WO1996017279A1 - Vehicle guidance system - Google Patents

Vehicle guidance system Download PDF

Info

Publication number
WO1996017279A1
WO1996017279A1 PCT/AU1995/000797 AU9500797W WO9617279A1 WO 1996017279 A1 WO1996017279 A1 WO 1996017279A1 AU 9500797 W AU9500797 W AU 9500797W WO 9617279 A1 WO9617279 A1 WO 9617279A1
Authority
WO
WIPO (PCT)
Prior art keywords
row
steering
image
objects
vehicle
Prior art date
Application number
PCT/AU1995/000797
Other languages
French (fr)
Inventor
John Billingsley
Murray Kenneth Schoenfisch
Original Assignee
The University Of Southern Queensland
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The University Of Southern Queensland filed Critical The University Of Southern Queensland
Priority to AU39747/95A priority Critical patent/AU691051B2/en
Publication of WO1996017279A1 publication Critical patent/WO1996017279A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B69/00Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
    • A01B69/007Steering or guiding of agricultural vehicles, e.g. steering of the tractor to keep the plough in the furrow
    • A01B69/008Steering or guiding of agricultural vehicles, e.g. steering of the tractor to keep the plough in the furrow automatic
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means

Definitions

  • TITLE VEHICLE GUIDANCE SYSTEM FIELD OF THE INVENTION
  • the system uses frame-sequential analysis methods to yield the necessary steering data
  • Vision systems can acquire data at a very large rate A full-colour high-resolution image can require 1 4 megabytes of memory to hold it, and twenty-five such images are received from a conventional camera each second. Many vision projects have become congested by such data rates, requiring massive computing power to extract the simplest of features
  • the invention resides in a method of guiding a vehicle including the steps of acquiring and storing a digitised image of a scene containing at least one row of objects selecting a portion of the image analysing the portion of the image to identify said objects, performing a regression calculation on the objects to locate a position of the row in terms of a lateral displacement value compared to a previously identified row converting the lateral displacement to a steering correction signal, and applying the steering correction signal to steering means of the vehicle to effect vehicle guidance
  • the image is acquired by means which preferably employs a video camera to generate a vision signal and an interface card which stores an image of that signal in the memory of the processor means, which is preferably a computer
  • the signals acquired represent an image of the scene ahead of the vehicle and may with advantage be of relatively low resolution thereby reducing the volume of data to be processed
  • Samples are selected for numerical processing from a small portion of the image Such a portion is bounded by a parallelogram of which two sides are horizontal The other sides are located such that the last estimate of the position of a row is centred between them and they are inclined to be parallel to the image of that row
  • a measure o quality of fit is also obtained Provided the fit is good, the coordinates of the parallelogram are changed before the next image is acquired
  • the deviation of the coordinates of the parallelogram from a central datum is used to initiate a steering action This may be performed by an hydraulic actuator
  • a feedback signal is preferably taken from a transducer mounted on the steering mechanism
  • the method is preferably extended to include several parallelograms each centred on an image of a different row of objects, so that interruption of a single row does not disrupt the steering control effected by the method
  • the invention resides in a vehicle guidance system comprising image acquisition means for acquiring an image of a scene containing at least one row of objects processor means for processing said image to estimate at least one row position and determine steering correction signals, input means in signal connection with said processor means for inputting parameters affecting processing in said processor means, and steering control means responsive to the steering correction signals from said processor means to control the direction of steering of the vehicle, wherein the processor means determines the steering correction signals by calculating the deviation of the at least one estimated row position from a previously estimated row position
  • the processor means is programmed with an algorithm to perform the steps of selecting a portion of the image analysing the portion of the image to identify objects, performing a regression calculation on the objects to locate a position of the row in terms of a lateral displacement value compared to a previously identified row converting the lateral displacement value to steering correction signals
  • the scene contains more than one row and the processor means is programmed with an algorithm wherein the step of selecting a portion of the image includes selecting a number of portions, each portion containing only one row and the step of performing a regression calculation includes performing a regression calculation on the objects in each portion and wherein the step of converting the lateral displacement to a steering direction correction includes determining a lateral displacement value of a vanishing point of the regression lines of the more than one rows
  • FIG 1 is a schematic block diagram of a vehicle guidance system
  • FIG 2 illustrates aspects of the image analysis algorithm
  • FIG 3 is a schematic block diagram of a variation on the system of
  • FIG 1 illustrates a display of the system of FIG 1 or FIG 2, and
  • FIG 5 shows a practical result of the implementation of the invention
  • FIG 1 there is shown a block diagram of the components of one embodiment of a vision guidance system for a vehicle
  • the invention will be described in terms of application to automatic steering of an agricultural vehicle following crop rows It will be appreciated that the description applies equally to other equivalent situations
  • the system comprises a video camera 1 that records an image of the crop rows ahead of the vehicle
  • a video interface card 2 captures single frames in the form of signals from the camera 1 and supplies them to a processor 3
  • the processor 3 stores the image signals for further processing
  • CCD charge coupled device
  • An input device such as a keypad 4 allows a user to input ⁇ parameters that effect the operation of the processor.
  • the operation of the system is indicated on display means 5.
  • a full colour image is captured in the on-board memory of video interface card 2 and can be merged "live" as a window forming part of a VGA display on display means 5
  • the image can be scaled horizontally and vertically with no use of processing time in processor 3. Lines and other graphics can be superimposed on the screen image (discussed later), so that the performance of the analysis system becomes very clear to see
  • An advantage of the interface is the provision of colour.
  • a field with a newly shooting crop may be littered with light-coloured detritus which makes discerning the crop rows difficult if brightness alone is used.
  • the spatial resolution of chrominance is not as sharp as that of luminance but resolution is not of the greatest importance.
  • the chrominance signal can be measured on a red/green axis or on a blue yellow axis Coupled with the light/dark luminance axis a three dimensional analysis space is defined.
  • the red/green, blue/yellow and light/dark parameters can be set by the user with keypad 4.
  • the image may be captured as an array in main memory of processor 3, where the software is able to access it for processing
  • part of the image can be intermittently copied to a display memory so that it can be seen on the display means 5 and the effectiveness of the algorithm assessed.
  • the processor 3 can conveniently be a commercially available personal computer such as a 386 or 486.
  • the processing algorithm may be implemented in the C programming language.
  • the main output of the processor 3 are signals to control solenoid valves 6 to control hydraulic pressure from the reservoir 7 to steering means 8
  • a sensor 9 provides feedback on the position of the steering means to the processor 3
  • the steering means 8 comprises a double ended hydraulic ram that adjusts the steering direction of the wheels of the vehicle
  • the steering means 8 acts independently of the steering wheel of the vehicle and without affecting its operation. Movement of the wheels due to the operation of the steering means 8 does not move the vehicle's steering wheel. The system is over-ridden by manual operation of the steering wheel
  • the processor 3 calculates steering corrections based on an analysis of the image captured by the video interface card 2 from the camera 1
  • the task is to identify a row of crops and locate its displacement from a datum position
  • the crop takes the form of a spotty row of variously-sized blobs At its best, it is a linearly-connected domain with a highly irregular outline If a window can be established within which members of only a single crop row will be present, however, an averaging technique can be used.
  • the analysis method makes heavy use of information learned from previous frames.
  • a window can be set for the next frame where movement of the vehicle should not have carried it as far as an adjoining row
  • the new frame will yield a new window for searching the following frame and so on.
  • the task therefore becomes one of making the best estimate of a line through a row of objects within the frame.
  • the threshold is according lowered by one count for the following frame
  • the target value is held in the computer as an adjustable parameter It is increased or decreased by the farmer during start-of-field set-up using the keypad 4
  • the display means 5 shows the actual scene overlaid by the image quantised by the threshold The user can adjust the threshold until the width of the white pixels matches the thickness of the crop
  • a routine within the program accepts initial values of xfit and sfit which define the window for summation This window is both laterally displaced and sheared to be centred on the last perceived line fit.
  • the routine computes values fitmean and fitsiope which are the corrections to be made to xfit ana sfit to achieve a minimum. It also computes a variable quality from which a decision can be made about the validity of the result.
  • y are given by the function picbit (x, y) which accesses the vision data
  • the computation is arranged to yield totals and moments in an efficient manner, where m is the total, mx is the total horizontal moment about the (sheared) window centre-line, mxx is the second moment and my, mxy and myy have similar appropriate meanings
  • the moment of inertia about the fitted line can be computed. If the fit is good, the result should be small. If the crop is scattered, however, the moment of inertia will be larger As a test, this moment of inertia is compared against myy, the moment of inertia about a horizontal axis. The ratio gives a measure of the quality of fit and the information is only acted upon if the quality is sufficiently high - typically greater than 4.0. Often a row may fade out half way down the field. For this reason, the computation is performed not just for one row but for two or three.
  • two variables represent the state of the vehicle s location These are the lateral movement of the rows, measured either at the vanishing point or at the centre of the view, and the change in the aggregate slope of the rows These variables are used to steer the vehicle.
  • Figure 2a depicts rows of newly sprouted plants as viewed from a camera mounted on an agricultural vehicle.
  • the newly sprouted plants appear in relatively neat rows
  • steering information can be derived from straight lines fitted to the rows of plants.
  • Figure 2e depicts that the moment about the regression line gives a measure to guard against errors
  • Figure 2f depicts that the moment about the regression line gives a measure to guard against errors
  • Movement in the vanishing point of the three lines indicates a change in the heading of the agricultural vehicle (figure 2g) and movement of the pattern indicates lateral displacement of the vehicle
  • the steering control elements are controlled from a separate processor 10
  • the processor 10 receives input from the keypad 4, steering sensor 9 and main processing computer 1 1
  • the main processing computer is conveniently a standard personal computer based on the 486 chip set
  • the processor 10 controls the solenoid valves 6 to effect automatic steering of steering means 8
  • the steering control processor 10 can be turned on or off - allowing manual control to be unimpeded - and set point values can be sent from keypad 4 to set the steering target angle
  • the steering sensor 9 may be a hall effect sensor that measures rotation of the wheels of the vehicle
  • the solenoid valves 6 can be switched in a four-millisecond cycle to give smooth and precise control
  • a schematic representation of the display means 5 is shown in Fig
  • the display means is a conventional visual display unit associated with the computer 1 1 or processor 3
  • the screen 12 is divided into four segments
  • a central segment 13 provides an image of the crop rows as viewed by the camera 1
  • the image is overlaid with regression lines 14 and windows 1 5 calculated by the algorithm
  • a further portion 16 of the screen displays parameters describing the fit of the regression lines 14 to the image
  • a visual indication of the steering correction is provided below the central segment 13
  • a line 17 indicates the position of the vehicle wheels relative to a straight ahead mark 18
  • the data for the line 17 is drawn from the sensor 9
  • a second line 19 indicates the correction from straight ahead applied by the system
  • the lines 17 and 19 are constantly changing in the display
  • Set-up data 20 is provided at the bottom of the screen 12
  • the set-up data 20 can be adjusted at any time by entering a set-up mode and using the keypad 4
  • the display is central to the user-friendly nature of the system
  • a driver of a vehicle implementing the system has immediate indication of the operations of the system
  • the particular benefit of this method is a relatively small number of numerical calculations which may typically only involve one hundred picture elements (pixels) per image analysed
  • the method also allows very rapid tracking of an image change, so that control is not lost if a severe disturbance occurs
  • FIG 5 A practical implementation of the invention is shown in Fig 5
  • the graph shows the actual performance of a test vehicle on a 35 second test run at a speed of one metre per second Deflection from ideal direction is measured in centimetres
  • the graph shows that after an initial 'acquisition' period the vehicle maintained direction to within 2 centimeters either side of the ideal direction

Abstract

A method of guiding a vehicle in which an image of a scene containing rows of objects is acquired and stored. A portion of the image within an applied parallelogram is analysed to identify objects having a brightness that exceeds an adaptive threshold. A regression calculation is performed on the objects to locate a position of the row in terms of a lateral displacement value compared to a previously identified row. The lateral displacement value is converted to a steering correction signal which is applied to a steering means of the vehicle to effect vehicle guidance. A number of portions can be analysed simultaneously to identify a number of rows thereby improving the robustness of the method. A vehicle guidance system embodying the method is also disclosed.

Description

TITLE VEHICLE GUIDANCE SYSTEM" FIELD OF THE INVENTION THIS INVENTION relates to a vision guidance system for automatic steering of vehicles and in particular agricultural vehicles The system uses frame-sequential analysis methods to yield the necessary steering data
BACKGROUND OF THE INVENTION
There is a need for automated guidance of agricultural vehicles, not to remove the presence of a driver but to allow greater attention to be given by the driver to the cultivation operation Automatic steering also promises to improve the effectiveness of "controlled traffic", a technique where vehicles seek to use the same ' footprint", every time to minimise compaction damage 10 the soil Under manual control, this increases the pressure on the driver to maintain precise control of the track of the vehicle
For spraying operations, high speeds are desirable to enable a ground vehicle to challenge the role of a crop-spraying aircraft Once again the driver s task is made more demanding and an "autopilot" becomes highly desirable
Many guidance methods can be considered, ranging from buried leader cables to beacons, surveying instruments or even satellite navigation All have their drawbacks The most appealing method is to follow human practice and take guidance from the crop itself, steering the vehicle by means of the view of the rows ahead
There are however many complications as the condition of the crop changes through the growing cycle Initially the plants appear as rows of small dots among scattered random dots which are weeds Later they fuse to form a clear solid line Before long, however, the lines have thickened and threaten to block the laneways Many variations of the vision algorithm are thus required to fulfil all the seasonal requirements
Vision systems can acquire data at a very large rate A full-colour high-resolution image can require 1 4 megabytes of memory to hold it, and twenty-five such images are received from a conventional camera each second. Many vision projects have become congested by such data rates, requiring massive computing power to extract the simplest of features
Prior attempts have been made to develop a viable automatic steering system for agricultural vehicles but these have suffered from the problems identified above. Yanmar Agricultural Equipment KK (Japanese Patent Application number J03012713) describe a complex image processor that compares a recorded crop image with a virtual crop row to determine an error signal Kubota KK (Japanese Patent Application number J01319878) discloses a system that uses linear approximation to identify crop rows on the basis of colour separation. Neither of these systems is user-friendly and both are processing intensive. Examples may be found in the Engineering literature of proposals for automatic guidance of agricultural vehicles. One such proposal is that of Gerrish et al reported in Transactions of the American Society of Agricultural Engineers. 1986. volume 95, section 5, pages 540-554. The Gerrish approach depends on edge identification techniques. Another approach is the proposal of Reid and Searcy reported in Transactions of the American Society of Agricultural Engineers 1988, Volume 31 , section 6, pages 1624-1632. The Reid and Searcy approach relies on a Bayes classifier algorithm to set threshold levels. Both of these approaches are processing intensive and failed to result in a practical solution to vehicle guidance.
OBJECT OF THE INVENTION It is an object of the present invention to provide a vehicle guidance system, particularly for agricultural vehicles
It is a further object to provide an automatic guidance system for agricultural vehicles that is user-friendly, economic and reliable.
Further objects will be evident from the following disclosure. DISCLOSURE OF THE INVENTION
In one form although it need not be the only or indeed the broadest form the invention resides in a method of guiding a vehicle including the steps of acquiring and storing a digitised image of a scene containing at least one row of objects selecting a portion of the image analysing the portion of the image to identify said objects, performing a regression calculation on the objects to locate a position of the row in terms of a lateral displacement value compared to a previously identified row converting the lateral displacement to a steering correction signal, and applying the steering correction signal to steering means of the vehicle to effect vehicle guidance The image is acquired by means which preferably employs a video camera to generate a vision signal and an interface card which stores an image of that signal in the memory of the processor means, which is preferably a computer The signals acquired represent an image of the scene ahead of the vehicle and may with advantage be of relatively low resolution thereby reducing the volume of data to be processed
Samples are selected for numerical processing from a small portion of the image Such a portion is bounded by a parallelogram of which two sides are horizontal The other sides are located such that the last estimate of the position of a row is centred between them and they are inclined to be parallel to the image of that row
If the image is stationary with respect to the vehicle the new imaged row will be centred in the parallelogram Any change of direction or orientation will cause a displacement of the image A regression calculation is performed to locate the position of the new row, in terms of lateral displacement and change of inclination
A measure o quality of fit is also obtained Provided the fit is good, the coordinates of the parallelogram are changed before the next image is acquired
The deviation of the coordinates of the parallelogram from a central datum is used to initiate a steering action This may be performed by an hydraulic actuator A feedback signal is preferably taken from a transducer mounted on the steering mechanism
The method is preferably extended to include several parallelograms each centred on an image of a different row of objects, so that interruption of a single row does not disrupt the steering control effected by the method In a further form the invention resides in a vehicle guidance system comprising image acquisition means for acquiring an image of a scene containing at least one row of objects processor means for processing said image to estimate at least one row position and determine steering correction signals, input means in signal connection with said processor means for inputting parameters affecting processing in said processor means, and steering control means responsive to the steering correction signals from said processor means to control the direction of steering of the vehicle, wherein the processor means determines the steering correction signals by calculating the deviation of the at least one estimated row position from a previously estimated row position
In preference the processor means is programmed with an algorithm to perform the steps of selecting a portion of the image analysing the portion of the image to identify objects, performing a regression calculation on the objects to locate a position of the row in terms of a lateral displacement value compared to a previously identified row converting the lateral displacement value to steering correction signals
In preference the scene contains more than one row and the processor means is programmed with an algorithm wherein the step of selecting a portion of the image includes selecting a number of portions, each portion containing only one row and the step of performing a regression calculation includes performing a regression calculation on the objects in each portion and wherein the step of converting the lateral displacement to a steering direction correction includes determining a lateral displacement value of a vanishing point of the regression lines of the more than one rows
BRIEF DETAILS OF THE DRAWINGS To assist in understanding the invention preferred embodiments will now be described with reference to the following figures in which
FIG 1 is a schematic block diagram of a vehicle guidance system,
FIG 2 illustrates aspects of the image analysis algorithm,
FIG 3 is a schematic block diagram of a variation on the system of
FIG 1 FIG 4 illustrates a display of the system of FIG 1 or FIG 2, and
FIG 5 shows a practical result of the implementation of the invention
DETAILED DESCRIPTION OF THE DRAWINGS In the drawings like reference numerals refer to like parts Referring to FIG 1 there is shown a block diagram of the components of one embodiment of a vision guidance system for a vehicle The invention will be described in terms of application to automatic steering of an agricultural vehicle following crop rows It will be appreciated that the description applies equally to other equivalent situations The system comprises a video camera 1 that records an image of the crop rows ahead of the vehicle A video interface card 2 captures single frames in the form of signals from the camera 1 and supplies them to a processor 3 The processor 3 stores the image signals for further processing The inventors have found that a commercially available CCD (charge coupled device) video camera and commercially available video card with software are an appropriate and economic combination
An input device such as a keypad 4 allows a user to input Ό parameters that effect the operation of the processor. The operation of the system is indicated on display means 5.
A full colour image is captured in the on-board memory of video interface card 2 and can be merged "live" as a window forming part of a VGA display on display means 5 The image can be scaled horizontally and vertically with no use of processing time in processor 3. Lines and other graphics can be superimposed on the screen image (discussed later), so that the performance of the analysis system becomes very clear to see An advantage of the interface is the provision of colour. A field with a newly shooting crop may be littered with light-coloured detritus which makes discerning the crop rows difficult if brightness alone is used. Even the use of a green filter over the lens of the camera 1 makes little improvement With colour it is possible to use the chrominance signal rather than luminance to capture an image based on the "greenness" of each point The spatial resolution of chrominance is not as sharp as that of luminance but resolution is not of the greatest importance. The chrominance signal can be measured on a red/green axis or on a blue yellow axis Coupled with the light/dark luminance axis a three dimensional analysis space is defined. The red/green, blue/yellow and light/dark parameters can be set by the user with keypad 4.
Alternatively, the image may be captured as an array in main memory of processor 3, where the software is able to access it for processing At some cost in overall speed, part of the image can be intermittently copied to a display memory so that it can be seen on the display means 5 and the effectiveness of the algorithm assessed.
The processor 3 can conveniently be a commercially available personal computer such as a 386 or 486. The processing algorithm may be implemented in the C programming language. The main output of the processor 3 are signals to control solenoid valves 6 to control hydraulic pressure from the reservoir 7 to steering means 8 A sensor 9 provides feedback on the position of the steering means to the processor 3 The steering means 8 comprises a double ended hydraulic ram that adjusts the steering direction of the wheels of the vehicle The steering means 8 acts independently of the steering wheel of the vehicle and without affecting its operation. Movement of the wheels due to the operation of the steering means 8 does not move the vehicle's steering wheel. The system is over-ridden by manual operation of the steering wheel
The processor 3 calculates steering corrections based on an analysis of the image captured by the video interface card 2 from the camera 1 The task is to identify a row of crops and locate its displacement from a datum position There will certainly not be a well- defined object with shape which could be analysed by outline methods, even if time permitted, nor is there a well defined edge that can be identified by edge analysis techniques. In the early stages of growth, the crop takes the form of a spotty row of variously-sized blobs At its best, it is a linearly-connected domain with a highly irregular outline If a window can be established within which members of only a single crop row will be present, however, an averaging technique can be used. The analysis method makes heavy use of information learned from previous frames. With knowledge of the location of a row, a window can be set for the next frame where movement of the vehicle should not have carried it as far as an adjoining row The new frame will yield a new window for searching the following frame and so on. The task therefore becomes one of making the best estimate of a line through a row of objects within the frame.
Within the frame a count is made of the pixels which exceed a threshold and are therefore deemed to be objects. The proportion of bright to total pixels is therefore known. The 'correct' value of this proportion is a property of the object. For crops it will be related to the plant's stage of development, varying from. say. 0 1 or less when the plants are newly emerged to 0 5 or more as the canopy closes. (Above this value the farmer would not wish to enter the crop with a vehicle unless harvesting)
If the proportion of bright pixels is seen to fall below the target value, the threshold is according lowered by one count for the following frame The target value is held in the computer as an adjustable parameter It is increased or decreased by the farmer during start-of-field set-up using the keypad 4 The display means 5 shows the actual scene overlaid by the image quantised by the threshold The user can adjust the threshold until the width of the white pixels matches the thickness of the crop
Until the threshold is changed again, the threshold will adapt to variations in picture brightness to preserve the proportion of bright cells in the window Because of the 'fuzzy' boundaries of the crop rows, there is a deal of latitude in the setting before performance deteriorates Regression is conventionally used to fit the best straight line to a sequence of points, usually pairs of measurement samples or readings from which statistics are to be drawn The regression line minimises a quadratic cost function, the sum of the weights of the points times the squares of their distances from the line In the present case, however, the measurements are brightness values for a two dimensional array of points and evaluation of the cost involves a double summation The cost function is
C=I m(x,y) (x-xfit-sfit y)2 The best-fit line is defined by the offset and slope parameters xfit and sfit which minimise C, xfit and sfit can be solved from the simultaneous equations by
dC dC
--o, -- dxfit dsfit
A routine within the program accepts initial values of xfit and sfit which define the window for summation This window is both laterally displaced and sheared to be centred on the last perceived line fit. The routine computes values fitmean and fitsiope which are the corrections to be made to xfit ana sfit to achieve a minimum. It also computes a variable quality from which a decision can be made about the validity of the result. The weights m(x. y) are given by the function picbit (x, y) which accesses the vision data The computation is arranged to yield totals and moments in an efficient manner, where m is the total, mx is the total horizontal moment about the (sheared) window centre-line, mxx is the second moment and my, mxy and myy have similar appropriate meanings The results are given by fitmean = ( mx -myy - mxymy)l(m -myy - my2) and fitsiope = {m -mxy - mx -my)l{m-myy - my2)
The moment of inertia about the fitted line can be computed. If the fit is good, the result should be small. If the crop is scattered, however, the moment of inertia will be larger As a test, this moment of inertia is compared against myy, the moment of inertia about a horizontal axis. The ratio gives a measure of the quality of fit and the information is only acted upon if the quality is sufficiently high - typically greater than 4.0. Often a row may fade out half way down the field. For this reason, the computation is performed not just for one row but for two or three.
Within the program, two variables represent the state of the vehicle s location These are the lateral movement of the rows, measured either at the vanishing point or at the centre of the view, and the change in the aggregate slope of the rows These variables are used to steer the vehicle.
Aspects of the image analysis algorithm are illustrated in figures 2a to 2h. Figure 2a depicts rows of newly sprouted plants as viewed from a camera mounted on an agricultural vehicle. The newly sprouted plants appear in relatively neat rows As depicted in figure 2b, steering information can be derived from straight lines fitted to the rows of plants.
This is achieved Dy selecting part of the image by applying a window as shown in figure 2c This is done statistically in the processor 3 A regression line is fitted using a least squares procedure as shown in figure 2d
Figure 2e depicts that the moment about the regression line gives a measure to guard against errors To improve accuracy a group of three windows is used for the calculations as shown in figure 2f The three windows are upαated by corrections to all valid regression lines
Movement in the vanishing point of the three lines indicates a change in the heading of the agricultural vehicle (figure 2g) and movement of the pattern indicates lateral displacement of the vehicle
(figure 2h)
In an alternative embodiment of the invention shown in Fig 3, the steering control elements are controlled from a separate processor 10 This allows the structure of the control software to be simplified The processor 10 receives input from the keypad 4, steering sensor 9 and main processing computer 1 1 The main processing computer is conveniently a standard personal computer based on the 486 chip set The processor 10 controls the solenoid valves 6 to effect automatic steering of steering means 8 The steering control processor 10 can be turned on or off - allowing manual control to be unimpeded - and set point values can be sent from keypad 4 to set the steering target angle The steering sensor 9 may be a hall effect sensor that measures rotation of the wheels of the vehicle The solenoid valves 6 can be switched in a four-millisecond cycle to give smooth and precise control A schematic representation of the display means 5 is shown in Fig
4 In the preferred embodiment the display means is a conventional visual display unit associated with the computer 1 1 or processor 3 The screen 12 is divided into four segments A central segment 13 provides an image of the crop rows as viewed by the camera 1 The image is overlaid with regression lines 14 and windows 1 5 calculated by the algorithm A further portion 16 of the screen displays parameters describing the fit of the regression lines 14 to the image A visual indication of the steering correction is provided below the central segment 13 A line 17 indicates the position of the vehicle wheels relative to a straight ahead mark 18 The data for the line 17 is drawn from the sensor 9 A second line 19 indicates the correction from straight ahead applied by the system The lines 17 and 19 are constantly changing in the display Set-up data 20 is provided at the bottom of the screen 12 The set-up data 20 can be adjusted at any time by entering a set-up mode and using the keypad 4
The display is central to the user-friendly nature of the system A driver of a vehicle implementing the system has immediate indication of the operations of the system
The particular benefit of this method is a relatively small number of numerical calculations which may typically only involve one hundred picture elements (pixels) per image analysed The method also allows very rapid tracking of an image change, so that control is not lost if a severe disturbance occurs
A practical implementation of the invention is shown in Fig 5 The graph shows the actual performance of a test vehicle on a 35 second test run at a speed of one metre per second Deflection from ideal direction is measured in centimetres The graph shows that after an initial 'acquisition' period the vehicle maintained direction to within 2 centimeters either side of the ideal direction
Throughout the specification the aim has been to describe the preferred embodiments of the invention without limiting the invention to any one embodiment or specific collection of features

Claims

CLAIMS 1 A method of guiding a vehicle including the steps of acquiring and storing a digitised image of a scene containing at least one row of objects, selecting a portion of the image, analysing the portion of the image to identify said objects, performing a regression calculation on the objects to locate a position of the row in terms of a lateral displacement value compared to a previously identified row, converting the lateral displacement to a steering correction signal, and applying the steering correction signal to steering means of the vehicle to effect vehicle guidance
2 The method of claim 1 wherein the portion of the image selected is bound by a parallelogram, said parallelogram having two horizontal sides and two inclined sides, said inclined sides being parallel to a row in the portion of the image and located such that the last estimated position of the row is centred between the inclined sides
3 The method of claim 2 further including the step of repositioning the parallelogram based on the lateral displacement value and a change in inclination value obtain from the regression calculation
4 The method of claim 1 further including the step of calculating a quality of fit of the regression calculation and only applying the steering correction signal to steering means of the vehicle if the quality of fit exceeds a preset value 5 The method of claim 1 wherein the scene contains more than one row of objects and the step of selecting a portion of the image includes selecting a number of portions, each portion containing only one row of objects 6 The method of claim 5 further including the step of performing a regression calculation on the row of objects in each portion to locate the position of each row in terms of lateral displacement and change in inclination compared to a previously identified row for that portion 7 The method of claim 5 further including the step of performing a regression calculation on the row of objects in each portion and wherein the step of converting the lateral displacement value to a steering direction correction includes determining a lateral displacement value of the vanishing point of the regression lines of the more than one rows
8 The method of claim 1 in which the step of analysing the portion of the image to identify objects involves comparing a brightness of each picture element in the portion of the image against a threshold level, wherein picture elements having a brightness greater than the threshold level are considered as objects
9 The method of claim 8 wherein brightness is a weighted average of light/dark luminance red/green chrominance and blue/yellow chrominance said weights being positive, zero or negative and said method further including the step of a user inputting the weights 10 The method of claim 8 wherein the threshold level is adapted such that the proportion of picture elements in the portion considered as objects approaches a preset level
11 The method of claim 1 further including the step of displaying the scene, results of the regression calculation and a graphic depiction of the steering control signals on a display means
12 A vehicle guidance system comprising image acquisition means for acquiring an image of a scene containing at least one row of objects processor means for processing said image to estimate at least one row position and determine steering correction signals, input means in signal connection with said processor means for inputting parameters affecting processing in said processor means, and steering control means responsive to the steering correction signals from said processor means to control the direction of steering of the vehicle, wherein the processor means determines the steering correction signals by calculating the deviation of the at least one estimated row position from a previously estimated row position 13 The system of claim 12 wherein the processor means is programmed with an algorithm to perform the steps of selecting a portion of the image, analysing the portion of the image to identify objects, performing a regression calculation on the objects to locate a position of the row in terms of a lateral displacement value compared to a previously identified row, converting the lateral displacement value to steering correction signals
14 The system of claim 12 wherein said image acquisition means is a video camera
15 The system of claim 12 wherein the image acquisition means is a video camera and the vehicle guidance system further comprises an interface means between the video camera and the processor means for capturing single frames from the video camera 16 The system of claim 12 wherein the input means is a keypad for input of parameters by a person operating the vehicle, said parameters including at least threshold levels and weights for chrominance and luminance measurements
17 The system of claim 12 wherein said steering control means comprises at least one computer actuated solenoid valve controlling flow of hydraulic fluid from a reservoir to an hydraulic steering means, said hydraulic steering means being in operative connection with steerable wheels of the vehicle
18 The system of claim 12 further comprising steering direction sensing means in signal connection with the processor means for providing feedback of a current steering direction of the vehicle 19. The system of claim 12 wherein said processor means comprises a vision processor means for processing said image to estimate at least one row position and compare said row position with a previously estimated row position to determine steering correction signals and a steering processor means in signal connection with the vision processor means for controlling the direction of steering of the vehicle in response to signals received from the vision processor means, the input means and a steer direction sensing means.
PCT/AU1995/000797 1994-11-29 1995-11-29 Vehicle guidance system WO1996017279A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU39747/95A AU691051B2 (en) 1994-11-29 1995-11-29 Vehicle guidance system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AUPM9716 1994-11-29
AUPM9716A AUPM971694A0 (en) 1994-11-29 1994-11-29 Vision guidance for agricultural vehicles

Publications (1)

Publication Number Publication Date
WO1996017279A1 true WO1996017279A1 (en) 1996-06-06

Family

ID=3784226

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU1995/000797 WO1996017279A1 (en) 1994-11-29 1995-11-29 Vehicle guidance system

Country Status (2)

Country Link
AU (1) AUPM971694A0 (en)
WO (1) WO1996017279A1 (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6278918B1 (en) 2000-02-28 2001-08-21 Case Corporation Region of interest selection for a vision guidance system
US6285930B1 (en) 2000-02-28 2001-09-04 Case Corporation Tracking improvement for a vision guidance system
US6385515B1 (en) 2000-06-15 2002-05-07 Case Corporation Trajectory path planner for a vision guidance system
US6445983B1 (en) 2000-07-07 2002-09-03 Case Corporation Sensor-fusion navigator for automated guidance of off-road vehicles
US6490539B1 (en) 2000-02-28 2002-12-03 Case Corporation Region of interest selection for varying distances between crop rows for a vision guidance system
US6686951B1 (en) 2000-02-28 2004-02-03 Case, Llc Crop row segmentation by K-means clustering for a vision guidance system
EP1529428A1 (en) 2003-11-06 2005-05-11 Deere & Company Method and system for automatic steering of an agricultural vehicle
US7248968B2 (en) 2004-10-29 2007-07-24 Deere & Company Obstacle detection using stereo vision
US7570783B2 (en) 2005-07-01 2009-08-04 Deere & Company Method and system for vehicular guidance using a crop image
US7580549B2 (en) 2005-07-01 2009-08-25 Deere & Company Method and system for vehicular guidance using a crop image
US7684916B2 (en) 2005-07-01 2010-03-23 Deere & Company Method and system for vehicular guidance using a crop image
US7792622B2 (en) 2005-07-01 2010-09-07 Deere & Company Method and system for vehicular guidance using a crop image
US7904218B2 (en) 2006-05-18 2011-03-08 Applied Perception, Inc. Vision guidance system and method for identifying the position of crop rows in a field
US8019513B2 (en) 2006-05-18 2011-09-13 Applied Perception Inc. Vision guidance system and method for identifying the position of crop rows in a field
US8121345B2 (en) 2006-05-18 2012-02-21 Applied Perception, Inc. Vision guidance system and method for identifying the position of crop rows in a field
US8185275B2 (en) 2005-07-01 2012-05-22 Deere & Company System for vehicular guidance with respect to harvested crop
US8712144B2 (en) 2003-04-30 2014-04-29 Deere & Company System and method for detecting crop rows in an agricultural field
US8737720B2 (en) 2003-04-30 2014-05-27 Deere & Company System and method for detecting and analyzing features in an agricultural field
US8855405B2 (en) 2003-04-30 2014-10-07 Deere & Company System and method for detecting and analyzing features in an agricultural field for vehicle guidance
WO2021105006A1 (en) * 2019-11-25 2021-06-03 Robert Bosch Gmbh Method for estimating a course of plant rows
US11400976B2 (en) 2020-07-22 2022-08-02 Ford Global Technologies, Llc Steering wheel angle calibration

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0048138A1 (en) * 1980-09-12 1982-03-24 Deere & Company Guidance aid for towing vehicle
US4555725A (en) * 1983-08-24 1985-11-26 Deutz-Allis Corporation Agricultural implement steering guidance system and method
DE3507570A1 (en) * 1985-03-04 1986-09-04 Willi Eisen GmbH, 3060 Stadthagen Method and device for automatically steering a vehicle, in particular an agricultural vehicle, along a pattern aligned in the direction of travel
AU6649286A (en) * 1985-12-20 1987-06-25 Yoshida Kogyo K.K. Apparatus and method for controlling automatically controlled wagon
US4769700A (en) * 1981-11-20 1988-09-06 Diffracto Ltd. Robot tractors
EP0446903A2 (en) * 1990-03-15 1991-09-18 Honda Giken Kogyo Kabushiki Kaisha Automatic travelling apparatus
EP0640903A1 (en) * 1993-08-28 1995-03-01 Lucas Industries Public Limited Company A driver assistance system for a vehicle

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0048138A1 (en) * 1980-09-12 1982-03-24 Deere & Company Guidance aid for towing vehicle
US4769700A (en) * 1981-11-20 1988-09-06 Diffracto Ltd. Robot tractors
US4555725A (en) * 1983-08-24 1985-11-26 Deutz-Allis Corporation Agricultural implement steering guidance system and method
DE3507570A1 (en) * 1985-03-04 1986-09-04 Willi Eisen GmbH, 3060 Stadthagen Method and device for automatically steering a vehicle, in particular an agricultural vehicle, along a pattern aligned in the direction of travel
AU6649286A (en) * 1985-12-20 1987-06-25 Yoshida Kogyo K.K. Apparatus and method for controlling automatically controlled wagon
EP0446903A2 (en) * 1990-03-15 1991-09-18 Honda Giken Kogyo Kabushiki Kaisha Automatic travelling apparatus
EP0640903A1 (en) * 1993-08-28 1995-03-01 Lucas Industries Public Limited Company A driver assistance system for a vehicle

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6285930B1 (en) 2000-02-28 2001-09-04 Case Corporation Tracking improvement for a vision guidance system
US6490539B1 (en) 2000-02-28 2002-12-03 Case Corporation Region of interest selection for varying distances between crop rows for a vision guidance system
US6686951B1 (en) 2000-02-28 2004-02-03 Case, Llc Crop row segmentation by K-means clustering for a vision guidance system
US6278918B1 (en) 2000-02-28 2001-08-21 Case Corporation Region of interest selection for a vision guidance system
US6385515B1 (en) 2000-06-15 2002-05-07 Case Corporation Trajectory path planner for a vision guidance system
US6445983B1 (en) 2000-07-07 2002-09-03 Case Corporation Sensor-fusion navigator for automated guidance of off-road vehicles
US8712144B2 (en) 2003-04-30 2014-04-29 Deere & Company System and method for detecting crop rows in an agricultural field
US8855405B2 (en) 2003-04-30 2014-10-07 Deere & Company System and method for detecting and analyzing features in an agricultural field for vehicle guidance
US8737720B2 (en) 2003-04-30 2014-05-27 Deere & Company System and method for detecting and analyzing features in an agricultural field
EP1529428A1 (en) 2003-11-06 2005-05-11 Deere & Company Method and system for automatic steering of an agricultural vehicle
US7400957B2 (en) 2003-11-06 2008-07-15 Deere & Company Process and steering system for the automatic steering of an agricultural vehicle
US7248968B2 (en) 2004-10-29 2007-07-24 Deere & Company Obstacle detection using stereo vision
US7580549B2 (en) 2005-07-01 2009-08-25 Deere & Company Method and system for vehicular guidance using a crop image
US8185275B2 (en) 2005-07-01 2012-05-22 Deere & Company System for vehicular guidance with respect to harvested crop
US8433483B2 (en) 2005-07-01 2013-04-30 Deere & Company Method and system for vehicular guidance with respect to harvested crop
US7792622B2 (en) 2005-07-01 2010-09-07 Deere & Company Method and system for vehicular guidance using a crop image
US7684916B2 (en) 2005-07-01 2010-03-23 Deere & Company Method and system for vehicular guidance using a crop image
US7570783B2 (en) 2005-07-01 2009-08-04 Deere & Company Method and system for vehicular guidance using a crop image
US7904218B2 (en) 2006-05-18 2011-03-08 Applied Perception, Inc. Vision guidance system and method for identifying the position of crop rows in a field
US8019513B2 (en) 2006-05-18 2011-09-13 Applied Perception Inc. Vision guidance system and method for identifying the position of crop rows in a field
US8121345B2 (en) 2006-05-18 2012-02-21 Applied Perception, Inc. Vision guidance system and method for identifying the position of crop rows in a field
WO2021105006A1 (en) * 2019-11-25 2021-06-03 Robert Bosch Gmbh Method for estimating a course of plant rows
US11400976B2 (en) 2020-07-22 2022-08-02 Ford Global Technologies, Llc Steering wheel angle calibration

Also Published As

Publication number Publication date
AUPM971694A0 (en) 1994-12-22

Similar Documents

Publication Publication Date Title
WO1996017279A1 (en) Vehicle guidance system
CN110243372B (en) Intelligent agricultural machinery navigation system and method based on machine vision
US20210034057A1 (en) Method for autonomous detection of crop location based on tool depth and location
Billingsley et al. The successful development of a vision guidance system for agriculture
EP0975209B1 (en) Agricultural harvester with robotic control
EP1738630B1 (en) Method and system for vehicular guidance with respect to harvested crop
US7570783B2 (en) Method and system for vehicular guidance using a crop image
CN103891697B (en) The variable spray method of a kind of indoor autonomous spraying machine device people
US7580549B2 (en) Method and system for vehicular guidance using a crop image
Guerrero et al. Crop rows and weeds detection in maize fields applying a computer vision system based on geometry
AU691051B2 (en) Vehicle guidance system
Olsen Determination of row position in small-grain crops by analysis of video images
CN114092822B (en) Image processing method, movement control method, and movement control system
JP2006101816A (en) Method and apparatus for controlling steering
CN113016331A (en) Wide-narrow row ratoon rice harvesting regulation and control system and method based on binocular vision
Okamoto et al. Automatic guidance system with crop row sensor
CA3233542A1 (en) Vehicle row follow system
JP2502981Y2 (en) Image processing system for crop row detection in rice transplanter
WO2023276227A1 (en) Row detection system, farm machine provided with row detection system, and method for detecting row
WO2023276228A1 (en) Row detection system, farm machine provided with row detection system, and row detection method
US20230094371A1 (en) Vehicle row follow system
US20230403964A1 (en) Method for Estimating a Course of Plant Rows
Okamoto et al. Automatic weeding cultivator using crop-row detector
Ericson et al. A vision-guided mobile robot for precision agriculture
JPH0257110A (en) Automatic steering control apparatus of farm working machine

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU BB BG BR BY CA CH CN CZ DE DK EE ES FI GB GE HU IS JP KE KG KP KR KZ LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TT UA UG US UZ VN

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): KE LS MW SD SZ UG AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
ENP Entry into the national phase

Ref country code: US

Ref document number: 1997 860417

Date of ref document: 19970529

Kind code of ref document: A

Format of ref document f/p: F

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: CA