US20020030739A1 - Moving object detection apparatus - Google Patents

Moving object detection apparatus Download PDF

Info

Publication number
US20020030739A1
US20020030739A1 US09/946,528 US94652801A US2002030739A1 US 20020030739 A1 US20020030739 A1 US 20020030739A1 US 94652801 A US94652801 A US 94652801A US 2002030739 A1 US2002030739 A1 US 2002030739A1
Authority
US
United States
Prior art keywords
moving object
background
window area
image
monitoring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US09/946,528
Inventor
Shigeki Nagaya
Takafumi Miyatake
Takehiro Fujita
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US09/946,528 priority Critical patent/US20020030739A1/en
Publication of US20020030739A1 publication Critical patent/US20020030739A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G7/00Direction control systems for self-propelled missiles
    • F41G7/20Direction control systems for self-propelled missiles based on continuous observation of target position
    • F41G7/22Homing guidance systems
    • F41G7/2226Homing guidance systems comparing the observed data with stored target data, e.g. target configuration data
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G7/00Direction control systems for self-propelled missiles
    • F41G7/20Direction control systems for self-propelled missiles based on continuous observation of target position
    • F41G7/22Homing guidance systems
    • F41G7/2253Passive homing systems, i.e. comprising a receiver and do not requiring an active illumination of the target
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G7/00Direction control systems for self-propelled missiles
    • F41G7/20Direction control systems for self-propelled missiles based on continuous observation of target position
    • F41G7/22Homing guidance systems
    • F41G7/2273Homing guidance systems characterised by the type of waves
    • F41G7/2293Homing guidance systems characterised by the type of waves using electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/42Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
    • G06V10/421Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation by analysing segments intersecting the pattern
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19604Image analysis to detect motion of the intruder, e.g. by frame subtraction involving reference image or background adaptation with time to compensate for changing conditions, e.g. reference image update on detection of light level change
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19678User interface
    • G08B13/19691Signalling events for better perception by user, e.g. indicating alarms by making display brighter, adding text, creating a sound
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors

Definitions

  • the invention relates to moving object detection apparatus and method for monitoring the movie which is inputted with a camera, to measure the traffic flows on roads, to detect failures on railroads/crossings, and to prevent crimes in banks or convenience stores.
  • the asymptotic illumination change of those problems coming from the complicated backgrounds is solved by the moving object detection method using the background difference.
  • This background difference is the method of separating/extracting only a moving object by taking a difference between the background image reflecting only the background and the frame image containing the moving object by using that the background will hardly change in the movie taken with a fixed camera.
  • the background image is automatically acquired by the method determining and using the medians and modes of the intensity of each pixel in the time axis.
  • FIG. 24 simply shows the principle of the moving object detection method using the background difference. If a background image 100 is given in advance for a scene 110 to be monitored, a moving object 111 can be separated/extracted as a scene change 121 from the differential image 120 between the background image 100 and the scene 110 .
  • FIG. 25 simply shows the problem of the method of the prior art.
  • a stopped object 131 appear in the scene of the background image 100 to cause the structure change of the background, as shown in a scene 130 .
  • the parked object 131 is extracted as a change 141 , as indicated in a differential image 140 between the background image 100 and the scene 130 .
  • the structure change and the moving object so merge into each other, as in the region 161 in a scene 160 , that they cannot be separated, even after a moving object 151 passed.
  • a number of structure changes of the background occur in the actual movie monitoring.
  • an automobile having passed a road may be stopped at the parking meter on a road edge to form part of a new background.
  • an object having been stopped at the parking meter may move to make the hidden region into a portion of the new background.
  • the passing object drops an object onto the road, the falling object may also form part of the new background.
  • the object having passed the snow may leave its tracks.
  • the method of the prior art using the is background difference could not cope with the structure change of the background. This is because it is impossible to discriminate whether the portion having a changed background structure belongs to the moving object or a new background region. For this discrimination, it is conceivable to execute the motion analysis of the moving object.
  • the motion analysis algorithm such as the optical flow, however, the number of moving objects has to be known in advance. Once the number of the moving objects is mis-recognized, the subsequent processing will find it difficult not only to separate the background change region but also the presence of the background change itself.
  • the invention has the following three objects.
  • a first object is to judge whether a pixel region of interest belongs to the background or the moving object, thereby to judge the kind of the background change, if any.
  • a second object is to extract only the moving object by separating/judging the background change region and the moving object region.
  • a third object is to easily calculate the moving direction or velocity of the moving object extracted.
  • the invention comprises, as its basic component means, means for inputting a movie, means for extracting/detecting a moving object, and means for outputting the processed result as the movie.
  • the following means are provided for realizing the judgment of the presence of the structure change in the moving object and the background for a predetermined pixel region according to the first object.
  • the means are: means for acquiring the pixel region to be judged for the background, from the movie; means for calculating the correlation between the pixel region at a time and the pixel region of each frame; means for holding the calculated correlated values sequentially; means for judging the interval which is predicted to belong to the background because of absence of the moving object; and means for deciding the interval for which the moving object is present, from the interval which is predicted to belong to the judged background.
  • the means for deciding the interval for which the moving object is present comprises: means for judging the present of the background change from the interval which is predicted to belong to the judged background; means for classifying the background change into the illuminance change or the structure change; and means for deciding the interval for which the moving object is present, from the interval which is predicted to belong to the judged background.
  • the following means are provided for realizing the extraction of only the moving object by separating/judging the background change region and the moving object region according to the second object.
  • the means are: means for acquiring the frame image containing the moving object and the images having only two projected backgrounds (or the original and future background images before and after the interval for which the moving object is present), as located before and after the interval, for which the moving object is present, from the movie; means for creating the original background differential image and the future background differential image from the frame image and the original and future background images; means for determining the merging region by a logical product from the original background differential image and the future background differential image; and means for cutting the moving object image out of the frame image and the merging region.
  • the means are: means for cutting out the spatial-temporal image of the interval for which the moving object is present; means for separating the slit images (or the original background slit image and the future background slit image before and after the interval for which the moving object is present) of only the two backgrounds, as located before and after the interval for which the moving object is present, and the moving object region from the spatial-temporal image; means for correcting the moving object region by the morphology processing and the hole fill processing; means for determining a common merging region from the corrected two background differential images, and means for estimating the direction/velocity of the moving object by calculating the inclination of the obtained merged regions.
  • the moving object is detected by the following procedure.
  • the structure changes in the moving object and the background are judged for a specific pixel region.
  • the pixel region to be judged for the background is acquired from the movie and for each frame, and the correlation with the pixel region at a time is calculated.
  • the correlated values thus calculated can be handled as a sequence.
  • the interval, which is predicted to belong to the background because of absence of the moving object is judged. Whether or not the background has changed for the interval to be predicted to belong to the background is judged to classify the judged background change into the illuminance change or the structure change.
  • the interval for which the moving object is present is decided from the interval to be predicted to belong to the background.
  • the frame image containing the moving object is acquired on the basis of the interval for which the moving object is present.
  • the original background image and the future background image as located before and after the interval for which the moving object is present, are acquired from the movie.
  • the original background differential image and the future background differential image are created from the frame image and the original and future background images.
  • the merged region is determined by the logical product from the original background differential image and the future background differential image until the moving object image is cut out of the frame image and the merged region.
  • the moving direction or velocity of the extracted moving object is simply calculated.
  • the spatial-temporal image of the interval, for which the moving object is present is cut out.
  • the moving object region is separated from the background slit image and the spatial-temporal image.
  • the moving object region is corrected by the morphology and the hole fill processings.
  • the merged region is determined from the logical product of the original background differential image and the future background differential image corrected, until the inclination of the merged region is calculated to estimate the direction/velocity of the moving object.
  • FIG. 1 shows a hardware construction for realizing the invention
  • FIG. 2 shows a system construction for realizing the invention
  • FIG. 3 shows relations among a movie, a slit image and a spatial-temporal image
  • FIG. 4 shows a relation in distance between a background slit 1041 and a current slit 1042 at each time
  • FIG. 5 shows a spatial-temporal image 1050 and a sequence of distances when a structure change occurs in the background
  • FIG. 6 explains a data flow of background judgment means 700 ;
  • FIG. 7 shows a flow chart of background period judgment means 720 ;
  • FIG. 8 shows the influences of an illuminance change upon a slit image 1040 ;
  • FIG. 9 shows the influences of an illuminance change upon a slit vector when the slit image 1040 is deemed as a vector
  • FIG. 10 shows the mapping of the ordinary slit vector and the slit vector, as influenced by the illuminance change, upon a unit sphere;
  • FIG. 11 shows a flow chart of background true/false judgment means 730 ;
  • FIG. 12 shows a flow chart of background structure change judgment means 750 ;
  • FIG. 13 shows the summary of a method of extracting a moving object 1100 exclusively by separating/judging a background change portion and a moving object region from a spatial-temporal image 1050 ;
  • FIG. 14 shows a slit setting method for analyzing the motion of the moving object 1100 in the movie 1010 , and the spatial-temporal image 1050 obtained by the method;
  • FIG. 15 explains the summary of a method for calculating the moving direction/velocity 1003 of the extracted moving object from the inclination of a slit 1030 and the inclination 1210 of the moving object;
  • FIG. 16 shows a data flow of motion analysis means 800 for realizing the aforementioned method
  • FIG. 17 shows a data flow of means 810 for creating an original background difference and a future background difference
  • FIG. 18 shows the summary of moving object region separation means 811 ;
  • FIG. 19 shows a flow chart of the moving object region separation means
  • FIG. 20 shows the summary of a method of extracting a moving object exclusively by separating/judging a background change portion and the moving object region relative to a frame image
  • FIG. 21 shows the summary of a method of extracting the background change exclusively by separating/judging the background change portion and the moving object region with respect to the frame image
  • FIG. 22 shows a data flow of moving object extraction means 900 for realizing the aforementioned method
  • FIG. 23 shows an example of the resultant display screen which is outputted on a display 300 by result output means 600 ;
  • FIG. 24 shows the moving object detection/extraction by the conventional method using the background difference
  • FIG. 25 shows a problem in the moving object detection/extraction by the conventional method using the background difference.
  • FIG. 2 shows one embodiment of the hardware construction for realizing the invention.
  • a TV camera 200 takes a scene to be monitored, transforms it into video signals 201 and transmits them to a computer 400 .
  • the video signals 201 are digitized for each frame and stored in the memory of the computer 400 .
  • This computer 400 reads out its memory content follows the processing program which is stored at another address in the memory, to judge whether the pixels on the frame image belong to the background or the moving object, to extract the moving object and to estimate the moving direction/velocity.
  • the image of the moving object extracted and the other accompanying processed results are transmitted to a display 300 .
  • This-display 300 outputs the results processed by the computer 400 , such as the background image and the image and moving direction/velocity of the moving object to the screen. These informations are transmitted through a network 210 to the display of a safety control unit or a monitor center.
  • FIG. 1 shows one example of the system construction which is realized in the computer 400 .
  • This computer 400 includes video input means 500 , result output means 600 , background judgment means 700 , motion analysis means 800 and moving object extraction means 900 .
  • the video input means 500 transforms the video signals into digital image data 100 for each frame and transmits them to the background judgment means 700 , the motion analysis means 800 and the moving object extraction means 900 .
  • the result output means 600 displays the processed results of the background judgment means 700 , the motion analysis means 800 and the moving object extraction means 900 , such as a background image 1002 , a moving direction/velocity 1003 and a moving object image 1004 on the display such that they can be easily observed by the user.
  • the background judgment means 700 judges whether each pixel on the digital image data 1000 belongs to the background, the moving object or the change in the background structure, and transmits a moving object period information 1001 to the motion analysis means 800 and the moving object extraction means 900 .
  • the moving object period information 1001 is a collection of the period (or interval) for each pixel, in which the moving object is judged to exist.
  • the background judgment means 700 transmits the background image 1002 or the accumulation of the pixels judged as the background to the result output means 600 . The detail of the background judgment method will be described with reference to FIGS. 3 to 12 .
  • the motion analysis means 800 calculates the moving direction/velocity 1003 of the moving object from the digital image data 1000 and the moving object period information 1001 and transmits them to the result output means 600 .
  • the detail of the method of calculating the moving direction/velocity of the moving object will be described with reference to FIGS. 13 to 19 .
  • the moving object extraction means 900 extracts the moving object image 1004 from the digital image data 1000 and the moving object period information 1001 and transmits it to the result output means 600 .
  • the detail of the extraction unit of the moving object image 1004 will be described with reference to FIGS. 20 to 22 .
  • FIG. 3 shows the relations among the movie, the slit image and the spatial-temporal image.
  • the movie (or motion picture) is constructed of a sequence of twenty five to thirty still images called the frame images per second. This sequence is schematically shown as a movie 1010 .
  • the movie 1010 is an arrangement of frame images from time TO to time Tn.
  • a slit image 1040 is a collection of the pixels which are contained in a segment called the slit 1030 , from a frame image 1020 .
  • the arrangement of these slit images 1040 in the chronological order for each frame is called the spatial-temporal image 1050 . This is because the spatial-temporal image 1050 contains both the temporal and spatial informations.
  • the pixel having no temporal intensity change forms the line which flows horizontally in the temporal direction, as indicated by 1051 .
  • This pixel having no temporal intensity change can be considered as belonging to the background.
  • the horizontal line may break even in the background. This is because the intensity of the pixel even in the background is changed with the illuminance change such as the sunshine condition and the movement of an object constructing the background.
  • an object moving in the frame image appears as an image, as indicated by 1052 , and usually forms no horizontal line.
  • the moving object forms a horizontal line only when it stands still on the slit or when the slit is placed horizontally of the moving direction. This can be included in the aforementioned case in which the object forming the background changes.
  • the background changes into the horizontal line 1051 , and the others change into the images 1052 .
  • the reason for these changes 1052 is thought to come from the moving object or the background change.
  • whether or not it belongs to the background is judged on the basis of the characteristic of the spatial-temporal image 1050 to detect/extract the moving object.
  • FIG. 5 shows the relation in the distance between the slit image of the background period and the current slit at each time.
  • the slit image 1040 is extracted from the interval of the spatial-temporal image 1050 , in which the moving object and the background do not change, and is set as a background slit ⁇ 1041 .
  • the slit image 1040 is extracted from other times and set as a current slit ⁇ 1042 .
  • the slit vector in which the intensities of the individual pixels composing the slit image 1040 are set as the vector elements, and a distance ⁇ between two slit vectors, as given by formula 1060 , is considered. If the distance ⁇ is determined for each current slit ⁇ 1042 at each time for the spatial-temporal image 1050 , there can be obtained a graph given by a distance sequence 1070 in which the distances ⁇ are arranged in the chronological order.
  • the following facts can be derived from the characteristics of the distance sequence 1070 and the spatial-temporal image 1050 , as described with reference to FIG. 3.
  • the flat portion of the distance sequence 1070 which has a constant or more length, is predicted to belong to the background so that it has no moving object. In the other portions having more changes, on the other hand, it is thought that the moving object has passed or that the background has changed.
  • the flat portion having the constant or more length is defined as a background period 1071
  • the remaining portions are defined as a moving object period 1072 .
  • FIG. 5 shows the spatial-temporal image 1050 and the distance sequence 1070 when the structure change of the background occurs.
  • a moving object 1100 drops a falling object 1200 onto the slit 1030 , as shown in the movie 1010 of FIG. 5( a ).
  • the background structure is changed with the falling object 1200 so that the spatial-temporal image 1050 is taken, as shown in FIG. 5( b ).
  • the image 1201 of the falling object 1200 appears just behind the image 1101 on the spatial-temporal image of the moving object 1100 .
  • This image 1201 forms part of the background on its way so that it becomes a horizontal line, as shown in FIG. 5( b ).
  • the distance sequence 1070 is determined, as shown in FIG. 5( c ), by using the background slit ⁇ 1041 .
  • both a background period 1073 and a background period 1074 are such flat portions of the distance sequence 1070 as have a constant or more length, so that they belong to the background period 1071 .
  • the average value is substantially zero, as at the background period 1073 .
  • the background period 1074 takes a constant or more average value if the slit is made different from the background slit ⁇ 1041 by the image 1201 of the falling object.
  • the image 1201 of the falling object is detected as the difference from the background slit ⁇ 1041 .
  • the average value of the distance sequence differs depending upon whether or not the slit image is identical to the background slit ⁇ 1041 . Therefore, the individual background periods 1071 will be defined as the true background period 1073 and the false background period 1074 so that they may be differentiated.
  • the reason for causing the false background period 1074 is thought to come from not only the structure change of the background but also an abrupt illuminance change. For either reason, the occurrence of the false background period 1074 means that the background has changed.
  • the slit ⁇ 1042 may be updated as a new background from the false background period 1074 to repeat the judgments by the distance sequence 1070 .
  • the invention separates the intervals of the background and the moving object by deeming the flat period of a constant length in the distance sequence 1070 as the background period 1071 and the remainder as the moving object period 1072 .
  • the background periods 1071 moreover, one having an average value approximate to zero is classified as the true background period 1073 whereas the other is classified as the false background period 1074 .
  • the false background period 1074 assuming that the background has been changed with the illuminance change or the structure change, the slit image ⁇ 1042 of the false background period 1074 is updated as a new background, and the foregoing judging procedures are repeated.
  • the method of judging the presence of the moving object and the structure change of the background by discriminating the three items: the moving object, the background and the structure change of the background at all times.
  • FIG. 6 explains the data flow of the background judgment means 700 for realizing the aforementioned method.
  • This background judgment means 700 includes slit image creation means 701 , background slit hold means 702 , distance calculation means 703 , distance sequence hold means 704 , distance sequence smoothing means 710 , smoothed time sequence hold means 705 , background period judgment means 720 , background true/false judgment means 730 , moving object period acquire means 740 and background structure change judgment means 750 .
  • the slit image creation means 701 creates the current slit 1042 to be judged on the basis of the digital image data 1000 inputted, and transmits it to the distance calculation means 703 .
  • the background slit hold means 702 holds the background slit 1041 , which is judged by the background true/false judgment means 730 or the background structure change judgment means 750 , and transmits it in response to the demand from the distance calculation means 702 .
  • the distance calculation means 702 calculates the distance in accordance with the formula 1060 by assuming the current slit 1042 and the background slit 1041 as vectors.
  • the distance ⁇ calculated is transmitted to the distance sequence hold means 704 .
  • This distance sequence hold means 704 holds the calculated distance ⁇ over a past constant time period so that the distance ⁇ may be handled as a sequence.
  • the distance sequence 1070 is updated to discard the oldest value and contain the newest value each time the distance is newly transmitted. Moreover, the distance sequence hold means 704 transmits the distance sequence 1070 to the distance sequence smoothing means 710 in response to a demand from the distance sequence smoothing means 710 .
  • This distance sequence smoothing means 710 smoothes the distance sequence 1070 which is stored in the distance sequence hold means 704 , by the motion averaging method. This is because small vibrations are frequently caused in the distance sequence by the influences such as jitters. The distance sequence 1070 thus smoothed is transmitted to the smoothed sequence hold means 705 .
  • This smoothed-sequence hold means 705 holds the smoothed latest distance sequence. This distance sequence is transmitted to the background period judgment means 720 , the background true/false judgment means 730 and the moving object period acquire means 740 in response to their individual demands.
  • the background period judgment means 720 searches the background period 1071 from the smoothed latest distance sequence 1070 and transmits the result as the interval to the background true/false judgment means 730 and the moving object period acquire means 740 , respectively.
  • This search of the background period 1071 is realized by judging the flat portion of the smoothed distance period 1070 , as described with reference to FIG. 4.
  • the search algorithm will be detailed with reference to FIG. 7.
  • the background true/false judgment means 730 judges whether the background period is the true one 1073 or the false one 1074 , on the basis of the background period 1071 and the smoothed distance sequence 1070 . After this, the current slit image 1042 is transmitted to the background slit hold means 702 and the background structure change judgment means 750 in accordance with the judgment result. In the case of the true background period 1073 , the current slit image 1042 is transmitted as a new background slit 1041 to the background slit hold means 702 . In the case of the false background period 1074 , the current slit image 1042 is transmitted to the background structure change judgment means 750 to extract the structure change of the background. The algorithm for the true/false judgment will be detailed with reference to FIGS. 8 to 11 .
  • the moving object period acquire means 740 determines the maximum on the basis of the interval, in which the moving object exists, and the smoothed distance sequence 1070 , and returns the number of real moving objects of the period, as predicted for their existence, and the time of the maximum portion as the moving object period.
  • the background structure change judgment means 750 judges whether the background change is caused by the structure change or the illuminance change, from both the background slit 1041 stored in the background slit hold means 702 and the current slit image 1042 transmitted from the background period true/false judgment means 730 , thereby to update the current slit image 1042 as a new background. This judgment algorithm will be detailed with reference to FIG. 12.
  • FIG. 7 shows the flow chart of the background period judgment means.
  • the smoothed sequence 1070 for a constant interval e.g., the latest forty five frames
  • the maximum/minimum for the interval are acquired (at Step 2002 ) from the smoothed sequence 1070 . If the difference between the maximum and the minimum is over a predetermined threshold, it is judged that the period is not the background period 1071 , and the procedure is ended. If the difference is below the threshold, it is judged that the period is the background period 1071 , and the routine advances to Step 2004 (at Step 2003 ). At last, the leading and ending times of the sequence are returned as the interval (at Step 2004 ).
  • FIG. 8 shows the influences of the illuminance change upon the slit image.
  • the slit 1040 in which the brightnesses of the individual pixels are given by P 1 to P n , as shown in FIG. 8( a ).
  • a graph is drawn to take the positions of pixels on the abscissa and the brightnesses of pixels on the ordinate, as indicated by 1046 in FIG. 8( b ).
  • the slit image 1040 be a vector v 1048 having the individual pixel brightnesses as its elements, as shown in FIG. 9( a ). If the base vectors of the individual pixels P 1 to Pn are designated by b 1 , b 2 , b 3 , - - -, and bn, the vector v 1048 can be expressed as one point in an n-dimensional vector space, as shown in FIG. 9( b ). Next, it is assumed that an abrupt illuminance change occurs for that vector v 1048 so that the slit vector changes into a slit vector v 1049 , as shown in FIG. 9( c ). At this time, it can be deemed from the consideration of FIG. 8 that the changed slit vector v′ 1049 exists on a straight line substantially identical to that of the vector v 1048 and is scalar times as large as the vector v 1048 .
  • the original slit vector 1048 and the slit vector 1049 changed by the illuminance have substantially identical directions even if they have highly different coordinate positions in the vector space.
  • the slit vector having a changed structure is predicted to be highly different not only in the coordinate position but also in the direction. In order to discriminate the illuminance change and the structure change of the slit 1040 , therefore, it is sufficient to consider the direction.
  • FIG. 10 is the projections of the ordinary slit vector 1048 and the slit vector 1049 which is influenced by the illuminance change, upon a unit sphere.
  • the distance PQ between the projected vector P 1048 ′ of the vector v 1048 upon the unit sphere and the projected vector Q 1049 ′ of the vector v 1049 upon the unit sphere becomes far shorter than the original distance vv′.
  • This normalized intervector distance will be called the normalized distance so that it may be discriminated from the distance, as defined by the formula 1060 .
  • this normalized distance is utilized to discriminate whether the background change is caused by the structure change or the illuminance change.
  • FIG. 11 shows the flow chart of the background true/false judgment means 730 .
  • the background period 1071 is acquired (at Step 2101 ) from the background period judgment means 720 .
  • an average value is acquired (at Step 2102 ) from the smoothed sequence of the background period 1071 . If the average value is below a predetermined threshold, the given background period 1071 is judged to be the true background period 1073 , and the routine advances to Step 2104 . If the average is over the threshold, the background period is judged to be false (at Step 2103 ), as at 1074 , and the routine advances to Step 2105 .
  • the background period 1071 is true, as at 1073 , and the routine is ended (at Step 2105 ) by storing the latest slit as the new background slit 1041 in the background slit hold means 702 . If at Step 2105 , the background period 1071 is false, as at 1074 , and the routine is ended (at Step 2105 ) by judging whether it is due to the illuminance change or the structure change, by the background structure judgment means 705 .
  • FIG. 12 shows the flow chart of the background structure change judgment means 750 .
  • the normalized distance between the background slit 1041 and the latest current slit 1042 in the smoothed sequence is determined (at Step 2201 ). If this normalized distance is below a predetermined threshold, this judgment is caused by the illuminance change, and the routine advances to Step 2203 . If over the threshold, the decision is caused (at Step 2202 ) by the structure change, and the routine advances to Step 2204 .
  • the background period 1071 is the false background period 1074 due to the illuminance change.
  • the routine is ended by setting the value in the smoothed sequence to zero and by storing the current slit 1042 as the new background slit 1041 in the background slit hold means 702 .
  • the background period 1071 is the false background period 1074 due to the structure change.
  • the routine is ended by storing the latest current slit 1042 as the new background slit 1041 in the background slit hold means 702 and by setting all values in the smoothed sequence to zero.
  • FIG. 13 summarizes the method of extracting the moving object exclusively by separating/judging the background change region and the moving object region from the spatial-temporal image.
  • FIGS. 14 and 15 summarize the method of calculating the moving direction and velocity.
  • FIGS. 15 to 19 explain the motion analysis means for realizing those methods.
  • FIG. 13 summarizes the method of extracting the moving object exclusively by separating/judging the background change region and the moving object region from the spatial-temporal image 1050 .
  • a spatial-temporal image 1053 in the interval where the moving object is thought to exist is cut out of the spatial-temporal image 1050 on the basis of the moving object interval 1001 , and the original background slit 1041 and a future background slit 1041 ′ are acquired.
  • an original background differential image 1054 is created from the original background slit 1041 and the spatial-temporal image 1053
  • a future background differential image 1055 is created from the future background slit 1041 ′ and the spatial-temporal image 1053 .
  • This background differential image contains not only a moving object region 1102 but also the differential region 1202 between the background slit 1041 and the background structure change.
  • the logical product of the original background differential image 1054 and the future background differential image 1055 is determined to extract the moving object region 1102 .
  • the differential region 1202 from the background structure change is canceled so that only the moving object region 1102 or the common region can be extracted.
  • FIG. 14 summarizes the slit setting method for analyzing the motions of the moving object in the movie and the method of calculating the moving direction/velocity 1003 of the moving object 1101 which is extracted from the spatial-temporal image 1050 obtained by the slit setting method.
  • the moving object 1101 is inclined forward or backward in the spatial-temporal image 1050 thus obtained, as shown in FIG. 14. This is because the upper or lower side of the moving object 1101 reaches the slit 1030 faster than the opposite side with respect to the slit 1030 . As a result, even with a single slit, it can be determined from the positive or negative sign of the inclination whether the moving object has moved from the left or right hand. From the magnitude of the inclination 1210 , moreover, the average velocity to cross the slit 1030 can be calculated.
  • the moving direction/velocity 1003 of the moving object 1101 can be calculated according to the invention.
  • the moving object region 1201 is extracted from the spatial-temporal image 1050 which is obtained from the slit 1030 set at the inclination, and this inclination 1210 is calculated from the moment of the region to estimate the moving direction and velocity.
  • FIG. 15 explains the principle for calculating the moving direction/velocity 1003 of the moving object 1101 extracted, from the inclination of the slit 1030 and the inclination 1210 of the moving object 1011 .
  • the inclination of the slit 1030 is designated by ⁇ . It is assumed that the moving object 1100 having a horizontal velocity v passes the slit 1030 . If the moving object 1100 has a height h and if the moving object 1100 moves by w after its upper portion passed the slit and before its lower portion passes the slit, the horizontal moving velocity v is expressed by a formula 1610 . Next, the inclination of the image 1101 of the moving object in the spatial-temporal image 1050 is expressed by ⁇ .
  • the frame number s for the moving object to move by w in the frame image is described by a formula 1620 .
  • a formula 1630 is obtained if the formulas 1610 and 1620 are rearranged for v.
  • the positive and negative signs of v indicate the directions, and the absolute value indicates the magnitude of the horizontal velocity component.
  • the moving direction/velocity 1003 are calculated from the inclination of the slit 1030 and the inclination of the image 1101 of the moving object 1100 in the spatial-temporal image 1050 .
  • FIG. 16 shows the data flow of the motion analysis means 800 for realizing the aforementioned method.
  • This motion analysis means 800 includes spatial-temporal image creation means 801 , background slit acquire means 802 , background difference merging means 803 and merged background difference inclination judgment means 820 .
  • These spatial-temporal image creation means 801 , background slit acquire means 802 , background difference merging means 803 and merged background difference inclination judgment means 820 realize the method of extracting the moving object 1101 exclusively by separating/judging the background change region 1202 and the moving object region 1102 from the spatial-temporal image 1050 , as described with reference to FIG. 13.
  • the spatial-temporal image creation means 801 creates the spatial-temporal image 1053 in the interval, for which the moving object 1101 exists, by acquiring the slit images from the digital image data 1000 and the moving object interval 1001 and by arranging them in the frame order.
  • the spatial-temporal image 1050 is transmitted in response to the demands from the background difference creation means 810 and 810 ′.
  • the background slit acquire means 802 acquires the original background slit 1041 and the future background slit 1041 ′ from before and behind the interval, for which the moving object 1101 exists, on the basis of the digital image data 1000 and the moving object interval 1001 .
  • the background difference creation means 810 and 810 ′ creates the original background difference 1054 and the future background difference 1055 from the spatial-temporal image 1053 , the original background slit 1041 and the future background slit 1041 ′.
  • the detail of the background difference creation algorithm will be described with reference to FIG. 17.
  • the background difference merging means 803 creates a merged background difference 1056 from the logical product of the created original background difference 1054 and future background difference 1055 . Only the moving object 1101 is extracted by the procedure described above.
  • the merged background difference inclination judgment means 820 realizes the method of calculating the moving direction/velocity 1003 of the extracted moving object from the inclination of the slit 1030 and the inclination 1210 of the moving object, as described with reference to FIGS. 13 and 14.
  • FIG. 17 shows the construction (or data flow) of the means 810 for creating the background difference by using the original background and the future background.
  • the background difference creation means 810 includes moving object region separation means 811 , moving object region morphology means 812 , noise region elimination means 813 and occluded region supply means 814 .
  • the moving object region separation means 811 makes only the moving object region 1201 binary to separate/extract it from either the background slit image 1041 and the current slit image 1042 or the background frame image and the current frame image. The detail of this separation/extraction algorithm will be described with reference to FIGS. 18 and 19.
  • the moving object region morphology means 812 , the noise region elimination means 813 and the occluded region supply means 814 correct the rupture or segmentation of the moving object region, as caused by moving object region separation means 911 .
  • the moving object region morphology means 812 connects the ruptured or segmented moving object region 1201 by the morphologies. The number of these morphologies is about three.
  • the noise region elimination means 813 eliminates the minute regions independent from the moving object region 1201 of morphology by deeming them as noises.
  • the occluded region supply means 814 searches and smears the holes contained in the moving object region 1201 .
  • the moving object 1200 is cut-out as the moving object region 1201 to create the background differential image.
  • this cutout can be made by the processings using the changed background image in place of the current image.
  • FIG. 18 summarizes the moving object region separation means 810 .
  • the background slit 1041 and the current slit 1042 are acquired.
  • the background slit 1041 and the current slit 1042 are compared for each pixel to judge whether they belong to the background or the moving object. This comparison does not resort to the brightness of the corresponding pixel, but a local slit composed of a w-number of pixels including the corresponding one is created for the judgment by determining the normalized distance between the local slits.
  • a local slit ⁇ 1 1044 containing the target pixel Pn 1045 and a corresponding background local slit ⁇ 1 1043 are created to determine the normalized distance of the two. Because this normalized distance is used, the background can be correctly judged and eliminated even if the illuminance change of its portion is caused by the shadow of the moving object or the light.
  • FIG. 19 shows the flow chart of the moving object region separation means 810 .
  • the current slit 1042 to be judged is acquired from the movie 1010 (at Step 2301 ).
  • the routine advances to Step 2303 , if there is any pixel non-judged, and otherwise the routine is ended (at Step 2302 ).
  • the two local slits 1043 and 1044 are acquired sequentially from above (at Steps 2303 and 2304 ) from the background slit 1041 and the current slit 1042 .
  • the dispersion of the background local slit 1043 is determined and is compared with the predetermined threshold TV.
  • the routine advances to Step 2306 , if below the threshold TV, and otherwise to Step 2307 (at Step 2305 ).
  • the normalized distance is determined (at Step 2306 ) as the distance between the two local slits 1043 and 1044 .
  • the vector distance is determined (at Step 2307 ) as the distance between the two local slits 1043 and 1044 .
  • the distance between the two local slits thus determined is compared with a predetermined threshold TD.
  • the routine advances to Step 2309 , if below the threshold TD, and otherwise to Step 2310 (at Step 2308 ).
  • the target pixel to be judged belongs to the background (at Step 2309 ), if judged below, and otherwise the target pixel belongs to the non-background (at Step 2310 ).
  • the procedure described above is returned to Step 2302 and is repeated.
  • FIGS. 20 and 21 explain the method of extracting the moving object and the background by separating/judging the background change region and the moving object region from the frame image.
  • FIG. 20 summarizes the method of extracting the moving object exclusively by separating/judging the background change region and the moving object region from the frame image.
  • an original background image 1801 , a future background image 1803 and a current frame image 1802 are acquired.
  • a moving object 1804 and a falling object 1805 be projected on the current frame image 1802 and that another falling object 1806 be projected in addition to the falling object 1805 on the future background image 1803 .
  • an original background difference 1807 between the original background image 1801 and the current frame image 1802 and a future background difference 1808 between the future background image 1803 and the present frame image 1802 A moving object region 1809 and a background change region 1810 by the falling object appear in the original background difference 1807 .
  • the future background difference 1808 there appear the moving object region 1809 and a background change region 1811 by the falling object 1806 .
  • FIG. 21 summarizes the method of extracting the background change exclusively by separating/judging the background change region and the moving object region from the frame image.
  • the original background image 1801 and the future background image 1803 are acquired.
  • the future background image 1803 , the falling object 1805 and the falling object 1806 are reflected.
  • a background difference 1901 between the original background image 1801 and the future background image 1803 is created.
  • this background difference 1901 there appear the background change region 1810 by the falling object 1805 and the background change region 1811 by the falling object 1806 .
  • a cutout is made from the future background image 1803 by using the merged difference 1901 as the mask image, it is possible to obtain a background structure change image 1902 containing only the falling objects 1805 and 1806 .
  • FIG. 22 shows the construction (or data flow) of the moving object extraction means 900 for realizing the aforementioned method.
  • the moving object extraction means 900 is constructed to include six components of frame image acquire means 901 , background image creation means 902 , background difference creation means 910 and 910 ′, background difference merging means 903 and moving object cutout means 904 . These means realizes the method of extracting only the moving object by separating/judging the background change region and the moving object region, as described with reference to FIG. 19.
  • the frame image acquire means 901 acquires the frame image 1802 of the interval, for which the moving object 1100 seems to exist, from the moving object interval 1001 and the digital image data 1000 , and transmits it to the background difference creation means 910 and 910 ′.
  • the background image creation means 902 acquires the frame image of the interval, which is judged as the background, from the moving object interval 1001 and the digital image data 1000 , and transmits it as the original background image 1801 and the future background image 1803 to the background difference creation means 910 and 910 ′.
  • These background difference creation means 910 and 910 ′ repeats the processings of the background difference creation means 810 and 810 ′, as described with reference to FIG.
  • the background difference merging means 903 creates the merged background difference 1812 from the logical product of the original background difference 1807 and the future background difference 1808 of the frame image and transmits it to the moving object cutout means 904 .
  • This moving object cutout means 904 cuts the merged background difference 1812 as the mask image out of the frame image 1802 to extract the moving object image 1804 .
  • FIG. 23 shows an example of the result display screen which is outputted onto the display 300 by the result output means 600 .
  • the result display screen 2000 is constructed to include at least the four components of an input image movie display region 2010 , a background change representative screen display region 2020 , a moving object representative screen display region 2030 and a correlated value sequence display region 2040 .
  • the slit 1030 is placed upright in the middle of the movie 1010 , and a representative screen 2032 of the moving object having passed through the slit 1030 and representative screens 2022 and 2033 of the changes in the background are displayed in the background change representative screen display region 2020 and the moving object representative screen display region 2030 , respectively.
  • the sequence (or the distance sequence 1070 ) of the correlated values of the slit is displayed in the correlated value sequence display region 2040 to indicate the user the grounds for judgment the present of the moving object and the background change.
  • the input movie display region 2010 is a portion for displaying the present movie 1010 which is inputted from the TV camera 200 .
  • the background change representative screen display region 2020 is a portion for displaying the background representative screen 2022 before change and the background representative screen after change by detecting the structure change of the background in the movie 1010 .
  • the detected change in the background structure is displayed as a pair of upper and lower parts of the background representative screen 2022 before change and the background representative screen 2023 after change in a background change display window 2021 so that their difference may be judged by the user.
  • This screen example displays that the background change is exemplified by the parking of an automobile or the falling object of a truck.
  • the background change display window 2021 is provided with a scroll bar so that the change in the background structure thus far detected may be observed.
  • a marker 2024 is attached so that the group of the latest representative screens may be quickly understood.
  • the moving object representative screen display region 2030 is a portion for displaying the representative screen 2032 projecting the moving object by detecting this object in the movie 1010 .
  • the detected moving object is displayed in a moving object representative screen display window 2031 so that it may be judged by the user.
  • This screen example displays that the moving object is exemplified by a motorbike, a black automobile, a white automobile, a gray automobile or a truck.
  • the moving object representative screen display window 2031 is provided with a scroll bar so that the representative screen of the moving objects thus far detected may be observed.
  • a marker 2033 is attached so that the latest moving object may be discriminated.
  • the correlated value sequence display region 2040 is a portion for displaying both the spatial-temporal image 1050 obtained from the slit 1030 and the sequence 1070 of the correlated values (or distances) at the corresponding time.
  • the pixels and graph values of the spatial-temporal image 1050 and the distance sequence 1070 at the latest time are always displayed at the righthand of a correlated value sequence display window 2041 .
  • a moving object detection marker 2042 indicating the position on the spatial-temporal image at the time of detecting the moving object
  • a background change detection marker 2043 indicating the position on the spatial-temporal image at the time of detecting the background change
  • the window area of interest on the movie 1010 is exemplified by the slit 1030 .
  • the processings in the background judgment means 700 and the moving object extraction means 900 essentially the same operations are undergone for an assembly of a plurality of adjacent pixels, even if the shape is different from the slit 1030 .
  • window area of interest as has a square, circular or concentric shape.
  • a movie of ten and several hours as is obtained from a TV camera attached to the entrance of a house or office, is judged with the correlated value sequence of the entire frame image so that the a list of the representative images of visitors or distributed parcels may be extracted.
  • the background and the moving object can be judged so that the moving object can be exclusively detected, even under the complicated background having a change in the illuminating condition or a structure change.
  • the moving object to be extracted no restriction is exerted upon the shape, color and moving direction and velocity of the moving object.
  • the moving direction and velocity can be calculated.
  • the object to be processed is several percentages of pixels in the movie so that the processing is ten times or more as high as that of the moving object extraction apparatus of the prior art.
  • the amount of memory to be used can also be reduced to several percentages.
  • the real time processing can be achieved even by an inexpensive computer such as the personal computer.

Abstract

A moving object is detected from a movie. The actual movie has a complicated background. In order to detect the moving object, the invention is constructed to comprise, in addition to means 500 for inputting the movie, and display 300 for outputting the processed result: means 700 for judging the interval which is predicted to belong to the background as to a pixel region in the movie; means 800 for extracting the moving object; and means 900 for calculating the moving direction and velocity of the moving object.
Thanks to the above-specified construction, even under the complicated background in which not only the change in the illumination condition but also the structure change will occur, the presence of the structure change of the background can be judged to detect/extract the moving object on real time. Moreover, the moving direction and velocity of the moving object can also be calculated.

Description

    BACKGROUND OF THE INVENTION
  • The invention relates to moving object detection apparatus and method for monitoring the movie which is inputted with a camera, to measure the traffic flows on roads, to detect failures on railroads/crossings, and to prevent crimes in banks or convenience stores. [0001]
  • At present, various places such as the roads, the crossings or the service floors of banks are monitored with the camera movies. This technique is intended to prevent traffic jams, accidents or crimes in advance by monitoring objects moving in a specified place (as will be called the “moving bodies” or “moving objects”). In the traffic flow surveys frequently undergone at roads, for example, the statistical data on the traffic flows can be collected by monitoring how many automobiles, motorbikes, bicycles or pedestrians pass the monitoring area and by classifying the traffic flows into various categories. In the monitoring of the traffic jams on the roads, the accidents at the crossings or the service floors of banks or convenience stores, on the other hand, the accidents or crimes can be prevented in advance by detecting failures such as the jams, the stops of automobiles due to engine stalls, the falling objects or the suspicious behaviors of customers. Thus, there are high needs for movie-monitoring the moving objects. However, this movie-monitoring at present cannot go without resorting to the man powers because of its technical level. This causes problems of high cost and easy introduction of human mistakes. With this environment, automation of the monitoring by computers or the like is desired, and various methods have been proposed using models or templates. [0002]
  • The actual case of the movie-monitoring frequently occurs not indoors but outdoors. As a result, the objects or backgrounds are intensely influenced by the climate conditions such as rainfalls or snowfalls or the illumination conditions such as the sunshines or street lights. By the shadow of the environment or the reflection of the light due to the rainfalls, for example, the apparent shapes are highly changed. When the illumination changes from the sunlight to the mercury lamp, moreover, the contrast in brightness or color between the target to be monitored and the background will change. Even the movie at the same location is changed in its image characteristics with seasons or times. It frequently follows that an effective characteristic quantity could be extracted under one condition but not under another condition. Thus, under a complicated background, the monitoring has very low reliability depending upon the kind of the characteristic quantity to be used in the recognition algorithm so that its practicability is difficult. [0003]
  • The asymptotic illumination change of those problems coming from the complicated backgrounds is solved by the moving object detection method using the background difference. This background difference is the method of separating/extracting only a moving object by taking a difference between the background image reflecting only the background and the frame image containing the moving object by using that the background will hardly change in the movie taken with a fixed camera. The background image is automatically acquired by the method determining and using the medians and modes of the intensity of each pixel in the time axis. FIG. 24 simply shows the principle of the moving object detection method using the background difference. If a [0004] background image 100 is given in advance for a scene 110 to be monitored, a moving object 111 can be separated/extracted as a scene change 121 from the differential image 120 between the background image 100 and the scene 110.
  • The feature of this method is robust to any monitoring place. This is because any complicated background such as a [0005] utility pole 101 would be deleted by the differential operation if the camera had no motion. The prior art of the moving object detection method according to the background difference is exemplified by 1) IEICE Trans. D-II, Vol. J72-DII, No. 6, pp. 855-865, 1989, 2) IPSJ SIG-Notes, CV 75-5, 1991, and 3) IEICE Trans. D-II Vol. J77-DII, No. 9, pp. 1716-1726, 1994.
  • However, the method of the prior art has a problem that it is weak to the structure change of the background. FIG. 25 simply shows the problem of the method of the prior art. For example, it is assumed that a stopped [0006] object 131 appear in the scene of the background image 100 to cause the structure change of the background, as shown in a scene 130. According to the method of the prior art, the parked object 131 is extracted as a change 141, as indicated in a differential image 140 between the background image 100 and the scene 130. However, it is impossible to discriminate whether the change 141 is caused by the moving object or the structure change of the background. In a scene 150 on and after the structure change of the background, therefore, the structure change and the moving object so merge into each other, as in the region 161 in a scene 160, that they cannot be separated, even after a moving object 151 passed.
  • A number of structure changes of the background occur in the actual movie monitoring. For example, an automobile having passed a road may be stopped at the parking meter on a road edge to form part of a new background. On the contrary, an object having been stopped at the parking meter may move to make the hidden region into a portion of the new background. When the passing object drops an object onto the road, the falling object may also form part of the new background. In addition, the object having passed the snow may leave its tracks. [0007]
  • Thus, the method of the prior art using the is background difference could not cope with the structure change of the background. This is because it is impossible to discriminate whether the portion having a changed background structure belongs to the moving object or a new background region. For this discrimination, it is conceivable to execute the motion analysis of the moving object. For the motion analysis algorithm such as the optical flow, however, the number of moving objects has to be known in advance. Once the number of the moving objects is mis-recognized, the subsequent processing will find it difficult not only to separate the background change region but also the presence of the background change itself. [0008]
  • It can be enumerated as another problem that the separation/extraction of the moving object are unstable. This is because the background change region and the moving object region could not always be correctly discriminated for the aforementioned reason even if the presence of the background change could be judged. When a parcel is dropped from a moving object and left on the road, for example, the moving object region is also updated as the background if the change in the new background by the falling object is detected and if the background is updated. As a result, a dust comes into the region where the moving object has been present at the background updating time. Thus, after the structure change of the background, the moving object couldn't be correctly separated/extracted from the background to make it resultantly difficult to continue the monitoring process. [0009]
  • In order to solve the problems thus far described, the invention has the following three objects. [0010]
  • A first object is to judge whether a pixel region of interest belongs to the background or the moving object, thereby to judge the kind of the background change, if any. [0011]
  • A second object is to extract only the moving object by separating/judging the background change region and the moving object region. [0012]
  • A third object is to easily calculate the moving direction or velocity of the moving object extracted. [0013]
  • SUMMARY OF THE INVENTION
  • First of all, the invention comprises, as its basic component means, means for inputting a movie, means for extracting/detecting a moving object, and means for outputting the processed result as the movie. [0014]
  • Next, the following means are provided for realizing the judgment of the presence of the structure change in the moving object and the background for a predetermined pixel region according to the first object. [0015]
  • The means are: means for acquiring the pixel region to be judged for the background, from the movie; means for calculating the correlation between the pixel region at a time and the pixel region of each frame; means for holding the calculated correlated values sequentially; means for judging the interval which is predicted to belong to the background because of absence of the moving object; and means for deciding the interval for which the moving object is present, from the interval which is predicted to belong to the judged background. Moreover, the means for deciding the interval for which the moving object is present, comprises: means for judging the present of the background change from the interval which is predicted to belong to the judged background; means for classifying the background change into the illuminance change or the structure change; and means for deciding the interval for which the moving object is present, from the interval which is predicted to belong to the judged background. [0016]
  • The following means are provided for realizing the extraction of only the moving object by separating/judging the background change region and the moving object region according to the second object. [0017]
  • The means are: means for acquiring the frame image containing the moving object and the images having only two projected backgrounds (or the original and future background images before and after the interval for which the moving object is present), as located before and after the interval, for which the moving object is present, from the movie; means for creating the original background differential image and the future background differential image from the frame image and the original and future background images; means for determining the merging region by a logical product from the original background differential image and the future background differential image; and means for cutting the moving object image out of the frame image and the merging region. [0018]
  • The following means are provided for easily calculating the moving direction or velocity of the extracted moving object according to the third object. [0019]
  • The means are: means for cutting out the spatial-temporal image of the interval for which the moving object is present; means for separating the slit images (or the original background slit image and the future background slit image before and after the interval for which the moving object is present) of only the two backgrounds, as located before and after the interval for which the moving object is present, and the moving object region from the spatial-temporal image; means for correcting the moving object region by the morphology processing and the hole fill processing; means for determining a common merging region from the corrected two background differential images, and means for estimating the direction/velocity of the moving object by calculating the inclination of the obtained merged regions. [0020]
  • The other characteristic moving object detection apparatus and method will become apparent from the description to be made in the following. [0021]
  • On the basis of the movie inputted by the movie input means, according to the invention, the moving object is detected by the following procedure. [0022]
  • First of all, the structure changes in the moving object and the background are judged for a specific pixel region. The pixel region to be judged for the background is acquired from the movie and for each frame, and the correlation with the pixel region at a time is calculated. The correlated values thus calculated can be handled as a sequence. Next, for the sequence of the correlated values, the interval, which is predicted to belong to the background because of absence of the moving object, is judged. Whether or not the background has changed for the interval to be predicted to belong to the background is judged to classify the judged background change into the illuminance change or the structure change. At last, the interval for which the moving object is present is decided from the interval to be predicted to belong to the background. [0023]
  • Next, only the moving object is extracted by separating/judging the background change region and the moving object region. First of all, the frame image containing the moving object is acquired on the basis of the interval for which the moving object is present. Next, the original background image and the future background image, as located before and after the interval for which the moving object is present, are acquired from the movie. Next, the original background differential image and the future background differential image are created from the frame image and the original and future background images. The merged region is determined by the logical product from the original background differential image and the future background differential image until the moving object image is cut out of the frame image and the merged region. [0024]
  • Then, the moving direction or velocity of the extracted moving object is simply calculated. First of all, the spatial-temporal image of the interval, for which the moving object is present, is cut out. Next, the moving object region is separated from the background slit image and the spatial-temporal image. The moving object region is corrected by the morphology and the hole fill processings. The merged region is determined from the logical product of the original background differential image and the future background differential image corrected, until the inclination of the merged region is calculated to estimate the direction/velocity of the moving object. [0025]
  • At last, the aforementioned processed results are displayed on the display by the result output means. [0026]
  • Still further advantages of the present invention will become apparent to those of ordinary skill in the art upon reading and understanding the following detailed description of the preferred and alternate embodiments.[0027]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The invention will be described in conjunction with certain drawings which are for the purpose of illustrating the preferred and alternate embodiments of the invention only, and not for the purpose of limiting the same, and wherein: [0028]
  • FIG. 1 shows a hardware construction for realizing the invention; [0029]
  • FIG. 2 shows a system construction for realizing the invention; [0030]
  • FIG. 3 shows relations among a movie, a slit image and a spatial-temporal image; [0031]
  • FIG. 4 shows a relation in distance between a [0032] background slit 1041 and a current slit 1042 at each time;
  • FIG. 5 shows a spatial-[0033] temporal image 1050 and a sequence of distances when a structure change occurs in the background;
  • FIG. 6 explains a data flow of background judgment means [0034] 700;
  • FIG. 7 shows a flow chart of background period judgment means [0035] 720;
  • FIG. 8 shows the influences of an illuminance change upon a [0036] slit image 1040;
  • FIG. 9 shows the influences of an illuminance change upon a slit vector when the [0037] slit image 1040 is deemed as a vector;
  • FIG. 10 shows the mapping of the ordinary slit vector and the slit vector, as influenced by the illuminance change, upon a unit sphere; [0038]
  • FIG. 11 shows a flow chart of background true/false judgment means [0039] 730;
  • FIG. 12 shows a flow chart of background structure change judgment means [0040] 750;
  • FIG. 13 shows the summary of a method of extracting a moving [0041] object 1100 exclusively by separating/judging a background change portion and a moving object region from a spatial-temporal image 1050;
  • FIG. 14 shows a slit setting method for analyzing the motion of the moving [0042] object 1100 in the movie 1010, and the spatial-temporal image 1050 obtained by the method;
  • FIG. 15 explains the summary of a method for calculating the moving direction/[0043] velocity 1003 of the extracted moving object from the inclination of a slit 1030 and the inclination 1210 of the moving object;
  • FIG. 16 shows a data flow of motion analysis means [0044] 800 for realizing the aforementioned method;
  • FIG. 17 shows a data flow of [0045] means 810 for creating an original background difference and a future background difference;
  • FIG. 18 shows the summary of moving object region separation means [0046] 811;
  • FIG. 19 shows a flow chart of the moving object region separation means; [0047]
  • FIG. 20 shows the summary of a method of extracting a moving object exclusively by separating/judging a background change portion and the moving object region relative to a frame image; [0048]
  • FIG. 21 shows the summary of a method of extracting the background change exclusively by separating/judging the background change portion and the moving object region with respect to the frame image; [0049]
  • FIG. 22 shows a data flow of moving object extraction means [0050] 900 for realizing the aforementioned method;
  • FIG. 23 shows an example of the resultant display screen which is outputted on a [0051] display 300 by result output means 600;
  • FIG. 24 shows the moving object detection/extraction by the conventional method using the background difference; and [0052]
  • FIG. 25 shows a problem in the moving object detection/extraction by the conventional method using the background difference.[0053]
  • DETAILED DESCRIPTION OF THE PREFERRED AND ALTERNATE EMBODIMENTS
  • One embodiment of the invention will be described in detail in the following. [0054]
  • FIG. 2 shows one embodiment of the hardware construction for realizing the invention. A [0055] TV camera 200 takes a scene to be monitored, transforms it into video signals 201 and transmits them to a computer 400. At this transmission, the video signals 201 are digitized for each frame and stored in the memory of the computer 400. This computer 400 reads out its memory content follows the processing program which is stored at another address in the memory, to judge whether the pixels on the frame image belong to the background or the moving object, to extract the moving object and to estimate the moving direction/velocity. The image of the moving object extracted and the other accompanying processed results are transmitted to a display 300. This-display 300 outputs the results processed by the computer 400, such as the background image and the image and moving direction/velocity of the moving object to the screen. These informations are transmitted through a network 210 to the display of a safety control unit or a monitor center.
  • FIG. 1 shows one example of the system construction which is realized in the [0056] computer 400. This computer 400 includes video input means 500, result output means 600, background judgment means 700, motion analysis means 800 and moving object extraction means 900. The video input means 500 transforms the video signals into digital image data 100 for each frame and transmits them to the background judgment means 700, the motion analysis means 800 and the moving object extraction means 900.
  • The result output means [0057] 600 displays the processed results of the background judgment means 700, the motion analysis means 800 and the moving object extraction means 900, such as a background image 1002, a moving direction/velocity 1003 and a moving object image 1004 on the display such that they can be easily observed by the user.
  • The background judgment means [0058] 700 judges whether each pixel on the digital image data 1000 belongs to the background, the moving object or the change in the background structure, and transmits a moving object period information 1001 to the motion analysis means 800 and the moving object extraction means 900. The moving object period information 1001 is a collection of the period (or interval) for each pixel, in which the moving object is judged to exist. In addition, the background judgment means 700 transmits the background image 1002 or the accumulation of the pixels judged as the background to the result output means 600. The detail of the background judgment method will be described with reference to FIGS. 3 to 12.
  • The motion analysis means [0059] 800 calculates the moving direction/velocity 1003 of the moving object from the digital image data 1000 and the moving object period information 1001 and transmits them to the result output means 600. The detail of the method of calculating the moving direction/velocity of the moving object will be described with reference to FIGS. 13 to 19.
  • The moving object extraction means [0060] 900 extracts the moving object image 1004 from the digital image data 1000 and the moving object period information 1001 and transmits it to the result output means 600. The detail of the extraction unit of the moving object image 1004 will be described with reference to FIGS. 20 to 22.
  • First of all, the method of judging the moving object and whether or not the structure changes in the background will be summarized with reference to FIGS. [0061] 3 to 5. Next, the background judgment means for realizing the method will be described with reference to FIGS. 6 to 12.
  • FIG. 3 shows the relations among the movie, the slit image and the spatial-temporal image. The movie (or motion picture) is constructed of a sequence of twenty five to thirty still images called the frame images per second. This sequence is schematically shown as a [0062] movie 1010. In this case, the movie 1010 is an arrangement of frame images from time TO to time Tn. A slit image 1040 is a collection of the pixels which are contained in a segment called the slit 1030, from a frame image 1020. The arrangement of these slit images 1040 in the chronological order for each frame is called the spatial-temporal image 1050. This is because the spatial-temporal image 1050 contains both the temporal and spatial informations.
  • In the spatial-[0063] temporal image 1050 of a fixed camera, the pixel having no temporal intensity change forms the line which flows horizontally in the temporal direction, as indicated by 1051. This pixel having no temporal intensity change can be considered as belonging to the background. On the other hand, the horizontal line may break even in the background. This is because the intensity of the pixel even in the background is changed with the illuminance change such as the sunshine condition and the movement of an object constructing the background.
  • On the contrary, an object moving in the frame image appears as an image, as indicated by [0064] 1052, and usually forms no horizontal line. The moving object forms a horizontal line only when it stands still on the slit or when the slit is placed horizontally of the moving direction. This can be included in the aforementioned case in which the object forming the background changes.
  • Thus in the spatial-[0065] temporal image 1050, the background changes into the horizontal line 1051, and the others change into the images 1052. The reason for these changes 1052 is thought to come from the moving object or the background change. In the invention, whether or not it belongs to the background is judged on the basis of the characteristic of the spatial-temporal image 1050 to detect/extract the moving object.
  • FIG. 5 shows the relation in the distance between the slit image of the background period and the current slit at each time. First of all, the [0066] slit image 1040 is extracted from the interval of the spatial-temporal image 1050, in which the moving object and the background do not change, and is set as a background slit β 1041. Next, the slit image 1040 is extracted from other times and set as a current slit τ 1042. Here is considered the slit vector, in which the intensities of the individual pixels composing the slit image 1040 are set as the vector elements, and a distance δ between two slit vectors, as given by formula 1060, is considered. If the distance δ is determined for each current slit τ 1042 at each time for the spatial-temporal image 1050, there can be obtained a graph given by a distance sequence 1070 in which the distances δ are arranged in the chronological order.
  • The following facts can be derived from the characteristics of the [0067] distance sequence 1070 and the spatial-temporal image 1050, as described with reference to FIG. 3. The flat portion of the distance sequence 1070, which has a constant or more length, is predicted to belong to the background so that it has no moving object. In the other portions having more changes, on the other hand, it is thought that the moving object has passed or that the background has changed. In order to discriminate these in the following description, of the distance sequence 1070, the flat portion having the constant or more length is defined as a background period 1071, and the remaining portions are defined as a moving object period 1072.
  • FIG. 5 shows the spatial-[0068] temporal image 1050 and the distance sequence 1070 when the structure change of the background occurs. Here is considered the case in which a moving object 1100 drops a falling object 1200 onto the slit 1030, as shown in the movie 1010 of FIG. 5(a). In this case, the background structure is changed with the falling object 1200 so that the spatial-temporal image 1050 is taken, as shown in FIG. 5(b). Specifically, the image 1201 of the falling object 1200 appears just behind the image 1101 on the spatial-temporal image of the moving object 1100. This image 1201 forms part of the background on its way so that it becomes a horizontal line, as shown in FIG. 5(b).
  • For this spatial-[0069] temporal image 1050, the distance sequence 1070 is determined, as shown in FIG. 5(c), by using the background slit β 1041. Of this distance sequence 1070, both a background period 1073 and a background period 1074 are such flat portions of the distance sequence 1070 as have a constant or more length, so that they belong to the background period 1071. In the period of the same slit as that of the background slit β 1041, the average value is substantially zero, as at the background period 1073. On the other hand, the background period 1074 takes a constant or more average value if the slit is made different from the background slit β 1041 by the image 1201 of the falling object. This is because the image 1201 of the falling object is detected as the difference from the background slit β 1041. Even for the same background period 1071, the average value of the distance sequence differs depending upon whether or not the slit image is identical to the background slit β 1041. Therefore, the individual background periods 1071 will be defined as the true background period 1073 and the false background period 1074 so that they may be differentiated.
  • The reason for causing the [0070] false background period 1074 is thought to come from not only the structure change of the background but also an abrupt illuminance change. For either reason, the occurrence of the false background period 1074 means that the background has changed. In order to continue the judgment of the background or the moving object, the slit τ 1042 may be updated as a new background from the false background period 1074 to repeat the judgments by the distance sequence 1070.
  • On the basis of the characteristics described above, the invention separates the intervals of the background and the moving object by deeming the flat period of a constant length in the [0071] distance sequence 1070 as the background period 1071 and the remainder as the moving object period 1072. Of the background periods 1071, moreover, one having an average value approximate to zero is classified as the true background period 1073 whereas the other is classified as the false background period 1074. In the case of the false background period 1074, assuming that the background has been changed with the illuminance change or the structure change, the slit image τ 1042 of the false background period 1074 is updated as a new background, and the foregoing judging procedures are repeated. By these procedures, there is realized the method of judging the presence of the moving object and the structure change of the background by discriminating the three items: the moving object, the background and the structure change of the background at all times.
  • FIG. 6 explains the data flow of the background judgment means [0072] 700 for realizing the aforementioned method. This background judgment means 700 includes slit image creation means 701, background slit hold means 702, distance calculation means 703, distance sequence hold means 704, distance sequence smoothing means 710, smoothed time sequence hold means 705, background period judgment means 720, background true/false judgment means 730, moving object period acquire means 740 and background structure change judgment means 750.
  • The slit image creation means [0073] 701 creates the current slit 1042 to be judged on the basis of the digital image data 1000 inputted, and transmits it to the distance calculation means 703.
  • The background slit hold means [0074] 702 holds the background slit 1041, which is judged by the background true/false judgment means 730 or the background structure change judgment means 750, and transmits it in response to the demand from the distance calculation means 702.
  • The distance calculation means [0075] 702 calculates the distance in accordance with the formula 1060 by assuming the current slit 1042 and the background slit 1041 as vectors. The distance δ calculated is transmitted to the distance sequence hold means 704.
  • This distance sequence hold means [0076] 704 holds the calculated distance δ over a past constant time period so that the distance δ may be handled as a sequence. The distance sequence 1070 is updated to discard the oldest value and contain the newest value each time the distance is newly transmitted. Moreover, the distance sequence hold means 704 transmits the distance sequence 1070 to the distance sequence smoothing means 710 in response to a demand from the distance sequence smoothing means 710.
  • This distance sequence smoothing means [0077] 710 smoothes the distance sequence 1070 which is stored in the distance sequence hold means 704, by the motion averaging method. This is because small vibrations are frequently caused in the distance sequence by the influences such as jitters. The distance sequence 1070 thus smoothed is transmitted to the smoothed sequence hold means 705.
  • This smoothed-sequence hold means [0078] 705 holds the smoothed latest distance sequence. This distance sequence is transmitted to the background period judgment means 720, the background true/false judgment means 730 and the moving object period acquire means 740 in response to their individual demands.
  • The background period judgment means [0079] 720 searches the background period 1071 from the smoothed latest distance sequence 1070 and transmits the result as the interval to the background true/false judgment means 730 and the moving object period acquire means 740, respectively. This search of the background period 1071 is realized by judging the flat portion of the smoothed distance period 1070, as described with reference to FIG. 4. The search algorithm will be detailed with reference to FIG. 7.
  • The background true/false judgment means [0080] 730 judges whether the background period is the true one 1073 or the false one 1074, on the basis of the background period 1071 and the smoothed distance sequence 1070. After this, the current slit image 1042 is transmitted to the background slit hold means 702 and the background structure change judgment means 750 in accordance with the judgment result. In the case of the true background period 1073, the current slit image 1042 is transmitted as a new background slit 1041 to the background slit hold means 702. In the case of the false background period 1074, the current slit image 1042 is transmitted to the background structure change judgment means 750 to extract the structure change of the background. The algorithm for the true/false judgment will be detailed with reference to FIGS. 8 to 11.
  • The moving object period acquire means [0081] 740 determines the maximum on the basis of the interval, in which the moving object exists, and the smoothed distance sequence 1070, and returns the number of real moving objects of the period, as predicted for their existence, and the time of the maximum portion as the moving object period.
  • The background structure change judgment means [0082] 750 judges whether the background change is caused by the structure change or the illuminance change, from both the background slit 1041 stored in the background slit hold means 702 and the current slit image 1042 transmitted from the background period true/false judgment means 730, thereby to update the current slit image 1042 as a new background. This judgment algorithm will be detailed with reference to FIG. 12.
  • FIG. 7 shows the flow chart of the background period judgment means. First of all, the smoothed [0083] sequence 1070 for a constant interval (e.g., the latest forty five frames) is acquired (at Step 2001) from the smoothed sequence hold means 705. Next, the maximum/minimum for the interval are acquired (at Step 2002) from the smoothed sequence 1070. If the difference between the maximum and the minimum is over a predetermined threshold, it is judged that the period is not the background period 1071, and the procedure is ended. If the difference is below the threshold, it is judged that the period is the background period 1071, and the routine advances to Step 2004 (at Step 2003). At last, the leading and ending times of the sequence are returned as the interval (at Step 2004).
  • FIG. 8 shows the influences of the illuminance change upon the slit image. First of all, there is thought the [0084] slit 1040 in which the brightnesses of the individual pixels are given by P1 to Pn, as shown in FIG. 8(a). Next, a graph is drawn to take the positions of pixels on the abscissa and the brightnesses of pixels on the ordinate, as indicated by 1046 in FIG. 8(b).
  • If an abrupt illuminance change occurs to darken the [0085] slit image 1040 as a whole, the brightnesses of the individual pixels P1 to Pn in the slit image 1040 grow uniformly dark from 1046 to 1047 while holding the relations, as shown in FIG. 8(c). These illuminance changes are shown in FIG. 9 if the slit image is deemed as vectors.
  • It can be deemed that the [0086] slit image 1040 be a vector v 1048 having the individual pixel brightnesses as its elements, as shown in FIG. 9(a). If the base vectors of the individual pixels P1 to Pn are designated by b1, b2, b3, - - -, and bn, the vector v 1048 can be expressed as one point in an n-dimensional vector space, as shown in FIG. 9(b). Next, it is assumed that an abrupt illuminance change occurs for that vector v 1048 so that the slit vector changes into a slit vector v 1049, as shown in FIG. 9(c). At this time, it can be deemed from the consideration of FIG. 8 that the changed slit vector v′ 1049 exists on a straight line substantially identical to that of the vector v 1048 and is scalar times as large as the vector v 1048.
  • Thus, it can be understood that the [0087] original slit vector 1048 and the slit vector 1049 changed by the illuminance have substantially identical directions even if they have highly different coordinate positions in the vector space. On the other hand, the slit vector having a changed structure is predicted to be highly different not only in the coordinate position but also in the direction. In order to discriminate the illuminance change and the structure change of the slit 1040, therefore, it is sufficient to consider the direction.
  • FIG. 10 is the projections of the [0088] ordinary slit vector 1048 and the slit vector 1049 which is influenced by the illuminance change, upon a unit sphere. As shown in FIG. 10, the distance PQ between the projected vector P 1048′ of the vector v 1048 upon the unit sphere and the projected vector Q 1049′ of the vector v 1049 upon the unit sphere becomes far shorter than the original distance vv′. Whether the relation between the two different slit images is caused merely by the difference in the illuminance change or the structure change can be judged depending upon whether or not the vector distance on the unit sphere is extremely short. This normalized intervector distance will be called the normalized distance so that it may be discriminated from the distance, as defined by the formula 1060.
  • In the invention, this normalized distance is utilized to discriminate whether the background change is caused by the structure change or the illuminance change. [0089]
  • FIG. 11 shows the flow chart of the background true/false judgment means [0090] 730. First of all, the background period 1071 is acquired (at Step 2101) from the background period judgment means 720. Next, an average value is acquired (at Step 2102) from the smoothed sequence of the background period 1071. If the average value is below a predetermined threshold, the given background period 1071 is judged to be the true background period 1073, and the routine advances to Step 2104. If the average is over the threshold, the background period is judged to be false (at Step 2103), as at 1074, and the routine advances to Step 2105. If at Step 2104, the background period 1071 is true, as at 1073, and the routine is ended (at Step 2105) by storing the latest slit as the new background slit 1041 in the background slit hold means 702. If at Step 2105, the background period 1071 is false, as at 1074, and the routine is ended (at Step 2105) by judging whether it is due to the illuminance change or the structure change, by the background structure judgment means 705.
  • FIG. 12 shows the flow chart of the background structure change judgment means [0091] 750. First of all, in order to judge whether the judgment of the false background period 1074 is caused by the illuminance change, the normalized distance between the background slit 1041 and the latest current slit 1042 in the smoothed sequence is determined (at Step 2201). If this normalized distance is below a predetermined threshold, this judgment is caused by the illuminance change, and the routine advances to Step 2203. If over the threshold, the decision is caused (at Step 2202) by the structure change, and the routine advances to Step 2204.
  • In the case of [0092] Step 2203, the background period 1071 is the false background period 1074 due to the illuminance change. The routine is ended by setting the value in the smoothed sequence to zero and by storing the current slit 1042 as the new background slit 1041 in the background slit hold means 702. In the case of Step 2204, the background period 1071 is the false background period 1074 due to the structure change. In this case, the routine is ended by storing the latest current slit 1042 as the new background slit 1041 in the background slit hold means 702 and by setting all values in the smoothed sequence to zero.
  • Next, the method of extracting the moving object exclusively and the method of calculating the moving direction and velocity will be described in the following. FIG. 13 summarizes the method of extracting the moving object exclusively by separating/judging the background change region and the moving object region from the spatial-temporal image. FIGS. 14 and 15 summarize the method of calculating the moving direction and velocity. FIGS. [0093] 15 to 19 explain the motion analysis means for realizing those methods.
  • FIG. 13 summarizes the method of extracting the moving object exclusively by separating/judging the background change region and the moving object region from the spatial-[0094] temporal image 1050.
  • First of all, a spatial-[0095] temporal image 1053 in the interval where the moving object is thought to exist is cut out of the spatial-temporal image 1050 on the basis of the moving object interval 1001, and the original background slit 1041 and a future background slit 1041′ are acquired. Next, an original background differential image 1054 is created from the original background slit 1041 and the spatial-temporal image 1053, and a future background differential image 1055 is created from the future background slit 1041′ and the spatial-temporal image 1053. This background differential image contains not only a moving object region 1102 but also the differential region 1202 between the background slit 1041 and the background structure change.
  • At last, the logical product of the original [0096] background differential image 1054 and the future background differential image 1055 is determined to extract the moving object region 1102. As a result, the differential region 1202 from the background structure change is canceled so that only the moving object region 1102 or the common region can be extracted.
  • FIG. 14 summarizes the slit setting method for analyzing the motions of the moving object in the movie and the method of calculating the moving direction/[0097] velocity 1003 of the moving object 1101 which is extracted from the spatial-temporal image 1050 obtained by the slit setting method.
  • Generally speaking, if the [0098] slit 1030 is set at a right angle or at a non-parallel oblique direction with respect to the moving direction of the moving object, the moving object 1101 is inclined forward or backward in the spatial-temporal image 1050 thus obtained, as shown in FIG. 14. This is because the upper or lower side of the moving object 1101 reaches the slit 1030 faster than the opposite side with respect to the slit 1030. As a result, even with a single slit, it can be determined from the positive or negative sign of the inclination whether the moving object has moved from the left or right hand. From the magnitude of the inclination 1210, moreover, the average velocity to cross the slit 1030 can be calculated.
  • By utilizing this, the moving direction/[0099] velocity 1003 of the moving object 1101 can be calculated according to the invention. In the motion analysis means 800, the moving object region 1201 is extracted from the spatial-temporal image 1050 which is obtained from the slit 1030 set at the inclination, and this inclination 1210 is calculated from the moment of the region to estimate the moving direction and velocity.
  • FIG. 15 explains the principle for calculating the moving direction/[0100] velocity 1003 of the moving object 1101 extracted, from the inclination of the slit 1030 and the inclination 1210 of the moving object 1011.
  • First of all, the inclination of the [0101] slit 1030, as set on the movie 1010, from the horizontal direction is designated by α. It is assumed that the moving object 1100 having a horizontal velocity v passes the slit 1030. If the moving object 1100 has a height h and if the moving object 1100 moves by w after its upper portion passed the slit and before its lower portion passes the slit, the horizontal moving velocity v is expressed by a formula 1610. Next, the inclination of the image 1101 of the moving object in the spatial-temporal image 1050 is expressed by θ. If the number of frame images per second is f, the frame number s for the moving object to move by w in the frame image is described by a formula 1620. A formula 1630 is obtained if the formulas 1610 and 1620 are rearranged for v. The positive and negative signs of v indicate the directions, and the absolute value indicates the magnitude of the horizontal velocity component.
  • In the invention, on the basis described above, the moving direction/[0102] velocity 1003 are calculated from the inclination of the slit 1030 and the inclination of the image 1101 of the moving object 1100 in the spatial-temporal image 1050.
  • FIG. 16 shows the data flow of the motion analysis means [0103] 800 for realizing the aforementioned method. This motion analysis means 800 includes spatial-temporal image creation means 801, background slit acquire means 802, background difference merging means 803 and merged background difference inclination judgment means 820.
  • These spatial-temporal image creation means [0104] 801, background slit acquire means 802, background difference merging means 803 and merged background difference inclination judgment means 820 realize the method of extracting the moving object 1101 exclusively by separating/judging the background change region 1202 and the moving object region 1102 from the spatial-temporal image 1050, as described with reference to FIG. 13.
  • The spatial-temporal image creation means [0105] 801 creates the spatial-temporal image 1053 in the interval, for which the moving object 1101 exists, by acquiring the slit images from the digital image data 1000 and the moving object interval 1001 and by arranging them in the frame order. The spatial-temporal image 1050 is transmitted in response to the demands from the background difference creation means 810 and 810′.
  • The background slit acquire means [0106] 802 acquires the original background slit 1041 and the future background slit 1041′ from before and behind the interval, for which the moving object 1101 exists, on the basis of the digital image data 1000 and the moving object interval 1001.
  • The background difference creation means [0107] 810 and 810′ creates the original background difference 1054 and the future background difference 1055 from the spatial-temporal image 1053, the original background slit 1041 and the future background slit 1041′. The detail of the background difference creation algorithm will be described with reference to FIG. 17.
  • The background difference merging means [0108] 803 creates a merged background difference 1056 from the logical product of the created original background difference 1054 and future background difference 1055. Only the moving object 1101 is extracted by the procedure described above.
  • The merged background difference inclination judgment means [0109] 820 realizes the method of calculating the moving direction/velocity 1003 of the extracted moving object from the inclination of the slit 1030 and the inclination 1210 of the moving object, as described with reference to FIGS. 13 and 14.
  • FIG. 17 shows the construction (or data flow) of the [0110] means 810 for creating the background difference by using the original background and the future background. The background difference creation means 810 includes moving object region separation means 811, moving object region morphology means 812, noise region elimination means 813 and occluded region supply means 814.
  • The moving object region separation means [0111] 811 makes only the moving object region 1201 binary to separate/extract it from either the background slit image 1041 and the current slit image 1042 or the background frame image and the current frame image. The detail of this separation/extraction algorithm will be described with reference to FIGS. 18 and 19.
  • Next, with the assumption that the moving object be one occluded region having a constant or more size, the moving object region morphology means [0112] 812, the noise region elimination means 813 and the occluded region supply means 814 correct the rupture or segmentation of the moving object region, as caused by moving object region separation means 911.
  • The moving object region morphology means [0113] 812 connects the ruptured or segmented moving object region 1201 by the morphologies. The number of these morphologies is about three. The noise region elimination means 813 eliminates the minute regions independent from the moving object region 1201 of morphology by deeming them as noises. The occluded region supply means 814 searches and smears the holes contained in the moving object region 1201.
  • By the processings described above, the moving [0114] object 1200 is cut-out as the moving object region 1201 to create the background differential image. When the change in the background structure is to be cut out, this cutout can be made by the processings using the changed background image in place of the current image.
  • FIG. 18 summarizes the moving object region separation means [0115] 810. First of all, there are acquired the background slit 1041 and the current slit 1042 from which the moving object region is to be cut out. Next, the background slit 1041 and the current slit 1042 are compared for each pixel to judge whether they belong to the background or the moving object. This comparison does not resort to the brightness of the corresponding pixel, but a local slit composed of a w-number of pixels including the corresponding one is created for the judgment by determining the normalized distance between the local slits.
  • When it is judged whether a [0116] target pixel Pn 1045 belongs to the background or the moving object, for example, a local slit τ1 1044 containing the target pixel Pn 1045 and a corresponding background local slit β1 1043 are created to determine the normalized distance of the two. Because this normalized distance is used, the background can be correctly judged and eliminated even if the illuminance change of its portion is caused by the shadow of the moving object or the light.
  • However, when the individual pixel valves of the background [0117] local slit 1043 have a small dispersion, the ordinary vector distance is used. This is because the normalized distance takes a zero value and is misjudged as the background even when a white object moves over a dark background.
  • FIG. 19 shows the flow chart of the moving object region separation means [0118] 810. First of all, the current slit 1042 to be judged is acquired from the movie 1010 (at Step 2301). Next, it is checked whether or not the judgment is executed for all slit pixels. The routine advances to Step 2303, if there is any pixel non-judged, and otherwise the routine is ended (at Step 2302). The two local slits 1043 and 1044 are acquired sequentially from above (at Steps 2303 and 2304) from the background slit 1041 and the current slit 1042. The dispersion of the background local slit 1043 is determined and is compared with the predetermined threshold TV. The routine advances to Step 2306, if below the threshold TV, and otherwise to Step 2307 (at Step 2305). Here, the normalized distance is determined (at Step 2306) as the distance between the two local slits 1043 and 1044. Here, the vector distance is determined (at Step 2307) as the distance between the two local slits 1043 and 1044. The distance between the two local slits thus determined is compared with a predetermined threshold TD. The routine advances to Step 2309, if below the threshold TD, and otherwise to Step 2310 (at Step 2308). The target pixel to be judged belongs to the background (at Step 2309), if judged below, and otherwise the target pixel belongs to the non-background (at Step 2310). The procedure described above is returned to Step 2302 and is repeated.
  • FIGS. 20 and 21 explain the method of extracting the moving object and the background by separating/judging the background change region and the moving object region from the frame image. [0119]
  • FIG. 20 summarizes the method of extracting the moving object exclusively by separating/judging the background change region and the moving object region from the frame image. [0120]
  • First of all, an [0121] original background image 1801, a future background image 1803 and a current frame image 1802 are acquired. Here, it is assumed that a moving object 1804 and a falling object 1805 be projected on the current frame image 1802 and that another falling object 1806 be projected in addition to the falling object 1805 on the future background image 1803.
  • Next, there are created an [0122] original background difference 1807 between the original background image 1801 and the current frame image 1802 and a future background difference 1808 between the future background image 1803 and the present frame image 1802. A moving object region 1809 and a background change region 1810 by the falling object appear in the original background difference 1807. In the future background difference 1808, on the other hand, there appear the moving object region 1809 and a background change region 1811 by the falling object 1806.
  • If a [0123] merged difference 1812 of the original background difference 1807 and the future background difference 1808 is determined by the logical product, the background change regions 1810 and 1811 are deleted to leave only the moving object region 1809.
  • At last, when a cutout is made from the [0124] frame image 1802 by using the merged difference 1812 as the mask image, it is possible to obtain a moving object image 1813 containing only the moving object 1804.
  • FIG. 21 summarizes the method of extracting the background change exclusively by separating/judging the background change region and the moving object region from the frame image. [0125]
  • First of all, the [0126] original background image 1801 and the future background image 1803 are acquired. Here, it is assumed that the future background image 1803, the falling object 1805 and the falling object 1806 are reflected. Next, a background difference 1901 between the original background image 1801 and the future background image 1803 is created. In this background difference 1901, there appear the background change region 1810 by the falling object 1805 and the background change region 1811 by the falling object 1806. At last, when a cutout is made from the future background image 1803 by using the merged difference 1901 as the mask image, it is possible to obtain a background structure change image 1902 containing only the falling objects 1805 and 1806.
  • FIG. 22 shows the construction (or data flow) of the moving object extraction means [0127] 900 for realizing the aforementioned method. The moving object extraction means 900 is constructed to include six components of frame image acquire means 901, background image creation means 902, background difference creation means 910 and 910′, background difference merging means 903 and moving object cutout means 904. These means realizes the method of extracting only the moving object by separating/judging the background change region and the moving object region, as described with reference to FIG. 19.
  • The frame image acquire means [0128] 901 acquires the frame image 1802 of the interval, for which the moving object 1100 seems to exist, from the moving object interval 1001 and the digital image data 1000, and transmits it to the background difference creation means 910 and 910′. The background image creation means 902 acquires the frame image of the interval, which is judged as the background, from the moving object interval 1001 and the digital image data 1000, and transmits it as the original background image 1801 and the future background image 1803 to the background difference creation means 910 and 910′. These background difference creation means 910 and 910′ repeats the processings of the background difference creation means 810 and 810′, as described with reference to FIG. 15, to create the original background difference 1807 and the future background difference 1808 of the frame image. The background difference merging means 903 creates the merged background difference 1812 from the logical product of the original background difference 1807 and the future background difference 1808 of the frame image and transmits it to the moving object cutout means 904.
  • This moving object cutout means [0129] 904 cuts the merged background difference 1812 as the mask image out of the frame image 1802 to extract the moving object image 1804.
  • FIG. 23 shows an example of the result display screen which is outputted onto the [0130] display 300 by the result output means 600. The result display screen 2000 is constructed to include at least the four components of an input image movie display region 2010, a background change representative screen display region 2020, a moving object representative screen display region 2030 and a correlated value sequence display region 2040.
  • In this display result example, the [0131] slit 1030 is placed upright in the middle of the movie 1010, and a representative screen 2032 of the moving object having passed through the slit 1030 and representative screens 2022 and 2033 of the changes in the background are displayed in the background change representative screen display region 2020 and the moving object representative screen display region 2030, respectively. Moreover, the sequence (or the distance sequence 1070) of the correlated values of the slit is displayed in the correlated value sequence display region 2040 to indicate the user the grounds for judgment the present of the moving object and the background change.
  • The input [0132] movie display region 2010 is a portion for displaying the present movie 1010 which is inputted from the TV camera 200.
  • The background change representative [0133] screen display region 2020 is a portion for displaying the background representative screen 2022 before change and the background representative screen after change by detecting the structure change of the background in the movie 1010. The detected change in the background structure is displayed as a pair of upper and lower parts of the background representative screen 2022 before change and the background representative screen 2023 after change in a background change display window 2021 so that their difference may be judged by the user. This screen example displays that the background change is exemplified by the parking of an automobile or the falling object of a truck. The background change display window 2021 is provided with a scroll bar so that the change in the background structure thus far detected may be observed. At this time, a marker 2024 is attached so that the group of the latest representative screens may be quickly understood.
  • The moving object representative [0134] screen display region 2030 is a portion for displaying the representative screen 2032 projecting the moving object by detecting this object in the movie 1010. The detected moving object is displayed in a moving object representative screen display window 2031 so that it may be judged by the user. This screen example displays that the moving object is exemplified by a motorbike, a black automobile, a white automobile, a gray automobile or a truck. The moving object representative screen display window 2031 is provided with a scroll bar so that the representative screen of the moving objects thus far detected may be observed. At this time, moreover, a marker 2033 is attached so that the latest moving object may be discriminated.
  • The correlated value [0135] sequence display region 2040 is a portion for displaying both the spatial-temporal image 1050 obtained from the slit 1030 and the sequence 1070 of the correlated values (or distances) at the corresponding time. The pixels and graph values of the spatial-temporal image 1050 and the distance sequence 1070 at the latest time are always displayed at the righthand of a correlated value sequence display window 2041. At the same time, there are displayed a moving object detection marker 2042 indicating the position on the spatial-temporal image at the time of detecting the moving object and a background change detection marker 2043 indicating the position on the spatial-temporal image at the time of detecting the background change, so that the grounds for the detections may be understood at a glance by the user.
  • In the present embodiment, the window area of interest on the [0136] movie 1010 is exemplified by the slit 1030. For the processings in the background judgment means 700 and the moving object extraction means 900, however, essentially the same operations are undergone for an assembly of a plurality of adjacent pixels, even if the shape is different from the slit 1030.
  • Another embodiment conceivable for the invention is exemplified by such a window area of interest as has a square, circular or concentric shape. For example, a movie of ten and several hours, as is obtained from a TV camera attached to the entrance of a house or office, is judged with the correlated value sequence of the entire frame image so that the a list of the representative images of visitors or distributed parcels may be extracted. [0137]
  • According to the invention, the background and the moving object can be judged so that the moving object can be exclusively detected, even under the complicated background having a change in the illuminating condition or a structure change. For the moving object to be extracted, no restriction is exerted upon the shape, color and moving direction and velocity of the moving object. Moreover, the moving direction and velocity can be calculated. [0138]
  • If the background changes, this change can be judged upon whether it is the structure change or the illumination condition change. [0139]
  • In addition, the object to be processed is several percentages of pixels in the movie so that the processing is ten times or more as high as that of the moving object extraction apparatus of the prior art. As a result, the amount of memory to be used can also be reduced to several percentages. Thus, the real time processing can be achieved even by an inexpensive computer such as the personal computer. [0140]
  • The present invention has been described with reference to the preferred and alternate embodiments. Obviously, modifications and alternations will occur to those of ordinary skill in the art upon reading and understanding the invention. It is intended that the invention be construed as including all such modifications and alternations in so far they come with the scope of the appended claims or the equivalent thereof. [0141]

Claims (20)

We claim:
1. A method for monitoring a moving object, comprising the steps of:
setting a window area of interest for an inputted time-varying image;
calculating a correlation between a first data of the window area in a frame (A) and a second data of the window area in a frame (B); and
deciding a first interval in which a moving object is present in the window area, based on a pattern of a calculated correlation value over a predetermined time period.
2. A method for monitoring a moving object according to claim 1, further comprising the step of:
displaying a representative screen of time-varying images in the first interval.
3. A method for monitoring a moving object according to claim 1, further comprising the steps of:
storing representative screens of time-varying images in a plurality of the first intervals; and
displaying the representative screens.
4. A method for monitoring a moving object according to claim 1, further comprising the steps of:
deciding a second interval in which a moving object is not present in the window area, based on the pattern of the calculated correlation value over the predetermined time period; and
displaying a representative screen of time-varying images in the second interval as a background screen.
5. A method of monitoring a moving object according to claim 1, wherein the data of the window area are represented as a feature and the calculated correlation value is assumed by the distance between the features.
6. A method for monitoring a moving object according to claim 1, wherein the window area has an arbitrary shape.
7. A method for monitoring a moving object, comprising the steps of:
setting a window area of interest for an inputted time-varying image;
calculating a correlation between a reference data of the window area in a reference frame and a current data of the window area in a current frame;
deciding whether images of the window area of the current frame includes an image of a moving object correlation value over a predetermined time period; and
updating the reference data as the current data when the image of the window area of the current frame does not include the image of the moving object.
8. A method of monitoring a moving object according to claim 7, wherein the window area has an arbitrary shape.
9. A method of monitoring a moving object according to claim 8, wherein the arbitrary shape is one of a straight line, a segment shape, a square shape, a circular shape and a concentric shape.
10. A method for monitoring a moving object, comprising the steps of:
setting a window area of interest for an inputted time-varying image;
calculating a correlation between a reference data of the window area in a reference frame and a current data of the window area in a current frame;
deciding whether images of the window area in the current frame is a background image which is a reference image for detecting a moving object, based on a plurality of calculated correlation values; and
updating the reference data as the current data when the image of the window area in the current frame is a background image.
11. A method of monitoring a moving object according to claim 10, further comprising the steps of:
deciding whether the background image is changed or not; and
judging whether the background change is caused by an illuminance change or a structure change.
12. A method for monitoring a moving object, comprising the steps of:
setting a window area of interest for an inputted time-varying image;
calculating a correlation between a first data of the window area in a frame (A) and a second data of the window area in a frame (B); and
deciding an interval in which an image of the window area changes temporarily, based on a plurality of calculated correlation values.
13. A method of monitoring a moving object according to claim 12, further comprising the step of:
displaying a representative screen of time-varying images in the interval.
14. A monitoring system, comprising:
a TV camera for taking in time-varying images;
a computer for monitoring the time-varying images; and
a display for displaying the result of the monitoring;
wherein said computer calculates a correlation between a first data of a window area in a frame (A) and a second data of the window area in a frame (B) and decides a interval in which a moving object is present in the window area based on a pattern of a calculated correlation value over a predetermined time period; and
wherein said display displays a representative screen of the time-varying images in the first interval.
15. A monitoring system according to claim 14, wherein the data of the window area are represented as a feature vector and the calculated correlation value is assumed by a distance between the feature vectors.
16. A monitoring system, comprising:
a TV camera for taking in time-varying images;
a computer for monitoring the time-varying images; and
a display for displaying the result of monitoring;
wherein said computer calculates a correlation between a reference data and a current data of the window area in a current frame and decides whether an image of the window area of the current frame includes an image of a moving object or not based on a pattern of a calculated correlation value over a predetermined time period.
17. A monitoring system according to claim 16, wherein the reference data is updated by the current data when an image of the window area of the current frame does not include the image of the moving object.
18. A monitoring system, comprising:
a TV camera for taking in time-varying images;
a computer for monitoring the time-varying images; and
a display for displaying the result of monitoring;
wherein said computer calculates a correlation between a first data of a window area in a frame (A) and a second data of the window area in a frame (B) and decides an interval in which an image of the window area changes temporarily, based on a plurality of calculated correlation value.
19. A monitoring system, comprising:
means for taking in time-varying images;
means for setting a window area of interest for the time-varying images;
means for calculating a correlation between a first data of the window area in a frame (A) and a second data of the window area in a frame (B); and
means for deciding a first interval in which a moving object is present in the window area, based on a pattern of a calculated correlation value over a predetermined time period.
20. A monitoring system, comprising:
means for taking in time-varying images;
means for setting a window area of interest for the time-varying images;
means for calculating a correlation between a first data of the window area in a frame (A) and a second data of the window area in a frame (B); and
means for deciding an interval in which an image of the window area changes temporarily, based on a plurality of a calculated correlation value.
US09/946,528 1995-02-17 2001-09-06 Moving object detection apparatus Abandoned US20020030739A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/946,528 US20020030739A1 (en) 1995-02-17 2001-09-06 Moving object detection apparatus

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP02909995A JP3569992B2 (en) 1995-02-17 1995-02-17 Mobile object detection / extraction device, mobile object detection / extraction method, and mobile object monitoring system
JP7-029099 1995-02-17
US08/601,951 US5721692A (en) 1995-02-17 1996-02-15 Moving object detection apparatus
US09/023,467 US5862508A (en) 1995-02-17 1998-02-13 Moving object detection apparatus
US18222098A 1998-10-30 1998-10-30
US09/946,528 US20020030739A1 (en) 1995-02-17 2001-09-06 Moving object detection apparatus

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US18222098A Continuation 1995-02-17 1998-10-30

Publications (1)

Publication Number Publication Date
US20020030739A1 true US20020030739A1 (en) 2002-03-14

Family

ID=12266908

Family Applications (3)

Application Number Title Priority Date Filing Date
US08/601,951 Expired - Lifetime US5721692A (en) 1995-02-17 1996-02-15 Moving object detection apparatus
US09/023,467 Expired - Lifetime US5862508A (en) 1995-02-17 1998-02-13 Moving object detection apparatus
US09/946,528 Abandoned US20020030739A1 (en) 1995-02-17 2001-09-06 Moving object detection apparatus

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US08/601,951 Expired - Lifetime US5721692A (en) 1995-02-17 1996-02-15 Moving object detection apparatus
US09/023,467 Expired - Lifetime US5862508A (en) 1995-02-17 1998-02-13 Moving object detection apparatus

Country Status (2)

Country Link
US (3) US5721692A (en)
JP (1) JP3569992B2 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1460598A1 (en) * 2003-03-17 2004-09-22 Adam Mazurek Process and apparatus for analyzing and identifying moving objects
US20040252861A1 (en) * 2003-02-14 2004-12-16 Sony Corporation Image processing apparatus and method, program, and recording medium
US20050078853A1 (en) * 2003-10-10 2005-04-14 Buehler Christopher J. System and method for searching for changes in surveillance video
US20060008118A1 (en) * 2004-07-02 2006-01-12 Mitsubishi Denki Kabushiki Kaisha Image processing apparatus and image monitoring system
US20060204044A1 (en) * 2005-03-01 2006-09-14 Fuji Photo Film Co., Ltd. Image output apparatus, image output method, image output program, image trimming apparatus, image trimming method, and image trimming program
US20080024612A1 (en) * 2003-09-03 2008-01-31 Canon Kabushiki Kaisha Display apparatus, image processing apparatus, and image processing system
US20080112642A1 (en) * 2006-11-14 2008-05-15 Microsoft Corporation Video Completion By Motion Field Transfer
US20080303911A1 (en) * 2003-12-11 2008-12-11 Motion Reality, Inc. Method for Capturing, Measuring and Analyzing Motion
US20080310677A1 (en) * 2007-06-18 2008-12-18 Weismuller Thomas P Object detection system and method incorporating background clutter removal
US7477417B1 (en) * 1999-09-07 2009-01-13 Dai Nippon Printing Co., Ltd. Image processing system
US20100157049A1 (en) * 2005-04-03 2010-06-24 Igal Dvir Apparatus And Methods For The Semi-Automatic Tracking And Examining Of An Object Or An Event In A Monitored Site
US20100296743A1 (en) * 2009-05-21 2010-11-25 Nobuhiro Tsunashima Image processing apparatus, image processing method and program
US20110052002A1 (en) * 2009-09-01 2011-03-03 Wesley Kenneth Cobb Foreground object tracking
US8264544B1 (en) * 2006-11-03 2012-09-11 Keystream Corporation Automated content insertion into video scene
FR3007878A1 (en) * 2013-06-27 2015-01-02 Rizze DEVICE FOR A ROAD SURVEILLANCE VIDEO SYSTEM TO RECORD THE CONTEXT OF AN EVENT ACCORDING TO THE PRESENCE OF A VEHICLE IN THE FIELD OF VISION OF THE CAMERA
US20170011261A1 (en) * 2015-07-09 2017-01-12 Analog Devices Global Video processing for human occupancy detection
US10410371B2 (en) 2017-12-21 2019-09-10 The Boeing Company Cluttered background removal from imagery for object detection

Families Citing this family (76)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8948442B2 (en) * 1982-06-18 2015-02-03 Intelligent Technologies International, Inc. Optical monitoring of vehicle interiors
JP3569992B2 (en) * 1995-02-17 2004-09-29 株式会社日立製作所 Mobile object detection / extraction device, mobile object detection / extraction method, and mobile object monitoring system
US5969755A (en) * 1996-02-05 1999-10-19 Texas Instruments Incorporated Motion based event detection system and method
WO2004089486A1 (en) 1996-06-21 2004-10-21 Kyota Tanaka Three-dimensional game device and information recording medium
US6356272B1 (en) * 1996-08-29 2002-03-12 Sanyo Electric Co., Ltd. Texture information giving method, object extracting method, three-dimensional model generating method and apparatus for the same
EP0831422B1 (en) * 1996-09-20 2007-11-14 Hitachi, Ltd. Method of displaying moving object for enabling identification of its moving route, display system using the same, and program recording medium therefor
IL131056A (en) * 1997-01-30 2003-07-06 Yissum Res Dev Co Generalized panoramic mosaic
US5936639A (en) * 1997-02-27 1999-08-10 Mitsubishi Electric Information Technology Center America, Inc. System for determining motion control of particles
US6130707A (en) * 1997-04-14 2000-10-10 Philips Electronics N.A. Corp. Video motion detector with global insensitivity
US5903271A (en) * 1997-05-23 1999-05-11 International Business Machines Corporation Facilitating viewer interaction with three-dimensional objects and two-dimensional images in virtual three-dimensional workspace by drag and drop technique
JP3444160B2 (en) * 1997-10-09 2003-09-08 松下電器産業株式会社 Moving object detection method
KR100246626B1 (en) * 1997-10-16 2000-03-15 정선종 Joint marker extraction method using space-time information for morphological image segmentation
JP3567066B2 (en) * 1997-10-31 2004-09-15 株式会社日立製作所 Moving object combination detecting apparatus and method
US6177944B1 (en) * 1998-09-18 2001-01-23 International Business Machines Corporation Two phase rendering for computer graphics
US6278460B1 (en) 1998-12-15 2001-08-21 Point Cloud, Inc. Creating a three-dimensional model from two-dimensional images
JP3721867B2 (en) * 1999-07-07 2005-11-30 日本電気株式会社 Video display device and display method
US6424370B1 (en) * 1999-10-08 2002-07-23 Texas Instruments Incorporated Motion based event detection system and method
JP3601392B2 (en) * 1999-12-27 2004-12-15 住友電気工業株式会社 Image processing apparatus, image processing method, and vehicle monitoring system
JP3828349B2 (en) 2000-09-27 2006-10-04 株式会社日立製作所 MOBILE BODY DETECTION MEASUREMENT METHOD, DEVICE THEREOF, AND RECORDING MEDIUM CONTAINING MOBILE BODY DETECTION MEASUREMENT PROGRAM
US7215795B2 (en) * 2000-09-28 2007-05-08 Hitachi Kokusai Electric Inc. Intruding object detecting method and intruding object monitoring apparatus employing the method
DE10050083A1 (en) * 2000-10-10 2002-04-18 Sick Ag Device and method for detecting objects
US8711217B2 (en) 2000-10-24 2014-04-29 Objectvideo, Inc. Video surveillance system employing video primitives
US20050162515A1 (en) * 2000-10-24 2005-07-28 Objectvideo, Inc. Video surveillance system
US9892606B2 (en) 2001-11-15 2018-02-13 Avigilon Fortress Corporation Video surveillance system employing video primitives
US8564661B2 (en) 2000-10-24 2013-10-22 Objectvideo, Inc. Video analytic rule detection system and method
US7200246B2 (en) * 2000-11-17 2007-04-03 Honeywell International Inc. Object detection
US6711279B1 (en) * 2000-11-17 2004-03-23 Honeywell International Inc. Object detection
US6466158B2 (en) 2000-12-08 2002-10-15 Lockheed Martin Corp. Identifying closely clustered moving targets
KR100450793B1 (en) * 2001-01-20 2004-10-01 삼성전자주식회사 Apparatus for object extraction based on the feature matching of region in the segmented images and method therefor
KR100355382B1 (en) * 2001-01-20 2002-10-12 삼성전자 주식회사 Apparatus and method for generating object label images in video sequence
US7522257B2 (en) * 2001-01-23 2009-04-21 Kenneth Jacobs System and method for a 3-D phenomenoscope
US9781408B1 (en) 2001-01-23 2017-10-03 Visual Effect Innovations, Llc Faster state transitioning for continuous adjustable 3Deeps filter spectacles using multi-layered variable tint materials
US7604348B2 (en) * 2001-01-23 2009-10-20 Kenneth Martin Jacobs Continuous adjustable 3deeps filter spectacles for optimized 3deeps stereoscopic viewing and its control method and means
US10742965B2 (en) 2001-01-23 2020-08-11 Visual Effect Innovations, Llc Faster state transitioning for continuous adjustable 3Deeps filter spectacles using multi-layered variable tint materials
US7850304B2 (en) * 2001-01-23 2010-12-14 Kenneth Martin Jacobs Continuous adjustable 3Deeps filter spectacles for optimized 3Deeps stereoscopic viewing and its control method and means
US8750382B2 (en) 2001-01-23 2014-06-10 Kenneth Martin Jacobs System and method for calculating 3Deeps action specs motion estimation from the motion vectors in an MPEG file
US7405801B2 (en) * 2001-01-23 2008-07-29 Kenneth Jacobs System and method for Pulfrich Filter Spectacles
US7508485B2 (en) * 2001-01-23 2009-03-24 Kenneth Martin Jacobs System and method for controlling 3D viewing spectacles
US6891570B2 (en) * 2001-01-31 2005-05-10 Itt Manufacturing Enterprises Inc. Method and adaptively deriving exposure time and frame rate from image motion
US6778705B2 (en) * 2001-02-27 2004-08-17 Koninklijke Philips Electronics N.V. Classification of objects through model ensembles
US7424175B2 (en) 2001-03-23 2008-09-09 Objectvideo, Inc. Video segmentation using statistical pixel modeling
CA2451992C (en) 2001-05-15 2013-08-27 Psychogenics Inc. Systems and methods for monitoring behavior informatics
US7162101B2 (en) * 2001-11-15 2007-01-09 Canon Kabushiki Kaisha Image processing apparatus and method
US6697010B1 (en) * 2002-04-23 2004-02-24 Lockheed Martin Corporation System and method for moving target detection
MXPA02005732A (en) * 2002-06-10 2004-12-13 Valencia Reuther Herman Smart time measuring card.
EP1537550A2 (en) * 2002-07-15 2005-06-08 Magna B.S.P. Ltd. Method and apparatus for implementing multipurpose monitoring system
CN100334598C (en) * 2002-11-26 2007-08-29 东芝照明技术株式会社 Market plan support system
US7664292B2 (en) * 2003-12-03 2010-02-16 Safehouse International, Inc. Monitoring an output from a camera
DE102004018410A1 (en) * 2004-04-16 2005-11-03 Robert Bosch Gmbh Safety system and method for its operation
US7653261B2 (en) * 2004-11-12 2010-01-26 Microsoft Corporation Image tapestry
US7529429B2 (en) 2004-11-12 2009-05-05 Carsten Rother Auto collage
US7532771B2 (en) * 2004-11-12 2009-05-12 Microsoft Corporation Image processing system for digital collage
JP2006165935A (en) * 2004-12-07 2006-06-22 Nec Corp Device and method for converting control information
US9077882B2 (en) 2005-04-05 2015-07-07 Honeywell International Inc. Relevant image detection in a camera, recorder, or video streaming device
JP4610005B2 (en) * 2005-07-08 2011-01-12 財団法人電力中央研究所 Intruding object detection apparatus, method and program by image processing
CA2649389A1 (en) * 2006-04-17 2007-11-08 Objectvideo, Inc. Video segmentation using statistical pixel modeling
JP4866159B2 (en) * 2006-06-27 2012-02-01 株式会社日立製作所 Moving body detection device
US20080122926A1 (en) * 2006-08-14 2008-05-29 Fuji Xerox Co., Ltd. System and method for process segmentation using motion detection
US8045783B2 (en) * 2006-11-09 2011-10-25 Drvision Technologies Llc Method for moving cell detection from temporal image sequence model estimation
JP4821642B2 (en) * 2007-02-15 2011-11-24 株式会社ニコン Image processing method, image processing apparatus, digital camera, and image processing program
TWI355615B (en) * 2007-05-11 2012-01-01 Ind Tech Res Inst Moving object detection apparatus and method by us
JP4972491B2 (en) * 2007-08-20 2012-07-11 株式会社構造計画研究所 Customer movement judgment system
JP4807354B2 (en) * 2007-12-25 2011-11-02 住友電気工業株式会社 Vehicle detection device, vehicle detection system, and vehicle detection method
JP4513869B2 (en) 2008-02-13 2010-07-28 カシオ計算機株式会社 Imaging apparatus, strobe image generation method, and program
JP2009194595A (en) * 2008-02-14 2009-08-27 Sony Corp Broadcast system, transmitter, transmission method, receiver, reception method, exhibition device, exhibition method, program, and recording medium
JP2010011016A (en) * 2008-06-26 2010-01-14 Sony Corp Tracking point detection apparatus, method, program, and recording medium
JP2012053708A (en) 2010-09-01 2012-03-15 Toshiba Tec Corp Store system, sales registration device and program
JP6024229B2 (en) * 2012-06-14 2016-11-09 富士通株式会社 Monitoring device, monitoring method, and program
WO2014155877A1 (en) * 2013-03-26 2014-10-02 ソニー株式会社 Image processing device, image processing method, and program
JP6157242B2 (en) * 2013-06-28 2017-07-05 キヤノン株式会社 Image processing apparatus and image processing method
FR3010220A1 (en) * 2013-09-03 2015-03-06 Rizze SYSTEM FOR CENSUSING VEHICLES BY THE CLOUD
US10664496B2 (en) * 2014-06-18 2020-05-26 Hitachi, Ltd. Computer system
US20170164267A1 (en) * 2015-12-03 2017-06-08 The Trustees Of Columbia University In The City Of New York Apparatus to inhibit misuse of an electrically powered device
WO2017120196A1 (en) 2016-01-04 2017-07-13 The Trustees Of Columbia University In The City Of New York Apparatus to effect an optical barrier to pests
KR102153607B1 (en) * 2016-01-22 2020-09-08 삼성전자주식회사 Apparatus and method for detecting foreground in image
US11436839B2 (en) 2018-11-02 2022-09-06 Toyota Research Institute, Inc. Systems and methods of detecting moving obstacles

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5099322A (en) * 1990-02-27 1992-03-24 Texas Instruments Incorporated Scene change detection system and method
US5103305A (en) * 1989-09-27 1992-04-07 Kabushiki Kaisha Toshiba Moving object detecting system
US5134472A (en) * 1989-02-08 1992-07-28 Kabushiki Kaisha Toshiba Moving object detection apparatus and method
US5243418A (en) * 1990-11-27 1993-09-07 Kabushiki Kaisha Toshiba Display monitoring system for detecting and tracking an intruder in a monitor area
US5331312A (en) * 1991-08-23 1994-07-19 Matsushita Electric Industrial Co., Ltd. Obstacle-detecting apparatus
US5416693A (en) * 1991-08-28 1995-05-16 Fuji Xerox Co., Ltd. Moving picture search support device
US5465115A (en) * 1993-05-14 1995-11-07 Rct Systems, Inc. Video traffic monitor for retail establishments and the like
US5500904A (en) * 1992-04-22 1996-03-19 Texas Instruments Incorporated System and method for indicating a change between images
US5566251A (en) * 1991-09-18 1996-10-15 David Sarnoff Research Center, Inc Video merging employing pattern-key insertion
US5684887A (en) * 1993-07-02 1997-11-04 Siemens Corporate Research, Inc. Background recovery in monocular vision
US5721692A (en) * 1995-02-17 1998-02-24 Hitachi, Ltd. Moving object detection apparatus
US5748775A (en) * 1994-03-09 1998-05-05 Nippon Telegraph And Telephone Corporation Method and apparatus for moving object extraction based on background subtraction
US5802361A (en) * 1994-09-30 1998-09-01 Apple Computer, Inc. Method and system for searching graphic images and videos
US5805733A (en) * 1994-12-12 1998-09-08 Apple Computer, Inc. Method and system for detecting scenes and summarizing video sequences
US5841883A (en) * 1994-10-27 1998-11-24 Yazaki Corporation Method of diagnosing a plant automatically and device for executing method thereof
US5974219A (en) * 1995-10-11 1999-10-26 Hitachi, Ltd. Control method for detecting change points in motion picture images and for stopping reproduction thereof and control system for monitoring picture images utilizing the same
US6005493A (en) * 1996-09-20 1999-12-21 Hitachi, Ltd. Method of displaying moving object for enabling identification of its moving route display system using the same, and program recording medium therefor

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2754867B2 (en) * 1990-05-23 1998-05-20 松下電器産業株式会社 Image motion detection device
JP3011748B2 (en) * 1990-09-12 2000-02-21 日本電信電話株式会社 Mobile counting device
JPH0589242A (en) * 1991-09-26 1993-04-09 Nippon Telegr & Teleph Corp <Ntt> Object area segmenting device for image

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5134472A (en) * 1989-02-08 1992-07-28 Kabushiki Kaisha Toshiba Moving object detection apparatus and method
US5103305A (en) * 1989-09-27 1992-04-07 Kabushiki Kaisha Toshiba Moving object detecting system
US5099322A (en) * 1990-02-27 1992-03-24 Texas Instruments Incorporated Scene change detection system and method
US5243418A (en) * 1990-11-27 1993-09-07 Kabushiki Kaisha Toshiba Display monitoring system for detecting and tracking an intruder in a monitor area
US5331312A (en) * 1991-08-23 1994-07-19 Matsushita Electric Industrial Co., Ltd. Obstacle-detecting apparatus
US5416693A (en) * 1991-08-28 1995-05-16 Fuji Xerox Co., Ltd. Moving picture search support device
US5566251A (en) * 1991-09-18 1996-10-15 David Sarnoff Research Center, Inc Video merging employing pattern-key insertion
US5500904A (en) * 1992-04-22 1996-03-19 Texas Instruments Incorporated System and method for indicating a change between images
US5465115A (en) * 1993-05-14 1995-11-07 Rct Systems, Inc. Video traffic monitor for retail establishments and the like
US5684887A (en) * 1993-07-02 1997-11-04 Siemens Corporate Research, Inc. Background recovery in monocular vision
US5748775A (en) * 1994-03-09 1998-05-05 Nippon Telegraph And Telephone Corporation Method and apparatus for moving object extraction based on background subtraction
US5802361A (en) * 1994-09-30 1998-09-01 Apple Computer, Inc. Method and system for searching graphic images and videos
US5841883A (en) * 1994-10-27 1998-11-24 Yazaki Corporation Method of diagnosing a plant automatically and device for executing method thereof
US5805733A (en) * 1994-12-12 1998-09-08 Apple Computer, Inc. Method and system for detecting scenes and summarizing video sequences
US5721692A (en) * 1995-02-17 1998-02-24 Hitachi, Ltd. Moving object detection apparatus
US5862508A (en) * 1995-02-17 1999-01-19 Hitachi, Ltd. Moving object detection apparatus
US5974219A (en) * 1995-10-11 1999-10-26 Hitachi, Ltd. Control method for detecting change points in motion picture images and for stopping reproduction thereof and control system for monitoring picture images utilizing the same
US6005493A (en) * 1996-09-20 1999-12-21 Hitachi, Ltd. Method of displaying moving object for enabling identification of its moving route display system using the same, and program recording medium therefor

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7477417B1 (en) * 1999-09-07 2009-01-13 Dai Nippon Printing Co., Ltd. Image processing system
US20040252861A1 (en) * 2003-02-14 2004-12-16 Sony Corporation Image processing apparatus and method, program, and recording medium
US7409075B2 (en) * 2003-02-14 2008-08-05 Sony Corporation Image processing apparatus and method, program, and recording medium
EP1460598A1 (en) * 2003-03-17 2004-09-22 Adam Mazurek Process and apparatus for analyzing and identifying moving objects
US7777780B2 (en) 2003-09-03 2010-08-17 Canon Kabushiki Kaisha Image motion display method and apparatus
US9131122B2 (en) 2003-09-03 2015-09-08 Canon Kabushiki Kaisha Apparatus, method, system, and storage medium causing a display to display a graph indicating a degree of change of part of a captured image
US20080024612A1 (en) * 2003-09-03 2008-01-31 Canon Kabushiki Kaisha Display apparatus, image processing apparatus, and image processing system
US8654199B2 (en) 2003-09-03 2014-02-18 Canon Kabushiki Kaisha Image motion detection apparatus and method for determining a parameter for detecting a moving object in a moving image and computer readable medium having encoded thereon a program instructing a computer to perform the method
US20050078853A1 (en) * 2003-10-10 2005-04-14 Buehler Christopher J. System and method for searching for changes in surveillance video
US7280673B2 (en) * 2003-10-10 2007-10-09 Intellivid Corporation System and method for searching for changes in surveillance video
US20080303911A1 (en) * 2003-12-11 2008-12-11 Motion Reality, Inc. Method for Capturing, Measuring and Analyzing Motion
US20060008118A1 (en) * 2004-07-02 2006-01-12 Mitsubishi Denki Kabushiki Kaisha Image processing apparatus and image monitoring system
US8031226B2 (en) * 2005-03-01 2011-10-04 Fujifilm Corporation Image output apparatus, image output method, image output program, image trimming apparatus, image trimming method, and image trimming program
US20060204044A1 (en) * 2005-03-01 2006-09-14 Fuji Photo Film Co., Ltd. Image output apparatus, image output method, image output program, image trimming apparatus, image trimming method, and image trimming program
US10019877B2 (en) * 2005-04-03 2018-07-10 Qognify Ltd. Apparatus and methods for the semi-automatic tracking and examining of an object or an event in a monitored site
US20100157049A1 (en) * 2005-04-03 2010-06-24 Igal Dvir Apparatus And Methods For The Semi-Automatic Tracking And Examining Of An Object Or An Event In A Monitored Site
US8264544B1 (en) * 2006-11-03 2012-09-11 Keystream Corporation Automated content insertion into video scene
US20080112642A1 (en) * 2006-11-14 2008-05-15 Microsoft Corporation Video Completion By Motion Field Transfer
US8243805B2 (en) 2006-11-14 2012-08-14 Microsoft Corporation Video completion by motion field transfer
US20080310677A1 (en) * 2007-06-18 2008-12-18 Weismuller Thomas P Object detection system and method incorporating background clutter removal
US20100296743A1 (en) * 2009-05-21 2010-11-25 Nobuhiro Tsunashima Image processing apparatus, image processing method and program
US8433139B2 (en) * 2009-05-21 2013-04-30 Sony Corporation Image processing apparatus, image processing method and program for segmentation based on a degree of dispersion of pixels with a same characteristic quality
US8218818B2 (en) * 2009-09-01 2012-07-10 Behavioral Recognition Systems, Inc. Foreground object tracking
WO2011028379A3 (en) * 2009-09-01 2011-05-05 Behavioral Recognition Systems, Inc. Foreground object tracking
US20110052002A1 (en) * 2009-09-01 2011-03-03 Wesley Kenneth Cobb Foreground object tracking
FR3007878A1 (en) * 2013-06-27 2015-01-02 Rizze DEVICE FOR A ROAD SURVEILLANCE VIDEO SYSTEM TO RECORD THE CONTEXT OF AN EVENT ACCORDING TO THE PRESENCE OF A VEHICLE IN THE FIELD OF VISION OF THE CAMERA
US20170011261A1 (en) * 2015-07-09 2017-01-12 Analog Devices Global Video processing for human occupancy detection
US10372977B2 (en) * 2015-07-09 2019-08-06 Analog Devices Gloval Unlimited Company Video processing for human occupancy detection
US10410371B2 (en) 2017-12-21 2019-09-10 The Boeing Company Cluttered background removal from imagery for object detection

Also Published As

Publication number Publication date
US5721692A (en) 1998-02-24
JP3569992B2 (en) 2004-09-29
JPH08221577A (en) 1996-08-30
US5862508A (en) 1999-01-19

Similar Documents

Publication Publication Date Title
US5721692A (en) Moving object detection apparatus
EP0749098B1 (en) Method and apparatus for sensing object located within visual field of imaging device
JP6180482B2 (en) Methods, systems, products, and computer programs for multi-queue object detection and analysis (multi-queue object detection and analysis)
EP0567059B1 (en) Object recognition system using image processing
KR100459476B1 (en) Apparatus and method for queue length of vehicle to measure
JP5325899B2 (en) Intrusion alarm video processor
CA2132515C (en) An object monitoring system
US8457360B2 (en) Detection of vehicles in an image
EP0878965A2 (en) Method for tracking entering object and apparatus for tracking and monitoring entering object
EP0986036A2 (en) Method of updating reference background image, method of detecting entering objects and system for detecting entering objects using the methods
WO2001033503A1 (en) Image processing techniques for a video based traffic monitoring system and methods therefor
JP6653361B2 (en) Road marking image processing apparatus, road marking image processing method, and road marking image processing program
JPH07210795A (en) Method and instrument for image type traffic flow measurement
JP2001067566A (en) Fire detecting device
JPH11284997A (en) Traveling object sensing device
JP3377659B2 (en) Object detection device and object detection method
JP7125843B2 (en) Fault detection system
JP4025007B2 (en) Railroad crossing obstacle detection device
JP3294468B2 (en) Object detection method in video monitoring device
JP2002190013A (en) System and method for detecting congestion by image recognition
KR101930429B1 (en) System for monitoring standardized accident standardization and method for analyzing accident situation using the same
Yu et al. Vision based vehicle detection and traffic parameter extraction
JP2001175959A (en) Method and device for detecting invasion object
JP2002300573A (en) Video diagnostic system on-board of video monitor
Michalopoulos et al. Machine-vision system for multispot vehicle detection

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION