US20020030739A1 - Moving object detection apparatus - Google Patents
Moving object detection apparatus Download PDFInfo
- Publication number
- US20020030739A1 US20020030739A1 US09/946,528 US94652801A US2002030739A1 US 20020030739 A1 US20020030739 A1 US 20020030739A1 US 94652801 A US94652801 A US 94652801A US 2002030739 A1 US2002030739 A1 US 2002030739A1
- Authority
- US
- United States
- Prior art keywords
- moving object
- background
- window area
- image
- monitoring
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G7/00—Direction control systems for self-propelled missiles
- F41G7/20—Direction control systems for self-propelled missiles based on continuous observation of target position
- F41G7/22—Homing guidance systems
- F41G7/2226—Homing guidance systems comparing the observed data with stored target data, e.g. target configuration data
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G7/00—Direction control systems for self-propelled missiles
- F41G7/20—Direction control systems for self-propelled missiles based on continuous observation of target position
- F41G7/22—Homing guidance systems
- F41G7/2253—Passive homing systems, i.e. comprising a receiver and do not requiring an active illumination of the target
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G7/00—Direction control systems for self-propelled missiles
- F41G7/20—Direction control systems for self-propelled missiles based on continuous observation of target position
- F41G7/22—Homing guidance systems
- F41G7/2273—Homing guidance systems characterised by the type of waves
- F41G7/2293—Homing guidance systems characterised by the type of waves using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/42—Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
- G06V10/421—Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation by analysing segments intersecting the pattern
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
- G06V20/54—Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19604—Image analysis to detect motion of the intruder, e.g. by frame subtraction involving reference image or background adaptation with time to compensate for changing conditions, e.g. reference image update on detection of light level change
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19678—User interface
- G08B13/19691—Signalling events for better perception by user, e.g. indicating alarms by making display brighter, adding text, creating a sound
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
Definitions
- the invention relates to moving object detection apparatus and method for monitoring the movie which is inputted with a camera, to measure the traffic flows on roads, to detect failures on railroads/crossings, and to prevent crimes in banks or convenience stores.
- the asymptotic illumination change of those problems coming from the complicated backgrounds is solved by the moving object detection method using the background difference.
- This background difference is the method of separating/extracting only a moving object by taking a difference between the background image reflecting only the background and the frame image containing the moving object by using that the background will hardly change in the movie taken with a fixed camera.
- the background image is automatically acquired by the method determining and using the medians and modes of the intensity of each pixel in the time axis.
- FIG. 24 simply shows the principle of the moving object detection method using the background difference. If a background image 100 is given in advance for a scene 110 to be monitored, a moving object 111 can be separated/extracted as a scene change 121 from the differential image 120 between the background image 100 and the scene 110 .
- FIG. 25 simply shows the problem of the method of the prior art.
- a stopped object 131 appear in the scene of the background image 100 to cause the structure change of the background, as shown in a scene 130 .
- the parked object 131 is extracted as a change 141 , as indicated in a differential image 140 between the background image 100 and the scene 130 .
- the structure change and the moving object so merge into each other, as in the region 161 in a scene 160 , that they cannot be separated, even after a moving object 151 passed.
- a number of structure changes of the background occur in the actual movie monitoring.
- an automobile having passed a road may be stopped at the parking meter on a road edge to form part of a new background.
- an object having been stopped at the parking meter may move to make the hidden region into a portion of the new background.
- the passing object drops an object onto the road, the falling object may also form part of the new background.
- the object having passed the snow may leave its tracks.
- the method of the prior art using the is background difference could not cope with the structure change of the background. This is because it is impossible to discriminate whether the portion having a changed background structure belongs to the moving object or a new background region. For this discrimination, it is conceivable to execute the motion analysis of the moving object.
- the motion analysis algorithm such as the optical flow, however, the number of moving objects has to be known in advance. Once the number of the moving objects is mis-recognized, the subsequent processing will find it difficult not only to separate the background change region but also the presence of the background change itself.
- the invention has the following three objects.
- a first object is to judge whether a pixel region of interest belongs to the background or the moving object, thereby to judge the kind of the background change, if any.
- a second object is to extract only the moving object by separating/judging the background change region and the moving object region.
- a third object is to easily calculate the moving direction or velocity of the moving object extracted.
- the invention comprises, as its basic component means, means for inputting a movie, means for extracting/detecting a moving object, and means for outputting the processed result as the movie.
- the following means are provided for realizing the judgment of the presence of the structure change in the moving object and the background for a predetermined pixel region according to the first object.
- the means are: means for acquiring the pixel region to be judged for the background, from the movie; means for calculating the correlation between the pixel region at a time and the pixel region of each frame; means for holding the calculated correlated values sequentially; means for judging the interval which is predicted to belong to the background because of absence of the moving object; and means for deciding the interval for which the moving object is present, from the interval which is predicted to belong to the judged background.
- the means for deciding the interval for which the moving object is present comprises: means for judging the present of the background change from the interval which is predicted to belong to the judged background; means for classifying the background change into the illuminance change or the structure change; and means for deciding the interval for which the moving object is present, from the interval which is predicted to belong to the judged background.
- the following means are provided for realizing the extraction of only the moving object by separating/judging the background change region and the moving object region according to the second object.
- the means are: means for acquiring the frame image containing the moving object and the images having only two projected backgrounds (or the original and future background images before and after the interval for which the moving object is present), as located before and after the interval, for which the moving object is present, from the movie; means for creating the original background differential image and the future background differential image from the frame image and the original and future background images; means for determining the merging region by a logical product from the original background differential image and the future background differential image; and means for cutting the moving object image out of the frame image and the merging region.
- the means are: means for cutting out the spatial-temporal image of the interval for which the moving object is present; means for separating the slit images (or the original background slit image and the future background slit image before and after the interval for which the moving object is present) of only the two backgrounds, as located before and after the interval for which the moving object is present, and the moving object region from the spatial-temporal image; means for correcting the moving object region by the morphology processing and the hole fill processing; means for determining a common merging region from the corrected two background differential images, and means for estimating the direction/velocity of the moving object by calculating the inclination of the obtained merged regions.
- the moving object is detected by the following procedure.
- the structure changes in the moving object and the background are judged for a specific pixel region.
- the pixel region to be judged for the background is acquired from the movie and for each frame, and the correlation with the pixel region at a time is calculated.
- the correlated values thus calculated can be handled as a sequence.
- the interval, which is predicted to belong to the background because of absence of the moving object is judged. Whether or not the background has changed for the interval to be predicted to belong to the background is judged to classify the judged background change into the illuminance change or the structure change.
- the interval for which the moving object is present is decided from the interval to be predicted to belong to the background.
- the frame image containing the moving object is acquired on the basis of the interval for which the moving object is present.
- the original background image and the future background image as located before and after the interval for which the moving object is present, are acquired from the movie.
- the original background differential image and the future background differential image are created from the frame image and the original and future background images.
- the merged region is determined by the logical product from the original background differential image and the future background differential image until the moving object image is cut out of the frame image and the merged region.
- the moving direction or velocity of the extracted moving object is simply calculated.
- the spatial-temporal image of the interval, for which the moving object is present is cut out.
- the moving object region is separated from the background slit image and the spatial-temporal image.
- the moving object region is corrected by the morphology and the hole fill processings.
- the merged region is determined from the logical product of the original background differential image and the future background differential image corrected, until the inclination of the merged region is calculated to estimate the direction/velocity of the moving object.
- FIG. 1 shows a hardware construction for realizing the invention
- FIG. 2 shows a system construction for realizing the invention
- FIG. 3 shows relations among a movie, a slit image and a spatial-temporal image
- FIG. 4 shows a relation in distance between a background slit 1041 and a current slit 1042 at each time
- FIG. 5 shows a spatial-temporal image 1050 and a sequence of distances when a structure change occurs in the background
- FIG. 6 explains a data flow of background judgment means 700 ;
- FIG. 7 shows a flow chart of background period judgment means 720 ;
- FIG. 8 shows the influences of an illuminance change upon a slit image 1040 ;
- FIG. 9 shows the influences of an illuminance change upon a slit vector when the slit image 1040 is deemed as a vector
- FIG. 10 shows the mapping of the ordinary slit vector and the slit vector, as influenced by the illuminance change, upon a unit sphere;
- FIG. 11 shows a flow chart of background true/false judgment means 730 ;
- FIG. 12 shows a flow chart of background structure change judgment means 750 ;
- FIG. 13 shows the summary of a method of extracting a moving object 1100 exclusively by separating/judging a background change portion and a moving object region from a spatial-temporal image 1050 ;
- FIG. 14 shows a slit setting method for analyzing the motion of the moving object 1100 in the movie 1010 , and the spatial-temporal image 1050 obtained by the method;
- FIG. 15 explains the summary of a method for calculating the moving direction/velocity 1003 of the extracted moving object from the inclination of a slit 1030 and the inclination 1210 of the moving object;
- FIG. 16 shows a data flow of motion analysis means 800 for realizing the aforementioned method
- FIG. 17 shows a data flow of means 810 for creating an original background difference and a future background difference
- FIG. 18 shows the summary of moving object region separation means 811 ;
- FIG. 19 shows a flow chart of the moving object region separation means
- FIG. 20 shows the summary of a method of extracting a moving object exclusively by separating/judging a background change portion and the moving object region relative to a frame image
- FIG. 21 shows the summary of a method of extracting the background change exclusively by separating/judging the background change portion and the moving object region with respect to the frame image
- FIG. 22 shows a data flow of moving object extraction means 900 for realizing the aforementioned method
- FIG. 23 shows an example of the resultant display screen which is outputted on a display 300 by result output means 600 ;
- FIG. 24 shows the moving object detection/extraction by the conventional method using the background difference
- FIG. 25 shows a problem in the moving object detection/extraction by the conventional method using the background difference.
- FIG. 2 shows one embodiment of the hardware construction for realizing the invention.
- a TV camera 200 takes a scene to be monitored, transforms it into video signals 201 and transmits them to a computer 400 .
- the video signals 201 are digitized for each frame and stored in the memory of the computer 400 .
- This computer 400 reads out its memory content follows the processing program which is stored at another address in the memory, to judge whether the pixels on the frame image belong to the background or the moving object, to extract the moving object and to estimate the moving direction/velocity.
- the image of the moving object extracted and the other accompanying processed results are transmitted to a display 300 .
- This-display 300 outputs the results processed by the computer 400 , such as the background image and the image and moving direction/velocity of the moving object to the screen. These informations are transmitted through a network 210 to the display of a safety control unit or a monitor center.
- FIG. 1 shows one example of the system construction which is realized in the computer 400 .
- This computer 400 includes video input means 500 , result output means 600 , background judgment means 700 , motion analysis means 800 and moving object extraction means 900 .
- the video input means 500 transforms the video signals into digital image data 100 for each frame and transmits them to the background judgment means 700 , the motion analysis means 800 and the moving object extraction means 900 .
- the result output means 600 displays the processed results of the background judgment means 700 , the motion analysis means 800 and the moving object extraction means 900 , such as a background image 1002 , a moving direction/velocity 1003 and a moving object image 1004 on the display such that they can be easily observed by the user.
- the background judgment means 700 judges whether each pixel on the digital image data 1000 belongs to the background, the moving object or the change in the background structure, and transmits a moving object period information 1001 to the motion analysis means 800 and the moving object extraction means 900 .
- the moving object period information 1001 is a collection of the period (or interval) for each pixel, in which the moving object is judged to exist.
- the background judgment means 700 transmits the background image 1002 or the accumulation of the pixels judged as the background to the result output means 600 . The detail of the background judgment method will be described with reference to FIGS. 3 to 12 .
- the motion analysis means 800 calculates the moving direction/velocity 1003 of the moving object from the digital image data 1000 and the moving object period information 1001 and transmits them to the result output means 600 .
- the detail of the method of calculating the moving direction/velocity of the moving object will be described with reference to FIGS. 13 to 19 .
- the moving object extraction means 900 extracts the moving object image 1004 from the digital image data 1000 and the moving object period information 1001 and transmits it to the result output means 600 .
- the detail of the extraction unit of the moving object image 1004 will be described with reference to FIGS. 20 to 22 .
- FIG. 3 shows the relations among the movie, the slit image and the spatial-temporal image.
- the movie (or motion picture) is constructed of a sequence of twenty five to thirty still images called the frame images per second. This sequence is schematically shown as a movie 1010 .
- the movie 1010 is an arrangement of frame images from time TO to time Tn.
- a slit image 1040 is a collection of the pixels which are contained in a segment called the slit 1030 , from a frame image 1020 .
- the arrangement of these slit images 1040 in the chronological order for each frame is called the spatial-temporal image 1050 . This is because the spatial-temporal image 1050 contains both the temporal and spatial informations.
- the pixel having no temporal intensity change forms the line which flows horizontally in the temporal direction, as indicated by 1051 .
- This pixel having no temporal intensity change can be considered as belonging to the background.
- the horizontal line may break even in the background. This is because the intensity of the pixel even in the background is changed with the illuminance change such as the sunshine condition and the movement of an object constructing the background.
- an object moving in the frame image appears as an image, as indicated by 1052 , and usually forms no horizontal line.
- the moving object forms a horizontal line only when it stands still on the slit or when the slit is placed horizontally of the moving direction. This can be included in the aforementioned case in which the object forming the background changes.
- the background changes into the horizontal line 1051 , and the others change into the images 1052 .
- the reason for these changes 1052 is thought to come from the moving object or the background change.
- whether or not it belongs to the background is judged on the basis of the characteristic of the spatial-temporal image 1050 to detect/extract the moving object.
- FIG. 5 shows the relation in the distance between the slit image of the background period and the current slit at each time.
- the slit image 1040 is extracted from the interval of the spatial-temporal image 1050 , in which the moving object and the background do not change, and is set as a background slit ⁇ 1041 .
- the slit image 1040 is extracted from other times and set as a current slit ⁇ 1042 .
- the slit vector in which the intensities of the individual pixels composing the slit image 1040 are set as the vector elements, and a distance ⁇ between two slit vectors, as given by formula 1060 , is considered. If the distance ⁇ is determined for each current slit ⁇ 1042 at each time for the spatial-temporal image 1050 , there can be obtained a graph given by a distance sequence 1070 in which the distances ⁇ are arranged in the chronological order.
- the following facts can be derived from the characteristics of the distance sequence 1070 and the spatial-temporal image 1050 , as described with reference to FIG. 3.
- the flat portion of the distance sequence 1070 which has a constant or more length, is predicted to belong to the background so that it has no moving object. In the other portions having more changes, on the other hand, it is thought that the moving object has passed or that the background has changed.
- the flat portion having the constant or more length is defined as a background period 1071
- the remaining portions are defined as a moving object period 1072 .
- FIG. 5 shows the spatial-temporal image 1050 and the distance sequence 1070 when the structure change of the background occurs.
- a moving object 1100 drops a falling object 1200 onto the slit 1030 , as shown in the movie 1010 of FIG. 5( a ).
- the background structure is changed with the falling object 1200 so that the spatial-temporal image 1050 is taken, as shown in FIG. 5( b ).
- the image 1201 of the falling object 1200 appears just behind the image 1101 on the spatial-temporal image of the moving object 1100 .
- This image 1201 forms part of the background on its way so that it becomes a horizontal line, as shown in FIG. 5( b ).
- the distance sequence 1070 is determined, as shown in FIG. 5( c ), by using the background slit ⁇ 1041 .
- both a background period 1073 and a background period 1074 are such flat portions of the distance sequence 1070 as have a constant or more length, so that they belong to the background period 1071 .
- the average value is substantially zero, as at the background period 1073 .
- the background period 1074 takes a constant or more average value if the slit is made different from the background slit ⁇ 1041 by the image 1201 of the falling object.
- the image 1201 of the falling object is detected as the difference from the background slit ⁇ 1041 .
- the average value of the distance sequence differs depending upon whether or not the slit image is identical to the background slit ⁇ 1041 . Therefore, the individual background periods 1071 will be defined as the true background period 1073 and the false background period 1074 so that they may be differentiated.
- the reason for causing the false background period 1074 is thought to come from not only the structure change of the background but also an abrupt illuminance change. For either reason, the occurrence of the false background period 1074 means that the background has changed.
- the slit ⁇ 1042 may be updated as a new background from the false background period 1074 to repeat the judgments by the distance sequence 1070 .
- the invention separates the intervals of the background and the moving object by deeming the flat period of a constant length in the distance sequence 1070 as the background period 1071 and the remainder as the moving object period 1072 .
- the background periods 1071 moreover, one having an average value approximate to zero is classified as the true background period 1073 whereas the other is classified as the false background period 1074 .
- the false background period 1074 assuming that the background has been changed with the illuminance change or the structure change, the slit image ⁇ 1042 of the false background period 1074 is updated as a new background, and the foregoing judging procedures are repeated.
- the method of judging the presence of the moving object and the structure change of the background by discriminating the three items: the moving object, the background and the structure change of the background at all times.
- FIG. 6 explains the data flow of the background judgment means 700 for realizing the aforementioned method.
- This background judgment means 700 includes slit image creation means 701 , background slit hold means 702 , distance calculation means 703 , distance sequence hold means 704 , distance sequence smoothing means 710 , smoothed time sequence hold means 705 , background period judgment means 720 , background true/false judgment means 730 , moving object period acquire means 740 and background structure change judgment means 750 .
- the slit image creation means 701 creates the current slit 1042 to be judged on the basis of the digital image data 1000 inputted, and transmits it to the distance calculation means 703 .
- the background slit hold means 702 holds the background slit 1041 , which is judged by the background true/false judgment means 730 or the background structure change judgment means 750 , and transmits it in response to the demand from the distance calculation means 702 .
- the distance calculation means 702 calculates the distance in accordance with the formula 1060 by assuming the current slit 1042 and the background slit 1041 as vectors.
- the distance ⁇ calculated is transmitted to the distance sequence hold means 704 .
- This distance sequence hold means 704 holds the calculated distance ⁇ over a past constant time period so that the distance ⁇ may be handled as a sequence.
- the distance sequence 1070 is updated to discard the oldest value and contain the newest value each time the distance is newly transmitted. Moreover, the distance sequence hold means 704 transmits the distance sequence 1070 to the distance sequence smoothing means 710 in response to a demand from the distance sequence smoothing means 710 .
- This distance sequence smoothing means 710 smoothes the distance sequence 1070 which is stored in the distance sequence hold means 704 , by the motion averaging method. This is because small vibrations are frequently caused in the distance sequence by the influences such as jitters. The distance sequence 1070 thus smoothed is transmitted to the smoothed sequence hold means 705 .
- This smoothed-sequence hold means 705 holds the smoothed latest distance sequence. This distance sequence is transmitted to the background period judgment means 720 , the background true/false judgment means 730 and the moving object period acquire means 740 in response to their individual demands.
- the background period judgment means 720 searches the background period 1071 from the smoothed latest distance sequence 1070 and transmits the result as the interval to the background true/false judgment means 730 and the moving object period acquire means 740 , respectively.
- This search of the background period 1071 is realized by judging the flat portion of the smoothed distance period 1070 , as described with reference to FIG. 4.
- the search algorithm will be detailed with reference to FIG. 7.
- the background true/false judgment means 730 judges whether the background period is the true one 1073 or the false one 1074 , on the basis of the background period 1071 and the smoothed distance sequence 1070 . After this, the current slit image 1042 is transmitted to the background slit hold means 702 and the background structure change judgment means 750 in accordance with the judgment result. In the case of the true background period 1073 , the current slit image 1042 is transmitted as a new background slit 1041 to the background slit hold means 702 . In the case of the false background period 1074 , the current slit image 1042 is transmitted to the background structure change judgment means 750 to extract the structure change of the background. The algorithm for the true/false judgment will be detailed with reference to FIGS. 8 to 11 .
- the moving object period acquire means 740 determines the maximum on the basis of the interval, in which the moving object exists, and the smoothed distance sequence 1070 , and returns the number of real moving objects of the period, as predicted for their existence, and the time of the maximum portion as the moving object period.
- the background structure change judgment means 750 judges whether the background change is caused by the structure change or the illuminance change, from both the background slit 1041 stored in the background slit hold means 702 and the current slit image 1042 transmitted from the background period true/false judgment means 730 , thereby to update the current slit image 1042 as a new background. This judgment algorithm will be detailed with reference to FIG. 12.
- FIG. 7 shows the flow chart of the background period judgment means.
- the smoothed sequence 1070 for a constant interval e.g., the latest forty five frames
- the maximum/minimum for the interval are acquired (at Step 2002 ) from the smoothed sequence 1070 . If the difference between the maximum and the minimum is over a predetermined threshold, it is judged that the period is not the background period 1071 , and the procedure is ended. If the difference is below the threshold, it is judged that the period is the background period 1071 , and the routine advances to Step 2004 (at Step 2003 ). At last, the leading and ending times of the sequence are returned as the interval (at Step 2004 ).
- FIG. 8 shows the influences of the illuminance change upon the slit image.
- the slit 1040 in which the brightnesses of the individual pixels are given by P 1 to P n , as shown in FIG. 8( a ).
- a graph is drawn to take the positions of pixels on the abscissa and the brightnesses of pixels on the ordinate, as indicated by 1046 in FIG. 8( b ).
- the slit image 1040 be a vector v 1048 having the individual pixel brightnesses as its elements, as shown in FIG. 9( a ). If the base vectors of the individual pixels P 1 to Pn are designated by b 1 , b 2 , b 3 , - - -, and bn, the vector v 1048 can be expressed as one point in an n-dimensional vector space, as shown in FIG. 9( b ). Next, it is assumed that an abrupt illuminance change occurs for that vector v 1048 so that the slit vector changes into a slit vector v 1049 , as shown in FIG. 9( c ). At this time, it can be deemed from the consideration of FIG. 8 that the changed slit vector v′ 1049 exists on a straight line substantially identical to that of the vector v 1048 and is scalar times as large as the vector v 1048 .
- the original slit vector 1048 and the slit vector 1049 changed by the illuminance have substantially identical directions even if they have highly different coordinate positions in the vector space.
- the slit vector having a changed structure is predicted to be highly different not only in the coordinate position but also in the direction. In order to discriminate the illuminance change and the structure change of the slit 1040 , therefore, it is sufficient to consider the direction.
- FIG. 10 is the projections of the ordinary slit vector 1048 and the slit vector 1049 which is influenced by the illuminance change, upon a unit sphere.
- the distance PQ between the projected vector P 1048 ′ of the vector v 1048 upon the unit sphere and the projected vector Q 1049 ′ of the vector v 1049 upon the unit sphere becomes far shorter than the original distance vv′.
- This normalized intervector distance will be called the normalized distance so that it may be discriminated from the distance, as defined by the formula 1060 .
- this normalized distance is utilized to discriminate whether the background change is caused by the structure change or the illuminance change.
- FIG. 11 shows the flow chart of the background true/false judgment means 730 .
- the background period 1071 is acquired (at Step 2101 ) from the background period judgment means 720 .
- an average value is acquired (at Step 2102 ) from the smoothed sequence of the background period 1071 . If the average value is below a predetermined threshold, the given background period 1071 is judged to be the true background period 1073 , and the routine advances to Step 2104 . If the average is over the threshold, the background period is judged to be false (at Step 2103 ), as at 1074 , and the routine advances to Step 2105 .
- the background period 1071 is true, as at 1073 , and the routine is ended (at Step 2105 ) by storing the latest slit as the new background slit 1041 in the background slit hold means 702 . If at Step 2105 , the background period 1071 is false, as at 1074 , and the routine is ended (at Step 2105 ) by judging whether it is due to the illuminance change or the structure change, by the background structure judgment means 705 .
- FIG. 12 shows the flow chart of the background structure change judgment means 750 .
- the normalized distance between the background slit 1041 and the latest current slit 1042 in the smoothed sequence is determined (at Step 2201 ). If this normalized distance is below a predetermined threshold, this judgment is caused by the illuminance change, and the routine advances to Step 2203 . If over the threshold, the decision is caused (at Step 2202 ) by the structure change, and the routine advances to Step 2204 .
- the background period 1071 is the false background period 1074 due to the illuminance change.
- the routine is ended by setting the value in the smoothed sequence to zero and by storing the current slit 1042 as the new background slit 1041 in the background slit hold means 702 .
- the background period 1071 is the false background period 1074 due to the structure change.
- the routine is ended by storing the latest current slit 1042 as the new background slit 1041 in the background slit hold means 702 and by setting all values in the smoothed sequence to zero.
- FIG. 13 summarizes the method of extracting the moving object exclusively by separating/judging the background change region and the moving object region from the spatial-temporal image.
- FIGS. 14 and 15 summarize the method of calculating the moving direction and velocity.
- FIGS. 15 to 19 explain the motion analysis means for realizing those methods.
- FIG. 13 summarizes the method of extracting the moving object exclusively by separating/judging the background change region and the moving object region from the spatial-temporal image 1050 .
- a spatial-temporal image 1053 in the interval where the moving object is thought to exist is cut out of the spatial-temporal image 1050 on the basis of the moving object interval 1001 , and the original background slit 1041 and a future background slit 1041 ′ are acquired.
- an original background differential image 1054 is created from the original background slit 1041 and the spatial-temporal image 1053
- a future background differential image 1055 is created from the future background slit 1041 ′ and the spatial-temporal image 1053 .
- This background differential image contains not only a moving object region 1102 but also the differential region 1202 between the background slit 1041 and the background structure change.
- the logical product of the original background differential image 1054 and the future background differential image 1055 is determined to extract the moving object region 1102 .
- the differential region 1202 from the background structure change is canceled so that only the moving object region 1102 or the common region can be extracted.
- FIG. 14 summarizes the slit setting method for analyzing the motions of the moving object in the movie and the method of calculating the moving direction/velocity 1003 of the moving object 1101 which is extracted from the spatial-temporal image 1050 obtained by the slit setting method.
- the moving object 1101 is inclined forward or backward in the spatial-temporal image 1050 thus obtained, as shown in FIG. 14. This is because the upper or lower side of the moving object 1101 reaches the slit 1030 faster than the opposite side with respect to the slit 1030 . As a result, even with a single slit, it can be determined from the positive or negative sign of the inclination whether the moving object has moved from the left or right hand. From the magnitude of the inclination 1210 , moreover, the average velocity to cross the slit 1030 can be calculated.
- the moving direction/velocity 1003 of the moving object 1101 can be calculated according to the invention.
- the moving object region 1201 is extracted from the spatial-temporal image 1050 which is obtained from the slit 1030 set at the inclination, and this inclination 1210 is calculated from the moment of the region to estimate the moving direction and velocity.
- FIG. 15 explains the principle for calculating the moving direction/velocity 1003 of the moving object 1101 extracted, from the inclination of the slit 1030 and the inclination 1210 of the moving object 1011 .
- the inclination of the slit 1030 is designated by ⁇ . It is assumed that the moving object 1100 having a horizontal velocity v passes the slit 1030 . If the moving object 1100 has a height h and if the moving object 1100 moves by w after its upper portion passed the slit and before its lower portion passes the slit, the horizontal moving velocity v is expressed by a formula 1610 . Next, the inclination of the image 1101 of the moving object in the spatial-temporal image 1050 is expressed by ⁇ .
- the frame number s for the moving object to move by w in the frame image is described by a formula 1620 .
- a formula 1630 is obtained if the formulas 1610 and 1620 are rearranged for v.
- the positive and negative signs of v indicate the directions, and the absolute value indicates the magnitude of the horizontal velocity component.
- the moving direction/velocity 1003 are calculated from the inclination of the slit 1030 and the inclination of the image 1101 of the moving object 1100 in the spatial-temporal image 1050 .
- FIG. 16 shows the data flow of the motion analysis means 800 for realizing the aforementioned method.
- This motion analysis means 800 includes spatial-temporal image creation means 801 , background slit acquire means 802 , background difference merging means 803 and merged background difference inclination judgment means 820 .
- These spatial-temporal image creation means 801 , background slit acquire means 802 , background difference merging means 803 and merged background difference inclination judgment means 820 realize the method of extracting the moving object 1101 exclusively by separating/judging the background change region 1202 and the moving object region 1102 from the spatial-temporal image 1050 , as described with reference to FIG. 13.
- the spatial-temporal image creation means 801 creates the spatial-temporal image 1053 in the interval, for which the moving object 1101 exists, by acquiring the slit images from the digital image data 1000 and the moving object interval 1001 and by arranging them in the frame order.
- the spatial-temporal image 1050 is transmitted in response to the demands from the background difference creation means 810 and 810 ′.
- the background slit acquire means 802 acquires the original background slit 1041 and the future background slit 1041 ′ from before and behind the interval, for which the moving object 1101 exists, on the basis of the digital image data 1000 and the moving object interval 1001 .
- the background difference creation means 810 and 810 ′ creates the original background difference 1054 and the future background difference 1055 from the spatial-temporal image 1053 , the original background slit 1041 and the future background slit 1041 ′.
- the detail of the background difference creation algorithm will be described with reference to FIG. 17.
- the background difference merging means 803 creates a merged background difference 1056 from the logical product of the created original background difference 1054 and future background difference 1055 . Only the moving object 1101 is extracted by the procedure described above.
- the merged background difference inclination judgment means 820 realizes the method of calculating the moving direction/velocity 1003 of the extracted moving object from the inclination of the slit 1030 and the inclination 1210 of the moving object, as described with reference to FIGS. 13 and 14.
- FIG. 17 shows the construction (or data flow) of the means 810 for creating the background difference by using the original background and the future background.
- the background difference creation means 810 includes moving object region separation means 811 , moving object region morphology means 812 , noise region elimination means 813 and occluded region supply means 814 .
- the moving object region separation means 811 makes only the moving object region 1201 binary to separate/extract it from either the background slit image 1041 and the current slit image 1042 or the background frame image and the current frame image. The detail of this separation/extraction algorithm will be described with reference to FIGS. 18 and 19.
- the moving object region morphology means 812 , the noise region elimination means 813 and the occluded region supply means 814 correct the rupture or segmentation of the moving object region, as caused by moving object region separation means 911 .
- the moving object region morphology means 812 connects the ruptured or segmented moving object region 1201 by the morphologies. The number of these morphologies is about three.
- the noise region elimination means 813 eliminates the minute regions independent from the moving object region 1201 of morphology by deeming them as noises.
- the occluded region supply means 814 searches and smears the holes contained in the moving object region 1201 .
- the moving object 1200 is cut-out as the moving object region 1201 to create the background differential image.
- this cutout can be made by the processings using the changed background image in place of the current image.
- FIG. 18 summarizes the moving object region separation means 810 .
- the background slit 1041 and the current slit 1042 are acquired.
- the background slit 1041 and the current slit 1042 are compared for each pixel to judge whether they belong to the background or the moving object. This comparison does not resort to the brightness of the corresponding pixel, but a local slit composed of a w-number of pixels including the corresponding one is created for the judgment by determining the normalized distance between the local slits.
- a local slit ⁇ 1 1044 containing the target pixel Pn 1045 and a corresponding background local slit ⁇ 1 1043 are created to determine the normalized distance of the two. Because this normalized distance is used, the background can be correctly judged and eliminated even if the illuminance change of its portion is caused by the shadow of the moving object or the light.
- FIG. 19 shows the flow chart of the moving object region separation means 810 .
- the current slit 1042 to be judged is acquired from the movie 1010 (at Step 2301 ).
- the routine advances to Step 2303 , if there is any pixel non-judged, and otherwise the routine is ended (at Step 2302 ).
- the two local slits 1043 and 1044 are acquired sequentially from above (at Steps 2303 and 2304 ) from the background slit 1041 and the current slit 1042 .
- the dispersion of the background local slit 1043 is determined and is compared with the predetermined threshold TV.
- the routine advances to Step 2306 , if below the threshold TV, and otherwise to Step 2307 (at Step 2305 ).
- the normalized distance is determined (at Step 2306 ) as the distance between the two local slits 1043 and 1044 .
- the vector distance is determined (at Step 2307 ) as the distance between the two local slits 1043 and 1044 .
- the distance between the two local slits thus determined is compared with a predetermined threshold TD.
- the routine advances to Step 2309 , if below the threshold TD, and otherwise to Step 2310 (at Step 2308 ).
- the target pixel to be judged belongs to the background (at Step 2309 ), if judged below, and otherwise the target pixel belongs to the non-background (at Step 2310 ).
- the procedure described above is returned to Step 2302 and is repeated.
- FIGS. 20 and 21 explain the method of extracting the moving object and the background by separating/judging the background change region and the moving object region from the frame image.
- FIG. 20 summarizes the method of extracting the moving object exclusively by separating/judging the background change region and the moving object region from the frame image.
- an original background image 1801 , a future background image 1803 and a current frame image 1802 are acquired.
- a moving object 1804 and a falling object 1805 be projected on the current frame image 1802 and that another falling object 1806 be projected in addition to the falling object 1805 on the future background image 1803 .
- an original background difference 1807 between the original background image 1801 and the current frame image 1802 and a future background difference 1808 between the future background image 1803 and the present frame image 1802 A moving object region 1809 and a background change region 1810 by the falling object appear in the original background difference 1807 .
- the future background difference 1808 there appear the moving object region 1809 and a background change region 1811 by the falling object 1806 .
- FIG. 21 summarizes the method of extracting the background change exclusively by separating/judging the background change region and the moving object region from the frame image.
- the original background image 1801 and the future background image 1803 are acquired.
- the future background image 1803 , the falling object 1805 and the falling object 1806 are reflected.
- a background difference 1901 between the original background image 1801 and the future background image 1803 is created.
- this background difference 1901 there appear the background change region 1810 by the falling object 1805 and the background change region 1811 by the falling object 1806 .
- a cutout is made from the future background image 1803 by using the merged difference 1901 as the mask image, it is possible to obtain a background structure change image 1902 containing only the falling objects 1805 and 1806 .
- FIG. 22 shows the construction (or data flow) of the moving object extraction means 900 for realizing the aforementioned method.
- the moving object extraction means 900 is constructed to include six components of frame image acquire means 901 , background image creation means 902 , background difference creation means 910 and 910 ′, background difference merging means 903 and moving object cutout means 904 . These means realizes the method of extracting only the moving object by separating/judging the background change region and the moving object region, as described with reference to FIG. 19.
- the frame image acquire means 901 acquires the frame image 1802 of the interval, for which the moving object 1100 seems to exist, from the moving object interval 1001 and the digital image data 1000 , and transmits it to the background difference creation means 910 and 910 ′.
- the background image creation means 902 acquires the frame image of the interval, which is judged as the background, from the moving object interval 1001 and the digital image data 1000 , and transmits it as the original background image 1801 and the future background image 1803 to the background difference creation means 910 and 910 ′.
- These background difference creation means 910 and 910 ′ repeats the processings of the background difference creation means 810 and 810 ′, as described with reference to FIG.
- the background difference merging means 903 creates the merged background difference 1812 from the logical product of the original background difference 1807 and the future background difference 1808 of the frame image and transmits it to the moving object cutout means 904 .
- This moving object cutout means 904 cuts the merged background difference 1812 as the mask image out of the frame image 1802 to extract the moving object image 1804 .
- FIG. 23 shows an example of the result display screen which is outputted onto the display 300 by the result output means 600 .
- the result display screen 2000 is constructed to include at least the four components of an input image movie display region 2010 , a background change representative screen display region 2020 , a moving object representative screen display region 2030 and a correlated value sequence display region 2040 .
- the slit 1030 is placed upright in the middle of the movie 1010 , and a representative screen 2032 of the moving object having passed through the slit 1030 and representative screens 2022 and 2033 of the changes in the background are displayed in the background change representative screen display region 2020 and the moving object representative screen display region 2030 , respectively.
- the sequence (or the distance sequence 1070 ) of the correlated values of the slit is displayed in the correlated value sequence display region 2040 to indicate the user the grounds for judgment the present of the moving object and the background change.
- the input movie display region 2010 is a portion for displaying the present movie 1010 which is inputted from the TV camera 200 .
- the background change representative screen display region 2020 is a portion for displaying the background representative screen 2022 before change and the background representative screen after change by detecting the structure change of the background in the movie 1010 .
- the detected change in the background structure is displayed as a pair of upper and lower parts of the background representative screen 2022 before change and the background representative screen 2023 after change in a background change display window 2021 so that their difference may be judged by the user.
- This screen example displays that the background change is exemplified by the parking of an automobile or the falling object of a truck.
- the background change display window 2021 is provided with a scroll bar so that the change in the background structure thus far detected may be observed.
- a marker 2024 is attached so that the group of the latest representative screens may be quickly understood.
- the moving object representative screen display region 2030 is a portion for displaying the representative screen 2032 projecting the moving object by detecting this object in the movie 1010 .
- the detected moving object is displayed in a moving object representative screen display window 2031 so that it may be judged by the user.
- This screen example displays that the moving object is exemplified by a motorbike, a black automobile, a white automobile, a gray automobile or a truck.
- the moving object representative screen display window 2031 is provided with a scroll bar so that the representative screen of the moving objects thus far detected may be observed.
- a marker 2033 is attached so that the latest moving object may be discriminated.
- the correlated value sequence display region 2040 is a portion for displaying both the spatial-temporal image 1050 obtained from the slit 1030 and the sequence 1070 of the correlated values (or distances) at the corresponding time.
- the pixels and graph values of the spatial-temporal image 1050 and the distance sequence 1070 at the latest time are always displayed at the righthand of a correlated value sequence display window 2041 .
- a moving object detection marker 2042 indicating the position on the spatial-temporal image at the time of detecting the moving object
- a background change detection marker 2043 indicating the position on the spatial-temporal image at the time of detecting the background change
- the window area of interest on the movie 1010 is exemplified by the slit 1030 .
- the processings in the background judgment means 700 and the moving object extraction means 900 essentially the same operations are undergone for an assembly of a plurality of adjacent pixels, even if the shape is different from the slit 1030 .
- window area of interest as has a square, circular or concentric shape.
- a movie of ten and several hours as is obtained from a TV camera attached to the entrance of a house or office, is judged with the correlated value sequence of the entire frame image so that the a list of the representative images of visitors or distributed parcels may be extracted.
- the background and the moving object can be judged so that the moving object can be exclusively detected, even under the complicated background having a change in the illuminating condition or a structure change.
- the moving object to be extracted no restriction is exerted upon the shape, color and moving direction and velocity of the moving object.
- the moving direction and velocity can be calculated.
- the object to be processed is several percentages of pixels in the movie so that the processing is ten times or more as high as that of the moving object extraction apparatus of the prior art.
- the amount of memory to be used can also be reduced to several percentages.
- the real time processing can be achieved even by an inexpensive computer such as the personal computer.
Abstract
A moving object is detected from a movie. The actual movie has a complicated background. In order to detect the moving object, the invention is constructed to comprise, in addition to means 500 for inputting the movie, and display 300 for outputting the processed result: means 700 for judging the interval which is predicted to belong to the background as to a pixel region in the movie; means 800 for extracting the moving object; and means 900 for calculating the moving direction and velocity of the moving object.
Thanks to the above-specified construction, even under the complicated background in which not only the change in the illumination condition but also the structure change will occur, the presence of the structure change of the background can be judged to detect/extract the moving object on real time. Moreover, the moving direction and velocity of the moving object can also be calculated.
Description
- The invention relates to moving object detection apparatus and method for monitoring the movie which is inputted with a camera, to measure the traffic flows on roads, to detect failures on railroads/crossings, and to prevent crimes in banks or convenience stores.
- At present, various places such as the roads, the crossings or the service floors of banks are monitored with the camera movies. This technique is intended to prevent traffic jams, accidents or crimes in advance by monitoring objects moving in a specified place (as will be called the “moving bodies” or “moving objects”). In the traffic flow surveys frequently undergone at roads, for example, the statistical data on the traffic flows can be collected by monitoring how many automobiles, motorbikes, bicycles or pedestrians pass the monitoring area and by classifying the traffic flows into various categories. In the monitoring of the traffic jams on the roads, the accidents at the crossings or the service floors of banks or convenience stores, on the other hand, the accidents or crimes can be prevented in advance by detecting failures such as the jams, the stops of automobiles due to engine stalls, the falling objects or the suspicious behaviors of customers. Thus, there are high needs for movie-monitoring the moving objects. However, this movie-monitoring at present cannot go without resorting to the man powers because of its technical level. This causes problems of high cost and easy introduction of human mistakes. With this environment, automation of the monitoring by computers or the like is desired, and various methods have been proposed using models or templates.
- The actual case of the movie-monitoring frequently occurs not indoors but outdoors. As a result, the objects or backgrounds are intensely influenced by the climate conditions such as rainfalls or snowfalls or the illumination conditions such as the sunshines or street lights. By the shadow of the environment or the reflection of the light due to the rainfalls, for example, the apparent shapes are highly changed. When the illumination changes from the sunlight to the mercury lamp, moreover, the contrast in brightness or color between the target to be monitored and the background will change. Even the movie at the same location is changed in its image characteristics with seasons or times. It frequently follows that an effective characteristic quantity could be extracted under one condition but not under another condition. Thus, under a complicated background, the monitoring has very low reliability depending upon the kind of the characteristic quantity to be used in the recognition algorithm so that its practicability is difficult.
- The asymptotic illumination change of those problems coming from the complicated backgrounds is solved by the moving object detection method using the background difference. This background difference is the method of separating/extracting only a moving object by taking a difference between the background image reflecting only the background and the frame image containing the moving object by using that the background will hardly change in the movie taken with a fixed camera. The background image is automatically acquired by the method determining and using the medians and modes of the intensity of each pixel in the time axis. FIG. 24 simply shows the principle of the moving object detection method using the background difference. If a
background image 100 is given in advance for ascene 110 to be monitored, a movingobject 111 can be separated/extracted as a scene change 121 from thedifferential image 120 between thebackground image 100 and thescene 110. - The feature of this method is robust to any monitoring place. This is because any complicated background such as a
utility pole 101 would be deleted by the differential operation if the camera had no motion. The prior art of the moving object detection method according to the background difference is exemplified by 1) IEICE Trans. D-II, Vol. J72-DII, No. 6, pp. 855-865, 1989, 2) IPSJ SIG-Notes, CV 75-5, 1991, and 3) IEICE Trans. D-II Vol. J77-DII, No. 9, pp. 1716-1726, 1994. - However, the method of the prior art has a problem that it is weak to the structure change of the background. FIG. 25 simply shows the problem of the method of the prior art. For example, it is assumed that a stopped
object 131 appear in the scene of thebackground image 100 to cause the structure change of the background, as shown in ascene 130. According to the method of the prior art, theparked object 131 is extracted as achange 141, as indicated in adifferential image 140 between thebackground image 100 and thescene 130. However, it is impossible to discriminate whether thechange 141 is caused by the moving object or the structure change of the background. In ascene 150 on and after the structure change of the background, therefore, the structure change and the moving object so merge into each other, as in theregion 161 in ascene 160, that they cannot be separated, even after amoving object 151 passed. - A number of structure changes of the background occur in the actual movie monitoring. For example, an automobile having passed a road may be stopped at the parking meter on a road edge to form part of a new background. On the contrary, an object having been stopped at the parking meter may move to make the hidden region into a portion of the new background. When the passing object drops an object onto the road, the falling object may also form part of the new background. In addition, the object having passed the snow may leave its tracks.
- Thus, the method of the prior art using the is background difference could not cope with the structure change of the background. This is because it is impossible to discriminate whether the portion having a changed background structure belongs to the moving object or a new background region. For this discrimination, it is conceivable to execute the motion analysis of the moving object. For the motion analysis algorithm such as the optical flow, however, the number of moving objects has to be known in advance. Once the number of the moving objects is mis-recognized, the subsequent processing will find it difficult not only to separate the background change region but also the presence of the background change itself.
- It can be enumerated as another problem that the separation/extraction of the moving object are unstable. This is because the background change region and the moving object region could not always be correctly discriminated for the aforementioned reason even if the presence of the background change could be judged. When a parcel is dropped from a moving object and left on the road, for example, the moving object region is also updated as the background if the change in the new background by the falling object is detected and if the background is updated. As a result, a dust comes into the region where the moving object has been present at the background updating time. Thus, after the structure change of the background, the moving object couldn't be correctly separated/extracted from the background to make it resultantly difficult to continue the monitoring process.
- In order to solve the problems thus far described, the invention has the following three objects.
- A first object is to judge whether a pixel region of interest belongs to the background or the moving object, thereby to judge the kind of the background change, if any.
- A second object is to extract only the moving object by separating/judging the background change region and the moving object region.
- A third object is to easily calculate the moving direction or velocity of the moving object extracted.
- First of all, the invention comprises, as its basic component means, means for inputting a movie, means for extracting/detecting a moving object, and means for outputting the processed result as the movie.
- Next, the following means are provided for realizing the judgment of the presence of the structure change in the moving object and the background for a predetermined pixel region according to the first object.
- The means are: means for acquiring the pixel region to be judged for the background, from the movie; means for calculating the correlation between the pixel region at a time and the pixel region of each frame; means for holding the calculated correlated values sequentially; means for judging the interval which is predicted to belong to the background because of absence of the moving object; and means for deciding the interval for which the moving object is present, from the interval which is predicted to belong to the judged background. Moreover, the means for deciding the interval for which the moving object is present, comprises: means for judging the present of the background change from the interval which is predicted to belong to the judged background; means for classifying the background change into the illuminance change or the structure change; and means for deciding the interval for which the moving object is present, from the interval which is predicted to belong to the judged background.
- The following means are provided for realizing the extraction of only the moving object by separating/judging the background change region and the moving object region according to the second object.
- The means are: means for acquiring the frame image containing the moving object and the images having only two projected backgrounds (or the original and future background images before and after the interval for which the moving object is present), as located before and after the interval, for which the moving object is present, from the movie; means for creating the original background differential image and the future background differential image from the frame image and the original and future background images; means for determining the merging region by a logical product from the original background differential image and the future background differential image; and means for cutting the moving object image out of the frame image and the merging region.
- The following means are provided for easily calculating the moving direction or velocity of the extracted moving object according to the third object.
- The means are: means for cutting out the spatial-temporal image of the interval for which the moving object is present; means for separating the slit images (or the original background slit image and the future background slit image before and after the interval for which the moving object is present) of only the two backgrounds, as located before and after the interval for which the moving object is present, and the moving object region from the spatial-temporal image; means for correcting the moving object region by the morphology processing and the hole fill processing; means for determining a common merging region from the corrected two background differential images, and means for estimating the direction/velocity of the moving object by calculating the inclination of the obtained merged regions.
- The other characteristic moving object detection apparatus and method will become apparent from the description to be made in the following.
- On the basis of the movie inputted by the movie input means, according to the invention, the moving object is detected by the following procedure.
- First of all, the structure changes in the moving object and the background are judged for a specific pixel region. The pixel region to be judged for the background is acquired from the movie and for each frame, and the correlation with the pixel region at a time is calculated. The correlated values thus calculated can be handled as a sequence. Next, for the sequence of the correlated values, the interval, which is predicted to belong to the background because of absence of the moving object, is judged. Whether or not the background has changed for the interval to be predicted to belong to the background is judged to classify the judged background change into the illuminance change or the structure change. At last, the interval for which the moving object is present is decided from the interval to be predicted to belong to the background.
- Next, only the moving object is extracted by separating/judging the background change region and the moving object region. First of all, the frame image containing the moving object is acquired on the basis of the interval for which the moving object is present. Next, the original background image and the future background image, as located before and after the interval for which the moving object is present, are acquired from the movie. Next, the original background differential image and the future background differential image are created from the frame image and the original and future background images. The merged region is determined by the logical product from the original background differential image and the future background differential image until the moving object image is cut out of the frame image and the merged region.
- Then, the moving direction or velocity of the extracted moving object is simply calculated. First of all, the spatial-temporal image of the interval, for which the moving object is present, is cut out. Next, the moving object region is separated from the background slit image and the spatial-temporal image. The moving object region is corrected by the morphology and the hole fill processings. The merged region is determined from the logical product of the original background differential image and the future background differential image corrected, until the inclination of the merged region is calculated to estimate the direction/velocity of the moving object.
- At last, the aforementioned processed results are displayed on the display by the result output means.
- Still further advantages of the present invention will become apparent to those of ordinary skill in the art upon reading and understanding the following detailed description of the preferred and alternate embodiments.
- The invention will be described in conjunction with certain drawings which are for the purpose of illustrating the preferred and alternate embodiments of the invention only, and not for the purpose of limiting the same, and wherein:
- FIG. 1 shows a hardware construction for realizing the invention;
- FIG. 2 shows a system construction for realizing the invention;
- FIG. 3 shows relations among a movie, a slit image and a spatial-temporal image;
- FIG. 4 shows a relation in distance between a
background slit 1041 and acurrent slit 1042 at each time; - FIG. 5 shows a spatial-
temporal image 1050 and a sequence of distances when a structure change occurs in the background; - FIG. 6 explains a data flow of background judgment means700;
- FIG. 7 shows a flow chart of background period judgment means720;
- FIG. 8 shows the influences of an illuminance change upon a
slit image 1040; - FIG. 9 shows the influences of an illuminance change upon a slit vector when the
slit image 1040 is deemed as a vector; - FIG. 10 shows the mapping of the ordinary slit vector and the slit vector, as influenced by the illuminance change, upon a unit sphere;
- FIG. 11 shows a flow chart of background true/false judgment means730;
- FIG. 12 shows a flow chart of background structure change judgment means750;
- FIG. 13 shows the summary of a method of extracting a moving
object 1100 exclusively by separating/judging a background change portion and a moving object region from a spatial-temporal image 1050; - FIG. 14 shows a slit setting method for analyzing the motion of the moving
object 1100 in themovie 1010, and the spatial-temporal image 1050 obtained by the method; - FIG. 15 explains the summary of a method for calculating the moving direction/
velocity 1003 of the extracted moving object from the inclination of aslit 1030 and theinclination 1210 of the moving object; - FIG. 16 shows a data flow of motion analysis means800 for realizing the aforementioned method;
- FIG. 17 shows a data flow of
means 810 for creating an original background difference and a future background difference; - FIG. 18 shows the summary of moving object region separation means811;
- FIG. 19 shows a flow chart of the moving object region separation means;
- FIG. 20 shows the summary of a method of extracting a moving object exclusively by separating/judging a background change portion and the moving object region relative to a frame image;
- FIG. 21 shows the summary of a method of extracting the background change exclusively by separating/judging the background change portion and the moving object region with respect to the frame image;
- FIG. 22 shows a data flow of moving object extraction means900 for realizing the aforementioned method;
- FIG. 23 shows an example of the resultant display screen which is outputted on a
display 300 by result output means 600; - FIG. 24 shows the moving object detection/extraction by the conventional method using the background difference; and
- FIG. 25 shows a problem in the moving object detection/extraction by the conventional method using the background difference.
- One embodiment of the invention will be described in detail in the following.
- FIG. 2 shows one embodiment of the hardware construction for realizing the invention. A
TV camera 200 takes a scene to be monitored, transforms it intovideo signals 201 and transmits them to acomputer 400. At this transmission, the video signals 201 are digitized for each frame and stored in the memory of thecomputer 400. Thiscomputer 400 reads out its memory content follows the processing program which is stored at another address in the memory, to judge whether the pixels on the frame image belong to the background or the moving object, to extract the moving object and to estimate the moving direction/velocity. The image of the moving object extracted and the other accompanying processed results are transmitted to adisplay 300. This-display 300 outputs the results processed by thecomputer 400, such as the background image and the image and moving direction/velocity of the moving object to the screen. These informations are transmitted through anetwork 210 to the display of a safety control unit or a monitor center. - FIG. 1 shows one example of the system construction which is realized in the
computer 400. Thiscomputer 400 includes video input means 500, result output means 600, background judgment means 700, motion analysis means 800 and moving object extraction means 900. The video input means 500 transforms the video signals intodigital image data 100 for each frame and transmits them to the background judgment means 700, the motion analysis means 800 and the moving object extraction means 900. - The result output means600 displays the processed results of the background judgment means 700, the motion analysis means 800 and the moving object extraction means 900, such as a
background image 1002, a moving direction/velocity 1003 and a movingobject image 1004 on the display such that they can be easily observed by the user. - The background judgment means700 judges whether each pixel on the
digital image data 1000 belongs to the background, the moving object or the change in the background structure, and transmits a movingobject period information 1001 to the motion analysis means 800 and the moving object extraction means 900. The movingobject period information 1001 is a collection of the period (or interval) for each pixel, in which the moving object is judged to exist. In addition, the background judgment means 700 transmits thebackground image 1002 or the accumulation of the pixels judged as the background to the result output means 600. The detail of the background judgment method will be described with reference to FIGS. 3 to 12. - The motion analysis means800 calculates the moving direction/
velocity 1003 of the moving object from thedigital image data 1000 and the movingobject period information 1001 and transmits them to the result output means 600. The detail of the method of calculating the moving direction/velocity of the moving object will be described with reference to FIGS. 13 to 19. - The moving object extraction means900 extracts the moving
object image 1004 from thedigital image data 1000 and the movingobject period information 1001 and transmits it to the result output means 600. The detail of the extraction unit of the movingobject image 1004 will be described with reference to FIGS. 20 to 22. - First of all, the method of judging the moving object and whether or not the structure changes in the background will be summarized with reference to FIGS.3 to 5. Next, the background judgment means for realizing the method will be described with reference to FIGS. 6 to 12.
- FIG. 3 shows the relations among the movie, the slit image and the spatial-temporal image. The movie (or motion picture) is constructed of a sequence of twenty five to thirty still images called the frame images per second. This sequence is schematically shown as a
movie 1010. In this case, themovie 1010 is an arrangement of frame images from time TO to time Tn. Aslit image 1040 is a collection of the pixels which are contained in a segment called theslit 1030, from aframe image 1020. The arrangement of theseslit images 1040 in the chronological order for each frame is called the spatial-temporal image 1050. This is because the spatial-temporal image 1050 contains both the temporal and spatial informations. - In the spatial-
temporal image 1050 of a fixed camera, the pixel having no temporal intensity change forms the line which flows horizontally in the temporal direction, as indicated by 1051. This pixel having no temporal intensity change can be considered as belonging to the background. On the other hand, the horizontal line may break even in the background. This is because the intensity of the pixel even in the background is changed with the illuminance change such as the sunshine condition and the movement of an object constructing the background. - On the contrary, an object moving in the frame image appears as an image, as indicated by1052, and usually forms no horizontal line. The moving object forms a horizontal line only when it stands still on the slit or when the slit is placed horizontally of the moving direction. This can be included in the aforementioned case in which the object forming the background changes.
- Thus in the spatial-
temporal image 1050, the background changes into thehorizontal line 1051, and the others change into theimages 1052. The reason for thesechanges 1052 is thought to come from the moving object or the background change. In the invention, whether or not it belongs to the background is judged on the basis of the characteristic of the spatial-temporal image 1050 to detect/extract the moving object. - FIG. 5 shows the relation in the distance between the slit image of the background period and the current slit at each time. First of all, the
slit image 1040 is extracted from the interval of the spatial-temporal image 1050, in which the moving object and the background do not change, and is set as a background slit β 1041. Next, theslit image 1040 is extracted from other times and set as a current slit τ 1042. Here is considered the slit vector, in which the intensities of the individual pixels composing theslit image 1040 are set as the vector elements, and a distance δ between two slit vectors, as given byformula 1060, is considered. If the distance δ is determined for each current slit τ 1042 at each time for the spatial-temporal image 1050, there can be obtained a graph given by adistance sequence 1070 in which the distances δ are arranged in the chronological order. - The following facts can be derived from the characteristics of the
distance sequence 1070 and the spatial-temporal image 1050, as described with reference to FIG. 3. The flat portion of thedistance sequence 1070, which has a constant or more length, is predicted to belong to the background so that it has no moving object. In the other portions having more changes, on the other hand, it is thought that the moving object has passed or that the background has changed. In order to discriminate these in the following description, of thedistance sequence 1070, the flat portion having the constant or more length is defined as abackground period 1071, and the remaining portions are defined as a movingobject period 1072. - FIG. 5 shows the spatial-
temporal image 1050 and thedistance sequence 1070 when the structure change of the background occurs. Here is considered the case in which a movingobject 1100 drops a fallingobject 1200 onto theslit 1030, as shown in themovie 1010 of FIG. 5(a). In this case, the background structure is changed with the fallingobject 1200 so that the spatial-temporal image 1050 is taken, as shown in FIG. 5(b). Specifically, theimage 1201 of the fallingobject 1200 appears just behind theimage 1101 on the spatial-temporal image of the movingobject 1100. Thisimage 1201 forms part of the background on its way so that it becomes a horizontal line, as shown in FIG. 5(b). - For this spatial-
temporal image 1050, thedistance sequence 1070 is determined, as shown in FIG. 5(c), by using the background slit β 1041. Of thisdistance sequence 1070, both abackground period 1073 and abackground period 1074 are such flat portions of thedistance sequence 1070 as have a constant or more length, so that they belong to thebackground period 1071. In the period of the same slit as that of the background slit β 1041, the average value is substantially zero, as at thebackground period 1073. On the other hand, thebackground period 1074 takes a constant or more average value if the slit is made different from the background slit β 1041 by theimage 1201 of the falling object. This is because theimage 1201 of the falling object is detected as the difference from the background slit β 1041. Even for thesame background period 1071, the average value of the distance sequence differs depending upon whether or not the slit image is identical to the background slit β 1041. Therefore, theindividual background periods 1071 will be defined as thetrue background period 1073 and thefalse background period 1074 so that they may be differentiated. - The reason for causing the
false background period 1074 is thought to come from not only the structure change of the background but also an abrupt illuminance change. For either reason, the occurrence of thefalse background period 1074 means that the background has changed. In order to continue the judgment of the background or the moving object, the slit τ 1042 may be updated as a new background from thefalse background period 1074 to repeat the judgments by thedistance sequence 1070. - On the basis of the characteristics described above, the invention separates the intervals of the background and the moving object by deeming the flat period of a constant length in the
distance sequence 1070 as thebackground period 1071 and the remainder as the movingobject period 1072. Of thebackground periods 1071, moreover, one having an average value approximate to zero is classified as thetrue background period 1073 whereas the other is classified as thefalse background period 1074. In the case of thefalse background period 1074, assuming that the background has been changed with the illuminance change or the structure change, the slit image τ 1042 of thefalse background period 1074 is updated as a new background, and the foregoing judging procedures are repeated. By these procedures, there is realized the method of judging the presence of the moving object and the structure change of the background by discriminating the three items: the moving object, the background and the structure change of the background at all times. - FIG. 6 explains the data flow of the background judgment means700 for realizing the aforementioned method. This background judgment means 700 includes slit image creation means 701, background slit hold means 702, distance calculation means 703, distance sequence hold means 704, distance sequence smoothing means 710, smoothed time sequence hold means 705, background period judgment means 720, background true/false judgment means 730, moving object period acquire means 740 and background structure change judgment means 750.
- The slit image creation means701 creates the
current slit 1042 to be judged on the basis of thedigital image data 1000 inputted, and transmits it to the distance calculation means 703. - The background slit hold means702 holds the
background slit 1041, which is judged by the background true/false judgment means 730 or the background structure change judgment means 750, and transmits it in response to the demand from the distance calculation means 702. - The distance calculation means702 calculates the distance in accordance with the
formula 1060 by assuming thecurrent slit 1042 and thebackground slit 1041 as vectors. The distance δ calculated is transmitted to the distance sequence hold means 704. - This distance sequence hold means704 holds the calculated distance δ over a past constant time period so that the distance δ may be handled as a sequence. The
distance sequence 1070 is updated to discard the oldest value and contain the newest value each time the distance is newly transmitted. Moreover, the distance sequence hold means 704 transmits thedistance sequence 1070 to the distance sequence smoothing means 710 in response to a demand from the distance sequence smoothing means 710. - This distance sequence smoothing means710 smoothes the
distance sequence 1070 which is stored in the distance sequence hold means 704, by the motion averaging method. This is because small vibrations are frequently caused in the distance sequence by the influences such as jitters. Thedistance sequence 1070 thus smoothed is transmitted to the smoothed sequence hold means 705. - This smoothed-sequence hold means705 holds the smoothed latest distance sequence. This distance sequence is transmitted to the background period judgment means 720, the background true/false judgment means 730 and the moving object period acquire means 740 in response to their individual demands.
- The background period judgment means720 searches the
background period 1071 from the smoothedlatest distance sequence 1070 and transmits the result as the interval to the background true/false judgment means 730 and the moving object period acquire means 740, respectively. This search of thebackground period 1071 is realized by judging the flat portion of the smootheddistance period 1070, as described with reference to FIG. 4. The search algorithm will be detailed with reference to FIG. 7. - The background true/false judgment means730 judges whether the background period is the true one 1073 or the
false one 1074, on the basis of thebackground period 1071 and the smootheddistance sequence 1070. After this, thecurrent slit image 1042 is transmitted to the background slit hold means 702 and the background structure change judgment means 750 in accordance with the judgment result. In the case of thetrue background period 1073, thecurrent slit image 1042 is transmitted as anew background slit 1041 to the background slit hold means 702. In the case of thefalse background period 1074, thecurrent slit image 1042 is transmitted to the background structure change judgment means 750 to extract the structure change of the background. The algorithm for the true/false judgment will be detailed with reference to FIGS. 8 to 11. - The moving object period acquire means740 determines the maximum on the basis of the interval, in which the moving object exists, and the smoothed
distance sequence 1070, and returns the number of real moving objects of the period, as predicted for their existence, and the time of the maximum portion as the moving object period. - The background structure change judgment means750 judges whether the background change is caused by the structure change or the illuminance change, from both the
background slit 1041 stored in the background slit hold means 702 and thecurrent slit image 1042 transmitted from the background period true/false judgment means 730, thereby to update thecurrent slit image 1042 as a new background. This judgment algorithm will be detailed with reference to FIG. 12. - FIG. 7 shows the flow chart of the background period judgment means. First of all, the smoothed
sequence 1070 for a constant interval (e.g., the latest forty five frames) is acquired (at Step 2001) from the smoothed sequence hold means 705. Next, the maximum/minimum for the interval are acquired (at Step 2002) from the smoothedsequence 1070. If the difference between the maximum and the minimum is over a predetermined threshold, it is judged that the period is not thebackground period 1071, and the procedure is ended. If the difference is below the threshold, it is judged that the period is thebackground period 1071, and the routine advances to Step 2004 (at Step 2003). At last, the leading and ending times of the sequence are returned as the interval (at Step 2004). - FIG. 8 shows the influences of the illuminance change upon the slit image. First of all, there is thought the
slit 1040 in which the brightnesses of the individual pixels are given by P1 to Pn, as shown in FIG. 8(a). Next, a graph is drawn to take the positions of pixels on the abscissa and the brightnesses of pixels on the ordinate, as indicated by 1046 in FIG. 8(b). - If an abrupt illuminance change occurs to darken the
slit image 1040 as a whole, the brightnesses of the individual pixels P1 to Pn in theslit image 1040 grow uniformly dark from 1046 to 1047 while holding the relations, as shown in FIG. 8(c). These illuminance changes are shown in FIG. 9 if the slit image is deemed as vectors. - It can be deemed that the
slit image 1040 be avector v 1048 having the individual pixel brightnesses as its elements, as shown in FIG. 9(a). If the base vectors of the individual pixels P1 to Pn are designated by b1, b2, b3, - - -, and bn, thevector v 1048 can be expressed as one point in an n-dimensional vector space, as shown in FIG. 9(b). Next, it is assumed that an abrupt illuminance change occurs for thatvector v 1048 so that the slit vector changes into aslit vector v 1049, as shown in FIG. 9(c). At this time, it can be deemed from the consideration of FIG. 8 that the changed slit vector v′ 1049 exists on a straight line substantially identical to that of thevector v 1048 and is scalar times as large as thevector v 1048. - Thus, it can be understood that the
original slit vector 1048 and theslit vector 1049 changed by the illuminance have substantially identical directions even if they have highly different coordinate positions in the vector space. On the other hand, the slit vector having a changed structure is predicted to be highly different not only in the coordinate position but also in the direction. In order to discriminate the illuminance change and the structure change of theslit 1040, therefore, it is sufficient to consider the direction. - FIG. 10 is the projections of the
ordinary slit vector 1048 and theslit vector 1049 which is influenced by the illuminance change, upon a unit sphere. As shown in FIG. 10, the distance PQ between the projectedvector P 1048′ of thevector v 1048 upon the unit sphere and the projectedvector Q 1049′ of thevector v 1049 upon the unit sphere becomes far shorter than the original distance vv′. Whether the relation between the two different slit images is caused merely by the difference in the illuminance change or the structure change can be judged depending upon whether or not the vector distance on the unit sphere is extremely short. This normalized intervector distance will be called the normalized distance so that it may be discriminated from the distance, as defined by theformula 1060. - In the invention, this normalized distance is utilized to discriminate whether the background change is caused by the structure change or the illuminance change.
- FIG. 11 shows the flow chart of the background true/false judgment means730. First of all, the
background period 1071 is acquired (at Step 2101) from the background period judgment means 720. Next, an average value is acquired (at Step 2102) from the smoothed sequence of thebackground period 1071. If the average value is below a predetermined threshold, the givenbackground period 1071 is judged to be thetrue background period 1073, and the routine advances to Step 2104. If the average is over the threshold, the background period is judged to be false (at Step 2103), as at 1074, and the routine advances to Step 2105. If atStep 2104, thebackground period 1071 is true, as at 1073, and the routine is ended (at Step 2105) by storing the latest slit as thenew background slit 1041 in the background slit hold means 702. If atStep 2105, thebackground period 1071 is false, as at 1074, and the routine is ended (at Step 2105) by judging whether it is due to the illuminance change or the structure change, by the background structure judgment means 705. - FIG. 12 shows the flow chart of the background structure change judgment means750. First of all, in order to judge whether the judgment of the
false background period 1074 is caused by the illuminance change, the normalized distance between thebackground slit 1041 and the latestcurrent slit 1042 in the smoothed sequence is determined (at Step 2201). If this normalized distance is below a predetermined threshold, this judgment is caused by the illuminance change, and the routine advances to Step 2203. If over the threshold, the decision is caused (at Step 2202) by the structure change, and the routine advances to Step 2204. - In the case of
Step 2203, thebackground period 1071 is thefalse background period 1074 due to the illuminance change. The routine is ended by setting the value in the smoothed sequence to zero and by storing thecurrent slit 1042 as thenew background slit 1041 in the background slit hold means 702. In the case ofStep 2204, thebackground period 1071 is thefalse background period 1074 due to the structure change. In this case, the routine is ended by storing the latestcurrent slit 1042 as thenew background slit 1041 in the background slit hold means 702 and by setting all values in the smoothed sequence to zero. - Next, the method of extracting the moving object exclusively and the method of calculating the moving direction and velocity will be described in the following. FIG. 13 summarizes the method of extracting the moving object exclusively by separating/judging the background change region and the moving object region from the spatial-temporal image. FIGS. 14 and 15 summarize the method of calculating the moving direction and velocity. FIGS.15 to 19 explain the motion analysis means for realizing those methods.
- FIG. 13 summarizes the method of extracting the moving object exclusively by separating/judging the background change region and the moving object region from the spatial-
temporal image 1050. - First of all, a spatial-
temporal image 1053 in the interval where the moving object is thought to exist is cut out of the spatial-temporal image 1050 on the basis of the movingobject interval 1001, and theoriginal background slit 1041 and afuture background slit 1041′ are acquired. Next, an originalbackground differential image 1054 is created from theoriginal background slit 1041 and the spatial-temporal image 1053, and a futurebackground differential image 1055 is created from thefuture background slit 1041′ and the spatial-temporal image 1053. This background differential image contains not only a movingobject region 1102 but also thedifferential region 1202 between thebackground slit 1041 and the background structure change. - At last, the logical product of the original
background differential image 1054 and the futurebackground differential image 1055 is determined to extract the movingobject region 1102. As a result, thedifferential region 1202 from the background structure change is canceled so that only the movingobject region 1102 or the common region can be extracted. - FIG. 14 summarizes the slit setting method for analyzing the motions of the moving object in the movie and the method of calculating the moving direction/
velocity 1003 of the movingobject 1101 which is extracted from the spatial-temporal image 1050 obtained by the slit setting method. - Generally speaking, if the
slit 1030 is set at a right angle or at a non-parallel oblique direction with respect to the moving direction of the moving object, the movingobject 1101 is inclined forward or backward in the spatial-temporal image 1050 thus obtained, as shown in FIG. 14. This is because the upper or lower side of the movingobject 1101 reaches theslit 1030 faster than the opposite side with respect to theslit 1030. As a result, even with a single slit, it can be determined from the positive or negative sign of the inclination whether the moving object has moved from the left or right hand. From the magnitude of theinclination 1210, moreover, the average velocity to cross theslit 1030 can be calculated. - By utilizing this, the moving direction/
velocity 1003 of the movingobject 1101 can be calculated according to the invention. In the motion analysis means 800, the movingobject region 1201 is extracted from the spatial-temporal image 1050 which is obtained from theslit 1030 set at the inclination, and thisinclination 1210 is calculated from the moment of the region to estimate the moving direction and velocity. - FIG. 15 explains the principle for calculating the moving direction/
velocity 1003 of the movingobject 1101 extracted, from the inclination of theslit 1030 and theinclination 1210 of the moving object 1011. - First of all, the inclination of the
slit 1030, as set on themovie 1010, from the horizontal direction is designated by α. It is assumed that the movingobject 1100 having a horizontal velocity v passes theslit 1030. If the movingobject 1100 has a height h and if the movingobject 1100 moves by w after its upper portion passed the slit and before its lower portion passes the slit, the horizontal moving velocity v is expressed by aformula 1610. Next, the inclination of theimage 1101 of the moving object in the spatial-temporal image 1050 is expressed by θ. If the number of frame images per second is f, the frame number s for the moving object to move by w in the frame image is described by aformula 1620. Aformula 1630 is obtained if theformulas - In the invention, on the basis described above, the moving direction/
velocity 1003 are calculated from the inclination of theslit 1030 and the inclination of theimage 1101 of the movingobject 1100 in the spatial-temporal image 1050. - FIG. 16 shows the data flow of the motion analysis means800 for realizing the aforementioned method. This motion analysis means 800 includes spatial-temporal image creation means 801, background slit acquire means 802, background difference merging means 803 and merged background difference inclination judgment means 820.
- These spatial-temporal image creation means801, background slit acquire means 802, background difference merging means 803 and merged background difference inclination judgment means 820 realize the method of extracting the moving
object 1101 exclusively by separating/judging thebackground change region 1202 and the movingobject region 1102 from the spatial-temporal image 1050, as described with reference to FIG. 13. - The spatial-temporal image creation means801 creates the spatial-
temporal image 1053 in the interval, for which the movingobject 1101 exists, by acquiring the slit images from thedigital image data 1000 and the movingobject interval 1001 and by arranging them in the frame order. The spatial-temporal image 1050 is transmitted in response to the demands from the background difference creation means 810 and 810′. - The background slit acquire means802 acquires the
original background slit 1041 and thefuture background slit 1041′ from before and behind the interval, for which the movingobject 1101 exists, on the basis of thedigital image data 1000 and the movingobject interval 1001. - The background difference creation means810 and 810′ creates the
original background difference 1054 and thefuture background difference 1055 from the spatial-temporal image 1053, theoriginal background slit 1041 and thefuture background slit 1041′. The detail of the background difference creation algorithm will be described with reference to FIG. 17. - The background difference merging means803 creates a
merged background difference 1056 from the logical product of the createdoriginal background difference 1054 andfuture background difference 1055. Only the movingobject 1101 is extracted by the procedure described above. - The merged background difference inclination judgment means820 realizes the method of calculating the moving direction/
velocity 1003 of the extracted moving object from the inclination of theslit 1030 and theinclination 1210 of the moving object, as described with reference to FIGS. 13 and 14. - FIG. 17 shows the construction (or data flow) of the
means 810 for creating the background difference by using the original background and the future background. The background difference creation means 810 includes moving object region separation means 811, moving object region morphology means 812, noise region elimination means 813 and occluded region supply means 814. - The moving object region separation means811 makes only the moving
object region 1201 binary to separate/extract it from either thebackground slit image 1041 and thecurrent slit image 1042 or the background frame image and the current frame image. The detail of this separation/extraction algorithm will be described with reference to FIGS. 18 and 19. - Next, with the assumption that the moving object be one occluded region having a constant or more size, the moving object region morphology means812, the noise region elimination means 813 and the occluded region supply means 814 correct the rupture or segmentation of the moving object region, as caused by moving object region separation means 911.
- The moving object region morphology means812 connects the ruptured or segmented moving
object region 1201 by the morphologies. The number of these morphologies is about three. The noise region elimination means 813 eliminates the minute regions independent from the movingobject region 1201 of morphology by deeming them as noises. The occluded region supply means 814 searches and smears the holes contained in the movingobject region 1201. - By the processings described above, the moving
object 1200 is cut-out as the movingobject region 1201 to create the background differential image. When the change in the background structure is to be cut out, this cutout can be made by the processings using the changed background image in place of the current image. - FIG. 18 summarizes the moving object region separation means810. First of all, there are acquired the
background slit 1041 and thecurrent slit 1042 from which the moving object region is to be cut out. Next, thebackground slit 1041 and thecurrent slit 1042 are compared for each pixel to judge whether they belong to the background or the moving object. This comparison does not resort to the brightness of the corresponding pixel, but a local slit composed of a w-number of pixels including the corresponding one is created for the judgment by determining the normalized distance between the local slits. - When it is judged whether a
target pixel Pn 1045 belongs to the background or the moving object, for example, a local slit τ1 1044 containing thetarget pixel Pn 1045 and a corresponding background local slit β1 1043 are created to determine the normalized distance of the two. Because this normalized distance is used, the background can be correctly judged and eliminated even if the illuminance change of its portion is caused by the shadow of the moving object or the light. - However, when the individual pixel valves of the background
local slit 1043 have a small dispersion, the ordinary vector distance is used. This is because the normalized distance takes a zero value and is misjudged as the background even when a white object moves over a dark background. - FIG. 19 shows the flow chart of the moving object region separation means810. First of all, the
current slit 1042 to be judged is acquired from the movie 1010 (at Step 2301). Next, it is checked whether or not the judgment is executed for all slit pixels. The routine advances to Step 2303, if there is any pixel non-judged, and otherwise the routine is ended (at Step 2302). The twolocal slits Steps 2303 and 2304) from thebackground slit 1041 and thecurrent slit 1042. The dispersion of the backgroundlocal slit 1043 is determined and is compared with the predetermined threshold TV. The routine advances to Step 2306, if below the threshold TV, and otherwise to Step 2307 (at Step 2305). Here, the normalized distance is determined (at Step 2306) as the distance between the twolocal slits local slits Step 2302 and is repeated. - FIGS. 20 and 21 explain the method of extracting the moving object and the background by separating/judging the background change region and the moving object region from the frame image.
- FIG. 20 summarizes the method of extracting the moving object exclusively by separating/judging the background change region and the moving object region from the frame image.
- First of all, an
original background image 1801, afuture background image 1803 and acurrent frame image 1802 are acquired. Here, it is assumed that a movingobject 1804 and a fallingobject 1805 be projected on thecurrent frame image 1802 and that another fallingobject 1806 be projected in addition to the fallingobject 1805 on thefuture background image 1803. - Next, there are created an
original background difference 1807 between theoriginal background image 1801 and thecurrent frame image 1802 and afuture background difference 1808 between thefuture background image 1803 and thepresent frame image 1802. A movingobject region 1809 and abackground change region 1810 by the falling object appear in theoriginal background difference 1807. In thefuture background difference 1808, on the other hand, there appear the movingobject region 1809 and abackground change region 1811 by the fallingobject 1806. - If a
merged difference 1812 of theoriginal background difference 1807 and thefuture background difference 1808 is determined by the logical product, thebackground change regions object region 1809. - At last, when a cutout is made from the
frame image 1802 by using themerged difference 1812 as the mask image, it is possible to obtain a movingobject image 1813 containing only the movingobject 1804. - FIG. 21 summarizes the method of extracting the background change exclusively by separating/judging the background change region and the moving object region from the frame image.
- First of all, the
original background image 1801 and thefuture background image 1803 are acquired. Here, it is assumed that thefuture background image 1803, the fallingobject 1805 and the fallingobject 1806 are reflected. Next, abackground difference 1901 between theoriginal background image 1801 and thefuture background image 1803 is created. In thisbackground difference 1901, there appear thebackground change region 1810 by the fallingobject 1805 and thebackground change region 1811 by the fallingobject 1806. At last, when a cutout is made from thefuture background image 1803 by using themerged difference 1901 as the mask image, it is possible to obtain a backgroundstructure change image 1902 containing only the fallingobjects - FIG. 22 shows the construction (or data flow) of the moving object extraction means900 for realizing the aforementioned method. The moving object extraction means 900 is constructed to include six components of frame image acquire means 901, background image creation means 902, background difference creation means 910 and 910′, background difference merging means 903 and moving object cutout means 904. These means realizes the method of extracting only the moving object by separating/judging the background change region and the moving object region, as described with reference to FIG. 19.
- The frame image acquire means901 acquires the
frame image 1802 of the interval, for which the movingobject 1100 seems to exist, from the movingobject interval 1001 and thedigital image data 1000, and transmits it to the background difference creation means 910 and 910′. The background image creation means 902 acquires the frame image of the interval, which is judged as the background, from the movingobject interval 1001 and thedigital image data 1000, and transmits it as theoriginal background image 1801 and thefuture background image 1803 to the background difference creation means 910 and 910′. These background difference creation means 910 and 910′ repeats the processings of the background difference creation means 810 and 810′, as described with reference to FIG. 15, to create theoriginal background difference 1807 and thefuture background difference 1808 of the frame image. The background difference merging means 903 creates themerged background difference 1812 from the logical product of theoriginal background difference 1807 and thefuture background difference 1808 of the frame image and transmits it to the moving object cutout means 904. - This moving object cutout means904 cuts the
merged background difference 1812 as the mask image out of theframe image 1802 to extract the movingobject image 1804. - FIG. 23 shows an example of the result display screen which is outputted onto the
display 300 by the result output means 600. Theresult display screen 2000 is constructed to include at least the four components of an input imagemovie display region 2010, a background change representativescreen display region 2020, a moving object representativescreen display region 2030 and a correlated valuesequence display region 2040. - In this display result example, the
slit 1030 is placed upright in the middle of themovie 1010, and arepresentative screen 2032 of the moving object having passed through theslit 1030 andrepresentative screens screen display region 2020 and the moving object representativescreen display region 2030, respectively. Moreover, the sequence (or the distance sequence 1070) of the correlated values of the slit is displayed in the correlated valuesequence display region 2040 to indicate the user the grounds for judgment the present of the moving object and the background change. - The input
movie display region 2010 is a portion for displaying thepresent movie 1010 which is inputted from theTV camera 200. - The background change representative
screen display region 2020 is a portion for displaying thebackground representative screen 2022 before change and the background representative screen after change by detecting the structure change of the background in themovie 1010. The detected change in the background structure is displayed as a pair of upper and lower parts of thebackground representative screen 2022 before change and thebackground representative screen 2023 after change in a backgroundchange display window 2021 so that their difference may be judged by the user. This screen example displays that the background change is exemplified by the parking of an automobile or the falling object of a truck. The backgroundchange display window 2021 is provided with a scroll bar so that the change in the background structure thus far detected may be observed. At this time, amarker 2024 is attached so that the group of the latest representative screens may be quickly understood. - The moving object representative
screen display region 2030 is a portion for displaying therepresentative screen 2032 projecting the moving object by detecting this object in themovie 1010. The detected moving object is displayed in a moving object representativescreen display window 2031 so that it may be judged by the user. This screen example displays that the moving object is exemplified by a motorbike, a black automobile, a white automobile, a gray automobile or a truck. The moving object representativescreen display window 2031 is provided with a scroll bar so that the representative screen of the moving objects thus far detected may be observed. At this time, moreover, amarker 2033 is attached so that the latest moving object may be discriminated. - The correlated value
sequence display region 2040 is a portion for displaying both the spatial-temporal image 1050 obtained from theslit 1030 and thesequence 1070 of the correlated values (or distances) at the corresponding time. The pixels and graph values of the spatial-temporal image 1050 and thedistance sequence 1070 at the latest time are always displayed at the righthand of a correlated valuesequence display window 2041. At the same time, there are displayed a movingobject detection marker 2042 indicating the position on the spatial-temporal image at the time of detecting the moving object and a backgroundchange detection marker 2043 indicating the position on the spatial-temporal image at the time of detecting the background change, so that the grounds for the detections may be understood at a glance by the user. - In the present embodiment, the window area of interest on the
movie 1010 is exemplified by theslit 1030. For the processings in the background judgment means 700 and the moving object extraction means 900, however, essentially the same operations are undergone for an assembly of a plurality of adjacent pixels, even if the shape is different from theslit 1030. - Another embodiment conceivable for the invention is exemplified by such a window area of interest as has a square, circular or concentric shape. For example, a movie of ten and several hours, as is obtained from a TV camera attached to the entrance of a house or office, is judged with the correlated value sequence of the entire frame image so that the a list of the representative images of visitors or distributed parcels may be extracted.
- According to the invention, the background and the moving object can be judged so that the moving object can be exclusively detected, even under the complicated background having a change in the illuminating condition or a structure change. For the moving object to be extracted, no restriction is exerted upon the shape, color and moving direction and velocity of the moving object. Moreover, the moving direction and velocity can be calculated.
- If the background changes, this change can be judged upon whether it is the structure change or the illumination condition change.
- In addition, the object to be processed is several percentages of pixels in the movie so that the processing is ten times or more as high as that of the moving object extraction apparatus of the prior art. As a result, the amount of memory to be used can also be reduced to several percentages. Thus, the real time processing can be achieved even by an inexpensive computer such as the personal computer.
- The present invention has been described with reference to the preferred and alternate embodiments. Obviously, modifications and alternations will occur to those of ordinary skill in the art upon reading and understanding the invention. It is intended that the invention be construed as including all such modifications and alternations in so far they come with the scope of the appended claims or the equivalent thereof.
Claims (20)
1. A method for monitoring a moving object, comprising the steps of:
setting a window area of interest for an inputted time-varying image;
calculating a correlation between a first data of the window area in a frame (A) and a second data of the window area in a frame (B); and
deciding a first interval in which a moving object is present in the window area, based on a pattern of a calculated correlation value over a predetermined time period.
2. A method for monitoring a moving object according to claim 1 , further comprising the step of:
displaying a representative screen of time-varying images in the first interval.
3. A method for monitoring a moving object according to claim 1 , further comprising the steps of:
storing representative screens of time-varying images in a plurality of the first intervals; and
displaying the representative screens.
4. A method for monitoring a moving object according to claim 1 , further comprising the steps of:
deciding a second interval in which a moving object is not present in the window area, based on the pattern of the calculated correlation value over the predetermined time period; and
displaying a representative screen of time-varying images in the second interval as a background screen.
5. A method of monitoring a moving object according to claim 1 , wherein the data of the window area are represented as a feature and the calculated correlation value is assumed by the distance between the features.
6. A method for monitoring a moving object according to claim 1 , wherein the window area has an arbitrary shape.
7. A method for monitoring a moving object, comprising the steps of:
setting a window area of interest for an inputted time-varying image;
calculating a correlation between a reference data of the window area in a reference frame and a current data of the window area in a current frame;
deciding whether images of the window area of the current frame includes an image of a moving object correlation value over a predetermined time period; and
updating the reference data as the current data when the image of the window area of the current frame does not include the image of the moving object.
8. A method of monitoring a moving object according to claim 7 , wherein the window area has an arbitrary shape.
9. A method of monitoring a moving object according to claim 8 , wherein the arbitrary shape is one of a straight line, a segment shape, a square shape, a circular shape and a concentric shape.
10. A method for monitoring a moving object, comprising the steps of:
setting a window area of interest for an inputted time-varying image;
calculating a correlation between a reference data of the window area in a reference frame and a current data of the window area in a current frame;
deciding whether images of the window area in the current frame is a background image which is a reference image for detecting a moving object, based on a plurality of calculated correlation values; and
updating the reference data as the current data when the image of the window area in the current frame is a background image.
11. A method of monitoring a moving object according to claim 10 , further comprising the steps of:
deciding whether the background image is changed or not; and
judging whether the background change is caused by an illuminance change or a structure change.
12. A method for monitoring a moving object, comprising the steps of:
setting a window area of interest for an inputted time-varying image;
calculating a correlation between a first data of the window area in a frame (A) and a second data of the window area in a frame (B); and
deciding an interval in which an image of the window area changes temporarily, based on a plurality of calculated correlation values.
13. A method of monitoring a moving object according to claim 12 , further comprising the step of:
displaying a representative screen of time-varying images in the interval.
14. A monitoring system, comprising:
a TV camera for taking in time-varying images;
a computer for monitoring the time-varying images; and
a display for displaying the result of the monitoring;
wherein said computer calculates a correlation between a first data of a window area in a frame (A) and a second data of the window area in a frame (B) and decides a interval in which a moving object is present in the window area based on a pattern of a calculated correlation value over a predetermined time period; and
wherein said display displays a representative screen of the time-varying images in the first interval.
15. A monitoring system according to claim 14 , wherein the data of the window area are represented as a feature vector and the calculated correlation value is assumed by a distance between the feature vectors.
16. A monitoring system, comprising:
a TV camera for taking in time-varying images;
a computer for monitoring the time-varying images; and
a display for displaying the result of monitoring;
wherein said computer calculates a correlation between a reference data and a current data of the window area in a current frame and decides whether an image of the window area of the current frame includes an image of a moving object or not based on a pattern of a calculated correlation value over a predetermined time period.
17. A monitoring system according to claim 16 , wherein the reference data is updated by the current data when an image of the window area of the current frame does not include the image of the moving object.
18. A monitoring system, comprising:
a TV camera for taking in time-varying images;
a computer for monitoring the time-varying images; and
a display for displaying the result of monitoring;
wherein said computer calculates a correlation between a first data of a window area in a frame (A) and a second data of the window area in a frame (B) and decides an interval in which an image of the window area changes temporarily, based on a plurality of calculated correlation value.
19. A monitoring system, comprising:
means for taking in time-varying images;
means for setting a window area of interest for the time-varying images;
means for calculating a correlation between a first data of the window area in a frame (A) and a second data of the window area in a frame (B); and
means for deciding a first interval in which a moving object is present in the window area, based on a pattern of a calculated correlation value over a predetermined time period.
20. A monitoring system, comprising:
means for taking in time-varying images;
means for setting a window area of interest for the time-varying images;
means for calculating a correlation between a first data of the window area in a frame (A) and a second data of the window area in a frame (B); and
means for deciding an interval in which an image of the window area changes temporarily, based on a plurality of a calculated correlation value.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/946,528 US20020030739A1 (en) | 1995-02-17 | 2001-09-06 | Moving object detection apparatus |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP02909995A JP3569992B2 (en) | 1995-02-17 | 1995-02-17 | Mobile object detection / extraction device, mobile object detection / extraction method, and mobile object monitoring system |
JP7-029099 | 1995-02-17 | ||
US08/601,951 US5721692A (en) | 1995-02-17 | 1996-02-15 | Moving object detection apparatus |
US09/023,467 US5862508A (en) | 1995-02-17 | 1998-02-13 | Moving object detection apparatus |
US18222098A | 1998-10-30 | 1998-10-30 | |
US09/946,528 US20020030739A1 (en) | 1995-02-17 | 2001-09-06 | Moving object detection apparatus |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18222098A Continuation | 1995-02-17 | 1998-10-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
US20020030739A1 true US20020030739A1 (en) | 2002-03-14 |
Family
ID=12266908
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/601,951 Expired - Lifetime US5721692A (en) | 1995-02-17 | 1996-02-15 | Moving object detection apparatus |
US09/023,467 Expired - Lifetime US5862508A (en) | 1995-02-17 | 1998-02-13 | Moving object detection apparatus |
US09/946,528 Abandoned US20020030739A1 (en) | 1995-02-17 | 2001-09-06 | Moving object detection apparatus |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/601,951 Expired - Lifetime US5721692A (en) | 1995-02-17 | 1996-02-15 | Moving object detection apparatus |
US09/023,467 Expired - Lifetime US5862508A (en) | 1995-02-17 | 1998-02-13 | Moving object detection apparatus |
Country Status (2)
Country | Link |
---|---|
US (3) | US5721692A (en) |
JP (1) | JP3569992B2 (en) |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1460598A1 (en) * | 2003-03-17 | 2004-09-22 | Adam Mazurek | Process and apparatus for analyzing and identifying moving objects |
US20040252861A1 (en) * | 2003-02-14 | 2004-12-16 | Sony Corporation | Image processing apparatus and method, program, and recording medium |
US20050078853A1 (en) * | 2003-10-10 | 2005-04-14 | Buehler Christopher J. | System and method for searching for changes in surveillance video |
US20060008118A1 (en) * | 2004-07-02 | 2006-01-12 | Mitsubishi Denki Kabushiki Kaisha | Image processing apparatus and image monitoring system |
US20060204044A1 (en) * | 2005-03-01 | 2006-09-14 | Fuji Photo Film Co., Ltd. | Image output apparatus, image output method, image output program, image trimming apparatus, image trimming method, and image trimming program |
US20080024612A1 (en) * | 2003-09-03 | 2008-01-31 | Canon Kabushiki Kaisha | Display apparatus, image processing apparatus, and image processing system |
US20080112642A1 (en) * | 2006-11-14 | 2008-05-15 | Microsoft Corporation | Video Completion By Motion Field Transfer |
US20080303911A1 (en) * | 2003-12-11 | 2008-12-11 | Motion Reality, Inc. | Method for Capturing, Measuring and Analyzing Motion |
US20080310677A1 (en) * | 2007-06-18 | 2008-12-18 | Weismuller Thomas P | Object detection system and method incorporating background clutter removal |
US7477417B1 (en) * | 1999-09-07 | 2009-01-13 | Dai Nippon Printing Co., Ltd. | Image processing system |
US20100157049A1 (en) * | 2005-04-03 | 2010-06-24 | Igal Dvir | Apparatus And Methods For The Semi-Automatic Tracking And Examining Of An Object Or An Event In A Monitored Site |
US20100296743A1 (en) * | 2009-05-21 | 2010-11-25 | Nobuhiro Tsunashima | Image processing apparatus, image processing method and program |
US20110052002A1 (en) * | 2009-09-01 | 2011-03-03 | Wesley Kenneth Cobb | Foreground object tracking |
US8264544B1 (en) * | 2006-11-03 | 2012-09-11 | Keystream Corporation | Automated content insertion into video scene |
FR3007878A1 (en) * | 2013-06-27 | 2015-01-02 | Rizze | DEVICE FOR A ROAD SURVEILLANCE VIDEO SYSTEM TO RECORD THE CONTEXT OF AN EVENT ACCORDING TO THE PRESENCE OF A VEHICLE IN THE FIELD OF VISION OF THE CAMERA |
US20170011261A1 (en) * | 2015-07-09 | 2017-01-12 | Analog Devices Global | Video processing for human occupancy detection |
US10410371B2 (en) | 2017-12-21 | 2019-09-10 | The Boeing Company | Cluttered background removal from imagery for object detection |
Families Citing this family (76)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8948442B2 (en) * | 1982-06-18 | 2015-02-03 | Intelligent Technologies International, Inc. | Optical monitoring of vehicle interiors |
JP3569992B2 (en) * | 1995-02-17 | 2004-09-29 | 株式会社日立製作所 | Mobile object detection / extraction device, mobile object detection / extraction method, and mobile object monitoring system |
US5969755A (en) * | 1996-02-05 | 1999-10-19 | Texas Instruments Incorporated | Motion based event detection system and method |
WO2004089486A1 (en) | 1996-06-21 | 2004-10-21 | Kyota Tanaka | Three-dimensional game device and information recording medium |
US6356272B1 (en) * | 1996-08-29 | 2002-03-12 | Sanyo Electric Co., Ltd. | Texture information giving method, object extracting method, three-dimensional model generating method and apparatus for the same |
EP0831422B1 (en) * | 1996-09-20 | 2007-11-14 | Hitachi, Ltd. | Method of displaying moving object for enabling identification of its moving route, display system using the same, and program recording medium therefor |
IL131056A (en) * | 1997-01-30 | 2003-07-06 | Yissum Res Dev Co | Generalized panoramic mosaic |
US5936639A (en) * | 1997-02-27 | 1999-08-10 | Mitsubishi Electric Information Technology Center America, Inc. | System for determining motion control of particles |
US6130707A (en) * | 1997-04-14 | 2000-10-10 | Philips Electronics N.A. Corp. | Video motion detector with global insensitivity |
US5903271A (en) * | 1997-05-23 | 1999-05-11 | International Business Machines Corporation | Facilitating viewer interaction with three-dimensional objects and two-dimensional images in virtual three-dimensional workspace by drag and drop technique |
JP3444160B2 (en) * | 1997-10-09 | 2003-09-08 | 松下電器産業株式会社 | Moving object detection method |
KR100246626B1 (en) * | 1997-10-16 | 2000-03-15 | 정선종 | Joint marker extraction method using space-time information for morphological image segmentation |
JP3567066B2 (en) * | 1997-10-31 | 2004-09-15 | 株式会社日立製作所 | Moving object combination detecting apparatus and method |
US6177944B1 (en) * | 1998-09-18 | 2001-01-23 | International Business Machines Corporation | Two phase rendering for computer graphics |
US6278460B1 (en) | 1998-12-15 | 2001-08-21 | Point Cloud, Inc. | Creating a three-dimensional model from two-dimensional images |
JP3721867B2 (en) * | 1999-07-07 | 2005-11-30 | 日本電気株式会社 | Video display device and display method |
US6424370B1 (en) * | 1999-10-08 | 2002-07-23 | Texas Instruments Incorporated | Motion based event detection system and method |
JP3601392B2 (en) * | 1999-12-27 | 2004-12-15 | 住友電気工業株式会社 | Image processing apparatus, image processing method, and vehicle monitoring system |
JP3828349B2 (en) | 2000-09-27 | 2006-10-04 | 株式会社日立製作所 | MOBILE BODY DETECTION MEASUREMENT METHOD, DEVICE THEREOF, AND RECORDING MEDIUM CONTAINING MOBILE BODY DETECTION MEASUREMENT PROGRAM |
US7215795B2 (en) * | 2000-09-28 | 2007-05-08 | Hitachi Kokusai Electric Inc. | Intruding object detecting method and intruding object monitoring apparatus employing the method |
DE10050083A1 (en) * | 2000-10-10 | 2002-04-18 | Sick Ag | Device and method for detecting objects |
US8711217B2 (en) | 2000-10-24 | 2014-04-29 | Objectvideo, Inc. | Video surveillance system employing video primitives |
US20050162515A1 (en) * | 2000-10-24 | 2005-07-28 | Objectvideo, Inc. | Video surveillance system |
US9892606B2 (en) | 2001-11-15 | 2018-02-13 | Avigilon Fortress Corporation | Video surveillance system employing video primitives |
US8564661B2 (en) | 2000-10-24 | 2013-10-22 | Objectvideo, Inc. | Video analytic rule detection system and method |
US7200246B2 (en) * | 2000-11-17 | 2007-04-03 | Honeywell International Inc. | Object detection |
US6711279B1 (en) * | 2000-11-17 | 2004-03-23 | Honeywell International Inc. | Object detection |
US6466158B2 (en) | 2000-12-08 | 2002-10-15 | Lockheed Martin Corp. | Identifying closely clustered moving targets |
KR100450793B1 (en) * | 2001-01-20 | 2004-10-01 | 삼성전자주식회사 | Apparatus for object extraction based on the feature matching of region in the segmented images and method therefor |
KR100355382B1 (en) * | 2001-01-20 | 2002-10-12 | 삼성전자 주식회사 | Apparatus and method for generating object label images in video sequence |
US7522257B2 (en) * | 2001-01-23 | 2009-04-21 | Kenneth Jacobs | System and method for a 3-D phenomenoscope |
US9781408B1 (en) | 2001-01-23 | 2017-10-03 | Visual Effect Innovations, Llc | Faster state transitioning for continuous adjustable 3Deeps filter spectacles using multi-layered variable tint materials |
US7604348B2 (en) * | 2001-01-23 | 2009-10-20 | Kenneth Martin Jacobs | Continuous adjustable 3deeps filter spectacles for optimized 3deeps stereoscopic viewing and its control method and means |
US10742965B2 (en) | 2001-01-23 | 2020-08-11 | Visual Effect Innovations, Llc | Faster state transitioning for continuous adjustable 3Deeps filter spectacles using multi-layered variable tint materials |
US7850304B2 (en) * | 2001-01-23 | 2010-12-14 | Kenneth Martin Jacobs | Continuous adjustable 3Deeps filter spectacles for optimized 3Deeps stereoscopic viewing and its control method and means |
US8750382B2 (en) | 2001-01-23 | 2014-06-10 | Kenneth Martin Jacobs | System and method for calculating 3Deeps action specs motion estimation from the motion vectors in an MPEG file |
US7405801B2 (en) * | 2001-01-23 | 2008-07-29 | Kenneth Jacobs | System and method for Pulfrich Filter Spectacles |
US7508485B2 (en) * | 2001-01-23 | 2009-03-24 | Kenneth Martin Jacobs | System and method for controlling 3D viewing spectacles |
US6891570B2 (en) * | 2001-01-31 | 2005-05-10 | Itt Manufacturing Enterprises Inc. | Method and adaptively deriving exposure time and frame rate from image motion |
US6778705B2 (en) * | 2001-02-27 | 2004-08-17 | Koninklijke Philips Electronics N.V. | Classification of objects through model ensembles |
US7424175B2 (en) | 2001-03-23 | 2008-09-09 | Objectvideo, Inc. | Video segmentation using statistical pixel modeling |
CA2451992C (en) | 2001-05-15 | 2013-08-27 | Psychogenics Inc. | Systems and methods for monitoring behavior informatics |
US7162101B2 (en) * | 2001-11-15 | 2007-01-09 | Canon Kabushiki Kaisha | Image processing apparatus and method |
US6697010B1 (en) * | 2002-04-23 | 2004-02-24 | Lockheed Martin Corporation | System and method for moving target detection |
MXPA02005732A (en) * | 2002-06-10 | 2004-12-13 | Valencia Reuther Herman | Smart time measuring card. |
EP1537550A2 (en) * | 2002-07-15 | 2005-06-08 | Magna B.S.P. Ltd. | Method and apparatus for implementing multipurpose monitoring system |
CN100334598C (en) * | 2002-11-26 | 2007-08-29 | 东芝照明技术株式会社 | Market plan support system |
US7664292B2 (en) * | 2003-12-03 | 2010-02-16 | Safehouse International, Inc. | Monitoring an output from a camera |
DE102004018410A1 (en) * | 2004-04-16 | 2005-11-03 | Robert Bosch Gmbh | Safety system and method for its operation |
US7653261B2 (en) * | 2004-11-12 | 2010-01-26 | Microsoft Corporation | Image tapestry |
US7529429B2 (en) | 2004-11-12 | 2009-05-05 | Carsten Rother | Auto collage |
US7532771B2 (en) * | 2004-11-12 | 2009-05-12 | Microsoft Corporation | Image processing system for digital collage |
JP2006165935A (en) * | 2004-12-07 | 2006-06-22 | Nec Corp | Device and method for converting control information |
US9077882B2 (en) | 2005-04-05 | 2015-07-07 | Honeywell International Inc. | Relevant image detection in a camera, recorder, or video streaming device |
JP4610005B2 (en) * | 2005-07-08 | 2011-01-12 | 財団法人電力中央研究所 | Intruding object detection apparatus, method and program by image processing |
CA2649389A1 (en) * | 2006-04-17 | 2007-11-08 | Objectvideo, Inc. | Video segmentation using statistical pixel modeling |
JP4866159B2 (en) * | 2006-06-27 | 2012-02-01 | 株式会社日立製作所 | Moving body detection device |
US20080122926A1 (en) * | 2006-08-14 | 2008-05-29 | Fuji Xerox Co., Ltd. | System and method for process segmentation using motion detection |
US8045783B2 (en) * | 2006-11-09 | 2011-10-25 | Drvision Technologies Llc | Method for moving cell detection from temporal image sequence model estimation |
JP4821642B2 (en) * | 2007-02-15 | 2011-11-24 | 株式会社ニコン | Image processing method, image processing apparatus, digital camera, and image processing program |
TWI355615B (en) * | 2007-05-11 | 2012-01-01 | Ind Tech Res Inst | Moving object detection apparatus and method by us |
JP4972491B2 (en) * | 2007-08-20 | 2012-07-11 | 株式会社構造計画研究所 | Customer movement judgment system |
JP4807354B2 (en) * | 2007-12-25 | 2011-11-02 | 住友電気工業株式会社 | Vehicle detection device, vehicle detection system, and vehicle detection method |
JP4513869B2 (en) | 2008-02-13 | 2010-07-28 | カシオ計算機株式会社 | Imaging apparatus, strobe image generation method, and program |
JP2009194595A (en) * | 2008-02-14 | 2009-08-27 | Sony Corp | Broadcast system, transmitter, transmission method, receiver, reception method, exhibition device, exhibition method, program, and recording medium |
JP2010011016A (en) * | 2008-06-26 | 2010-01-14 | Sony Corp | Tracking point detection apparatus, method, program, and recording medium |
JP2012053708A (en) | 2010-09-01 | 2012-03-15 | Toshiba Tec Corp | Store system, sales registration device and program |
JP6024229B2 (en) * | 2012-06-14 | 2016-11-09 | 富士通株式会社 | Monitoring device, monitoring method, and program |
WO2014155877A1 (en) * | 2013-03-26 | 2014-10-02 | ソニー株式会社 | Image processing device, image processing method, and program |
JP6157242B2 (en) * | 2013-06-28 | 2017-07-05 | キヤノン株式会社 | Image processing apparatus and image processing method |
FR3010220A1 (en) * | 2013-09-03 | 2015-03-06 | Rizze | SYSTEM FOR CENSUSING VEHICLES BY THE CLOUD |
US10664496B2 (en) * | 2014-06-18 | 2020-05-26 | Hitachi, Ltd. | Computer system |
US20170164267A1 (en) * | 2015-12-03 | 2017-06-08 | The Trustees Of Columbia University In The City Of New York | Apparatus to inhibit misuse of an electrically powered device |
WO2017120196A1 (en) | 2016-01-04 | 2017-07-13 | The Trustees Of Columbia University In The City Of New York | Apparatus to effect an optical barrier to pests |
KR102153607B1 (en) * | 2016-01-22 | 2020-09-08 | 삼성전자주식회사 | Apparatus and method for detecting foreground in image |
US11436839B2 (en) | 2018-11-02 | 2022-09-06 | Toyota Research Institute, Inc. | Systems and methods of detecting moving obstacles |
Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5099322A (en) * | 1990-02-27 | 1992-03-24 | Texas Instruments Incorporated | Scene change detection system and method |
US5103305A (en) * | 1989-09-27 | 1992-04-07 | Kabushiki Kaisha Toshiba | Moving object detecting system |
US5134472A (en) * | 1989-02-08 | 1992-07-28 | Kabushiki Kaisha Toshiba | Moving object detection apparatus and method |
US5243418A (en) * | 1990-11-27 | 1993-09-07 | Kabushiki Kaisha Toshiba | Display monitoring system for detecting and tracking an intruder in a monitor area |
US5331312A (en) * | 1991-08-23 | 1994-07-19 | Matsushita Electric Industrial Co., Ltd. | Obstacle-detecting apparatus |
US5416693A (en) * | 1991-08-28 | 1995-05-16 | Fuji Xerox Co., Ltd. | Moving picture search support device |
US5465115A (en) * | 1993-05-14 | 1995-11-07 | Rct Systems, Inc. | Video traffic monitor for retail establishments and the like |
US5500904A (en) * | 1992-04-22 | 1996-03-19 | Texas Instruments Incorporated | System and method for indicating a change between images |
US5566251A (en) * | 1991-09-18 | 1996-10-15 | David Sarnoff Research Center, Inc | Video merging employing pattern-key insertion |
US5684887A (en) * | 1993-07-02 | 1997-11-04 | Siemens Corporate Research, Inc. | Background recovery in monocular vision |
US5721692A (en) * | 1995-02-17 | 1998-02-24 | Hitachi, Ltd. | Moving object detection apparatus |
US5748775A (en) * | 1994-03-09 | 1998-05-05 | Nippon Telegraph And Telephone Corporation | Method and apparatus for moving object extraction based on background subtraction |
US5802361A (en) * | 1994-09-30 | 1998-09-01 | Apple Computer, Inc. | Method and system for searching graphic images and videos |
US5805733A (en) * | 1994-12-12 | 1998-09-08 | Apple Computer, Inc. | Method and system for detecting scenes and summarizing video sequences |
US5841883A (en) * | 1994-10-27 | 1998-11-24 | Yazaki Corporation | Method of diagnosing a plant automatically and device for executing method thereof |
US5974219A (en) * | 1995-10-11 | 1999-10-26 | Hitachi, Ltd. | Control method for detecting change points in motion picture images and for stopping reproduction thereof and control system for monitoring picture images utilizing the same |
US6005493A (en) * | 1996-09-20 | 1999-12-21 | Hitachi, Ltd. | Method of displaying moving object for enabling identification of its moving route display system using the same, and program recording medium therefor |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2754867B2 (en) * | 1990-05-23 | 1998-05-20 | 松下電器産業株式会社 | Image motion detection device |
JP3011748B2 (en) * | 1990-09-12 | 2000-02-21 | 日本電信電話株式会社 | Mobile counting device |
JPH0589242A (en) * | 1991-09-26 | 1993-04-09 | Nippon Telegr & Teleph Corp <Ntt> | Object area segmenting device for image |
-
1995
- 1995-02-17 JP JP02909995A patent/JP3569992B2/en not_active Expired - Fee Related
-
1996
- 1996-02-15 US US08/601,951 patent/US5721692A/en not_active Expired - Lifetime
-
1998
- 1998-02-13 US US09/023,467 patent/US5862508A/en not_active Expired - Lifetime
-
2001
- 2001-09-06 US US09/946,528 patent/US20020030739A1/en not_active Abandoned
Patent Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5134472A (en) * | 1989-02-08 | 1992-07-28 | Kabushiki Kaisha Toshiba | Moving object detection apparatus and method |
US5103305A (en) * | 1989-09-27 | 1992-04-07 | Kabushiki Kaisha Toshiba | Moving object detecting system |
US5099322A (en) * | 1990-02-27 | 1992-03-24 | Texas Instruments Incorporated | Scene change detection system and method |
US5243418A (en) * | 1990-11-27 | 1993-09-07 | Kabushiki Kaisha Toshiba | Display monitoring system for detecting and tracking an intruder in a monitor area |
US5331312A (en) * | 1991-08-23 | 1994-07-19 | Matsushita Electric Industrial Co., Ltd. | Obstacle-detecting apparatus |
US5416693A (en) * | 1991-08-28 | 1995-05-16 | Fuji Xerox Co., Ltd. | Moving picture search support device |
US5566251A (en) * | 1991-09-18 | 1996-10-15 | David Sarnoff Research Center, Inc | Video merging employing pattern-key insertion |
US5500904A (en) * | 1992-04-22 | 1996-03-19 | Texas Instruments Incorporated | System and method for indicating a change between images |
US5465115A (en) * | 1993-05-14 | 1995-11-07 | Rct Systems, Inc. | Video traffic monitor for retail establishments and the like |
US5684887A (en) * | 1993-07-02 | 1997-11-04 | Siemens Corporate Research, Inc. | Background recovery in monocular vision |
US5748775A (en) * | 1994-03-09 | 1998-05-05 | Nippon Telegraph And Telephone Corporation | Method and apparatus for moving object extraction based on background subtraction |
US5802361A (en) * | 1994-09-30 | 1998-09-01 | Apple Computer, Inc. | Method and system for searching graphic images and videos |
US5841883A (en) * | 1994-10-27 | 1998-11-24 | Yazaki Corporation | Method of diagnosing a plant automatically and device for executing method thereof |
US5805733A (en) * | 1994-12-12 | 1998-09-08 | Apple Computer, Inc. | Method and system for detecting scenes and summarizing video sequences |
US5721692A (en) * | 1995-02-17 | 1998-02-24 | Hitachi, Ltd. | Moving object detection apparatus |
US5862508A (en) * | 1995-02-17 | 1999-01-19 | Hitachi, Ltd. | Moving object detection apparatus |
US5974219A (en) * | 1995-10-11 | 1999-10-26 | Hitachi, Ltd. | Control method for detecting change points in motion picture images and for stopping reproduction thereof and control system for monitoring picture images utilizing the same |
US6005493A (en) * | 1996-09-20 | 1999-12-21 | Hitachi, Ltd. | Method of displaying moving object for enabling identification of its moving route display system using the same, and program recording medium therefor |
Cited By (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7477417B1 (en) * | 1999-09-07 | 2009-01-13 | Dai Nippon Printing Co., Ltd. | Image processing system |
US20040252861A1 (en) * | 2003-02-14 | 2004-12-16 | Sony Corporation | Image processing apparatus and method, program, and recording medium |
US7409075B2 (en) * | 2003-02-14 | 2008-08-05 | Sony Corporation | Image processing apparatus and method, program, and recording medium |
EP1460598A1 (en) * | 2003-03-17 | 2004-09-22 | Adam Mazurek | Process and apparatus for analyzing and identifying moving objects |
US7777780B2 (en) | 2003-09-03 | 2010-08-17 | Canon Kabushiki Kaisha | Image motion display method and apparatus |
US9131122B2 (en) | 2003-09-03 | 2015-09-08 | Canon Kabushiki Kaisha | Apparatus, method, system, and storage medium causing a display to display a graph indicating a degree of change of part of a captured image |
US20080024612A1 (en) * | 2003-09-03 | 2008-01-31 | Canon Kabushiki Kaisha | Display apparatus, image processing apparatus, and image processing system |
US8654199B2 (en) | 2003-09-03 | 2014-02-18 | Canon Kabushiki Kaisha | Image motion detection apparatus and method for determining a parameter for detecting a moving object in a moving image and computer readable medium having encoded thereon a program instructing a computer to perform the method |
US20050078853A1 (en) * | 2003-10-10 | 2005-04-14 | Buehler Christopher J. | System and method for searching for changes in surveillance video |
US7280673B2 (en) * | 2003-10-10 | 2007-10-09 | Intellivid Corporation | System and method for searching for changes in surveillance video |
US20080303911A1 (en) * | 2003-12-11 | 2008-12-11 | Motion Reality, Inc. | Method for Capturing, Measuring and Analyzing Motion |
US20060008118A1 (en) * | 2004-07-02 | 2006-01-12 | Mitsubishi Denki Kabushiki Kaisha | Image processing apparatus and image monitoring system |
US8031226B2 (en) * | 2005-03-01 | 2011-10-04 | Fujifilm Corporation | Image output apparatus, image output method, image output program, image trimming apparatus, image trimming method, and image trimming program |
US20060204044A1 (en) * | 2005-03-01 | 2006-09-14 | Fuji Photo Film Co., Ltd. | Image output apparatus, image output method, image output program, image trimming apparatus, image trimming method, and image trimming program |
US10019877B2 (en) * | 2005-04-03 | 2018-07-10 | Qognify Ltd. | Apparatus and methods for the semi-automatic tracking and examining of an object or an event in a monitored site |
US20100157049A1 (en) * | 2005-04-03 | 2010-06-24 | Igal Dvir | Apparatus And Methods For The Semi-Automatic Tracking And Examining Of An Object Or An Event In A Monitored Site |
US8264544B1 (en) * | 2006-11-03 | 2012-09-11 | Keystream Corporation | Automated content insertion into video scene |
US20080112642A1 (en) * | 2006-11-14 | 2008-05-15 | Microsoft Corporation | Video Completion By Motion Field Transfer |
US8243805B2 (en) | 2006-11-14 | 2012-08-14 | Microsoft Corporation | Video completion by motion field transfer |
US20080310677A1 (en) * | 2007-06-18 | 2008-12-18 | Weismuller Thomas P | Object detection system and method incorporating background clutter removal |
US20100296743A1 (en) * | 2009-05-21 | 2010-11-25 | Nobuhiro Tsunashima | Image processing apparatus, image processing method and program |
US8433139B2 (en) * | 2009-05-21 | 2013-04-30 | Sony Corporation | Image processing apparatus, image processing method and program for segmentation based on a degree of dispersion of pixels with a same characteristic quality |
US8218818B2 (en) * | 2009-09-01 | 2012-07-10 | Behavioral Recognition Systems, Inc. | Foreground object tracking |
WO2011028379A3 (en) * | 2009-09-01 | 2011-05-05 | Behavioral Recognition Systems, Inc. | Foreground object tracking |
US20110052002A1 (en) * | 2009-09-01 | 2011-03-03 | Wesley Kenneth Cobb | Foreground object tracking |
FR3007878A1 (en) * | 2013-06-27 | 2015-01-02 | Rizze | DEVICE FOR A ROAD SURVEILLANCE VIDEO SYSTEM TO RECORD THE CONTEXT OF AN EVENT ACCORDING TO THE PRESENCE OF A VEHICLE IN THE FIELD OF VISION OF THE CAMERA |
US20170011261A1 (en) * | 2015-07-09 | 2017-01-12 | Analog Devices Global | Video processing for human occupancy detection |
US10372977B2 (en) * | 2015-07-09 | 2019-08-06 | Analog Devices Gloval Unlimited Company | Video processing for human occupancy detection |
US10410371B2 (en) | 2017-12-21 | 2019-09-10 | The Boeing Company | Cluttered background removal from imagery for object detection |
Also Published As
Publication number | Publication date |
---|---|
US5721692A (en) | 1998-02-24 |
JP3569992B2 (en) | 2004-09-29 |
JPH08221577A (en) | 1996-08-30 |
US5862508A (en) | 1999-01-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5721692A (en) | Moving object detection apparatus | |
EP0749098B1 (en) | Method and apparatus for sensing object located within visual field of imaging device | |
JP6180482B2 (en) | Methods, systems, products, and computer programs for multi-queue object detection and analysis (multi-queue object detection and analysis) | |
EP0567059B1 (en) | Object recognition system using image processing | |
KR100459476B1 (en) | Apparatus and method for queue length of vehicle to measure | |
JP5325899B2 (en) | Intrusion alarm video processor | |
CA2132515C (en) | An object monitoring system | |
US8457360B2 (en) | Detection of vehicles in an image | |
EP0878965A2 (en) | Method for tracking entering object and apparatus for tracking and monitoring entering object | |
EP0986036A2 (en) | Method of updating reference background image, method of detecting entering objects and system for detecting entering objects using the methods | |
WO2001033503A1 (en) | Image processing techniques for a video based traffic monitoring system and methods therefor | |
JP6653361B2 (en) | Road marking image processing apparatus, road marking image processing method, and road marking image processing program | |
JPH07210795A (en) | Method and instrument for image type traffic flow measurement | |
JP2001067566A (en) | Fire detecting device | |
JPH11284997A (en) | Traveling object sensing device | |
JP3377659B2 (en) | Object detection device and object detection method | |
JP7125843B2 (en) | Fault detection system | |
JP4025007B2 (en) | Railroad crossing obstacle detection device | |
JP3294468B2 (en) | Object detection method in video monitoring device | |
JP2002190013A (en) | System and method for detecting congestion by image recognition | |
KR101930429B1 (en) | System for monitoring standardized accident standardization and method for analyzing accident situation using the same | |
Yu et al. | Vision based vehicle detection and traffic parameter extraction | |
JP2001175959A (en) | Method and device for detecting invasion object | |
JP2002300573A (en) | Video diagnostic system on-board of video monitor | |
Michalopoulos et al. | Machine-vision system for multispot vehicle detection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |