WO2016005011A1 - Method in a robotic cleaning device for facilitating detection of objects from captured images - Google Patents

Method in a robotic cleaning device for facilitating detection of objects from captured images Download PDF

Info

Publication number
WO2016005011A1
WO2016005011A1 PCT/EP2014/078143 EP2014078143W WO2016005011A1 WO 2016005011 A1 WO2016005011 A1 WO 2016005011A1 EP 2014078143 W EP2014078143 W EP 2014078143W WO 2016005011 A1 WO2016005011 A1 WO 2016005011A1
Authority
WO
WIPO (PCT)
Prior art keywords
cleaning device
image
robotic cleaning
vicinity
luminous section
Prior art date
Application number
PCT/EP2014/078143
Other languages
French (fr)
Inventor
Anders Haegermarck
Original Assignee
Aktiebolaget Electrolux
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aktiebolaget Electrolux filed Critical Aktiebolaget Electrolux
Publication of WO2016005011A1 publication Critical patent/WO2016005011A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0248Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images

Definitions

  • the invention relates to a robotic cleaning device and a method for the robotic cleaning device of facilitating detection of objects from captured images.
  • Robotic vacuum cleaners are known in the art, which are equipped with drive means in the form of one or more motors for moving the cleaner across a surface to be cleaned.
  • the robotic vacuum cleaners are further equipped with intelligence in the form of microprocessor(s) and navigation means for causing an autonomous behaviour such that the robotic vacuum cleaners freely can move around and clean a space in the form of e.g. a room.
  • these prior art robotic vacuum cleaners has the capability of more or less autonomously vacuum cleaning a room in which furniture such as tables and chairs and other obstacles such as walls and stairs are located.
  • These robotic vacuum cleaners have navigated a room by means of using structured light, such as e.g.
  • An object of the present invention is to solve, or at least mitigate, this problem in art and to provide an improved method at a robotic cleaning device for facilitating detection of objects.
  • a method for a robotic cleaning device of facilitating detection of objects from captured images comprises the steps of illuminating a vicinity of the robotic cleaning device with structured light, recording a first image of the vicinity of the robotic cleaning device, changing position of the robotic cleaning device, and recording a second image of the vicinity of the robotic cleaning device.
  • the method further comprises the steps of detecting at least one luminous section in the first and second image and determining, when comparing the second image and the first image, whether the luminous section maintains its position in the second image, in which case the luminous section is considered to be a direct reflection of the structured light impinging on an object in said vicinity.
  • a robotic cleaning device comprising a propulsion system arranged to move the robotic cleaning device, at least one light source arranged to illuminate a vicinity of the robotic cleaning device with structured light, a camera device arranged to capture images of the vicinity of the robotic cleaning device, and a controller arranged to control the propulsion system to move the robotic cleaning device.
  • the controller is further arranged to control the camera device to capture a first image, to control the propulsion system to change position of the robotic cleaning device, to control the camera device to capture a second image, to detect at least one luminous section in the first and second image, to determine, when comparing the second image and the first image, whether the luminous section maintains its position in the second image, in which case the luminous section is considered to be a direct reflection of the structured light impinging on an object in the vicinity.
  • a vicinity of the robotic cleaning device is
  • the robotic cleaning device continuously captures images of the vicinity.
  • the structured light being for instance line laser beams
  • the robotic cleaning device changes position by means of performing a yaw, pitch, translation or roll movement, or a combination thereof, and captures a second image.
  • the line laser beams will impinge on an obstacle, such as a wall or a floor, and be reflected towards the camera.
  • CMOS Complementary Metal Oxide Semiconductor
  • the captured images will not only comprise illuminated sections being a result of directly reflected light from an illuminated object, but also indirect reflections and detected light of strong light sources such as sunbeams and lamps. This will cause false detections in the form of image data representing lines in relation to which the robotic cleaning device cannot position itself.
  • the second image and the first image are compared, and it is determined whether the luminous section maintains its position in the second image, in which case the luminous section is considered to be a direct reflection of the structured light impinging on an object in the vicinity.
  • the directly reflected light will cause a luminous section, e.g. a line as exemplified hereinabove, in the second image to maintain its position as compared to the first image.
  • a luminous section e.g. a line as exemplified hereinabove
  • the movement of the image data extracted from the luminous section will be zero for a direct reflection of an illuminated object when comparing the first and the second image, while an indirect reflection will cause a movement of the image data extracted from the indirectly reflected line laser beams captured in the first and second image, which movement will be directly related to change in yaw angle of the robotic cleaning device.
  • the movement of the extracted image data will also be directly related to the change in yaw angle, but displacement of the extracted image data will be smaller as compared to the displacement of the indirectly reflected line laser beams.
  • the laser sensor Since the camera capturing the images is fixedly related to the laser light source, the camera will always follow the movement of the projected laser beams, in which case the image data extracted from the luminous section(s) of the two captured images will maintain its position.
  • the laser sensor has two main sources of false detections: first, strong sunlight will cause bright areas in the image, despite an optical filter. These can be mistaken for laser lines. Second, the laser may be reflected on shiny surfaces, causing multiple detected laser lines. Both these types of false detections can be suppressed in the present invention by comparing two consecutive images.
  • the luminous section in case the luminous section does not maintain its position in the second image, it is not considered a direct reflection of the structured light impinging on an object in the vicinity, but an indirect reflection or a result of a fixed light source such as a sunbeam or a lamp, and the luminous section is thus filtered out from the first and second image.
  • Figure 1 shows a bottom view of a robotic cleaning device according to embodiments of the present invention
  • Figure 2 shows a front view of the robotic cleaning device illustrated in Figure i;
  • Figure 3a shows a robotic cleaning device implementing the method according to an embodiment of the present invention
  • Figure 3b illustrates an image captured by the robotic cleaning device of the environment in Figure 3a;
  • Figure 4a shows the robotic cleaning device of Figure 3a performing a yaw movement:
  • Figure 4b illustrates an image captured by the robotic cleaning device of the environment in Figure 4a.
  • Figure 5 illustrates a flow chart of an embodiment of the method according to the present invention.
  • the invention relates to robotic cleaning devices, or in other words, to automatic, self-propelled machines for cleaning a surface, e.g. a robotic vacuum cleaner, a robotic sweeper or a robotic floor washer.
  • the robotic cleaning device according to the invention can be mains-operated and have a cord, be battery-operated or use any other kind of suitable energy source, for example solar energy.
  • FIG. ⁇ shows a robotic cleaning device 10 according to embodiments of the present invention in a bottom view, i.e. the bottom side of the robotic cleaning device is shown.
  • the arrow indicates the forward direction of the robotic cleaning device.
  • the robotic cleaning device 10 comprises a main body 11 housing components such as a propulsion system comprising driving means in the form of two electric wheel motors 15a, 15b for enabling movement of the driving wheels 12, 13 such that the cleaning device can be moved over a surface to be cleaned.
  • Each wheel motor 15a, 15b is capable of controlling the respective driving wheel 12, 13 to rotate independently of each other in order to move the robotic cleaning device 10 across the surface to be cleaned.
  • a number of different driving wheel arrangements, as well as various wheel motor arrangements, can be envisaged.
  • the robotic cleaning device may have any appropriate shape, such as a device having a more traditional circular-shaped main body, or a triangular-shaped main body.
  • a track propulsion system may be used or even a hovercraft propulsion system.
  • the propulsion system may further be arranged to cause the robotic cleaning device 10 to perform any one or more of a yaw, pitch, translation or roll movement.
  • a controller 16 such as a microprocessor controls the wheel motors 15a, 15b to rotate the driving wheels 12, 13 as required in view of information received from an obstacle detecting device (not shown in Figure 1) for detecting obstacles in the form of walls, floor lamps, table legs, around which the robotic cleaning device must navigate.
  • the obstacle detecting device may be embodied in the form of a 3D sensor system registering its surroundings, implemented by means of e.g. a 3D camera, a camera in combination with lasers, a laser scanner, etc. for detecting obstacles and communicating information about any detected obstacle to the microprocessor 16.
  • the microprocessor 16 communicates with the wheel motors 15a, 15b to control movement of the wheels 12, 13 in accordance with information provided by the obstacle detecting device such that the robotic cleaning device 10 can move as desired across the surface to be cleaned. This will be described in more detail with reference to subsequent drawings.
  • the main body 11 may optionally be arranged with a cleaning member 17 for removing debris and dust from the surface to be cleaned in the form of a rotatable brush roll arranged in an opening 18 at the bottom of the robotic cleaner 10.
  • a cleaning member 17 for removing debris and dust from the surface to be cleaned in the form of a rotatable brush roll arranged in an opening 18 at the bottom of the robotic cleaner 10.
  • the rotatable brush roll 17 is arranged along a horizontal axis in the opening 18 to enhance the dust and debris collecting properties of the cleaning device 10.
  • a brush roll motor 19 is operatively coupled to the brush roll to control its rotation in line with instructions received from the controller 16.
  • the main body 11 of the robotic cleaner 10 comprises a suction fan
  • the suction fan 20 creating an air flow for transporting debris to a dust bag or cyclone arrangement (not shown) housed in the main body via the opening 18 in the bottom side of the main body 11.
  • the suction fan 20 is driven by a fan motor
  • the main body 11 or the robotic cleaning device 10 is further equipped with an angle-measuring device 24, such as e.g. a gyroscope 24 and/or an accelerometer or any other appropriate device for measuring orientation of the robotic cleaning device 10.
  • a three-axis gyroscope is capable of measuring rotational velocity in a roll, pitch and yaw movement of the robotic cleaning device 10.
  • a three-axis accelerometer is capable of measuring acceleration in all directions, which is mainly used to determine whether the robotic cleaning device is bumped or lifted or if it is stuck (i.e. not moving even though the wheels are turning).
  • the robotic cleaning device 10 further comprises encoders (not shown in Figure l) on each drive wheel 12, 13 which generate pulses when the wheels turn.
  • the encoders may for instance be magnetic or optical.
  • By counting the pulses at the controller 16, the speed of each wheel 12, 13 can be determined.
  • the controller 16 can perform so called dead reckoning to determine position and heading of the cleaning device 10.
  • the controller/processing unit 16 embodied in the form of one or more microprocessors is arranged to execute a computer program 25 downloaded to a suitable storage medium 26 associated with the microprocessor, such as a Random Access Memory (RAM), a Flash memory or a hard disk drive.
  • the controller 16 is arranged to carry out a method according to embodiments of the present invention when the appropriate computer program 25 comprising computer-executable instructions is downloaded to the storage medium 26 and executed by the controller 16.
  • the storage medium 26 may also be a computer program product comprising the computer program 25.
  • the computer program 25 may be transferred to the storage medium 26 by means of a suitable computer program product, such as a digital versatile disc (DVD), compact disc (CD) or a memory stick.
  • DVD digital versatile disc
  • CD compact disc
  • the computer program 25 may be downloaded to the storage medium 26 over a network.
  • the controller 16 may alternatively be embodied in the form of a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field- programmable gate array (FPGA), a complex programmable logic device (CPLD), etc.
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field- programmable gate array
  • CPLD complex programmable logic device
  • Figure 2 shows a front view of the robotic cleaning device 10 of Figure 1 in an embodiment of the present invention illustrating the previously mentioned obstacle detecting device in the form of a 3D sensor system 22 comprising at least a camera 23 and a first and a second line laser 27, 28, which may be horizontally or vertically oriented line lasers. Further shown is the controller 16, the main body 11, the driving wheels 12, 13, and the rotatable brush roll 17 previously discussed with reference to Figure la.
  • the controller 16 is operatively coupled to the camera 23 for recording images of a vicinity of the robotic cleaning device 10.
  • the first and second line lasers 27, 28 may preferably be vertical line lasers and are arranged lateral of the camera 23 and configured to illuminate a height and a width that is greater than the height and width of the robotic cleaning device 10.
  • the angle of the field of view of the camera 23 is preferably smaller than the space illuminated by the first and second line lasers 27, 28.
  • the camera 23 is controlled by the controller 16 to capture and record a plurality of images per second. Data from the images is extracted by the controller 16 and the data is typically saved in the memory 26 along with the computer program 25.
  • the first and second line lasers 27, 28 are typically arranged on a respective side of the camera 23 along an axis being perpendicular to an optical axis of the camera. Further, the line lasers 27, 28 are directed such that their respective laser beams intersect within the field of view of the camera 23. Typically, the intersection coincides with the optical axis of the camera 23.
  • the first and second line laser 27, 28 are configured to scan, preferably in a vertical orientation, the vicinity of the robotic cleaning device 10, normally in the direction of movement of the robotic cleaning device 10.
  • the first and second line lasers 27, 28 are configured to send out laser beams, which illuminate furniture, walls and other objects of e.g. a room to be cleaned.
  • the camera 23 is controlled by the controller 16 to capture and record images from which the controller 16 creates a representation or layout of the surroundings that the robotic cleaning device 10 is operating in, by extracting features from the images and by measuring the distance covered by the robotic cleaning device 10, while the robotic cleaning device 10 is moving across the surface to be cleaned.
  • the controller 16 derives positional data of the robotic cleaning device 10 with respect to the surface to be cleaned from the recorded images, generates a 3D representation of the surroundings from the derived positional data and controls the driving motors 15a, 15b to move the robotic cleaning device across the surface to be cleaned in accordance with the generated 3D representation and navigation information supplied to the robotic cleaning device 10 such that the surface to be cleaned can be navigated by taking into account the generated 3D representation. Since the derived positional data will serve as a foundation for the navigation of the robotic cleaning device, it is important that the positioning is correct; the robotic device will otherwise navigate according to a "map" of its surroundings that is misleading.
  • the 3D representation generated from the images recorded by the 3D sensor system 22 thus facilitates detection of obstacles in the form of walls, floor lamps, table legs, around which the robotic cleaning device must navigate as well as rugs, carpets, doorsteps, etc., that the robotic cleaning device 10 must traverse.
  • the robotic cleaning device 10 is hence configured to learn about its environment or surroundings by operating/cleaning.
  • the 3D sensor system 22 is separated from the main body 11 of the robotic cleaning device 10.
  • the 3D sensor system 22 is likely to be integrated with the main body 11 of the robotic cleaning device 10 to minimize the height of the robotic cleaning device 10, thereby allowing it to pass under obstacles, such as e.g. a sofa.
  • the 3D sensor system 22 comprising the camera 23 and the first and second vertical line lasers 27, 28 is arranged to record images of a vicinity of the robotic cleaning from which objects/obstacles may be detected.
  • the controller 16 is capable of positioning the robotic cleaning device 10 with respect to the detected obstacles and hence a surface to be cleaned by deriving positional data from the recorded images. From the positioning, the controller 16 controls movement of the robotic cleaning device 10 by means of controlling the wheels 12, 13 via the wheel drive motors 15a, 15b, across the surface to be cleaned
  • the derived positional data facilitates control of the movement of the robotic cleaning device 10 such that cleaning device can be navigated to move very close to an object, and to move closely around the object to remove debris from the surface on which the object is located.
  • the derived positional data is utilized to move flush against the object, being e.g. a thick rug or a wall.
  • the controller 16 continuously generates and transfers control signals to the drive wheels 12, 13 via the drive motors 15a, 15b such that the robotic cleaning device 10 is navigated close to the object.
  • FIG. 3a illustrates facilitation of detection of objects in accordance with an embodiment of the present invention.
  • one vertical line laser 27 of the robotic device 10 is shown.
  • the line laser 27 projects a laser beam 30 onto the floor and a first wall 31 of the room to be cleaned.
  • the laser beam 30 is also reflected against a second wall 32, thus creating a "false" laser beam 33.
  • the false laser beam 33 is indirectly reflected against the second wall 32 towards the camera 23 of the robotic cleaning device 10.
  • sunlight entering the room via window 34 will cause two fixedly arranged light sources 35, 36 which undesirably may be detected by the camera 23.
  • Figure 3b illustrates a first image 37 captured by the camera 23 in the situation shown in Figure3a where for illustrational purposes only the detected light sources are shown.
  • the first image 37 comprises three luminous sections in the form of the directly reflected laser beam 30, the indirectly reflected laser beam 33 and the fixed sunbeam 35.
  • the indirectly reflected laser beam 33 and the fixed sunbeam 35 will cause errors in the obstacle detection for the robotic cleaning device and subsequent creation of the 3D representation, since false lines will appear in the captured images, in relation to which the robotic cleaning device cannot position itself correctly. Even though optical filters can be used to filter out light, it is oftentimes still not enough.
  • Figure 4a illustrates the room of Figure 3a, but where a yaw movement of the robotic device 10 has been performed. That is, the robotic cleaning device 10 turns about its z-axis with a change in angle of ⁇ . This will cause the indirectly reflected laser beam 33 to move to the left on the second wall 32.
  • the laser beams pertaining to the previous robotic cleaning device position of Figure 3a are shown with dashed lines in Figure 4a.
  • Figure 4b illustrates a second image 38 captured by the camera 23 in the situation shown in Figure 3b, i.e. after the heading of the robotic cleaning device 10 has changed with ⁇ as compared to its heading when capturing the first image 37.
  • the second image 38 comprises three luminous sections in the form of the directly reflected laser beam 30, the indirectly reflected laser beam 33 and the fixed sunbeam 35.
  • the position of the directly reflected "real" laser beam 30 is maintained in the second image 38 as compared to in the first image 37, since the camera 23 is fixedly arranged with respect to the line laser 27 and thus follows the laser beam 30.
  • the indirectly reflected laser beam 33 moves to the left in the second image 38 with a distance
  • the luminous sections 33 and 35 of the second image 38 in Figure 4b have moved as compared to their position in the first image 37 of figure 3b.
  • These two luminous sections are thus false detections (in this case an indirect reflection of the line laser and a fixed light source, respectively), and can advantageously be filtered out from the first and the second image.
  • "clean" images with no false detections are provided, which will greatly improve the obstacle-detecting capacity of the robotic cleaning device 10 and the ability of creating 3D representations from the captured images.
  • the line lasers 27, 28 are optionally controlled to emit light at a highest possible power.
  • an optical filter can be arranged in front of the camera 23 to make the camera more perceptive to the light emitted by the line lasers 27, 28.
  • the optical filter is adapted to a wavelength of the structured light emitted by the line lasers 27, 28.
  • the estimated position of the robot cleaning device 10 is typically recorded at the time of capturing the respective picture 37, 38 by applying dead
  • CMOS camera Since a CMOS camera is equipped with a light sensor array, where each individual light sensor (i.e. pixel) in the array represents detected light from a unique position in space, a recorded picture will contain image data representing objects that the line lasers have illuminated, which image data further can be associated with unique coordinates. From the extracted lines in the captured images 37, 38, a representation of the illuminated vicinity along the projected laser lines 27, 28 can be created. As the position of the robot 10 is recorded when each image is captured, the 2D representations provided by the captured images 37, 38 can be transformed into 3D space and used to build a complete 3D map of the room as the robot moves and continuously records further sections of the room.
  • each individual light sensor i.e. pixel
  • FIG. 5 illustrates a flowchart of an embodiment of the method of facilitating detection of objects from captured images according to the present invention.
  • a first step S101 the first and second vertical line lasers 27, 28 of the 3D sensor system 22 illuminates a vicinity of the robotic cleaning device 10 with laser light.
  • the floor and the first wall 31 of the room to be cleaned is illuminated.
  • the camera 23 of the 3D sensor system 22 records a first image 37 of the vicinity from which obstacles may be detected.
  • the controller 16 is capable of positioning the robotic cleaning device 10 with respect to the detected obstacles (and hence a surface to be cleaned) by deriving positional data from the recorded images. From the positioning, the controller 16 controls movement of the robotic cleaning device 10 by means of controlling the wheels 12, 13 via the wheel drive motors 15a, 15b, across the surface to be cleaned.
  • the derived positional data facilitates control of the movement of the robotic cleaning device 10 such that cleaning device can be navigated to move very close to an object, and to move closely around the object to remove debris from the surface on which the object is located.
  • the derived positional data is utilized to move flush against the object, being e.g. a wall or a piece of furniture.
  • the controller 16 continuously generates and transfers control signals to the drive wheels 12, 13 via the drive motors 15a, 15b such that the robotic cleaning device 10 is navigated close to the object.
  • the robotic cleaning device 10 changes its position in step S103, as previously has been described with reference to Figure 4a.
  • the controller 16 effects a yaw movement of the cleaning device 10 and records ⁇ via the gyroscope 24.
  • the change in position of the cleaning device 10 is not necessarily brought about by a yaw movement, but could
  • step S104 the controller 16 controls the camera 23 to capture a second image 38.
  • the controller 16 subsequently detects, in step S105, a plurality of luminous sections 30, 33, 35 in the first and the second images 37, 38.
  • the controller 16 determines whether the luminous sections 30, 33, 35 maintains their position in the second image 38. If that is the case, the controller 16 will consider such a luminous section to be a directly reflected laser beam as a result of the line lasers 27, 28 impinging on an object to be detected, in this particular example the first wall 31. Consequently, the image data extracted from the luminous section 30 is kept in the first and second image 37, 38, while image data extracted from the luminous sections 33, 35 optionally will be removed from the images in step S107, since it relates to an indirectly reflected laser beam and a fixed light source, respectively.
  • indirect reflections such as the laser beam 33 reflected against the second wall 32 can be distinguished from fixed lights sources such as the sunbeams 35, 36.
  • the extracted image data caused by the indirect reflection 33 will move to the left in the second image 38 as compared to its position in the first image 37, and so will the extracted image data caused by the fixed light source 35, but the movement of the indirect reflection will be greater, while a counter-clockwise rotation will result in the respective extracted image data moving in the right direction when comparing the two images 37, 38.
  • the movement- i.e. displacement - of the extracted image data in the second image 38 will be greater for an indirect reflection 33 than for a fixed light source 35.
  • a measure is evaluated based on a relation between the change in position of the robotic cleaning device 10 - caused e.g. by a yaw, pitch, roll or translation movement of the robotic cleaning device 10 - and a displacement ⁇ of the extracted image data in the second image 38 to determine whether the luminous section in the second image is an indirect reflection 33 of the structured light impinging on the object 31 or a fixed light source 35.
  • a measure k is introduced:
  • is defined as positive for extracted image data moving to the right in the second image 38, and ⁇ is defined as positive for rotation in a counter-clockwise direction of the robotic cleaning device 10.
  • a threshold value can be appropriately set to determine whether the extracted image data should be rejected as false. If k exceeds an appropriately set threshold value, the extracted image data is considered to be caused by an indirect reflection. If k is below the appropriately set threshold value (but greater than o, i.e. no displacement of image data is detected), the extracted image data is considered to be caused by a fixed light source.
  • the camera used has a 90 0 field of view distributed over 752 pixels horizontally.
  • k would be approximately 1 for the fixed light source 35.

Abstract

The invention relates to a robotic cleaning device and a method for the robotic cleaning device of facilitating detection of objects from captured images. In a first aspect of the present invention a method is provided for a robotic cleaning device of facilitating detection of objects from captured images. The method comprises the steps of illuminating a vicinity of the robotic cleaning device with structured light, recording a first image of the vicinity of the robotic cleaning device, changing position of the robotic cleaning device, and recording a second image of the vicinity of the robotic cleaning device. The method further comprises the steps of detecting at least one luminous section in the first and second image and determining, when comparing the second image and the first image, whether the luminous section maintains its position in the second image, in which case the luminous section is considered to be a direct reflection of the structured light impinging on an object in said vicinity.

Description

METHOD IN A ROBOTIC CLEANING DEVICE FOR FACILITATING DETECTION OF OBJECTS FROM CAPTURED IMAGES
TECHNICAL FIELD
The invention relates to a robotic cleaning device and a method for the robotic cleaning device of facilitating detection of objects from captured images.
BACKGROUND
In many fields of technology, it is desirable to use robots with an autonomous behaviour such that they freely can move around a space without colliding with possible obstacles.
Robotic vacuum cleaners are known in the art, which are equipped with drive means in the form of one or more motors for moving the cleaner across a surface to be cleaned. The robotic vacuum cleaners are further equipped with intelligence in the form of microprocessor(s) and navigation means for causing an autonomous behaviour such that the robotic vacuum cleaners freely can move around and clean a space in the form of e.g. a room. Thus, these prior art robotic vacuum cleaners has the capability of more or less autonomously vacuum cleaning a room in which furniture such as tables and chairs and other obstacles such as walls and stairs are located. These robotic vacuum cleaners have navigated a room by means of using structured light, such as e.g. line laser beams, to illuminate obstacles to be detected and registering laser light directly reflected from the obstacles back towards the cleaner in order to determine where the obstacles are located in the room. Images are continuously captured by a camera of the robotic cleaning device, and distance to the illuminated obstacle such as a wall or a floor can be calculated by detecting the directly reflected laser line in the captured images and using trigonometric functions based on the known position of the cleaner, such that a 3D representation of the room subsequently can be created relative to the robot cleaner. A problem in the art is however that not only the direct reflections of the laser light is registered by the camera of the robotic vacuum cleaner, but also indirect reflections as well as other sources of light in the form of for example sunbeams entering the room via windows. This will cause errors in the obstacle detection and subsequent creation of the 3D representation, since "false" lines will appear in the captured images. Even though optical filters can be used to filter out light, the wavelength of which differs from that of the laser light, it is still not enough. This is particularly the case for laser light which is not reflected directly from the obstacle, but via a further obstacle and then back towards the camera (for instance from a first wall to be detected via a second wall and back to the camera). Such indirectly reflected light cannot by filtered out, since it is of the same wavelength as the laser light from which the obstacles is to be detected. This has the negative effect that the robotic cleaning device cannot separate a directly reflected laser light from "noise", i.e. indirectly reflected light and/or powerful light sources such as e.g. sunbeams and higher-power lamps. Consequently, the obstacle detection and subsequent creation of a 3D representation of the room will be incorrect.
SUMMARY
An object of the present invention is to solve, or at least mitigate, this problem in art and to provide an improved method at a robotic cleaning device for facilitating detection of objects.
This object is attained in a first aspect of the present invention by a method for a robotic cleaning device of facilitating detection of objects from captured images. The method comprises the steps of illuminating a vicinity of the robotic cleaning device with structured light, recording a first image of the vicinity of the robotic cleaning device, changing position of the robotic cleaning device, and recording a second image of the vicinity of the robotic cleaning device. The method further comprises the steps of detecting at least one luminous section in the first and second image and determining, when comparing the second image and the first image, whether the luminous section maintains its position in the second image, in which case the luminous section is considered to be a direct reflection of the structured light impinging on an object in said vicinity.
This object is attained in a first aspect of the present invention by a robotic cleaning device comprising a propulsion system arranged to move the robotic cleaning device, at least one light source arranged to illuminate a vicinity of the robotic cleaning device with structured light, a camera device arranged to capture images of the vicinity of the robotic cleaning device, and a controller arranged to control the propulsion system to move the robotic cleaning device. The controller is further arranged to control the camera device to capture a first image, to control the propulsion system to change position of the robotic cleaning device, to control the camera device to capture a second image, to detect at least one luminous section in the first and second image, to determine, when comparing the second image and the first image, whether the luminous section maintains its position in the second image, in which case the luminous section is considered to be a direct reflection of the structured light impinging on an object in the vicinity.
In the present invention, a vicinity of the robotic cleaning device is
illuminated with structured light, and a camera of the robotic cleaning device continuously captures images of the vicinity. Hence, with the structured light, being for instance line laser beams, not all parts of field of view of the camera are illuminated in the same way with respect to for instance intensity, time, polarity, color, etc., in order to add information that aids in the interpretation of the captured images. After a first image is captured, the robotic cleaning device changes position by means of performing a yaw, pitch, translation or roll movement, or a combination thereof, and captures a second image. The line laser beams will impinge on an obstacle, such as a wall or a floor, and be reflected towards the camera. From the captured images, luminous sections are detected in the form of laser lines reflected from the obstacle back towards the camera. Thus, image data representing a line formed by the laser line beams are extracted from the recorded images. From these extracted lines, a 3D representation of the illuminated space along the projected laser lines can be created. Since e.g. a Complementary Metal Oxide Semiconductor (CMOS) camera is equipped with a light sensor array, where each individual light sensor (i.e. pixel) in the array represents detected light from a unique position in space, a recorded picture will contain image data representing objects that the line lasers have illuminated, which image data further can be associated with unique coordinates. As a result, it is possible to position the robotic cleaning device with respect to the illuminated objects/obstacles using trigonometric functions since the positional and angular relationship between the camera and the laser light source(s) of the robotic cleaning device are known.
However, the captured images will not only comprise illuminated sections being a result of directly reflected light from an illuminated object, but also indirect reflections and detected light of strong light sources such as sunbeams and lamps. This will cause false detections in the form of image data representing lines in relation to which the robotic cleaning device cannot position itself.
Advantageously, in the present invention, the second image and the first image are compared, and it is determined whether the luminous section maintains its position in the second image, in which case the luminous section is considered to be a direct reflection of the structured light impinging on an object in the vicinity. The directly reflected light will cause a luminous section, e.g. a line as exemplified hereinabove, in the second image to maintain its position as compared to the first image. As an example, if the robotic cleaning device performs a small yaw movement (i.e. a movement about its z-axis) between the first and the second captured image, the movement of the image data extracted from the luminous section will be zero for a direct reflection of an illuminated object when comparing the first and the second image, while an indirect reflection will cause a movement of the image data extracted from the indirectly reflected line laser beams captured in the first and second image, which movement will be directly related to change in yaw angle of the robotic cleaning device. For a fixed light source, such as a sunbeam, the movement of the extracted image data will also be directly related to the change in yaw angle, but displacement of the extracted image data will be smaller as compared to the displacement of the indirectly reflected line laser beams. Since the camera capturing the images is fixedly related to the laser light source, the camera will always follow the movement of the projected laser beams, in which case the image data extracted from the luminous section(s) of the two captured images will maintain its position. The laser sensor has two main sources of false detections: first, strong sunlight will cause bright areas in the image, despite an optical filter. These can be mistaken for laser lines. Second, the laser may be reflected on shiny surfaces, causing multiple detected laser lines. Both these types of false detections can be suppressed in the present invention by comparing two consecutive images.
In an embodiment of the present invention, in case the luminous section does not maintain its position in the second image, it is not considered a direct reflection of the structured light impinging on an object in the vicinity, but an indirect reflection or a result of a fixed light source such as a sunbeam or a lamp, and the luminous section is thus filtered out from the first and second image.
Further embodiments of the present invention will be described in the following.
It is noted that the invention relates to all possible combinations of features recited in the claims. Further features of, and advantages with, the present invention will become apparent when studying the appended claims and the following description. Those skilled in the art realize that different features of the present invention can be combined to create embodiments other than those described in the following. BRIEF DESCRIPTION OF THE DRAWINGS
The invention is now described, by way of example, with reference to the accompanying drawings, in which:
Figure 1 shows a bottom view of a robotic cleaning device according to embodiments of the present invention;
Figure 2 shows a front view of the robotic cleaning device illustrated in Figure i;
Figure 3a shows a robotic cleaning device implementing the method according to an embodiment of the present invention; Figure 3b illustrates an image captured by the robotic cleaning device of the environment in Figure 3a;
Figure 4a shows the robotic cleaning device of Figure 3a performing a yaw movement:
Figure 4b illustrates an image captured by the robotic cleaning device of the environment in Figure 4a; and
Figure 5 illustrates a flow chart of an embodiment of the method according to the present invention.
DETAILED DESCRIPTION
The invention will now be described more fully hereinafter with reference to the accompanying drawings, in which certain embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout the description. The invention relates to robotic cleaning devices, or in other words, to automatic, self-propelled machines for cleaning a surface, e.g. a robotic vacuum cleaner, a robotic sweeper or a robotic floor washer. The robotic cleaning device according to the invention can be mains-operated and have a cord, be battery-operated or use any other kind of suitable energy source, for example solar energy.
Figure ι shows a robotic cleaning device 10 according to embodiments of the present invention in a bottom view, i.e. the bottom side of the robotic cleaning device is shown. The arrow indicates the forward direction of the robotic cleaning device. The robotic cleaning device 10 comprises a main body 11 housing components such as a propulsion system comprising driving means in the form of two electric wheel motors 15a, 15b for enabling movement of the driving wheels 12, 13 such that the cleaning device can be moved over a surface to be cleaned. Each wheel motor 15a, 15b is capable of controlling the respective driving wheel 12, 13 to rotate independently of each other in order to move the robotic cleaning device 10 across the surface to be cleaned. A number of different driving wheel arrangements, as well as various wheel motor arrangements, can be envisaged. It should be noted that the robotic cleaning device may have any appropriate shape, such as a device having a more traditional circular-shaped main body, or a triangular-shaped main body. As an alternative, a track propulsion system may be used or even a hovercraft propulsion system. The propulsion system may further be arranged to cause the robotic cleaning device 10 to perform any one or more of a yaw, pitch, translation or roll movement. A controller 16 such as a microprocessor controls the wheel motors 15a, 15b to rotate the driving wheels 12, 13 as required in view of information received from an obstacle detecting device (not shown in Figure 1) for detecting obstacles in the form of walls, floor lamps, table legs, around which the robotic cleaning device must navigate. The obstacle detecting device may be embodied in the form of a 3D sensor system registering its surroundings, implemented by means of e.g. a 3D camera, a camera in combination with lasers, a laser scanner, etc. for detecting obstacles and communicating information about any detected obstacle to the microprocessor 16. The microprocessor 16 communicates with the wheel motors 15a, 15b to control movement of the wheels 12, 13 in accordance with information provided by the obstacle detecting device such that the robotic cleaning device 10 can move as desired across the surface to be cleaned. This will be described in more detail with reference to subsequent drawings.
Further, the main body 11 may optionally be arranged with a cleaning member 17 for removing debris and dust from the surface to be cleaned in the form of a rotatable brush roll arranged in an opening 18 at the bottom of the robotic cleaner 10. Thus, the rotatable brush roll 17 is arranged along a horizontal axis in the opening 18 to enhance the dust and debris collecting properties of the cleaning device 10. In order to rotate the brush roll 17, a brush roll motor 19 is operatively coupled to the brush roll to control its rotation in line with instructions received from the controller 16. Moreover, the main body 11 of the robotic cleaner 10 comprises a suction fan
20 creating an air flow for transporting debris to a dust bag or cyclone arrangement (not shown) housed in the main body via the opening 18 in the bottom side of the main body 11. The suction fan 20 is driven by a fan motor
21 communicatively connected to the controller 16 from which the fan motor 21 receives instructions for controlling the suction fan 20. It should be noted that a robotic cleaning device having either one of the rotatable brush roll 17 and the suction fan 20 for transporting debris to the dust bag can be envisaged. A combination of the two will however enhance the debris- removing capabilities of the robotic cleaning device 10. The main body 11 or the robotic cleaning device 10 is further equipped with an angle-measuring device 24, such as e.g. a gyroscope 24 and/or an accelerometer or any other appropriate device for measuring orientation of the robotic cleaning device 10. A three-axis gyroscope is capable of measuring rotational velocity in a roll, pitch and yaw movement of the robotic cleaning device 10. A three-axis accelerometer is capable of measuring acceleration in all directions, which is mainly used to determine whether the robotic cleaning device is bumped or lifted or if it is stuck (i.e. not moving even though the wheels are turning). The robotic cleaning device 10 further comprises encoders (not shown in Figure l) on each drive wheel 12, 13 which generate pulses when the wheels turn. The encoders may for instance be magnetic or optical. By counting the pulses at the controller 16, the speed of each wheel 12, 13 can be determined. By combining wheel speed readings with gyroscope information, the controller 16 can perform so called dead reckoning to determine position and heading of the cleaning device 10.
With further reference to Figure 1, the controller/processing unit 16 embodied in the form of one or more microprocessors is arranged to execute a computer program 25 downloaded to a suitable storage medium 26 associated with the microprocessor, such as a Random Access Memory (RAM), a Flash memory or a hard disk drive. The controller 16 is arranged to carry out a method according to embodiments of the present invention when the appropriate computer program 25 comprising computer-executable instructions is downloaded to the storage medium 26 and executed by the controller 16. The storage medium 26 may also be a computer program product comprising the computer program 25. Alternatively, the computer program 25 may be transferred to the storage medium 26 by means of a suitable computer program product, such as a digital versatile disc (DVD), compact disc (CD) or a memory stick. As a further alternative, the computer program 25 may be downloaded to the storage medium 26 over a network. The controller 16 may alternatively be embodied in the form of a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field- programmable gate array (FPGA), a complex programmable logic device (CPLD), etc.
Figure 2 shows a front view of the robotic cleaning device 10 of Figure 1 in an embodiment of the present invention illustrating the previously mentioned obstacle detecting device in the form of a 3D sensor system 22 comprising at least a camera 23 and a first and a second line laser 27, 28, which may be horizontally or vertically oriented line lasers. Further shown is the controller 16, the main body 11, the driving wheels 12, 13, and the rotatable brush roll 17 previously discussed with reference to Figure la. The controller 16 is operatively coupled to the camera 23 for recording images of a vicinity of the robotic cleaning device 10. The first and second line lasers 27, 28 may preferably be vertical line lasers and are arranged lateral of the camera 23 and configured to illuminate a height and a width that is greater than the height and width of the robotic cleaning device 10. Further, the angle of the field of view of the camera 23 is preferably smaller than the space illuminated by the first and second line lasers 27, 28. The camera 23 is controlled by the controller 16 to capture and record a plurality of images per second. Data from the images is extracted by the controller 16 and the data is typically saved in the memory 26 along with the computer program 25.
The first and second line lasers 27, 28 are typically arranged on a respective side of the camera 23 along an axis being perpendicular to an optical axis of the camera. Further, the line lasers 27, 28 are directed such that their respective laser beams intersect within the field of view of the camera 23. Typically, the intersection coincides with the optical axis of the camera 23.
The first and second line laser 27, 28 are configured to scan, preferably in a vertical orientation, the vicinity of the robotic cleaning device 10, normally in the direction of movement of the robotic cleaning device 10. The first and second line lasers 27, 28 are configured to send out laser beams, which illuminate furniture, walls and other objects of e.g. a room to be cleaned. The camera 23 is controlled by the controller 16 to capture and record images from which the controller 16 creates a representation or layout of the surroundings that the robotic cleaning device 10 is operating in, by extracting features from the images and by measuring the distance covered by the robotic cleaning device 10, while the robotic cleaning device 10 is moving across the surface to be cleaned. Thus, the controller 16 derives positional data of the robotic cleaning device 10 with respect to the surface to be cleaned from the recorded images, generates a 3D representation of the surroundings from the derived positional data and controls the driving motors 15a, 15b to move the robotic cleaning device across the surface to be cleaned in accordance with the generated 3D representation and navigation information supplied to the robotic cleaning device 10 such that the surface to be cleaned can be navigated by taking into account the generated 3D representation. Since the derived positional data will serve as a foundation for the navigation of the robotic cleaning device, it is important that the positioning is correct; the robotic device will otherwise navigate according to a "map" of its surroundings that is misleading.
The 3D representation generated from the images recorded by the 3D sensor system 22 thus facilitates detection of obstacles in the form of walls, floor lamps, table legs, around which the robotic cleaning device must navigate as well as rugs, carpets, doorsteps, etc., that the robotic cleaning device 10 must traverse. The robotic cleaning device 10 is hence configured to learn about its environment or surroundings by operating/cleaning.
With reference to Figure 2, for illustrational purposes, the 3D sensor system 22 is separated from the main body 11 of the robotic cleaning device 10.
However, in a practical implementation, the 3D sensor system 22 is likely to be integrated with the main body 11 of the robotic cleaning device 10 to minimize the height of the robotic cleaning device 10, thereby allowing it to pass under obstacles, such as e.g. a sofa.
Hence, the 3D sensor system 22 comprising the camera 23 and the first and second vertical line lasers 27, 28 is arranged to record images of a vicinity of the robotic cleaning from which objects/obstacles may be detected. The controller 16 is capable of positioning the robotic cleaning device 10 with respect to the detected obstacles and hence a surface to be cleaned by deriving positional data from the recorded images. From the positioning, the controller 16 controls movement of the robotic cleaning device 10 by means of controlling the wheels 12, 13 via the wheel drive motors 15a, 15b, across the surface to be cleaned
The derived positional data facilitates control of the movement of the robotic cleaning device 10 such that cleaning device can be navigated to move very close to an object, and to move closely around the object to remove debris from the surface on which the object is located. Hence, the derived positional data is utilized to move flush against the object, being e.g. a thick rug or a wall. Typically, the controller 16 continuously generates and transfers control signals to the drive wheels 12, 13 via the drive motors 15a, 15b such that the robotic cleaning device 10 is navigated close to the object.
Figure 3a illustrates facilitation of detection of objects in accordance with an embodiment of the present invention. For illustrational purposes, one vertical line laser 27 of the robotic device 10 is shown. As can be seen, the line laser 27 projects a laser beam 30 onto the floor and a first wall 31 of the room to be cleaned. The laser beam 30 is also reflected against a second wall 32, thus creating a "false" laser beam 33. Hence, the false laser beam 33 is indirectly reflected against the second wall 32 towards the camera 23 of the robotic cleaning device 10. Further, sunlight entering the room via window 34 will cause two fixedly arranged light sources 35, 36 which undesirably may be detected by the camera 23.
Figure 3b illustrates a first image 37 captured by the camera 23 in the situation shown in Figure3a where for illustrational purposes only the detected light sources are shown. Thus, the first image 37 comprises three luminous sections in the form of the directly reflected laser beam 30, the indirectly reflected laser beam 33 and the fixed sunbeam 35. As previously discussed, the indirectly reflected laser beam 33 and the fixed sunbeam 35 will cause errors in the obstacle detection for the robotic cleaning device and subsequent creation of the 3D representation, since false lines will appear in the captured images, in relation to which the robotic cleaning device cannot position itself correctly. Even though optical filters can be used to filter out light, it is oftentimes still not enough.
Figure 4a illustrates the room of Figure 3a, but where a yaw movement of the robotic device 10 has been performed. That is, the robotic cleaning device 10 turns about its z-axis with a change in angle of Δρ. This will cause the indirectly reflected laser beam 33 to move to the left on the second wall 32. The laser beams pertaining to the previous robotic cleaning device position of Figure 3a are shown with dashed lines in Figure 4a.
Figure 4b illustrates a second image 38 captured by the camera 23 in the situation shown in Figure 3b, i.e. after the heading of the robotic cleaning device 10 has changed with Δρ as compared to its heading when capturing the first image 37. Again, for illustrational purposes, only the detected light sources are shown. Thus, the second image 38 comprises three luminous sections in the form of the directly reflected laser beam 30, the indirectly reflected laser beam 33 and the fixed sunbeam 35. As can be seen, the position of the directly reflected "real" laser beam 30 is maintained in the second image 38 as compared to in the first image 37, since the camera 23 is fixedly arranged with respect to the line laser 27 and thus follows the laser beam 30. However, as was shown in Figure 4a, the indirectly reflected laser beam 33 moves to the left in the second image 38 with a distance
corresponding to Δχχ, and the fixed sunbeam 35 will accordingly move to the left in the second image 38, but with a distance corresponding to Δχ2 being smaller then the distance Δχι. The positions of the luminous sections corresponding to the first image 37 of Figure 3b are shown with dashed lines in Figure 4b. Now, in an embodiment of the present invention, since the luminous section pertaining to the laser beam 30 in the second image 38 in Figure 4b maintains its position with respect to the first image 37, it is considered to be a direct reflection of the line laser 27 impinging on the object to be detected, namely the first wall 31. That is, the extracted image data corresponding to the luminous section caused by the laser beam 30 is correctly detected data. To the contrary, the luminous sections 33 and 35 of the second image 38 in Figure 4b have moved as compared to their position in the first image 37 of figure 3b. These two luminous sections are thus false detections (in this case an indirect reflection of the line laser and a fixed light source, respectively), and can advantageously be filtered out from the first and the second image. As a result, "clean" images with no false detections are provided, which will greatly improve the obstacle-detecting capacity of the robotic cleaning device 10 and the ability of creating 3D representations from the captured images.
In order to minimize detrimental effects of ambient light, the line lasers 27, 28 are optionally controlled to emit light at a highest possible power. Further, an optical filter can be arranged in front of the camera 23 to make the camera more perceptive to the light emitted by the line lasers 27, 28. Hence, the optical filter is adapted to a wavelength of the structured light emitted by the line lasers 27, 28.
The estimated position of the robot cleaning device 10 is typically recorded at the time of capturing the respective picture 37, 38 by applying dead
reckoning. This is a known method where a current position is calculated by using locational data pertaining to a previously determined position. Image data of each image is filtered for noise reduction and image data in the form of a line 30 defining the respective vertical laser lines 27, 28 is extracted using any appropriate edge detection method, such as e.g. the Canny edge detection algorithm. Since the respective line extracted from the image data may be grainy, image processing may be further enhanced by extracting the center of the laser lines present in the picture, using for instance the so called center of gravity method on adjacent pixel values in the respective edge detected laser line to calculate the center of the laser line.
Since a CMOS camera is equipped with a light sensor array, where each individual light sensor (i.e. pixel) in the array represents detected light from a unique position in space, a recorded picture will contain image data representing objects that the line lasers have illuminated, which image data further can be associated with unique coordinates. From the extracted lines in the captured images 37, 38, a representation of the illuminated vicinity along the projected laser lines 27, 28 can be created. As the position of the robot 10 is recorded when each image is captured, the 2D representations provided by the captured images 37, 38 can be transformed into 3D space and used to build a complete 3D map of the room as the robot moves and continuously records further sections of the room. Figure 5 illustrates a flowchart of an embodiment of the method of facilitating detection of objects from captured images according to the present invention. In a first step S101, the first and second vertical line lasers 27, 28 of the 3D sensor system 22 illuminates a vicinity of the robotic cleaning device 10 with laser light. As can be seen in Figure 3a, the floor and the first wall 31 of the room to be cleaned is illuminated. Thereafter, in step S102, the camera 23 of the 3D sensor system 22 records a first image 37 of the vicinity from which obstacles may be detected.
The controller 16 is capable of positioning the robotic cleaning device 10 with respect to the detected obstacles (and hence a surface to be cleaned) by deriving positional data from the recorded images. From the positioning, the controller 16 controls movement of the robotic cleaning device 10 by means of controlling the wheels 12, 13 via the wheel drive motors 15a, 15b, across the surface to be cleaned. The derived positional data facilitates control of the movement of the robotic cleaning device 10 such that cleaning device can be navigated to move very close to an object, and to move closely around the object to remove debris from the surface on which the object is located. Hence, the derived positional data is utilized to move flush against the object, being e.g. a wall or a piece of furniture. Typically, the controller 16 continuously generates and transfers control signals to the drive wheels 12, 13 via the drive motors 15a, 15b such that the robotic cleaning device 10 is navigated close to the object.
Now, after the first image 37 has been recorded by the camera 23, the robotic cleaning device 10 changes its position in step S103, as previously has been described with reference to Figure 4a. Thus, the controller 16 effects a yaw movement of the cleaning device 10 and records Δρ via the gyroscope 24. As previously has been mentioned, the change in position of the cleaning device 10 is not necessarily brought about by a yaw movement, but could
alternatively be undertaken by performing a roll or pitch movement, or even a translation (i.e. moving forward or back), which causes small changes in the second image as compared to the rotations. A combination of these different types of movements can further be envisaged. Further, in case the cleaning device 10 moves across e.g. a thick rug or the like, the cleaning device 10 is "automatically" subject to these types of movements when the images are captured. Thereafter, in step S104, the controller 16 controls the camera 23 to capture a second image 38. The controller 16 subsequently detects, in step S105, a plurality of luminous sections 30, 33, 35 in the first and the second images 37, 38. By comparing the first and second image captured by the camera 23 in step S106, the controller 16 determines whether the luminous sections 30, 33, 35 maintains their position in the second image 38. If that is the case, the controller 16 will consider such a luminous section to be a directly reflected laser beam as a result of the line lasers 27, 28 impinging on an object to be detected, in this particular example the first wall 31. Consequently, the image data extracted from the luminous section 30 is kept in the first and second image 37, 38, while image data extracted from the luminous sections 33, 35 optionally will be removed from the images in step S107, since it relates to an indirectly reflected laser beam and a fixed light source, respectively.
With reference again to Figures 4a and b, in a further embodiment of the present invention, in relation to a predetermined reference orientation of the robotic cleaning device, indirect reflections such as the laser beam 33 reflected against the second wall 32 can be distinguished from fixed lights sources such as the sunbeams 35, 36.
For instance, if a clockwise rotation of the robotic cleaning device 10 is regarded as the predetermined reference orientation, the extracted image data caused by the indirect reflection 33 will move to the left in the second image 38 as compared to its position in the first image 37, and so will the extracted image data caused by the fixed light source 35, but the movement of the indirect reflection will be greater, while a counter-clockwise rotation will result in the respective extracted image data moving in the right direction when comparing the two images 37, 38. Hence, the movement- i.e. displacement - of the extracted image data in the second image 38 will be greater for an indirect reflection 33 than for a fixed light source 35.
Thus, in an embodiment, a measure is evaluated based on a relation between the change in position of the robotic cleaning device 10 - caused e.g. by a yaw, pitch, roll or translation movement of the robotic cleaning device 10 - and a displacement Δχ of the extracted image data in the second image 38 to determine whether the luminous section in the second image is an indirect reflection 33 of the structured light impinging on the object 31 or a fixed light source 35. In a further embodiment, a measure k is introduced:
, Ax
k =— , where as an example, with reference to Figures 4a and b, Δχ is defined as positive for extracted image data moving to the right in the second image 38, and Δρ is defined as positive for rotation in a counter-clockwise direction of the robotic cleaning device 10.
When the robotic cleaning device 10 turns right, as is the case in Figures 4a and b, the extracted image data pertaining to the fixed light source 35 moves a distance Δχ2 to the left, while the extracted image data of the indirect reflection 33 also moves to the left but with a greater distance Δχι.
Consequently, the movement of both the indirect reflection 33 and the fixed light source 35 will result in k = positive, but the indirect reflection will result in a greater k, i.e. Δχι > Δχ2. The direct reflection will not move, resulting in k = o.
Thus, with respect to the clockwise rotation of Figures 4a and b, for: (a) the direct reflection in the form of the laser beam 30, Δχ is zero
resulting in k =0; l8
(b) the indirect reflection in the form of the laser beam 33, Δχχ is negative resulting in k = positive (if a counter-clockwise orientation is the reference orientation); and
(c) the fixed light source in the form of the sunbeam 33, Δχ2 is negative resulting in k = positive, but with a smaller k as compared to the indirect reflection.
Thus, a threshold value can be appropriately set to determine whether the extracted image data should be rejected as false. If k exceeds an appropriately set threshold value, the extracted image data is considered to be caused by an indirect reflection. If k is below the appropriately set threshold value (but greater than o, i.e. no displacement of image data is detected), the extracted image data is considered to be caused by a fixed light source.
In a practical example, the camera used has a 900 field of view distributed over 752 pixels horizontally. In such an example, if Δχ for the respective extracted image data 30, 33, 35 is multiplied with 90/752, k would be approximately 1 for the fixed light source 35.
The invention has mainly been described above with reference to a few embodiments. However, as is readily appreciated by a person skilled in the art, other embodiments than the ones disclosed above are equally possible within the scope of the invention, as defined by the appended patent claims.

Claims

1. A method for a robotic cleaning device (10) of facilitating detection of objects from captured images, the method comprising:
illuminating (S101) a vicinity of the robotic cleaning device with structured light;
capturing (S102) a first image (37) of the vicinity of the robotic cleaning device;
changing (S103) position of the robotic cleaning device;
capturing (S104) a second image (38) of the vicinity of the robotic cleaning device;
detecting (S105) at least one luminous section (30, 33, 35) in the first and second image; and
determining (S106), when comparing the second image and the first image, whether the luminous section maintains its position in the second image, in which case the luminous section (30) is considered to be a direct reflection of the structured light impinging on an object (31) in said vicinity.
2. The method of claim 1, further comprising:
filtering (S107) out the luminous section from the first and second image if the luminous section does not maintain its position in the second image.
3. The method of claims 1 or 2, wherein the determining further comprises:
evaluating, when comparing the second image (38) and the first image (37), a measure based on a relation between the change in position of the robotic cleaning device (10) and a displacement (Δχ) of the luminous section (33, 35) i the second image to determine whether the luminous section is an indirect reflection of the structured light impinging on the object (31) or a fixed light source in said vicinity.
4. The method of any of claim 3, wherein the measure (k) is based on a ratio of the displacement (Δχ) of the luminous section (33, 35) in the second image (38) to a property (Δρ) of the change in position (Δρ) of the robotic cleaning device (10), the luminous section being determined to be a fixed light source in said vicinity if a magnitude of the ratio exceeds a threshold value, and being determined to be an indirect reflection of the structured light impinging on the object (31) if the magnitude is below the threshold value..
5. Robotic cleaning device (10) comprising:
a propulsion system (12, 13, 15a, 15b) arranged to move the robotic cleaning device (10);
at least one light source (27) arranged to illuminate a vicinity of the robotic cleaning device with structured light;
a camera device (23) arranged to capture images of the vicinity of the robotic cleaning device;
a controller (16) arranged to control the propulsion system to move the robotic cleaning device; wherein
the controller (16) further is arranged to control the camera device to capture a first image (37), to control the propulsion system to change position of the robotic cleaning device, to control the camera device to capture a second image (38), to detect at least one luminous section (30, 33, 35) in the first and second image, to determine, when comparing the second image and the first image, whether the luminous section maintains its position in the second image, in which case the luminous section (30) is considered to be a direct reflection of the structured light impinging on an object (31) in said vicinity.
6. The robotic cleaning device (10) of claim 5, the controller (16) further being arranged to filter out the luminous section (33, 35) from the first and second image (37, 38) in case the luminous section does not maintain its position in the second image.
7. The robotic cleaning device (10) of claims 5 or 6, the controller (16) further being arranged to detect the object (31) from the captured images (37, 38).
8. The robotic cleaning device (10) of claim 7, further comprising:
a position-measuring device (24) arranged to measure the position of the robotic cleaning device (10), the controller (16) further being arranged to: position the robotic cleaning device with respect to detected object (31) from the captured images (31, 32) and positional data provided by the position-measuring device, wherein the controlling of the movement of the robotic cleaning device is performed on the basis of the positioning.
9. The robotic cleaning device (10) of claim 8, the position-measuring device (24) comprising an accelerometer and/or a gyroscope.
10. The robotic cleaning device (10) of any one of claims 5-8, said at least one light source comprising:
a first and second vertical line laser (27, 28) arranged to illuminate said vicinity of the robotic cleaning device.
11. The robotic cleaning device (10) of claim 10, wherein said first (27) and second (28) line lasers are arranged on a respective side of the camera device
(23) along an axis being perpendicular to an optical axis of the camera device.
12. The robot cleaning device (10) according to any one of claims 5-11, the controller (16) further being arranged to:
evaluate, when comparing the second image (38) and the first image (37), a measure based on a relation between the change in position of the robotic cleaning device (10) and a displacement (Δχ) of the luminous section (33, 35) the second image to determine whether the luminous section is an indirect reflection of the structured light impinging on the object (31) or a fixed light source in said vicinity.
13. The robot cleaning device (10) according to claim 12, wherein the measure (k) is based on a ratio of the displacement (Δχ) of the luminous section (33, 35) in the second image (38) to a property (Δρ) of the change in position (Δρ) of the robotic cleaning device (10), the luminous section being determined to be a fixed light source in said vicinity if a magnitude of the ratio exceeds a threshold value, and being determined to be an indirect reflection of the structured light impinging on the object (31) if the magnitude is below the threshold value.
14. A computer program (25) comprising computer-executable instructions for causing a device (10) to perform the steps recited in any one of claims 1-4 when the computer-executable instructions are executed on a controller (16) included in the device.
15. A computer program product comprising a computer readable medium (26), the computer readable medium having the computer program (25) according to claim 14 embodied therein.
PCT/EP2014/078143 2014-07-10 2014-12-17 Method in a robotic cleaning device for facilitating detection of objects from captured images WO2016005011A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE1450885-7 2014-07-10
SE1450885 2014-07-10

Publications (1)

Publication Number Publication Date
WO2016005011A1 true WO2016005011A1 (en) 2016-01-14

Family

ID=52146473

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2014/078143 WO2016005011A1 (en) 2014-07-10 2014-12-17 Method in a robotic cleaning device for facilitating detection of objects from captured images

Country Status (1)

Country Link
WO (1) WO2016005011A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108603935A (en) * 2016-03-15 2018-09-28 伊莱克斯公司 The method that robotic cleaning device and robotic cleaning device carry out cliff detection
WO2018219473A1 (en) * 2017-06-02 2018-12-06 Aktiebolaget Electrolux Method of detecting a difference in level of a surface in front of a robotic cleaning device
CN109033136A (en) * 2018-06-05 2018-12-18 北京智行者科技有限公司 A kind of operation map updating method
EP3432107A1 (en) * 2017-07-21 2019-01-23 LG Electronics Inc. Cleaning robot and controlling method thereof
US10877484B2 (en) 2014-12-10 2020-12-29 Aktiebolaget Electrolux Using laser sensor for floor type detection
US20210212541A1 (en) * 2018-05-16 2021-07-15 Lg Electronics Inc. Vacuum cleaner and control method thereof
US11099554B2 (en) 2015-04-17 2021-08-24 Aktiebolaget Electrolux Robotic cleaning device and a method of controlling the robotic cleaning device
US11507097B2 (en) * 2018-02-05 2022-11-22 Pixart Imaging Inc. Control apparatus for auto clean machine and auto clean machine control method
US11712142B2 (en) 2015-09-03 2023-08-01 Aktiebolaget Electrolux System of robotic cleaning devices
US11921517B2 (en) 2017-09-26 2024-03-05 Aktiebolaget Electrolux Controlling movement of a robotic cleaning device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5155775A (en) * 1988-10-13 1992-10-13 Brown C David Structured illumination autonomous machine vision system
US20030194110A1 (en) * 2002-04-16 2003-10-16 Koninklijke Philips Electronics N.V. Discriminating between changes in lighting and movement of objects in a series of images using different methods depending on optically detectable surface characteristics
US20060165276A1 (en) * 2005-01-25 2006-07-27 Samsung Electronics Co., Ltd Apparatus and method for estimating location of mobile body and generating map of mobile body environment using upper image of mobile body environment, and computer readable recording medium storing computer program controlling the apparatus
WO2014033055A1 (en) * 2012-08-27 2014-03-06 Aktiebolaget Electrolux Robot positioning system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5155775A (en) * 1988-10-13 1992-10-13 Brown C David Structured illumination autonomous machine vision system
US20030194110A1 (en) * 2002-04-16 2003-10-16 Koninklijke Philips Electronics N.V. Discriminating between changes in lighting and movement of objects in a series of images using different methods depending on optically detectable surface characteristics
US20060165276A1 (en) * 2005-01-25 2006-07-27 Samsung Electronics Co., Ltd Apparatus and method for estimating location of mobile body and generating map of mobile body environment using upper image of mobile body environment, and computer readable recording medium storing computer program controlling the apparatus
WO2014033055A1 (en) * 2012-08-27 2014-03-06 Aktiebolaget Electrolux Robot positioning system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MAY S ET AL: "Robust 3D-mapping with time-of-flight cameras", INTELLIGENT ROBOTS AND SYSTEMS, 2009. IROS 2009. IEEE/RSJ INTERNATIONAL CONFERENCE ON, IEEE, PISCATAWAY, NJ, USA, 10 October 2009 (2009-10-10), pages 1673 - 1678, XP031581042, ISBN: 978-1-4244-3803-7 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10877484B2 (en) 2014-12-10 2020-12-29 Aktiebolaget Electrolux Using laser sensor for floor type detection
US11099554B2 (en) 2015-04-17 2021-08-24 Aktiebolaget Electrolux Robotic cleaning device and a method of controlling the robotic cleaning device
US11712142B2 (en) 2015-09-03 2023-08-01 Aktiebolaget Electrolux System of robotic cleaning devices
CN108603935A (en) * 2016-03-15 2018-09-28 伊莱克斯公司 The method that robotic cleaning device and robotic cleaning device carry out cliff detection
US11169533B2 (en) 2016-03-15 2021-11-09 Aktiebolaget Electrolux Robotic cleaning device and a method at the robotic cleaning device of performing cliff detection
WO2018219473A1 (en) * 2017-06-02 2018-12-06 Aktiebolaget Electrolux Method of detecting a difference in level of a surface in front of a robotic cleaning device
CN110621208A (en) * 2017-06-02 2019-12-27 伊莱克斯公司 Method for detecting a height difference of a surface in front of a robotic cleaning device
JP2020522288A (en) * 2017-06-02 2020-07-30 アクチエボラゲット エレクトロルックス How to detect level differences on the front surface of a robot cleaning device
JP7243967B2 (en) 2017-06-02 2023-03-22 アクチエボラゲット エレクトロルックス Method for Detecting Level Differences on a Surface in Front of a Robotic Cleaning Device
US11474533B2 (en) 2017-06-02 2022-10-18 Aktiebolaget Electrolux Method of detecting a difference in level of a surface in front of a robotic cleaning device
EP3432107A1 (en) * 2017-07-21 2019-01-23 LG Electronics Inc. Cleaning robot and controlling method thereof
US11921517B2 (en) 2017-09-26 2024-03-05 Aktiebolaget Electrolux Controlling movement of a robotic cleaning device
US11507097B2 (en) * 2018-02-05 2022-11-22 Pixart Imaging Inc. Control apparatus for auto clean machine and auto clean machine control method
US20210212541A1 (en) * 2018-05-16 2021-07-15 Lg Electronics Inc. Vacuum cleaner and control method thereof
CN109033136B (en) * 2018-06-05 2021-06-29 北京智行者科技有限公司 Operation map updating method
CN109033136A (en) * 2018-06-05 2018-12-18 北京智行者科技有限公司 A kind of operation map updating method

Similar Documents

Publication Publication Date Title
US10877484B2 (en) Using laser sensor for floor type detection
WO2016005011A1 (en) Method in a robotic cleaning device for facilitating detection of objects from captured images
US10678251B2 (en) Cleaning method for a robotic cleaning device
US10149589B2 (en) Sensing climb of obstacle of a robotic cleaning device
US9946263B2 (en) Prioritizing cleaning areas
JP6202544B2 (en) Robot positioning system
US11474533B2 (en) Method of detecting a difference in level of a surface in front of a robotic cleaning device
US11169533B2 (en) Robotic cleaning device and a method at the robotic cleaning device of performing cliff detection
US20180242806A1 (en) System of robotic cleaning devices
US20220299650A1 (en) Detecting objects using a line array
US20190246852A1 (en) Robotic cleaning device and a method of controlling movement of the robotic cleaning device
WO2017108077A1 (en) Controlling movement of a robotic cleaning device
WO2024008279A1 (en) Robotic cleaning device using optical sensor for navigation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14818947

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14818947

Country of ref document: EP

Kind code of ref document: A1