US20150367780A1 - Method for ascertaining the heart rate of the driver of a vehicle - Google Patents

Method for ascertaining the heart rate of the driver of a vehicle Download PDF

Info

Publication number
US20150367780A1
US20150367780A1 US14/744,158 US201514744158A US2015367780A1 US 20150367780 A1 US20150367780 A1 US 20150367780A1 US 201514744158 A US201514744158 A US 201514744158A US 2015367780 A1 US2015367780 A1 US 2015367780A1
Authority
US
United States
Prior art keywords
driver
camera
recited
heart rate
eye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US14/744,158
Other versions
US10043074B2 (en
Inventor
Joerg Hilsebecher
Philippe Dreuw
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Assigned to ROBERT BOSCH GMBH reassignment ROBERT BOSCH GMBH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HILSEBECHER, JOERG, DREUW, Philippe
Publication of US20150367780A1 publication Critical patent/US20150367780A1/en
Application granted granted Critical
Publication of US10043074B2 publication Critical patent/US10043074B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • G06K9/00604
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8006Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying scenes of vehicle interior, e.g. for monitoring passengers or cargo

Definitions

  • the present invention relates to a method for ascertaining the heart rate of the driver of a vehicle.
  • the present invention is based on the objective of ascertaining the heart rate of the driver of a vehicle with high precision.
  • the method of the present invention makes it possible to determine the heart rate of the driver of a vehicle in a contactless manner and with high precision. This is accomplished with the aid of a camera, which is aimed at the head of the driver and determines the eye position of the driver. The images recorded by the camera are subsequently analyzed, and multiple temporally successive images of the camera are examined in an effort to ascertain the heart rate.
  • the analysis pertains to an image section of the driver's head, in which changes in the examined image section across the temporally successive images shed light on the heart rate of the driver.
  • This procedure has the advantage that the determination of the eye position makes it possible to determine the image section of the driver's head on the basis of which the heart rate is ascertained in a precise manner.
  • the image section of interest is likewise known very precisely, via the relative position vis-à-vis the eye position, and can then be analyzed in a subsequent evaluation on the basis of multiple successively recorded images of the camera.
  • the ascertainment of the eye position therefore constitutes the first step, which is followed by a second step in which an ascertained image section of the driver's head is analyzed.
  • Both the eye position and the image section to be examined for ascertaining the heart rate are preferably recorded in a joint image of the camera and analyzed.
  • the eye position can be considered in a further context as well, e.g., for ascertaining the line of sight and consequently also the head posture of the driver, which may be taken into account for instance when evaluating the attentiveness state of the driver, or for providing additional information, such as the projection of information onto the windshield.
  • the ascertainment of the eye position therefore constitutes information that is able to be utilized multiple times in different ways.
  • a camera which operates in the infrared range, and additionally perhaps also an infrared illumination device are advantageously used, via which infrared radiation, especially in the near-infrared range, is emitted in the direction of the driver.
  • a characteristic highlight on the cornea in the region of the pupil of the driver's eye is created in that the eye reflects the infrared light in the manner of a convex mirror (corneal reflection).
  • the highlight on the cornea is visible in the recorded image of the infrared camera, so that it can be utilized to ascertain the eye position.
  • the image of the camera is able to be subjected to further image processing steps, such as a feature extraction and an ascertainment of the center of the pupil in order to ascertain the line of sight.
  • the image section contains a defined facial region, such as an eye or both eyes of the driver, or a facial region beyond the eyes, such as the lips, cheeks or the forehead.
  • the image area including the facial region which lies either at a specified distance from the eyes or which involves an eye or the eyes, is able to be analyzed if successive image recordings are available, for instance with the aid of an image frequency analysis method.
  • the recorded images in the image section of interest are examined either with regard to changes in color or changes in size, especially changes in volume. Because of the pulsation caused by the heart rate, the color and/or the size of the examined facial region in the image section change(s) rhythmically, which is able to be determined by the appropriate analysis methods.
  • the heart rate results from the temporal change in color and size.
  • the eyeball in particular is examined, which pulsates in synchrony with the cardiac rhythm (fundus pulsation) and is subject to a corresponding change in color and volume, which is able to be recorded.
  • an additional detection method may be used for determining the head position, so that higher reliability of the information and the data obtained with the aid of the camera is able to be achieved. If the additional detection method does not supply an unambiguous signal, the information obtained from the camera may possibly be discarded, at least with regard to the heart frequency analysis, so that such an analysis is dispensed with.
  • the additional detection system is used to determine the head position, for example.
  • the detection system may also be developed as voice detection system, which is employed to ascertain whether the driver is talking. This is utilized for determining the direction of the voice origin, for instance, in order to infer the position of the driver's mouth.
  • the information regarding the heart rate is able to be used further in the vehicle, for instance in a driver assistance system, which is parameterized as a function of the heart rate. For example, if a heart rate suggests that the driver's health status is compromised, it may be advantageous to implement a parameterization in a driver assistance system at reduced trigger thresholds, such as in a brake assistant. In addition, it is also possible to make the information available in the vehicle in an optical, acoustic or haptic manner. Finally, the heart rate may also be transmitted to an external location, outside of the vehicle, for instance to an emergency hotline.
  • the method is advantageously executed in a system in the vehicle which includes a camera for ascertaining the eye position of the driver, and an evaluation unit for analyzing a recorded image section of the driver's head.
  • the camera for example, is an infrared camera, which may be equipped with an illumination unit, preferably for generating infrared radiation, especially in the near-infrared range.
  • FIG. 1 shows a schematic illustration of a system for ascertaining the heart rate of the driver in a vehicle.
  • FIG. 2 shows in a perspective view, the passenger compartment of a vehicle having cameras at different positions for recording the head region of the driver.
  • FIG. 3 shows a schematic representation of an infrared-sensitive camera for detecting the head region of the driver.
  • FIG. 1 shows a system 1 for ascertaining the heart rate of the driver in a vehicle.
  • System 1 is installed in a vehicle and makes it possible to record the head region of the driver with the aid of one or more camera(s), and furthermore to ascertain the heart rate on the basis of the analysis of multiple successive images recorded by the camera.
  • System 1 includes at least one camera 3 , which operates in the near-infrared range, in particular, and furthermore an additional detection system 2 , which includes at least one passenger compartment sensor, with the aid of which the driver or an utterance or a reaction of the driver is able to be detected, independently of and additionally to camera 3 , in an acoustic, haptic or optical manner.
  • Detection system 2 for example, is a microphone in the vehicle interior for recording the utterances made inside the vehicle. It is advantageous that the source of the words is able to be detected with the aid of detection system 2 , that is to say, the position of the driver's head, especially of the driver's mouth.
  • Camera 3 which preferably operates in the near infrared range, may be part of a gaze detection device, which is used to ascertain the driver's line of sight.
  • Camera 3 may be assigned an infrared source, such as an LED, for generating a weak infrared signal, which is reflected by the eye of the driver and recorded in the infrared-sensitive camera. In so doing, the corneal reflex in the eye is recorded by the camera, from which the eye position of the driver is inferred.
  • an infrared source such as an LED
  • System 1 additionally includes a processing and evaluation unit 4 , in which the data are analyzed and the output signals for actuating display devices 5 , 6 and 7 are generated.
  • Display devices 5 , 6 and 7 preferably are optical, acoustic and/or haptic display devices for representing the heart rate of the driver ascertained in computer and evaluation unit 4 .
  • an actuation of a driver assistance system in the vehicle via computer and evaluation unit 4 is conceivable as well. For example, given a raised heart rate, which indicates an indisposition or an acute risk state of the driver, the parameterization of a driver assistance system, such as a brake assistant, for instance, may be changed in favor of lower trigger thresholds.
  • the current heart rate of the driver is determined by analyzing multiple successive images, which are recorded with the aid of camera 3 in system 1 .
  • a certain part of the driver's head is examined and analyzed, preferably an eye or both eyes of the driver, while other head regions, especially facial regions such as lips, cheeks or the forehead may be considered as well, as the case may be.
  • the pulsation in the body induces a color and/or size in the examined facial regions that varies with the heart rate, which can be ascertained by analyzing the successive images.
  • the fundus pulsation i.e., the change in the size of the eye, is examined and analyzed in order to ascertain the heart rate.
  • the recorded images may be analyzed using the image frequency analysis method.
  • FIG. 2 illustrates the vehicle interior of a vehicle in which cameras 3 , which are part of a system for ascertaining the heart rate of the driver, have been installed at different positions.
  • a first camera 3 a is located in the dashboard area, directly below the steering wheel.
  • a camera 3 b is situated on the steering wheel, and further cameras 3 c and 3 d are located in the left and right A-column, respectively.
  • Cameras 3 e , 3 f and 3 g are disposed at different heights in the center of the dashboard area.
  • Camera 3 a is integrated into the interior mirror.
  • the orientation of camera 3 with respect to the head of the driver is such that the facial region in the eyes is able to be recorded.
  • an infrared illumination unit which may be integrated into camera 3
  • the driver's eye is illuminated, and the corneal reflex in the eye is recorded via the camera, from which the eye position of the driver is ascertained in a very precise manner.
  • the heart rate is able to be determined by way of analysis, for instance on the basis of the fundus pulsation.

Abstract

In a method for ascertaining the heart rate of the driver of a vehicle, the eye position is determined with the aid of a camera, and an image section of the driver's head is acquired. To ascertain the heart rate, successive images which were recorded over time with the aid of the camera are analyzed.

Description

    FIELD OF THE INVENTION
  • The present invention relates to a method for ascertaining the heart rate of the driver of a vehicle.
  • BACKGROUND INFORMATION
  • From German Published Patent Application No. 10 2010 023 369 A1 it is already known to integrate an electrode into a vehicle seat for the capacitive measurement of biological signals, so that an electrocardiogram of a vehicle passenger can be recorded in a contact-free manner. The electrode is made up of multiple electrically acting layers, which are situated directly on top of each other, are developed as sensor layer, shield layer as well as grounding layer and include interposed insulating layers. In the capacitive measurement for ascertaining the electrocardiogram, one layer of the electrodes forms the plate of an electrical plate-type capacitor, and the skin surface of the driver forms the second plate. The information from the ascertained electrocardiogram can be used for the vehicle control or be transmitted in a wireless manner to an external station.
  • SUMMARY
  • The present invention is based on the objective of ascertaining the heart rate of the driver of a vehicle with high precision.
  • The method of the present invention makes it possible to determine the heart rate of the driver of a vehicle in a contactless manner and with high precision. This is accomplished with the aid of a camera, which is aimed at the head of the driver and determines the eye position of the driver. The images recorded by the camera are subsequently analyzed, and multiple temporally successive images of the camera are examined in an effort to ascertain the heart rate. The analysis pertains to an image section of the driver's head, in which changes in the examined image section across the temporally successive images shed light on the heart rate of the driver.
  • This procedure has the advantage that the determination of the eye position makes it possible to determine the image section of the driver's head on the basis of which the heart rate is ascertained in a precise manner. As soon as the eye position has been determined, the image section of interest is likewise known very precisely, via the relative position vis-à-vis the eye position, and can then be analyzed in a subsequent evaluation on the basis of multiple successively recorded images of the camera. The ascertainment of the eye position therefore constitutes the first step, which is followed by a second step in which an ascertained image section of the driver's head is analyzed. Both the eye position and the image section to be examined for ascertaining the heart rate are preferably recorded in a joint image of the camera and analyzed.
  • If appropriate, the eye position can be considered in a further context as well, e.g., for ascertaining the line of sight and consequently also the head posture of the driver, which may be taken into account for instance when evaluating the attentiveness state of the driver, or for providing additional information, such as the projection of information onto the windshield. The ascertainment of the eye position therefore constitutes information that is able to be utilized multiple times in different ways.
  • To detect the eye position, a camera, which operates in the infrared range, and additionally perhaps also an infrared illumination device are advantageously used, via which infrared radiation, especially in the near-infrared range, is emitted in the direction of the driver. A characteristic highlight on the cornea in the region of the pupil of the driver's eye is created in that the eye reflects the infrared light in the manner of a convex mirror (corneal reflection). The highlight on the cornea is visible in the recorded image of the infrared camera, so that it can be utilized to ascertain the eye position. If appropriate, the image of the camera is able to be subjected to further image processing steps, such as a feature extraction and an ascertainment of the center of the pupil in order to ascertain the line of sight.
  • It is possible to create the eye position also by using light in the visible range. Here, too, a reflecting point of light is produced on the cornea of the eye, which can then be recorded by a camera operating with visible light.
  • An image cutaway of the camera image is analyzed in multiple successive steps in order to ascertain the heart rate. The image section contains a defined facial region, such as an eye or both eyes of the driver, or a facial region beyond the eyes, such as the lips, cheeks or the forehead. The image area including the facial region, which lies either at a specified distance from the eyes or which involves an eye or the eyes, is able to be analyzed if successive image recordings are available, for instance with the aid of an image frequency analysis method.
  • In the analysis, the recorded images in the image section of interest are examined either with regard to changes in color or changes in size, especially changes in volume. Because of the pulsation caused by the heart rate, the color and/or the size of the examined facial region in the image section change(s) rhythmically, which is able to be determined by the appropriate analysis methods. The heart rate results from the temporal change in color and size.
  • In an analysis of the eye, the eyeball in particular is examined, which pulsates in synchrony with the cardiac rhythm (fundus pulsation) and is subject to a corresponding change in color and volume, which is able to be recorded.
  • If appropriate, an additional detection method may be used for determining the head position, so that higher reliability of the information and the data obtained with the aid of the camera is able to be achieved. If the additional detection method does not supply an unambiguous signal, the information obtained from the camera may possibly be discarded, at least with regard to the heart frequency analysis, so that such an analysis is dispensed with. The additional detection system is used to determine the head position, for example. The detection system may also be developed as voice detection system, which is employed to ascertain whether the driver is talking. This is utilized for determining the direction of the voice origin, for instance, in order to infer the position of the driver's mouth.
  • The information regarding the heart rate is able to be used further in the vehicle, for instance in a driver assistance system, which is parameterized as a function of the heart rate. For example, if a heart rate suggests that the driver's health status is compromised, it may be advantageous to implement a parameterization in a driver assistance system at reduced trigger thresholds, such as in a brake assistant. In addition, it is also possible to make the information available in the vehicle in an optical, acoustic or haptic manner. Finally, the heart rate may also be transmitted to an external location, outside of the vehicle, for instance to an emergency hotline.
  • The method is advantageously executed in a system in the vehicle which includes a camera for ascertaining the eye position of the driver, and an evaluation unit for analyzing a recorded image section of the driver's head. The camera, for example, is an infrared camera, which may be equipped with an illumination unit, preferably for generating infrared radiation, especially in the near-infrared range.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows a schematic illustration of a system for ascertaining the heart rate of the driver in a vehicle.
  • FIG. 2 shows in a perspective view, the passenger compartment of a vehicle having cameras at different positions for recording the head region of the driver.
  • FIG. 3 shows a schematic representation of an infrared-sensitive camera for detecting the head region of the driver.
  • DETAILED DESCRIPTION
  • FIG. 1 shows a system 1 for ascertaining the heart rate of the driver in a vehicle. System 1 is installed in a vehicle and makes it possible to record the head region of the driver with the aid of one or more camera(s), and furthermore to ascertain the heart rate on the basis of the analysis of multiple successive images recorded by the camera.
  • System 1 includes at least one camera 3, which operates in the near-infrared range, in particular, and furthermore an additional detection system 2, which includes at least one passenger compartment sensor, with the aid of which the driver or an utterance or a reaction of the driver is able to be detected, independently of and additionally to camera 3, in an acoustic, haptic or optical manner. Detection system 2, for example, is a microphone in the vehicle interior for recording the utterances made inside the vehicle. It is advantageous that the source of the words is able to be detected with the aid of detection system 2, that is to say, the position of the driver's head, especially of the driver's mouth.
  • Camera 3, which preferably operates in the near infrared range, may be part of a gaze detection device, which is used to ascertain the driver's line of sight.
  • Camera 3 may be assigned an infrared source, such as an LED, for generating a weak infrared signal, which is reflected by the eye of the driver and recorded in the infrared-sensitive camera. In so doing, the corneal reflex in the eye is recorded by the camera, from which the eye position of the driver is inferred.
  • System 1 additionally includes a processing and evaluation unit 4, in which the data are analyzed and the output signals for actuating display devices 5, 6 and 7 are generated. Display devices 5, 6 and 7 preferably are optical, acoustic and/or haptic display devices for representing the heart rate of the driver ascertained in computer and evaluation unit 4. In addition or as an alternative to a display device, an actuation of a driver assistance system in the vehicle via computer and evaluation unit 4 is conceivable as well. For example, given a raised heart rate, which indicates an indisposition or an acute risk state of the driver, the parameterization of a driver assistance system, such as a brake assistant, for instance, may be changed in favor of lower trigger thresholds.
  • The current heart rate of the driver is determined by analyzing multiple successive images, which are recorded with the aid of camera 3 in system 1. In the process, a certain part of the driver's head is examined and analyzed, preferably an eye or both eyes of the driver, while other head regions, especially facial regions such as lips, cheeks or the forehead may be considered as well, as the case may be. The pulsation in the body induces a color and/or size in the examined facial regions that varies with the heart rate, which can be ascertained by analyzing the successive images. For example, the fundus pulsation, i.e., the change in the size of the eye, is examined and analyzed in order to ascertain the heart rate. To do so, the recorded images may be analyzed using the image frequency analysis method.
  • FIG. 2 illustrates the vehicle interior of a vehicle in which cameras 3, which are part of a system for ascertaining the heart rate of the driver, have been installed at different positions. A first camera 3 a is located in the dashboard area, directly below the steering wheel. A camera 3 b is situated on the steering wheel, and further cameras 3 c and 3 d are located in the left and right A-column, respectively. Cameras 3 e, 3 f and 3 g are disposed at different heights in the center of the dashboard area. Camera 3 a is integrated into the interior mirror.
  • It basically suffices to provide only one camera 3 in the passenger compartment, which is focused on the head of the driver. However, it may be advantageous to provide a plurality of cameras 3 in the passenger compartment, in order to be able to record the eyes of the driver at different head positions, and to be able to derive the heart rate on this basis.
  • As can be gathered from FIG. 3, the orientation of camera 3 with respect to the head of the driver is such that the facial region in the eyes is able to be recorded. Using an infrared illumination unit, which may be integrated into camera 3, the driver's eye is illuminated, and the corneal reflex in the eye is recorded via the camera, from which the eye position of the driver is ascertained in a very precise manner. By recording multiple successive images with the aid of camera 3, which cover a time interval of multiple heartbeats, the heart rate, especially in the frequency range, is able to be determined by way of analysis, for instance on the basis of the fundus pulsation.

Claims (15)

What is claimed is:
1. A method for ascertaining a heart rate of a driver of a vehicle, comprising:
ascertaining, via a camera, an eye position of an eye of the driver;
recording an image section of a head of the driver; and
analyzing successive images recorded by the camera in order to ascertain the heart rate.
2. The method as recited in claim 1, wherein the eye position of the driver is ascertained based on a corneal reflex in the eye.
3. The method as recited in claim 1, wherein images of the eye are analyzed in order to ascertain the heart rate.
4. The method as recited in claim 1, wherein images of a defined facial region outside of the eye are analyzed in order to ascertain the heart rate.
5. The method as recited in claim 4, wherein the defined facial region outside of the eye includes lips.
6. The method as recited in claim 1, wherein the recorded images are analyzed with regard to a change in color.
7. The method as recited in claim 1, wherein the recorded images are analyzed with regard to a change in a size of a facial region.
8. The method as recited in claim 1, wherein the recorded images are analyzed using an image frequency analysis method.
9. The method as recited in claim 1, wherein a line of sight of the driver is ascertained from the eye position.
10. The method as recited in claim 1, wherein the camera includes an infrared camera.
11. The method as recited in claim 1, wherein the camera operates in a near-infrared range.
12. The method as recited in claim 10, wherein an infrared illumination unit is used, via which infrared radiation is generated.
13. The method as recited in claim 1, further comprising an additional detection system for increasing a reliability of information obtained by analyzing the camera images.
14. The method as recited in claim 13, wherein the additional detection system ascertains whether the driver is speaking.
15. A system for a vehicle for ascertaining a heart rate of a driver, comprising:
a camera for ascertaining an eye position of an eye of the driver and for recording an image section of a head of the driver; and
an evaluation unit for analyzing successive images recorded by the camera in order to ascertain the heart rate.
US14/744,158 2014-06-20 2015-06-19 Method for ascertaining the heart rate of the driver of a vehicle Active 2036-03-15 US10043074B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE102014211882.4A DE102014211882A1 (en) 2014-06-20 2014-06-20 Method for determining the heart rate of the driver of a vehicle
DE102014211882 2014-06-20
DE102014211882.4 2014-06-20

Publications (2)

Publication Number Publication Date
US20150367780A1 true US20150367780A1 (en) 2015-12-24
US10043074B2 US10043074B2 (en) 2018-08-07

Family

ID=54767916

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/744,158 Active 2036-03-15 US10043074B2 (en) 2014-06-20 2015-06-19 Method for ascertaining the heart rate of the driver of a vehicle

Country Status (3)

Country Link
US (1) US10043074B2 (en)
CN (1) CN105193402A (en)
DE (1) DE102014211882A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018034781A1 (en) * 2016-08-16 2018-02-22 Honda Motor Co., Ltd. Vehicle data selection system for modifying automated driving functionalities and method thereof
DE102017216328B3 (en) * 2017-09-14 2018-12-13 Audi Ag A method for monitoring a state of attention of a person, processing device, storage medium, and motor vehicle
CN111652036A (en) * 2020-03-30 2020-09-11 华南理工大学 Fatigue driving identification method based on fusion of heart rate and facial features of vision
CN111753586A (en) * 2019-03-28 2020-10-09 合肥工业大学 Fatigue driving monitoring device and method
JP2020199848A (en) * 2019-06-07 2020-12-17 本田技研工業株式会社 Driver state detection device
WO2021205508A1 (en) * 2020-04-06 2021-10-14 三菱電機株式会社 Biological information acquisition device and biological information acquisition method

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017154213A1 (en) * 2016-03-11 2017-09-14 三菱電機株式会社 Vehicle-mounted device, warning output method, and warning output program
CN109409172B (en) * 2017-08-18 2021-08-13 安徽三联交通应用技术股份有限公司 Driver sight line detection method, system, medium, and apparatus
EP3666169A1 (en) * 2018-12-11 2020-06-17 Aptiv Technologies Limited Driver monitoring system
US11527081B2 (en) 2020-10-20 2022-12-13 Toyota Research Institute, Inc. Multiple in-cabin cameras and lighting sources for driver monitoring

Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020188219A1 (en) * 2001-06-06 2002-12-12 Eytan Suchard Method and apparatus for inferring physical/mental fitness through eye response analysis
US20070291983A1 (en) * 2006-06-14 2007-12-20 Hammoud Riad I Method of tracking a human eye in a video image
US7602278B2 (en) * 2005-01-19 2009-10-13 Takata-Petri Ag Steering wheel assembly for a motor vehicle
US7835834B2 (en) * 2005-05-16 2010-11-16 Delphi Technologies, Inc. Method of mitigating driver distraction
US7982618B2 (en) * 2007-01-29 2011-07-19 Denso Corporation Wakefulness maintaining apparatus and method of maintaining wakefulness
US8077915B2 (en) * 2007-10-12 2011-12-13 Sony Ericsson Mobile Communications Ab Obtaining information by tracking a user
US20120150387A1 (en) * 2010-12-10 2012-06-14 Tk Holdings Inc. System for monitoring a vehicle driver
US8274578B2 (en) * 2008-05-15 2012-09-25 Sungkyunkwan University Foundation For Corporate Collaboration Gaze tracking apparatus and method using difference image entropy
US8322855B2 (en) * 2004-05-06 2012-12-04 Indo Internacional S.A. Method for determining the visual behaviour of a Person
US8604932B2 (en) * 1992-05-05 2013-12-10 American Vehicular Sciences, LLC Driver fatigue monitoring system and method
US8754388B2 (en) * 2011-03-16 2014-06-17 Controlrad Systems, Inc. Radiation control and minimization system and method using collimation/filtering
US20140276104A1 (en) * 2013-03-14 2014-09-18 Nongjian Tao System and method for non-contact monitoring of physiological parameters
US20140276090A1 (en) * 2011-03-14 2014-09-18 American Vehcular Sciences Llc Driver health and fatigue monitoring system and method using optics
US8849845B2 (en) * 2010-11-03 2014-09-30 Blackberry Limited System and method for displaying search results on electronic devices
US8885882B1 (en) * 2011-07-14 2014-11-11 The Research Foundation For The State University Of New York Real time eye tracking for human computer interaction
US8986218B2 (en) * 2008-07-09 2015-03-24 Imotions A/S System and method for calibrating and normalizing eye data in emotional testing
US8988350B2 (en) * 2011-08-20 2015-03-24 Buckyball Mobile, Inc Method and system of user authentication with bioresponse data
US9043042B2 (en) * 2011-07-19 2015-05-26 GM Global Technology Operations LLC Method to map gaze position to information display in vehicle
US9120379B2 (en) * 2013-09-25 2015-09-01 Denso International America, Inc. Adaptive instrument display using eye tracking
US9129505B2 (en) * 1995-06-07 2015-09-08 American Vehicular Sciences Llc Driver fatigue monitoring system and method
US20150379362A1 (en) * 2013-02-21 2015-12-31 Iee International Electronics & Engineering S.A. Imaging device based occupant monitoring system supporting multiple functions
US20160110868A1 (en) * 2014-10-20 2016-04-21 Microsoft Corporation Facial Skin Mask Generation for Heart Rate Detection
US9380976B2 (en) * 2013-03-11 2016-07-05 Sync-Think, Inc. Optical neuroinformatics
US20160342206A1 (en) * 2014-01-29 2016-11-24 Tarke A Shazly Eye and head tracking device
US9542847B2 (en) * 2011-02-16 2017-01-10 Toyota Motor Engineering & Manufacturing North America, Inc. Lane departure warning/assistance method and system having a threshold adjusted based on driver impairment determination using pupil size and driving patterns
US9554185B2 (en) * 2011-12-15 2017-01-24 Arris Enterprises, Inc. Supporting multiple attention-based, user-interaction modes

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998049028A1 (en) * 1997-04-25 1998-11-05 Applied Science Group, Inc. An alertness monitor
CN1433911A (en) * 2002-01-22 2003-08-06 王廷森 Intelligent vehicle robbing-resistant facilities
US7484646B1 (en) * 2004-09-23 2009-02-03 The United States Of America As Represented By The Secretary Of The Navy Dive mask index bracket
CN1614640A (en) * 2004-12-10 2005-05-11 清华大学 Onboard apparatus for monitoring vehicle status in running
US7363844B2 (en) * 2006-03-15 2008-04-29 James Barton Remotely operated, underwater non-destructive ordnance recovery system and method
EP2486539B1 (en) * 2009-10-06 2016-09-07 Koninklijke Philips N.V. Method and system for obtaining a first signal for analysis to characterize at least one periodic component thereof
DE102010023369A1 (en) 2010-06-10 2010-12-30 Daimler Ag Vehicle i.e. lorry, has electrode built in sensor, shielding, mass and isolating layers directly lying one upon another for capacitive measurement of biological signals of occupant in seat and/or couch
US20140375785A1 (en) * 2013-06-19 2014-12-25 Raytheon Company Imaging-based monitoring of stress and fatigue

Patent Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8604932B2 (en) * 1992-05-05 2013-12-10 American Vehicular Sciences, LLC Driver fatigue monitoring system and method
US9129505B2 (en) * 1995-06-07 2015-09-08 American Vehicular Sciences Llc Driver fatigue monitoring system and method
US20020188219A1 (en) * 2001-06-06 2002-12-12 Eytan Suchard Method and apparatus for inferring physical/mental fitness through eye response analysis
US8322855B2 (en) * 2004-05-06 2012-12-04 Indo Internacional S.A. Method for determining the visual behaviour of a Person
US7602278B2 (en) * 2005-01-19 2009-10-13 Takata-Petri Ag Steering wheel assembly for a motor vehicle
US7835834B2 (en) * 2005-05-16 2010-11-16 Delphi Technologies, Inc. Method of mitigating driver distraction
US20070291983A1 (en) * 2006-06-14 2007-12-20 Hammoud Riad I Method of tracking a human eye in a video image
US7982618B2 (en) * 2007-01-29 2011-07-19 Denso Corporation Wakefulness maintaining apparatus and method of maintaining wakefulness
US8077915B2 (en) * 2007-10-12 2011-12-13 Sony Ericsson Mobile Communications Ab Obtaining information by tracking a user
US8274578B2 (en) * 2008-05-15 2012-09-25 Sungkyunkwan University Foundation For Corporate Collaboration Gaze tracking apparatus and method using difference image entropy
US8986218B2 (en) * 2008-07-09 2015-03-24 Imotions A/S System and method for calibrating and normalizing eye data in emotional testing
US8849845B2 (en) * 2010-11-03 2014-09-30 Blackberry Limited System and method for displaying search results on electronic devices
US20120150387A1 (en) * 2010-12-10 2012-06-14 Tk Holdings Inc. System for monitoring a vehicle driver
US9542847B2 (en) * 2011-02-16 2017-01-10 Toyota Motor Engineering & Manufacturing North America, Inc. Lane departure warning/assistance method and system having a threshold adjusted based on driver impairment determination using pupil size and driving patterns
US20140276090A1 (en) * 2011-03-14 2014-09-18 American Vehcular Sciences Llc Driver health and fatigue monitoring system and method using optics
US8754388B2 (en) * 2011-03-16 2014-06-17 Controlrad Systems, Inc. Radiation control and minimization system and method using collimation/filtering
US8885882B1 (en) * 2011-07-14 2014-11-11 The Research Foundation For The State University Of New York Real time eye tracking for human computer interaction
US9043042B2 (en) * 2011-07-19 2015-05-26 GM Global Technology Operations LLC Method to map gaze position to information display in vehicle
US8988350B2 (en) * 2011-08-20 2015-03-24 Buckyball Mobile, Inc Method and system of user authentication with bioresponse data
US9554185B2 (en) * 2011-12-15 2017-01-24 Arris Enterprises, Inc. Supporting multiple attention-based, user-interaction modes
US20150379362A1 (en) * 2013-02-21 2015-12-31 Iee International Electronics & Engineering S.A. Imaging device based occupant monitoring system supporting multiple functions
US9380976B2 (en) * 2013-03-11 2016-07-05 Sync-Think, Inc. Optical neuroinformatics
US20140276104A1 (en) * 2013-03-14 2014-09-18 Nongjian Tao System and method for non-contact monitoring of physiological parameters
US9120379B2 (en) * 2013-09-25 2015-09-01 Denso International America, Inc. Adaptive instrument display using eye tracking
US20160342206A1 (en) * 2014-01-29 2016-11-24 Tarke A Shazly Eye and head tracking device
US20160110868A1 (en) * 2014-10-20 2016-04-21 Microsoft Corporation Facial Skin Mask Generation for Heart Rate Detection

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018034781A1 (en) * 2016-08-16 2018-02-22 Honda Motor Co., Ltd. Vehicle data selection system for modifying automated driving functionalities and method thereof
US10759424B2 (en) 2016-08-16 2020-09-01 Honda Motor Co., Ltd. Vehicle data selection system for modifying automated driving functionalities and method thereof
DE102017216328B3 (en) * 2017-09-14 2018-12-13 Audi Ag A method for monitoring a state of attention of a person, processing device, storage medium, and motor vehicle
CN111753586A (en) * 2019-03-28 2020-10-09 合肥工业大学 Fatigue driving monitoring device and method
JP2020199848A (en) * 2019-06-07 2020-12-17 本田技研工業株式会社 Driver state detection device
CN111652036A (en) * 2020-03-30 2020-09-11 华南理工大学 Fatigue driving identification method based on fusion of heart rate and facial features of vision
WO2021205508A1 (en) * 2020-04-06 2021-10-14 三菱電機株式会社 Biological information acquisition device and biological information acquisition method
JP7391188B2 (en) 2020-04-06 2023-12-04 三菱電機株式会社 Biometric information acquisition device and biometric information acquisition method

Also Published As

Publication number Publication date
CN105193402A (en) 2015-12-30
US10043074B2 (en) 2018-08-07
DE102014211882A1 (en) 2015-12-24

Similar Documents

Publication Publication Date Title
US10043074B2 (en) Method for ascertaining the heart rate of the driver of a vehicle
EP2648618B1 (en) System for monitoring a vehicle driver
US10322728B1 (en) Method for distress and road rage detection
JP5923180B2 (en) Biological information measuring device and input device using the same
US9272689B2 (en) System and method for biometric identification in a vehicle
CN106462027B (en) Method for controlling a vehicle system in a motor vehicle
US7884705B2 (en) Safety-drive assistance device
CN108693973B (en) Emergency condition detection system fusing electroencephalogram signals and environmental information
US20140171752A1 (en) Apparatus and method for controlling emotion of driver
CN107554528B (en) Fatigue grade detection method and device for driver and passenger, storage medium and terminal
US20150125126A1 (en) Detection system in a vehicle for recording the speaking activity of a vehicle occupant
JP7290930B2 (en) Occupant modeling device, occupant modeling method and occupant modeling program
WO2015175435A1 (en) Driver health and fatigue monitoring system and method
JP2017007652A (en) Method for recognizing a speech context for speech control, method for determining a speech control signal for speech control, and apparatus for executing the method
US11112602B2 (en) Method, apparatus and system for determining line of sight, and wearable eye movement device
KR102272774B1 (en) Audio navigation device, vehicle having the same, user device, and method for controlling vehicle
Tayibnapis et al. A novel driver fatigue monitoring using optical imaging of face on safe driving system
WO2020203915A1 (en) Pulse wave detection device, and pulse wave detection program
JP2015123160A (en) Vital data processing method and vital data measuring apparatus
CN111105594A (en) Vehicle and recognition method and device for fatigue driving of driver
JP2018117726A (en) Biological sensor control device and biological sensor control method
WO2018211966A1 (en) Driver monitoring device, driver monitoring method and program for monitoring driver
CN212353957U (en) Health regulating system
JP6550952B2 (en) Method and apparatus for analyzing electroencephalogram
KR102613180B1 (en) Vehicle and control method for the same

Legal Events

Date Code Title Description
AS Assignment

Owner name: ROBERT BOSCH GMBH, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HILSEBECHER, JOERG;DREUW, PHILIPPE;SIGNING DATES FROM 20150715 TO 20150720;REEL/FRAME:037290/0720

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4