US20140093135A1 - Systems and methods for three-dimensional interaction monitoring in an ems environment - Google Patents

Systems and methods for three-dimensional interaction monitoring in an ems environment Download PDF

Info

Publication number
US20140093135A1
US20140093135A1 US14/040,159 US201314040159A US2014093135A1 US 20140093135 A1 US20140093135 A1 US 20140093135A1 US 201314040159 A US201314040159 A US 201314040159A US 2014093135 A1 US2014093135 A1 US 2014093135A1
Authority
US
United States
Prior art keywords
occurrence
human
condition
emergency
patient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/040,159
Inventor
C. Shane Reid
Chad Ashmore
Robert H. GOTSCHALL
Martin BURES
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zoll Medical Corp
Original Assignee
Zoll Medical Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zoll Medical Corp filed Critical Zoll Medical Corp
Priority to US14/040,159 priority Critical patent/US20140093135A1/en
Publication of US20140093135A1 publication Critical patent/US20140093135A1/en
Assigned to ZOLL MEDICAL CORPORATION reassignment ZOLL MEDICAL CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ASHMORE, CHAD, GOTSCHALL, ROBERT H., REID, C. SHANE, BURES, Martin
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/22Social work
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/10ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to drugs or medications, e.g. for ensuring correct administration to patients
    • G16H20/13ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to drugs or medications, e.g. for ensuring correct administration to patients delivered from dispensers
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation

Definitions

  • Embodiments of the present invention relate generally to gesture recognition and three-dimensional interaction tracking in an emergency medical services environment.
  • EMS emergency medical services
  • caregivers In an emergency medical services (“EMS”) or first responder environment, caregivers must often focus more acutely on patient care in a shorter amount of time and with a greater number of uncertainties and variables than their counterparts in a hospital setting. Creating a record of the EMS caregiver's encounter with a patient, however, remains important. Manual input of information into patient charting systems (e.g. by typing or by writing) can sometimes take valuable time and attention away from patient care, can be distracting, and can often be inaccurately recreated from memory after an EMS encounter.
  • a method for tracking interactions in an emergency response environment includes receiving color images and depth information from within a field of view of a sensor array; maintaining an emergency encounter record; monitoring one or both of a position of an object and movement of the object in the emergency response environment based on the color images and depth information received by the sensor array; and recording an occurrence of a condition in the emergency encounter record, wherein the condition is based on the one or both of the position of the object and the movement of the object.
  • monitoring one or both of the position of the object and movement of the object comprises monitoring one or both of the position of the human and movement of an at least partial skeletal approximation of the human.
  • recording the occurrence of the condition comprises recording a time at which the condition occurs.
  • recording the occurrence of the condition further comprises recording a type of the condition.
  • recording the occurrence of the condition further comprises recording as video footage the color images received during the occurrence of the condition.
  • correlating the at least the portion of the streaming clinical data comprises flagging the at least the portion of the streaming clinical data corresponding to a time of the occurrence of the condition.
  • a system for tracking interactions in an emergency response environment includes a sensor array, wherein the sensor array is adapted to receive color images and depth information in its field of view; a control system communicably coupled to the sensor array, the control system configured to: maintain an emergency encounter record; monitor one or both of position and movement of an object in the emergency response environment based on the color images and depth information received from the sensor array; and record an occurrence of a condition in the emergency encounter record, wherein the condition is based on the one or both of position and movement of the object.
  • a method for inventory control in an emergency response environment includes detecting three-dimensional movement of a human body in the emergency response environment with a sensor array, wherein the sensor array generates visual information and depth information about the emergency response environment; detecting three-dimensional movement of an object in the emergency response environment; determining an occurrence of contact between the human body and the object; and recording an entry in an emergency encounter record based on the occurrence of the contact.
  • the object is a narcotic medication stored in an enclosure in the emergency response environment, the method further comprising: determining, based on the detection of the three-dimensional movement of the human body and the object, an occurrence of intersection of the human body with the enclosure; and recording an entry in the emergency encounter record based on the occurrence of the intersection.
  • the method further comprising: determining, based on the detection of the three-dimensional movement of the object, an occurrence of removal of the narcotic medication from the enclosure; and recording an entry in the emergency encounter record based on the occurrence of the removal.
  • FIG. 1 illustrates an emergency response environment with a vehicle control system communicably coupled to other devices, according to embodiments of the present invention.
  • FIG. 2 illustrates a computer system, according to embodiments of the present invention.
  • FIG. 3 illustrates an emergency response environment with a system that monitors three-dimensional interaction, according to embodiments of the present invention.
  • FIG. 4 illustrates a system including a vehicle control system and a sensor array, according to embodiments of the present invention.
  • FIG. 5 illustrates a table listing various hand and finger gestures that may be recognized by the system of FIG. 4 , according to embodiments of the present invention.
  • FIG. 6 illustrates a table listing various head and facial gestures that may be recognized by the system of FIG. 4 , according to embodiments of the present invention.
  • FIG. 7 depicts a flow chart illustrating a method for monitoring three-dimensional interaction in an emergency response environment, according to embodiments of the present invention.
  • FIG. 8 depicts a flow chart illustrating a method for monitoring three-dimensional interaction of a caregiver with a patient in an emergency response environment, according to embodiments of the present invention.
  • FIG. 9 depicts a flow chart illustrating a method for monitoring three-dimensional interaction in an emergency response environment for inventory control, according to embodiments of the present invention.
  • FIG. 10 depicts a flow chart illustrating a method for gesture recognition in an emergency response environment, according to embodiments of the present invention.
  • a system 100 performs advanced data management, integration and presentation of EMS data from multiple different devices.
  • System 100 includes a mobile environment 101 , an enterprise environment 102 , and an administration environment 103 .
  • Devices within the various environments 101 , 102 , 103 may be communicably coupled via a network 120 , such as, for example, the Internet.
  • System 100 is further described in Patent Cooperation Treaty Application Publication No. WO 2011/011454, published on Jan. 27, 2011, which is incorporated herein by reference in its entirety for all purposes.
  • communicably coupled is used in its broadest sense to refer to any coupling whereby information may be passed.
  • communicably coupled includes electrically coupled by, for example, a wire; optically coupled by, for example, an optical cable; and/or wirelessly coupled by, for example, a radio frequency or other transmission media.
  • “Communicably coupled” also includes, for example, indirect coupling, such as through a network, or direct coupling.
  • the mobile environment 101 is an ambulance or other EMS vehicle—for example a vehicular mobile environment (VME).
  • the mobile environment may also be the local network of data entry devices as well as diagnostic and therapeutic devices established at time of treatment of a patient or patients in the field environment—the “At Scene Patient Mobile Environment” (ASPME).
  • ASPME Active Scene Patient Mobile Environment
  • the mobile environment may also be a combination of one or more of VMEs and/or ASPMEs.
  • the mobile environment may include a navigation device 110 used by the driver 112 to track the mobile environment's position 101 , locate the mobile environment 101 and/or the emergency location, and locate the transport destination, according to embodiments of the present invention.
  • the navigation device 110 may include a Global Positioning System (“GPS”), for example.
  • GPS Global Positioning System
  • the navigation device 110 may also be configured to perform calculations about vehicle speed, the travel time between locations, and estimated times of arrival. According to embodiments of the present invention, the navigation device 110 is located at the front of the ambulance to assist the driver 112 in navigating the vehicle.
  • the navigation device 110 may be, for example, a RescueNet® Navigator onboard electronic data communication system available from ZOLL Data Systems of Broomfield, Colo.
  • a patient monitoring device 106 and a patient charting device 108 are also often used for patient care in the mobile environment 101 , according to embodiments of the present invention.
  • the EMS technician 114 attaches the patient monitoring device 106 to the patient 116 to monitor the patient 116 .
  • the patient monitoring device 106 may be, for example, a defibrillator device with electrodes and/or sensors configured for attachment to the patient 116 to monitor heart rate and/or to generate electrocardiographs (“ECG's”), according to embodiments of the present invention.
  • ECG's electrocardiographs
  • the patient monitoring device 106 may also include sensors to detect or a processor to derive or calculate other patient conditions.
  • the patient monitoring device 106 may monitor, detect, treat and/or derive or calculate blood pressure, temperature, respiration rate, blood oxygen level, end-tidal carbon dioxide level, pulmonary function, blood glucose level, and/or weight, according to embodiments of the present invention.
  • the patient monitoring device 106 may be a Zoll E-Series® or X-Series defibrillator available from Zoll Medical Corporation of Chelmsford, Mass., according to embodiments of the present invention.
  • a patient monitoring device may also be a patient treatment device, or another kind of device that includes patient monitoring and/or patient treatment capabilities, according to embodiments of the present invention.
  • the patient charting device 108 is a device used by the EMS technician 114 to generate records and/or notes about the patient's 116 condition and/or treatments applied to the patient, according to embodiments of the present invention.
  • the patient charting device 108 may be used to note a dosage of medicine given to the patient 116 at a particular time.
  • the patient charting device 108 and/or patient monitoring device 106 may have a clock, which may be synchronized with an external time source such as a network or a satellite to prevent the EMS technician from having to manually enter a time of treatment or observation (or having to attempt to estimate the time of treatment for charting purposes long after the treatment was administered), according to embodiments of the present invention.
  • the patient charting device 108 may also be used to record biographic and/or demographic and/or historical information about a patient, for example the patient's name, identification number, height, weight, and/or medical history, according to embodiments of the present invention.
  • the patient charting device 108 is a tablet PC, such as for example the TabletPCR component of the RescueNet® ePCR Suite available from Zoll Data Systems of Broomfield, Colo.
  • the patient charting device 108 is a wristband or smart-phone such as an Apple iPhone or iPad with interactive data entry interface such as a touch screen or voice recognition data entry that may be communicably connected to the VCS 104 and tapped to indicate what was done with the patient 116 and when it was done.
  • the navigation device 110 , the charting device 108 , and the monitoring device 106 are each separately very useful to the EMS drivers 112 and technicians 114 before, during, and after the patient transport.
  • a vehicle control system (“VCS”) 104 receives, organizes, stores, and displays data from each device 108 , 110 , 112 to further enhance the usefulness of each device 108 , 110 , 112 and to make it much easier for the EMS technician 114 to perform certain tasks that would normally require the EMS technician 114 to divert visual and manual attention to each device 108 , 110 , 112 separately, according to embodiments of the present invention.
  • the VCS centralizes and organizes information that would normally be de-centralized and disorganized, according to embodiments of the present invention.
  • the VCS 104 is communicably coupled to the patient monitoring device 106 , the patient charting device 108 , and the navigation device 110 , according to embodiments of the present invention.
  • the VCS 104 is also communicably coupled to a storage medium 118 .
  • the VCS 104 may be a touch-screen, flat panel PC, and the storage medium 118 may be located within or external to the VCS 104 , according to embodiments of the present invention.
  • the VCS 104 may include a display template serving as a graphical user interface, which permits the user (e.g. EMS tech 114 ) to select different subsets and/or display modes of the information gathered from and/or sent to devices 106 , 108 , 110 , according to embodiments of the present invention.
  • Some embodiments of the present invention include various steps, some of which may be performed by hardware components or may be embodied in machine-executable instructions. These machine-executable instructions may be used to cause a general-purpose or a special-purpose processor programmed with the instructions to perform the steps. Alternatively, the steps may be performed by a combination of hardware, software, and/or firmware. In addition, some embodiments of the present invention may be performed or implemented, at least in part (e.g., one or more modules), on one or more computer systems, mainframes (e.g., IBM mainframes such as the IBM zSeries, Unisys ClearPath Mainframes, HP Integrity NonStop servers, NEC Express series, and others), or client-server type systems. In addition, specific hardware aspects of embodiments of the present invention may incorporate one or more of these systems, or portions thereof.
  • mainframes e.g., IBM mainframes such as the IBM zSeries, Unisys ClearPath Mainframes, HP Integrity NonStop servers
  • FIG. 2 is an example of a computer system 200 with which embodiments of the present invention may be utilized.
  • the computer system includes a bus 201 , at least one processor 202 , at least one communication port 203 , a main memory 24 , a removable storage media 205 , a read only memory 206 , and a mass storage 207 .
  • Processor(s) 202 can be any known processor, such as, but not limited to, an Intel® Itanium® or Itanium 2® processor(s), or AMD® Opteron® or Athlon MP® processor(s), or Motorola® lines of processors.
  • Communication port(s) 203 can be any of an RS-232 port for use with a modem based dialup connection, a 10/100 Ethernet port, or a Gigabit port using copper or fiber, for example. Communication port(s) 203 may be chosen depending on a network such a Local Area Network (LAN), Wide Area Network (WAN), or any network to which the computer system 200 connects.
  • Main memory 204 can be Random Access Memory (RAM), or any other dynamic storage device(s) commonly known to one of ordinary skill in the art.
  • Read only memory 206 can be any static storage device(s) such as Programmable Read Only Memory (PROM) chips for storing static information such as instructions for processor 202 , for example.
  • PROM Programmable Read Only Memory
  • Mass storage 207 can be used to store information and instructions.
  • hard disks such as the Adaptec® family of SCSI drives, an optical disc, an array of disks such as RAID (e.g. the Adaptec family of RAID drives), or any other mass storage devices may be used, for example.
  • Bus 201 communicably couples processor(s) 202 with the other memory, storage and communication blocks.
  • Bus 201 can be a PCI/PCI-X or SCSI based system bus depending on the storage devices used, for example.
  • Removable storage media 205 can be any kind of external hard-drives, floppy drives, flash drives, IOMEGA® Zip Drives, Compact Disc-Read Only Memory (CD-ROM), Compact Disc-Re-Writable (CD-RW), or Digital Video Disk-Read Only Memory (DVD-ROM), for example.
  • CD-ROM Compact Disc-Read Only Memory
  • CD-RW Compact Disc-Re-Writable
  • DVD-ROM Digital Video Disk-Read Only Memory
  • FIG. 3 illustrates an emergency response environment with a system 300 that monitors three-dimensional interaction, according to embodiments of the present invention.
  • System 300 includes a sensor or sensor array 1 .
  • Sensor 1 may be a camera, video camera, or other imaging device capable of collecting visual information.
  • sensor 1 is a sensor array that includes an image capture device, for example a color image capture device, as well as a depth determining device, for example an infrared emitter and infrared depth sensor.
  • Sensor 1 may also include an audio capture device.
  • sensor 1 may be a sensor array such as a Kinect® sensor array available from Microsoft Corporation.
  • Sensor 1 may also or alternatively be a LEAPTM device available from Leap Motion, Inc.
  • Sensor 1 may be, or include, a wide variety of hardware that permits collection of visual, depth, audio, and color information and the like, according to embodiments of the present invention.
  • Sensor 1 may be placed within an emergency response environment, for example in the back 152 of an ambulance 101 , such that activities of the patient 116 and/or crew members 2 , 3 are at least partially within its field of view.
  • sensor 1 may be mounted on a wall or ceiling of the back compartment 152 of the ambulance 101 .
  • the sensor 1 may also include, within its field of view, a patient support 4 , such as a bed, cot, or stretcher, upon which a patient 116 is laying and/or being treated.
  • the back 152 of the ambulance 101 may further include a supply cabinet 5 , for example a medicine cabinet or narcotics cabinet, which may be stocked with medicines, for example narcotic 6 .
  • FIG. 4 illustrates a system including a vehicle control system 104 communicably coupled with a sensor array 1 , according to embodiments of the present invention.
  • Sensor array 1 may include an imaging device 9 , a depth sensor system 10 , and/or an audio input 11 , according to embodiments of the present invention.
  • VCS 104 may also be communicably coupled with a patient monitoring device 106 , a charting system 108 , a navigation system 110 , and vehicle operations systems 8 .
  • the vehicle operation systems 8 may include sensors and controllers installed in the vehicle relating to vehicle safety and/or operation, including both manufacturer-installed and aftermarket devices, for example vehicle speed sensors, seatbelt detectors, accelerometers, and other vehicle- and safety-related devices, including without limitation those described in U.S. Provisional Patent Application Ser. No. 61/656,527, filed on Jun. 7, 2012, which is incorporated by reference herein in its entirety for all purposes.
  • Vehicle control system 104 may be configured to create, maintain, and/or update an encounter record 7 , which may be stored locally in an emergency response environment (for example in database 118 ) and/or remotely on an enterprise database 130 .
  • the encounter record 7 may include information obtained by the vehicle control system 104 and each of the devices to which VCS 104 is communicably coupled. Records in the encounter record 7 may be specific to an encounter with a particular patient 116 , and/or a particular dispatch of the vehicle 101 , for example.
  • the VCS 104 may be configured to track interactions in the emergency response environment, for example interactions by and among caregivers 2 , 3 , and patient 4 and/or objects in the emergency response environment.
  • the VCS 104 may be configured to receive color images and depth information from within a field of view of the sensor array 1 .
  • the VCS 104 may also be configured to maintain an emergency encounter record 7 , either locally and/or remotely.
  • the VCS 104 monitors a position of an object and/or movement of the object in the emergency response environment based on the color images and depth information received by the sensor array 1 .
  • the sensor array 1 may be a Kinect® sensor array
  • the VCS 104 may include software that receives data from the sensor array 1 to detect or approximate movements and locations of human bodies and their respective linkages (skeletal joints and bones) in three-dimensional space.
  • the VCS 104 can distinguish between different humans in the field of view of the sensor 1 , and can monitor or observe the movements of two or more of such humans in the field of view.
  • the VCS 104 is configured to recognize which of the humans is a patient and which is a caregiver.
  • VCS 104 may recognize a human as a patient by observing that the particular human is laying relatively still on the patient support 4 , while another human is an EMS technician 2 because the other human is standing up or moving around the back of the ambulance 101 .
  • the VCS 104 may be configured to track or monitor three-dimensional movements of one or more humans in the emergency response environment by approximating elements of their basic skeletal structure and, as such, can determine when two humans are in contact or close proximity. For example, the VCS 104 can determine when a hand or arm of the EMS technician 2 reaches over and touches an area of the patient's 116 body, according to embodiments of the present invention.
  • the VCS 104 is configured to record into the emergency encounter record 7 an occurrence of a condition.
  • a condition may be based on the position of the object and/or the movement of the object.
  • the object may be a human, and the VCS 104 may monitor the human's movement (or a skeletal approximation thereof) in three-dimensional space, and make an entry in the encounter record 7 when the human or part of the human intersects a certain location (e.g. within the ambulance 101 ), or remains in a particular location for a certain amount of time, or intersects or nears another object.
  • the VCS 104 may be configured to make an entry to the encounter record 7 when one object (e.g.
  • the VCS 104 may be configured to mark the encounter record 7 when a caregiver 2 or 3 approached the patient 116 and/or touched the patient 116 , or when an object approached or touched the patient 116 .
  • the VCS 104 may be configured to update the encounter record 7 in various ways based on the observance of a condition based on three-dimensional visual and position data. For example, the VCS 104 may be configured to enter into the encounter record 7 a time at which the condition occurred, and/or an identification of the condition or type of condition that occurred, and/or other data coinciding with the occurrence of the condition, for example video data or color images covering the time or time range when the condition occurred.
  • the VCS 104 receives streaming clinical data about a patient 116 , for example from a defibrillator or other patient monitoring device 106 communicably coupled to the patient, and correlates at least a portion of the streaming clinical data in the emergency encounter record 7 with the occurrence of the condition based on the sensor's 1 visual data.
  • correlating some or all of the streaming clinical data includes flagging some or all of the streaming clinical data that corresponds to a time of the occurrence of the condition.
  • FIG. 7 illustrates a flow chart 700 showing the recording of an occurrence of a condition based on three-dimensional position and shape visual data, according to embodiments of the present invention.
  • One or more distinct objects are identified (block 702 ), for example by VCS 104 and sensor 1 .
  • the position and/or movement of the one or more objects are tracked or otherwise monitored or modeled (block 704 ), and based on such tracking the VCS 104 identifies the occurrence of a condition (block 706 ).
  • the occurrence of the condition, or information about the condition is recorded in the patient encounter record 7 (block 708 ).
  • FIG. 8 illustrates a flow chart 800 describing a similar method in greater detail, according to embodiments of the present invention.
  • An individual human or distinct humans are identified in an emergency response environment, for example the back of an ambulance (block 802 ). At least one of the humans is identified as a patient (block 804 ). The position and/or movement of the one or more humans is observed or tracked or otherwise modeled (block 806 ), and based thereon the VCS 104 identifies the occurrence of a condition, for example the occurrence of patient treatment (block 808 ).
  • Information about the patient contact may be recorded in the encounter record 7 (block 810 ), for example by recording a time or time range at which the condition (e.g. treatment) occurred (block 812 ), and/or by recording a type of contact (e.g. treatment) which occurred (block 814 ).
  • the VCS 104 may update the encounter record 7 to reflect that an oral medication was or may have been administered to the patient 116 , and the particular time which this occurred.
  • the VCS 104 may be configured to prompt the EMS technician 2 or other caregiver at a later time, for example after the emergency encounter or at the end of a standard shift, to confirm or validate the perceived interactions or conditions that were entered into the patient encounter record 7 .
  • the VCS 104 might observe the occurrence of the EMS technician's 2 hand going to the face of the patient 116 and flag such occurrence as the possible administration of an oral medication, but when prompting the EMS technician 2 for later confirmation, may give the EMS technician 2 the ability to edit the observation to reflect that the interaction was instead a turning of the head of the patient, or some other reason for why the caregiver 2 contacted the patient 116 .
  • FIG. 9 depicts a flow chart 900 illustrating a method for monitoring three-dimensional interaction in an emergency response environment for inventory control, according to embodiments of the present invention.
  • the VCS 104 may identify a particular location within the emergency response environment, for example a supply cabinet 5 , using sensor 1 and known information about the environment (block 902 ).
  • the VCS 104 may also be configured for customization regarding the locations of certain items in the emergency response environment. For example, during an initialization and/or configuration protocol, the VCS 104 may prompt the user to run the user's finger or hand around an outer perimeter of a supply cabinet 5 and/or a door thereto, so that the VCS 104 can log the three-dimensional position of the supply cabinet 5 .
  • Such cabinet 5 may be, for example, a narcotics cabinet 5 to which access is often controlled for safety and security reasons.
  • the VCS 104 may identify individual humans in the emergency response environment, for example the back of an ambulance (block 904 ), and track the position and/or movement of such humans (block 906 ). This may be done with visual and depth information received from the sensor array 1 , according to embodiments of the present invention. Based on such visual and depth information received from the sensor array 1 , the VCS 104 may also detect or track three-dimensional movement of an object in the emergency response environment, for example a non-human object. The VCS 104 may determine an occurrence of contact between the human body and the object (block 908 ), for example an occurrence of the human body or a portion thereof approaching and/or intersecting the narcotics cabinet 5 .
  • the VCS 104 may also record an entry in an emergency encounter record 7 based on the occurrence of the contact, for example a note that the cabinet 5 was accessed (block 910 ) along with a time (block 912 ) and/or an identity of the person who accessed the cabinet 5 (block 914 ).
  • the VCS 104 may be configured to observe the occurrence of various different types of conditions of note. For example, the VCS 104 may be configured to detect an intersection of a human form with the area of the door or opening to the cabinet 5 .
  • the VCS 104 may be configured to detect that a shape that correlates to the shape of the narcotic medication 6 has gone from inside such area of the door or cabinet opening to outside such area.
  • VCS 104 may also be configured to note whether a human has an object in the human's hand as well as the shape and/or size of the object.
  • the VCS 104 may further be configured to update an inventory database, based on the occurrence of the removal, to reflect that the narcotic medication has been used and needs restocking. Similar processes may be used to track the use of other objects and the inventory associated therewith, as well as to track in general the intersection of objects with humans and use thereby, according to embodiments of the present invention.
  • the occurrence of an access event to the particular cabinet 5 may further trigger other information gathering, for example it may trigger a camera on the inside of the cabinet 5 and/or another video camera elsewhere in the vehicle 101 .
  • the identity of each crew member accessing the cabinet 5 may be recorded in the encounter record 7 , according to embodiments of the present invention.
  • FIG. 10 depicts a flow chart 1000 illustrating a method for gesture recognition in an emergency response environment, according to embodiments of the present invention.
  • system 400 including VCS 104 and sensors 1 , may be configured to track motions, positions, and interactions of humans and objects as described above
  • system 400 as well as VCS 104 and sensors 1 may also or alternatively be configured to monitor such visual information for the occurrence of gestures.
  • three-dimensional position and visual information may be used to monitor for gestures; in other cases, mere visual information may be used to detect gestures (e.g. based on pattern recognition or other visual cues or patterns).
  • sensor 1 may be one of a number of various types of sensors or sensor arrays.
  • VCS 104 may be configured to track an entire human body and/or one or more portions thereof to identify gestures being made, for example gestures being made by one or more hands and/or fingers or by the head and/or neck (block 1002 ).
  • VCS 104 receives visual information about at least a portion of a human body from at least one sensor 1 , and maintains the encounter record 7 .
  • the VCS 104 is configured to monitor the visual information to determine movements of the at least the portion of the human body (for example the hand or the head), and to recognize an occurrence of a gesture based on the movements of the at least the portion of the human body.
  • the VCS 104 recognizes one or more hand or finger gestures based on visual and/or depth information received by sensor 1 , for example one or more hand or finger gestures listed in FIG. 5 .
  • the VCS 104 may also recognize one or more head or facial gestures based on visual and/or depth information received by sensor 1 , for example one or more head or facial gestures listed in FIG. 6 .
  • Examples of hand or finger gestures may include waving a hand or finger, making a fist, raising the fist, shaking the first, making the “thumbs up” signal, spreading fingers apart, displaying a count (e.g. zero, one, two, three, four, five, six, seven, eight, nine, or ten digits extended), pointing, moving hands together, pulling hands apart, and/or tapping on the wrist.
  • Examples of head or facial gestures may include nodding the head, bobbing the head, shaking the head side-to-side as in the “no” gesture, shaking the head up and down as in the “yes” gesture, blinking, opening or closing the mouth, sticking the tongue out, raising or lowering eyebrows, and/or opening or closing the eyes.
  • the VCS 104 When the VCS 104 recognizes a gesture, the VCS 104 records an entry in the emergency encounter record 7 based on the occurrence of the gesture (block 1004 ).
  • a gesture may be artificial, or alternatively such a gesture may be natural.
  • An artificial gesture is a gesture made by a human for the primary purpose of triggering the condition with VCS 104 .
  • an artificial gesture may be a gesture that would not normally be made in the normal course of treating a patient 116 in an emergency response environment. For example, making a “thumbs up” signal is one example of an artificial gesture.
  • a patient whose head is involuntarily bobbing is an example of a natural gesture, or a gesture which is not performed only to trigger VCS 104 .
  • the entry which the VCS 104 makes in the patient encounter record 7 based on the recognition of the gesture may include information about the type of gesture made (block 1006 ), information about the time at which the gesture was made (block 1008 ), and/or information about other data values at the time the gesture was made (block 1010 ), for example information about the crew (block 1012 ), patient clinical data (block 1014 ), and vehicle operation or safety conditions (block 1016 ).
  • VCS 104 may be configured to write the patient's 116 current blood pressure reading to the encounter record 7 whenever VCS 104 receives visual and/or depth information from the sensor 1 indicating that the caregiver 2 attending to the patient 116 taps his or her left wrist with the right hand or fingers (tapping the location where a watch would normally be worn).
  • Successive gestures may be used to take the VCS 104 down various pathways and/or treatment protocols, or to confirm previous gestures or options that become available because of those gestures.
  • the VCS 104 may be configured to record a blood pressure reading to the encounter record 7 when it identifies the wrist tapping gesture followed by a chest tapping gesture, and may be configured to record an ECG waveform signal to the encounter record 7 when it identifies the same wrist tapping gesture followed by a back-of-the-neck tapping gesture.
  • the VCS 104 may also be configured to record in the encounter record 7 the audiovisual (e.g. video and/or audio) information received during or within a certain time range of the gesture, according to embodiments of the present invention.
  • the VCS is configured to identify simultaneous occurrence of gestures, for example two or more gestures selected from FIG. 5 , FIG. 6 , or any other natural or artificial gestures.
  • the VCS 104 is configured to identify simultaneous occurrence of gestures along with position and/or movement information for entire human bodies or portions thereof, or simultaneous occurrence of other factors such as vehicle position along the ambulance route, patient vital signs, and/or vehicle speed.
  • VCS 104 may also be configured to identify simultaneous occurrence of gestures by the same person, for example a different or similar gesture with each hand, or a hand and a head.
  • the VCS 104 may be configured to recognize a hand waiving gesture and to make a record in the encounter record 7 and notify the ambulance driver to slow down if the hand waiving gesture is received at a time when the vehicle speed is exceeding 60 miles per hour.
  • the visually recognized gestures may be paired or correlated or combined with other information received by VCS 104 , either in the creation of the condition which triggers a further event (such as writing to the encounter record 7 or creating a notification or some other action), or in the creation of the entry to the encounter record 7 itself (for example the types of information that would be flagged or gathered or otherwise noted upon occurrence of the condition).
  • the VCS 104 identifies (either in the encounter record 7 or for other devices) whether a patient is being transported by the vehicle 101 , for example by determining whether a human figure is sitting on or laying on the patient support 4 .
  • the VCS 104 may also identify the position of a patient or a crew member, for example whether the patient or crew member is sitting or standing.
  • the VCS 104 may also receive from sensor 1 information about structures beyond a normal emergency response environment, for example larger-scale depth images of emergency incidents such as buildings on fire, to aid in the location of emergency workers and/or victims.
  • sensor 1 may be communicably coupled with VCS 104 .
  • Multiple sensors 1 may be used to expand the field or depth of view, or to collect similar information from a different viewing angle, in order to observe more objects or humans, or gather more detailed information about shapes and/or movements.
  • sensor 1 is described as being mounted within a vehicle, sensor 1 or multiples thereof may alternatively be mounted on a device (for example a defibrillator taken to an emergency response scene) and/or on a person (for example on a crew member's helmet).
  • Embodiments of the present invention may also be used for charting and/or counting functions. Often, medics must reconstruct past events that occurred during patient treatment. Embodiments of the present invention improve accuracy and help to accurately document times at which various events occurred.
  • the VCS 104 may recognize boundaries of multiple cabinets or storage areas within an ambulance 101 , and may log the times at which each storage area was accessed by a medic, as well as the identity (e.g. obtained from voice or body or facial recognition) of the medic who accessed the area. Such a “bounding volume” may be preprogrammed into VCS 104 and/or customized or initialized upon installation of VCS 104 , sensor 1 , and/or a new storage area.
  • the VCS 104 may count a number of boxes on the floor of the ambulance to determine a number of items used in the encounter, and reconcile that with the medications and other durable goods charted for the patient encounter. The VCS 104 may then prompt the medic for additional information to help reconcile the encounter record 7 .
  • the system 400 may also determine when a patient is being touched, either by another human or by an implement held by another human. This information may be used either during the patient encounter, or afterward, to determine whether inappropriate patient contact has occurred.
  • the system 400 may determine when an IV is being started.
  • System 400 may also use gesture-based charting, for example quick-logging with artificial gestures, to save time over manual entry or typing of such information.
  • Embodiments of the present invention may also include voice recognition, which may filter out siren sounds or road sounds, and which may also provide feedback to the crew.
  • Embodiments of the present invention may also be configured to identify crew members, for example through facial recognition, pattern recognition, name badge reading, skeletal modeling, habits or movements, or via another mechanism such as crew logins or RFID badges which are also communicably coupled to VCS 104 .
  • the system 400 may be used for security monitoring, to detect the presence of unidentified or unwanted intruders in the vehicle 101 .
  • the system 400 may be used to begin tracking a person when the person makes a gesture or performs a certain activity, and then continue to track the same person after the gesture or activity, for a certain period of time or until another event occurs, for example another visual event.
  • the system 400 identifies an operator of a medical device using visual information; for example, a patient monitoring device 106 , such as a defibrillator, may include a camera or other type of sensor array 1 , and upon use of the device 106 the device 106 may observe visual characteristics of the person directly in front of the device 106 in order to identify the person or monitor or interpret activities of that person.
  • the system 400 may also be configured to recognize or identify in its field of view equipment used by medical personnel, either by visual cues or otherwise, and may perform similar medical personnel identification or visual monitoring even when the camera or sensor array 1 is not in or near the device being used.
  • Such multiple devices used by medical personnel may be wirelessly or otherwise communicably coupled with each other and/or with system 400 , so that activities performed on various devices and by the personnel are correlated for a more complete patient record without requiring manual annotation, according to embodiments of the present invention.
  • the system 400 may be mounted not only in a vehicle, such as the back of an ambulance, but system 400 and/or parts thereof may also be integrated into or mounted on a medical device, including a portable medical device such as a defibrillator.
  • the system 400 may also be configured to “remember” a person based on that person's gestures; for example, the system 400 may observe certain gestures performed by a person one day after the person identifies himself or herself to the system 400 , and may then visually identify the same person the next day based on observing similar gestures, even if the person has not specifically identified himself or herself to the system 400 on the following occasion.
  • the system 400 may also be configured to count the number of distinct individual people in a given area, according to embodiments of the present invention.
  • the system 400 may also be configured to monitor certain activities and to interpret various aspects of those activities, and even to provide feedback to the performer of the activities either in real time or in a later review.
  • the system 400 may monitor an EMS technician's twelve-lead placement on a patient, and/or may provide adaptive feedback, for example adaptive feedback to a person who is administering cardiopulmonary resuscitation.
  • the system 400 may also be configured to identify a certain portion of the body, or an object held by a person, and to track the movement of the body part or object and record the tracked motion as writing.
  • an EMS technician could write numbers, letters, or words in the air using a finger, and the system 400 may be configured to record such movement as writing.
  • the EMS technician may initiate such “air writing” recording mode with a gesture or other activation; in other embodiments, the system 400 automatically recognizes such “air writing” based on the absence of other objects with which the user's hand or finger could be interacting, for example for a certain period of time.
  • Such recording capabilities may save the EMS technician time in data entry or patient charting, and would permit the medical professional to create charting entries and other writings even when the medical professional's hands are dirty, or when the medical professional does not wish to physically touch devices so as to maintain sterility for hands or gloved hands, according to embodiments of the present invention.

Abstract

A method for tracking interactions in an emergency response environment according to embodiments of the present invention includes receiving color images and depth information from within a field of view of a sensor array; maintaining an emergency encounter record; monitoring one or both of a position of an object and movement of the object in the emergency response environment based on the color images and depth information received by the sensor array; and recording an occurrence of a condition in the emergency encounter record, wherein the condition is based on the one or both of the position of the object and the movement of the object.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Patent Application Ser. No. 61/707,671, filed on Sep. 28, 2012, and of U.S. Provisional Patent Application Ser. No. 61/707,665, filed on Sep. 28, 2012, both of which are incorporated herein by reference in their entireties for all purposes.
  • TECHNICAL FIELD
  • Embodiments of the present invention relate generally to gesture recognition and three-dimensional interaction tracking in an emergency medical services environment.
  • BACKGROUND
  • In an emergency medical services (“EMS”) or first responder environment, caregivers must often focus more acutely on patient care in a shorter amount of time and with a greater number of uncertainties and variables than their counterparts in a hospital setting. Creating a record of the EMS caregiver's encounter with a patient, however, remains important. Manual input of information into patient charting systems (e.g. by typing or by writing) can sometimes take valuable time and attention away from patient care, can be distracting, and can often be inaccurately recreated from memory after an EMS encounter.
  • SUMMARY
  • A method for tracking interactions in an emergency response environment according to embodiments of the present invention includes receiving color images and depth information from within a field of view of a sensor array; maintaining an emergency encounter record; monitoring one or both of a position of an object and movement of the object in the emergency response environment based on the color images and depth information received by the sensor array; and recording an occurrence of a condition in the emergency encounter record, wherein the condition is based on the one or both of the position of the object and the movement of the object.
  • The method of paragraph [0004], wherein the object is a human, and wherein monitoring one or both of the position of the object and movement of the object comprises monitoring one or both of the position of the human and movement of an at least partial skeletal approximation of the human.
  • The method of any of paragraphs [0004] and [0005], wherein the human is a first object, wherein the condition comprises the at least partial skeletal approximation of the human coming within a certain distance of a second object.
  • The method of any of paragraphs [0004] through [0006], wherein the human is a first human, and wherein the second object is a second human.
  • The method of any of paragraphs [0004] through [0007], wherein the condition comprises the first human touching the second human.
  • The method of any of paragraphs [0004] through [0008], wherein the second human is a patient being treated by the first human in the emergency response environment.
  • The method of any of paragraphs [0004] through [0009], wherein recording the occurrence of the condition comprises recording a time at which the condition occurs.
  • The method of any of paragraphs [0004] through [0010], wherein recording the occurrence of the condition further comprises recording a type of the condition.
  • The method of any of paragraphs [0004] through [0011], wherein recording the occurrence of the condition further comprises recording as video footage the color images received during the occurrence of the condition.
  • The method of any of paragraphs [0004] through [0012], further comprising: receiving streaming clinical data about a patient, and correlating at least a portion of the streaming clinical data in the emergency encounter record with the occurrence of the condition.
  • The method of any of paragraphs [0004] through [0013], wherein correlating the at least the portion of the streaming clinical data comprises flagging the at least the portion of the streaming clinical data corresponding to a time of the occurrence of the condition.
  • A system for tracking interactions in an emergency response environment according to embodiments of the present invention includes a sensor array, wherein the sensor array is adapted to receive color images and depth information in its field of view; a control system communicably coupled to the sensor array, the control system configured to: maintain an emergency encounter record; monitor one or both of position and movement of an object in the emergency response environment based on the color images and depth information received from the sensor array; and record an occurrence of a condition in the emergency encounter record, wherein the condition is based on the one or both of position and movement of the object.
  • A method for inventory control in an emergency response environment, according to embodiments of the present invention, includes detecting three-dimensional movement of a human body in the emergency response environment with a sensor array, wherein the sensor array generates visual information and depth information about the emergency response environment; detecting three-dimensional movement of an object in the emergency response environment; determining an occurrence of contact between the human body and the object; and recording an entry in an emergency encounter record based on the occurrence of the contact.
  • The method of paragraph [0016], wherein the object is a narcotic medication stored in an enclosure in the emergency response environment, the method further comprising: determining, based on the detection of the three-dimensional movement of the human body and the object, an occurrence of intersection of the human body with the enclosure; and recording an entry in the emergency encounter record based on the occurrence of the intersection.
  • The method of any of paragraphs [0016] and [0017], wherein the object is a narcotic medication stored in an enclosure in the emergency response environment, the method further comprising: determining, based on the detection of the three-dimensional movement of the object, an occurrence of removal of the narcotic medication from the enclosure; and recording an entry in the emergency encounter record based on the occurrence of the removal.
  • The method of any of paragraphs [0016] through [0018], further comprising updating an inventory database, based on the occurrence of the removal, to reflect that the narcotic medication has been used and needs restocking.
  • While multiple embodiments are disclosed, still other embodiments of the present invention will become apparent to those skilled in the art from the following detailed description, which shows and describes illustrative embodiments of the invention. Accordingly, the drawings and detailed description are to be regarded as illustrative in nature and not restrictive.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates an emergency response environment with a vehicle control system communicably coupled to other devices, according to embodiments of the present invention.
  • FIG. 2 illustrates a computer system, according to embodiments of the present invention.
  • FIG. 3 illustrates an emergency response environment with a system that monitors three-dimensional interaction, according to embodiments of the present invention.
  • FIG. 4 illustrates a system including a vehicle control system and a sensor array, according to embodiments of the present invention.
  • FIG. 5 illustrates a table listing various hand and finger gestures that may be recognized by the system of FIG. 4, according to embodiments of the present invention.
  • FIG. 6 illustrates a table listing various head and facial gestures that may be recognized by the system of FIG. 4, according to embodiments of the present invention.
  • FIG. 7 depicts a flow chart illustrating a method for monitoring three-dimensional interaction in an emergency response environment, according to embodiments of the present invention.
  • FIG. 8 depicts a flow chart illustrating a method for monitoring three-dimensional interaction of a caregiver with a patient in an emergency response environment, according to embodiments of the present invention.
  • FIG. 9 depicts a flow chart illustrating a method for monitoring three-dimensional interaction in an emergency response environment for inventory control, according to embodiments of the present invention.
  • FIG. 10 depicts a flow chart illustrating a method for gesture recognition in an emergency response environment, according to embodiments of the present invention.
  • While the invention is amenable to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and are described in detail below. The intention, however, is not to limit the invention to the particular embodiments described. On the contrary, the invention is intended to cover all modifications, equivalents, and alternatives falling within the scope of the invention as defined by the appended claims.
  • DETAILED DESCRIPTION
  • As illustrated in FIG. 1, a system 100 according to embodiments of the present invention performs advanced data management, integration and presentation of EMS data from multiple different devices. System 100 includes a mobile environment 101, an enterprise environment 102, and an administration environment 103. Devices within the various environments 101, 102, 103 may be communicably coupled via a network 120, such as, for example, the Internet. System 100 is further described in Patent Cooperation Treaty Application Publication No. WO 2011/011454, published on Jan. 27, 2011, which is incorporated herein by reference in its entirety for all purposes.
  • As used herein, the phrase “communicably coupled” is used in its broadest sense to refer to any coupling whereby information may be passed. Thus, for example, communicably coupled includes electrically coupled by, for example, a wire; optically coupled by, for example, an optical cable; and/or wirelessly coupled by, for example, a radio frequency or other transmission media. “Communicably coupled” also includes, for example, indirect coupling, such as through a network, or direct coupling.
  • According to embodiments of the present invention, the mobile environment 101 is an ambulance or other EMS vehicle—for example a vehicular mobile environment (VME). The mobile environment may also be the local network of data entry devices as well as diagnostic and therapeutic devices established at time of treatment of a patient or patients in the field environment—the “At Scene Patient Mobile Environment” (ASPME). The mobile environment may also be a combination of one or more of VMEs and/or ASPMEs. The mobile environment may include a navigation device 110 used by the driver 112 to track the mobile environment's position 101, locate the mobile environment 101 and/or the emergency location, and locate the transport destination, according to embodiments of the present invention. The navigation device 110 may include a Global Positioning System (“GPS”), for example. The navigation device 110 may also be configured to perform calculations about vehicle speed, the travel time between locations, and estimated times of arrival. According to embodiments of the present invention, the navigation device 110 is located at the front of the ambulance to assist the driver 112 in navigating the vehicle. The navigation device 110 may be, for example, a RescueNet® Navigator onboard electronic data communication system available from ZOLL Data Systems of Broomfield, Colo.
  • As illustrated in FIG. 1, a patient monitoring device 106 and a patient charting device 108 are also often used for patient care in the mobile environment 101, according to embodiments of the present invention. The EMS technician 114 attaches the patient monitoring device 106 to the patient 116 to monitor the patient 116. The patient monitoring device 106 may be, for example, a defibrillator device with electrodes and/or sensors configured for attachment to the patient 116 to monitor heart rate and/or to generate electrocardiographs (“ECG's”), according to embodiments of the present invention. The patient monitoring device 106 may also include sensors to detect or a processor to derive or calculate other patient conditions. For example, the patient monitoring device 106 may monitor, detect, treat and/or derive or calculate blood pressure, temperature, respiration rate, blood oxygen level, end-tidal carbon dioxide level, pulmonary function, blood glucose level, and/or weight, according to embodiments of the present invention. The patient monitoring device 106 may be a Zoll E-Series® or X-Series defibrillator available from Zoll Medical Corporation of Chelmsford, Mass., according to embodiments of the present invention. A patient monitoring device may also be a patient treatment device, or another kind of device that includes patient monitoring and/or patient treatment capabilities, according to embodiments of the present invention.
  • The patient charting device 108 is a device used by the EMS technician 114 to generate records and/or notes about the patient's 116 condition and/or treatments applied to the patient, according to embodiments of the present invention. For example, the patient charting device 108 may be used to note a dosage of medicine given to the patient 116 at a particular time. The patient charting device 108 and/or patient monitoring device 106 may have a clock, which may be synchronized with an external time source such as a network or a satellite to prevent the EMS technician from having to manually enter a time of treatment or observation (or having to attempt to estimate the time of treatment for charting purposes long after the treatment was administered), according to embodiments of the present invention. The patient charting device 108 may also be used to record biographic and/or demographic and/or historical information about a patient, for example the patient's name, identification number, height, weight, and/or medical history, according to embodiments of the present invention. According to embodiments of the present invention, the patient charting device 108 is a tablet PC, such as for example the TabletPCR component of the RescueNet® ePCR Suite available from Zoll Data Systems of Broomfield, Colo. According to some embodiments of the present invention, the patient charting device 108 is a wristband or smart-phone such as an Apple iPhone or iPad with interactive data entry interface such as a touch screen or voice recognition data entry that may be communicably connected to the VCS 104 and tapped to indicate what was done with the patient 116 and when it was done.
  • The navigation device 110, the charting device 108, and the monitoring device 106 are each separately very useful to the EMS drivers 112 and technicians 114 before, during, and after the patient transport. A vehicle control system (“VCS”) 104 receives, organizes, stores, and displays data from each device 108, 110, 112 to further enhance the usefulness of each device 108, 110, 112 and to make it much easier for the EMS technician 114 to perform certain tasks that would normally require the EMS technician 114 to divert visual and manual attention to each device 108, 110, 112 separately, according to embodiments of the present invention. In other words, the VCS centralizes and organizes information that would normally be de-centralized and disorganized, according to embodiments of the present invention.
  • The VCS 104 is communicably coupled to the patient monitoring device 106, the patient charting device 108, and the navigation device 110, according to embodiments of the present invention. The VCS 104 is also communicably coupled to a storage medium 118. The VCS 104 may be a touch-screen, flat panel PC, and the storage medium 118 may be located within or external to the VCS 104, according to embodiments of the present invention. The VCS 104 may include a display template serving as a graphical user interface, which permits the user (e.g. EMS tech 114) to select different subsets and/or display modes of the information gathered from and/or sent to devices 106, 108, 110, according to embodiments of the present invention.
  • Some embodiments of the present invention include various steps, some of which may be performed by hardware components or may be embodied in machine-executable instructions. These machine-executable instructions may be used to cause a general-purpose or a special-purpose processor programmed with the instructions to perform the steps. Alternatively, the steps may be performed by a combination of hardware, software, and/or firmware. In addition, some embodiments of the present invention may be performed or implemented, at least in part (e.g., one or more modules), on one or more computer systems, mainframes (e.g., IBM mainframes such as the IBM zSeries, Unisys ClearPath Mainframes, HP Integrity NonStop servers, NEC Express series, and others), or client-server type systems. In addition, specific hardware aspects of embodiments of the present invention may incorporate one or more of these systems, or portions thereof.
  • As such, FIG. 2 is an example of a computer system 200 with which embodiments of the present invention may be utilized. According to the present example, the computer system includes a bus 201, at least one processor 202, at least one communication port 203, a main memory 24, a removable storage media 205, a read only memory 206, and a mass storage 207.
  • Processor(s) 202 can be any known processor, such as, but not limited to, an Intel® Itanium® or Itanium 2® processor(s), or AMD® Opteron® or Athlon MP® processor(s), or Motorola® lines of processors. Communication port(s) 203 can be any of an RS-232 port for use with a modem based dialup connection, a 10/100 Ethernet port, or a Gigabit port using copper or fiber, for example. Communication port(s) 203 may be chosen depending on a network such a Local Area Network (LAN), Wide Area Network (WAN), or any network to which the computer system 200 connects. Main memory 204 can be Random Access Memory (RAM), or any other dynamic storage device(s) commonly known to one of ordinary skill in the art. Read only memory 206 can be any static storage device(s) such as Programmable Read Only Memory (PROM) chips for storing static information such as instructions for processor 202, for example.
  • Mass storage 207 can be used to store information and instructions. For example, hard disks such as the Adaptec® family of SCSI drives, an optical disc, an array of disks such as RAID (e.g. the Adaptec family of RAID drives), or any other mass storage devices may be used, for example. Bus 201 communicably couples processor(s) 202 with the other memory, storage and communication blocks. Bus 201 can be a PCI/PCI-X or SCSI based system bus depending on the storage devices used, for example. Removable storage media 205 can be any kind of external hard-drives, floppy drives, flash drives, IOMEGA® Zip Drives, Compact Disc-Read Only Memory (CD-ROM), Compact Disc-Re-Writable (CD-RW), or Digital Video Disk-Read Only Memory (DVD-ROM), for example. The components described above are meant to exemplify some types of possibilities. In no way should the aforementioned examples limit the scope of the invention, as they are only exemplary embodiments.
  • FIG. 3 illustrates an emergency response environment with a system 300 that monitors three-dimensional interaction, according to embodiments of the present invention. System 300 includes a sensor or sensor array 1. Sensor 1 may be a camera, video camera, or other imaging device capable of collecting visual information. According to some embodiments of the present invention, sensor 1 is a sensor array that includes an image capture device, for example a color image capture device, as well as a depth determining device, for example an infrared emitter and infrared depth sensor. Sensor 1 may also include an audio capture device. For example, sensor 1 may be a sensor array such as a Kinect® sensor array available from Microsoft Corporation. Sensor 1 may also or alternatively be a LEAP™ device available from Leap Motion, Inc. Sensor 1 may be, or include, a wide variety of hardware that permits collection of visual, depth, audio, and color information and the like, according to embodiments of the present invention.
  • Sensor 1 may be placed within an emergency response environment, for example in the back 152 of an ambulance 101, such that activities of the patient 116 and/or crew members 2, 3 are at least partially within its field of view. For example, sensor 1 may be mounted on a wall or ceiling of the back compartment 152 of the ambulance 101. The sensor 1 may also include, within its field of view, a patient support 4, such as a bed, cot, or stretcher, upon which a patient 116 is laying and/or being treated. The back 152 of the ambulance 101 may further include a supply cabinet 5, for example a medicine cabinet or narcotics cabinet, which may be stocked with medicines, for example narcotic 6.
  • FIG. 4 illustrates a system including a vehicle control system 104 communicably coupled with a sensor array 1, according to embodiments of the present invention. Sensor array 1 may include an imaging device 9, a depth sensor system 10, and/or an audio input 11, according to embodiments of the present invention. VCS 104 may also be communicably coupled with a patient monitoring device 106, a charting system 108, a navigation system 110, and vehicle operations systems 8. The vehicle operation systems 8 may include sensors and controllers installed in the vehicle relating to vehicle safety and/or operation, including both manufacturer-installed and aftermarket devices, for example vehicle speed sensors, seatbelt detectors, accelerometers, and other vehicle- and safety-related devices, including without limitation those described in U.S. Provisional Patent Application Ser. No. 61/656,527, filed on Jun. 7, 2012, which is incorporated by reference herein in its entirety for all purposes.
  • Vehicle control system 104 may be configured to create, maintain, and/or update an encounter record 7, which may be stored locally in an emergency response environment (for example in database 118) and/or remotely on an enterprise database 130. The encounter record 7 may include information obtained by the vehicle control system 104 and each of the devices to which VCS 104 is communicably coupled. Records in the encounter record 7 may be specific to an encounter with a particular patient 116, and/or a particular dispatch of the vehicle 101, for example.
  • The VCS 104 may be configured to track interactions in the emergency response environment, for example interactions by and among caregivers 2, 3, and patient 4 and/or objects in the emergency response environment. The VCS 104 may be configured to receive color images and depth information from within a field of view of the sensor array 1. The VCS 104 may also be configured to maintain an emergency encounter record 7, either locally and/or remotely. The VCS 104 monitors a position of an object and/or movement of the object in the emergency response environment based on the color images and depth information received by the sensor array 1. For example, the sensor array 1 may be a Kinect® sensor array, and the VCS 104 may include software that receives data from the sensor array 1 to detect or approximate movements and locations of human bodies and their respective linkages (skeletal joints and bones) in three-dimensional space.
  • As such, the VCS 104 can distinguish between different humans in the field of view of the sensor 1, and can monitor or observe the movements of two or more of such humans in the field of view. According to some embodiments of the present invention, the VCS 104 is configured to recognize which of the humans is a patient and which is a caregiver. For example, VCS 104 may recognize a human as a patient by observing that the particular human is laying relatively still on the patient support 4, while another human is an EMS technician 2 because the other human is standing up or moving around the back of the ambulance 101. The VCS 104 may be configured to track or monitor three-dimensional movements of one or more humans in the emergency response environment by approximating elements of their basic skeletal structure and, as such, can determine when two humans are in contact or close proximity. For example, the VCS 104 can determine when a hand or arm of the EMS technician 2 reaches over and touches an area of the patient's 116 body, according to embodiments of the present invention.
  • Any or all of the information received by the VCS 104 from the sensor array 1, as well as any additional data or information derived from such sensor information, may be stored to the encounter record 7. Such information may also be stored to the encounter record 7 in a manner that correlates it with other data in the encounter record 7 from other devices, for example records in the encounter record 7 may include a time index and/or a patient identification.
  • According to embodiments of the present invention, the VCS 104 is configured to record into the emergency encounter record 7 an occurrence of a condition. Such condition may be based on the position of the object and/or the movement of the object. For example, the object may be a human, and the VCS 104 may monitor the human's movement (or a skeletal approximation thereof) in three-dimensional space, and make an entry in the encounter record 7 when the human or part of the human intersects a certain location (e.g. within the ambulance 101), or remains in a particular location for a certain amount of time, or intersects or nears another object. The VCS 104 may be configured to make an entry to the encounter record 7 when one object (e.g. a human) comes within a certain distance of another object (e.g. another human), for example a zero or minimal distance at which the first object is touching the second object. As such, the VCS 104 may be configured to mark the encounter record 7 when a caregiver 2 or 3 approached the patient 116 and/or touched the patient 116, or when an object approached or touched the patient 116.
  • The VCS 104 may be configured to update the encounter record 7 in various ways based on the observance of a condition based on three-dimensional visual and position data. For example, the VCS 104 may be configured to enter into the encounter record 7 a time at which the condition occurred, and/or an identification of the condition or type of condition that occurred, and/or other data coinciding with the occurrence of the condition, for example video data or color images covering the time or time range when the condition occurred. In some cases, the VCS 104 receives streaming clinical data about a patient 116, for example from a defibrillator or other patient monitoring device 106 communicably coupled to the patient, and correlates at least a portion of the streaming clinical data in the emergency encounter record 7 with the occurrence of the condition based on the sensor's 1 visual data. According to embodiments of the present invention, correlating some or all of the streaming clinical data includes flagging some or all of the streaming clinical data that corresponds to a time of the occurrence of the condition.
  • FIG. 7 illustrates a flow chart 700 showing the recording of an occurrence of a condition based on three-dimensional position and shape visual data, according to embodiments of the present invention. One or more distinct objects are identified (block 702), for example by VCS 104 and sensor 1. The position and/or movement of the one or more objects are tracked or otherwise monitored or modeled (block 704), and based on such tracking the VCS 104 identifies the occurrence of a condition (block 706). The occurrence of the condition, or information about the condition, is recorded in the patient encounter record 7 (block 708).
  • FIG. 8 illustrates a flow chart 800 describing a similar method in greater detail, according to embodiments of the present invention. An individual human or distinct humans are identified in an emergency response environment, for example the back of an ambulance (block 802). At least one of the humans is identified as a patient (block 804). The position and/or movement of the one or more humans is observed or tracked or otherwise modeled (block 806), and based thereon the VCS 104 identifies the occurrence of a condition, for example the occurrence of patient treatment (block 808). Information about the patient contact may be recorded in the encounter record 7 (block 810), for example by recording a time or time range at which the condition (e.g. treatment) occurred (block 812), and/or by recording a type of contact (e.g. treatment) which occurred (block 814).
  • For example, if the sensor 1 data supplied to the VCS 104 was interpreted by the VCS 104 as a caregiver's 2 hand going to the head or mouth area of the patient 116, the VCS 104 may update the encounter record 7 to reflect that an oral medication was or may have been administered to the patient 116, and the particular time which this occurred. Alternatively, or in addition, the VCS 104 may be configured to prompt the EMS technician 2 or other caregiver at a later time, for example after the emergency encounter or at the end of a standard shift, to confirm or validate the perceived interactions or conditions that were entered into the patient encounter record 7. For example, the VCS 104 might observe the occurrence of the EMS technician's 2 hand going to the face of the patient 116 and flag such occurrence as the possible administration of an oral medication, but when prompting the EMS technician 2 for later confirmation, may give the EMS technician 2 the ability to edit the observation to reflect that the interaction was instead a turning of the head of the patient, or some other reason for why the caregiver 2 contacted the patient 116.
  • FIG. 9 depicts a flow chart 900 illustrating a method for monitoring three-dimensional interaction in an emergency response environment for inventory control, according to embodiments of the present invention. The VCS 104 may identify a particular location within the emergency response environment, for example a supply cabinet 5, using sensor 1 and known information about the environment (block 902). The VCS 104 may also be configured for customization regarding the locations of certain items in the emergency response environment. For example, during an initialization and/or configuration protocol, the VCS 104 may prompt the user to run the user's finger or hand around an outer perimeter of a supply cabinet 5 and/or a door thereto, so that the VCS 104 can log the three-dimensional position of the supply cabinet 5. Such cabinet 5 may be, for example, a narcotics cabinet 5 to which access is often controlled for safety and security reasons.
  • The VCS 104 may identify individual humans in the emergency response environment, for example the back of an ambulance (block 904), and track the position and/or movement of such humans (block 906). This may be done with visual and depth information received from the sensor array 1, according to embodiments of the present invention. Based on such visual and depth information received from the sensor array 1, the VCS 104 may also detect or track three-dimensional movement of an object in the emergency response environment, for example a non-human object. The VCS 104 may determine an occurrence of contact between the human body and the object (block 908), for example an occurrence of the human body or a portion thereof approaching and/or intersecting the narcotics cabinet 5. The VCS 104 may also record an entry in an emergency encounter record 7 based on the occurrence of the contact, for example a note that the cabinet 5 was accessed (block 910) along with a time (block 912) and/or an identity of the person who accessed the cabinet 5 (block 914). The VCS 104 may be configured to observe the occurrence of various different types of conditions of note. For example, the VCS 104 may be configured to detect an intersection of a human form with the area of the door or opening to the cabinet 5. The VCS 104 may be configured to detect that a shape that correlates to the shape of the narcotic medication 6 has gone from inside such area of the door or cabinet opening to outside such area. VCS 104 may also be configured to note whether a human has an object in the human's hand as well as the shape and/or size of the object. The VCS 104 may further be configured to update an inventory database, based on the occurrence of the removal, to reflect that the narcotic medication has been used and needs restocking. Similar processes may be used to track the use of other objects and the inventory associated therewith, as well as to track in general the intersection of objects with humans and use thereby, according to embodiments of the present invention. According to some embodiments of the present invention, the occurrence of an access event to the particular cabinet 5 may further trigger other information gathering, for example it may trigger a camera on the inside of the cabinet 5 and/or another video camera elsewhere in the vehicle 101. The identity of each crew member accessing the cabinet 5 may be recorded in the encounter record 7, according to embodiments of the present invention.
  • FIG. 10 depicts a flow chart 1000 illustrating a method for gesture recognition in an emergency response environment, according to embodiments of the present invention. While system 400, including VCS 104 and sensors 1, may be configured to track motions, positions, and interactions of humans and objects as described above, system 400 as well as VCS 104 and sensors 1 may also or alternatively be configured to monitor such visual information for the occurrence of gestures. In some cases, three-dimensional position and visual information may be used to monitor for gestures; in other cases, mere visual information may be used to detect gestures (e.g. based on pattern recognition or other visual cues or patterns). As such, sensor 1 may be one of a number of various types of sensors or sensor arrays.
  • VCS 104 may be configured to track an entire human body and/or one or more portions thereof to identify gestures being made, for example gestures being made by one or more hands and/or fingers or by the head and/or neck (block 1002). VCS 104 receives visual information about at least a portion of a human body from at least one sensor 1, and maintains the encounter record 7. The VCS 104 is configured to monitor the visual information to determine movements of the at least the portion of the human body (for example the hand or the head), and to recognize an occurrence of a gesture based on the movements of the at least the portion of the human body. For example, the VCS 104 recognizes one or more hand or finger gestures based on visual and/or depth information received by sensor 1, for example one or more hand or finger gestures listed in FIG. 5. The VCS 104 may also recognize one or more head or facial gestures based on visual and/or depth information received by sensor 1, for example one or more head or facial gestures listed in FIG. 6.
  • Examples of hand or finger gestures may include waving a hand or finger, making a fist, raising the fist, shaking the first, making the “thumbs up” signal, spreading fingers apart, displaying a count (e.g. zero, one, two, three, four, five, six, seven, eight, nine, or ten digits extended), pointing, moving hands together, pulling hands apart, and/or tapping on the wrist. Examples of head or facial gestures may include nodding the head, bobbing the head, shaking the head side-to-side as in the “no” gesture, shaking the head up and down as in the “yes” gesture, blinking, opening or closing the mouth, sticking the tongue out, raising or lowering eyebrows, and/or opening or closing the eyes.
  • When the VCS 104 recognizes a gesture, the VCS 104 records an entry in the emergency encounter record 7 based on the occurrence of the gesture (block 1004). Such a gesture may be artificial, or alternatively such a gesture may be natural. An artificial gesture is a gesture made by a human for the primary purpose of triggering the condition with VCS 104. As such, an artificial gesture may be a gesture that would not normally be made in the normal course of treating a patient 116 in an emergency response environment. For example, making a “thumbs up” signal is one example of an artificial gesture. A patient whose head is involuntarily bobbing is an example of a natural gesture, or a gesture which is not performed only to trigger VCS 104.
  • The entry which the VCS 104 makes in the patient encounter record 7 based on the recognition of the gesture may include information about the type of gesture made (block 1006), information about the time at which the gesture was made (block 1008), and/or information about other data values at the time the gesture was made (block 1010), for example information about the crew (block 1012), patient clinical data (block 1014), and vehicle operation or safety conditions (block 1016). For example, VCS 104 may be configured to write the patient's 116 current blood pressure reading to the encounter record 7 whenever VCS 104 receives visual and/or depth information from the sensor 1 indicating that the caregiver 2 attending to the patient 116 taps his or her left wrist with the right hand or fingers (tapping the location where a watch would normally be worn). Successive gestures may be used to take the VCS 104 down various pathways and/or treatment protocols, or to confirm previous gestures or options that become available because of those gestures. For example, the VCS 104 may be configured to record a blood pressure reading to the encounter record 7 when it identifies the wrist tapping gesture followed by a chest tapping gesture, and may be configured to record an ECG waveform signal to the encounter record 7 when it identifies the same wrist tapping gesture followed by a back-of-the-neck tapping gesture. The VCS 104 may also be configured to record in the encounter record 7 the audiovisual (e.g. video and/or audio) information received during or within a certain time range of the gesture, according to embodiments of the present invention.
  • According to some embodiments of the present invention, the VCS is configured to identify simultaneous occurrence of gestures, for example two or more gestures selected from FIG. 5, FIG. 6, or any other natural or artificial gestures. According to some embodiments of the present invention, the VCS 104 is configured to identify simultaneous occurrence of gestures along with position and/or movement information for entire human bodies or portions thereof, or simultaneous occurrence of other factors such as vehicle position along the ambulance route, patient vital signs, and/or vehicle speed. VCS 104 may also be configured to identify simultaneous occurrence of gestures by the same person, for example a different or similar gesture with each hand, or a hand and a head. For example, the VCS 104 may be configured to recognize a hand waiving gesture and to make a record in the encounter record 7 and notify the ambulance driver to slow down if the hand waiving gesture is received at a time when the vehicle speed is exceeding 60 miles per hour. In this way, the visually recognized gestures may be paired or correlated or combined with other information received by VCS 104, either in the creation of the condition which triggers a further event (such as writing to the encounter record 7 or creating a notification or some other action), or in the creation of the entry to the encounter record 7 itself (for example the types of information that would be flagged or gathered or otherwise noted upon occurrence of the condition).
  • According to some embodiments, the VCS 104 identifies (either in the encounter record 7 or for other devices) whether a patient is being transported by the vehicle 101, for example by determining whether a human figure is sitting on or laying on the patient support 4. The VCS 104 may also identify the position of a patient or a crew member, for example whether the patient or crew member is sitting or standing. The VCS 104 may also receive from sensor 1 information about structures beyond a normal emergency response environment, for example larger-scale depth images of emergency incidents such as buildings on fire, to aid in the location of emergency workers and/or victims.
  • Although one sensor 1 is shown and described, multiple sensors 1, either of the same type of different types, may be communicably coupled with VCS 104. Multiple sensors 1 may be used to expand the field or depth of view, or to collect similar information from a different viewing angle, in order to observe more objects or humans, or gather more detailed information about shapes and/or movements. And although sensor 1 is described as being mounted within a vehicle, sensor 1 or multiples thereof may alternatively be mounted on a device (for example a defibrillator taken to an emergency response scene) and/or on a person (for example on a crew member's helmet).
  • Embodiments of the present invention may also be used for charting and/or counting functions. Often, medics must reconstruct past events that occurred during patient treatment. Embodiments of the present invention improve accuracy and help to accurately document times at which various events occurred. For example, the VCS 104 may recognize boundaries of multiple cabinets or storage areas within an ambulance 101, and may log the times at which each storage area was accessed by a medic, as well as the identity (e.g. obtained from voice or body or facial recognition) of the medic who accessed the area. Such a “bounding volume” may be preprogrammed into VCS 104 and/or customized or initialized upon installation of VCS 104, sensor 1, and/or a new storage area. The VCS 104 may count a number of boxes on the floor of the ambulance to determine a number of items used in the encounter, and reconcile that with the medications and other durable goods charted for the patient encounter. The VCS 104 may then prompt the medic for additional information to help reconcile the encounter record 7.
  • As described above, the system 400 may also determine when a patient is being touched, either by another human or by an implement held by another human. This information may be used either during the patient encounter, or afterward, to determine whether inappropriate patient contact has occurred. The system 400 may determine when an IV is being started. System 400 may also use gesture-based charting, for example quick-logging with artificial gestures, to save time over manual entry or typing of such information. Embodiments of the present invention may also include voice recognition, which may filter out siren sounds or road sounds, and which may also provide feedback to the crew. Embodiments of the present invention may also be configured to identify crew members, for example through facial recognition, pattern recognition, name badge reading, skeletal modeling, habits or movements, or via another mechanism such as crew logins or RFID badges which are also communicably coupled to VCS 104. According to some embodiments of the present invention, the system 400 may be used for security monitoring, to detect the presence of unidentified or unwanted intruders in the vehicle 101.
  • According to some embodiments of the present invention, the system 400 may be used to begin tracking a person when the person makes a gesture or performs a certain activity, and then continue to track the same person after the gesture or activity, for a certain period of time or until another event occurs, for example another visual event. In some embodiments, the system 400 identifies an operator of a medical device using visual information; for example, a patient monitoring device 106, such as a defibrillator, may include a camera or other type of sensor array 1, and upon use of the device 106 the device 106 may observe visual characteristics of the person directly in front of the device 106 in order to identify the person or monitor or interpret activities of that person. The system 400 may also be configured to recognize or identify in its field of view equipment used by medical personnel, either by visual cues or otherwise, and may perform similar medical personnel identification or visual monitoring even when the camera or sensor array 1 is not in or near the device being used. Such multiple devices used by medical personnel may be wirelessly or otherwise communicably coupled with each other and/or with system 400, so that activities performed on various devices and by the personnel are correlated for a more complete patient record without requiring manual annotation, according to embodiments of the present invention. The system 400 may be mounted not only in a vehicle, such as the back of an ambulance, but system 400 and/or parts thereof may also be integrated into or mounted on a medical device, including a portable medical device such as a defibrillator.
  • The system 400 may also be configured to “remember” a person based on that person's gestures; for example, the system 400 may observe certain gestures performed by a person one day after the person identifies himself or herself to the system 400, and may then visually identify the same person the next day based on observing similar gestures, even if the person has not specifically identified himself or herself to the system 400 on the following occasion. The system 400 may also be configured to count the number of distinct individual people in a given area, according to embodiments of the present invention.
  • The system 400 may also be configured to monitor certain activities and to interpret various aspects of those activities, and even to provide feedback to the performer of the activities either in real time or in a later review. For example, the system 400 may monitor an EMS technician's twelve-lead placement on a patient, and/or may provide adaptive feedback, for example adaptive feedback to a person who is administering cardiopulmonary resuscitation. The system 400 may also be configured to identify a certain portion of the body, or an object held by a person, and to track the movement of the body part or object and record the tracked motion as writing. For example, an EMS technician could write numbers, letters, or words in the air using a finger, and the system 400 may be configured to record such movement as writing. The EMS technician may initiate such “air writing” recording mode with a gesture or other activation; in other embodiments, the system 400 automatically recognizes such “air writing” based on the absence of other objects with which the user's hand or finger could be interacting, for example for a certain period of time. Such recording capabilities may save the EMS technician time in data entry or patient charting, and would permit the medical professional to create charting entries and other writings even when the medical professional's hands are dirty, or when the medical professional does not wish to physically touch devices so as to maintain sterility for hands or gloved hands, according to embodiments of the present invention.
  • Various modifications and additions can be made to the exemplary embodiments discussed without departing from the scope of the present invention. For example, while the embodiments described above refer to particular features, the scope of this invention also includes embodiments having different combinations of features and embodiments that do not include all of the described features. Accordingly, the scope of the present invention is intended to embrace all such alternatives, modifications, and variations as fall within the scope of the claims, together with all equivalents thereof.

Claims (16)

What is claimed is:
1. A method for tracking interactions in an emergency response environment, the method comprising:
receiving color images and depth information from within a field of view of a sensor array;
maintaining an emergency encounter record;
monitoring one or both of a position of an object and movement of the object in the emergency response environment based on the color images and depth information received by the sensor array; and
recording an occurrence of a condition in the emergency encounter record, wherein the condition is based on the one or both of the position of the object and the movement of the object.
2. The method of claim 1, wherein the object is a human, and wherein monitoring one or both of the position of the object and movement of the object comprises monitoring one or both of the position of the human and movement of an at least partial skeletal approximation of the human.
3. The method of claim 2, wherein the human is a first object, wherein the condition comprises the at least partial skeletal approximation of the human coming within a certain distance of a second object.
4. The method of claim 3, wherein the human is a first human, and wherein the second object is a second human.
5. The method of claim 4, wherein the condition comprises the first human touching the second human.
6. The method of claim 5, wherein the second human is a patient being treated by the first human in the emergency response environment.
7. The method of claim 1, wherein recording the occurrence of the condition comprises recording a time at which the condition occurs.
8. The method of claim 7, wherein recording the occurrence of the condition further comprises recording a type of the condition.
9. The method of claim 7, wherein recording the occurrence of the condition further comprises recording as video footage the color images received during the occurrence of the condition.
10. The method of claim 1, further comprising:
receiving streaming clinical data about a patient, and
correlating at least a portion of the streaming clinical data in the emergency encounter record with the occurrence of the condition.
11. The method of claim 10, wherein correlating the at least the portion of the streaming clinical data comprises flagging the at least the portion of the streaming clinical data corresponding to a time of the occurrence of the condition.
12. A system for tracking interactions in an emergency response environment, the system comprising:
a sensor array, wherein the sensor array is adapted to receive color images and depth information in its field of view;
a control system communicably coupled to the sensor array, the control system configured to:
maintain an emergency encounter record;
monitor one or both of position and movement of an object in the emergency response environment based on the color images and depth information received from the sensor array; and
record an occurrence of a condition in the emergency encounter record, wherein the condition is based on the one or both of position and movement of the object.
13. A method for inventory control in an emergency response environment, the method comprising:
detecting three-dimensional movement of a human body in the emergency response environment with a sensor array, wherein the sensor array generates visual information and depth information about the emergency response environment;
detecting three-dimensional movement of an object in the emergency response environment;
determining an occurrence of contact between the human body and the object; and
recording an entry in an emergency encounter record based on the occurrence of the contact.
14. The method of claim 13, wherein the object is a narcotic medication stored in an enclosure in the emergency response environment, the method further comprising:
determining, based on the detection of the three-dimensional movement of the human body and the object, an occurrence of intersection of the human body with the enclosure; and
recording an entry in the emergency encounter record based on the occurrence of the intersection.
15. The method of claim 13, wherein the object is a narcotic medication stored in an enclosure in the emergency response environment, the method further comprising:
determining, based on the detection of the three-dimensional movement of the object, an occurrence of removal of the narcotic medication from the enclosure; and
recording an entry in the emergency encounter record based on the occurrence of the removal.
16. The method of claim 15, further comprising updating an inventory database, based on the occurrence of the removal, to reflect that the narcotic medication has been used and needs restocking.
US14/040,159 2012-09-28 2013-09-27 Systems and methods for three-dimensional interaction monitoring in an ems environment Abandoned US20140093135A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/040,159 US20140093135A1 (en) 2012-09-28 2013-09-27 Systems and methods for three-dimensional interaction monitoring in an ems environment

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201261707665P 2012-09-28 2012-09-28
US201261707671P 2012-09-28 2012-09-28
US14/040,159 US20140093135A1 (en) 2012-09-28 2013-09-27 Systems and methods for three-dimensional interaction monitoring in an ems environment

Publications (1)

Publication Number Publication Date
US20140093135A1 true US20140093135A1 (en) 2014-04-03

Family

ID=50385259

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/040,159 Abandoned US20140093135A1 (en) 2012-09-28 2013-09-27 Systems and methods for three-dimensional interaction monitoring in an ems environment
US14/040,147 Active 2033-10-22 US9911166B2 (en) 2012-09-28 2013-09-27 Systems and methods for three-dimensional interaction monitoring in an EMS environment

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/040,147 Active 2033-10-22 US9911166B2 (en) 2012-09-28 2013-09-27 Systems and methods for three-dimensional interaction monitoring in an EMS environment

Country Status (5)

Country Link
US (2) US20140093135A1 (en)
EP (1) EP2901368A4 (en)
JP (1) JP2015533248A (en)
CN (1) CN104995638A (en)
WO (1) WO2014052802A2 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017006529A1 (en) * 2017-07-11 2019-01-17 Drägerwerk AG & Co. KGaA A method, apparatus and computer program for capturing optical image data of a patient environment and for detecting a patient examination
WO2019032812A1 (en) * 2017-08-10 2019-02-14 Nuance Communications, Inc. Automated clinical documentation system and method
US10809970B2 (en) 2018-03-05 2020-10-20 Nuance Communications, Inc. Automated clinical documentation system and method
US11043207B2 (en) 2019-06-14 2021-06-22 Nuance Communications, Inc. System and method for array data simulation and customized acoustic modeling for ambient ASR
US11179293B2 (en) 2017-07-28 2021-11-23 Stryker Corporation Patient support system with chest compression system and harness assembly with sensor system
US11216480B2 (en) 2019-06-14 2022-01-04 Nuance Communications, Inc. System and method for querying data points from graph data structures
US11222103B1 (en) 2020-10-29 2022-01-11 Nuance Communications, Inc. Ambient cooperative intelligence system and method
US11222716B2 (en) 2018-03-05 2022-01-11 Nuance Communications System and method for review of automated clinical documentation from recorded audio
US11227679B2 (en) 2019-06-14 2022-01-18 Nuance Communications, Inc. Ambient clinical intelligence system and method
US11316865B2 (en) 2017-08-10 2022-04-26 Nuance Communications, Inc. Ambient cooperative intelligence system and method
US11515020B2 (en) 2018-03-05 2022-11-29 Nuance Communications, Inc. Automated clinical documentation system and method
US11531807B2 (en) 2019-06-28 2022-12-20 Nuance Communications, Inc. System and method for customized text macros
US11670408B2 (en) 2019-09-30 2023-06-06 Nuance Communications, Inc. System and method for review of automated clinical documentation

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110172550A1 (en) 2009-07-21 2011-07-14 Michael Scott Martin Uspa: systems and methods for ems device communication interface
EP4053760A1 (en) 2010-04-09 2022-09-07 Zoll Medical Corporation Systems and methods for ems device communications interface
WO2014052802A2 (en) * 2012-09-28 2014-04-03 Zoll Medical Corporation Systems and methods for three-dimensional interaction monitoring in an ems environment
EP2770452A1 (en) * 2013-02-22 2014-08-27 Samsung Electronics Co., Ltd. Method and system for transmitting result of examination of specimen from medical device to destination through mobile device
US10404784B2 (en) * 2013-02-22 2019-09-03 Samsung Electronics Co., Ltd. Method and system for transmitting result of examination of specimen from medical device to destination
JP6115335B2 (en) * 2013-06-10 2017-04-19 ノーリツプレシジョン株式会社 Information processing apparatus, information processing method, and program
US20150346932A1 (en) * 2014-06-03 2015-12-03 Praveen Nuthulapati Methods and systems for snapshotting events with mobile devices
US11493348B2 (en) 2017-06-23 2022-11-08 Direct Current Capital LLC Methods for executing autonomous rideshare requests
US11106927B2 (en) * 2017-12-27 2021-08-31 Direct Current Capital LLC Method for monitoring an interior state of an autonomous vehicle
US11819369B2 (en) 2018-03-15 2023-11-21 Zoll Medical Corporation Augmented reality device for providing feedback to an acute care provider

Citations (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5850352A (en) * 1995-03-31 1998-12-15 The Regents Of The University Of California Immersive video, including video hypermosaicing to generate from multiple video views of a scene a three-dimensional video mosaic from which diverse virtual video scene images are synthesized, including panoramic, scene interactive and stereoscopic images
US6115052A (en) * 1998-02-12 2000-09-05 Mitsubishi Electric Information Technology Center America, Inc. (Ita) System for reconstructing the 3-dimensional motions of a human figure from a monocularly-viewed image sequence
US6236737B1 (en) * 1997-03-26 2001-05-22 Dalhousie University Dynamic target addressing system
US20030058111A1 (en) * 2001-09-27 2003-03-27 Koninklijke Philips Electronics N.V. Computer vision based elderly care monitoring system
US6741977B1 (en) * 1999-01-29 2004-05-25 Hitachi, Ltd. Image recording/reproducing apparatus in monitor system
US7382895B2 (en) * 2002-04-08 2008-06-03 Newton Security, Inc. Tailgating and reverse entry detection, alarm, recording and prevention using machine vision
US7539532B2 (en) * 2006-05-12 2009-05-26 Bao Tran Cuffless blood pressure monitoring appliance
US20090195382A1 (en) * 2008-01-31 2009-08-06 Sensormatic Electronics Corporation Video sensor and alarm system and method with object and event classification
US20090237247A1 (en) * 2007-06-08 2009-09-24 Brunetti Sam F Remote area monitoring system
US20090259113A1 (en) * 2007-11-08 2009-10-15 General Electric Company System and method for determining pain level
US20110007139A1 (en) * 2007-06-08 2011-01-13 Brunetti Sam F Method and system for administering remote area monitoring system
US7961910B2 (en) * 2009-10-07 2011-06-14 Microsoft Corporation Systems and methods for tracking a model
US8009867B2 (en) * 2009-01-30 2011-08-30 Microsoft Corporation Body scan
US20120154582A1 (en) * 2010-09-14 2012-06-21 General Electric Company System and method for protocol adherence
US8213680B2 (en) * 2010-03-19 2012-07-03 Microsoft Corporation Proxy training data for human body tracking
US20120212582A1 (en) * 2011-02-22 2012-08-23 Richard Deutsch Systems and methods for monitoring caregiver and patient protocol compliance
US8284157B2 (en) * 2010-01-15 2012-10-09 Microsoft Corporation Directed performance in motion capture system
US8290249B2 (en) * 2009-05-01 2012-10-16 Microsoft Corporation Systems and methods for detecting a tilt angle from a depth image
US8295542B2 (en) * 2007-01-12 2012-10-23 International Business Machines Corporation Adjusting a consumer experience based on a 3D captured image stream of a consumer response
US8334842B2 (en) * 2010-01-15 2012-12-18 Microsoft Corporation Recognizing user intent in motion capture system
US8351652B2 (en) * 2009-05-29 2013-01-08 Microsoft Corporation Systems and methods for tracking a model
US8374423B2 (en) * 2009-12-18 2013-02-12 Microsoft Corporation Motion detection using depth images
US8379101B2 (en) * 2009-05-29 2013-02-19 Microsoft Corporation Environment and/or target segmentation
US8385596B2 (en) * 2010-12-21 2013-02-26 Microsoft Corporation First person shooter control with virtual skeleton
US8457353B2 (en) * 2010-05-18 2013-06-04 Microsoft Corporation Gestures and gesture modifiers for manipulating a user-interface
US20130173300A1 (en) * 2011-12-30 2013-07-04 Elwha Llc Evidence-based healthcare information management protocols
US8565479B2 (en) * 2009-08-13 2013-10-22 Primesense Ltd. Extraction of skeletons from 3D maps
US8570373B2 (en) * 2007-06-08 2013-10-29 Cisco Technology, Inc. Tracking an object utilizing location information associated with a wireless device
US8652038B2 (en) * 2006-05-12 2014-02-18 Bao Tran Health monitoring appliance
US20140132728A1 (en) * 2012-11-12 2014-05-15 Shopperception, Inc. Methods and systems for measuring human interaction
US20150302539A1 (en) * 2014-04-16 2015-10-22 Vios Medical Singapore PTE LTD Patient care and health information management systems and methods

Family Cites Families (79)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5534917A (en) 1991-05-09 1996-07-09 Very Vivid, Inc. Video image based control system
US6409660B1 (en) * 1996-09-19 2002-06-25 Ortivus Ab Portable telemedicine device
US6347299B1 (en) * 1997-07-31 2002-02-12 Ncr Corporation System for navigation and editing of electronic records through speech and audio
US6950534B2 (en) 1998-08-10 2005-09-27 Cybernet Systems Corporation Gesture-controlled interfaces for self-service machines and other applications
US7227526B2 (en) 2000-07-24 2007-06-05 Gesturetek, Inc. Video-based image control system
US7058204B2 (en) 2000-10-03 2006-06-06 Gesturetek, Inc. Multiple camera control system
US7095401B2 (en) 2000-11-02 2006-08-22 Siemens Corporate Research, Inc. System and method for gesture interface
US7143044B2 (en) 2000-12-29 2006-11-28 International Business Machines Corporation Translator for infants and toddlers
US6968294B2 (en) 2001-03-15 2005-11-22 Koninklijke Philips Electronics N.V. Automatic system for monitoring person requiring care and his/her caretaker
JP3969974B2 (en) * 2001-07-17 2007-09-05 株式会社山武 Patient transporter and stretcher for patient transport
US7110569B2 (en) 2001-09-27 2006-09-19 Koninklijke Philips Electronics N.V. Video based detection of fall-down and other events
US7340077B2 (en) * 2002-02-15 2008-03-04 Canesta, Inc. Gesture recognition system using depth perceptive sensors
US7369685B2 (en) 2002-04-05 2008-05-06 Identix Corporation Vision-based operating method and system
AU2003234910B2 (en) 2002-05-07 2008-07-17 Kyoto University Medical cockpit system
US7225131B1 (en) 2002-06-14 2007-05-29 At&T Corp. System and method for accessing and annotating electronic medical records using multi-modal interface
AU2003245758A1 (en) 2002-06-21 2004-01-06 Cedara Software Corp. Computer assisted system and method for minimal invasive hip, uni knee and total knee replacement
US6984208B2 (en) 2002-08-01 2006-01-10 The Hong Kong Polytechnic University Method and apparatus for sensing body gesture, posture and movement
DE60215504T2 (en) 2002-10-07 2007-09-06 Sony France S.A. Method and apparatus for analyzing gestures of a human, e.g. for controlling a machine by gestures
EP1627272B2 (en) 2003-02-04 2017-03-08 Mako Surgical Corp. Interactive computer-assisted surgery system and method
US20050267354A1 (en) 2003-02-04 2005-12-01 Joel Marquart System and method for providing computer assistance with spinal fixation procedures
US8745541B2 (en) 2003-03-25 2014-06-03 Microsoft Corporation Architecture for controlling a computer using hand gestures
DE10334073A1 (en) 2003-07-25 2005-02-10 Siemens Ag Medical technical control system
US7212109B2 (en) 2004-02-13 2007-05-01 Ge Medical Systems Global Technology Company, Llc Hygienic input device for medical information systems
US20050255434A1 (en) 2004-02-27 2005-11-17 University Of Florida Research Foundation, Inc. Interactive virtual characters for training including medical diagnosis training
CN100573548C (en) 2004-04-15 2009-12-23 格斯图尔泰克股份有限公司 The method and apparatus of tracking bimanual movements
EP1766940A4 (en) 2004-06-04 2012-04-11 Systems Ltd Keyless System to enhance data entry in mobile and fixed environment
JP2008506188A (en) 2004-07-09 2008-02-28 ジェスチャーラド インコーポレイテッド Gesture-based reporting method and system
US7501995B2 (en) 2004-11-24 2009-03-10 General Electric Company System and method for presentation of enterprise, clinical, and decision support information utilizing eye tracking navigation
JP5080273B2 (en) 2005-01-07 2012-11-21 クアルコム,インコーポレイテッド Tilt sensor based on optical flow
CN101198964A (en) 2005-01-07 2008-06-11 格斯图尔泰克股份有限公司 Creating 3D images of objects by illuminating with infrared patterns
WO2006124935A2 (en) 2005-05-17 2006-11-23 Gesturetek, Inc. Orientation-sensitive signal output
US7301464B2 (en) * 2005-05-24 2007-11-27 Electronic Data Systems Corporation Process and method for safer vehicle navigation through facial gesture recognition and operator condition monitoring
US20070016008A1 (en) 2005-06-23 2007-01-18 Ryan Schoenefeld Selective gesturing input to a surgical navigation system
US7607079B2 (en) 2005-07-08 2009-10-20 Bruce Reiner Multi-input reporting and editing tool
JP2007058625A (en) * 2005-08-25 2007-03-08 Fuji Xerox Co Ltd Information processor, information processing method and computer program
WO2007027610A2 (en) * 2005-08-30 2007-03-08 Bruce Reiner Multi-functional navigational device and method
US7643862B2 (en) 2005-09-15 2010-01-05 Biomet Manufacturing Corporation Virtual mouse for use in surgical navigation
US20070118400A1 (en) 2005-11-22 2007-05-24 General Electric Company Method and system for gesture recognition to drive healthcare applications
US8380631B2 (en) * 2006-07-19 2013-02-19 Mvisum, Inc. Communication of emergency medical data over a vulnerable system
US7698002B2 (en) 2006-09-29 2010-04-13 Nellcor Puritan Bennett Llc Systems and methods for user interface and identification in a medical device
US20080097176A1 (en) * 2006-09-29 2008-04-24 Doug Music User interface and identification in a medical device systems and methods
US20080104547A1 (en) 2006-10-25 2008-05-01 General Electric Company Gesture-based communications
US20080114615A1 (en) 2006-11-15 2008-05-15 General Electric Company Methods and systems for gesture-based healthcare application interaction in thin-air display
US20080114614A1 (en) 2006-11-15 2008-05-15 General Electric Company Methods and systems for healthcare application interaction using gesture-based interaction enhanced with pressure sensitivity
US7694240B2 (en) 2006-11-22 2010-04-06 General Electric Company Methods and systems for creation of hanging protocols using graffiti-enabled devices
US7971156B2 (en) * 2007-01-12 2011-06-28 International Business Machines Corporation Controlling resource access based on user gesturing in a 3D captured image stream of the user
US7801332B2 (en) * 2007-01-12 2010-09-21 International Business Machines Corporation Controlling a system based on user behavioral signals detected from a 3D captured image stream
WO2008134745A1 (en) 2007-04-30 2008-11-06 Gesturetek, Inc. Mobile video-based therapy
CN101778598B (en) 2007-08-10 2013-03-27 皇家飞利浦电子股份有限公司 Motion detection in medical systems
US7987069B2 (en) 2007-11-12 2011-07-26 Bee Cave, Llc Monitoring patient support exiting and initiating response
WO2009094591A2 (en) * 2008-01-24 2009-07-30 Micropower Appliance Video delivery systems using wireless cameras
US20090198696A1 (en) * 2008-02-01 2009-08-06 Flexscan, Inc. Emergency medical record
US8203454B2 (en) 2008-03-03 2012-06-19 The General Hospital Corporation Wheelchair alarm system and method
EP2291815A2 (en) 2008-05-07 2011-03-09 Carrot Medical Llc Integration system for medical instruments with remote control
US8184092B2 (en) 2008-05-22 2012-05-22 International Business Machines Corporation Simulation of writing on game consoles through the use of motion-sensing technology
US10117721B2 (en) 2008-10-10 2018-11-06 Truevision Systems, Inc. Real-time surgical reference guides and methods for surgical applications
KR100995885B1 (en) 2008-11-17 2010-11-23 휴잇테크놀러지스 주식회사 System and Method of notifying in-vehicle emergency based on eye writing recognition
US7996793B2 (en) 2009-01-30 2011-08-09 Microsoft Corporation Gesture recognizer system architecture
EP3499507A1 (en) * 2009-07-21 2019-06-19 Zoll Medical Corporation System for providing role-based data feeds for caregivers
DE102009037316A1 (en) 2009-08-14 2011-02-17 Karl Storz Gmbh & Co. Kg Control and method for operating a surgical light
WO2011060140A1 (en) * 2009-11-12 2011-05-19 Soteria Systems, Llc Personal safety application for mobile device and method
US8935003B2 (en) 2010-09-21 2015-01-13 Intuitive Surgical Operations Method and system for hand presence detection in a minimally invasive surgical system
US8543240B2 (en) 2009-11-13 2013-09-24 Intuitive Surgical Operations, Inc. Master finger tracking device and method of use in a minimally invasive surgical system
US8682489B2 (en) 2009-11-13 2014-03-25 Intuitive Sugical Operations, Inc. Method and system for hand control of a teleoperated minimally invasive slave surgical instrument
US8996173B2 (en) 2010-09-21 2015-03-31 Intuitive Surgical Operations, Inc. Method and apparatus for hand gesture control in a minimally invasive surgical system
US8666781B2 (en) 2009-12-23 2014-03-04 Ai Cure Technologies, LLC Method and apparatus for management of clinical trials
US8605165B2 (en) 2010-10-06 2013-12-10 Ai Cure Technologies Llc Apparatus and method for assisting monitoring of medication adherence
US8631355B2 (en) 2010-01-08 2014-01-14 Microsoft Corporation Assigning gesture dictionaries
US20110213342A1 (en) 2010-02-26 2011-09-01 Ashok Burton Tripathi Real-time Virtual Indicium Apparatus and Methods for Guiding an Implant into an Eye
US8520027B2 (en) 2010-05-14 2013-08-27 Intuitive Surgical Operations, Inc. Method and system of see-through console overlay
US9400503B2 (en) * 2010-05-20 2016-07-26 Irobot Corporation Mobile human interface robot
US20120059671A1 (en) 2010-09-08 2012-03-08 William Park System for real time recording and reporting of emergency medical assessment data
US9436286B2 (en) * 2011-01-05 2016-09-06 Qualcomm Incorporated Method and apparatus for tracking orientation of a user
WO2012129669A1 (en) * 2011-03-28 2012-10-04 Gestsure Technologies Inc. Gesture operated control for medical information systems
US20120116986A1 (en) 2011-12-22 2012-05-10 Patricia Ault System and Method for Integrating Medical Treatment Guidelines with Real-Time, Ad-Hoc, Community Generated Commentary to Facilitate Collaborative Evidence-Based Practice
US8830054B2 (en) * 2012-02-17 2014-09-09 Wavemarket, Inc. System and method for detecting and responding to an emergency
US8930040B2 (en) 2012-06-07 2015-01-06 Zoll Medical Corporation Systems and methods for video capture, user feedback, reporting, adaptive parameters, and remote data access in vehicle safety monitoring
US20140009378A1 (en) * 2012-07-03 2014-01-09 Yen Hsiang Chew User Profile Based Gesture Recognition
WO2014052802A2 (en) * 2012-09-28 2014-04-03 Zoll Medical Corporation Systems and methods for three-dimensional interaction monitoring in an ems environment

Patent Citations (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5850352A (en) * 1995-03-31 1998-12-15 The Regents Of The University Of California Immersive video, including video hypermosaicing to generate from multiple video views of a scene a three-dimensional video mosaic from which diverse virtual video scene images are synthesized, including panoramic, scene interactive and stereoscopic images
US6236737B1 (en) * 1997-03-26 2001-05-22 Dalhousie University Dynamic target addressing system
US6115052A (en) * 1998-02-12 2000-09-05 Mitsubishi Electric Information Technology Center America, Inc. (Ita) System for reconstructing the 3-dimensional motions of a human figure from a monocularly-viewed image sequence
US6741977B1 (en) * 1999-01-29 2004-05-25 Hitachi, Ltd. Image recording/reproducing apparatus in monitor system
US20030058111A1 (en) * 2001-09-27 2003-03-27 Koninklijke Philips Electronics N.V. Computer vision based elderly care monitoring system
US7382895B2 (en) * 2002-04-08 2008-06-03 Newton Security, Inc. Tailgating and reverse entry detection, alarm, recording and prevention using machine vision
US8652038B2 (en) * 2006-05-12 2014-02-18 Bao Tran Health monitoring appliance
US7539532B2 (en) * 2006-05-12 2009-05-26 Bao Tran Cuffless blood pressure monitoring appliance
US8295542B2 (en) * 2007-01-12 2012-10-23 International Business Machines Corporation Adjusting a consumer experience based on a 3D captured image stream of a consumer response
US20090237247A1 (en) * 2007-06-08 2009-09-24 Brunetti Sam F Remote area monitoring system
US20110007139A1 (en) * 2007-06-08 2011-01-13 Brunetti Sam F Method and system for administering remote area monitoring system
US8199009B2 (en) * 2007-06-08 2012-06-12 Bas Strategic Solutions, Inc. Method and system for administering remote area monitoring system
US8570373B2 (en) * 2007-06-08 2013-10-29 Cisco Technology, Inc. Tracking an object utilizing location information associated with a wireless device
US20090259113A1 (en) * 2007-11-08 2009-10-15 General Electric Company System and method for determining pain level
US20090195382A1 (en) * 2008-01-31 2009-08-06 Sensormatic Electronics Corporation Video sensor and alarm system and method with object and event classification
US8009867B2 (en) * 2009-01-30 2011-08-30 Microsoft Corporation Body scan
US8290249B2 (en) * 2009-05-01 2012-10-16 Microsoft Corporation Systems and methods for detecting a tilt angle from a depth image
US8340432B2 (en) * 2009-05-01 2012-12-25 Microsoft Corporation Systems and methods for detecting a tilt angle from a depth image
US8379101B2 (en) * 2009-05-29 2013-02-19 Microsoft Corporation Environment and/or target segmentation
US8351652B2 (en) * 2009-05-29 2013-01-08 Microsoft Corporation Systems and methods for tracking a model
US8565479B2 (en) * 2009-08-13 2013-10-22 Primesense Ltd. Extraction of skeletons from 3D maps
US7961910B2 (en) * 2009-10-07 2011-06-14 Microsoft Corporation Systems and methods for tracking a model
US8374423B2 (en) * 2009-12-18 2013-02-12 Microsoft Corporation Motion detection using depth images
US8334842B2 (en) * 2010-01-15 2012-12-18 Microsoft Corporation Recognizing user intent in motion capture system
US8284157B2 (en) * 2010-01-15 2012-10-09 Microsoft Corporation Directed performance in motion capture system
US8465108B2 (en) * 2010-01-15 2013-06-18 Microsoft Corporation Directed performance in motion capture system
US8213680B2 (en) * 2010-03-19 2012-07-03 Microsoft Corporation Proxy training data for human body tracking
US8457353B2 (en) * 2010-05-18 2013-06-04 Microsoft Corporation Gestures and gesture modifiers for manipulating a user-interface
US20120154582A1 (en) * 2010-09-14 2012-06-21 General Electric Company System and method for protocol adherence
US8385596B2 (en) * 2010-12-21 2013-02-26 Microsoft Corporation First person shooter control with virtual skeleton
US20120212582A1 (en) * 2011-02-22 2012-08-23 Richard Deutsch Systems and methods for monitoring caregiver and patient protocol compliance
US20130173300A1 (en) * 2011-12-30 2013-07-04 Elwha Llc Evidence-based healthcare information management protocols
US20140132728A1 (en) * 2012-11-12 2014-05-15 Shopperception, Inc. Methods and systems for measuring human interaction
US20150302539A1 (en) * 2014-04-16 2015-10-22 Vios Medical Singapore PTE LTD Patient care and health information management systems and methods

Cited By (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11666247B2 (en) 2017-07-11 2023-06-06 Drägerwerk AG & Co. KGaA Method, device and computer program for capturing optical image data of patient surroundings and for identifying a patient check-up
DE102017006529A1 (en) * 2017-07-11 2019-01-17 Drägerwerk AG & Co. KGaA A method, apparatus and computer program for capturing optical image data of a patient environment and for detecting a patient examination
US11179293B2 (en) 2017-07-28 2021-11-23 Stryker Corporation Patient support system with chest compression system and harness assembly with sensor system
US11723835B2 (en) 2017-07-28 2023-08-15 Stryker Corporation Patient support system with chest compression system and harness assembly with sensor system
US11322231B2 (en) 2017-08-10 2022-05-03 Nuance Communications, Inc. Automated clinical documentation system and method
EP3665697A4 (en) * 2017-08-10 2021-06-09 Nuance Communications, Inc. Automated clinical documentation system and method
US10546655B2 (en) 2017-08-10 2020-01-28 Nuance Communications, Inc. Automated clinical documentation system and method
US11853691B2 (en) 2017-08-10 2023-12-26 Nuance Communications, Inc. Automated clinical documentation system and method
US10957428B2 (en) 2017-08-10 2021-03-23 Nuance Communications, Inc. Automated clinical documentation system and method
WO2019032812A1 (en) * 2017-08-10 2019-02-14 Nuance Communications, Inc. Automated clinical documentation system and method
US10978187B2 (en) 2017-08-10 2021-04-13 Nuance Communications, Inc. Automated clinical documentation system and method
WO2019032778A1 (en) * 2017-08-10 2019-02-14 Nuance Communications, Inc. Automated clinical documentation system and method
EP3665699A4 (en) * 2017-08-10 2021-06-09 Nuance Communications, Inc. Automated clinical documentation system and method
US11043288B2 (en) 2017-08-10 2021-06-22 Nuance Communications, Inc. Automated clinical documentation system and method
US11605448B2 (en) 2017-08-10 2023-03-14 Nuance Communications, Inc. Automated clinical documentation system and method
US11074996B2 (en) 2017-08-10 2021-07-27 Nuance Communications, Inc. Automated clinical documentation system and method
US11101022B2 (en) 2017-08-10 2021-08-24 Nuance Communications, Inc. Automated clinical documentation system and method
US11101023B2 (en) 2017-08-10 2021-08-24 Nuance Communications, Inc. Automated clinical documentation system and method
US11114186B2 (en) 2017-08-10 2021-09-07 Nuance Communications, Inc. Automated clinical documentation system and method
WO2019032819A1 (en) 2017-08-10 2019-02-14 Nuance Communications, Inc. Automated clinical documentation system and method
US11482311B2 (en) 2017-08-10 2022-10-25 Nuance Communications, Inc. Automated clinical documentation system and method
US11482308B2 (en) 2017-08-10 2022-10-25 Nuance Communications, Inc. Automated clinical documentation system and method
US11404148B2 (en) 2017-08-10 2022-08-02 Nuance Communications, Inc. Automated clinical documentation system and method
WO2019032826A1 (en) * 2017-08-10 2019-02-14 Nuance Communications, Inc. Automated clinical documentation system and method
US11316865B2 (en) 2017-08-10 2022-04-26 Nuance Communications, Inc. Ambient cooperative intelligence system and method
WO2019032815A1 (en) * 2017-08-10 2019-02-14 Nuance Communications, Inc. Automated clinical documentation system and method
US10957427B2 (en) 2017-08-10 2021-03-23 Nuance Communications, Inc. Automated clinical documentation system and method
US11257576B2 (en) * 2017-08-10 2022-02-22 Nuance Communications, Inc. Automated clinical documentation system and method
US11295838B2 (en) 2017-08-10 2022-04-05 Nuance Communications, Inc. Automated clinical documentation system and method
US11295839B2 (en) 2017-08-10 2022-04-05 Nuance Communications, Inc. Automated clinical documentation system and method
US11250382B2 (en) 2018-03-05 2022-02-15 Nuance Communications, Inc. Automated clinical documentation system and method
US11295272B2 (en) 2018-03-05 2022-04-05 Nuance Communications, Inc. Automated clinical documentation system and method
US10809970B2 (en) 2018-03-05 2020-10-20 Nuance Communications, Inc. Automated clinical documentation system and method
US11494735B2 (en) 2018-03-05 2022-11-08 Nuance Communications, Inc. Automated clinical documentation system and method
US11222716B2 (en) 2018-03-05 2022-01-11 Nuance Communications System and method for review of automated clinical documentation from recorded audio
US11270261B2 (en) 2018-03-05 2022-03-08 Nuance Communications, Inc. System and method for concept formatting
EP3761861A4 (en) * 2018-03-05 2022-01-12 Nuance Communications, Inc. Automated clinical documentation system and method
US11515020B2 (en) 2018-03-05 2022-11-29 Nuance Communications, Inc. Automated clinical documentation system and method
US11250383B2 (en) 2018-03-05 2022-02-15 Nuance Communications, Inc. Automated clinical documentation system and method
US11227679B2 (en) 2019-06-14 2022-01-18 Nuance Communications, Inc. Ambient clinical intelligence system and method
US11216480B2 (en) 2019-06-14 2022-01-04 Nuance Communications, Inc. System and method for querying data points from graph data structures
US11043207B2 (en) 2019-06-14 2021-06-22 Nuance Communications, Inc. System and method for array data simulation and customized acoustic modeling for ambient ASR
US11531807B2 (en) 2019-06-28 2022-12-20 Nuance Communications, Inc. System and method for customized text macros
US11670408B2 (en) 2019-09-30 2023-06-06 Nuance Communications, Inc. System and method for review of automated clinical documentation
US11222103B1 (en) 2020-10-29 2022-01-11 Nuance Communications, Inc. Ambient cooperative intelligence system and method

Also Published As

Publication number Publication date
CN104995638A (en) 2015-10-21
US20140096091A1 (en) 2014-04-03
WO2014052802A2 (en) 2014-04-03
JP2015533248A (en) 2015-11-19
EP2901368A2 (en) 2015-08-05
EP2901368A4 (en) 2016-05-25
WO2014052802A3 (en) 2014-07-31
US9911166B2 (en) 2018-03-06

Similar Documents

Publication Publication Date Title
US9911166B2 (en) Systems and methods for three-dimensional interaction monitoring in an EMS environment
US20200174594A1 (en) Facilitating user input via head-mounted display device and arm-mounted peripheral device
US11816322B2 (en) EMS decision support interface, event history, and related tools
US20190282324A1 (en) Augmented Reality Device for Providing Feedback to an Acute Care Provider
US11202579B2 (en) Wrist-worn device for coordinating patient care
US20220331028A1 (en) System for Capturing Movement Patterns and/or Vital Signs of a Person
US20190252053A1 (en) Wearable system for healthcare management
US9412161B2 (en) Systems and methods for medical use of motion imaging and capture
US8727981B2 (en) Ambient sensing of patient discomfort
US20080106374A1 (en) Patient Room Information System
CN106462928A (en) Patient care and health information management systems and methods
US20190302460A1 (en) Augmented reality systems for time critical biomedical applications
CN105377120A (en) Use of muscle oxygen saturation and PH in clinical decision support
Moshnyaga et al. A medication adherence monitoring system for people with dementia
Michelin et al. Faceguard: A wearable system to avoid face touching
CN109310330A (en) System and method for medical device patient measurement
WO2013109864A1 (en) System for reducing patient non-adherence
Sivaramakrishnan et al. A touchless interface for interventional radiology procedures
EP4120284A2 (en) Image-based risk analysis of individuals in clinical settings
Shahraki et al. The Role of a Human‐Machine Interaction (HMI) System on the Medical Devices
TR2023013940A2 (en) DRUG INtake CONTROL SYSTEM AND METHOD
TWM644850U (en) Hybrid teaching system for total parenteral nutrition preparation
KIT Remote Rehabilitation System Based on the Fusion of Noninvasive Wearable Device and Motion-sensing for Pulmonary Patients
Wegter Optimization of user interaction with DICOM in the Operation Room of a hospital

Legal Events

Date Code Title Description
AS Assignment

Owner name: ZOLL MEDICAL CORPORATION, MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:REID, C. SHANE;ASHMORE, CHAD;GOTSCHALL, ROBERT H.;AND OTHERS;SIGNING DATES FROM 20131001 TO 20131004;REEL/FRAME:033906/0332

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION