US20130343585A1 - Multisensor hearing assist device for health - Google Patents
Multisensor hearing assist device for health Download PDFInfo
- Publication number
- US20130343585A1 US20130343585A1 US13/623,545 US201213623545A US2013343585A1 US 20130343585 A1 US20130343585 A1 US 20130343585A1 US 201213623545 A US201213623545 A US 201213623545A US 2013343585 A1 US2013343585 A1 US 2013343585A1
- Authority
- US
- United States
- Prior art keywords
- assist device
- hearing assist
- user
- hearing
- sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/80—Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61J—CONTAINERS SPECIALLY ADAPTED FOR MEDICAL OR PHARMACEUTICAL PURPOSES; DEVICES OR METHODS SPECIALLY ADAPTED FOR BRINGING PHARMACEUTICAL PRODUCTS INTO PARTICULAR PHYSICAL OR ADMINISTERING FORMS; DEVICES FOR ADMINISTERING FOOD OR MEDICINES ORALLY; BABY COMFORTERS; DEVICES FOR RECEIVING SPITTLE
- A61J1/00—Containers specially adapted for medical or pharmaceutical purposes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R25/00—Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception
- H04R25/55—Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception using an external connection, either wireless or wired
- H04R25/554—Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception using an external connection, either wireless or wired using a wireless connection, e.g. between microphone and amplifier or using Tcoils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2225/00—Details of deaf aids covered by H04R25/00, not provided for in any of its subgroups
- H04R2225/43—Signal processing in hearing aids to enhance the speech intelligibility
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2225/00—Details of deaf aids covered by H04R25/00, not provided for in any of its subgroups
- H04R2225/55—Communication between hearing aids and external devices via a network for data exchange
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2460/00—Details of hearing devices, i.e. of ear- or headphones covered by H04R1/10 or H04R5/033 but not provided for in any of their subgroups, or of hearing aids covered by H04R25/00 but not provided for in any of its subgroups
- H04R2460/03—Aspects of the reduction of energy consumption in hearing devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2460/00—Details of hearing devices, i.e. of ear- or headphones covered by H04R1/10 or H04R5/033 but not provided for in any of their subgroups, or of hearing aids covered by H04R25/00 but not provided for in any of its subgroups
- H04R2460/13—Hearing devices using bone conduction transducers
Definitions
- the present invention relates to hearing assist devices that sense, analyze, and communicate user health characteristics.
- a hearing aid is an electro-acoustic device that typically fits in or behind the ear of a wearer, and amplifies and modulates sound for the wearer. Hearing aids are frequently worn by persons who are hearing impaired to improve their ability to hear sounds. A hearing aid may be worn in one or both ears of a user, depending on whether one or both of the user's ears need assistance.
- FIG. 1 shows a communication system that includes a multi-sensor hearing assist device that communicates with a near field communication (NFC)-enabled communications device, according to an exemplary embodiment.
- NFC near field communication
- FIGS. 2-4 show various configurations for associating a multi-sensor hearing assist device with an ear of a user, according to exemplary embodiments.
- FIG. 5 shows a multi-sensor hearing assist device that mounts over an ear of a user, according to an exemplary embodiment.
- FIG. 6 shows a multi-sensor hearing assist device that extends at least partially into the ear canal of a user, according to an exemplary embodiment.
- FIG. 7 shows a circuit block diagram of a multi-sensor hearing assist device that is configured to communicate with external devices according to multiple communication schemes, according to an exemplary embodiment.
- FIG. 8 shows a flowchart of a process for a hearing assist device that processes and transmits sensor data and receives a command from a second device, according to an exemplary embodiment.
- FIG. 9 shows a communication system that includes a multi-sensor hearing assist device that communicates with one or more communications devices and network-connected devices, according to an exemplary embodiment.
- FIG. 10 shows a flowchart of a process for a wirelessly charging a battery of a hearing assist device, according to an exemplary embodiment.
- FIG. 11 shows a flowchart of a process for broadcasting sound that is generated based on sensor data, according to an exemplary embodiment.
- FIG. 12 shows a flowchart of a process for generating and broadcasting filtered sound from a hearing assist device, according to an exemplary embodiment.
- FIG. 13 shows a flowchart of a process for generating an information signal in a hearing assist device based on a voice of a user, and transmitting the information signal to a second device, according to an exemplary embodiment.
- FIG. 14 shows a flowchart of a process for generating voice based at least on sensor data to be broadcast by a speaker of a hearing assist device to a user, according to an exemplary embodiment.
- FIG. 15 shows a system that includes a hearing assist device and a cloud/service/phone portable device that may be communicatively connected thereto, according to an exemplary embodiment.
- references in the specification to “one embodiment,” “an embodiment,” “an example embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
- a hearing aid is an electro-acoustic device that typically fits in or behind the ear of a wearer, and amplifies and modulates sound for the wearer. Hearing aids are frequently worn by persons who are hearing impaired to improve their ability to hear sounds. A hearing aid may be worn in one or both ears of a user, depending on whether one or both of the user's ears need hearing assistance.
- Hearing assist devices such as hearing aids, headsets, and headphones, are typically worn in contact with the user's ear, and in some cases extend into the user's ear canal.
- a hearing assist device is typically positioned in close proximity to various organs and physical features of a wearer, such as the inner ear structure (e.g., the ear canal, ear drum, ossicles, Eustachian tube, cochlea, auditory nerve, etc.), skin, brain, veins and arteries, and further physical features of the wearer.
- the inner ear structure e.g., the ear canal, ear drum, ossicles, Eustachian tube, cochlea, auditory nerve, etc.
- skin brain, veins and arteries, and further physical features of the wearer.
- a hearing assist device may be configured to detect various characteristics of a user's health.
- the detected characteristics may be used to treat health-related issues of the wearer, and perform further health-related functions.
- health monitoring technology may be incorporated into a hearing assist device to monitor the health of a wearer.
- health monitoring technology that may be incorporated in a hearing assist device include health sensors that determine (e.g., sense/detect/measure/collect, etc.) various physical characteristics of the user, such as blood pressure, heart rate, temperature, humidity, blood oxygen level, skin galvanometric levels, brain wave information, arrhythmia onset detection, skin chemistry changes, falling down impacts, long periods of activity, etc.
- Sensor information resulting from the monitoring may be analyzed within the hearing assist device, or may be transmitted from the hearing assist device and analyzed at a remote location.
- the sensor information may be analyzed at a local computer, in a smart phone or other mobile device, or at a remote location, such as at a cloud-based server.
- instructions and/or other information may be communicated back to the wearer.
- Such information may be provided to the wearer by a display screen (e.g., a desktop computer display, a smart phone display, a tablet computer display, a medical equipment display, etc.), by the hearing assist device itself (e.g., by voice, beeps, etc.), or may be provided to the wearer in another manner.
- Medical personnel and/or emergency response personnel may be alerted when particular problems with the wearer are detected by the hearing assist device.
- the medical personnel may evaluate information received from the hearing assist device, and provide information back to the hearing assist device/wearer.
- the hearing assist device may provide the wearer with reminders, alarms, instructions, etc.
- the hearing assist device may be configured with speech/voice recognition capability. For instance, the wearer may provide commands, such as by voice, to the hearing assist device.
- the hearing assist device may be configured to perform various audio processing functions to suppress background noise and/or other sounds, as well amplifying other sounds, and may be configured to modify audio according to a particular frequency response of the hearing of the wearer.
- the hearing assist device may be configured to detect vibrations (e.g., jaw movement of the wearer during talking), and may use the detected vibrations to aid in improving speech/voice recognition.
- FIG. 1 shows a communication system 100 that includes a multi-sensor hearing assist device 102 that communicates with a near field communication (NFC)-enabled communications device 104 , according to an exemplary embodiment.
- Hearing assist device 102 may be worn in association with the ear of a user, and may be configured to communicate with other devices, such as communications device 104 .
- hearing assist device 102 includes a plurality of sensors 106 a and 106 b , processing logic 108 , an NFC transceiver 110 , storage 112 , and a rechargeable battery 114 . These features of hearing assist device 102 are described as follows.
- Sensors 106 a and 106 b are medical sensors that each sense a characteristic of the user and generate a corresponding sensor output signal. Although two sensors 106 a and 106 b are shown in hearing assist device 102 in FIG. 1 , any number of sensors may be included in hearing assist device 102 , including three sensors, four sensors, five sensors, etc. (e.g., tens of sensors, hundreds of sensors, etc.).
- sensors for sensors 106 a and 106 b include a blood pressure sensor, a heart rate sensor, a temperature sensor, a humidity sensor, a blood oxygen level sensor, a skin galvanometric level sensor, a brain wave information sensor, an arrhythmia onset detection sensor (e.g., a chest strap with multiple sensor pads), a skin chemistry sensor, a motion sensor (e.g., to detect falling down impacts, long periods of activity, etc.), an air pressure sensor, etc.
- a blood pressure sensor e.g., a heart rate sensor, a temperature sensor, a humidity sensor, a blood oxygen level sensor, a skin galvanometric level sensor, a brain wave information sensor, an arrhythmia onset detection sensor (e.g., a chest strap with multiple sensor pads), a skin chemistry sensor, a motion sensor (e.g., to detect falling down impacts, long periods of activity, etc.), an air pressure sensor, etc.
- arrhythmia onset detection sensor e.g.,
- Processing logic 108 may be implemented in hardware (e.g., one or more processors, electrical circuits, etc.), or any combination of hardware with software and/or firmware. Processing logic 108 may receive sensor information from sensors 106 a , 106 b , etc., and may process the sensor information to generate processed sensor data. Processing logic 108 may execute one or more programs that define various operational characteristics, such as: (i) a sequence or order of retrieving sensor information from sensors of hearing assist device 102 , (ii) sensor configurations and reconfigurations (via a preliminary setup or via adaptations over the course of time), (iii) routines by which particular sensor data is at least pre-processed, and (iv) one or more functions/actions to be performed based on particular sensor data values, etc.
- various operational characteristics such as: (i) a sequence or order of retrieving sensor information from sensors of hearing assist device 102 , (ii) sensor configurations and reconfigurations (via a preliminary setup or via adaptations over the course of time), (iii
- processing logic 108 may store and/or access sensor data in storage 112 , processed or unprocessed. Furthermore, processing logic 108 may access one or more programs stored in storage 112 for execution.
- Storage 112 may include one or more types of storage, including memory (e.g., random access memory (RAM), read only memory (ROM), etc.) that is volatile or non-volatile.
- NFC transceiver 110 is configured to wirelessly communicate with a second device (e.g., a local or remote supporting device), such as NFC-enabled communications device 104 according to NFC techniques.
- a second device e.g., a local or remote supporting device
- NFC-enabled communications device 104 uses magnetic induction between two loop antennas (e.g., coils, microstrip antennas, etc.) located within each other's near field, effectively forming an air-core transformer.
- NFC communications occur over relatively short ranges (e.g., within a few centimeters), and are conducted at radio frequencies.
- NFC communications may be performed by NFC transceiver 110 at a 13.56 MHz frequency, with data transfers of up to 424 kilobits per second.
- NFC transceiver 110 may be configured to perform NFC communications at other frequencies and data transfer rates. Examples of standards according to which NFC transceiver 110 may be configured to conduct NFC communications include ISO/IEC 18092 and those defined by the NFC Forum, which was founded in 2004 by Nokia, Philips and Sony.
- NFC-enabled communications device 104 may be configured with an NFC transceiver to perform NFC communications.
- NFC-enabled communications device 104 may be any type of device that may be enabled with NFC capability, such as a docking station, a desktop computer (e.g., a personal computer, etc.), a mobile computing device (e.g., a personal digital assistant (PDA), a laptop computer, a notebook computer, a tablet computer (e.g., an Apple iPadTM), a netbook, etc.), a mobile phone (e.g., a cell phone, a smart phone, etc.), a medical appliance, etc.
- NFC-enabled communications device 104 may be network-connected to enable hearing assist device 102 to communicate with entities over the network (e.g., cloud computers or servers, web services, etc.).
- NFC transceiver 102 enables sensor data (processed or unprocessed) to be transmitted by processing logic 108 from hearing assist device 102 to NFC-enabled communications device 104 . In this manner, the sensor data may be reported, processed, and/or analyzed externally to hearing assist device 102 . Furthermore, NFC transceiver 102 enables processing logic 108 at hearing assist device 102 to receive data and/or instructions/commands from NFC-enabled communications device 104 in response to the transmitted sensor data.
- NFC transceiver 102 enables processing logic 108 at hearing assist device 102 to receive programs (e.g., program code), including new programs, program updates, applications, “apps”, and/or other programs from NFC-enabled communications device 104 that can be executed by processing logic 108 to change/update the functionality of hearing assist device 102 .
- programs e.g., program code
- Rechargeable battery 114 is a rechargeable battery that includes one or more electrochemical cells that store charge that may be used to power components of hearing assist device 102 , including one or more of sensor 106 a , 106 b , etc., processing logic 108 , NFC transceiver 110 , and storage 112 .
- Rechargeable battery 114 may be any suitable rechargeable battery type, including lead-acid, nickel cadmium (NiCd), nickel metal hydride (NiMH), lithium ion (Li-ion), and lithium ion polymer (Li-ion polymer). Charging of the batteries may be through a typical tethered recharger or via NFC power delivery.
- NFC communications are shown, alternative communication approaches can be employed. Such alternatives may include wireless power transfer schemes as well.
- Hearing assist device 102 may be configured in any manner to be associated with the ear of a user.
- FIGS. 2-4 show various configurations for associating a hearing assist device with an ear of a user, according to exemplary embodiments.
- hearing assist device 102 may be a hearing aid type that fits and is inserted partially or fully in an ear 202 of a user.
- hearing assist device 102 includes sensors 106 a - 106 n that contact the user. Examples forms of hearing assist device 102 of FIG.
- ear buds include ear buds, “receiver in the canal” hearing aids, “in the ear” (ITE) hearing aids, “invisible in canal” (ITC) hearing aids, “completely in canal” (CIC) hearing aids, etc.
- ITE ear buds
- ITC visible in canal
- CIC completely in canal
- hearing assist device 102 may be a hearing aid type that mounts on top of, or behind ear 202 of the user. As shown in FIG. 3 , hearing assist device 102 includes sensors 106 a - 106 n that contact the user. Examples forms of hearing assist device 102 of FIG. 3 include “behind the ear” (BTE) hearing aids, “open fit” or “over the ear” (OTE) hearing aids, eyeglasses hearing aids (e.g., that contain hearing aid functionality in or on the glasses arms), etc.
- BTE behind the ear
- OFT over the ear
- hearing assist device 102 may be a headset or head phones that mounts on the head of the user and include speakers that are held close to the user's ears. As shown in FIG. 4 , hearing assist device 102 includes sensors 106 a - 106 n that contact the user. In the embodiment of FIG. 4 , sensors 106 a - 106 n may be spaced further apart in the headphones, including being dispersed in the ear pad(s) and/or along the headband that connects together the ear pads (when a head band is present).
- hearing assist device 102 may be configured in further forms, including combinations of the forms shown in FIGS. 2-4 , and is not intended to be limited to the embodiments illustrated in FIGS. 2-4 .
- hearing assist device 102 may be a cochlear implant-type hearing aid, or other type of hearing assist device.
- the following section describes some example forms of hearing assist device 102 with associated sensor configurations.
- hearing assist device 102 may be configured in various forms, and may include any number and type of sensors.
- FIG. 5 shows a hearing assist device 500 that is an example of hearing assist device 102 according to an exemplary embodiment.
- Hearing assist device 500 is configured to mount over an ear of a user, and has a portion that is at least partially inserted into the ear.
- a user may wear a single hearing assist device 500 on one ear, or may simultaneously wear first and second hearing assist devices 500 on the user's right and left ears, respectively.
- hearing assist device 500 includes a case or housing 502 that includes a first portion 504 , a second portion 506 , and a third portion 508 .
- First portion 504 is shaped to be positioned behind/over the ear of a user.
- first portion 504 has a crescent shape, and may optionally be molded in the shape of a user's outer ear (e.g., by taking an impression of the outer ear, etc.).
- Second portion 506 extends perpendicularly from a side of an end of first portion 504 .
- Second portion 506 is shaped to be inserted at least partially into the ear canal of the user.
- Third portion 508 extends from second portion 506 , and may be referred to as an earmold shaped to conform to the user's ear shape, to better adhere hearing assist device 500 to the user's ear.
- hearing assist device 500 further includes a speaker 512 , a forward IR/UV (ultraviolet) communication transceiver 520 , a BTLE (BLUETOOTH low energy) antenna 522 , at least one microphone 524 , a telecoil 526 , a tethered sensor port 528 , a skin communication conductor 534 , a volume controller 540 , and a communication and power delivery coil 542 .
- a speaker 512 a forward IR/UV (ultraviolet) communication transceiver 520 , a BTLE (BLUETOOTH low energy) antenna 522 , at least one microphone 524 , a telecoil 526 , a tethered sensor port 528 , a skin communication conductor 534 , a volume controller 540 , and a communication and power delivery coil 542 .
- a speaker 512 As shown in FIG. 5 , hearing assist device 500 further includes a speaker 512 ,
- hearing assist device 500 includes a plurality of medical sensors, including at least one pH sensor 510 , an IR (infrared) or sonic distance sensor 514 , an inner ear temperature sensor 516 , a position/motion sensor 518 , a WPT (wireless power transfer)/NFC coil 530 , a switch 532 , a glucose spectroscopy sensor 536 , a heart rate sensor 538 , and a subcutaneous sensor 544 .
- hearing assist device 500 may include one or more of these further features and/alternative features. The features of hearing assist device 500 are described as follows.
- speaker 512 As shown in FIG. 5 , speaker 512 , IR or sonic distance sensor 514 , and inner ear temperature sensor 516 are located on a circular surface of second portion 506 of hearing assist device 500 that faces into the ear of the user. Position/motion sensor 518 and pH sensor 510 are located on a perimeter surface of second portion 506 around the circular surface that contacts the ear canal of the user. In alternative embodiments, one or more of these features may be located in/on different locations of hearing assist device 500 .
- pH sensor 510 is a sensor that may be present to measure a pH of skin of the user's inner ear. The measured pH value may be used to determine a medical problem of the user, such an onset of stroke. pH sensor 510 may include one or more metallic plates. Upon receiving power (e.g., from rechargeable battery 114 of FIG. 1 ), pH sensor 510 may generate a sensor output signal (e.g., an electrical signal) that indicates a measured pH value.
- a sensor output signal e.g., an electrical signal
- Speaker 512 is a speaker of hearing assist device 500 that broadcasts environmental sound received by microphone(s) 524 , that is subsequently amplified and/or filtered by processing logic of the hearing assist device 600 , into the ear of the user to assist the user in hearing the environmental sound. Furthermore, speaker 512 may broadcast additional sounds into the ear of the user for the user to hear, including alerts (e.g., tones, beeping sounds), voice, and/or further sounds that may be generated by or received by processing logic of hearing assist device 500 , and/or may be stored in hearing assist device 500 .
- alerts e.g., tones, beeping sounds
- IR or sonic distance sensor 514 is a sensor that may be present to sense a displacement distance. Upon receiving power, IR or sonic distance sensor 514 may generate an IR light pulse, a sonic (e.g., ultrasonic) pulse, or other light or sound pulse, that may be reflected in the ear of the user, and the reflection may be received by IR or sonic distance sensor 514 . A time of reflection may be compared for a series of pulses to determine a displacement distance within the ear of user. IR or sonic distance sensor 514 may generate a sensor output signal (e.g., an electrical signal) that indicates a measured displacement distance.
- a sensor output signal e.g., an electrical signal
- hearing assist device 500 can perform the following when a user inserts and turns on hearing assist device 500 : (i) automatically adjust the volume to fall within a target range; and (ii) prevent excess volume associated with unexpected loud sound events. It is noted that the amount of volume adjustment that may be applied can vary by frequency. It is also noted that the excess volume associated with unexpected loud sound events may be further prevented by using a hearing assist device that has a relatively tight fit, thereby allowing the hearing assist device to act as an ear plug.
- Hearing efficiency and performance data over the spectrum of normal audible frequencies can be gathered by delivering each frequency (or frequency range) at an output volume level, measuring eardrum deflection characteristics, and delivering audible test questions to the user via hearing assist device 500 .
- This can be accomplished solely by hearing assist device 500 or with assistance from a smartphone or other external device or service.
- a user may respond to an audio (or textual) prompt “Can you hear this?” with a “yes” or “no” response.
- the response is received by microphone(s) 524 (or via touch input for example) and processed internally or on an assisting external device to identify the response.
- the amplitude of the audio output can be adjusted to determine a given user's hearing threshold for each frequency (or frequency range). From this hearing efficiency and performance data, input frequency equalization can be performed by hearing assist device 500 so as to deliver to the user audio signals that will be perceived in much the same way as someone with no hearing impairment. In addition, such data can be delivered to the assisting external device (e.g., to a smartphone) for use by such device in producing audio output for the user.
- the assisting external device e.g., to a smartphone
- the assisting device can deliver an adjusted audio output tailored for the user if (i) the user is not wearing hearing assist device 500 , (ii) the battery power of hearing assist device 500 is depleted, (iii) hearing assist device 500 is powered down, or (iv) hearing assist device 500 is operating in a lower power mode.
- the supporting device can deliver the audio signal: (a) in an audible form via a speaker which will be generated with intent of directly reaching the eardrum; (b) in an audible form intended for receipt and amplification control by hearing assist device 500 without further need for user specific audio equalization; and (c) in a non-audible form (e.g.) electromagnetic transmission for receipt and conversion to an audible form by hearing assist device 500 and again without further equalization.
- a non-audible form e.g. electromagnetic transmission for receipt and conversion to an audible form by hearing assist device 500 and again without further equalization.
- a wearer may further tweak their recommended equalization via slide bars and such in a manner similar to adjusting equalization for other conventional audio equipment. Such tweaking can be carried out via the supporting device user interface.
- a plurality of equalization settings can be supported with each being associated with a particular mode of operation of hearing assist device 500 . That is conversation in a quiet room with one other might receive one equalization profile while a concert hall might receive another. Modes can be selected in many automatic or commanded ways via either or both hearing assist device 500 and the external supporting device. Automatic selection can be performed via analysis and classification of captured audio. Certain classifications may trigger selection of a particular mode. Commands may delivered via any user input interface such as voice input (voice recognized commands), tactile input commands, etc.
- Audio modes also comprise alternate or additional audio processing techniques as well. For example, in one mode, to enhance audio perspective and directionality, delays might be selectively introduced (or increased in a stereoscopic manner) to enhance a wearer's ability to discern the location of an audio source. Sensor data may support automatic mode selection in such situations. Detecting walking impacts and outdoor GPS (Global Positioning System) location might automatically trigger such enhanced perspective mode. A medical condition might trigger another mode which attenuates environmental audio while delivering synthesized voice commands to the wearer. In another exemplary mode, both echoes and delays might be introduced to simulate a theater environment. For example, when audio is being sourced by a television channel broadcast of a movie, the theater environment mode might be selected. Such selection may be in response to a set top box, television or media player's commands or by identifying one of the same as the audio source.
- GPS Global Positioning System
- hearing assist device 500 and an external supporting device.
- the external supporting device may receive the audio for processing: (i) directly via built in microphones; (ii) from storage; or (iii) via yet another external device.
- the source audio may be captured by hearing assist device 500 itself and delivered via a wired or wireless pathway to the external supporting device for processing before delivery of either the processed audio signals or substitute audio back to hearing assist device 500 for delivery to the wearer.
- sensor data may be captured in one or both of hearing assist device 500 and an external supporting device.
- Sensor data captured by hearing assist device 500 may likewise be delivered via such or other wired or wireless pathways to the external supporting device for (further) processing.
- the external supporting device may then respond to the sensor data received and processed by delivering audio content and/or hearing aid commands back to hearing assist device 500 .
- Such commands may be to reconfigure some aspect of hearing assist device 500 or manage communication or power delivery.
- Such audio content may be instructional, comprise queries, or consist of commands to be delivered the wearer via the ear drums.
- Sensor data may be stored and displayed in some form locally on the external supporting device along with similar audio, graphical or textual content, commands or queries.
- Sensors within one or both hearing assist device 500 and an external supporting device may be medical sensors or environmental sensors (e.g., latitude/longitude, velocity, temperature, wearer's physical orientation, acceleration, elevation, tilt, humidity, etc.).
- hearing assist device 500 may also be configured with an imager that may be located near transceiver 520 .
- the imager can then be used to capture images or video that may be relayed to one or more external supporting device for real time display, storage or processing. For example, detecting a medical situation and no response to audible content queries delivered via hearing assist device 500 , the imager can be commanded (internal or external command origin) to capture an image or a video sequence.
- Such imager output can be delivered to medical staff via a user's supporting smartphone so that a determination can be made as to the user's condition or the position/location of hearing assist device 500 .
- Inner ear temperature sensor 516 is a sensor that may be present to measure a temperature of the user.
- inner ear temperature sensor 516 may include a lens used to measure inner ear temperature.
- IR light may be reflected from the user skin by an IR light emitter, such as the ear canal or ear drum, and received by a single temperature sensor element, a one-dimensional array of temperature sensor elements, a two-dimensional array of temperature sensor elements, or other configuration of temperature sensor elements.
- Inner ear temperature sensor 516 may generate a sensor output signal (e.g., an electrical signal) that indicates a measured inner ear temperature.
- Such a configuration may also be used to determine a distance to the user's ear drum.
- the IR light emitter and sensor may be used to determine a distance to the user's ear drum from hearing assist device 500 , which may be used by processing logic to automatically control a volume of sound emitted from hearing assist device 500 , as well as for other purposes.
- the IR light emitter/sensor may also be used as an imager that captures an image of the inside of the user's ear. This could be used to identify characteristics of vein structures inside the user's ear, for example.
- the IR light emitter/sensor could also be used to detect the user's heartbeat, as well as to perform further functions.
- hearing assist device 500 may include a light sensor that senses outdoor light levels for various purposes.
- Position/motion sensor 518 includes one or more sensors that may be present to measure time of day, location, acceleration, orientation, vibrations, and/or other movement related characteristics of the user.
- position/motion sensor 518 may include one or more of a GPS (global positioning system) receiver (to measure user position), an accelerometer (to measure acceleration of the user), a gyroscope (to measure orientation of the head of the user), a magneto (to determine a direction the user is facing), a vibration sensor (e.g., a micro-electromechanical system (MEMS) vibration sensor), etc.
- GPS global positioning system
- accelerometer to measure acceleration of the user
- gyroscope to measure orientation of the head of the user
- magneto to determine a direction the user is facing
- vibration sensor e.g., a micro-electromechanical system (MEMS) vibration sensor
- Position/motion sensor 518 may be used for various benefits, including determining whether a user has fallen (e.g., based on measured position, acceleration, orientation, etc.), for local VoD, and many more benefits. Position/motion sensor 518 may generate a sensor output signal (e.g., an electrical signal) that indicates one or more of the measured time of day, location, acceleration, orientation, vibration, etc.
- a sensor output signal e.g., an electrical signal
- MEMS sensors may be configured to record the position/movement of the head of a wearer for health purposes, for location applications, and/or for other reasons.
- a wireless transceiver and a MEMS sensor of hearing assist device 500 can determine the location of the head of the user, which may be a more meaningful positioning reference point than a position of a mobile device (e.g., cellphone) held against the user's head by the user.
- hearing assist device 500 may be configured to tell the user that the user is not looking at the road properly when driving.
- hearing assist device 500 may determine that the wearer fell down and may send a communication signal to the user's mobile device to dial 911 or other emergency number.
- wireless communication signals may be used to help triangulate and determine position of the head.
- the user may shake their head up/down and/or may otherwise move their head to answer verbal commands provided by hearing assist device 500 and/or by the user's phone without the user having to speak.
- the user may enabled to speak to hearing assist device 500 to respond to commands (e.g., “did you fall?”, “are you alright?”, “should I dial for help?”, “are you falling asleep?”, etc.).
- Position data can be processed in hearing assist device 500 , in the mobile device, in the “cloud”, etc. In an embodiment, to save power, the position data may be used to augment mobile/cloud data for better accurate and special circumstances.
- Hearing assist device 500 may determine a proximity to the mobile device of the user even if the camera on the mobile device is not in view. Based upon position and sensors, hearing assist device 500 may determine the direction the person is looking at to aid artificial reality. In an embodiment, hearing assist device 500 may be configured to calibrate position data when the head is in view of a remote camera.
- the sensor information indicated by position/motion sensor 518 and/or other sensors may be used for various purposes. For instance, position/motion information may be used to determine that the user has fallen down/collapsed.
- voice and/or video assist e.g., by a handheld device in communication with hearing assist device 500
- Such sensor data and feedback information if warranted, can be automatically forwarded to medical staff, ambulance services, and/or family members, for example, as described elsewhere herein.
- the analysis of the data that triggered the forwarding process may be performed in whole or in part on one (or both) of hearing assist device 500 , and/or on the assisting local device (e.g., a smart phone, tablet computer, set top box, TV, etc., in communication with a hearing assist device 500 ) and/or remote computing systems (e.g., at medical staff offices or as might be available through a cloud or portal service).
- the assisting local device e.g., a smart phone, tablet computer, set top box, TV, etc., in communication with a hearing assist device 500
- remote computing systems e.g., at medical staff offices or as might be available through a cloud or portal service.
- forward IR/UV (ultraviolet) communication transceiver 520 , BTLE antenna 522 , microphone(s) 524 , telecoil 526 , tethered sensor port 528 , WPT/NFC coil 530 , switch 532 , skin communication conductor 534 , glucose spectroscopy sensor 536 , a heart rate sensor 538 , volume controller 540 , and communication and power delivery coil 542 are located at different locations in/on the first portion 504 of hearing assist device 500 . In alternative embodiments, one or more of these features may be located in/on different locations of hearing assist device 500 .
- Forward IR/UV communication transceiver 520 is a communication mechanism that may be present to enable communications with another device, such as a smart phone, computer, etc.
- Forward IR/UV communication transceiver 520 may receive information/data from processing logic of hearing assist device 500 to be transmitted to the other device in the form of modulated light (e.g., IR light, UV light, etc.), and may receive information/data in the form of modulated light from the other device to be provided to the processing logic of hearing assist device 500 .
- Forward IR/UV communication transceiver 520 may enable low power communications for hearing assist device 500 , to reduce a load on a battery of hearing assist device 500 .
- an emitter/receiver of forward IR/UV communication transceiver 520 may be positioned on housing 502 to be facing forward in a direction a wearer of hearing assist device 500 faces. In this manner, the forward IR/UV communication transceiver 520 may communicate with a device held by the wearer, such as a smart phone, a tablet computer, etc., to provide text to be displayed to the wearer, etc.
- a device held by the wearer such as a smart phone, a tablet computer, etc.
- BTLE antenna 522 is a communication mechanism coupled to a BluetoothTM transceiver in hearing assist device 500 that may be present to enable communications with another device, such as a smart phone, computer, etc.
- BTLE antenna 522 may receive information/data from processing logic of hearing assist device 500 to be transmitted to the other device according to the BluetoothTM specification, and may receive information/data transmitted according to the BluetoothTM specification from the other device to be provided to the processing logic of hearing assist device 500 .
- Microphone(s) 524 is a sensor that may be present to receive environmental sounds, including voice of the user, voice of other persons, and other sounds in the environment (e.g., traffic noise, music, etc.). Microphone(s) 524 may include any number of microphones, and may be configured in any manner, including being omni-directional (non-directional), directional, etc. Microphone(s) 524 generates an audio signal based on the received environmental sound that may be processed and/or filtered by processing logic of hearing assist device 500 , may be stored in digital form in hearing assist device 500 , may be transmitted from hearing assist device 500 , and may be used in other ways.
- Telecoil 526 is a communication mechanism that may be present to enable communications with another device.
- Telecoil 526 is an audio induction loop that enables audio sources to be directly coupled to hearing assist device 500 in a manner known to persons skilled in the relevant art(s).
- Telecoil 526 may be used with a telephone, a radio system, and induction loop systems that transmit sound to hearing aids.
- Tethered sensor port 528 is a port that a remote sensor (separate from hearing assist device 500 ) may be coupled with to interface with hearing assist device 500 .
- port 528 may be an industry standard or proprietary connector type.
- a remote sensor may have a tether (one or more wires) with a connector at an end that may be plugged into port 528 . Any number of tethered sensor ports 528 may be present.
- sensor types that may interface with tethered sensor port 528 include brainwave sensors (e.g., electroencephalography (EEG) sensors that record electrical activity along the scalp according to EEG techniques) attached to the user's scalp, heart rate/arrhythmia sensors attached to a chest of the user, etc.
- EEG electroencephalography
- Such brainwave sensors may record/measure electrical signals of the user's brain.
- WPT/NFC coil 530 is a communication mechanism coupled to a NFC transceiver in hearing assist device 500 that may be present to enable communications with another device, such as a smart phone, computer, etc., as described above with respect to NFC transceiver 110 ( FIG. 1 ).
- Switch 532 is a switching mechanism that may be present on housing 502 to perform various functions, such as switching power on or off, switching between different power and/or operational modes, etc. A user may interact with switch 532 to switch power on or off, to switch between modes, etc. Switch 532 may be any type of switch, including a toggle switch, a push button switch, a rocker switch, a three-(or greater) position switch, a dial switch, etc.
- Skin communication conductor 534 is a communication mechanism coupled to a transceiver in hearing assist device 500 that may be present to enable communications with another device, such as a smart phone, computer, etc., through skin of the user.
- skin communication conductor 534 may enable communications to flow between hearing assist device 500 and a smart phone held in the hand of the user, a second hearing assist device worn on an opposite ear of the user, a pacemaker or other device implanted in the user, or other communications device in communication with skin of the user.
- a transceiver of hearing assist device 500 may receive information/data from processing logic to be transmitted from skin communication conductor 534 through the user's skin to the other device, and the transceiver may receive information/data at skin communication conductor 534 that was transmitted from the other device through the user's skin to be provided to the processing logic of hearing assist device 500 .
- Glucose spectroscopy sensor 536 is a sensor that may be present to measure a glucose level of the user using spectroscopy techniques in a manner known to persons skilled in the relevant art(s). Such a measurement may be valuable in determining whether a user has diabetes. Such a measurement can also be valuable in helping a diabetic user determine whether insulin is needed, etc. (e.g., hypoglycemia or hyperglycemia).
- Glucose spectroscopy sensor 536 may be configured to monitor glucose in combination with subcutaneous sensor 544 . As shown in FIG. 5 , subcutaneous sensor 544 is shown separate from, and proximate to hearing assist device 500 .
- subcutaneous sensor 544 may be imbedded in the user's skin, in or around the user's ear. In an alternative embodiment, subcutaneous sensor 544 may be located in/on hearing assist device 500 .
- Subcutaneous sensor 544 is a sensor that may be present to measure any attribute of a user's health, characteristics or status.
- subcutaneous sensor 544 may be a glucose sensor implanted under the skin behind the ear so as to provide a reasonably close mating location with communication and power delivery coil 542 .
- glucose spectroscopy sensor 536 may measure the user glucose level with respect to subcutaneous sensor 544 , and may generate a sensor output signal (e.g., an electrical signal) that indicates a glucose level of the user/
- Heart rate sensor 538 is a sensor that may be present to measure a heart rate of the user. For instance, in an embodiment, upon receiving power, heart rate sensor 538 may pressure changes with respect to a blood vessel in the ear, or may measure heart rate in another manner such as changes in reflectivity or otherwise as would be known to persons skilled in the relevant art(s). Missed beats, elevated heart rate, and further heart conditions may be detected in this manner. Heart rate sensor 538 may generate a sensor output signal (e.g., an electrical signal) that indicates a measured heart rate.
- subcutaneous sensor 544 might comprise at least a portion of an internal heart monitoring device which communicates via communication and power delivery coil 542 heart status information and data. Subcutaneous sensor 544 could also be associated with or be part of a pacemaker or defibrillating implant, insulin pump, etc.
- Volume controller 540 is a user interface mechanism that may be present on housing 502 to enable a user to modify a volume at which sound is broadcast from speaker 512 .
- a user may interact with volume controller 520 to increase or decrease the volume.
- Volume controller 540 may be any suitable controller type (e.g., a potentiometer), including a rotary volume dial, a thumb wheel, a capacitive touch sensing device, etc.
- communication and power delivery coil 542 may be dedicated to one or the other.
- such coil may only support power delivery (if needed to charge or otherwise deliver power to subcutaneous sensor 544 ), and can be replaced with any other type of communication system that supports communication with subcutaneous sensor 544 .
- coils/antennas of hearing assist device 500 may be separately included in hearing assist device 500 , or in embodiments, two or more of the coils/antennas may be combined as a single coil/antenna.
- the processing logic of hearing assist device 500 may be operable to set up/configure and adaptively reconfigure each of the sensors of hearing assist device 500 based on an analysis of the data obtained by such sensor as well as on an analysis of data obtained by other sensors.
- a first sensor of hearing assist device 500 may be configured to operate at one sampling rate (or sensing rate) which is analyzed periodically or continuously.
- a second sensor of hearing assist device 500 can be in a sleep or power down mode to conserve battery power. When a threshold is exceeded or other triggering event occurs, such first sensor can be reconfigured by the processing logic of hearing assist device 500 to sample at a higher rate or continuously and the second sensor can be powered up and configured. Additionally, multiple types of sensor data can be used to construct or derive single conclusions.
- FIG. 6 shows a hearing assist device 600 that is an example of hearing assist device 102 according to an exemplary embodiment.
- Hearing assist device 600 is configured to be at least partially inserted into the ear canal of a user (e.g., an ear bud).
- a user may wear a single hearing assist device 600 on one ear, or may simultaneously wear first and second hearing assist devices 600 on the user's right and left ears, respectively.
- hearing assist device 600 includes a case or housing 602 that has a generally cylindrical shape, and includes a first portion 604 , a second portion 606 , and a third portion 608 .
- First portion 604 is shaped to be inserted at least partially into the ear canal of the user.
- Second portion 606 extends coaxially from first portion 604 .
- Third portion 608 is a handle that extends from second portion 606 . A user grasps third portion 608 to extract hearing assist device 600 from the ear of the user.
- hearing assist device 600 further includes pH sensor 510 , speaker 512 , IR (infrared) or sonic distance sensor 514 , inner ear temperature sensor 516 , and an antenna 610 .
- pH sensor 510 , speaker 512 , IR (infrared) or sonic distance sensor 514 , inner ear temperature sensor 516 may function and be configured similarly as described above.
- hearing assist device 600 may include an outer ear temperature sensor to determine outside ear temperature.
- Antenna 610 may be include one or more coils or other types of antennas to function as any one or more of the coils/antennas described above with respect to FIG. 5 and/or elsewhere herein (e.g., an NFC antenna, a BluetoothTM antenna, etc.).
- antennas such as coils, mentioned herein may be implemented as any suitable type of antenna, including a coil, a microstrip antenna, or other antenna type.
- further sensors, communication mechanisms, switches, etc., of hearing assist device 500 of FIG. 5 are not shown included in hearing assist device 600 , one or more further of these features of hearing assist device 500 may additionally and/or alternatively be included in hearing assist device 600 .
- sensors that are present in a hearing assist device may all operate simultaneously, or one or more sensors may be run periodically, and may be off at other times (e.g., based on an algorithm in program code, etc.). By running fewer sensors at any one time, battery power may be conserved.
- sensor management duty cycling, continuous operations, threshold triggers, sampling rates, etc.
- the assisting local device e.g., smart phone, tablet computer, set top box, TV, etc.
- remote computing systems at medical staff offices or as might be available through a cloud or portal service.
- Hearing assist devices 102 , 500 , and 600 may be configured in various ways with circuitry to process sensor information, and to communicate with other devices.
- the next section describes some example circuit embodiments for hearing assist devices, as well as processes for communicating with other devices, and for further functionality.
- hearing assist devices may be configured in various ways to perform their functions.
- FIG. 7 shows a circuit block diagram of a hearing assist device 700 that is configured to communicate with external devices according to multiple communication schemes, according to an exemplary embodiment.
- Hearing assist devices 102 , 500 , and 600 may each be implemented similarly to hearing assist device 700 , according to embodiments.
- hearing assist device 700 includes a plurality of sensors 702 a - 702 c , processing logic 704 , a microphone 706 , an amplifier 708 , a filter 710 , an analog-to-digital (A/D) converter 712 , a speaker 714 , an NFC coil 716 , an NFC transceiver 718 , an antenna 720 , a BluetoothTM transceiver 722 , a charge circuit 724 , a battery 726 , a plurality of sensor interfaces 728 a - 728 c , and a digital-to-analog (D/A) converter 764 .
- A/D analog-to-digital
- Processing logic 704 includes a digital signal processor (DSP) 730 , a central processing unit (CPU) 732 , and a memory 734 .
- Sensors 702 a - 702 c , processing logic 704 , amplifier 708 , filter 710 , A/D converter 712 , NFC transceiver 718 , BluetoothTM transceiver 722 , charge circuit 724 , sensor interfaces 728 a - 728 c , D/A converter 764 , DSP 730 , CPU 732 may each be implemented in the form of hardware (e.g., electrical circuits, digital logic, etc.) or a combination of hardware and software/firmware.
- the features of hearing assist device 700 shown in FIG. 7 are described as follows.
- microphone 706 is a sensor that receives environmental sounds, including voice of the user of hearing assist device 700 , voice of other persons, and other sounds in the environment (e.g., traffic noise, music, etc.).
- Microphone 706 may be configured in any manner, including being omni-directional (non-directional), directional, etc., and may include one or more microphones.
- Microphone 706 may be a miniature microphone conventionally used in hearing aids, as would be known to persons skilled in the relevant art(s), or may be another suitable type of microphone.
- Microphone(s) 524 FIG. 5 is an example of microphone 706 .
- Microphone 706 generates a received audio signal 740 based on the received environmental sound.
- Amplifier 708 receives and amplifies received audio signal 740 to generate an amplified audio signal 742 .
- Amplifier 708 may be any type of amplifier, including a low-noise amplifier for amplifying low level signals.
- Filter 710 receives and processes amplified audio signal 742 to generate a filtered audio signal 744 .
- Filter 710 may be any type of filter, including being a filter configured to filter out noise, other high frequencies, and/or other frequencies as desired.
- A/D converter 712 receives filtered audio signal 742 , which may be an analog signal, and converts filtered audio signal 742 to digital form, to generate a digital audio signal 746 .
- A/D converter 712 may be configured in any manner, including as a conventional A/D converter.
- Processing logic 704 receives digital audio signal 746 , and may process digital audio signal 746 in any manner to generate processed digital audio signal 762 .
- DSP 730 may receive digital audio signal 746 , and may perform digital signal processing on digital audio signal 746 to generate processed digital audio signal 762 .
- DSP 730 may be configured in any manner, including as a conventional DSP known to person skilled in the relevant art(s), or in another manner.
- DSP 730 may perform any suitable type of digital signal processing to process/filter digital audio signal 746 , including processing digital audio signal 746 in the frequency domain to manipulate the frequency spectrum of digital audio signal 746 (e.g., according to Fourier transform/analysis techniques, etc.).
- DSP 730 may amplify particular frequencies, may attenuate particular frequencies, and may otherwise modify digital audio signal 746 in the discrete domain. DSP 730 may perform the signal processing for various reasons, including noise cancelation or hearing loss compensation. For instance, DSP 730 may process digital audio signal 746 to compensate for a personal hearing frequency response of the user, such as compensating for poor hearing of high frequencies, middle range frequencies, or other personal frequency response characteristics of the user.
- DSP 730 may be pre-configured to process digital audio signal 746 .
- DSP 730 may receive instructions from CPU 732 regarding how to process digital audio signal 746 .
- CPU 732 may access one or more DSP configurations in stored in memory 734 (e.g., in other data 768 ) that may be provided to DSP 730 to configure DSP 730 for digital signal processing of digital audio signal 746 .
- CPU 732 may select a DSP configuration based on a hearing assist mode selected by a user of hearing assist device 700 (e.g., by interacting with switch 532 , etc.).
- D/A converter 764 receives processed digital audio signal 762 , and converts processed digital audio signal 762 to digital form, generating processed audio signal 766 .
- D/A converter 764 may be configured in any manner, including as a conventional D/A converter.
- Speaker 714 receives processed audio signal 766 , and broadcasts sound generated based on processed audio signal 766 into the ear of the user. The user is enabled to hear the broadcast sound, which may be amplified, filtered, and/or otherwise frequency manipulated with respect to the sound received by microphone 706 .
- Speaker 714 may be a miniature speaker conventionally used in hearing aids, as would be known to persons skilled in the relevant art(s), or may be another suitable type of speaker.
- Speaker 512 ( FIG. 5 ) is an example of speaker 714 .
- Speaker 714 may include one or more speakers.
- FIG. 8 shows a flowchart 800 of a process for a hearing assist device that processes and transmits sensor data and receives a command from a second device, according to an exemplary embodiment.
- hearing assist device 700 (as well as any of hearing assist devices 102 , 500 , and 600 ) may perform flowchart 800 . Further structural and operational embodiments will be apparent to persons skilled in the relevant art(s) based on the following description of flowchart 800 and hearing assist device 700 .
- Flowchart 800 begins with step 802 .
- a sensor output signal is received from a medical sensor of the hearing assist device that senses a characteristic of the user.
- sensors 702 a - 702 c may each sense/measure information about a health characteristic of the user of hearing assist device 700 .
- Sensors 702 a - 702 c may each be one of the sensors shown in FIGS. 5 and 6 , and/or mentioned elsewhere herein. Although three sensors are shown in FIG. 7 for purposes of illustration, other numbers of sensors may be present in hearing assist device 700 , including one sensor, two sensors, or greater numbers of sensors.
- Sensors 702 a - 702 c each may generate a corresponding sensor output signal 758 a - 758 c (e.g., an electrical signal) that indicates the measured information about the corresponding health characteristic.
- sensor output signals 758 a - 758 c may be analog or digital signals having levels or values corresponding to the measured information.
- Sensor interfaces 728 a - 728 c are each optionally present, depending on whether the corresponding sensor outputs a sensor output signal that needs to be modified to be receivable by CPU 732 .
- each of sensor interfaces 728 a - 728 c may include an amplifier, filter, and/or A/D converter (e.g., similar to amplifier 708 , filter 710 , and A/D converter 712 ) that respectively amplify (e.g., increase or decrease), reduces particular frequencies, and/or convert to digital form the corresponding sensor output signal.
- Sensor interfaces 728 a - 728 c (when present) respectively output modified sensor output signals 760 a - 760 c.
- the sensor output signal is processed to generate processed sensor data.
- processing logic 704 receives modified sensor output signals 760 a - 760 c .
- Processing logic 704 may process modified sensor output signals 760 a - 760 c in any manner to generate processed sensor data.
- CPU 732 may receive modified sensor output signals 760 a - 760 c .
- CPU 732 may process the sensor information in one or more of modified sensor output signals 760 a - 760 c to generate processed sensor data.
- CPU 732 may manipulate the sensor information (e.g., according to an algorithm of code 738 ) to convert the sensor information into a presentable form (e.g., scaling the sensor information, adding or subtracting a constant to/from the sensor information, etc.). Furthermore, CPU 732 may transmit the sensor information of modified sensor output signals 760 a - 760 c to DSP 730 to be digital signal processed by DSP 730 to generate processed sensor data, and may receive the processed sensor data from DSP 730 . The processed and/or raw (unprocessed) sensor data may optionally be stored in memory 734 (e.g., as sensor data 736 ).
- the processed sensor data is wirelessly transmitted from the hearing assist device to a second device.
- CPU 732 may provide the sensor data (processed or raw) (e.g., from CPU registers, from DSP 730 , from memory 734 , etc.) to a transceiver to be transmitted from hearing assist device 700 .
- hearing assist device 700 includes an NFC transceiver 718 and a BT transceiver 722 , which may each be used to transmit sensor data from hearing assist device 700 .
- hearing assist device 700 may include one or more additional and/or alternative transceivers that may transmit sensor data from hearing assist device 700 , including a Wi-Fi transceiver, a forward IR/UV communication transceiver (e.g., transceiver 520 of FIG. 5 ), a telecoil transceiver (which may transmit via telecoil 526 ), a skin communication transceiver 534 (which may transmit via skin communication conductor 534 ), etc.
- a Wi-Fi transceiver e.g., transceiver 520 of FIG. 5
- a telecoil transceiver which may transmit via telecoil 526
- a skin communication transceiver 534 which may transmit via skin communication conductor 534
- NFC transceiver 718 may receive an information signal 740 from CPU 732 that includes sensor data for transmitting.
- NFC transceiver 718 may modulate the sensor data onto NFC antenna signal 748 to be transmitted from hearing assist device 700 by NFC coil 716 when NFC coil 716 is energized by an RF field generated by a second device.
- BT transceiver 722 may receive an information signal 754 from CPU 732 that includes sensor data for transmitting.
- BT transceiver 722 may modulate the sensor data onto BT antenna signal 752 to be transmitted from hearing assist device 700 by antenna 720 (e.g., BTLE antenna 522 of FIG. 5 ), according to a BluetoothTM communication protocol or standard.
- a hearing assist device may transmit/make a first communication with one or more other devices to provide sensor data and/or other information, and to receive information.
- FIG. 9 shows a communication system 900 that includes a hearing assist device communicating with other communication devices, according to an exemplary embodiment.
- communication system 900 includes hearing assist device 700 , a mobile computing device 902 , a stationary computing device 904 , and a server 906 .
- System 900 is described as follows.
- Mobile computing device 902 (e.g., a local supporting device) is a device capable of communicating with hearing assist device 700 according to one or more communication techniques. For instance, as shown in FIG. 9 , mobile computing device 902 includes a telecoil 910 , one or more microphones 912 , an IR/UV communication transceiver 914 , a WPT/NFC coil 916 , and a BluetoothTM antenna 918 . In embodiments, mobile computing device 902 may include one or more of these features and/or alternative or additional features (e.g., communication mechanisms, etc.).
- Mobile computing device 902 may be any type of mobile electronic device, including a personal digital assistant (PDA), a laptop computer, a notebook computer, a tablet computer (e.g., an Apple iPadTM), a netbook, a mobile phone (e.g., a cell phone, a smart phone, etc.), a special purpose medical device, etc.
- PDA personal digital assistant
- laptop computer a notebook computer
- tablet computer e.g., an Apple iPadTM
- netbook e.g., a mobile phone (e.g., a cell phone, a smart phone, etc.), a special purpose medical device, etc.
- the features of mobile computing device 902 shown in FIG. 9 are described as follows.
- Telecoil 910 is a communication mechanism that may be present to enable mobile computing device 902 to communicate with hearing assist device 700 via a telecoil (e.g., telecoil 526 of FIG. 5 ).
- a telecoil e.g., telecoil 526 of FIG. 5
- telecoil 910 and an associated transceiver may enable mobile computing device 902 to couple audio sources and/or other communications to hearing assist device 700 in a manner known to persons skilled in the relevant art(s).
- Microphone(s) 912 may be present to receive voice of a user of mobile computing device 902 .
- the user may provide instructions for mobile computing device 902 and/or for hearing assist device 700 by speaking into microphone(s) 912 .
- the received voice may be transmitted to hearing assist device 700 (in digital or analog form) according to any communication mechanism, or may be converted into data and/or commands to be provided to hearing assist device 700 to cause functions/actions in hearing assist device 700 .
- Microphone(s) 912 may include any number of microphones, and may be configured in any manner, including being omni-directional (non-directional), directional, etc.
- IR/UV communication transceiver 914 is a communication mechanism that may be present to enable communications with hearing assist device 700 via an IR/UV communication transceiver of hearing assist device 700 (e.g., forward IR/UV communication transceiver 520 of FIG. 5 ). IR/UV communication transceiver 914 may receive information/data from and/or transmit information/data to hearing assist device 700 (e.g., in the form of modulated light, as described above).
- WPT/NFC coil 916 is an NFC antenna coupled to a NFC transceiver in mobile computing device 902 that may be present to enable NFC communications with an NFC communication mechanism of hearing assist device 700 (e.g., NFC transceiver 110 of FIG. 1 , NFC coil 530 of FIG. 5 ). WPT/NFC coil 916 may be used to receive information/data from and/or transmit information/data to hearing assist device 700 .
- BluetoothTM antenna 918 is a communication mechanism coupled to a BluetoothTM transceiver in mobile computing device 902 that may be present to enable communications with hearing assist device 700 (e.g., BT transceiver 722 and antenna 720 of FIG. 7 ). BluetoothTM antenna 918 may be used to receive information/data from and/or transmit information/data to hearing assist device 700 .
- mobile computing device 902 and hearing assist device 700 may exchange communication signals 920 according to any communication mechanism/protocol/standard mentioned herein or otherwise known. According to step 806 , hearing assist device 700 may wirelessly transmit sensor data to mobile computing device 902 .
- Stationary computing device 904 (e.g., a local supporting device) is also a device capable of communicating with hearing assist device 700 according to one or more communication techniques.
- stationary computing device 904 may be capable of communicating with hearing assist device 700 according to any of the communication mechanisms shown for mobile computing device 902 in FIG. 9 , and/or according to other communication mechanisms/protocols/standards described elsewhere herein or otherwise known.
- Stationary computing device 904 may be any type of stationary electronic device, including a desktop computer (e.g., a personal computer, etc.), a docking station, a set top box, a gateway device, an access point, special purpose medical equipment, etc.
- stationary computing device 904 and hearing assist device 700 may exchange communication signals 922 according to any communication mechanism/protocol/standard mentioned herein or otherwise known. According to step 806 , hearing assist device 700 may wirelessly transmit sensor data to stationary computing device 904 .
- mobile computing device 902 may communicate with server 906 (e.g., a remote supporting device, a third device).
- server 906 e.g., a remote supporting device, a third device.
- mobile computing device and/or stationary computing device 904
- Network 908 may be any type of communication network, including a local area network (LAN), a wide area network (WAN), a personal area network (PAN), a phone network (e.g., a cellular network, a land based network), or a combination of communication networks, such as the Internet.
- Network 908 may include wired and/or wireless communication pathway(s) implemented using any of a wide variety of communication media and associated protocols.
- such communication pathway(s) may comprise wireless communication pathways implemented via radio frequency (RF) signaling, infrared (IR) signaling, or the like.
- RF radio frequency
- IR infrared
- Such signaling may be carried out using long-range wireless protocols such as WIMAX® (IEEE 802.16) or GSM (Global System for Mobile Communications), medium-range wireless protocols such as WI-FI® (IEEE 802.11), and/or short-range wireless protocols such as BLUETOOTH® or any of a variety of IR-based protocols.
- Such communication pathway(s) may also comprise wired communication pathways established over twisted pair, Ethernet cable, coaxial cable, optical fiber, or the like, using suitable communication protocols therefor.
- security protocols e.g., private key exchange, etc.
- Server 906 may be any computer system, including a stationary computing device, a server computer, a mobile computing device, etc.
- Server 906 may include a web service, an API (application programming interface), or other service or interface for communications.
- API application programming interface
- Sensor data and/or other information may be transmitted (e.g., relayed) to server 906 over network 908 to be processed.
- server 906 may transmit processed data, instructions, and/or other information through network 908 to mobile computing device 902 (and/or stationary computing device 904 ) to be transmitted to hearing assist device 700 to be stored, to cause a function/action at hearing assist device 700 , and/or for other reason.
- hearing assist device 700 may receive a second communication as a wirelessly transmitted communication signal from a second device at NFC coil 716 , antenna 720 , or other antenna or communication mechanism at hearing assist device 700 .
- the communication may include a command and/or may identify a function, and hearing assist device 700 may respond by performing the command and/or function.
- hearing assist device 700 may respond by gathering additional sensor data, by analyzing retrieved sensor data, by performing a command, etc.
- Example commands include commands relating to sensor data capture, such as a command for a particular sensor to perform and/or provide a measurement, a command related to a sensing configuration (e.g., turning on and/or off particular sensors, calibrating particular sensors, etc.), a command related to a hearing assist device configuration (e.g., turning on and/or off particular hearing assist device components, calibrating particular components, etc.), a command that defines audio playback, etc.
- a command for a particular sensor to perform and/or provide a measurement such as a command for a particular sensor to perform and/or provide a measurement, a command related to a sensing configuration (e.g., turning on and/or off particular sensors, calibrating particular sensors, etc.), a command related to a hearing assist device configuration (e.g., turning on and/or off particular hearing assist device components, calibrating particular components, etc.), a command that defines audio playback, etc.
- a sensing configuration e.g., turning on and/or off particular sensors,
- a received communication may define audio playback, such as by including or causing audio data to be played to the user by a speaker of hearing assist device 700 as voice or other sound, including or causing audio data to be played to the user by a speaker of hearing assist device 700 that prompts for user input (e.g., requests a user response to a question, etc.), etc.
- a command may be transmitted from NFC coil 716 on NFC antenna signal 748 to NFC transceiver 718 .
- NFC transceiver 718 may demodulate command data from the received communication signal, and provide the command to CPU 732 .
- the command may be transmitted from antenna 720 on BT antenna signal 752 to BT transceiver 722 .
- BT transceiver 722 may demodulate command data from the received communication signal, and provide the command to CPU 732 .
- the CPU 732 may execute the received command.
- the received command may cause hearing assist device 700 to perform one or more functions/actions.
- the command may cause hearing assist device 700 to turn on or off, to change modes, to activate or deactivate one or more sensors, to wirelessly transmit further information, to execute particular program code (e.g., stored as code 738 in memory 734 ), to play a sound (e.g., an alert, a tone, a beeping noise, pre-recorded or synthesized voice, etc.) from speaker 714 to the user to inform the user of information and/or cause the user to perform a function/action, and/or cause one or more additional and/or alternative functions/actions to be performed by hearing assist device 700 . Further examples of such commands and functions/actions are described elsewhere herein.
- a hearing assist device may be configured to convert received RF energy into charge for storage in a battery of the hearing assist device.
- hearing assist device 700 includes charge circuit 724 for charging battery 726 , which is a rechargeable battery (e.g., rechargeable battery 114 ).
- charge circuit 724 may operate according to FIG. 10 .
- FIG. 10 shows a flowchart 1000 of a process for a wirelessly charging a battery of a hearing assist device, according to an exemplary embodiment. Flowchart 1000 is described as follows.
- a radio frequency signal is received.
- NFC coil 716 , antenna 720 , and/or other antenna or coil of hearing assist device 700 may receive a radio frequency (RF) signal.
- the RF signal may be a communication signal that includes data (e.g., modulated on the RF signal), or may be an un-modulated RF signal.
- Charge circuit 724 may be coupled to one or more of NFC coil 716 , antenna 720 , or other antenna to receive the RF signal.
- a charge current is generated that charges a rechargeable battery of the hearing assist device based on the received radio frequency signal.
- charge circuit 724 is configured to generate a charge current 756 that is used to charge battery 726 .
- Charge circuit 724 may be configured in various ways to convert a received RF signal to a charge current.
- charge circuit 724 may include an induction coil to take power from an electromagnetic field and convert it to electrical current.
- charge circuit 724 may include a diode rectifier circuit that rectifies the received RF signal to a DC (direct current) signal, and may include one or more charge pump circuits coupled to the diode rectifier circuit used to create a higher voltage value from the DC signal.
- charge circuit 724 may be configured in other ways to generate charge current 756 from a received RF signal.
- hearing assist device 700 may maintain power for operation, with battery 726 being charged periodically by RF fields generated by other devices, rather than needing to physically replace batteries.
- hearing assist device 700 may be configured to generate sound based on received sensor data.
- hearing assist device 700 may operate according to FIG. 11 .
- FIG. 11 shows a flowchart 1100 of a process for generating and broadcasting sound based on sensor data, according to an exemplary embodiment. For purposes of illustration, flowchart 1100 is described as follows with reference to FIG. 7 .
- Flowchart 1100 begins with step 1102 .
- an audio signal is generated based at least on the processed sensor data.
- a sensor output signal may be processed to generate processed sensor data.
- the processed sensor data may be stored in memory 736 as sensor data 736 , may be held in registers in CPU 732 , or may be present in another location.
- Audio data for one or more sounds e.g., tones, beeping sounds, voice segments, etc.
- CPU 732 or DSP 730 may select the audio data corresponding to particular sensor data from memory 734 .
- CPU 732 may transmit a request for the audio data from another device using a communication mechanism (e.g., NFC transceiver 718 , BT transceiver 722 , etc.).
- DSP 730 may receive the audio data from CPU 732 , from memory 734 , or from another device, and may generate processed digital audio signal 762 based thereon.
- step 1104 sound is generated based on the audio signal, the sound broadcast from a speaker of the hearing assist device into the ear of the user.
- D/A converter 764 may be present, and may receive processed digital audio signal 762 .
- D/A converter 764 may convert processed digital audio signal 762 to digital form to generate processed audio signal 766 .
- Speaker 714 receives processed audio signal 766 , and broadcasts sound generated based on processed audio signal 766 into the ear of the user.
- sounds may be provided to the user by hearing assist device 700 based at least on sensor data, and optionally further based on additional information.
- the sounds may provide information to the user, and may remind or instruct the user to perform a function/action.
- the sounds may include one or more of a tone, a beeping sound, or a voice that includes at least one of a verbal instruction to the user, a verbal warning to the user, or a verbal question to the user.
- a tone or a beeping sound may be provided to the user as an alert based on particular values of sensor data (e.g., indicating a high glucose/blood sugar value), and/or a voice instruction may be provided to the user as the alert based on the particular values of sensor data (e.g., a voice segment stating “Blood sugar is low—Insulin is required” or “hey, your heart rate is 80 beats per minute, your heart is fine, your pacemaker has got 6 hours of battery left.”).
- a voice segment stating “Blood sugar is low—Insulin is required” or “hey, your heart rate is 80 beats per minute, your heart is fine, your pacemaker has got 6 hours of battery left.”.
- hearing assist device 700 may be configured to generate filtered environmental sound.
- hearing assist device 700 may operate according to FIG. 12 .
- FIG. 12 shows a flowchart 1200 of a process for generating and broadcasting filtered sound from a hearing assist device, according to an exemplary embodiment. For purposes of illustration, flowchart 1200 is described as follows with reference to FIG. 7 .
- Flowchart 1200 begins with step 1202 .
- an audio signal is generated based on environmental sound received by at least one microphone of the hearing assist device.
- microphone 706 may generate a received audio signal 740 based on received environmental sound.
- Received audio signal 740 may optionally be amplified, filtered, and converted to digital form to generate digital audio signal 746 , as shown in FIG. 7 .
- DSP 730 may receive digital audio signal 746 , and may perform digital signal processing on digital audio signal 746 to generate processed digital audio signal 762 .
- DSP 730 may favor one or more frequencies by amplifying particular frequencies, attenuate particular frequencies, and/or by otherwise filtering digital audio signal 746 in the discrete domain.
- DSP 730 may perform the signal processing for various reasons, including noise cancelation or hearing loss compensation. For instance, DSP 730 may process digital audio signal 746 to compensate for a personal hearing frequency response of the user, such as compensating for poor hearing of high frequencies, middle range frequencies, or other personal frequency response characteristics of the user.
- step 1206 sound is generated based on the modified audio signal, the sound broadcast from a speaker of the hearing assist device into the ear of the user.
- D/A converter 764 may be present, and may receive processed digital audio signal 762 .
- D/A converter 764 may convert processed digital audio signal 762 to digital form to generate processed audio signal 766 .
- Speaker 714 receives processed audio signal 766 , and broadcasts sound generated based on processed audio signal 766 into the ear of the user.
- environmental noise, voice, and other sounds may be tailored to a particular user's personal hearing frequency response characteristics.
- particular noises in the environment may be attenuated (e.g., road noise, engine noise, etc.) to be filtered from the received environmental sounds so that the user may better hear important or desired sounds.
- sounds that are desired to be heard e.g., music, a conversation, a verbal warning, verbal instructions, sirens, sounds of a nearby car accident, etc.
- sounds that are desired to be heard e.g., music, a conversation, a verbal warning, verbal instructions, sirens, sounds of a nearby car accident, etc.
- hearing assist device 700 may be configured to transmit recorded voice of a user to another device.
- hearing assist device 700 may operate according to FIG. 13 .
- FIG. 13 shows a flowchart 1300 of a process for generating an information signal in a hearing assist device based on a voice of a user, and for transmitting the information signal to a second device, according to an exemplary embodiment.
- flowchart 1300 is described as follows with reference to FIG. 7 .
- Flowchart 1300 begins with step 1302 .
- an audio signal is generated based on a voice of the user received at a microphone of the hearing assist device.
- microphone 706 may generate a received audio signal 740 based on received voice of the user.
- Received audio signal 740 may optionally be amplified, filtered, and converted to digital form to generate digital audio signal 746 , as shown in FIG. 7 .
- the voice of the user may be any statement made by the user, including a question, a statement of fact, a command, or any other verbal sequence. For instance, the user may ask “what is my heart rate”. All such statements made by the user can be those intended for capture by one or more hearing assist devices, supporting local and remote systems. Such statements may also include unintentional sounds such as semi-lucid ramblings, moaning, choking, coughing, and/or other sounds. Any one or more of the hearing assist devices and the supporting local device can receive (via microphones) such audio and forward the audio from the hearing assist device(s) as needed for further processing.
- This processing may include voice and/or sound recognition, comparisons with command words or sequences, (video, audio) prompting for (gesture, tactile or audible) confirmation, carrying out commands, storage for later analysis or playback, and/or forwarding to an appropriate recipient system for further processing, storage, and/or presentations to others.
- an information signal is generated based on the audio signal.
- DSP 730 may receive digital audio signal 746 .
- DSP 730 and/or CPU 732 may generate an information signal from digital audio signal 746 to be transmitted to a second device from hearing assist device 700 .
- DSP 730 and/or CPU 732 may optionally perform voice/speech recognition on digital audio signal 746 to recognize spoken words included therein, and may include the spoken words in the generated information signal.
- code 738 stored in memory 734 may include a voice recognition program that may be executed by CPU 732 and/or DSP 730 .
- the voice recognition program may use conventional or proprietary voice recognition techniques.
- voice recognition techniques may be augmented by sensor data.
- position/motion sensor 518 may include a vibration sensor.
- the vibration sensor may detect vibrations of the user associated with speaking (e.g., jaw movement of the wearer during talking), and generates corresponding vibration information/data.
- the vibration information output by the vibration sensor may be received by CPU 732 and/or DSP 730 , and may be used to aid in improving speech/voice recognition performed by the voice recognition program.
- the vibration information may be used by the voice recognition program to detect breaks between words, to identify the location of spoken syllables, to identify the syllables themselves, and/or to better perform other aspects of voice recognition.
- the vibration information may be transmitted from hearing assist device 700 , along with the information signal, to a second device to perform the voice recognition process at the second device (or other device).
- the generated information signal is transmitted to the second device.
- CPU 732 may provide the information signal (e.g., from CPU registers, from DSP 730 , from memory 734 , etc.) to a transceiver to be transmitted from hearing assist device 700 (e.g., NFC transceiver 718 , BT transceiver 722 , or other transceiver).
- a transceiver e.g., NFC transceiver 718 , BT transceiver 722 , or other transceiver.
- Another device such as mobile computing device 902 , stationary computing device 904 , and server 906 , which may be associated devices, third party devices (utilized by third parties), or be otherwise related to or not related to hearing assist device 700 , may receive the transmitted voice information, and may analyze the voice (spoken words, moans, slurred words, etc.) therein to determine one or more functions/actions to be performed. As a result, one or more functions/actions may be determined to be performed by hearing assist device 700 or another device.
- hearing assist device 700 may be configured to enable voice to be received and/or generated to be played to the user.
- hearing assist device 700 may operate according to FIG. 14 .
- FIG. 14 shows a flowchart 1400 of a process for generating voice to be broadcast to a user, according to an exemplary embodiment. For purposes of illustration, flowchart 1400 is described as follows with reference to FIG. 7 .
- Flowchart 1400 begins with step 1402 .
- a sensor output signal is received from a medical sensor of the hearing assist device that senses a characteristic of the user.
- sensors 702 a - 702 c each sense/measure information about a health characteristic of the user of hearing assist device 700 .
- sensor 702 a may sense a characteristic of the user (e.g., a heart rate, a blood pressure, a glucose level, a temperature, etc.).
- Sensors 702 a generates sensor output signal 758 a , which indicates the measured information about the corresponding health characteristic.
- Sensor interface 728 a when present, may convert sensor output signal 758 a to modified sensor output signal 760 a , to be received by processing logic.
- processed sensor data is generated based on the sensor output signal.
- processing logic 704 receives modified sensor output signal 760 a , and may process modified sensor output signal 760 a in any manner.
- CPU 732 may receive modified sensor output signal 760 a , and may process the sensor information contained therein to generate processed sensor data.
- CPU 732 may manipulate the sensor information (e.g., according to an algorithm of code 738 ) to convert the sensor information into a presentable form (e.g., scaling the sensor information, adding or subtracting a constant to/from the sensor information, etc.), or may otherwise process the sensor information.
- CPU 732 may transmit the sensor information of modified sensor output signal 760 a to DSP 730 to be digital signal processed.
- a voice audio signal generated based at least on the processed sensor data is received.
- the processed sensor data generated in step 1404 may be transmitted from hearing assist device 700 to another device (e.g., as shown in FIG. 9 ), and a voice audio signal may be generated at the other device based on the processed sensor data.
- the voice audio signal may be generated by processing logic 704 based on the processed sensor data.
- the voice audio signal contains voice information (e.g., spoken words) that relate to the processed sensor data.
- the voice information may include a verbal alert, verbal instructions, and/or other verbal information to be provided to the user based on the processed sensor data (e.g., based on a value of measured sensor data, etc.).
- the voice information may be generated by being synthesized, being retrieved from memory 734 (e.g., a library of record spoken segments in other data 768 ), or being generated from a combination thereof. It is noted that the voice audio signal may be generated based on processed sensor data from one or more sensors. DSP 730 may output the voice audio signal as processed digital audio signal 762 .
- voice is broadcast from the speaker into the ear of the user based on the received voice audio signal.
- D/A converter 764 may be present, and may receive processed digital audio signal 762 .
- D/A converter 764 may convert processed digital audio signal 762 to digital form to generate processed audio signal 766 .
- Speaker 714 receives processed audio signal 766 , and broadcasts voice generated based on processed audio signal 766 into the ear of the user.
- voice may be provided to the user by hearing assist device 700 based at least on sensor data, and optionally further based on additional information.
- the voice may provide information to the user, and may remind or instruct the user to perform a function/action.
- the voice may include at least one of a verbal instruction to the user (“take an iron supplement”), a verbal warning to the user (“your heart rate is high”), a verbal question to the user (“have you fallen down, and do you need assistance?”), or a verbal answer to the user (“your heart rate is 98 beats per minute”).
- hearing assist devices may be configured to perform various functions using hardware (e.g., circuits), or a combination of hardware and software/firmware (e.g., code 738 of FIG. 7 , etc.). Furthermore, hearing assist devices may communicate with remote devices (e.g., mobile computing device 902 , stationary computing device 904 , server 906 , etc.) that include corresponding functionality.
- FIG. 15 shows a system 1500 comprising a hearing assist device 1501 and a cloud/service/phone portable device 1503 that may be communicatively connected thereto.
- Hearing assist device 1501 may comprise, for example and without limitation, one of hearing assist devices 102 , 500 , 600 , or 700 described above. Although only a single hearing assist device 1501 is shown in FIG.
- system 1500 may include two hearing assist devices.
- Device 1503 may comprise, for example and without limitation, mobile computing device 902 , stationary computing device 904 , server 906 , or another remote device that is accessible to hearing assist device 1501 .
- device 1503 may be local with respect to the wearer of hearing assist device 1501 or remote with respect to the wearer of hearing assist device 1501 .
- Hearing assist device 1501 includes a number of processing modules that may be implemented as software or firmware running on one or more general purpose processors (e.g., CPU 732 of FIG. 7 ) and/or DSPs (e.g., DSP 730 ), as dedicated circuitry, or as a combination thereof.
- Such processors and/or dedicated circuitry are collectively referred to in FIG. 15 as general purpose (DSP) and dedicated processing circuitry 1513 .
- the processing modules include a speech generation module 1523 , a speech/noise recognition module 1525 , an enhanced audio processing module 1527 , a clock/scheduler module 1529 , a mode select and reconfiguration module 1531 , and a battery management module 1533 .
- hearing assist device 1501 further includes local storage 1535 .
- Local storage 1535 comprises one or more volatile and/or non-volatile memory devices or structures that are internal to hearing assist device 1501 (e.g., memory 734 of FIG. 7 ). Such memory devices or structures may be used to store recorded audio information in an audio playback queue 1537 as well as to store information and settings 1539 associated with hearing assist device 1501 , a user thereof, a device paired thereto, and to services (cloud-based or otherwise) accessed by or on behalf of hearing assist device 1501 .
- Hearing assist device 1501 further includes sensor components and associated circuitry 1541 .
- sensor components and associated circuitry may include but are not limited to one or more microphones, bone conduction sensors, temperature sensors, blood pressure sensors, blood glucose sensors, pulse oximetry sensors, pH sensors, vibration sensors, accelerometers, gyros, magnetos, any other sensor mentioned elsewhere herein, or the like.
- Hearing assist device 1501 still further includes user interface (UI) components and associated circuitry 1543 .
- UI components may include buttons, switches, dials, capacitive touch sensing devices, or other mechanical components by which a user may control and configure the operation of hearing assist device 1501 (e.g., switch 532 and volume controller 540 ).
- Such UI components may also comprise capacitive sensing components to allow for touch-based or tap-based interaction with hearing assist device 1501 .
- Such UI components may further include a voice-based UI.
- voice-based UI may utilize speech/noise recognition module 1525 to recognize commands uttered by a user of hearing assist device 1501 and/or speech generation module 1523 to provide output in the form of pre-defined or synthesized speech.
- hearing assist device 1501 comprise an integrated part of a pair of glasses, visor or helmet
- user interface component and associated circuitry 1543 may also comprise a display integrated with or projected upon a portion of the glasses, visor or helmet for presenting information to a user.
- Hearing assist device 1501 also includes communication interfaces and associated circuitry 1545 for carrying out communication over one or more wired, wireless, or skin-based communication pathways. Communication interfaces and associated circuitry 1545 enable hearing assist device 1501 to communicate with device 1503 . Communication interfaces and associated circuitry 1545 may also enable hearing assist device 1501 to communicate with a second hearing assist device worn by the same user as well as with other devices.
- cloud/service/phone/portable device 1503 comprises power resources, processing resources, and storage resources that can be used by hearing assist device 1501 to assist in performing certain operations and/or to improve the performance of such operations when a communication pathway has been established between the two devices.
- device 1503 includes a number of assist processing modules that may be implemented as software or firmware running on one or more general purpose processors and/or DSPs, as dedicated circuitry, or as a combination thereof.
- Such processors and/or dedicated circuitry are collectively referred to in FIG. 15 as general/dedicated processing circuitry (with hearing assist device support) 1553 .
- the processing modules include a speech generation assist module 1555 , a speech/noise recognition assist module 1557 , an enhanced audio processing assist module 1559 , a clock/scheduler assist module 1561 , a mode select and reconfiguration assist module 1563 , and a battery management assist module 1565 .
- device 1503 further includes storage 1567 .
- Storage 1567 comprises one or more volatile and/or non-volatile memory devices/structures and/or storage systems that are internal to or otherwise accessible to device 1503 .
- Such memory devices/structures and/or storage systems may be used to store recorded audio information in an audio playback queue 1569 as well as to store information and settings 1571 associated with hearing assist device 1501 , a user thereof, a device paired thereto, and to services (cloud-based or otherwise) accessed by or on behalf of hearing assist device 1501 .
- storage 1567 may be used to record commands to be cached in device 1503 , such that when a time window becomes available for device 1571 to communicate with the outside environment (because of power savings or availability), such stored commands (and/or other data) may be sent to the user's mobile device, other devices, the cloud, etc. for processing. Results of such processing may be transmitted back to device 1503 , to an email address of the user, a text message address of the user, and/or may be provided to the user in another manner.
- Device 1503 also includes communication interfaces and associated circuitry 1577 for carrying out communication over one or more wired, wireless or skin-based communication pathways.
- Communication interfaces and associated circuitry 1577 enable device 1503 to communicate with hearing assist device 1501 . Such communication may be direct (point-to-point between device 1503 and hearing assist device 1501 ) or indirect (through one or more intervening devices or nodes).
- Communication interfaces and associated circuitry 1577 may also enable device 1503 to communicate with other devices or access various remote services, including cloud-based services.
- device 1503 may also comprise supplemental sensor components and associated circuitry 1573 and supplemental user interface components and associated circuitry 1575 that can be used by hearing assist device 1501 to assist in performing certain operations and/or to improve the performance of such operations.
- a prerequisite for providing external operational support to hearing assist device 1501 by device 1503 may be the establishment of a communication pathway between device 1503 and hearing assist device 1501 .
- the establishment of such a communication pathway is achieved by implementing a communication service on hearing assist device 1501 that monitors for the presence of device 1503 and selectively establishes communication therewith in accordance with a predefined protocol.
- a communication service may be implemented on device 1503 that monitors for the presence of hearing assist device 1501 and selectively establishes communication therewith in accordance with a predefined protocol. Still other methods of establishing a communication pathway between hearing assist device 1501 and device 1503 may be used.
- Hearing assist device 1501 includes battery management module 1533 that monitors a state of a battery internal to hearing assist device 1501 .
- Battery management module 1501 may also be configured to alert a wearer of hearing assist device 1501 when such battery is in a low-power state so that the wearer can recharge the battery.
- the wearer of hearing assist device 1501 can cause such recharging to occur by bringing a portable electronic device within a certain distance of hearing assist device 1501 such that power may be transferred via an NFC link, WPT link, or other suitable link for transferring power between such devices.
- hearing assist device 1501 may be said to be utilizing the power resources of device 1503 to assist in the performance of its operations.
- hearing assist device 1501 can also utilize other resources of device 1503 to assist in performing certain operations and/or to improve the performance of such operations. Whether and when hearing assist device 1501 so utilizes the resources of device 1503 may vary depending upon the designs of such devices and/or any user configuration of such devices.
- hearing assist device 1501 may be programmed to only utilize certain resources of device 1503 when the battery power available to hearing assist device 1501 has dropped below a certain level. As another example, hearing assist device 1501 may be programmed to only utilize certain resources of device 1503 when it is determined that an estimated amount of power that will be consumed in maintaining a particular communication pathway between hearing assist device 1501 and device 1503 will be less than an estimated amount of power that will be saved by offloading functionality to and/or utilizing the resources of device 1503 .
- an assistance feature of device 1503 may be provided when a very low power communication pathway can be established or exists between hearing assist device 1501 and device 1503 , but that same assistance feature of device 1503 may be disabled if the only communication pathway that can be established or exists between hearing assist device 1501 and device 1503 is one that consumes a relatively greater amount of power.
- Still other decision algorithms can be used to determine whether and when hearing assist device 1501 will utilize resources of device 1503 .
- Such algorithms may be applied by battery management module 1533 of hearing assist device 1501 and/or by battery management assist module 1565 of device 1503 prior to activating assistance features of device 1503 .
- a user interface provided by hearing assist device 1501 and/or device 1503 may enable a user to select which features of hearing assist device 1501 should be able to utilize external operational support and/or under what conditions such external operational support should be provided.
- the settings established by the user may be stored as part of information and settings 1539 in local storage 1535 of hearing assist device 1501 and/or as part of information and settings 1571 in storage 1567 of device 1503 .
- hearing assist device 1501 can also utilize resources of a second hearing assist device to perform certain operations.
- hearing assist device 1501 may communicate with a second hearing assist device worn by the same user to coordinate distribution or shared execution of particular operations. Such communication may be carried out, for example, via a point-to-point link between the two hearing assist devices or via links between the two hearing assist devices and an intermediate device, such as a portable electronic device being carried by a user.
- the determination of whether a particular operation should be performed by hearing assist device 1501 versus the second hearing assist device may be made by battery management module 1533 , a battery management module of the second hearing assist device, or via coordination between both battery management modules.
- hearing assist device 1501 may be selected to perform a particular operation, such as taking a blood pressure reading or the like.
- a particular operation such as taking a blood pressure reading or the like.
- Such battery imbalance may result from, for example, one hearing assist device being used at a higher volume than the other over an extended period of time.
- a more balanced discharging of the batteries of both devices can be achieved.
- certain sensors may be present on hearing assist device 1501 that are not present on the second hearing assist device and certain sensors may be present on the second hearing assist device that are not present on hearing assist device 1501 , such that a distribution of functionality between the two hearing assist devices is achieved by design.
- Hearing assist device 1501 comprises a speech generation module 1523 that enables hearing assist device 1501 to generate and output verbal audio information (spoken words or the like) to a wearer thereof via a speaker of hearing assist device 1501 .
- verbal audio information may be used to implement a voice UI, to provide speech-based alerts, messages and reminders as part of a clock/scheduler feature implemented by clock/schedule module 1529 , or to provide emergency alerts or messages to a wearer of hearing assist device based on a detected medical condition of the wearer, or the like.
- the speech generated by speech generation module 1523 may be pre-recorded and/or dynamically synthesized, depending upon the implementation.
- speech generation assist module 1555 of device 1503 may operate to perform all or part of the speech generation function that would otherwise be performed by speech generation module 1523 of hearing assist device 1501 . Such operation by device 1503 can advantageously cause the battery power of hearing assist device 1501 to be conserved. Any speech generated by speech generation assist module 1555 may be communicated back to hearing assist device 1501 for playback via at least one speaker of hearing assist device 1501 . Any of a wide variety of well-known speech codecs may be used to carry out such transmission of speech information in an efficient manner. Additionally or alternatively, any speech generated by speech generation assist module 1555 can be played back via one or more speakers of device 1503 if device 1503 is local with respect to the wearer of hearing assist device 1501 .
- speech generation assist module 1555 may provide a more elaborate set of features than those provided by speech generation module 1523 , as device 1503 may have access to greater power, processing and storage resources than hearing assist device 1501 to support such additional features.
- speech generation assist module 1555 may provide a more extensive vocabulary of pre-recorded words, terms and sentences or may provide a more powerful speech synthesis engine.
- Hearing assist device 1501 includes a speech/noise recognition module 1525 that is operable to apply speech and/or noise recognition algorithms to audio input received via one or more microphones of hearing assist device 1501 .
- speech/noise recognition module 1525 can enable speech/noise recognition module 1525 to determine when a wearer of hearing assist device 1501 is speaking and further to recognize words that are spoken by such wearer, while rejecting non-speech utterances and noise.
- Such algorithms may be used, for example, to enable hearing assist device 1501 to provide a voice-based UI by which a wearer of hearing assist device 1501 can exercise voice-based control over the device.
- speech/noise recognition assist module 1557 of device 1503 may operate to perform all or part of the speech/noise recognition functions that would otherwise be performed by speech/noise recognition module 1525 of hearing assist device 1501 . Such operation by device 1503 can advantageously cause the battery power of hearing assist device 1501 to be conserved.
- speech/noise recognition assist module 1557 may provide a more elaborate set of features than those provided by speech/noise recognition module 1525 , as device 1503 may have access to greater power, processing and storage resources than hearing assist device 1501 to support such additional features.
- speech/noise recognition assist module 1557 may include a training program that a wearer of hearing assist device 1501 can use to train the speech recognition logic to better recognize and interpret his/her own voice.
- speech/noise recognition assist module 1557 may include a process by which a wearer of hearing assist device 1501 can add new words to the dictionary of words that are recognized by the speech recognition logic.
- Such additional features may be included in an application that can be installed by the wearer on device 1503 .
- Such additional features may also be supported by a user interface that forms part of supplemental user interface components and associated circuitry 1575 .
- speech/noise recognition module 1525 in accordance with certain embodiments.
- Hearing assist device 1501 includes an enhanced audio processing module 1527 .
- Enhanced audio processing module 1527 may be configured to process an input audio signal received by hearing assist device 1501 to achieve a desired frequency response prior to playing back such input audio signal to a wearer of hearing assist device 1501 .
- enhanced audio processing module 1527 may selectively amplify certain frequency components of an input audio signal prior to playing back such input audio signal to the wearer.
- the frequency response to be achieved may specified by or derived from a prescription for the wearer that is provided to hearing assist device 1501 by an external device or system.
- such prescription may be formatted in a standardized manner in order to facilitate use thereof by any of a variety of hearing assistance devices and audio reproduction systems.
- enhanced audio processing module 1527 may modify a first input audio signal received by hearing assist device 1501 prior to playback of the first input audio signal to one ear of the wearer, while an enhanced audio processing module of the second hearing assist device modifies a second input audio signal received by the second hearing assist device prior to playback of the second input audio signal to the other ear of the wearer.
- Such modification of the first and second input audio signals can be used to achieve enhanced spatial signaling for the wearer. That is to say, the enhanced audio signals provided to both ears of the wearer will enable the wearer to better determine the spatial origin of sounds.
- Such enhancement is desirable for persons who have a poor ability to detect the spatial origin of sound, and therefore a poor ability to responds to spatial cues.
- an appropriate user-specific “head transfer function” can be determined through testing of a user. The results of such testing may then be used to calibrate the spatial audio enhancement function applied at each ear.
- the hearing assist devices described above may used in further applications, as well as variations on the above described embodiments.
- various health monitoring technologies can be integrated into hearing aid devices as well as into local and remote supporting devices and systems.
- Local systems may comprise one or more smart phones, tablets, computers that are portable or stationary. Such devices may have application software installed (downloaded) therein to define supporting behaviors.
- Local systems or devices may also comprise other dedicated health care devices such as monitors, rate measuring devices, and so on that may be stationary or be worn or carried by the user.
- Sensor data collected by one or both of the hearing aid devices and local supporting devices or systems can be used together to help provide a basis for a more accurate diagnosis of a user's current health.
- a hearing assist device may be docked to a stationary docking station, or a mobile device may be held adjacent to the hearing assist device (e.g., against the ear of the user) to cause sensor data and/or other information to be transmitted from the hearing assist device according to NFC techniques, as well as to enable information to be received by the hearing assist device.
- Temperature information measured by a sensor of the hearing assist device may be used to determine whether the hearing aid is being worn by a user. If the hearing assist device is determined to not be worn (e.g., temperature below human temperature is detected), processing logic of the hearing assist device may cause the hearing assist device to enter a low power state (e.g., with periodic flashing LED or audio to support attempts to find a misplaced hearing aid).
- a low power state e.g., with periodic flashing LED or audio to support attempts to find a misplaced hearing aid.
- an elevated human temperature may cause the processing logic to power up communication circuitry within the hearing assist device, which may in turn cause a remote device to power up and participate in a data exchange with the hearing assist device.
- the elevated temperature may be reported to another person, including medical personnel.
- a temperature extreme may cause a request to be transmitted to a remote device (e.g., a smart phone) to dial 911, medical staff, or family members.
- Program code, applications, or “apps” may be downloaded to a hearing assist device and stored in memory (e.g., in memory 734 of FIG. 7 as code 738 ). Such applications can program different sensor response functionality, tailoring the hearing assist device and/or mobile computing device to service a particular user. For example, sensor data (e.g., motion, heart rate, stress levels) may be analyzed by processing logic along with recorded audio sounds (e.g., sounds of pain, moaning, slurred words, a lack of sound), to determine a lack of movement, stroke, heart attack, to cause a request to be generated to dial 911 and/or a doctor immediately (e.g., through a wireless link to a local access point or phone).
- sensor data e.g., motion, heart rate, stress levels
- recorded audio sounds e.g., sounds of pain, moaning, slurred words, a lack of sound
- the smart phone can determine a distance between the hearing assist device and the ear, and processing logic of the hearing assist device adjust may adjust broadcast sound accordingly. If there is no hearing assist device in the user's ear, and the phone reliably identifies the correct user (e.g., by camera, by sound, etc.), the phone may be configured to compensate for hearing loss of the user (e.g., by amplifying particular frequency ranges). The user may manually increase the phone volume, and may select an icon or other user interface mechanism to turn on hearing aid frequency compensation. Magnetic field induction may also be used to communicate the audio signal.
- an alarm clock signal delivered via a hearing assist device may be configured to repeat until the user is determined to be upright by processing logic of the hearing assist device (e.g., based on measured information from position/motion 518 ).
- processing logic may cause a message (e.g., from memory 734 ) to be broadcast to the user, such as “All systems stable. You are at home and it is 8 am. It is time to take your XYZ pill.”
- a message e.g., from memory 734
- Such messages, alerts, etc. may be triggered by processing logic in response to sensor data changes (e.g., emergencies, etc.), smart phone interaction, and/or pressing a status button on the hearing assist device.
- a user may be determined by processing logic to have fallen down (e.g., based on measured information from position/motion 518 that indicates an impact and/or user orientation).
- processing logic may cause a message (e.g., from memory 734 ) to be broadcast to the user, such as “are you ok? Say yes if so, and no if injured.”
- the processing logic of the hearing assist device may then step the user through a question/answer (Q/A) interaction that based on sensor data circumstances to arrive at likelihood of needed medical intervention (e.g., “Are you sweaty? Can you read a book at arm's length? Can you read the letters? Shut one eye. Shut the other eye.
- Q/A question/answer
- Information regarding the Q/A interaction may be transmitted to a medical staff member, who may review the interaction and deliver their own voice to the user through the hearing assist device and/or smart phone.
- a microphone of the hearing assist device may be used to capture the user's verbal responses, which may be delivered back to the medical staff member (or a family member).
- Such a communication flow may of course be carried out via local cell phone device of the user, or via a back door channel through a third party's cell phone (a third party device).
- status message playback at the hearing assist device can be triggered by voice recognized commands received from the user.
- the hearing assist device and smart phone may use NFC to transfer a call or other audio to the hearing assist device.
- a skin pathway e.g., via skin communication conductor 534 ) for communications to the skin communication conductor 534 from a hand held smart phone may be used.
- a doctor may remotely evaluate (and control) hearing assist device performance/settings/battery, extra collected health data, etc., and deliver audio, with or without placing a call.
- voice signaling can be injected into the hearing pathway (via speakers of the hearing assist device, or by speakers of the mobile computing device).
- a warning message may be received from a smart environment for dangerous items in that area (e.g., from access points, smart phones, computers, sensors, etc.).
- the warning message may be played to the ear of the user with background sounds suppressed (e.g., by DSP 730 ) to make sure that the user hears the warning message.
- An intelligent mixing of sounds may be performed by DSP 730 .
- the hearing assist device may be configured to ensure that particular desired sounds are heard clearly despite a sound level of the radio.
- the hearing assist device and/or the vehicle itself may amplify certain desired sounds or other sensor readings to the user, such as another vehicle that is getting too close, or an obstacle detected in front of the vehicle.
- voice/speech recognition may be incorporated into a hearing assist device to enable commands from the user to be recognized and transmitted to a remote device under certain circumstances.
- voice commands provided by the user may be explicit (e.g., “contact my doctor”), or may be coded (e.g., saying “apple” to cause the hearing assist device to contact the user's doctor) for various reasons, such as to avoid public embarrassment regarding wearing a hearing aid.
- voice of the user may be recognized by the hearing assist device, and converted to a text message that is displayed to the user on a mobile computing device, or transmitted to one or more intended recipients.
- the mobile computing device may transmit commands to the hearing assist device that are converted to audio that is broadcast into the ear of the user by a speaker of the hearing assist device (e.g., to provide a privacy mode).
- artificial reality may be augmented, where the mobile computing device can provide extra information (e.g., by voice) to the user based upon location and other aspects.
- a medical condition of the user may be detected by the hearing assist device, as well as a location of the user, which may be used to launch a web search to find a local medical clinic (e.g., contact information, an address, etc.).
- the hearing assist device and/or mobile computing device can train on the voice of a talker to support better filtering over time.
- the hearing assist device may implement voice recognition that detects slurred or unusual speech patterns of the user, which may indicate a potential medical condition of the user. For instance, slurred speech and time of detection information may prove critical when attempting to identify a window of opportunity in which blood thinners may be useful in minimizing brain damage due to a stroke.
- a hearing assist device may perform an emergency call through a smart phone. For instance, if a person finds a user that is unconscious, the person may place their smart phone near the ear of the user, and the user's hearing assist device may make an emergency communication through the smart phone.
- the hearing assist device may gather sensor data to be used to evaluate the user's health, and may relay this sensor data through the smart phone to an emergency responder.
- the hearing assist device may even provide commands to the person to perform on the unconscious user (e.g., “feel the user's forehead,” etc.).
- injected voice may be provided by a hearing assist device to a user.
- the user may be listening to music that is transmitted to the hearing assist device from a remote device (e.g., through BluetoothTM, the user's skin, etc.).
- Voice provided by the hearing assist device may interrupt the music to provide verbal information to the user, such as “your blood pressure is dropping,” “you have a fever,” etc.
- program code or “apps” may be downloaded to a hearing assist device as well as to the remote device(s). Upgrades to downloaded apps may also be downloaded. Such downloads may be performed opportunistically to preserve battery life. For instance, such downloads may be queued to be performed when the hearing assist device is being charged (e.g., by a proximate device providing an RF field, when it is placed in a charger, etc.
Abstract
Description
- This application claims the benefit of U.S. Provisional Application No. 61/662,217, filed on Jun. 20, 2012, which is incorporated by reference herein in its entirety.
- 1. Field of the Invention
- The present invention relates to hearing assist devices that sense, analyze, and communicate user health characteristics.
- 2. Background Art
- Persons may become hearing impaired for a variety of reasons, including aging and being exposed to excessive noise, which can both damage hair cells in the inner ear. A hearing aid is an electro-acoustic device that typically fits in or behind the ear of a wearer, and amplifies and modulates sound for the wearer. Hearing aids are frequently worn by persons who are hearing impaired to improve their ability to hear sounds. A hearing aid may be worn in one or both ears of a user, depending on whether one or both of the user's ears need assistance.
- Methods, systems, and apparatuses are described for hearing assist devices that include health sensors, transmitters, and receivers, as well as additional and/or alternative functionality, substantially as shown in and/or described herein in connection with at least one of the figures, as set forth more completely in the claims.
- The accompanying drawings, which are incorporated herein and form a part of the specification, illustrate embodiments and, together with the description, further serve to explain the principles of the embodiments and to enable a person skilled in the pertinent art to make and use the embodiments.
-
FIG. 1 shows a communication system that includes a multi-sensor hearing assist device that communicates with a near field communication (NFC)-enabled communications device, according to an exemplary embodiment. -
FIGS. 2-4 show various configurations for associating a multi-sensor hearing assist device with an ear of a user, according to exemplary embodiments. -
FIG. 5 shows a multi-sensor hearing assist device that mounts over an ear of a user, according to an exemplary embodiment. -
FIG. 6 shows a multi-sensor hearing assist device that extends at least partially into the ear canal of a user, according to an exemplary embodiment. -
FIG. 7 shows a circuit block diagram of a multi-sensor hearing assist device that is configured to communicate with external devices according to multiple communication schemes, according to an exemplary embodiment. -
FIG. 8 shows a flowchart of a process for a hearing assist device that processes and transmits sensor data and receives a command from a second device, according to an exemplary embodiment. -
FIG. 9 shows a communication system that includes a multi-sensor hearing assist device that communicates with one or more communications devices and network-connected devices, according to an exemplary embodiment. -
FIG. 10 shows a flowchart of a process for a wirelessly charging a battery of a hearing assist device, according to an exemplary embodiment. -
FIG. 11 shows a flowchart of a process for broadcasting sound that is generated based on sensor data, according to an exemplary embodiment. -
FIG. 12 shows a flowchart of a process for generating and broadcasting filtered sound from a hearing assist device, according to an exemplary embodiment. -
FIG. 13 shows a flowchart of a process for generating an information signal in a hearing assist device based on a voice of a user, and transmitting the information signal to a second device, according to an exemplary embodiment. -
FIG. 14 shows a flowchart of a process for generating voice based at least on sensor data to be broadcast by a speaker of a hearing assist device to a user, according to an exemplary embodiment. -
FIG. 15 shows a system that includes a hearing assist device and a cloud/service/phone portable device that may be communicatively connected thereto, according to an exemplary embodiment. - Embodiments will now be described with reference to the accompanying drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Additionally, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears.
- The present specification discloses numerous example embodiments. The scope of the present patent application is not limited to the disclosed embodiments, but also encompasses combinations of the disclosed embodiments, as well as modifications to the disclosed embodiments.
- References in the specification to “one embodiment,” “an embodiment,” “an example embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
- Furthermore, it should be understood that spatial descriptions (e.g., “above,” “below,” “up,” “left,” “right,” “down,” “top,” “bottom,” “vertical,” “horizontal,” etc.) used herein are for purposes of illustration only, and that practical implementations of the structures described herein can be spatially arranged in any orientation or manner.
- Numerous exemplary embodiments are described as follows. It is noted that any section/subsection headings provided herein are not intended to be limiting. Embodiments are described throughout this document, and any type of embodiment may be included under any section/subsection. Furthermore, disclosed embodiments may be combined with each other in any manner.
- Persons may become hearing impaired for a variety of reasons, including aging and being exposed to excessive noise, which can both damage hair cells in the inner ear. A hearing aid is an electro-acoustic device that typically fits in or behind the ear of a wearer, and amplifies and modulates sound for the wearer. Hearing aids are frequently worn by persons who are hearing impaired to improve their ability to hear sounds. A hearing aid may be worn in one or both ears of a user, depending on whether one or both of the user's ears need hearing assistance.
- Opportunities exist with integrating further functionality into hearing assist devices that are worn in/on a human ear. Hearing assist devices, such as hearing aids, headsets, and headphones, are typically worn in contact with the user's ear, and in some cases extend into the user's ear canal. As such, a hearing assist device is typically positioned in close proximity to various organs and physical features of a wearer, such as the inner ear structure (e.g., the ear canal, ear drum, ossicles, Eustachian tube, cochlea, auditory nerve, etc.), skin, brain, veins and arteries, and further physical features of the wearer. Because of this advantageous positioning, a hearing assist device may be configured to detect various characteristics of a user's health. Furthermore, the detected characteristics may be used to treat health-related issues of the wearer, and perform further health-related functions. As such, hearing assist devices may be used by users that do not even have hearing problems, but instead may be used by these users to detect other health problems.
- For instance, in embodiments, health monitoring technology may be incorporated into a hearing assist device to monitor the health of a wearer. Examples of health monitoring technology that may be incorporated in a hearing assist device include health sensors that determine (e.g., sense/detect/measure/collect, etc.) various physical characteristics of the user, such as blood pressure, heart rate, temperature, humidity, blood oxygen level, skin galvanometric levels, brain wave information, arrhythmia onset detection, skin chemistry changes, falling down impacts, long periods of activity, etc.
- Sensor information resulting from the monitoring may be analyzed within the hearing assist device, or may be transmitted from the hearing assist device and analyzed at a remote location. For instance, the sensor information may be analyzed at a local computer, in a smart phone or other mobile device, or at a remote location, such as at a cloud-based server. In response to the analysis of the sensor information, instructions and/or other information may be communicated back to the wearer. Such information may be provided to the wearer by a display screen (e.g., a desktop computer display, a smart phone display, a tablet computer display, a medical equipment display, etc.), by the hearing assist device itself (e.g., by voice, beeps, etc.), or may be provided to the wearer in another manner. Medical personnel and/or emergency response personnel (e.g., reachable at the 911 phone number) may be alerted when particular problems with the wearer are detected by the hearing assist device. The medical personnel may evaluate information received from the hearing assist device, and provide information back to the hearing assist device/wearer. The hearing assist device may provide the wearer with reminders, alarms, instructions, etc.
- The hearing assist device may be configured with speech/voice recognition capability. For instance, the wearer may provide commands, such as by voice, to the hearing assist device. The hearing assist device may be configured to perform various audio processing functions to suppress background noise and/or other sounds, as well amplifying other sounds, and may be configured to modify audio according to a particular frequency response of the hearing of the wearer. The hearing assist device may be configured to detect vibrations (e.g., jaw movement of the wearer during talking), and may use the detected vibrations to aid in improving speech/voice recognition.
- Hearing assist devices may be configured in various ways, according to embodiments. For instance,
FIG. 1 shows acommunication system 100 that includes a multi-sensor hearing assistdevice 102 that communicates with a near field communication (NFC)-enabledcommunications device 104, according to an exemplary embodiment. Hearing assistdevice 102 may be worn in association with the ear of a user, and may be configured to communicate with other devices, such ascommunications device 104. As shown inFIG. 1 , hearing assistdevice 102 includes a plurality ofsensors processing logic 108, anNFC transceiver 110,storage 112, and arechargeable battery 114. These features of hearingassist device 102 are described as follows. -
Sensors sensors device 102 inFIG. 1 , any number of sensors may be included in hearing assistdevice 102, including three sensors, four sensors, five sensors, etc. (e.g., tens of sensors, hundreds of sensors, etc.). Examples of sensors forsensors sensors -
Processing logic 108 may be implemented in hardware (e.g., one or more processors, electrical circuits, etc.), or any combination of hardware with software and/or firmware.Processing logic 108 may receive sensor information fromsensors Processing logic 108 may execute one or more programs that define various operational characteristics, such as: (i) a sequence or order of retrieving sensor information from sensors of hearingassist device 102, (ii) sensor configurations and reconfigurations (via a preliminary setup or via adaptations over the course of time), (iii) routines by which particular sensor data is at least pre-processed, and (iv) one or more functions/actions to be performed based on particular sensor data values, etc. - For instance,
processing logic 108 may store and/or access sensor data instorage 112, processed or unprocessed. Furthermore,processing logic 108 may access one or more programs stored instorage 112 for execution.Storage 112 may include one or more types of storage, including memory (e.g., random access memory (RAM), read only memory (ROM), etc.) that is volatile or non-volatile. -
NFC transceiver 110 is configured to wirelessly communicate with a second device (e.g., a local or remote supporting device), such as NFC-enabledcommunications device 104 according to NFC techniques. NFC uses magnetic induction between two loop antennas (e.g., coils, microstrip antennas, etc.) located within each other's near field, effectively forming an air-core transformer. As such, NFC communications occur over relatively short ranges (e.g., within a few centimeters), and are conducted at radio frequencies. For instance, in one example, NFC communications may be performed byNFC transceiver 110 at a 13.56 MHz frequency, with data transfers of up to 424 kilobits per second. In other embodiments,NFC transceiver 110 may be configured to perform NFC communications at other frequencies and data transfer rates. Examples of standards according to whichNFC transceiver 110 may be configured to conduct NFC communications include ISO/IEC 18092 and those defined by the NFC Forum, which was founded in 2004 by Nokia, Philips and Sony. - NFC-enabled
communications device 104 may be configured with an NFC transceiver to perform NFC communications. NFC-enabledcommunications device 104 may be any type of device that may be enabled with NFC capability, such as a docking station, a desktop computer (e.g., a personal computer, etc.), a mobile computing device (e.g., a personal digital assistant (PDA), a laptop computer, a notebook computer, a tablet computer (e.g., an Apple iPad™), a netbook, etc.), a mobile phone (e.g., a cell phone, a smart phone, etc.), a medical appliance, etc. Furthermore, NFC-enabledcommunications device 104 may be network-connected to enable hearing assistdevice 102 to communicate with entities over the network (e.g., cloud computers or servers, web services, etc.). -
NFC transceiver 102 enables sensor data (processed or unprocessed) to be transmitted by processinglogic 108 from hearingassist device 102 to NFC-enabledcommunications device 104. In this manner, the sensor data may be reported, processed, and/or analyzed externally to hearing assistdevice 102. Furthermore,NFC transceiver 102 enablesprocessing logic 108 at hearingassist device 102 to receive data and/or instructions/commands from NFC-enabledcommunications device 104 in response to the transmitted sensor data. Furthermore,NFC transceiver 102 enablesprocessing logic 108 at hearingassist device 102 to receive programs (e.g., program code), including new programs, program updates, applications, “apps”, and/or other programs from NFC-enabledcommunications device 104 that can be executed by processinglogic 108 to change/update the functionality of hearingassist device 102. -
Rechargeable battery 114 is a rechargeable battery that includes one or more electrochemical cells that store charge that may be used to power components of hearingassist device 102, including one or more ofsensor processing logic 108,NFC transceiver 110, andstorage 112.Rechargeable battery 114 may be any suitable rechargeable battery type, including lead-acid, nickel cadmium (NiCd), nickel metal hydride (NiMH), lithium ion (Li-ion), and lithium ion polymer (Li-ion polymer). Charging of the batteries may be through a typical tethered recharger or via NFC power delivery. - Although NFC communications are shown, alternative communication approaches can be employed. Such alternatives may include wireless power transfer schemes as well.
- Hearing assist
device 102 may be configured in any manner to be associated with the ear of a user. For instance,FIGS. 2-4 show various configurations for associating a hearing assist device with an ear of a user, according to exemplary embodiments. InFIG. 2 , hearing assistdevice 102 may be a hearing aid type that fits and is inserted partially or fully in anear 202 of a user. As shown inFIG. 2 , hearing assistdevice 102 includes sensors 106 a-106 n that contact the user. Examples forms of hearingassist device 102 ofFIG. 2 include ear buds, “receiver in the canal” hearing aids, “in the ear” (ITE) hearing aids, “invisible in canal” (ITC) hearing aids, “completely in canal” (CIC) hearing aids, etc. Although not illustrated, cochlear implant configurations may also be used. - In
FIG. 3 , hearing assistdevice 102 may be a hearing aid type that mounts on top of, or behindear 202 of the user. As shown inFIG. 3 , hearing assistdevice 102 includes sensors 106 a-106 n that contact the user. Examples forms of hearingassist device 102 ofFIG. 3 include “behind the ear” (BTE) hearing aids, “open fit” or “over the ear” (OTE) hearing aids, eyeglasses hearing aids (e.g., that contain hearing aid functionality in or on the glasses arms), etc. - In
FIG. 4 , hearing assistdevice 102 may be a headset or head phones that mounts on the head of the user and include speakers that are held close to the user's ears. As shown inFIG. 4 , hearing assistdevice 102 includes sensors 106 a-106 n that contact the user. In the embodiment ofFIG. 4 , sensors 106 a-106 n may be spaced further apart in the headphones, including being dispersed in the ear pad(s) and/or along the headband that connects together the ear pads (when a head band is present). - It is noted that hearing assist
device 102 may be configured in further forms, including combinations of the forms shown inFIGS. 2-4 , and is not intended to be limited to the embodiments illustrated inFIGS. 2-4 . For instance, hearing assistdevice 102 may be a cochlear implant-type hearing aid, or other type of hearing assist device. The following section describes some example forms of hearingassist device 102 with associated sensor configurations. - As described above, hearing assist
device 102 may be configured in various forms, and may include any number and type of sensors. For instance,FIG. 5 shows ahearing assist device 500 that is an example of hearingassist device 102 according to an exemplary embodiment. Hearing assistdevice 500 is configured to mount over an ear of a user, and has a portion that is at least partially inserted into the ear. A user may wear a singlehearing assist device 500 on one ear, or may simultaneously wear first and second hearing assistdevices 500 on the user's right and left ears, respectively. - As shown in
FIG. 5 , hearing assistdevice 500 includes a case orhousing 502 that includes afirst portion 504, asecond portion 506, and athird portion 508.First portion 504 is shaped to be positioned behind/over the ear of a user. For instance, as shown inFIG. 5 ,first portion 504 has a crescent shape, and may optionally be molded in the shape of a user's outer ear (e.g., by taking an impression of the outer ear, etc.).Second portion 506 extends perpendicularly from a side of an end offirst portion 504.Second portion 506 is shaped to be inserted at least partially into the ear canal of the user.Third portion 508 extends fromsecond portion 506, and may be referred to as an earmold shaped to conform to the user's ear shape, to better adhere hearing assistdevice 500 to the user's ear. - As shown in
FIG. 5 , hearing assistdevice 500 further includes aspeaker 512, a forward IR/UV (ultraviolet)communication transceiver 520, a BTLE (BLUETOOTH low energy)antenna 522, at least onemicrophone 524, atelecoil 526, atethered sensor port 528, askin communication conductor 534, avolume controller 540, and a communication andpower delivery coil 542. Furthermore, hearing assistdevice 500 includes a plurality of medical sensors, including at least onepH sensor 510, an IR (infrared) orsonic distance sensor 514, an innerear temperature sensor 516, a position/motion sensor 518, a WPT (wireless power transfer)/NFC coil 530, aswitch 532, aglucose spectroscopy sensor 536, aheart rate sensor 538, and asubcutaneous sensor 544. In embodiments, hearing assistdevice 500 may include one or more of these further features and/alternative features. The features of hearingassist device 500 are described as follows. - As shown in
FIG. 5 ,speaker 512, IR orsonic distance sensor 514, and innerear temperature sensor 516 are located on a circular surface ofsecond portion 506 of hearingassist device 500 that faces into the ear of the user. Position/motion sensor 518 andpH sensor 510 are located on a perimeter surface ofsecond portion 506 around the circular surface that contacts the ear canal of the user. In alternative embodiments, one or more of these features may be located in/on different locations of hearingassist device 500. -
pH sensor 510 is a sensor that may be present to measure a pH of skin of the user's inner ear. The measured pH value may be used to determine a medical problem of the user, such an onset of stroke.pH sensor 510 may include one or more metallic plates. Upon receiving power (e.g., fromrechargeable battery 114 ofFIG. 1 ),pH sensor 510 may generate a sensor output signal (e.g., an electrical signal) that indicates a measured pH value. - Speaker 512 (also referred to as a “loudspeaker”) is a speaker of hearing
assist device 500 that broadcasts environmental sound received by microphone(s) 524, that is subsequently amplified and/or filtered by processing logic of the hearing assistdevice 600, into the ear of the user to assist the user in hearing the environmental sound. Furthermore,speaker 512 may broadcast additional sounds into the ear of the user for the user to hear, including alerts (e.g., tones, beeping sounds), voice, and/or further sounds that may be generated by or received by processing logic of hearingassist device 500, and/or may be stored in hearing assistdevice 500. - IR or
sonic distance sensor 514 is a sensor that may be present to sense a displacement distance. Upon receiving power, IR orsonic distance sensor 514 may generate an IR light pulse, a sonic (e.g., ultrasonic) pulse, or other light or sound pulse, that may be reflected in the ear of the user, and the reflection may be received by IR orsonic distance sensor 514. A time of reflection may be compared for a series of pulses to determine a displacement distance within the ear of user. IR orsonic distance sensor 514 may generate a sensor output signal (e.g., an electrical signal) that indicates a measured displacement distance. - A distance and eardrum deflection that is determined using IR or sonic distance sensor 514 (e.g., by using a high rate sampling or continuous sampling) may be used to calculate an estimate of the “actual” or “true” decibel level of an audio signal being input to the ear of the user. By incorporating such functionality, hearing assist
device 500 can perform the following when a user inserts and turns on hearing assist device 500: (i) automatically adjust the volume to fall within a target range; and (ii) prevent excess volume associated with unexpected loud sound events. It is noted that the amount of volume adjustment that may be applied can vary by frequency. It is also noted that the excess volume associated with unexpected loud sound events may be further prevented by using a hearing assist device that has a relatively tight fit, thereby allowing the hearing assist device to act as an ear plug. - Hearing efficiency and performance data over the spectrum of normal audible frequencies can be gathered by delivering each frequency (or frequency range) at an output volume level, measuring eardrum deflection characteristics, and delivering audible test questions to the user via hearing
assist device 500. This can be accomplished solely by hearingassist device 500 or with assistance from a smartphone or other external device or service. For example, a user may respond to an audio (or textual) prompt “Can you hear this?” with a “yes” or “no” response. The response is received by microphone(s) 524 (or via touch input for example) and processed internally or on an assisting external device to identify the response. Depending on the user's response, the amplitude of the audio output can be adjusted to determine a given user's hearing threshold for each frequency (or frequency range). From this hearing efficiency and performance data, input frequency equalization can be performed by hearingassist device 500 so as to deliver to the user audio signals that will be perceived in much the same way as someone with no hearing impairment. In addition, such data can be delivered to the assisting external device (e.g., to a smartphone) for use by such device in producing audio output for the user. For example, the assisting device can deliver an adjusted audio output tailored for the user if (i) the user is not wearing hearing assistdevice 500, (ii) the battery power of hearingassist device 500 is depleted, (iii) hearingassist device 500 is powered down, or (iv) hearingassist device 500 is operating in a lower power mode. In such situations, the supporting device can deliver the audio signal: (a) in an audible form via a speaker which will be generated with intent of directly reaching the eardrum; (b) in an audible form intended for receipt and amplification control by hearingassist device 500 without further need for user specific audio equalization; and (c) in a non-audible form (e.g.) electromagnetic transmission for receipt and conversion to an audible form by hearingassist device 500 and again without further equalization. - After testing and setup, a wearer may further tweak their recommended equalization via slide bars and such in a manner similar to adjusting equalization for other conventional audio equipment. Such tweaking can be carried out via the supporting device user interface. In addition, a plurality of equalization settings can be supported with each being associated with a particular mode of operation of hearing
assist device 500. That is conversation in a quiet room with one other might receive one equalization profile while a concert hall might receive another. Modes can be selected in many automatic or commanded ways via either or both hearing assistdevice 500 and the external supporting device. Automatic selection can be performed via analysis and classification of captured audio. Certain classifications may trigger selection of a particular mode. Commands may delivered via any user input interface such as voice input (voice recognized commands), tactile input commands, etc. - Audio modes also comprise alternate or additional audio processing techniques as well. For example, in one mode, to enhance audio perspective and directionality, delays might be selectively introduced (or increased in a stereoscopic manner) to enhance a wearer's ability to discern the location of an audio source. Sensor data may support automatic mode selection in such situations. Detecting walking impacts and outdoor GPS (Global Positioning System) location might automatically trigger such enhanced perspective mode. A medical condition might trigger another mode which attenuates environmental audio while delivering synthesized voice commands to the wearer. In another exemplary mode, both echoes and delays might be introduced to simulate a theater environment. For example, when audio is being sourced by a television channel broadcast of a movie, the theater environment mode might be selected. Such selection may be in response to a set top box, television or media player's commands or by identifying one of the same as the audio source.
- Other similar and all of such functionality can be carried out by one or both of hearing
assist device 500 and an external supporting device. When assisting the hearing aid device, the external supporting device may receive the audio for processing: (i) directly via built in microphones; (ii) from storage; or (iii) via yet another external device. Alternatively, the source audio may be captured by hearingassist device 500 itself and delivered via a wired or wireless pathway to the external supporting device for processing before delivery of either the processed audio signals or substitute audio back to hearing assistdevice 500 for delivery to the wearer. - Similarly, sensor data may be captured in one or both of hearing
assist device 500 and an external supporting device. Sensor data captured by hearingassist device 500 may likewise be delivered via such or other wired or wireless pathways to the external supporting device for (further) processing. The external supporting device may then respond to the sensor data received and processed by delivering audio content and/or hearing aid commands back to hearing assistdevice 500. Such commands may be to reconfigure some aspect of hearingassist device 500 or manage communication or power delivery. Such audio content may be instructional, comprise queries, or consist of commands to be delivered the wearer via the ear drums. Sensor data may be stored and displayed in some form locally on the external supporting device along with similar audio, graphical or textual content, commands or queries. In addition, such sensor data can be further delivered to yet other external supporting devices for further processing, analysis and storage. Sensors within one or both hearing assistdevice 500 and an external supporting device may be medical sensors or environmental sensors (e.g., latitude/longitude, velocity, temperature, wearer's physical orientation, acceleration, elevation, tilt, humidity, etc.). - Although not shown, hearing assist
device 500 may also be configured with an imager that may be located neartransceiver 520. The imager can then be used to capture images or video that may be relayed to one or more external supporting device for real time display, storage or processing. For example, detecting a medical situation and no response to audible content queries delivered via hearingassist device 500, the imager can be commanded (internal or external command origin) to capture an image or a video sequence. Such imager output can be delivered to medical staff via a user's supporting smartphone so that a determination can be made as to the user's condition or the position/location of hearingassist device 500. - Inner
ear temperature sensor 516 is a sensor that may be present to measure a temperature of the user. For instance, in an embodiment, upon receiving power, innerear temperature sensor 516 may include a lens used to measure inner ear temperature. IR light may be reflected from the user skin by an IR light emitter, such as the ear canal or ear drum, and received by a single temperature sensor element, a one-dimensional array of temperature sensor elements, a two-dimensional array of temperature sensor elements, or other configuration of temperature sensor elements. Innerear temperature sensor 516 may generate a sensor output signal (e.g., an electrical signal) that indicates a measured inner ear temperature. - Such a configuration may also be used to determine a distance to the user's ear drum. The IR light emitter and sensor may be used to determine a distance to the user's ear drum from hearing
assist device 500, which may be used by processing logic to automatically control a volume of sound emitted from hearingassist device 500, as well as for other purposes. Furthermore, the IR light emitter/sensor may also be used as an imager that captures an image of the inside of the user's ear. This could be used to identify characteristics of vein structures inside the user's ear, for example. The IR light emitter/sensor could also be used to detect the user's heartbeat, as well as to perform further functions. Note that hearing assistdevice 500 may include a light sensor that senses outdoor light levels for various purposes. - Position/
motion sensor 518 includes one or more sensors that may be present to measure time of day, location, acceleration, orientation, vibrations, and/or other movement related characteristics of the user. For instance, position/motion sensor 518 may include one or more of a GPS (global positioning system) receiver (to measure user position), an accelerometer (to measure acceleration of the user), a gyroscope (to measure orientation of the head of the user), a magneto (to determine a direction the user is facing), a vibration sensor (e.g., a micro-electromechanical system (MEMS) vibration sensor), etc. Position/motion sensor 518 may be used for various benefits, including determining whether a user has fallen (e.g., based on measured position, acceleration, orientation, etc.), for local VoD, and many more benefits. Position/motion sensor 518 may generate a sensor output signal (e.g., an electrical signal) that indicates one or more of the measured time of day, location, acceleration, orientation, vibration, etc. - In one example, MEMS sensors may be configured to record the position/movement of the head of a wearer for health purposes, for location applications, and/or for other reasons. A wireless transceiver and a MEMS sensor of hearing
assist device 500 can determine the location of the head of the user, which may be a more meaningful positioning reference point than a position of a mobile device (e.g., cellphone) held against the user's head by the user. In this manner, hearing assistdevice 500 may be configured to tell the user that the user is not looking at the road properly when driving. In another example, in this manner, hearing assistdevice 500 may determine that the wearer fell down and may send a communication signal to the user's mobile device to dial 911 or other emergency number. If the user is wearing a pair of hearing assistdevices 500, wireless communication signals may be used to help triangulate and determine position of the head. The user may shake their head up/down and/or may otherwise move their head to answer verbal commands provided by hearingassist device 500 and/or by the user's phone without the user having to speak. The user may enabled to speak to hearing assistdevice 500 to respond to commands (e.g., “did you fall?”, “are you alright?”, “should I dial for help?”, “are you falling asleep?”, etc.). Position data can be processed in hearing assistdevice 500, in the mobile device, in the “cloud”, etc. In an embodiment, to save power, the position data may be used to augment mobile/cloud data for better accurate and special circumstances. Hearing assistdevice 500 may determine a proximity to the mobile device of the user even if the camera on the mobile device is not in view. Based upon position and sensors, hearing assistdevice 500 may determine the direction the person is looking at to aid artificial reality. In an embodiment, hearing assistdevice 500 may be configured to calibrate position data when the head is in view of a remote camera. - The sensor information indicated by position/
motion sensor 518 and/or other sensors may be used for various purposes. For instance, position/motion information may be used to determine that the user has fallen down/collapsed. In response, voice and/or video assist (e.g., by a handheld device in communication with hearing assist device 500) may be used to gather feedback from the user (e.g., to find out if they are ok, and/or to further supplement the sensor data collection (which triggered the feedback request)). Such sensor data and feedback information, if warranted, can be automatically forwarded to medical staff, ambulance services, and/or family members, for example, as described elsewhere herein. The analysis of the data that triggered the forwarding process may be performed in whole or in part on one (or both) of hearingassist device 500, and/or on the assisting local device (e.g., a smart phone, tablet computer, set top box, TV, etc., in communication with a hearing assist device 500) and/or remote computing systems (e.g., at medical staff offices or as might be available through a cloud or portal service). - As shown in
FIG. 5 , forward IR/UV (ultraviolet)communication transceiver 520,BTLE antenna 522, microphone(s) 524,telecoil 526, tetheredsensor port 528, WPT/NFC coil 530,switch 532,skin communication conductor 534,glucose spectroscopy sensor 536, aheart rate sensor 538,volume controller 540, and communication andpower delivery coil 542 are located at different locations in/on thefirst portion 504 of hearingassist device 500. In alternative embodiments, one or more of these features may be located in/on different locations of hearingassist device 500. - Forward IR/
UV communication transceiver 520 is a communication mechanism that may be present to enable communications with another device, such as a smart phone, computer, etc. Forward IR/UV communication transceiver 520 may receive information/data from processing logic of hearingassist device 500 to be transmitted to the other device in the form of modulated light (e.g., IR light, UV light, etc.), and may receive information/data in the form of modulated light from the other device to be provided to the processing logic of hearingassist device 500. Forward IR/UV communication transceiver 520 may enable low power communications for hearingassist device 500, to reduce a load on a battery of hearingassist device 500. In an embodiment, an emitter/receiver of forward IR/UV communication transceiver 520 may be positioned onhousing 502 to be facing forward in a direction a wearer of hearingassist device 500 faces. In this manner, the forward IR/UV communication transceiver 520 may communicate with a device held by the wearer, such as a smart phone, a tablet computer, etc., to provide text to be displayed to the wearer, etc. -
BTLE antenna 522 is a communication mechanism coupled to a Bluetooth™ transceiver in hearing assistdevice 500 that may be present to enable communications with another device, such as a smart phone, computer, etc.BTLE antenna 522 may receive information/data from processing logic of hearingassist device 500 to be transmitted to the other device according to the Bluetooth™ specification, and may receive information/data transmitted according to the Bluetooth™ specification from the other device to be provided to the processing logic of hearingassist device 500. - Microphone(s) 524 is a sensor that may be present to receive environmental sounds, including voice of the user, voice of other persons, and other sounds in the environment (e.g., traffic noise, music, etc.). Microphone(s) 524 may include any number of microphones, and may be configured in any manner, including being omni-directional (non-directional), directional, etc. Microphone(s) 524 generates an audio signal based on the received environmental sound that may be processed and/or filtered by processing logic of hearing
assist device 500, may be stored in digital form in hearing assistdevice 500, may be transmitted from hearingassist device 500, and may be used in other ways. -
Telecoil 526 is a communication mechanism that may be present to enable communications with another device.Telecoil 526 is an audio induction loop that enables audio sources to be directly coupled to hearing assistdevice 500 in a manner known to persons skilled in the relevant art(s).Telecoil 526 may be used with a telephone, a radio system, and induction loop systems that transmit sound to hearing aids. -
Tethered sensor port 528 is a port that a remote sensor (separate from hearing assist device 500) may be coupled with to interface with hearingassist device 500. For instance,port 528 may be an industry standard or proprietary connector type. A remote sensor may have a tether (one or more wires) with a connector at an end that may be plugged intoport 528. Any number of tetheredsensor ports 528 may be present. Examples of sensor types that may interface with tetheredsensor port 528 include brainwave sensors (e.g., electroencephalography (EEG) sensors that record electrical activity along the scalp according to EEG techniques) attached to the user's scalp, heart rate/arrhythmia sensors attached to a chest of the user, etc. Such brainwave sensors may record/measure electrical signals of the user's brain. - WPT/
NFC coil 530 is a communication mechanism coupled to a NFC transceiver in hearing assistdevice 500 that may be present to enable communications with another device, such as a smart phone, computer, etc., as described above with respect to NFC transceiver 110 (FIG. 1 ). -
Switch 532 is a switching mechanism that may be present onhousing 502 to perform various functions, such as switching power on or off, switching between different power and/or operational modes, etc. A user may interact withswitch 532 to switch power on or off, to switch between modes, etc.Switch 532 may be any type of switch, including a toggle switch, a push button switch, a rocker switch, a three-(or greater) position switch, a dial switch, etc. -
Skin communication conductor 534 is a communication mechanism coupled to a transceiver in hearing assistdevice 500 that may be present to enable communications with another device, such as a smart phone, computer, etc., through skin of the user. For instance,skin communication conductor 534 may enable communications to flow between hearing assistdevice 500 and a smart phone held in the hand of the user, a second hearing assist device worn on an opposite ear of the user, a pacemaker or other device implanted in the user, or other communications device in communication with skin of the user. A transceiver of hearingassist device 500 may receive information/data from processing logic to be transmitted fromskin communication conductor 534 through the user's skin to the other device, and the transceiver may receive information/data atskin communication conductor 534 that was transmitted from the other device through the user's skin to be provided to the processing logic of hearingassist device 500. -
Glucose spectroscopy sensor 536 is a sensor that may be present to measure a glucose level of the user using spectroscopy techniques in a manner known to persons skilled in the relevant art(s). Such a measurement may be valuable in determining whether a user has diabetes. Such a measurement can also be valuable in helping a diabetic user determine whether insulin is needed, etc. (e.g., hypoglycemia or hyperglycemia).Glucose spectroscopy sensor 536 may be configured to monitor glucose in combination withsubcutaneous sensor 544. As shown inFIG. 5 ,subcutaneous sensor 544 is shown separate from, and proximate to hearing assistdevice 500. In such an embodiment,subcutaneous sensor 544 may be imbedded in the user's skin, in or around the user's ear. In an alternative embodiment,subcutaneous sensor 544 may be located in/on hearingassist device 500.Subcutaneous sensor 544 is a sensor that may be present to measure any attribute of a user's health, characteristics or status. For example,subcutaneous sensor 544 may be a glucose sensor implanted under the skin behind the ear so as to provide a reasonably close mating location with communication andpower delivery coil 542. When powered,glucose spectroscopy sensor 536 may measure the user glucose level with respect tosubcutaneous sensor 544, and may generate a sensor output signal (e.g., an electrical signal) that indicates a glucose level of the user/ -
Heart rate sensor 538 is a sensor that may be present to measure a heart rate of the user. For instance, in an embodiment, upon receiving power,heart rate sensor 538 may pressure changes with respect to a blood vessel in the ear, or may measure heart rate in another manner such as changes in reflectivity or otherwise as would be known to persons skilled in the relevant art(s). Missed beats, elevated heart rate, and further heart conditions may be detected in this manner.Heart rate sensor 538 may generate a sensor output signal (e.g., an electrical signal) that indicates a measured heart rate. In addition,subcutaneous sensor 544 might comprise at least a portion of an internal heart monitoring device which communicates via communication andpower delivery coil 542 heart status information and data.Subcutaneous sensor 544 could also be associated with or be part of a pacemaker or defibrillating implant, insulin pump, etc. -
Volume controller 540 is a user interface mechanism that may be present onhousing 502 to enable a user to modify a volume at which sound is broadcast fromspeaker 512. A user may interact withvolume controller 520 to increase or decrease the volume.Volume controller 540 may be any suitable controller type (e.g., a potentiometer), including a rotary volume dial, a thumb wheel, a capacitive touch sensing device, etc. - Instead of supporting both power delivery and communications, communication and
power delivery coil 542 may be dedicated to one or the other. For example, such coil may only support power delivery (if needed to charge or otherwise deliver power to subcutaneous sensor 544), and can be replaced with any other type of communication system that supports communication withsubcutaneous sensor 544. - It is noted that the coils/antennas of hearing
assist device 500 may be separately included in hearing assistdevice 500, or in embodiments, two or more of the coils/antennas may be combined as a single coil/antenna. - The processing logic of hearing
assist device 500 may be operable to set up/configure and adaptively reconfigure each of the sensors of hearingassist device 500 based on an analysis of the data obtained by such sensor as well as on an analysis of data obtained by other sensors. For example, a first sensor of hearingassist device 500 may be configured to operate at one sampling rate (or sensing rate) which is analyzed periodically or continuously. Furthermore, a second sensor of hearingassist device 500 can be in a sleep or power down mode to conserve battery power. When a threshold is exceeded or other triggering event occurs, such first sensor can be reconfigured by the processing logic of hearingassist device 500 to sample at a higher rate or continuously and the second sensor can be powered up and configured. Additionally, multiple types of sensor data can be used to construct or derive single conclusions. For example, heart rate can be gathered multiple ways (via multiple sensors) and combined to provide a more robust and trustworthy conclusion. Likewise, a combination of data obtained from different sensors (e.g., pH plus temperature plus horizontal posture plus impact detected plus weak heart rate) may result in an ambulance being called or indicate a possible heart attack. Or, if glucose is too high, hyperglycemia may be indicated while if glucose it too low, hypoglycemia may be indicated. Or, if glucose and heart data is acceptable, then a stroke may be indicated. This processing can be done in whole or in part within hearing assistdevice 500 with audio content being played to the wearer thereof to gather further voiced information from the wearer to assist in conclusions or to warn the wearer. -
FIG. 6 shows ahearing assist device 600 that is an example of hearingassist device 102 according to an exemplary embodiment. Hearing assistdevice 600 is configured to be at least partially inserted into the ear canal of a user (e.g., an ear bud). A user may wear a singlehearing assist device 600 on one ear, or may simultaneously wear first and second hearing assistdevices 600 on the user's right and left ears, respectively. - As shown in
FIG. 6 , hearing assistdevice 600 includes a case orhousing 602 that has a generally cylindrical shape, and includes afirst portion 604, asecond portion 606, and athird portion 608.First portion 604 is shaped to be inserted at least partially into the ear canal of the user.Second portion 606 extends coaxially fromfirst portion 604.Third portion 608 is a handle that extends fromsecond portion 606. A user graspsthird portion 608 to extract hearing assistdevice 600 from the ear of the user. - As shown in
FIG. 6 , hearing assistdevice 600 further includespH sensor 510,speaker 512, IR (infrared) orsonic distance sensor 514, innerear temperature sensor 516, and anantenna 610.pH sensor 510,speaker 512, IR (infrared) orsonic distance sensor 514, innerear temperature sensor 516 may function and be configured similarly as described above. In another embodiment, hearing assistdevice 600 may include an outer ear temperature sensor to determine outside ear temperature.Antenna 610 may be include one or more coils or other types of antennas to function as any one or more of the coils/antennas described above with respect toFIG. 5 and/or elsewhere herein (e.g., an NFC antenna, a Bluetooth™ antenna, etc.). - It is noted that antennas, such as coils, mentioned herein may be implemented as any suitable type of antenna, including a coil, a microstrip antenna, or other antenna type. Although further sensors, communication mechanisms, switches, etc., of hearing
assist device 500 ofFIG. 5 are not shown included in hearing assistdevice 600, one or more further of these features of hearingassist device 500 may additionally and/or alternatively be included in hearing assistdevice 600. Furthermore, sensors that are present in a hearing assist device may all operate simultaneously, or one or more sensors may be run periodically, and may be off at other times (e.g., based on an algorithm in program code, etc.). By running fewer sensors at any one time, battery power may be conserved. Note that in addition to one or more of sensor data compression, analysis, encryption, and processing, sensor management (duty cycling, continuous operations, threshold triggers, sampling rates, etc.) can be performed in whole or in part in any one or both hear assist devices, the assisting local device (e.g., smart phone, tablet computer, set top box, TV, etc.), and/or remote computing systems (at medical staff offices or as might be available through a cloud or portal service). - Hearing assist
devices - According to embodiments, hearing assist devices may be configured in various ways to perform their functions. For instance,
FIG. 7 shows a circuit block diagram of ahearing assist device 700 that is configured to communicate with external devices according to multiple communication schemes, according to an exemplary embodiment. Hearing assistdevices device 700, according to embodiments. - As shown in
FIG. 7 , hearing assistdevice 700 includes a plurality of sensors 702 a-702 c,processing logic 704, amicrophone 706, anamplifier 708, afilter 710, an analog-to-digital (A/D)converter 712, aspeaker 714, anNFC coil 716, anNFC transceiver 718, anantenna 720, aBluetooth™ transceiver 722, acharge circuit 724, abattery 726, a plurality of sensor interfaces 728 a-728 c, and a digital-to-analog (D/A) converter 764.Processing logic 704 includes a digital signal processor (DSP) 730, a central processing unit (CPU) 732, and amemory 734. Sensors 702 a-702 c,processing logic 704,amplifier 708,filter 710, A/D converter 712,NFC transceiver 718,Bluetooth™ transceiver 722,charge circuit 724, sensor interfaces 728 a-728 c, D/A converter 764,DSP 730,CPU 732, may each be implemented in the form of hardware (e.g., electrical circuits, digital logic, etc.) or a combination of hardware and software/firmware. The features of hearingassist device 700 shown inFIG. 7 are described as follows. - For instance, hearing aid functionality of hearing
assist device 700 is first described. InFIG. 7 ,microphone 706,amplifier 708,filter 710, A/D converter 712,processing logic 704, D/A converter 764, andspeaker 714 provide at least some of the hearing aid functionality of hearingassist device 700.Microphone 706 is a sensor that receives environmental sounds, including voice of the user of hearingassist device 700, voice of other persons, and other sounds in the environment (e.g., traffic noise, music, etc.).Microphone 706 may be configured in any manner, including being omni-directional (non-directional), directional, etc., and may include one or more microphones.Microphone 706 may be a miniature microphone conventionally used in hearing aids, as would be known to persons skilled in the relevant art(s), or may be another suitable type of microphone. Microphone(s) 524 (FIG. 5 ) is an example ofmicrophone 706.Microphone 706 generates a receivedaudio signal 740 based on the received environmental sound. -
Amplifier 708 receives and amplifies receivedaudio signal 740 to generate an amplifiedaudio signal 742.Amplifier 708 may be any type of amplifier, including a low-noise amplifier for amplifying low level signals.Filter 710 receives and processes amplifiedaudio signal 742 to generate a filteredaudio signal 744.Filter 710 may be any type of filter, including being a filter configured to filter out noise, other high frequencies, and/or other frequencies as desired. A/D converter 712 receives filteredaudio signal 742, which may be an analog signal, and converts filteredaudio signal 742 to digital form, to generate adigital audio signal 746. A/D converter 712 may be configured in any manner, including as a conventional A/D converter. -
Processing logic 704 receivesdigital audio signal 746, and may processdigital audio signal 746 in any manner to generate processeddigital audio signal 762. For instance, as shown inFIG. 7 ,DSP 730 may receivedigital audio signal 746, and may perform digital signal processing ondigital audio signal 746 to generate processeddigital audio signal 762.DSP 730 may be configured in any manner, including as a conventional DSP known to person skilled in the relevant art(s), or in another manner.DSP 730 may perform any suitable type of digital signal processing to process/filterdigital audio signal 746, including processingdigital audio signal 746 in the frequency domain to manipulate the frequency spectrum of digital audio signal 746 (e.g., according to Fourier transform/analysis techniques, etc.).DSP 730 may amplify particular frequencies, may attenuate particular frequencies, and may otherwise modifydigital audio signal 746 in the discrete domain.DSP 730 may perform the signal processing for various reasons, including noise cancelation or hearing loss compensation. For instance,DSP 730 may processdigital audio signal 746 to compensate for a personal hearing frequency response of the user, such as compensating for poor hearing of high frequencies, middle range frequencies, or other personal frequency response characteristics of the user. - In one embodiment,
DSP 730 may be pre-configured to processdigital audio signal 746. In another embodiment,DSP 730 may receive instructions fromCPU 732 regarding how to processdigital audio signal 746. For instance,CPU 732 may access one or more DSP configurations in stored in memory 734 (e.g., in other data 768) that may be provided toDSP 730 to configureDSP 730 for digital signal processing ofdigital audio signal 746. For instance,CPU 732 may select a DSP configuration based on a hearing assist mode selected by a user of hearing assist device 700 (e.g., by interacting withswitch 532, etc.). - As shown in
FIG. 7 , D/A converter 764 receives processeddigital audio signal 762, and converts processeddigital audio signal 762 to digital form, generating processedaudio signal 766. D/A converter 764 may be configured in any manner, including as a conventional D/A converter.Speaker 714 receives processedaudio signal 766, and broadcasts sound generated based on processedaudio signal 766 into the ear of the user. The user is enabled to hear the broadcast sound, which may be amplified, filtered, and/or otherwise frequency manipulated with respect to the sound received bymicrophone 706.Speaker 714 may be a miniature speaker conventionally used in hearing aids, as would be known to persons skilled in the relevant art(s), or may be another suitable type of speaker. Speaker 512 (FIG. 5 ) is an example ofspeaker 714.Speaker 714 may include one or more speakers. - Hearing assist
device 700 ofFIG. 7 is further described as follows with respect toFIGS. 8-14 .FIG. 8 shows aflowchart 800 of a process for a hearing assist device that processes and transmits sensor data and receives a command from a second device, according to an exemplary embodiment. In an embodiment, hearing assist device 700 (as well as any of hearing assistdevices flowchart 800. Further structural and operational embodiments will be apparent to persons skilled in the relevant art(s) based on the following description offlowchart 800 and hearing assistdevice 700. -
Flowchart 800 begins withstep 802. Instep 802, a sensor output signal is received from a medical sensor of the hearing assist device that senses a characteristic of the user. For example, as shown inFIG. 7 , sensors 702 a-702 c may each sense/measure information about a health characteristic of the user of hearingassist device 700. Sensors 702 a-702 c may each be one of the sensors shown inFIGS. 5 and 6 , and/or mentioned elsewhere herein. Although three sensors are shown inFIG. 7 for purposes of illustration, other numbers of sensors may be present in hearing assistdevice 700, including one sensor, two sensors, or greater numbers of sensors. Sensors 702 a-702 c each may generate a corresponding sensor output signal 758 a-758 c (e.g., an electrical signal) that indicates the measured information about the corresponding health characteristic. For instance, sensor output signals 758 a-758 c may be analog or digital signals having levels or values corresponding to the measured information. - Sensor interfaces 728 a-728 c are each optionally present, depending on whether the corresponding sensor outputs a sensor output signal that needs to be modified to be receivable by
CPU 732. For instance, each of sensor interfaces 728 a-728 c may include an amplifier, filter, and/or A/D converter (e.g., similar toamplifier 708,filter 710, and A/D converter 712) that respectively amplify (e.g., increase or decrease), reduces particular frequencies, and/or convert to digital form the corresponding sensor output signal. Sensor interfaces 728 a-728 c (when present) respectively output modified sensor output signals 760 a-760 c. - In
step 804, the sensor output signal is processed to generate processed sensor data. For instance, as shown inFIG. 7 ,processing logic 704 receives modified sensor output signals 760 a-760 c.Processing logic 704 may process modified sensor output signals 760 a-760 c in any manner to generate processed sensor data. For instance, as shown inFIG. 7 ,CPU 732 may receive modified sensor output signals 760 a-760 c.CPU 732 may process the sensor information in one or more of modified sensor output signals 760 a-760 c to generate processed sensor data. For instance,CPU 732 may manipulate the sensor information (e.g., according to an algorithm of code 738) to convert the sensor information into a presentable form (e.g., scaling the sensor information, adding or subtracting a constant to/from the sensor information, etc.). Furthermore,CPU 732 may transmit the sensor information of modified sensor output signals 760 a-760 c toDSP 730 to be digital signal processed byDSP 730 to generate processed sensor data, and may receive the processed sensor data fromDSP 730. The processed and/or raw (unprocessed) sensor data may optionally be stored in memory 734 (e.g., as sensor data 736). - In
step 806, the processed sensor data is wirelessly transmitted from the hearing assist device to a second device. For instance, as shown inFIG. 7 ,CPU 732 may provide the sensor data (processed or raw) (e.g., from CPU registers, fromDSP 730, frommemory 734, etc.) to a transceiver to be transmitted from hearingassist device 700. In the embodiment ofFIG. 7 , hearing assistdevice 700 includes anNFC transceiver 718 and aBT transceiver 722, which may each be used to transmit sensor data from hearingassist device 700. In alternative embodiments, hearing assistdevice 700 may include one or more additional and/or alternative transceivers that may transmit sensor data from hearingassist device 700, including a Wi-Fi transceiver, a forward IR/UV communication transceiver (e.g.,transceiver 520 ofFIG. 5 ), a telecoil transceiver (which may transmit via telecoil 526), a skin communication transceiver 534 (which may transmit via skin communication conductor 534), etc. The operation of such alternative transceivers will become apparent to persons skilled in the relevant art(s) based on the teachings provided herein. - As shown in
FIG. 7 ,NFC transceiver 718 may receive aninformation signal 740 fromCPU 732 that includes sensor data for transmitting. In an embodiment,NFC transceiver 718 may modulate the sensor data ontoNFC antenna signal 748 to be transmitted from hearingassist device 700 byNFC coil 716 whenNFC coil 716 is energized by an RF field generated by a second device. - Similarly,
BT transceiver 722 may receive aninformation signal 754 fromCPU 732 that includes sensor data for transmitting. In an embodiment,BT transceiver 722 may modulate the sensor data ontoBT antenna signal 752 to be transmitted from hearingassist device 700 by antenna 720 (e.g.,BTLE antenna 522 ofFIG. 5 ), according to a Bluetooth™ communication protocol or standard. - In embodiments, a hearing assist device may transmit/make a first communication with one or more other devices to provide sensor data and/or other information, and to receive information. For instance,
FIG. 9 shows acommunication system 900 that includes a hearing assist device communicating with other communication devices, according to an exemplary embodiment. As shown inFIG. 9 ,communication system 900 includes hearing assistdevice 700, amobile computing device 902, astationary computing device 904, and aserver 906.System 900 is described as follows. - Mobile computing device 902 (e.g., a local supporting device) is a device capable of communicating with hearing
assist device 700 according to one or more communication techniques. For instance, as shown inFIG. 9 ,mobile computing device 902 includes atelecoil 910, one ormore microphones 912, an IR/UV communication transceiver 914, a WPT/NFC coil 916, and aBluetooth™ antenna 918. In embodiments,mobile computing device 902 may include one or more of these features and/or alternative or additional features (e.g., communication mechanisms, etc.).Mobile computing device 902 may be any type of mobile electronic device, including a personal digital assistant (PDA), a laptop computer, a notebook computer, a tablet computer (e.g., an Apple iPad™), a netbook, a mobile phone (e.g., a cell phone, a smart phone, etc.), a special purpose medical device, etc. The features ofmobile computing device 902 shown inFIG. 9 are described as follows. -
Telecoil 910 is a communication mechanism that may be present to enablemobile computing device 902 to communicate with hearingassist device 700 via a telecoil (e.g., telecoil 526 ofFIG. 5 ). For instance,telecoil 910 and an associated transceiver may enablemobile computing device 902 to couple audio sources and/or other communications to hearing assistdevice 700 in a manner known to persons skilled in the relevant art(s). - Microphone(s) 912 may be present to receive voice of a user of
mobile computing device 902. For instance, the user may provide instructions formobile computing device 902 and/or for hearingassist device 700 by speaking into microphone(s) 912. The received voice may be transmitted to hearing assist device 700 (in digital or analog form) according to any communication mechanism, or may be converted into data and/or commands to be provided to hearing assistdevice 700 to cause functions/actions in hearing assistdevice 700. Microphone(s) 912 may include any number of microphones, and may be configured in any manner, including being omni-directional (non-directional), directional, etc. - IR/
UV communication transceiver 914 is a communication mechanism that may be present to enable communications with hearingassist device 700 via an IR/UV communication transceiver of hearing assist device 700 (e.g., forward IR/UV communication transceiver 520 ofFIG. 5 ). IR/UV communication transceiver 914 may receive information/data from and/or transmit information/data to hearing assist device 700 (e.g., in the form of modulated light, as described above). - WPT/
NFC coil 916 is an NFC antenna coupled to a NFC transceiver inmobile computing device 902 that may be present to enable NFC communications with an NFC communication mechanism of hearing assist device 700 (e.g.,NFC transceiver 110 ofFIG. 1 ,NFC coil 530 ofFIG. 5 ). WPT/NFC coil 916 may be used to receive information/data from and/or transmit information/data to hearing assistdevice 700. -
Bluetooth™ antenna 918 is a communication mechanism coupled to a Bluetooth™ transceiver inmobile computing device 902 that may be present to enable communications with hearing assist device 700 (e.g.,BT transceiver 722 andantenna 720 ofFIG. 7 ).Bluetooth™ antenna 918 may be used to receive information/data from and/or transmit information/data to hearing assistdevice 700. - As shown in
FIG. 9 ,mobile computing device 902 and hearing assistdevice 700 may exchange communication signals 920 according to any communication mechanism/protocol/standard mentioned herein or otherwise known. According to step 806, hearing assistdevice 700 may wirelessly transmit sensor data tomobile computing device 902. - Stationary computing device 904 (e.g., a local supporting device) is also a device capable of communicating with hearing
assist device 700 according to one or more communication techniques. For instance,stationary computing device 904 may be capable of communicating with hearingassist device 700 according to any of the communication mechanisms shown formobile computing device 902 inFIG. 9 , and/or according to other communication mechanisms/protocols/standards described elsewhere herein or otherwise known.Stationary computing device 904 may be any type of stationary electronic device, including a desktop computer (e.g., a personal computer, etc.), a docking station, a set top box, a gateway device, an access point, special purpose medical equipment, etc. - As shown in
FIG. 9 ,stationary computing device 904 and hearing assistdevice 700 may exchange communication signals 922 according to any communication mechanism/protocol/standard mentioned herein or otherwise known. According to step 806, hearing assistdevice 700 may wirelessly transmit sensor data tostationary computing device 904. - It is noted that mobile computing device 902 (and/or stationary computing device 904) may communicate with server 906 (e.g., a remote supporting device, a third device). For instance, as shown in
FIG. 9 , mobile computing device (and/or stationary computing device 904) may be communicatively coupled withserver 906 bynetwork 908.Network 908 may be any type of communication network, including a local area network (LAN), a wide area network (WAN), a personal area network (PAN), a phone network (e.g., a cellular network, a land based network), or a combination of communication networks, such as the Internet.Network 908 may include wired and/or wireless communication pathway(s) implemented using any of a wide variety of communication media and associated protocols. For example, such communication pathway(s) may comprise wireless communication pathways implemented via radio frequency (RF) signaling, infrared (IR) signaling, or the like. Such signaling may be carried out using long-range wireless protocols such as WIMAX® (IEEE 802.16) or GSM (Global System for Mobile Communications), medium-range wireless protocols such as WI-FI® (IEEE 802.11), and/or short-range wireless protocols such as BLUETOOTH® or any of a variety of IR-based protocols. Such communication pathway(s) may also comprise wired communication pathways established over twisted pair, Ethernet cable, coaxial cable, optical fiber, or the like, using suitable communication protocols therefor. It is noted that security protocols (e.g., private key exchange, etc.) may be used to protect sensitive health information that is communicated by hearingassist device 700 to and from remote devices. -
Server 906 may be any computer system, including a stationary computing device, a server computer, a mobile computing device, etc.Server 906 may include a web service, an API (application programming interface), or other service or interface for communications. - Sensor data and/or other information may be transmitted (e.g., relayed) to
server 906 overnetwork 908 to be processed. After such processing, in response,server 906 may transmit processed data, instructions, and/or other information throughnetwork 908 to mobile computing device 902 (and/or stationary computing device 904) to be transmitted to hearing assistdevice 700 to be stored, to cause a function/action at hearingassist device 700, and/or for other reason. - Referring back to
FIG. 8 , instep 808, at least one command is received from the second device at the hearing assist device. For instance, referring toFIG. 7 , hearing assistdevice 700 may receive a second communication as a wirelessly transmitted communication signal from a second device atNFC coil 716,antenna 720, or other antenna or communication mechanism at hearingassist device 700. The communication may include a command and/or may identify a function, and hearing assistdevice 700 may respond by performing the command and/or function. For instance, hearing assistdevice 700 may respond by gathering additional sensor data, by analyzing retrieved sensor data, by performing a command, etc. Example commands include commands relating to sensor data capture, such as a command for a particular sensor to perform and/or provide a measurement, a command related to a sensing configuration (e.g., turning on and/or off particular sensors, calibrating particular sensors, etc.), a command related to a hearing assist device configuration (e.g., turning on and/or off particular hearing assist device components, calibrating particular components, etc.), a command that defines audio playback, etc. A received communication may define audio playback, such as by including or causing audio data to be played to the user by a speaker of hearingassist device 700 as voice or other sound, including or causing audio data to be played to the user by a speaker of hearingassist device 700 that prompts for user input (e.g., requests a user response to a question, etc.), etc. - For instance, in the example of
NFC coil 716, a command may be transmitted fromNFC coil 716 onNFC antenna signal 748 toNFC transceiver 718.NFC transceiver 718 may demodulate command data from the received communication signal, and provide the command toCPU 732. In the example ofantenna 720, the command may be transmitted fromantenna 720 onBT antenna signal 752 toBT transceiver 722.BT transceiver 722 may demodulate command data from the received communication signal, and provide the command toCPU 732. -
CPU 732 may execute the received command. The received command may cause hearing assistdevice 700 to perform one or more functions/actions. For instance, in embodiments, the command may cause hearing assistdevice 700 to turn on or off, to change modes, to activate or deactivate one or more sensors, to wirelessly transmit further information, to execute particular program code (e.g., stored ascode 738 in memory 734), to play a sound (e.g., an alert, a tone, a beeping noise, pre-recorded or synthesized voice, etc.) fromspeaker 714 to the user to inform the user of information and/or cause the user to perform a function/action, and/or cause one or more additional and/or alternative functions/actions to be performed by hearingassist device 700. Further examples of such commands and functions/actions are described elsewhere herein. - In embodiments, a hearing assist device may be configured to convert received RF energy into charge for storage in a battery of the hearing assist device. For instance, as shown in
FIG. 7 , hearing assistdevice 700 includescharge circuit 724 for chargingbattery 726, which is a rechargeable battery (e.g., rechargeable battery 114). In an embodiment,charge circuit 724 may operate according toFIG. 10 .FIG. 10 shows aflowchart 1000 of a process for a wirelessly charging a battery of a hearing assist device, according to an exemplary embodiment.Flowchart 1000 is described as follows. - In
step 1002 offlowchart 1000, a radio frequency signal is received. For example, as shown inFIG. 7 ,NFC coil 716,antenna 720, and/or other antenna or coil of hearingassist device 700 may receive a radio frequency (RF) signal. The RF signal may be a communication signal that includes data (e.g., modulated on the RF signal), or may be an un-modulated RF signal.Charge circuit 724 may be coupled to one or more ofNFC coil 716,antenna 720, or other antenna to receive the RF signal. - In
step 1004, a charge current is generated that charges a rechargeable battery of the hearing assist device based on the received radio frequency signal. In an embodiment,charge circuit 724 is configured to generate a charge current 756 that is used to chargebattery 726.Charge circuit 724 may be configured in various ways to convert a received RF signal to a charge current. For instance,charge circuit 724 may include an induction coil to take power from an electromagnetic field and convert it to electrical current. Alternatively,charge circuit 724 may include a diode rectifier circuit that rectifies the received RF signal to a DC (direct current) signal, and may include one or more charge pump circuits coupled to the diode rectifier circuit used to create a higher voltage value from the DC signal. Alternatively,charge circuit 724 may be configured in other ways to generate charge current 756 from a received RF signal. - In this manner, hearing assist
device 700 may maintain power for operation, withbattery 726 being charged periodically by RF fields generated by other devices, rather than needing to physically replace batteries. - In another embodiment, hearing assist
device 700 may be configured to generate sound based on received sensor data. For instance, hearing assistdevice 700 may operate according toFIG. 11 .FIG. 11 shows aflowchart 1100 of a process for generating and broadcasting sound based on sensor data, according to an exemplary embodiment. For purposes of illustration,flowchart 1100 is described as follows with reference toFIG. 7 . -
Flowchart 1100 begins withstep 1102. Instep 1102, an audio signal is generated based at least on the processed sensor data. For instance, as described above with respect tosteps FIG. 8 ), a sensor output signal may be processed to generate processed sensor data. The processed sensor data may be stored inmemory 736 assensor data 736, may be held in registers inCPU 732, or may be present in another location. Audio data for one or more sounds (e.g., tones, beeping sounds, voice segments, etc.) may be stored in memory 734 (e.g., as other data 768) that may be selected for play to the user based on particular sensor data (e.g., particular values of sensor data, etc.).CPU 732 orDSP 730 may select the audio data corresponding to particular sensor data frommemory 734. Alternatively,CPU 732 may transmit a request for the audio data from another device using a communication mechanism (e.g.,NFC transceiver 718,BT transceiver 722, etc.).DSP 730 may receive the audio data fromCPU 732, frommemory 734, or from another device, and may generate processeddigital audio signal 762 based thereon. - In
step 1104, sound is generated based on the audio signal, the sound broadcast from a speaker of the hearing assist device into the ear of the user. For instance, as shown inFIG. 7 , D/A converter 764 may be present, and may receive processeddigital audio signal 762. D/A converter 764 may convert processeddigital audio signal 762 to digital form to generate processedaudio signal 766.Speaker 714 receives processedaudio signal 766, and broadcasts sound generated based on processedaudio signal 766 into the ear of the user. - In this manner, sounds may be provided to the user by hearing
assist device 700 based at least on sensor data, and optionally further based on additional information. The sounds may provide information to the user, and may remind or instruct the user to perform a function/action. The sounds may include one or more of a tone, a beeping sound, or a voice that includes at least one of a verbal instruction to the user, a verbal warning to the user, or a verbal question to the user. For instance, a tone or a beeping sound may be provided to the user as an alert based on particular values of sensor data (e.g., indicating a high glucose/blood sugar value), and/or a voice instruction may be provided to the user as the alert based on the particular values of sensor data (e.g., a voice segment stating “Blood sugar is low—Insulin is required” or “hey, your heart rate is 80 beats per minute, your heart is fine, your pacemaker has got 6 hours of battery left.”). - In another embodiment, hearing assist
device 700 may be configured to generate filtered environmental sound. For instance, hearing assistdevice 700 may operate according toFIG. 12 .FIG. 12 shows aflowchart 1200 of a process for generating and broadcasting filtered sound from a hearing assist device, according to an exemplary embodiment. For purposes of illustration,flowchart 1200 is described as follows with reference toFIG. 7 . -
Flowchart 1200 begins withstep 1202. Instep 1202, an audio signal is generated based on environmental sound received by at least one microphone of the hearing assist device. For instance, as shown inFIG. 7 ,microphone 706 may generate a receivedaudio signal 740 based on received environmental sound. Receivedaudio signal 740 may optionally be amplified, filtered, and converted to digital form to generatedigital audio signal 746, as shown inFIG. 7 . - In step 1204, one or more frequencies of the audio signal are selectively favored to generate a modified audio signal. As shown in
FIG. 7 ,DSP 730 may receivedigital audio signal 746, and may perform digital signal processing ondigital audio signal 746 to generate processeddigital audio signal 762.DSP 730 may favor one or more frequencies by amplifying particular frequencies, attenuate particular frequencies, and/or by otherwise filteringdigital audio signal 746 in the discrete domain.DSP 730 may perform the signal processing for various reasons, including noise cancelation or hearing loss compensation. For instance,DSP 730 may processdigital audio signal 746 to compensate for a personal hearing frequency response of the user, such as compensating for poor hearing of high frequencies, middle range frequencies, or other personal frequency response characteristics of the user. - In
step 1206, sound is generated based on the modified audio signal, the sound broadcast from a speaker of the hearing assist device into the ear of the user. For instance, as shown inFIG. 7 , D/A converter 764 may be present, and may receive processeddigital audio signal 762. D/A converter 764 may convert processeddigital audio signal 762 to digital form to generate processedaudio signal 766.Speaker 714 receives processedaudio signal 766, and broadcasts sound generated based on processedaudio signal 766 into the ear of the user. - In this manner, environmental noise, voice, and other sounds may be tailored to a particular user's personal hearing frequency response characteristics. Furthermore, particular noises in the environment may be attenuated (e.g., road noise, engine noise, etc.) to be filtered from the received environmental sounds so that the user may better hear important or desired sounds. Furthermore, sounds that are desired to be heard (e.g., music, a conversation, a verbal warning, verbal instructions, sirens, sounds of a nearby car accident, etc.) may be amplified so that the user may better hear them.
- In another embodiment, hearing assist
device 700 may be configured to transmit recorded voice of a user to another device. For instance, hearing assistdevice 700 may operate according toFIG. 13 .FIG. 13 shows aflowchart 1300 of a process for generating an information signal in a hearing assist device based on a voice of a user, and for transmitting the information signal to a second device, according to an exemplary embodiment. For purposes of illustration,flowchart 1300 is described as follows with reference toFIG. 7 . -
Flowchart 1300 begins withstep 1302. Instep 1302, an audio signal is generated based on a voice of the user received at a microphone of the hearing assist device. For instance, as shown inFIG. 7 ,microphone 706 may generate a receivedaudio signal 740 based on received voice of the user. Receivedaudio signal 740 may optionally be amplified, filtered, and converted to digital form to generatedigital audio signal 746, as shown inFIG. 7 . - The voice of the user may be any statement made by the user, including a question, a statement of fact, a command, or any other verbal sequence. For instance, the user may ask “what is my heart rate”. All such statements made by the user can be those intended for capture by one or more hearing assist devices, supporting local and remote systems. Such statements may also include unintentional sounds such as semi-lucid ramblings, moaning, choking, coughing, and/or other sounds. Any one or more of the hearing assist devices and the supporting local device can receive (via microphones) such audio and forward the audio from the hearing assist device(s) as needed for further processing. This processing may include voice and/or sound recognition, comparisons with command words or sequences, (video, audio) prompting for (gesture, tactile or audible) confirmation, carrying out commands, storage for later analysis or playback, and/or forwarding to an appropriate recipient system for further processing, storage, and/or presentations to others.
- In
step 1304, an information signal is generated based on the audio signal. As shown inFIG. 7 ,DSP 730 may receivedigital audio signal 746. In an embodiment,DSP 730 and/orCPU 732 may generate an information signal fromdigital audio signal 746 to be transmitted to a second device from hearingassist device 700.DSP 730 and/orCPU 732 may optionally perform voice/speech recognition ondigital audio signal 746 to recognize spoken words included therein, and may include the spoken words in the generated information signal. - For instance, in an embodiment,
code 738 stored inmemory 734 may include a voice recognition program that may be executed byCPU 732 and/orDSP 730. The voice recognition program may use conventional or proprietary voice recognition techniques. Furthermore, such voice recognition techniques may be augmented by sensor data. For instance, as described above, position/motion sensor 518 may include a vibration sensor. The vibration sensor may detect vibrations of the user associated with speaking (e.g., jaw movement of the wearer during talking), and generates corresponding vibration information/data. The vibration information output by the vibration sensor may be received byCPU 732 and/orDSP 730, and may be used to aid in improving speech/voice recognition performed by the voice recognition program. For instance, the vibration information may be used by the voice recognition program to detect breaks between words, to identify the location of spoken syllables, to identify the syllables themselves, and/or to better perform other aspects of voice recognition. Alternatively, the vibration information may be transmitted from hearingassist device 700, along with the information signal, to a second device to perform the voice recognition process at the second device (or other device). - In
step 1306, the generated information signal is transmitted to the second device. For instance, as shown inFIG. 7 ,CPU 732 may provide the information signal (e.g., from CPU registers, fromDSP 730, frommemory 734, etc.) to a transceiver to be transmitted from hearing assist device 700 (e.g.,NFC transceiver 718,BT transceiver 722, or other transceiver). - Another device, such as
mobile computing device 902,stationary computing device 904, andserver 906, which may be associated devices, third party devices (utilized by third parties), or be otherwise related to or not related to hearing assistdevice 700, may receive the transmitted voice information, and may analyze the voice (spoken words, moans, slurred words, etc.) therein to determine one or more functions/actions to be performed. As a result, one or more functions/actions may be determined to be performed by hearingassist device 700 or another device. - In another embodiment, hearing assist
device 700 may be configured to enable voice to be received and/or generated to be played to the user. For instance, hearing assistdevice 700 may operate according toFIG. 14 .FIG. 14 shows aflowchart 1400 of a process for generating voice to be broadcast to a user, according to an exemplary embodiment. For purposes of illustration,flowchart 1400 is described as follows with reference toFIG. 7 . -
Flowchart 1400 begins withstep 1402. Instep 1402, a sensor output signal is received from a medical sensor of the hearing assist device that senses a characteristic of the user. Similarly to step 802 ofFIG. 8 , sensors 702 a-702 c each sense/measure information about a health characteristic of the user of hearingassist device 700. For instance,sensor 702 a may sense a characteristic of the user (e.g., a heart rate, a blood pressure, a glucose level, a temperature, etc.).Sensors 702 a generatessensor output signal 758 a, which indicates the measured information about the corresponding health characteristic.Sensor interface 728 a, when present, may convertsensor output signal 758 a to modifiedsensor output signal 760 a, to be received by processing logic. - In
step 1404, processed sensor data is generated based on the sensor output signal. Similarly to step 804 ofFIG. 8 ,processing logic 704 receives modifiedsensor output signal 760 a, and may process modifiedsensor output signal 760 a in any manner. For instance, as shown inFIG. 7 ,CPU 732 may receive modifiedsensor output signal 760 a, and may process the sensor information contained therein to generate processed sensor data. For instance,CPU 732 may manipulate the sensor information (e.g., according to an algorithm of code 738) to convert the sensor information into a presentable form (e.g., scaling the sensor information, adding or subtracting a constant to/from the sensor information, etc.), or may otherwise process the sensor information. Furthermore,CPU 732 may transmit the sensor information of modifiedsensor output signal 760 a toDSP 730 to be digital signal processed. - In
step 1406, a voice audio signal generated based at least on the processed sensor data is received. In an embodiment, the processed sensor data generated instep 1404 may be transmitted from hearingassist device 700 to another device (e.g., as shown inFIG. 9 ), and a voice audio signal may be generated at the other device based on the processed sensor data. In another embodiment, the voice audio signal may be generated by processinglogic 704 based on the processed sensor data. The voice audio signal contains voice information (e.g., spoken words) that relate to the processed sensor data. For instance, the voice information may include a verbal alert, verbal instructions, and/or other verbal information to be provided to the user based on the processed sensor data (e.g., based on a value of measured sensor data, etc.). The voice information may be generated by being synthesized, being retrieved from memory 734 (e.g., a library of record spoken segments in other data 768), or being generated from a combination thereof. It is noted that the voice audio signal may be generated based on processed sensor data from one or more sensors.DSP 730 may output the voice audio signal as processeddigital audio signal 762. - In
step 1408, voice is broadcast from the speaker into the ear of the user based on the received voice audio signal. For instance, as shown inFIG. 7 , D/A converter 764 may be present, and may receive processeddigital audio signal 762. D/A converter 764 may convert processeddigital audio signal 762 to digital form to generate processedaudio signal 766.Speaker 714 receives processedaudio signal 766, and broadcasts voice generated based on processedaudio signal 766 into the ear of the user. - In this manner, voice may be provided to the user by hearing
assist device 700 based at least on sensor data, and optionally further based on additional information. The voice may provide information to the user, and may remind or instruct the user to perform a function/action. For instance, the voice may include at least one of a verbal instruction to the user (“take an iron supplement”), a verbal warning to the user (“your heart rate is high”), a verbal question to the user (“have you fallen down, and do you need assistance?”), or a verbal answer to the user (“your heart rate is 98 beats per minute”). - The next section describes some example hardware/software/firmware embodiments for hearing assist devices and associated remote devices.
- In embodiments, hearing assist devices may be configured to perform various functions using hardware (e.g., circuits), or a combination of hardware and software/firmware (e.g.,
code 738 ofFIG. 7 , etc.). Furthermore, hearing assist devices may communicate with remote devices (e.g.,mobile computing device 902,stationary computing device 904,server 906, etc.) that include corresponding functionality. According to an embodiment,FIG. 15 shows asystem 1500 comprising ahearing assist device 1501 and a cloud/service/phoneportable device 1503 that may be communicatively connected thereto.Hearing assist device 1501 may comprise, for example and without limitation, one of hearing assistdevices hearing assist device 1501 is shown inFIG. 15 , it is to be understood thatsystem 1500 may include two hearing assist devices.Device 1503 may comprise, for example and without limitation,mobile computing device 902,stationary computing device 904,server 906, or another remote device that is accessible to hearing assistdevice 1501. Thusdevice 1503 may be local with respect to the wearer of hearingassist device 1501 or remote with respect to the wearer of hearingassist device 1501. -
Hearing assist device 1501 includes a number of processing modules that may be implemented as software or firmware running on one or more general purpose processors (e.g.,CPU 732 ofFIG. 7 ) and/or DSPs (e.g., DSP 730), as dedicated circuitry, or as a combination thereof. Such processors and/or dedicated circuitry are collectively referred to inFIG. 15 as general purpose (DSP) anddedicated processing circuitry 1513. As shown inFIG. 15 , the processing modules include aspeech generation module 1523, a speech/noise recognition module 1525, an enhancedaudio processing module 1527, a clock/scheduler module 1529, a mode select and reconfiguration module 1531, and abattery management module 1533. - As also shown in
FIG. 15 , hearingassist device 1501 further includeslocal storage 1535.Local storage 1535 comprises one or more volatile and/or non-volatile memory devices or structures that are internal to hearing assist device 1501 (e.g.,memory 734 ofFIG. 7 ). Such memory devices or structures may be used to store recorded audio information in anaudio playback queue 1537 as well as to store information and settings 1539 associated with hearingassist device 1501, a user thereof, a device paired thereto, and to services (cloud-based or otherwise) accessed by or on behalf of hearingassist device 1501. -
Hearing assist device 1501 further includes sensor components and associatedcircuitry 1541. Such sensor components and associated circuitry may include but are not limited to one or more microphones, bone conduction sensors, temperature sensors, blood pressure sensors, blood glucose sensors, pulse oximetry sensors, pH sensors, vibration sensors, accelerometers, gyros, magnetos, any other sensor mentioned elsewhere herein, or the like. -
Hearing assist device 1501 still further includes user interface (UI) components and associated circuitry 1543. Such UI components may include buttons, switches, dials, capacitive touch sensing devices, or other mechanical components by which a user may control and configure the operation of hearing assist device 1501 (e.g.,switch 532 and volume controller 540). Such UI components may also comprise capacitive sensing components to allow for touch-based or tap-based interaction with hearingassist device 1501. Such UI components may further include a voice-based UI. Such voice-based UI may utilize speech/noise recognition module 1525 to recognize commands uttered by a user of hearingassist device 1501 and/orspeech generation module 1523 to provide output in the form of pre-defined or synthesized speech. In an embodiment in whichhearing assist device 1501 comprise an integrated part of a pair of glasses, visor or helmet, user interface component and associated circuitry 1543 may also comprise a display integrated with or projected upon a portion of the glasses, visor or helmet for presenting information to a user. -
Hearing assist device 1501 also includes communication interfaces and associatedcircuitry 1545 for carrying out communication over one or more wired, wireless, or skin-based communication pathways. Communication interfaces and associatedcircuitry 1545 enable hearing assistdevice 1501 to communicate withdevice 1503. Communication interfaces and associatedcircuitry 1545 may also enable hearing assistdevice 1501 to communicate with a second hearing assist device worn by the same user as well as with other devices. - Generally speaking, cloud/service/phone/
portable device 1503 comprises power resources, processing resources, and storage resources that can be used by hearingassist device 1501 to assist in performing certain operations and/or to improve the performance of such operations when a communication pathway has been established between the two devices. - In particular,
device 1503 includes a number of assist processing modules that may be implemented as software or firmware running on one or more general purpose processors and/or DSPs, as dedicated circuitry, or as a combination thereof. Such processors and/or dedicated circuitry are collectively referred to inFIG. 15 as general/dedicated processing circuitry (with hearing assist device support) 1553. As shown inFIG. 15 , the processing modules include a speech generation assistmodule 1555, a speech/noise recognition assistmodule 1557, an enhanced audioprocessing assist module 1559, a clock/scheduler assist module 1561, a mode select and reconfiguration assist module 1563, and a batterymanagement assist module 1565. - As also shown in
FIG. 15 ,device 1503 further includesstorage 1567.Storage 1567 comprises one or more volatile and/or non-volatile memory devices/structures and/or storage systems that are internal to or otherwise accessible todevice 1503. Such memory devices/structures and/or storage systems may be used to store recorded audio information in anaudio playback queue 1569 as well as to store information and settings 1571 associated with hearingassist device 1501, a user thereof, a device paired thereto, and to services (cloud-based or otherwise) accessed by or on behalf of hearingassist device 1501. For instance,storage 1567 may be used to record commands to be cached indevice 1503, such that when a time window becomes available for device 1571 to communicate with the outside environment (because of power savings or availability), such stored commands (and/or other data) may be sent to the user's mobile device, other devices, the cloud, etc. for processing. Results of such processing may be transmitted back todevice 1503, to an email address of the user, a text message address of the user, and/or may be provided to the user in another manner. -
Device 1503 also includes communication interfaces and associatedcircuitry 1577 for carrying out communication over one or more wired, wireless or skin-based communication pathways. Communication interfaces and associatedcircuitry 1577 enabledevice 1503 to communicate with hearingassist device 1501. Such communication may be direct (point-to-point betweendevice 1503 and hearing assist device 1501) or indirect (through one or more intervening devices or nodes). Communication interfaces and associatedcircuitry 1577 may also enabledevice 1503 to communicate with other devices or access various remote services, including cloud-based services. - In an embodiment in which
device 1503 comprises a device that is carried by or is otherwise locally accessible to a wearer of hearingassist device 1501,device 1503 may also comprise supplemental sensor components and associatedcircuitry 1573 and supplemental user interface components and associated circuitry 1575 that can be used by hearingassist device 1501 to assist in performing certain operations and/or to improve the performance of such operations. - Further explanation and examples of how external operational support may be provided to a hearing assist device will now be provided with continued reference to
system 1500 ofFIG. 15 . - A prerequisite for providing external operational support to hearing assist
device 1501 bydevice 1503 may be the establishment of a communication pathway betweendevice 1503 and hearing assistdevice 1501. In one embodiment, the establishment of such a communication pathway is achieved by implementing a communication service on hearingassist device 1501 that monitors for the presence ofdevice 1503 and selectively establishes communication therewith in accordance with a predefined protocol. Alternatively, a communication service may be implemented ondevice 1503 that monitors for the presence of hearingassist device 1501 and selectively establishes communication therewith in accordance with a predefined protocol. Still other methods of establishing a communication pathway between hearing assistdevice 1501 anddevice 1503 may be used. -
Hearing assist device 1501 includesbattery management module 1533 that monitors a state of a battery internal to hearing assistdevice 1501.Battery management module 1501 may also be configured to alert a wearer of hearingassist device 1501 when such battery is in a low-power state so that the wearer can recharge the battery. As discussed above, the wearer of hearingassist device 1501 can cause such recharging to occur by bringing a portable electronic device within a certain distance of hearingassist device 1501 such that power may be transferred via an NFC link, WPT link, or other suitable link for transferring power between such devices. In an embodiment in whichdevice 1503 comprises such a portable electronic device, hearingassist device 1501 may be said to be utilizing the power resources ofdevice 1503 to assist in the performance of its operations. - As also noted above, when a communication pathway has been established between hearing assist
device 1501 anddevice 1503, hearingassist device 1501 can also utilize other resources ofdevice 1503 to assist in performing certain operations and/or to improve the performance of such operations. Whether and when hearingassist device 1501 so utilizes the resources ofdevice 1503 may vary depending upon the designs of such devices and/or any user configuration of such devices. - For example, hearing
assist device 1501 may be programmed to only utilize certain resources ofdevice 1503 when the battery power available to hearing assistdevice 1501 has dropped below a certain level. As another example, hearingassist device 1501 may be programmed to only utilize certain resources ofdevice 1503 when it is determined that an estimated amount of power that will be consumed in maintaining a particular communication pathway between hearing assistdevice 1501 anddevice 1503 will be less than an estimated amount of power that will be saved by offloading functionality to and/or utilizing the resources ofdevice 1503. In accordance with such an embodiment, an assistance feature ofdevice 1503 may be provided when a very low power communication pathway can be established or exists between hearing assistdevice 1501 anddevice 1503, but that same assistance feature ofdevice 1503 may be disabled if the only communication pathway that can be established or exists between hearing assistdevice 1501 anddevice 1503 is one that consumes a relatively greater amount of power. - Still other decision algorithms can be used to determine whether and when hearing
assist device 1501 will utilize resources ofdevice 1503. Such algorithms may be applied bybattery management module 1533 of hearingassist device 1501 and/or by batterymanagement assist module 1565 ofdevice 1503 prior to activating assistance features ofdevice 1503. Furthermore, a user interface provided by hearingassist device 1501 and/ordevice 1503 may enable a user to select which features of hearingassist device 1501 should be able to utilize external operational support and/or under what conditions such external operational support should be provided. The settings established by the user may be stored as part of information and settings 1539 inlocal storage 1535 of hearingassist device 1501 and/or as part of information and settings 1571 instorage 1567 ofdevice 1503. - In accordance with certain embodiments, hearing
assist device 1501 can also utilize resources of a second hearing assist device to perform certain operations. For example, hearingassist device 1501 may communicate with a second hearing assist device worn by the same user to coordinate distribution or shared execution of particular operations. Such communication may be carried out, for example, via a point-to-point link between the two hearing assist devices or via links between the two hearing assist devices and an intermediate device, such as a portable electronic device being carried by a user. The determination of whether a particular operation should be performed by hearingassist device 1501 versus the second hearing assist device may be made bybattery management module 1533, a battery management module of the second hearing assist device, or via coordination between both battery management modules. - For example, if hearing
assist device 1501 has more battery power available then the second hearing assist device, hearingassist device 1501 may be selected to perform a particular operation, such as taking a blood pressure reading or the like. Such battery imbalance may result from, for example, one hearing assist device being used at a higher volume than the other over an extended period of time. Via coordination between the two hearing assist devices, a more balanced discharging of the batteries of both devices can be achieved. Furthermore, in accordance with certain embodiments, certain sensors may be present on hearingassist device 1501 that are not present on the second hearing assist device and certain sensors may be present on the second hearing assist device that are not present on hearingassist device 1501, such that a distribution of functionality between the two hearing assist devices is achieved by design. -
Hearing assist device 1501 comprises aspeech generation module 1523 that enables hearingassist device 1501 to generate and output verbal audio information (spoken words or the like) to a wearer thereof via a speaker of hearingassist device 1501. Such verbal audio information may be used to implement a voice UI, to provide speech-based alerts, messages and reminders as part of a clock/scheduler feature implemented by clock/schedule module 1529, or to provide emergency alerts or messages to a wearer of hearing assist device based on a detected medical condition of the wearer, or the like. The speech generated byspeech generation module 1523 may be pre-recorded and/or dynamically synthesized, depending upon the implementation. - When a communication pathway has been established between hearing assist
device 1501 anddevice 1503, speech generation assistmodule 1555 ofdevice 1503 may operate to perform all or part of the speech generation function that would otherwise be performed byspeech generation module 1523 of hearingassist device 1501. Such operation bydevice 1503 can advantageously cause the battery power of hearingassist device 1501 to be conserved. Any speech generated by speech generation assistmodule 1555 may be communicated back to hearing assistdevice 1501 for playback via at least one speaker of hearingassist device 1501. Any of a wide variety of well-known speech codecs may be used to carry out such transmission of speech information in an efficient manner. Additionally or alternatively, any speech generated by speech generation assistmodule 1555 can be played back via one or more speakers ofdevice 1503 ifdevice 1503 is local with respect to the wearer of hearingassist device 1501. - Furthermore, speech generation assist
module 1555 may provide a more elaborate set of features than those provided byspeech generation module 1523, asdevice 1503 may have access to greater power, processing and storage resources than hearing assistdevice 1501 to support such additional features. For example, speech generation assistmodule 1555 may provide a more extensive vocabulary of pre-recorded words, terms and sentences or may provide a more powerful speech synthesis engine. -
Hearing assist device 1501 includes a speech/noise recognition module 1525 that is operable to apply speech and/or noise recognition algorithms to audio input received via one or more microphones of hearingassist device 1501. Such algorithms can enable speech/noise recognition module 1525 to determine when a wearer of hearingassist device 1501 is speaking and further to recognize words that are spoken by such wearer, while rejecting non-speech utterances and noise. Such algorithms may be used, for example, to enable hearing assistdevice 1501 to provide a voice-based UI by which a wearer of hearingassist device 1501 can exercise voice-based control over the device. - When a communication pathway has been established between hearing assist
device 1501 anddevice 1503, speech/noise recognition assistmodule 1557 ofdevice 1503 may operate to perform all or part of the speech/noise recognition functions that would otherwise be performed by speech/noise recognition module 1525 of hearingassist device 1501. Such operation bydevice 1503 can advantageously cause the battery power of hearingassist device 1501 to be conserved. - Furthermore, speech/noise recognition assist
module 1557 may provide a more elaborate set of features than those provided by speech/noise recognition module 1525, asdevice 1503 may have access to greater power, processing and storage resources than hearing assistdevice 1501 to support such additional features. For example, speech/noise recognition assistmodule 1557 may include a training program that a wearer of hearingassist device 1501 can use to train the speech recognition logic to better recognize and interpret his/her own voice. As another example, speech/noise recognition assistmodule 1557 may include a process by which a wearer of hearingassist device 1501 can add new words to the dictionary of words that are recognized by the speech recognition logic. Such additional features may be included in an application that can be installed by the wearer ondevice 1503. Such additional features may also be supported by a user interface that forms part of supplemental user interface components and associated circuitry 1575. Of course, such features may be included in speech/noise recognition module 1525 in accordance with certain embodiments. -
Hearing assist device 1501 includes an enhancedaudio processing module 1527. Enhancedaudio processing module 1527 may be configured to process an input audio signal received by hearingassist device 1501 to achieve a desired frequency response prior to playing back such input audio signal to a wearer of hearingassist device 1501. For example, enhancedaudio processing module 1527 may selectively amplify certain frequency components of an input audio signal prior to playing back such input audio signal to the wearer. The frequency response to be achieved may specified by or derived from a prescription for the wearer that is provided to hearing assistdevice 1501 by an external device or system. In certain embodiments, such prescription may be formatted in a standardized manner in order to facilitate use thereof by any of a variety of hearing assistance devices and audio reproduction systems. - In accordance with a further embodiment in which
hearing assist device 1501 is worn in conjunction with a second hearing assist device, enhancedaudio processing module 1527 may modify a first input audio signal received by hearingassist device 1501 prior to playback of the first input audio signal to one ear of the wearer, while an enhanced audio processing module of the second hearing assist device modifies a second input audio signal received by the second hearing assist device prior to playback of the second input audio signal to the other ear of the wearer. Such modification of the first and second input audio signals can be used to achieve enhanced spatial signaling for the wearer. That is to say, the enhanced audio signals provided to both ears of the wearer will enable the wearer to better determine the spatial origin of sounds. Such enhancement is desirable for persons who have a poor ability to detect the spatial origin of sound, and therefore a poor ability to responds to spatial cues. To determine the appropriate modifications for the left and right ear of the wearer, an appropriate user-specific “head transfer function” can be determined through testing of a user. The results of such testing may then be used to calibrate the spatial audio enhancement function applied at each ear. - The next section describes some further example applications/embodiments for hearing assist devices.
- The hearing assist devices described above may used in further applications, as well as variations on the above described embodiments. In such applications, various health monitoring technologies can be integrated into hearing aid devices as well as into local and remote supporting devices and systems. Local systems may comprise one or more smart phones, tablets, computers that are portable or stationary. Such devices may have application software installed (downloaded) therein to define supporting behaviors. Local systems or devices may also comprise other dedicated health care devices such as monitors, rate measuring devices, and so on that may be stationary or be worn or carried by the user. Sensor data collected by one or both of the hearing aid devices and local supporting devices or systems can be used together to help provide a basis for a more accurate diagnosis of a user's current health.
- For instance, in an embodiment, a hearing assist device may be docked to a stationary docking station, or a mobile device may be held adjacent to the hearing assist device (e.g., against the ear of the user) to cause sensor data and/or other information to be transmitted from the hearing assist device according to NFC techniques, as well as to enable information to be received by the hearing assist device.
- Temperature information measured by a sensor of the hearing assist device, and/or further sensed information, may be used to determine whether the hearing aid is being worn by a user. If the hearing assist device is determined to not be worn (e.g., temperature below human temperature is detected), processing logic of the hearing assist device may cause the hearing assist device to enter a low power state (e.g., with periodic flashing LED or audio to support attempts to find a misplaced hearing aid).
- Similarly, an elevated human temperature (e.g., a fever, over 99 degrees Fahrenheit, etc.) may cause the processing logic to power up communication circuitry within the hearing assist device, which may in turn cause a remote device to power up and participate in a data exchange with the hearing assist device. In this manner, the elevated temperature may be reported to another person, including medical personnel. In an embodiment, a temperature extreme may cause a request to be transmitted to a remote device (e.g., a smart phone) to dial 911, medical staff, or family members.
- Program code, applications, or “apps” may be downloaded to a hearing assist device and stored in memory (e.g., in
memory 734 ofFIG. 7 as code 738). Such applications can program different sensor response functionality, tailoring the hearing assist device and/or mobile computing device to service a particular user. For example, sensor data (e.g., motion, heart rate, stress levels) may be analyzed by processing logic along with recorded audio sounds (e.g., sounds of pain, moaning, slurred words, a lack of sound), to determine a lack of movement, stroke, heart attack, to cause a request to be generated to dial 911 and/or a doctor immediately (e.g., through a wireless link to a local access point or phone). - In an embodiment, using NFC communications between the hearing assist device and a smart phone, the smart phone can determine a distance between the hearing assist device and the ear, and processing logic of the hearing assist device adjust may adjust broadcast sound accordingly. If there is no hearing assist device in the user's ear, and the phone reliably identifies the correct user (e.g., by camera, by sound, etc.), the phone may be configured to compensate for hearing loss of the user (e.g., by amplifying particular frequency ranges). The user may manually increase the phone volume, and may select an icon or other user interface mechanism to turn on hearing aid frequency compensation. Magnetic field induction may also be used to communicate the audio signal.
- In another embodiment, an alarm clock signal delivered via a hearing assist device may be configured to repeat until the user is determined to be upright by processing logic of the hearing assist device (e.g., based on measured information from position/motion 518). At this point, processing logic may cause a message (e.g., from memory 734) to be broadcast to the user, such as “All systems stable. You are at home and it is 8 am. It is time to take your XYZ pill.” Such messages, alerts, etc., may be triggered by processing logic in response to sensor data changes (e.g., emergencies, etc.), smart phone interaction, and/or pressing a status button on the hearing assist device. A user may be determined by processing logic to have fallen down (e.g., based on measured information from position/
motion 518 that indicates an impact and/or user orientation). At this point, processing logic may cause a message (e.g., from memory 734) to be broadcast to the user, such as “are you ok? Say yes if so, and no if injured.” The processing logic of the hearing assist device may then step the user through a question/answer (Q/A) interaction that based on sensor data circumstances to arrive at likelihood of needed medical intervention (e.g., “Are you sweaty? Can you read a book at arm's length? Can you read the letters? Shut one eye. Shut the other eye. Do you feel any numbness?” Information regarding the Q/A interaction may be transmitted to a medical staff member, who may review the interaction and deliver their own voice to the user through the hearing assist device and/or smart phone. A microphone of the hearing assist device may be used to capture the user's verbal responses, which may be delivered back to the medical staff member (or a family member). - Such a communication flow may of course be carried out via local cell phone device of the user, or via a back door channel through a third party's cell phone (a third party device). Also note that status message playback at the hearing assist device can be triggered by voice recognized commands received from the user. In addition, the hearing assist device and smart phone may use NFC to transfer a call or other audio to the hearing assist device. A skin pathway (e.g., via skin communication conductor 534) for communications to the
skin communication conductor 534 from a hand held smart phone may be used. A doctor may remotely evaluate (and control) hearing assist device performance/settings/battery, extra collected health data, etc., and deliver audio, with or without placing a call. - In addition to parallel text on a smart phone or other hand-held device UI, voice signaling can be injected into the hearing pathway (via speakers of the hearing assist device, or by speakers of the mobile computing device). For example, a warning message may be received from a smart environment for dangerous items in that area (e.g., from access points, smart phones, computers, sensors, etc.). The warning message may be played to the ear of the user with background sounds suppressed (e.g., by DSP 730) to make sure that the user hears the warning message. An intelligent mixing of sounds may be performed by
DSP 730. For instance, if the user is in a vehicle, the hearing assist device may be configured to ensure that particular desired sounds are heard clearly despite a sound level of the radio. The hearing assist device and/or the vehicle itself may amplify certain desired sounds or other sensor readings to the user, such as another vehicle that is getting too close, or an obstacle detected in front of the vehicle. - As described above, voice/speech recognition may be incorporated into a hearing assist device to enable commands from the user to be recognized and transmitted to a remote device under certain circumstances. Such voice commands provided by the user may be explicit (e.g., “contact my doctor”), or may be coded (e.g., saying “apple” to cause the hearing assist device to contact the user's doctor) for various reasons, such as to avoid public embarrassment regarding wearing a hearing aid. Furthermore, voice of the user may be recognized by the hearing assist device, and converted to a text message that is displayed to the user on a mobile computing device, or transmitted to one or more intended recipients. Furthermore, the mobile computing device may transmit commands to the hearing assist device that are converted to audio that is broadcast into the ear of the user by a speaker of the hearing assist device (e.g., to provide a privacy mode). In an embodiment, artificial reality may be augmented, where the mobile computing device can provide extra information (e.g., by voice) to the user based upon location and other aspects. For example, a medical condition of the user may be detected by the hearing assist device, as well as a location of the user, which may be used to launch a web search to find a local medical clinic (e.g., contact information, an address, etc.). Also, when the user is talking to someone, the hearing assist device and/or mobile computing device can train on the voice of a talker to support better filtering over time.
- As described above, the hearing assist device may implement voice recognition that detects slurred or unusual speech patterns of the user, which may indicate a potential medical condition of the user. For instance, slurred speech and time of detection information may prove critical when attempting to identify a window of opportunity in which blood thinners may be useful in minimizing brain damage due to a stroke.
- In an embodiment, a hearing assist device may perform an emergency call through a smart phone. For instance, if a person finds a user that is unconscious, the person may place their smart phone near the ear of the user, and the user's hearing assist device may make an emergency communication through the smart phone. The hearing assist device may gather sensor data to be used to evaluate the user's health, and may relay this sensor data through the smart phone to an emergency responder. The hearing assist device may even provide commands to the person to perform on the unconscious user (e.g., “feel the user's forehead,” etc.).
- As described above, injected voice may be provided by a hearing assist device to a user. For instance, the user may be listening to music that is transmitted to the hearing assist device from a remote device (e.g., through Bluetooth™, the user's skin, etc.). Voice provided by the hearing assist device may interrupt the music to provide verbal information to the user, such as “your blood pressure is dropping,” “you have a fever,” etc.
- Furthermore, as described above, program code or “apps” may be downloaded to a hearing assist device as well as to the remote device(s). Upgrades to downloaded apps may also be downloaded. Such downloads may be performed opportunistically to preserve battery life. For instance, such downloads may be queued to be performed when the hearing assist device is being charged (e.g., by a proximate device providing an RF field, when it is placed in a charger, etc.
- While various embodiments have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the embodiments. Thus, the breadth and scope of the described embodiments should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/623,545 US20130343585A1 (en) | 2012-06-20 | 2012-09-20 | Multisensor hearing assist device for health |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261662217P | 2012-06-20 | 2012-06-20 | |
US13/623,545 US20130343585A1 (en) | 2012-06-20 | 2012-09-20 | Multisensor hearing assist device for health |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130343585A1 true US20130343585A1 (en) | 2013-12-26 |
Family
ID=49774491
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/594,489 Expired - Fee Related US9185501B2 (en) | 2012-06-20 | 2012-08-24 | Container-located information transfer module |
US13/623,545 Abandoned US20130343585A1 (en) | 2012-06-20 | 2012-09-20 | Multisensor hearing assist device for health |
US13/623,435 Abandoned US20130343584A1 (en) | 2012-06-20 | 2012-09-20 | Hearing assist device with external operational support |
US14/879,765 Active US9730005B2 (en) | 2012-06-20 | 2015-10-09 | Container-located information transfer module |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/594,489 Expired - Fee Related US9185501B2 (en) | 2012-06-20 | 2012-08-24 | Container-located information transfer module |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/623,435 Abandoned US20130343584A1 (en) | 2012-06-20 | 2012-09-20 | Hearing assist device with external operational support |
US14/879,765 Active US9730005B2 (en) | 2012-06-20 | 2015-10-09 | Container-located information transfer module |
Country Status (1)
Country | Link |
---|---|
US (4) | US9185501B2 (en) |
Cited By (154)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD288512S (en) * | 1985-04-24 | 1987-03-03 | Thermo-Serv, Inc. | Wine glass |
US20130182855A1 (en) * | 2012-01-13 | 2013-07-18 | Samsung Electronics Co., Ltd. | Multimedia playing apparatus and method for outputting modulated sound according to hearing characteristic of user |
US20140169596A1 (en) * | 2012-12-14 | 2014-06-19 | Oticon A/S | Configurable hearing instrument |
US20140195018A1 (en) * | 2013-01-08 | 2014-07-10 | Samsung Electronics Co,. Ltd. | Method and apparatus for identifying exercise information of user |
US20140369537A1 (en) * | 2013-06-14 | 2014-12-18 | Oticon A/S | Hearing assistance device with brain computer interface |
US20140369536A1 (en) * | 2013-06-14 | 2014-12-18 | Gn Resound A/S | Hearing instrument with off-line speech messages |
US20150130628A1 (en) * | 2013-07-22 | 2015-05-14 | Center For Integrated Smart Sensors Foundation | Nfc or rfid based bio sensor measurement device and measuring method using the same |
US9135915B1 (en) * | 2012-07-26 | 2015-09-15 | Google Inc. | Augmenting speech segmentation and recognition using head-mounted vibration and/or motion sensors |
EP2928211A1 (en) * | 2014-04-04 | 2015-10-07 | Oticon A/s | Self-calibration of multi-microphone noise reduction system for hearing assistance devices using an auxiliary device |
US20150334493A1 (en) * | 2008-12-31 | 2015-11-19 | Thomas Howard Burns | Systems and methods of telecommunication for bilateral hearing instruments |
US20150351143A1 (en) * | 2014-05-30 | 2015-12-03 | Apple Inc. | Seamless connectivity between hearing aid and multiple devices |
US20160033308A1 (en) * | 2014-08-04 | 2016-02-04 | Infineon Technologies Ag | Intelligent gauge devices and related systems and methods |
US20160066185A1 (en) * | 2014-08-29 | 2016-03-03 | Freelinc Technologies | Spatially aware communications using radio frequency (rf) communications standards |
US20160157030A1 (en) * | 2013-06-21 | 2016-06-02 | The Trustees Of Dartmouth College | Hearing-Aid Noise Reduction Circuitry With Neural Feedback To Improve Speech Comprehension |
US20160278647A1 (en) * | 2015-03-26 | 2016-09-29 | Intel Corporation | Misalignment detection of a wearable device |
EP3035710A3 (en) * | 2016-03-30 | 2016-11-02 | Oticon A/s | Monitoring system for a hearing device |
US20160331964A1 (en) * | 2015-05-14 | 2016-11-17 | Cochlear Limited | Functionality migration |
US9503437B2 (en) * | 2014-12-12 | 2016-11-22 | Gn Resound A/S | Apparatus for secure hearing device communication and related method |
WO2016196838A1 (en) * | 2015-06-05 | 2016-12-08 | Apple Inc. | Changing companion communication device behavior based on status of wearable device |
US20170006387A1 (en) * | 2015-07-02 | 2017-01-05 | Carl L.C. Kah, JR. | External ear insert for hearing enhancement |
WO2017007728A1 (en) * | 2015-07-03 | 2017-01-12 | teleCalm, Inc. | Telephone system for impaired individuals |
US20170064470A1 (en) * | 2015-08-24 | 2017-03-02 | Ivana Popovac | Prosthesis functionality control and data presentation |
US20170094401A1 (en) * | 2014-05-20 | 2017-03-30 | Bugatone Ltd. | Aural measurements from earphone output speakers |
US20170111747A1 (en) * | 2015-10-14 | 2017-04-20 | Sonion Nederland B.V. | Hearing device with vibration sensitive transducer |
US20170118568A1 (en) * | 2014-12-10 | 2017-04-27 | Starkey Laboratories, Inc. | Managing a hearing assistance device via low energy digital communications |
EP3163911A1 (en) * | 2015-10-29 | 2017-05-03 | Sivantos Pte. Ltd. | Hearing aid system with sensor for detection of biological data |
US20170127196A1 (en) * | 2015-10-29 | 2017-05-04 | PogoTec, Inc. | Hearing aid adapted for wireless power reception |
US20170156010A1 (en) * | 2015-11-27 | 2017-06-01 | Rishubh VERMA | External component with inductance and mechanical vibratory functionality |
US20170195804A1 (en) * | 2015-12-30 | 2017-07-06 | Earlens Corporation | Charging protocol for rechargable hearing systems |
US9704497B2 (en) * | 2015-07-06 | 2017-07-11 | Apple Inc. | Method and system of audio power reduction and thermal mitigation using psychoacoustic techniques |
CN107801138A (en) * | 2016-08-29 | 2018-03-13 | 奥迪康有限公司 | hearing aid device with voice control function |
US9924255B2 (en) | 2016-03-31 | 2018-03-20 | Bose Corporation | On/off head detection using magnetic field sensing |
KR20180033077A (en) * | 2016-09-23 | 2018-04-02 | 애플 인크. | Broadcasting a device state in a wireless communication network |
US20180101656A1 (en) * | 2016-10-07 | 2018-04-12 | Bragi GmbH | Software Application Transmission via Body Interface Using a Wearable Device in Conjunction with Removable Body Sensor Arrays System and Method |
EP3313092A1 (en) * | 2017-03-17 | 2018-04-25 | Oticon A/s | A hearing system for monitoring a health related parameter |
WO2018129281A1 (en) * | 2017-01-05 | 2018-07-12 | Ohio State Innovation Foundation | Systems and methods for wirelessly charging a hearing device |
US10034103B2 (en) | 2014-03-18 | 2018-07-24 | Earlens Corporation | High fidelity and reduced feedback contact hearing apparatus and methods |
US20180271428A1 (en) * | 2017-03-23 | 2018-09-27 | Fuji Xerox Co., Ltd. | Brain wave measuring device and brain wave measuring system |
US10154352B2 (en) | 2007-10-12 | 2018-12-11 | Earlens Corporation | Multifunction system and method for integrated hearing and communication with noise cancellation and feedback management |
US10157037B2 (en) * | 2016-03-31 | 2018-12-18 | Bose Corporation | Performing an operation at a headphone system |
US10164685B2 (en) | 2014-12-31 | 2018-12-25 | Freelinc Technologies Inc. | Spatially aware wireless network |
US10185163B2 (en) | 2014-08-03 | 2019-01-22 | PogoTec, Inc. | Wearable camera systems and apparatus and method for attaching camera systems or other electronic devices to wearable articles |
US10213157B2 (en) * | 2017-06-09 | 2019-02-26 | Bose Corporation | Active unipolar dry electrode open ear wireless headset and brain computer interface |
US20190075382A1 (en) * | 2017-09-07 | 2019-03-07 | Light Speed Aviation, Inc. | Circumaural headset or headphones with adjustable biometric sensor |
US10237663B2 (en) | 2008-09-22 | 2019-03-19 | Earlens Corporation | Devices and methods for hearing |
US10241351B2 (en) | 2015-06-10 | 2019-03-26 | PogoTec, Inc. | Eyewear with magnetic track for electronic wearable device |
US10255902B2 (en) * | 2015-06-25 | 2019-04-09 | Boe Technology Group Co., Ltd. | Voice synthesis device, voice synthesis method, bone conduction helmet and hearing aid |
US10284964B2 (en) | 2010-12-20 | 2019-05-07 | Earlens Corporation | Anatomically customized ear canal hearing apparatus |
US10292601B2 (en) | 2015-10-02 | 2019-05-21 | Earlens Corporation | Wearable customized ear canal apparatus |
US10292607B2 (en) | 2015-03-26 | 2019-05-21 | Intel Corporation | Sensor data transmissions |
US10299050B2 (en) * | 2014-08-27 | 2019-05-21 | Auditory Labs, Llc | Mobile audio receiver |
US20190158946A1 (en) * | 2016-04-19 | 2019-05-23 | Snik Llc | Magnetic earphones holder |
US10344960B2 (en) | 2017-09-19 | 2019-07-09 | Bragi GmbH | Wireless earpiece controlled medical headlight |
US10348965B2 (en) | 2014-12-23 | 2019-07-09 | PogoTec, Inc. | Wearable camera system |
US20190261105A1 (en) * | 2015-02-23 | 2019-08-22 | Oticon A/S | Method and apparatus for controlling a hearing instrument to relieve tinitus, hyperacusis, and hearing loss |
US10397688B2 (en) | 2015-08-29 | 2019-08-27 | Bragi GmbH | Power control for battery powered personal area network device system and method |
US20190268707A1 (en) * | 2018-02-28 | 2019-08-29 | Starkey Laboratories, Inc. | Health monitoring with ear-wearable devices and accessory devices |
US10398374B2 (en) | 2016-11-04 | 2019-09-03 | Bragi GmbH | Manual operation assistance with earpiece with 3D sound cues |
US10412519B1 (en) * | 2015-12-27 | 2019-09-10 | Philip Scott Lyren | Switching binaural sound |
US10433788B2 (en) * | 2016-03-23 | 2019-10-08 | Bragi GmbH | Earpiece life monitor with capability of automatic notification system and method |
US10492010B2 (en) | 2015-12-30 | 2019-11-26 | Earlens Corporations | Damping in contact hearing systems |
US10506351B2 (en) * | 2016-03-11 | 2019-12-10 | Sonova Ag | Hearing assistance device and method with automatic security control |
USD869445S1 (en) * | 2019-05-22 | 2019-12-10 | Shenzhen Qianhai Patuoxun Network And Technology Co., Ltd | Earphone |
US20190387327A1 (en) * | 2018-06-18 | 2019-12-19 | Sivantos Pte. Ltd. | Method for operating a hearing apparatus system, and hearing apparatus system |
US10516949B2 (en) | 2008-06-17 | 2019-12-24 | Earlens Corporation | Optical electro-mechanical hearing devices with separate power and signal components |
US10516951B2 (en) | 2014-11-26 | 2019-12-24 | Earlens Corporation | Adjustable venting for hearing instruments |
US10524068B2 (en) | 2016-01-07 | 2019-12-31 | Sonova Ag | Hearing assistance device transducers and hearing assistance devices with same |
US10524038B2 (en) | 2012-02-22 | 2019-12-31 | Snik Llc | Magnetic earphones holder |
US10531206B2 (en) | 2014-07-14 | 2020-01-07 | Earlens Corporation | Sliding bias and peak limiting for optical hearing devices |
US20200009395A1 (en) * | 2013-10-29 | 2020-01-09 | Physio-Control, Inc. | Variable sound system for audio devices |
US20200015022A1 (en) * | 2018-07-03 | 2020-01-09 | Tom Yu-Chi CHANG | Hearing aid device and a system for controlling a hearing aid device |
US10575105B2 (en) | 2017-06-09 | 2020-02-25 | Sivantos Pte. Ltd. | Method for characterizing a receiver in a hearing device, hearing device and test apparatus for a hearing device |
KR20200020793A (en) * | 2017-06-23 | 2020-02-26 | 에너저스 코포레이션 | Systems, methods, and devices using wires of a sound reproduction device as antennas for reception of wireless transmission power |
US10582289B2 (en) | 2015-10-20 | 2020-03-03 | Bragi GmbH | Enhanced biometric control systems for detection of emergency events system and method |
USD878337S1 (en) * | 2019-07-08 | 2020-03-17 | Shenzhen Ginto E-commerce Co., Limited | Earphone |
US10624559B2 (en) | 2017-02-13 | 2020-04-21 | Starkey Laboratories, Inc. | Fall prediction system and method of using the same |
US10631074B2 (en) | 2016-04-19 | 2020-04-21 | Snik Llc | Magnetic earphones holder |
USD883260S1 (en) * | 2019-12-25 | 2020-05-05 | Shenzhen Qianhai Patuoxun Network And Technology Co., Ltd | Earphone |
US10652661B2 (en) | 2008-06-27 | 2020-05-12 | Snik, LLC | Headset cord holder |
US10659859B2 (en) | 2018-02-28 | 2020-05-19 | Starkey Laboratories, Inc. | Portable case for modular hearing assistance devices |
US10665243B1 (en) * | 2016-11-11 | 2020-05-26 | Facebook Technologies, Llc | Subvocalized speech recognition |
US10660378B2 (en) | 2008-06-27 | 2020-05-26 | Snik, LLC | Headset cord holder |
CN111226445A (en) * | 2017-10-23 | 2020-06-02 | 科利耳有限公司 | Advanced auxiliary device for prosthesis-assisted communication |
US10681450B2 (en) | 2016-11-04 | 2020-06-09 | Bragi GmbH | Earpiece with source selection within ambient environment |
US20200212961A1 (en) * | 2018-12-28 | 2020-07-02 | Samsung Electronics Co., Ltd. | Apparatus and method with near-field communication |
USD890138S1 (en) * | 2020-04-30 | 2020-07-14 | Shenzhen Qianhai Patuoxun Network And Technology Co., Ltd | Earphones |
WO2020144189A1 (en) | 2019-01-07 | 2020-07-16 | Cosinuss Gmbh | Method for providing data for an interface |
USD890724S1 (en) * | 2019-11-22 | 2020-07-21 | Stb International Limited | Earphone |
USD893462S1 (en) * | 2020-03-05 | 2020-08-18 | Shenzhen Humboldt Technology Co., Ltd | Headphone |
US10764668B2 (en) | 2017-09-07 | 2020-09-01 | Lightspeed Aviation, Inc. | Sensor mount and circumaural headset or headphones with adjustable sensor |
USD901457S1 (en) * | 2020-06-03 | 2020-11-10 | Shenzhen Wireless Cloud Image Electronics Co., Ltd. | Wireless headset |
WO2020264203A1 (en) * | 2019-06-28 | 2020-12-30 | Starkey Laboratories, Inc. | Direct informative communication through an ear-wearable device |
US10893353B2 (en) | 2016-03-11 | 2021-01-12 | Bragi GmbH | Earpiece with GPS receiver |
US10896665B2 (en) | 2016-11-03 | 2021-01-19 | Bragi GmbH | Selective audio isolation from body generated sound system and method |
US10911878B2 (en) | 2018-12-21 | 2021-02-02 | Starkey Laboratories, Inc. | Modularization of components of an ear-wearable device |
US10951968B2 (en) | 2016-04-19 | 2021-03-16 | Snik Llc | Magnetic earphones holder |
US10965164B2 (en) | 2012-07-06 | 2021-03-30 | Energous Corporation | Systems and methods of wirelessly delivering power to a receiver device |
US10985617B1 (en) | 2019-12-31 | 2021-04-20 | Energous Corporation | System for wirelessly transmitting energy at a near-field distance without using beam-forming control |
US10993013B2 (en) | 2012-02-22 | 2021-04-27 | Snik Llc | Magnetic earphones holder |
US10992187B2 (en) | 2012-07-06 | 2021-04-27 | Energous Corporation | System and methods of using electromagnetic waves to wirelessly deliver power to electronic devices |
US10992185B2 (en) | 2012-07-06 | 2021-04-27 | Energous Corporation | Systems and methods of using electromagnetic waves to wirelessly deliver power to game controllers |
US11011942B2 (en) | 2017-03-30 | 2021-05-18 | Energous Corporation | Flat antennas having two or more resonant frequencies for use in wireless power transmission systems |
US11018779B2 (en) | 2019-02-06 | 2021-05-25 | Energous Corporation | Systems and methods of estimating optimal phases to use for individual antennas in an antenna array |
US11102594B2 (en) | 2016-09-09 | 2021-08-24 | Earlens Corporation | Contact hearing systems, apparatus and methods |
US20210274273A1 (en) * | 2015-09-30 | 2021-09-02 | Apple Inc. | Portable listening device with sensors |
US20210267464A1 (en) * | 2018-11-15 | 2021-09-02 | Kyocera Corporation | Biosensor |
US11115519B2 (en) | 2014-11-11 | 2021-09-07 | K/S Himpp | Subscription-based wireless service for a hearing device |
US11139699B2 (en) | 2019-09-20 | 2021-10-05 | Energous Corporation | Classifying and detecting foreign objects using a power amplifier controller integrated circuit in wireless power transmission systems |
US11153671B2 (en) | 2016-04-19 | 2021-10-19 | Snik Llc | Magnetic earphones holder |
US11166114B2 (en) | 2016-11-15 | 2021-11-02 | Earlens Corporation | Impression procedure |
USD934839S1 (en) * | 2020-03-05 | 2021-11-02 | Shenzhen Yamay Digital Electronics Co. Ltd | Combined wireless earbuds and charging case |
US11212626B2 (en) | 2018-04-09 | 2021-12-28 | Earlens Corporation | Dynamic filter |
US11265665B2 (en) * | 2014-08-22 | 2022-03-01 | K/S Himpp | Wireless hearing device interactive with medical devices |
US11265643B2 (en) * | 2018-09-17 | 2022-03-01 | Starkey Laboratories, Inc. | Hearing device including a sensor and hearing system including same |
US11272367B2 (en) | 2017-09-20 | 2022-03-08 | Bragi GmbH | Wireless earpieces for hub communications |
US11272281B2 (en) | 2016-04-19 | 2022-03-08 | Snik Llc | Magnetic earphones holder |
US11277697B2 (en) | 2018-12-15 | 2022-03-15 | Starkey Laboratories, Inc. | Hearing assistance system with enhanced fall detection features |
US11300857B2 (en) | 2018-11-13 | 2022-04-12 | Opkix, Inc. | Wearable mounts for portable camera |
US11304016B2 (en) * | 2019-06-04 | 2022-04-12 | Concha Inc. | Method for configuring a hearing-assistance device with a hearing profile |
US11342798B2 (en) | 2017-10-30 | 2022-05-24 | Energous Corporation | Systems and methods for managing coexistence of wireless-power signals and data signals operating in a same frequency band |
US11350226B2 (en) * | 2015-12-30 | 2022-05-31 | Earlens Corporation | Charging protocol for rechargeable hearing systems |
US11344256B2 (en) | 2017-02-21 | 2022-05-31 | Bose Corporation | Collecting biologically-relevant information using an earpiece |
US11355966B2 (en) | 2019-12-13 | 2022-06-07 | Energous Corporation | Charging pad with guiding contours to align an electronic device on the charging pad and efficiently transfer near-field radio-frequency energy to the electronic device |
US11381118B2 (en) | 2019-09-20 | 2022-07-05 | Energous Corporation | Systems and methods for machine learning based foreign object detection for wireless power transmission |
USD956719S1 (en) * | 2020-09-30 | 2022-07-05 | Shenzhen Zio Communication Technology Co., Ltd. | Earphone |
US11382571B2 (en) | 2008-10-29 | 2022-07-12 | Flashback Technologies, Inc. | Noninvasive predictive and/or estimative blood pressure monitoring |
US11389069B2 (en) | 2008-10-29 | 2022-07-19 | Flashback Technologies, Inc. | Hemodynamic reserve monitor and hemodialysis control |
US11394755B1 (en) * | 2021-06-07 | 2022-07-19 | International Business Machines Corporation | Guided hardware input prompts |
US11395634B2 (en) | 2008-10-29 | 2022-07-26 | Flashback Technologies, Inc. | Estimating physiological states based on changes in CRI |
US11395594B2 (en) | 2008-10-29 | 2022-07-26 | Flashback Technologies, Inc. | Noninvasive monitoring for fluid resuscitation |
EP3865058A4 (en) * | 2018-11-26 | 2022-07-27 | Osong Medical Innovation Foundation | Core body temperature measurement device having battery charging structure |
US11411441B2 (en) | 2019-09-20 | 2022-08-09 | Energous Corporation | Systems and methods of protecting wireless power receivers using multiple rectifiers and establishing in-band communications using multiple rectifiers |
US11406269B2 (en) | 2008-10-29 | 2022-08-09 | Flashback Technologies, Inc. | Rapid detection of bleeding following injury |
US20220272461A1 (en) * | 2019-12-06 | 2022-08-25 | Gn Hearing A/S | Method for charging a battery of a hearing device |
US11445014B2 (en) * | 2019-11-11 | 2022-09-13 | Sivantos Pte. Ltd. | Method for operating a hearing device, and hearing device |
US11462949B2 (en) | 2017-05-16 | 2022-10-04 | Wireless electrical Grid LAN, WiGL Inc | Wireless charging method and system |
US11478190B2 (en) | 2008-10-29 | 2022-10-25 | Flashback Technologies, Inc. | Noninvasive hydration monitoring |
US20220345836A1 (en) * | 2018-02-28 | 2022-10-27 | Starkey Laboratories, Inc. | Health monitoring with ear-wearable devices and accessory devices |
US11502551B2 (en) | 2012-07-06 | 2022-11-15 | Energous Corporation | Wirelessly charging multiple wireless-power receivers using different subsets of an antenna array to focus energy at different locations |
US11516603B2 (en) | 2018-03-07 | 2022-11-29 | Earlens Corporation | Contact hearing device and retention structure materials |
US11539243B2 (en) | 2019-01-28 | 2022-12-27 | Energous Corporation | Systems and methods for miniaturized antenna for wireless power transmissions |
US11558538B2 (en) | 2016-03-18 | 2023-01-17 | Opkix, Inc. | Portable camera system |
WO2023057461A1 (en) * | 2021-10-06 | 2023-04-13 | Sivantos Pte. Ltd. | Method for operating a hearing aid system |
US11638563B2 (en) | 2018-12-27 | 2023-05-02 | Starkey Laboratories, Inc. | Predictive fall event management system and method of using same |
EP4227959A1 (en) * | 2022-02-10 | 2023-08-16 | GN Hearing A/S | Hearing system with cardiac arrest detection |
US11758338B2 (en) | 2020-06-05 | 2023-09-12 | Starkey Laboratories, Inc. | Authentication and encryption key exchange for assistive listening devices |
US11778392B2 (en) * | 2019-11-14 | 2023-10-03 | Starkey Laboratories, Inc. | Ear-worn electronic device configured to compensate for hunched or stooped posture |
US11786694B2 (en) | 2019-05-24 | 2023-10-17 | NeuroLight, Inc. | Device, method, and app for facilitating sleep |
US11799324B2 (en) | 2020-04-13 | 2023-10-24 | Energous Corporation | Wireless-power transmitting device for creating a uniform near-field charging area |
US11831361B2 (en) | 2019-09-20 | 2023-11-28 | Energous Corporation | Systems and methods for machine learning based foreign object detection for wireless power transmission |
US20230410058A1 (en) * | 2022-06-21 | 2023-12-21 | Avaya Management L.P. | Virtual meeting participation |
US11857293B2 (en) | 2008-10-29 | 2024-01-02 | Flashback Technologies, Inc. | Rapid detection of bleeding before, during, and after fluid resuscitation |
US11916398B2 (en) | 2021-12-29 | 2024-02-27 | Energous Corporation | Small form-factor devices with integrated and modular harvesting receivers, and shelving-mounted wireless-power transmitters for use therewith |
US11918386B2 (en) | 2018-12-26 | 2024-03-05 | Flashback Technologies, Inc. | Device-based maneuver and activity state-based physiologic status monitoring |
Families Citing this family (130)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7668325B2 (en) | 2005-05-03 | 2010-02-23 | Earlens Corporation | Hearing system having an open chamber for housing components and reducing the occlusion effect |
US9318108B2 (en) | 2010-01-18 | 2016-04-19 | Apple Inc. | Intelligent automated assistant |
US8977255B2 (en) | 2007-04-03 | 2015-03-10 | Apple Inc. | Method and system for operating a multi-function portable electronic device using voice-activation |
US8676904B2 (en) | 2008-10-02 | 2014-03-18 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
US20120311585A1 (en) | 2011-06-03 | 2012-12-06 | Apple Inc. | Organizing task items that represent tasks to perform |
US10417037B2 (en) | 2012-05-15 | 2019-09-17 | Apple Inc. | Systems and methods for integrating third party services with a digital assistant |
US9185501B2 (en) | 2012-06-20 | 2015-11-10 | Broadcom Corporation | Container-located information transfer module |
PL2870072T3 (en) | 2012-07-05 | 2018-03-30 | P.C.O.A. Devices Ltd. | Medication dispenser |
DK3284700T3 (en) | 2012-07-30 | 2019-09-09 | DosentRX Ltd | A CONTAINER FOR CONTAINING AND DISPENSING FIXED MEDICINE PILLS |
US9191755B2 (en) | 2012-12-14 | 2015-11-17 | Starkey Laboratories, Inc. | Spatial enhancement mode for hearing aids |
KR102516577B1 (en) | 2013-02-07 | 2023-04-03 | 애플 인크. | Voice trigger for a digital assistant |
US10652394B2 (en) | 2013-03-14 | 2020-05-12 | Apple Inc. | System and method for processing voicemail |
US20140270287A1 (en) * | 2013-03-15 | 2014-09-18 | Qualcomm Incorporated | Bluetooth hearing aids enabled during voice activity on a mobile phone |
US10748529B1 (en) | 2013-03-15 | 2020-08-18 | Apple Inc. | Voice activated device for use with a voice-based digital assistant |
US9215075B1 (en) | 2013-03-15 | 2015-12-15 | Poltorak Technologies Llc | System and method for secure relayed communications from an implantable medical device |
US20140273824A1 (en) * | 2013-03-15 | 2014-09-18 | Medtronic, Inc. | Systems, apparatus and methods facilitating secure pairing of an implantable device with a remote device using near field communication |
US10020008B2 (en) * | 2013-05-23 | 2018-07-10 | Knowles Electronics, Llc | Microphone and corresponding digital interface |
CN105379308B (en) | 2013-05-23 | 2019-06-25 | 美商楼氏电子有限公司 | Microphone, microphone system and the method for operating microphone |
US9711166B2 (en) | 2013-05-23 | 2017-07-18 | Knowles Electronics, Llc | Decimation synchronization in a microphone |
US10176167B2 (en) | 2013-06-09 | 2019-01-08 | Apple Inc. | System and method for inferring user intent from speech inputs |
US9100775B2 (en) * | 2013-09-18 | 2015-08-04 | Plantronics, Inc. | Audio delivery system for headsets |
US9502028B2 (en) | 2013-10-18 | 2016-11-22 | Knowles Electronics, Llc | Acoustic activity detection apparatus and method |
US9147397B2 (en) | 2013-10-29 | 2015-09-29 | Knowles Electronics, Llc | VAD detection apparatus and method of operating the same |
US20150172832A1 (en) * | 2013-12-17 | 2015-06-18 | United Sciences, Llc | Iidentity confirmation using wearable computerized earpieces and related methods |
US20150172827A1 (en) * | 2013-12-17 | 2015-06-18 | United Sciences, Llc | Identity confirmation using wearable computerized earpieces and related methods |
US20150172828A1 (en) * | 2013-12-17 | 2015-06-18 | United Sciences, Llc | Iidentity confirmation using wearable computerized earpieces and related methods |
US9877116B2 (en) | 2013-12-30 | 2018-01-23 | Gn Hearing A/S | Hearing device with position data, audio system and related methods |
JP6674737B2 (en) * | 2013-12-30 | 2020-04-01 | ジーエヌ ヒアリング エー/エスGN Hearing A/S | Listening device having position data and method of operating the listening device |
EP2890156B1 (en) * | 2013-12-30 | 2020-03-04 | GN Hearing A/S | Hearing device with position data and method of operating a hearing device |
EP2908549A1 (en) | 2014-02-13 | 2015-08-19 | Oticon A/s | A hearing aid device comprising a sensor member |
EP3149728B1 (en) | 2014-05-30 | 2019-01-16 | Apple Inc. | Multi-command single utterance input method |
US10170123B2 (en) | 2014-05-30 | 2019-01-01 | Apple Inc. | Intelligent assistant for home automation |
US9715875B2 (en) | 2014-05-30 | 2017-07-25 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
IL233295B (en) | 2014-06-22 | 2019-11-28 | Ilan Paz | A controlled pill-dispensing system |
US9338493B2 (en) | 2014-06-30 | 2016-05-10 | Apple Inc. | Intelligent automated assistant for TV user interactions |
EP2982296A1 (en) * | 2014-08-07 | 2016-02-10 | Oticon A/s | A hearing assistance system with improved signal processing comprising an implanted part |
CN107211203B (en) * | 2014-10-30 | 2020-01-21 | 史马特意尔有限公司 | Intelligent flexible interactive earplug |
US9779752B2 (en) | 2014-10-31 | 2017-10-03 | At&T Intellectual Property I, L.P. | Acoustic enhancement by leveraging metadata to mitigate the impact of noisy environments |
WO2016118480A1 (en) | 2015-01-21 | 2016-07-28 | Knowles Electronics, Llc | Low power voice trigger for acoustic apparatus and method |
US10121472B2 (en) | 2015-02-13 | 2018-11-06 | Knowles Electronics, Llc | Audio buffer catch-up apparatus and method with two microphones |
US9886953B2 (en) | 2015-03-08 | 2018-02-06 | Apple Inc. | Virtual assistant activation |
US9438300B1 (en) * | 2015-03-10 | 2016-09-06 | Invensense, Inc. | Sensor fusion for antenna tuning |
IL238387B (en) | 2015-04-20 | 2019-01-31 | Paz Ilan | Medication dispenser depilling mechanism |
EP3089364B1 (en) | 2015-05-01 | 2019-01-16 | Nxp B.V. | A gain function controller |
US10200824B2 (en) | 2015-05-27 | 2019-02-05 | Apple Inc. | Systems and methods for proactively identifying and surfacing relevant content on a touch-sensitive device |
US9661426B2 (en) * | 2015-06-22 | 2017-05-23 | Gn Hearing A/S | Hearing aid having combined antennas |
EP3110170B1 (en) * | 2015-06-22 | 2019-02-20 | GN Hearing A/S | A hearing aid having combined antennas |
US20160378747A1 (en) | 2015-06-29 | 2016-12-29 | Apple Inc. | Virtual assistant for media playback |
US9478234B1 (en) | 2015-07-13 | 2016-10-25 | Knowles Electronics, Llc | Microphone apparatus and method with catch-up buffer |
EP3139627B1 (en) * | 2015-09-02 | 2019-02-13 | Sonion Nederland B.V. | Ear phone with multi-way speakers |
US10331312B2 (en) | 2015-09-08 | 2019-06-25 | Apple Inc. | Intelligent automated assistant in a media environment |
US10671428B2 (en) | 2015-09-08 | 2020-06-02 | Apple Inc. | Distributed personal assistant |
US10740384B2 (en) | 2015-09-08 | 2020-08-11 | Apple Inc. | Intelligent automated assistant for media search and playback |
US10747498B2 (en) | 2015-09-08 | 2020-08-18 | Apple Inc. | Zero latency digital assistant |
CA3002134C (en) | 2015-10-15 | 2021-11-02 | Ilan Paz | Image recognition-based dosage form dispensers |
US10453450B2 (en) * | 2015-10-20 | 2019-10-22 | Bragi GmbH | Wearable earpiece voice command control system and method |
WO2017077529A1 (en) | 2015-11-02 | 2017-05-11 | P.C.O.A. | Lockable advanceable oral dosage form dispenser containers |
US10691473B2 (en) | 2015-11-06 | 2020-06-23 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US10956666B2 (en) | 2015-11-09 | 2021-03-23 | Apple Inc. | Unconventional virtual assistant interactions |
EP3171614B1 (en) | 2015-11-23 | 2020-11-04 | Goodix Technology (HK) Company Limited | A controller for an audio system |
US10063979B2 (en) | 2015-12-08 | 2018-08-28 | Gn Hearing A/S | Hearing aid with power management |
EP3179741B1 (en) * | 2015-12-08 | 2019-09-25 | GN Hearing A/S | Hearing aid with power management |
US9980033B2 (en) * | 2015-12-21 | 2018-05-22 | Bragi GmbH | Microphone natural speech capture voice dictation system and method |
US10223066B2 (en) | 2015-12-23 | 2019-03-05 | Apple Inc. | Proactive assistance based on dialog communication between devices |
US10244332B2 (en) * | 2016-01-25 | 2019-03-26 | Cochlear Limited | Device monitoring for program switching |
DK3430817T3 (en) | 2016-03-14 | 2020-08-31 | Sonova Ag | WIRELESS BODY CARRIER PERSONAL DEVICE WITH LOSE TRACKING FUNCTIONALITY |
US10337783B2 (en) * | 2016-04-12 | 2019-07-02 | Abigail Weaver | Carry bag with insulated medicine compartment and related methods |
US9937346B2 (en) | 2016-04-26 | 2018-04-10 | Cochlear Limited | Downshifting of output in a sense prosthesis |
US10586535B2 (en) | 2016-06-10 | 2020-03-10 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
DK179415B1 (en) | 2016-06-11 | 2018-06-14 | Apple Inc | Intelligent device arbitration and control |
DK201670540A1 (en) | 2016-06-11 | 2018-01-08 | Apple Inc | Application integration with a digital assistant |
US10339960B2 (en) | 2016-10-13 | 2019-07-02 | International Business Machines Corporation | Personal device for hearing degradation monitoring |
US10678502B2 (en) * | 2016-10-20 | 2020-06-09 | Qualcomm Incorporated | Systems and methods for in-ear control of remote devices |
CN110235453B (en) | 2016-12-09 | 2021-10-15 | 纽约州立大学研究基金会 | Fiber microphone |
DK3358812T3 (en) * | 2017-02-03 | 2019-08-12 | Widex As | COMMUNICATION CHANNELS BETWEEN A PERSONAL COMMUNICATION ESTABLISHMENT AND AT LEAST A MAIN-BORN DEVICE |
DK180048B1 (en) | 2017-05-11 | 2020-02-04 | Apple Inc. | MAINTAINING THE DATA PROTECTION OF PERSONAL INFORMATION |
US10726832B2 (en) | 2017-05-11 | 2020-07-28 | Apple Inc. | Maintaining privacy of personal information |
DK179496B1 (en) | 2017-05-12 | 2019-01-15 | Apple Inc. | USER-SPECIFIC Acoustic Models |
DK201770429A1 (en) | 2017-05-12 | 2018-12-14 | Apple Inc. | Low-latency intelligent automated assistant |
DK179745B1 (en) | 2017-05-12 | 2019-05-01 | Apple Inc. | SYNCHRONIZATION AND TASK DELEGATION OF A DIGITAL ASSISTANT |
US20180336275A1 (en) | 2017-05-16 | 2018-11-22 | Apple Inc. | Intelligent automated assistant for media exploration |
US20180336892A1 (en) | 2017-05-16 | 2018-11-22 | Apple Inc. | Detecting a trigger of a digital assistant |
JP7003993B2 (en) * | 2017-07-21 | 2022-01-21 | ソニーグループ株式会社 | Acoustic output device |
US10617842B2 (en) | 2017-07-31 | 2020-04-14 | Starkey Laboratories, Inc. | Ear-worn electronic device for conducting and monitoring mental exercises |
US10674285B2 (en) | 2017-08-25 | 2020-06-02 | Starkey Laboratories, Inc. | Cognitive benefit measure related to hearing-assistance device use |
US20190132683A1 (en) * | 2017-10-31 | 2019-05-02 | Starkey Laboratories, Inc. | Hearing device including a sensor and a method of forming same |
US10356537B2 (en) * | 2017-12-01 | 2019-07-16 | Semiconductor Components Industries, Llc | All-in-one method for wireless connectivity and contactless battery charging of small wearables |
DK3506656T3 (en) * | 2017-12-29 | 2023-05-01 | Gn Hearing As | HEARING INSTRUMENT COMPRISING A PARASITIC BATTERY ANTENNA ELEMENT |
US11570559B2 (en) | 2017-12-29 | 2023-01-31 | Gn Hearing A/S | Hearing instrument comprising a parasitic battery antenna element |
DE102018204260B4 (en) | 2018-03-20 | 2019-11-21 | Zf Friedrichshafen Ag | Evaluation device, apparatus, method and computer program product for a hearing-impaired person for the environmental perception of a sound event |
US10818288B2 (en) | 2018-03-26 | 2020-10-27 | Apple Inc. | Natural assistant interaction |
EP3554096B9 (en) * | 2018-04-11 | 2023-07-05 | GN Hearing A/S | A hearing aid housing with an integrated antenna |
US11145294B2 (en) | 2018-05-07 | 2021-10-12 | Apple Inc. | Intelligent automated assistant for delivering content from user experiences |
US10928918B2 (en) | 2018-05-07 | 2021-02-23 | Apple Inc. | Raise to speak |
DK180639B1 (en) | 2018-06-01 | 2021-11-04 | Apple Inc | DISABILITY OF ATTENTION-ATTENTIVE VIRTUAL ASSISTANT |
US10892996B2 (en) | 2018-06-01 | 2021-01-12 | Apple Inc. | Variable latency device coordination |
DK179822B1 (en) | 2018-06-01 | 2019-07-12 | Apple Inc. | Voice interaction at a primary device to access call functionality of a companion device |
DE102018209822A1 (en) * | 2018-06-18 | 2019-12-19 | Sivantos Pte. Ltd. | Method for controlling the data transmission between at least one hearing aid and a peripheral device of a hearing aid system and hearing aid |
NL2021491B1 (en) | 2018-08-23 | 2020-02-27 | Audus B V | Method, system, and hearing device for enhancing an environmental audio signal of such a hearing device |
US11462215B2 (en) | 2018-09-28 | 2022-10-04 | Apple Inc. | Multi-modal inputs for voice commands |
US11527265B2 (en) * | 2018-11-02 | 2022-12-13 | BriefCam Ltd. | Method and system for automatic object-aware video or audio redaction |
EP3668108A1 (en) * | 2018-12-14 | 2020-06-17 | Widex A/S | Hearing assistive system with sensors for acquiring a physiological signal |
US11264029B2 (en) | 2019-01-05 | 2022-03-01 | Starkey Laboratories, Inc. | Local artificial intelligence assistant system with ear-wearable device |
US11264035B2 (en) * | 2019-01-05 | 2022-03-01 | Starkey Laboratories, Inc. | Audio signal processing for automatic transcription using ear-wearable device |
US11607170B2 (en) | 2019-02-01 | 2023-03-21 | Starkey Laboratories, Inc. | Detection of physical abuse or neglect using data from ear-wearable devices |
US20200273566A1 (en) * | 2019-02-22 | 2020-08-27 | Starkey Laboratories, Inc. | Sharing of health-related data based on data exported by ear-wearable device |
US11348573B2 (en) | 2019-03-18 | 2022-05-31 | Apple Inc. | Multimodality in digital assistant systems |
US11011182B2 (en) * | 2019-03-25 | 2021-05-18 | Nxp B.V. | Audio processing system for speech enhancement |
EP3684079B1 (en) * | 2019-03-29 | 2024-03-20 | Sonova AG | Hearing device for orientation estimation and method of its operation |
US11213688B2 (en) | 2019-03-30 | 2022-01-04 | Advanced Bionics Ag | Utilization of a non-wearable coil to remotely power a cochlear implant from a distance |
US11307752B2 (en) | 2019-05-06 | 2022-04-19 | Apple Inc. | User configurable task triggers |
DK201970509A1 (en) | 2019-05-06 | 2021-01-15 | Apple Inc | Spoken notifications |
US20220304580A1 (en) * | 2019-05-13 | 2022-09-29 | Starkey Laboratories, Inc. | Ear-worn devices for communication with medical devices |
US11140099B2 (en) | 2019-05-21 | 2021-10-05 | Apple Inc. | Providing message response suggestions |
DK201970510A1 (en) | 2019-05-31 | 2021-02-11 | Apple Inc | Voice identification in digital assistant systems |
DK180129B1 (en) | 2019-05-31 | 2020-06-02 | Apple Inc. | User activity shortcut suggestions |
US11468890B2 (en) | 2019-06-01 | 2022-10-11 | Apple Inc. | Methods and user interfaces for voice-based control of electronic devices |
US11076243B2 (en) * | 2019-06-20 | 2021-07-27 | Samsung Electro-Mechanics Co., Ltd. | Terminal with hearing aid setting, and setting method for hearing aid |
US10832535B1 (en) * | 2019-09-26 | 2020-11-10 | Bose Corporation | Sleepbuds for parents |
WO2021069715A1 (en) * | 2019-10-09 | 2021-04-15 | Jacoti Bv | System of processing devices to perform an algorithm |
EP3806493B1 (en) * | 2019-10-11 | 2023-07-19 | GN Hearing A/S | A hearing device having a magnetic induction coil |
EP4046395B1 (en) * | 2019-10-14 | 2023-11-15 | Starkey Laboratories, Inc. | Hearing assistance system with automatic hearing loop memory |
US20230016667A1 (en) * | 2019-12-17 | 2023-01-19 | Starkey Laboratories, Inc. | Hearing assistance systems and methods for monitoring emotional state |
US10993045B1 (en) * | 2020-03-30 | 2021-04-27 | Sonova Ag | Hearing devices and methods for implementing automatic sensor-based on/off control of a hearing device |
US11183193B1 (en) | 2020-05-11 | 2021-11-23 | Apple Inc. | Digital assistant hardware abstraction |
US11061543B1 (en) | 2020-05-11 | 2021-07-13 | Apple Inc. | Providing relevant data items based on context |
US11490204B2 (en) | 2020-07-20 | 2022-11-01 | Apple Inc. | Multi-device audio adjustment coordination |
US11438683B2 (en) | 2020-07-21 | 2022-09-06 | Apple Inc. | User identification using headphones |
DK180923B1 (en) | 2020-07-27 | 2022-06-27 | Gn Hearing As | MAIN PORTABLE HEARING INSTRUMENT WITH ENHANCED COEXISTENCE BETWEEN MULTIPLE COMMUNICATION INTERFACES |
CN112218221B (en) * | 2020-10-21 | 2022-06-03 | 歌尔智能科技有限公司 | Hearing aid adapter and control method |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5721783A (en) * | 1995-06-07 | 1998-02-24 | Anderson; James C. | Hearing aid with wireless remote processor |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6786860B2 (en) | 2001-10-03 | 2004-09-07 | Advanced Bionics Corporation | Hearing aid design |
US6839446B2 (en) | 2002-05-28 | 2005-01-04 | Trevor I. Blumenau | Hearing aid with sound replay capability |
DE10228157B3 (en) | 2002-06-24 | 2004-01-08 | Siemens Audiologische Technik Gmbh | Hearing aid system with a hearing aid and an external processor unit |
US7012520B2 (en) * | 2003-06-17 | 2006-03-14 | Infraegis, Inc. | Global intelligent remote detection system |
US20060214789A1 (en) * | 2005-03-24 | 2006-09-28 | Joshua Posamentier | Tamper detection with RFID tag |
US8094848B1 (en) | 2006-04-24 | 2012-01-10 | At&T Mobility Ii Llc | Automatically configuring hearing assistive device |
JP2008097585A (en) * | 2006-09-11 | 2008-04-24 | Seiko Epson Corp | Contactless data communication system and contactless ic tag |
KR100826877B1 (en) * | 2006-09-28 | 2008-05-06 | 한국전자통신연구원 | RFID tag with LED and RF identification managing method using the same |
DE102006057644A1 (en) * | 2006-12-05 | 2008-06-12 | Deutsche Post Ag | Container for shipping objects and method for producing the containers |
US8157730B2 (en) | 2006-12-19 | 2012-04-17 | Valencell, Inc. | Physiological and environmental monitoring systems and methods |
WO2008103925A1 (en) | 2007-02-22 | 2008-08-28 | Personics Holdings Inc. | Method and device for sound detection and audio control |
DE102007008738A1 (en) | 2007-02-22 | 2008-08-28 | Siemens Audiologische Technik Gmbh | Method for improving spatial perception and corresponding hearing device |
KR20080084548A (en) * | 2007-03-14 | 2008-09-19 | 한국전자통신연구원 | Apparatus and method for transmitting sensor status of rfid tag |
JP5300205B2 (en) * | 2007-03-22 | 2013-09-25 | キヤノン株式会社 | Target substance detection element, target substance detection method, and method for manufacturing target substance detection element |
US20090076804A1 (en) | 2007-09-13 | 2009-03-19 | Bionica Corporation | Assistive listening system with memory buffer for instant replay and speech to text conversion |
WO2009095937A1 (en) * | 2008-01-28 | 2009-08-06 | Paolo Stefanelli | Container for fluid products, in particular perfumes, deodorants, creams and similar |
EP2247986B1 (en) * | 2008-01-30 | 2014-12-31 | Neology, Inc. | Rfid authentication architecture and methods for rfid authentication |
US7929722B2 (en) | 2008-08-13 | 2011-04-19 | Intelligent Systems Incorporated | Hearing assistance using an external coprocessor |
US20100045425A1 (en) * | 2008-08-21 | 2010-02-25 | Chivallier M Laurent | data transmission of sensors |
US8477029B2 (en) * | 2008-10-23 | 2013-07-02 | Whirlpool Corporation | Modular attribute sensing device |
US20100101317A1 (en) * | 2008-10-23 | 2010-04-29 | Whirlpool Corporation | Lid based amount sensor |
US9202456B2 (en) | 2009-04-23 | 2015-12-01 | Qualcomm Incorporated | Systems, methods, apparatus, and computer-readable media for automatic control of active noise cancellation |
EP2531100B1 (en) | 2010-02-01 | 2017-09-20 | T&W Engineering A/S | Portable eeg monitor system with wireless communication |
US8827171B2 (en) * | 2011-04-20 | 2014-09-09 | Honda Motor Co., Ltd. | Vehicular automatic temperature regulation system |
US9323893B2 (en) | 2011-06-23 | 2016-04-26 | Orca Health, Inc. | Using mobile consumer devices to communicate with consumer medical devices |
US20130043735A1 (en) * | 2011-08-16 | 2013-02-21 | Qualcomm Incorporated | Systems, methods, and devices for multi-level signaling via a wireless power transfer field |
US9185501B2 (en) | 2012-06-20 | 2015-11-10 | Broadcom Corporation | Container-located information transfer module |
-
2012
- 2012-08-24 US US13/594,489 patent/US9185501B2/en not_active Expired - Fee Related
- 2012-09-20 US US13/623,545 patent/US20130343585A1/en not_active Abandoned
- 2012-09-20 US US13/623,435 patent/US20130343584A1/en not_active Abandoned
-
2015
- 2015-10-09 US US14/879,765 patent/US9730005B2/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5721783A (en) * | 1995-06-07 | 1998-02-24 | Anderson; James C. | Hearing aid with wireless remote processor |
Cited By (295)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD288512S (en) * | 1985-04-24 | 1987-03-03 | Thermo-Serv, Inc. | Wine glass |
US10516950B2 (en) | 2007-10-12 | 2019-12-24 | Earlens Corporation | Multifunction system and method for integrated hearing and communication with noise cancellation and feedback management |
US10863286B2 (en) | 2007-10-12 | 2020-12-08 | Earlens Corporation | Multifunction system and method for integrated hearing and communication with noise cancellation and feedback management |
US10154352B2 (en) | 2007-10-12 | 2018-12-11 | Earlens Corporation | Multifunction system and method for integrated hearing and communication with noise cancellation and feedback management |
US11483665B2 (en) | 2007-10-12 | 2022-10-25 | Earlens Corporation | Multifunction system and method for integrated hearing and communication with noise cancellation and feedback management |
US11310605B2 (en) | 2008-06-17 | 2022-04-19 | Earlens Corporation | Optical electro-mechanical hearing devices with separate power and signal components |
US10516949B2 (en) | 2008-06-17 | 2019-12-24 | Earlens Corporation | Optical electro-mechanical hearing devices with separate power and signal components |
US10652661B2 (en) | 2008-06-27 | 2020-05-12 | Snik, LLC | Headset cord holder |
US10660378B2 (en) | 2008-06-27 | 2020-05-26 | Snik, LLC | Headset cord holder |
US11057714B2 (en) | 2008-09-22 | 2021-07-06 | Earlens Corporation | Devices and methods for hearing |
US10743110B2 (en) | 2008-09-22 | 2020-08-11 | Earlens Corporation | Devices and methods for hearing |
US10511913B2 (en) | 2008-09-22 | 2019-12-17 | Earlens Corporation | Devices and methods for hearing |
US10516946B2 (en) | 2008-09-22 | 2019-12-24 | Earlens Corporation | Devices and methods for hearing |
US10237663B2 (en) | 2008-09-22 | 2019-03-19 | Earlens Corporation | Devices and methods for hearing |
US11395594B2 (en) | 2008-10-29 | 2022-07-26 | Flashback Technologies, Inc. | Noninvasive monitoring for fluid resuscitation |
US11406269B2 (en) | 2008-10-29 | 2022-08-09 | Flashback Technologies, Inc. | Rapid detection of bleeding following injury |
US11857293B2 (en) | 2008-10-29 | 2024-01-02 | Flashback Technologies, Inc. | Rapid detection of bleeding before, during, and after fluid resuscitation |
US11395634B2 (en) | 2008-10-29 | 2022-07-26 | Flashback Technologies, Inc. | Estimating physiological states based on changes in CRI |
US11382571B2 (en) | 2008-10-29 | 2022-07-12 | Flashback Technologies, Inc. | Noninvasive predictive and/or estimative blood pressure monitoring |
US11478190B2 (en) | 2008-10-29 | 2022-10-25 | Flashback Technologies, Inc. | Noninvasive hydration monitoring |
US11389069B2 (en) | 2008-10-29 | 2022-07-19 | Flashback Technologies, Inc. | Hemodynamic reserve monitor and hemodialysis control |
US9473859B2 (en) * | 2008-12-31 | 2016-10-18 | Starkey Laboratories, Inc. | Systems and methods of telecommunication for bilateral hearing instruments |
US20150334493A1 (en) * | 2008-12-31 | 2015-11-19 | Thomas Howard Burns | Systems and methods of telecommunication for bilateral hearing instruments |
US10609492B2 (en) | 2010-12-20 | 2020-03-31 | Earlens Corporation | Anatomically customized ear canal hearing apparatus |
US11153697B2 (en) | 2010-12-20 | 2021-10-19 | Earlens Corporation | Anatomically customized ear canal hearing apparatus |
US11743663B2 (en) | 2010-12-20 | 2023-08-29 | Earlens Corporation | Anatomically customized ear canal hearing apparatus |
US10284964B2 (en) | 2010-12-20 | 2019-05-07 | Earlens Corporation | Anatomically customized ear canal hearing apparatus |
US9420381B2 (en) * | 2012-01-13 | 2016-08-16 | Samsung Electronics Co., Ltd. | Multimedia playing apparatus and method for outputting modulated sound according to hearing characteristic of user |
US20130182855A1 (en) * | 2012-01-13 | 2013-07-18 | Samsung Electronics Co., Ltd. | Multimedia playing apparatus and method for outputting modulated sound according to hearing characteristic of user |
US11570540B2 (en) | 2012-02-22 | 2023-01-31 | Snik, LLC | Magnetic earphones holder |
US11575983B2 (en) | 2012-02-22 | 2023-02-07 | Snik, LLC | Magnetic earphones holder |
US10993013B2 (en) | 2012-02-22 | 2021-04-27 | Snik Llc | Magnetic earphones holder |
US10993012B2 (en) | 2012-02-22 | 2021-04-27 | Snik Llc | Magnetic earphones holder |
US10524038B2 (en) | 2012-02-22 | 2019-12-31 | Snik Llc | Magnetic earphones holder |
US11652369B2 (en) | 2012-07-06 | 2023-05-16 | Energous Corporation | Systems and methods of determining a location of a receiver device and wirelessly delivering power to a focus region associated with the receiver device |
US10992185B2 (en) | 2012-07-06 | 2021-04-27 | Energous Corporation | Systems and methods of using electromagnetic waves to wirelessly deliver power to game controllers |
US10992187B2 (en) | 2012-07-06 | 2021-04-27 | Energous Corporation | System and methods of using electromagnetic waves to wirelessly deliver power to electronic devices |
US11502551B2 (en) | 2012-07-06 | 2022-11-15 | Energous Corporation | Wirelessly charging multiple wireless-power receivers using different subsets of an antenna array to focus energy at different locations |
US10965164B2 (en) | 2012-07-06 | 2021-03-30 | Energous Corporation | Systems and methods of wirelessly delivering power to a receiver device |
US9135915B1 (en) * | 2012-07-26 | 2015-09-15 | Google Inc. | Augmenting speech segmentation and recognition using head-mounted vibration and/or motion sensors |
US20150356981A1 (en) * | 2012-07-26 | 2015-12-10 | Google Inc. | Augmenting Speech Segmentation and Recognition Using Head-Mounted Vibration and/or Motion Sensors |
US9779758B2 (en) * | 2012-07-26 | 2017-10-03 | Google Inc. | Augmenting speech segmentation and recognition using head-mounted vibration and/or motion sensors |
US9560458B2 (en) * | 2012-12-14 | 2017-01-31 | Oticon A/S | Configurable hearing instrument |
US20140169596A1 (en) * | 2012-12-14 | 2014-06-19 | Oticon A/S | Configurable hearing instrument |
US9619626B2 (en) * | 2013-01-08 | 2017-04-11 | Samsung Electronics Co., Ltd | Method and apparatus for identifying exercise information of user |
US20140195018A1 (en) * | 2013-01-08 | 2014-07-10 | Samsung Electronics Co,. Ltd. | Method and apparatus for identifying exercise information of user |
US11185257B2 (en) | 2013-06-14 | 2021-11-30 | Oticon A/S | Hearing assistance device with brain computer interface |
US20140369537A1 (en) * | 2013-06-14 | 2014-12-18 | Oticon A/S | Hearing assistance device with brain computer interface |
US10743121B2 (en) | 2013-06-14 | 2020-08-11 | Oticon A/S | Hearing assistance device with brain computer interface |
US9788128B2 (en) * | 2013-06-14 | 2017-10-10 | Gn Hearing A/S | Hearing instrument with off-line speech messages |
US20140369536A1 (en) * | 2013-06-14 | 2014-12-18 | Gn Resound A/S | Hearing instrument with off-line speech messages |
US9210517B2 (en) * | 2013-06-14 | 2015-12-08 | Oticon A/S | Hearing assistance device with brain computer interface |
US9906872B2 (en) * | 2013-06-21 | 2018-02-27 | The Trustees Of Dartmouth College | Hearing-aid noise reduction circuitry with neural feedback to improve speech comprehension |
US20160157030A1 (en) * | 2013-06-21 | 2016-06-02 | The Trustees Of Dartmouth College | Hearing-Aid Noise Reduction Circuitry With Neural Feedback To Improve Speech Comprehension |
US20150130628A1 (en) * | 2013-07-22 | 2015-05-14 | Center For Integrated Smart Sensors Foundation | Nfc or rfid based bio sensor measurement device and measuring method using the same |
US11707633B2 (en) | 2013-10-29 | 2023-07-25 | Physio-Control, Inc. | Variable sound system for audio devices |
US10792507B2 (en) * | 2013-10-29 | 2020-10-06 | Physio-Control, Inc. | Variable sound system for audio devices |
US20200009395A1 (en) * | 2013-10-29 | 2020-01-09 | Physio-Control, Inc. | Variable sound system for audio devices |
US11247062B2 (en) | 2013-10-29 | 2022-02-15 | Physio-Control, Inc. | Variable sound system for audio devices |
US11317224B2 (en) | 2014-03-18 | 2022-04-26 | Earlens Corporation | High fidelity and reduced feedback contact hearing apparatus and methods |
US10034103B2 (en) | 2014-03-18 | 2018-07-24 | Earlens Corporation | High fidelity and reduced feedback contact hearing apparatus and methods |
EP2928215A1 (en) * | 2014-04-04 | 2015-10-07 | Oticon A/s | Self-calibration of multi-microphone noise reduction system for hearing assistance devices using an auxiliary device |
EP2928211A1 (en) * | 2014-04-04 | 2015-10-07 | Oticon A/s | Self-calibration of multi-microphone noise reduction system for hearing assistance devices using an auxiliary device |
US20150289064A1 (en) * | 2014-04-04 | 2015-10-08 | Oticon A/S | Self-calibration of multi-microphone noise reduction system for hearing assistance devices using an auxiliary device |
US9591411B2 (en) * | 2014-04-04 | 2017-03-07 | Oticon A/S | Self-calibration of multi-microphone noise reduction system for hearing assistance devices using an auxiliary device |
US20170094401A1 (en) * | 2014-05-20 | 2017-03-30 | Bugatone Ltd. | Aural measurements from earphone output speakers |
US10142722B2 (en) * | 2014-05-20 | 2018-11-27 | Bugatone Ltd. | Aural measurements from earphone output speakers |
US10555102B2 (en) | 2014-05-20 | 2020-02-04 | Bugatone Ltd. | Aural measurements from earphone output speakers |
US9769858B2 (en) | 2014-05-30 | 2017-09-19 | Apple Inc. | Seamless connectivity between hearing aid and multiple devices |
US9763276B2 (en) * | 2014-05-30 | 2017-09-12 | Apple Inc. | Seamless connectivity between hearing aid and multiple devices |
US20150351143A1 (en) * | 2014-05-30 | 2015-12-03 | Apple Inc. | Seamless connectivity between hearing aid and multiple devices |
US11800303B2 (en) | 2014-07-14 | 2023-10-24 | Earlens Corporation | Sliding bias and peak limiting for optical hearing devices |
US11259129B2 (en) | 2014-07-14 | 2022-02-22 | Earlens Corporation | Sliding bias and peak limiting for optical hearing devices |
US10531206B2 (en) | 2014-07-14 | 2020-01-07 | Earlens Corporation | Sliding bias and peak limiting for optical hearing devices |
US10185163B2 (en) | 2014-08-03 | 2019-01-22 | PogoTec, Inc. | Wearable camera systems and apparatus and method for attaching camera systems or other electronic devices to wearable articles |
US20160033308A1 (en) * | 2014-08-04 | 2016-02-04 | Infineon Technologies Ag | Intelligent gauge devices and related systems and methods |
US11265663B2 (en) | 2014-08-22 | 2022-03-01 | K/S Himpp | Wireless hearing device with physiologic sensors for health monitoring |
US11265665B2 (en) * | 2014-08-22 | 2022-03-01 | K/S Himpp | Wireless hearing device interactive with medical devices |
US11265664B2 (en) * | 2014-08-22 | 2022-03-01 | K/S Himpp | Wireless hearing device for tracking activity and emergency events |
US10299050B2 (en) * | 2014-08-27 | 2019-05-21 | Auditory Labs, Llc | Mobile audio receiver |
US9621228B2 (en) * | 2014-08-29 | 2017-04-11 | Freelinc Technologies | Spatially aware communications using radio frequency (RF) communications standards |
US10038475B2 (en) * | 2014-08-29 | 2018-07-31 | Freelinc Technologies Inc. | Proximity boundary based communication using radio frequency (RF) communication standards |
US9780837B2 (en) | 2014-08-29 | 2017-10-03 | Freelinc Technologies | Spatially enabled secure communications |
US20160066185A1 (en) * | 2014-08-29 | 2016-03-03 | Freelinc Technologies | Spatially aware communications using radio frequency (rf) communications standards |
US10122414B2 (en) | 2014-08-29 | 2018-11-06 | Freelinc Technologies Inc. | Spatially enabled secure communications |
US10084512B2 (en) | 2014-08-29 | 2018-09-25 | Freelinc Technologies | Proximity boundary based communication |
US9621227B2 (en) * | 2014-08-29 | 2017-04-11 | Freelinc Technologies | Proximity boundary based communication using radio frequency (RF) communication standards |
US20160065271A1 (en) * | 2014-08-29 | 2016-03-03 | Freelinc Technologies | Proximity boundary based communication using radio frequency (rf) communication standards |
US9705564B2 (en) | 2014-08-29 | 2017-07-11 | Freelinc Technologies | Spatially enabled secure communications |
US20170338857A1 (en) * | 2014-08-29 | 2017-11-23 | Freelinc Technologies Inc. | Proximity boundary based communication using radio frequency (rf) communication standards |
US9838082B2 (en) | 2014-08-29 | 2017-12-05 | Freelinc Technologies | Proximity boundary based communication |
US11115519B2 (en) | 2014-11-11 | 2021-09-07 | K/S Himpp | Subscription-based wireless service for a hearing device |
US10516951B2 (en) | 2014-11-26 | 2019-12-24 | Earlens Corporation | Adjustable venting for hearing instruments |
US11252516B2 (en) | 2014-11-26 | 2022-02-15 | Earlens Corporation | Adjustable venting for hearing instruments |
US20170118568A1 (en) * | 2014-12-10 | 2017-04-27 | Starkey Laboratories, Inc. | Managing a hearing assistance device via low energy digital communications |
US10506355B2 (en) * | 2014-12-10 | 2019-12-10 | Starkey Laboratories, Inc. | Managing a hearing assistance device via low energy digital communications |
US11284249B2 (en) * | 2014-12-12 | 2022-03-22 | Gn Hearing A/S | Apparatus for secure hearing device communication and related method |
US10045207B2 (en) * | 2014-12-12 | 2018-08-07 | Gn Hearing A/S | Apparatus for secure hearing device communication and related method |
US10595197B2 (en) * | 2014-12-12 | 2020-03-17 | Gn Hearing A/S | Apparatus for secure hearing device communication and related method |
US20170064545A1 (en) * | 2014-12-12 | 2017-03-02 | Gn Resound A/S | Apparatus for secure hearing device communication and related method |
US9503437B2 (en) * | 2014-12-12 | 2016-11-22 | Gn Resound A/S | Apparatus for secure hearing device communication and related method |
US20190141522A1 (en) * | 2014-12-12 | 2019-05-09 | Gn Hearing A/S | Apparatus for secure hearing device communication and related method |
US10348965B2 (en) | 2014-12-23 | 2019-07-09 | PogoTec, Inc. | Wearable camera system |
US10887516B2 (en) | 2014-12-23 | 2021-01-05 | PogoTec, Inc. | Wearable camera system |
US10164685B2 (en) | 2014-12-31 | 2018-12-25 | Freelinc Technologies Inc. | Spatially aware wireless network |
US10674292B2 (en) * | 2015-02-23 | 2020-06-02 | Oticon A/S | Method and apparatus for controlling a hearing instrument to relieve tinitus, hyperacusis, and hearing loss |
US20190261105A1 (en) * | 2015-02-23 | 2019-08-22 | Oticon A/S | Method and apparatus for controlling a hearing instrument to relieve tinitus, hyperacusis, and hearing loss |
US20160278647A1 (en) * | 2015-03-26 | 2016-09-29 | Intel Corporation | Misalignment detection of a wearable device |
US11678810B2 (en) | 2015-03-26 | 2023-06-20 | Intel Corporation | Sensor data transmissions |
US10292607B2 (en) | 2015-03-26 | 2019-05-21 | Intel Corporation | Sensor data transmissions |
US11426592B2 (en) * | 2015-05-14 | 2022-08-30 | Cochlear Limited | Functionality migration |
US20160331964A1 (en) * | 2015-05-14 | 2016-11-17 | Cochlear Limited | Functionality migration |
US10970030B2 (en) | 2015-06-05 | 2021-04-06 | Apple Inc. | Changing companion communication device behavior based on status of wearable device |
WO2016196838A1 (en) * | 2015-06-05 | 2016-12-08 | Apple Inc. | Changing companion communication device behavior based on status of wearable device |
US10067734B2 (en) | 2015-06-05 | 2018-09-04 | Apple Inc. | Changing companion communication device behavior based on status of wearable device |
US11630636B2 (en) | 2015-06-05 | 2023-04-18 | Apple Inc. | Changing companion communication device behavior based on status of wearable device |
US10241351B2 (en) | 2015-06-10 | 2019-03-26 | PogoTec, Inc. | Eyewear with magnetic track for electronic wearable device |
US10255902B2 (en) * | 2015-06-25 | 2019-04-09 | Boe Technology Group Co., Ltd. | Voice synthesis device, voice synthesis method, bone conduction helmet and hearing aid |
US10536782B2 (en) * | 2015-07-02 | 2020-01-14 | Carl L. C. Kah, Jr. | External ear insert for hearing enhancement |
US20170006387A1 (en) * | 2015-07-02 | 2017-01-05 | Carl L.C. Kah, JR. | External ear insert for hearing enhancement |
US10425518B2 (en) | 2015-07-03 | 2019-09-24 | teleCalm, Inc. | Telephone system for impaired individuals |
WO2017007728A1 (en) * | 2015-07-03 | 2017-01-12 | teleCalm, Inc. | Telephone system for impaired individuals |
US9686392B2 (en) | 2015-07-03 | 2017-06-20 | teleCalm, Inc. | Telephone system for impaired individuals |
US9704497B2 (en) * | 2015-07-06 | 2017-07-11 | Apple Inc. | Method and system of audio power reduction and thermal mitigation using psychoacoustic techniques |
US20170064470A1 (en) * | 2015-08-24 | 2017-03-02 | Ivana Popovac | Prosthesis functionality control and data presentation |
CN108028995A (en) * | 2015-08-24 | 2018-05-11 | 科利耳有限公司 | Prosthesis function is controlled to be represented with data |
US11917375B2 (en) | 2015-08-24 | 2024-02-27 | Cochlear Limited | Prosthesis functionality control and data presentation |
US10575108B2 (en) * | 2015-08-24 | 2020-02-25 | Cochlear Limited | Prosthesis functionality control and data presentation |
US10397688B2 (en) | 2015-08-29 | 2019-08-27 | Bragi GmbH | Power control for battery powered personal area network device system and method |
US11690428B2 (en) | 2015-09-30 | 2023-07-04 | Apple Inc. | Portable listening device with accelerometer |
US11944172B2 (en) * | 2015-09-30 | 2024-04-02 | Apple Inc. | Portable listening device with sensors |
US20210274273A1 (en) * | 2015-09-30 | 2021-09-02 | Apple Inc. | Portable listening device with sensors |
US10292601B2 (en) | 2015-10-02 | 2019-05-21 | Earlens Corporation | Wearable customized ear canal apparatus |
US11058305B2 (en) | 2015-10-02 | 2021-07-13 | Earlens Corporation | Wearable customized ear canal apparatus |
US20170111747A1 (en) * | 2015-10-14 | 2017-04-20 | Sonion Nederland B.V. | Hearing device with vibration sensitive transducer |
US10021494B2 (en) * | 2015-10-14 | 2018-07-10 | Sonion Nederland B.V. | Hearing device with vibration sensitive transducer |
US10582289B2 (en) | 2015-10-20 | 2020-03-03 | Bragi GmbH | Enhanced biometric control systems for detection of emergency events system and method |
US20170127196A1 (en) * | 2015-10-29 | 2017-05-04 | PogoTec, Inc. | Hearing aid adapted for wireless power reception |
US11166112B2 (en) * | 2015-10-29 | 2021-11-02 | PogoTec, Inc. | Hearing aid adapted for wireless power reception |
EP3163911B1 (en) | 2015-10-29 | 2018-08-01 | Sivantos Pte. Ltd. | Hearing aid system with sensor for detection of biological data |
EP3163911A1 (en) * | 2015-10-29 | 2017-05-03 | Sivantos Pte. Ltd. | Hearing aid system with sensor for detection of biological data |
US20170127193A1 (en) * | 2015-10-29 | 2017-05-04 | Sivantos Pte. Ltd. | Hearing aid system and method containing a sensor for capturing biological data |
US20220060836A1 (en) * | 2015-10-29 | 2022-02-24 | Pogotec Inc. | Hearing aid adapted for wireless power reception |
CN106963548A (en) * | 2015-10-29 | 2017-07-21 | 西万拓私人有限公司 | Hearing system with the sensor for gathering biological data |
US10341787B2 (en) * | 2015-10-29 | 2019-07-02 | PogoTec, Inc. | Hearing aid adapted for wireless power reception |
US20170156010A1 (en) * | 2015-11-27 | 2017-06-01 | Rishubh VERMA | External component with inductance and mechanical vibratory functionality |
US10321247B2 (en) * | 2015-11-27 | 2019-06-11 | Cochlear Limited | External component with inductance and mechanical vibratory functionality |
US10412519B1 (en) * | 2015-12-27 | 2019-09-10 | Philip Scott Lyren | Switching binaural sound |
US20190297442A1 (en) * | 2015-12-27 | 2019-09-26 | Philip Scott Lyren | Switching Binaural Sound |
US11337012B2 (en) | 2015-12-30 | 2022-05-17 | Earlens Corporation | Battery coating for rechargable hearing systems |
US10178483B2 (en) | 2015-12-30 | 2019-01-08 | Earlens Corporation | Light based hearing systems, apparatus, and methods |
US11070927B2 (en) | 2015-12-30 | 2021-07-20 | Earlens Corporation | Damping in contact hearing systems |
US20170195804A1 (en) * | 2015-12-30 | 2017-07-06 | Earlens Corporation | Charging protocol for rechargable hearing systems |
US10306381B2 (en) * | 2015-12-30 | 2019-05-28 | Earlens Corporation | Charging protocol for rechargable hearing systems |
US10492010B2 (en) | 2015-12-30 | 2019-11-26 | Earlens Corporations | Damping in contact hearing systems |
US11350226B2 (en) * | 2015-12-30 | 2022-05-31 | Earlens Corporation | Charging protocol for rechargeable hearing systems |
US10779094B2 (en) | 2015-12-30 | 2020-09-15 | Earlens Corporation | Damping in contact hearing systems |
US11516602B2 (en) | 2015-12-30 | 2022-11-29 | Earlens Corporation | Damping in contact hearing systems |
US10524068B2 (en) | 2016-01-07 | 2019-12-31 | Sonova Ag | Hearing assistance device transducers and hearing assistance devices with same |
US11700475B2 (en) | 2016-03-11 | 2023-07-11 | Bragi GmbH | Earpiece with GPS receiver |
US10893353B2 (en) | 2016-03-11 | 2021-01-12 | Bragi GmbH | Earpiece with GPS receiver |
US10506351B2 (en) * | 2016-03-11 | 2019-12-10 | Sonova Ag | Hearing assistance device and method with automatic security control |
US11336989B2 (en) | 2016-03-11 | 2022-05-17 | Bragi GmbH | Earpiece with GPS receiver |
US11558538B2 (en) | 2016-03-18 | 2023-01-17 | Opkix, Inc. | Portable camera system |
US10433788B2 (en) * | 2016-03-23 | 2019-10-08 | Bragi GmbH | Earpiece life monitor with capability of automatic notification system and method |
US20190246216A1 (en) * | 2016-03-30 | 2019-08-08 | Oticon A/S | Hearing device and monitoring system thereof |
US10743115B2 (en) * | 2016-03-30 | 2020-08-11 | Oticon A/S | Hearing device and monitoring system thereof |
EP3035710A3 (en) * | 2016-03-30 | 2016-11-02 | Oticon A/s | Monitoring system for a hearing device |
US10015601B2 (en) * | 2016-03-30 | 2018-07-03 | Oticon A/S | Hearing device and monitoring system thereof |
US10313802B2 (en) * | 2016-03-30 | 2019-06-04 | Oticon A/S | Hearing device and monitoring system thereof |
US10157037B2 (en) * | 2016-03-31 | 2018-12-18 | Bose Corporation | Performing an operation at a headphone system |
US9924255B2 (en) | 2016-03-31 | 2018-03-20 | Bose Corporation | On/off head detection using magnetic field sensing |
US11272281B2 (en) | 2016-04-19 | 2022-03-08 | Snik Llc | Magnetic earphones holder |
US10951968B2 (en) | 2016-04-19 | 2021-03-16 | Snik Llc | Magnetic earphones holder |
US11153671B2 (en) | 2016-04-19 | 2021-10-19 | Snik Llc | Magnetic earphones holder |
US11678101B2 (en) * | 2016-04-19 | 2023-06-13 | Snik Llc | Magnetic earphones holder |
US11638075B2 (en) | 2016-04-19 | 2023-04-25 | Snik Llc | Magnetic earphones holder |
US11722811B2 (en) | 2016-04-19 | 2023-08-08 | Snik Llc | Magnetic earphones holder |
US20190158946A1 (en) * | 2016-04-19 | 2019-05-23 | Snik Llc | Magnetic earphones holder |
US11095972B2 (en) | 2016-04-19 | 2021-08-17 | Snik Llc | Magnetic earphones holder |
US10631074B2 (en) | 2016-04-19 | 2020-04-21 | Snik Llc | Magnetic earphones holder |
US11632615B2 (en) | 2016-04-19 | 2023-04-18 | Snik Llc | Magnetic earphones holder |
CN107801138A (en) * | 2016-08-29 | 2018-03-13 | 奥迪康有限公司 | hearing aid device with voice control function |
US11540065B2 (en) | 2016-09-09 | 2022-12-27 | Earlens Corporation | Contact hearing systems, apparatus and methods |
US11102594B2 (en) | 2016-09-09 | 2021-08-24 | Earlens Corporation | Contact hearing systems, apparatus and methods |
KR101913295B1 (en) * | 2016-09-23 | 2018-10-30 | 애플 인크. | Broadcasting a device state in a wireless communication network |
KR102033682B1 (en) | 2016-09-23 | 2019-10-18 | 애플 인크. | Broadcasting a device state in a wireless communication network |
US10834567B2 (en) | 2016-09-23 | 2020-11-10 | Apple Inc. | Broadcasting a device state in a wireless communication network |
KR20180033077A (en) * | 2016-09-23 | 2018-04-02 | 애플 인크. | Broadcasting a device state in a wireless communication network |
US10349259B2 (en) | 2016-09-23 | 2019-07-09 | Apple Inc. | Broadcasting a device state in a wireless communication network |
KR20180120636A (en) * | 2016-09-23 | 2018-11-06 | 애플 인크. | Broadcasting a device state in a wireless communication network |
US20180101656A1 (en) * | 2016-10-07 | 2018-04-12 | Bragi GmbH | Software Application Transmission via Body Interface Using a Wearable Device in Conjunction with Removable Body Sensor Arrays System and Method |
US10049184B2 (en) * | 2016-10-07 | 2018-08-14 | Bragi GmbH | Software application transmission via body interface using a wearable device in conjunction with removable body sensor arrays system and method |
US11908442B2 (en) | 2016-11-03 | 2024-02-20 | Bragi GmbH | Selective audio isolation from body generated sound system and method |
US10896665B2 (en) | 2016-11-03 | 2021-01-19 | Bragi GmbH | Selective audio isolation from body generated sound system and method |
US11417307B2 (en) | 2016-11-03 | 2022-08-16 | Bragi GmbH | Selective audio isolation from body generated sound system and method |
US10398374B2 (en) | 2016-11-04 | 2019-09-03 | Bragi GmbH | Manual operation assistance with earpiece with 3D sound cues |
US10681450B2 (en) | 2016-11-04 | 2020-06-09 | Bragi GmbH | Earpiece with source selection within ambient environment |
US10665243B1 (en) * | 2016-11-11 | 2020-05-26 | Facebook Technologies, Llc | Subvocalized speech recognition |
US11671774B2 (en) | 2016-11-15 | 2023-06-06 | Earlens Corporation | Impression procedure |
US11166114B2 (en) | 2016-11-15 | 2021-11-02 | Earlens Corporation | Impression procedure |
US20190052979A1 (en) * | 2017-01-05 | 2019-02-14 | Ohio State Innovation Foundation | Systems and methods for wirelessly charging a hearing device |
WO2018129281A1 (en) * | 2017-01-05 | 2018-07-12 | Ohio State Innovation Foundation | Systems and methods for wirelessly charging a hearing device |
US10624559B2 (en) | 2017-02-13 | 2020-04-21 | Starkey Laboratories, Inc. | Fall prediction system and method of using the same |
US11344256B2 (en) | 2017-02-21 | 2022-05-31 | Bose Corporation | Collecting biologically-relevant information using an earpiece |
EP3313092A1 (en) * | 2017-03-17 | 2018-04-25 | Oticon A/s | A hearing system for monitoring a health related parameter |
EP3376780A1 (en) * | 2017-03-17 | 2018-09-19 | Oticon A/s | A hearing system for monitoring a health related parameter |
JP7455417B2 (en) | 2017-03-23 | 2024-03-26 | 株式会社Agama-X | EEG measurement device, EEG measurement system, EEG measurement method, and EEG measurement program |
US10918325B2 (en) * | 2017-03-23 | 2021-02-16 | Fuji Xerox Co., Ltd. | Brain wave measuring device and brain wave measuring system |
US20180271428A1 (en) * | 2017-03-23 | 2018-09-27 | Fuji Xerox Co., Ltd. | Brain wave measuring device and brain wave measuring system |
US11011942B2 (en) | 2017-03-30 | 2021-05-18 | Energous Corporation | Flat antennas having two or more resonant frequencies for use in wireless power transmission systems |
US11462949B2 (en) | 2017-05-16 | 2022-10-04 | Wireless electrical Grid LAN, WiGL Inc | Wireless charging method and system |
US10213157B2 (en) * | 2017-06-09 | 2019-02-26 | Bose Corporation | Active unipolar dry electrode open ear wireless headset and brain computer interface |
US10575105B2 (en) | 2017-06-09 | 2020-02-25 | Sivantos Pte. Ltd. | Method for characterizing a receiver in a hearing device, hearing device and test apparatus for a hearing device |
KR102390101B1 (en) | 2017-06-23 | 2022-04-26 | 에너저스 코포레이션 | Systems, methods and devices using wires of sound reproduction devices as antennas for the reception of wirelessly transmitted power |
US10848853B2 (en) * | 2017-06-23 | 2020-11-24 | Energous Corporation | Systems, methods, and devices for utilizing a wire of a sound-producing device as an antenna for receipt of wirelessly delivered power |
JP7320096B2 (en) | 2017-06-23 | 2023-08-02 | エナージャス コーポレイション | Systems, methods, and devices for utilizing a wire of a sound producing device as an antenna for receiving wirelessly delivered power |
US11218795B2 (en) * | 2017-06-23 | 2022-01-04 | Energous Corporation | Systems, methods, and devices for utilizing a wire of a sound-producing device as an antenna for receipt of wirelessly delivered power |
KR20200020793A (en) * | 2017-06-23 | 2020-02-26 | 에너저스 코포레이션 | Systems, methods, and devices using wires of a sound reproduction device as antennas for reception of wireless transmission power |
JP2020526158A (en) * | 2017-06-23 | 2020-08-27 | エナージャス コーポレイション | System, method and device for utilizing wires of a sound producing device as an antenna for receiving power delivered wirelessly |
US10701470B2 (en) * | 2017-09-07 | 2020-06-30 | Light Speed Aviation, Inc. | Circumaural headset or headphones with adjustable biometric sensor |
US20190075382A1 (en) * | 2017-09-07 | 2019-03-07 | Light Speed Aviation, Inc. | Circumaural headset or headphones with adjustable biometric sensor |
US10764668B2 (en) | 2017-09-07 | 2020-09-01 | Lightspeed Aviation, Inc. | Sensor mount and circumaural headset or headphones with adjustable sensor |
US10344960B2 (en) | 2017-09-19 | 2019-07-09 | Bragi GmbH | Wireless earpiece controlled medical headlight |
US11272367B2 (en) | 2017-09-20 | 2022-03-08 | Bragi GmbH | Wireless earpieces for hub communications |
US11711695B2 (en) | 2017-09-20 | 2023-07-25 | Bragi GmbH | Wireless earpieces for hub communications |
EP3701729A4 (en) * | 2017-10-23 | 2021-12-22 | Cochlear Limited | Advanced assistance for prosthesis assisted communication |
CN111226445A (en) * | 2017-10-23 | 2020-06-02 | 科利耳有限公司 | Advanced auxiliary device for prosthesis-assisted communication |
US11817721B2 (en) | 2017-10-30 | 2023-11-14 | Energous Corporation | Systems and methods for managing coexistence of wireless-power signals and data signals operating in a same frequency band |
US11342798B2 (en) | 2017-10-30 | 2022-05-24 | Energous Corporation | Systems and methods for managing coexistence of wireless-power signals and data signals operating in a same frequency band |
US10728642B2 (en) | 2018-02-28 | 2020-07-28 | Starkey Laboratories, Inc. | Portable case for modular hearing assistance devices |
US11716580B2 (en) * | 2018-02-28 | 2023-08-01 | Starkey Laboratories, Inc. | Health monitoring with ear-wearable devices and accessory devices |
US20220345836A1 (en) * | 2018-02-28 | 2022-10-27 | Starkey Laboratories, Inc. | Health monitoring with ear-wearable devices and accessory devices |
US10659859B2 (en) | 2018-02-28 | 2020-05-19 | Starkey Laboratories, Inc. | Portable case for modular hearing assistance devices |
US10939216B2 (en) * | 2018-02-28 | 2021-03-02 | Starkey Laboratories, Inc. | Health monitoring with ear-wearable devices and accessory devices |
US20190268707A1 (en) * | 2018-02-28 | 2019-08-29 | Starkey Laboratories, Inc. | Health monitoring with ear-wearable devices and accessory devices |
US11019417B2 (en) | 2018-02-28 | 2021-05-25 | Starkey Laboratories, Inc. | Modular hearing assistance device |
US11395076B2 (en) * | 2018-02-28 | 2022-07-19 | Starkey Laboratories, Inc. | Health monitoring with ear-wearable devices and accessory devices |
US11516603B2 (en) | 2018-03-07 | 2022-11-29 | Earlens Corporation | Contact hearing device and retention structure materials |
US11564044B2 (en) | 2018-04-09 | 2023-01-24 | Earlens Corporation | Dynamic filter |
US11212626B2 (en) | 2018-04-09 | 2021-12-28 | Earlens Corporation | Dynamic filter |
US10880655B2 (en) * | 2018-06-18 | 2020-12-29 | Sivantos Pte. Ltd. | Method for operating a hearing apparatus system, and hearing apparatus system |
US20190387327A1 (en) * | 2018-06-18 | 2019-12-19 | Sivantos Pte. Ltd. | Method for operating a hearing apparatus system, and hearing apparatus system |
US10798497B2 (en) * | 2018-07-03 | 2020-10-06 | Tom Yu-Chi CHANG | Hearing aid device and a system for controlling a hearing aid device |
US20200015022A1 (en) * | 2018-07-03 | 2020-01-09 | Tom Yu-Chi CHANG | Hearing aid device and a system for controlling a hearing aid device |
US11265643B2 (en) * | 2018-09-17 | 2022-03-01 | Starkey Laboratories, Inc. | Hearing device including a sensor and hearing system including same |
US11300857B2 (en) | 2018-11-13 | 2022-04-12 | Opkix, Inc. | Wearable mounts for portable camera |
US20210267464A1 (en) * | 2018-11-15 | 2021-09-02 | Kyocera Corporation | Biosensor |
EP3865058A4 (en) * | 2018-11-26 | 2022-07-27 | Osong Medical Innovation Foundation | Core body temperature measurement device having battery charging structure |
US11277697B2 (en) | 2018-12-15 | 2022-03-15 | Starkey Laboratories, Inc. | Hearing assistance system with enhanced fall detection features |
US11330380B2 (en) | 2018-12-21 | 2022-05-10 | Starkey Laboratories, Inc. | Modularization of components of an ear-wearable device |
US10911878B2 (en) | 2018-12-21 | 2021-02-02 | Starkey Laboratories, Inc. | Modularization of components of an ear-wearable device |
US11918386B2 (en) | 2018-12-26 | 2024-03-05 | Flashback Technologies, Inc. | Device-based maneuver and activity state-based physiologic status monitoring |
US11638563B2 (en) | 2018-12-27 | 2023-05-02 | Starkey Laboratories, Inc. | Predictive fall event management system and method of using same |
US11184052B2 (en) * | 2018-12-28 | 2021-11-23 | Samsung Electronics Co., Ltd. | Apparatus and method with near-field communication |
US20200212961A1 (en) * | 2018-12-28 | 2020-07-02 | Samsung Electronics Co., Ltd. | Apparatus and method with near-field communication |
WO2020144189A1 (en) | 2019-01-07 | 2020-07-16 | Cosinuss Gmbh | Method for providing data for an interface |
US11539243B2 (en) | 2019-01-28 | 2022-12-27 | Energous Corporation | Systems and methods for miniaturized antenna for wireless power transmissions |
US11018779B2 (en) | 2019-02-06 | 2021-05-25 | Energous Corporation | Systems and methods of estimating optimal phases to use for individual antennas in an antenna array |
US11463179B2 (en) | 2019-02-06 | 2022-10-04 | Energous Corporation | Systems and methods of estimating optimal phases to use for individual antennas in an antenna array |
US11784726B2 (en) | 2019-02-06 | 2023-10-10 | Energous Corporation | Systems and methods of estimating optimal phases to use for individual antennas in an antenna array |
USD869445S1 (en) * | 2019-05-22 | 2019-12-10 | Shenzhen Qianhai Patuoxun Network And Technology Co., Ltd | Earphone |
US11786694B2 (en) | 2019-05-24 | 2023-10-17 | NeuroLight, Inc. | Device, method, and app for facilitating sleep |
US11304016B2 (en) * | 2019-06-04 | 2022-04-12 | Concha Inc. | Method for configuring a hearing-assistance device with a hearing profile |
US20220191632A1 (en) * | 2019-06-04 | 2022-06-16 | Concha Inc. | Method for configuring a hearing-assistance device with a hearing profile |
US11871187B2 (en) * | 2019-06-04 | 2024-01-09 | Concha Inc. | Method for configuring a hearing-assistance device with a hearing profile |
WO2020264203A1 (en) * | 2019-06-28 | 2020-12-30 | Starkey Laboratories, Inc. | Direct informative communication through an ear-wearable device |
USD878337S1 (en) * | 2019-07-08 | 2020-03-17 | Shenzhen Ginto E-commerce Co., Limited | Earphone |
US11411441B2 (en) | 2019-09-20 | 2022-08-09 | Energous Corporation | Systems and methods of protecting wireless power receivers using multiple rectifiers and establishing in-band communications using multiple rectifiers |
US11831361B2 (en) | 2019-09-20 | 2023-11-28 | Energous Corporation | Systems and methods for machine learning based foreign object detection for wireless power transmission |
US11139699B2 (en) | 2019-09-20 | 2021-10-05 | Energous Corporation | Classifying and detecting foreign objects using a power amplifier controller integrated circuit in wireless power transmission systems |
US11381118B2 (en) | 2019-09-20 | 2022-07-05 | Energous Corporation | Systems and methods for machine learning based foreign object detection for wireless power transmission |
US11799328B2 (en) | 2019-09-20 | 2023-10-24 | Energous Corporation | Systems and methods of protecting wireless power receivers using surge protection provided by a rectifier, a depletion mode switch, and a coupling mechanism having multiple coupling locations |
US11715980B2 (en) | 2019-09-20 | 2023-08-01 | Energous Corporation | Classifying and detecting foreign objects using a power amplifier controller integrated circuit in wireless power transmission systems |
US11445014B2 (en) * | 2019-11-11 | 2022-09-13 | Sivantos Pte. Ltd. | Method for operating a hearing device, and hearing device |
US11778392B2 (en) * | 2019-11-14 | 2023-10-03 | Starkey Laboratories, Inc. | Ear-worn electronic device configured to compensate for hunched or stooped posture |
USD890724S1 (en) * | 2019-11-22 | 2020-07-21 | Stb International Limited | Earphone |
US20220272461A1 (en) * | 2019-12-06 | 2022-08-25 | Gn Hearing A/S | Method for charging a battery of a hearing device |
US11355966B2 (en) | 2019-12-13 | 2022-06-07 | Energous Corporation | Charging pad with guiding contours to align an electronic device on the charging pad and efficiently transfer near-field radio-frequency energy to the electronic device |
USD883260S1 (en) * | 2019-12-25 | 2020-05-05 | Shenzhen Qianhai Patuoxun Network And Technology Co., Ltd | Earphone |
US11411437B2 (en) | 2019-12-31 | 2022-08-09 | Energous Corporation | System for wirelessly transmitting energy without using beam-forming control |
US10985617B1 (en) | 2019-12-31 | 2021-04-20 | Energous Corporation | System for wirelessly transmitting energy at a near-field distance without using beam-forming control |
US11817719B2 (en) | 2019-12-31 | 2023-11-14 | Energous Corporation | Systems and methods for controlling and managing operation of one or more power amplifiers to optimize the performance of one or more antennas |
USD934839S1 (en) * | 2020-03-05 | 2021-11-02 | Shenzhen Yamay Digital Electronics Co. Ltd | Combined wireless earbuds and charging case |
USD893462S1 (en) * | 2020-03-05 | 2020-08-18 | Shenzhen Humboldt Technology Co., Ltd | Headphone |
US11799324B2 (en) | 2020-04-13 | 2023-10-24 | Energous Corporation | Wireless-power transmitting device for creating a uniform near-field charging area |
USD890138S1 (en) * | 2020-04-30 | 2020-07-14 | Shenzhen Qianhai Patuoxun Network And Technology Co., Ltd | Earphones |
USD901457S1 (en) * | 2020-06-03 | 2020-11-10 | Shenzhen Wireless Cloud Image Electronics Co., Ltd. | Wireless headset |
US11758338B2 (en) | 2020-06-05 | 2023-09-12 | Starkey Laboratories, Inc. | Authentication and encryption key exchange for assistive listening devices |
USD956719S1 (en) * | 2020-09-30 | 2022-07-05 | Shenzhen Zio Communication Technology Co., Ltd. | Earphone |
US11394755B1 (en) * | 2021-06-07 | 2022-07-19 | International Business Machines Corporation | Guided hardware input prompts |
WO2023057461A1 (en) * | 2021-10-06 | 2023-04-13 | Sivantos Pte. Ltd. | Method for operating a hearing aid system |
US11916398B2 (en) | 2021-12-29 | 2024-02-27 | Energous Corporation | Small form-factor devices with integrated and modular harvesting receivers, and shelving-mounted wireless-power transmitters for use therewith |
EP4227959A1 (en) * | 2022-02-10 | 2023-08-16 | GN Hearing A/S | Hearing system with cardiac arrest detection |
US20230410058A1 (en) * | 2022-06-21 | 2023-12-21 | Avaya Management L.P. | Virtual meeting participation |
Also Published As
Publication number | Publication date |
---|---|
US9185501B2 (en) | 2015-11-10 |
US20130343584A1 (en) | 2013-12-26 |
US9730005B2 (en) | 2017-08-08 |
US20130344806A1 (en) | 2013-12-26 |
US20160037288A1 (en) | 2016-02-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20130343585A1 (en) | Multisensor hearing assist device for health | |
US11671773B2 (en) | Hearing aid device for hands free communication | |
US10841682B2 (en) | Communication network of in-ear utility devices having sensors | |
US9510112B2 (en) | External microphone array and hearing aid using it | |
US20180263562A1 (en) | Hearing system for monitoring a health related parameter | |
US9838771B1 (en) | In-ear utility device having a humidity sensor | |
US10045130B2 (en) | In-ear utility device having voice recognition | |
EP2874410A1 (en) | Communication system | |
US20170347179A1 (en) | In-Ear Utility Device Having Tap Detector | |
US11638106B2 (en) | Hearing system comprising a hearing aid and a processing device | |
US20170347183A1 (en) | In-Ear Utility Device Having Dual Microphones | |
WO2017205558A1 (en) | In-ear utility device having dual microphones | |
US20220272462A1 (en) | Hearing device comprising an own voice processor | |
US20240105177A1 (en) | Local artificial intelligence assistant system with ear-wearable device | |
WO2020142679A1 (en) | Audio signal processing for automatic transcription using ear-wearable device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BROADCOM CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BENNETT, JAMES D.;WALLEY, JOHN;SIGNING DATES FROM 20120926 TO 20121126;REEL/FRAME:029363/0832 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001 Effective date: 20170120 Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001 Effective date: 20170120 |
|
AS | Assignment |
Owner name: BROADCOM CORPORATION, CALIFORNIA Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001 Effective date: 20170119 |