US20150154886A1 - Tactile communication apparatus, method, and computer program product - Google Patents

Tactile communication apparatus, method, and computer program product Download PDF

Info

Publication number
US20150154886A1
US20150154886A1 US14/096,858 US201314096858A US2015154886A1 US 20150154886 A1 US20150154886 A1 US 20150154886A1 US 201314096858 A US201314096858 A US 201314096858A US 2015154886 A1 US2015154886 A1 US 2015154886A1
Authority
US
United States
Prior art keywords
user
tactile
data
tactile communication
pins
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/096,858
Inventor
Thieab AlDossary
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US14/096,858 priority Critical patent/US20150154886A1/en
Publication of US20150154886A1 publication Critical patent/US20150154886A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/003Teaching or communicating with blind persons using tactile presentation of the information, e.g. Braille displays
    • G09B21/004Details of particular tactile cells, e.g. electro-mechanical or mechanical layout
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/003Teaching or communicating with blind persons using tactile presentation of the information, e.g. Braille displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/247Telephone sets including user guidance or feature selection means facilitating their use
    • H04M1/2474Telephone terminals specially adapted for disabled people
    • H04M1/2476Telephone terminals specially adapted for disabled people for a visually impaired user

Definitions

  • Embodiments described herein relate generally to an apparatus, method, and computer program product for tactile communication. More particularly, the embodiments described relate to an apparatus that can facilitate data communications for (users who are occupying the use of their visual and auditory senses) and the visually impaired.
  • a tactile communication apparatus that includes a signal receiver configured decode data received via a wireless signal, a tactile communication device containing a plurality of pins on one side, each pin configured to respectively move in both an outward direction and inward direction to form a plurality of pin combinations based on a plurality of activation signals, and a communication processor configured to generate the plurality of pin activation signals based on the received data so as to convey the data to a user through the plurality of pin combinations of the tactile communication device.
  • FIGS. 1A and 1B illustrate a front view and a rear view, respectively, of a tactile communication apparatus according to an exemplary embodiment.
  • FIG. 2 illustrates an ergonomic design of the tactile communication device according to an exemplary embodiment.
  • FIG. 3 illustrates a pin of the tactile communication device according to an exemplary embodiment.
  • FIGS. 4A and 4B illustrate Braille code and the corresponding output of the tactile communication device according to an exemplary embodiment.
  • FIG. 5 illustrates the output of relative directional data via the tactile communication device according to an exemplary embodiment.
  • FIG. 6 is a block diagram of a tactile communication apparatus according to an exemplary embodiment.
  • FIG. 7 is a block diagram of a tactile communication apparatus including navigational features according to an exemplary embodiment.
  • FIG. 8 is a block diagram of a tactile communication apparatus including detection features according to an exemplary embodiment.
  • FIG. 9 is a sequence diagram illustrating the communication features of the tactile communication apparatus according to an exemplary embodiment.
  • FIG. 10 is a sequence diagram illustrating the navigational features of the tactile communication apparatus according to an exemplary embodiment.
  • FIG. 11 is a sequence diagram illustrating the detection features of the tactile communication apparatus according to an exemplary embodiment.
  • FIG. 12 is a hardware block diagram of a target tracking processor according to an exemplary embodiment.
  • FIGS. 13A and 13B illustrate a front view and a rear view, respectively, of a tactile communication apparatus with an additional tactile wrist communicator according to an exemplary embodiment.
  • FIGS. 14A and 14B illustrate an individual component with an internal tactile component rested and suspended, respectively, it represents one component of several tactile pins for the wrist.
  • FIG. 15A illustrates a ‘wave’ form of communicating, it demonstrates the how the pins raise and drop to relay a pulse feeling in the inside section of the wrist.
  • FIG. 15B shows the levels the pins can raise.
  • FIGS. 16A & 16B shows how wrist tactile pins can relay tactile information in the form of a musical equalizer.
  • FIGS. 17A & 17B illustrate how tactile information relayed to the wrist can be used to communicate speed of the user against the speed limit of the road the user is travelling in.
  • FIGS. 18A & 18B Show a function of tactile wrist communication whereby a tuning of an instrument can be determined by feeling raisable pins in the inside section of the wrist.
  • FIG. 19 demonstrates how tactile communication to the wrist can work alongside relaying tactile messages to the palm by indicating the number of letters in a word to the inside section of the wrist while communicating the letters to the palm.
  • Tremendous developments have occurred with mobile communication devices in a very short time frame. However they have been dependant on the users' visual or auditory senses to interact with them often causing the user to have to pause whatever they are doing to use the device. Tactile communication allows the users to feel the information, enabling less disruption to their physical activities in certain cases.
  • the present inventor recognized the need to improve the way information can be communicated discreetly to individuals without interruption to their visual and auditory activities and to assist navigation and communication while they are in motion. With the way computer technology is advancing and the way it is changing the lives of people, adequate methods of communication need to be established to tackle issues especially in a mobility situation.
  • the tactile communication apparatus is designed to communicate data such as simple text in a physical or tactile manner. Text can be communicated, for example, in the form of Braille and directions in the form of directional tactile indication.
  • the tactile communication apparatus combines both a hardware unit to work alongside computer software. It is designed to be versatile in the sense that it can work with several software programs as well as wired and wireless networks.
  • the tactile communication apparatus is able to interact with the surrounding environment to communicate additional data such as tag detection, GPS navigation, object recognition and identification, obstacle detection, etc. And communicate frequency levels such as music tactile equalizer, heart rates, blood pressure, impact response (for such applications as video games), etc.
  • the tactile communicator has also been ergonomically designed to tackle many mobility communication issues highlighted in the user research. It introduces a new way of communication to mobile Smartphone users in such a way that their visual and auditory senses are not interrupted.
  • the communication method is discrete, light, easy to use, unrestrictive and very useful for navigation in an outside mobile environment.
  • FIG. 1A illustrates a front view of a tactile communication apparatus 1 according to an exemplary embodiment.
  • the tactile communication apparatus 1 consists of two main parts: a processing section 20 and a tactile communication device 30 .
  • the processing section 20 receives signals and data from external sources and generates pin activation signals based on the data to be communicated to a user 10 .
  • the tactile communication device 30 receives pin activation signals and activates a plurality of pin combinations in a particular sequence to physically communicate data to the user 10 through a tactile projection and/or haptic mechanism (e.g., vibration).
  • FIG. 1B illustrates a rear view of the tactile communication apparatus 1 .
  • the processing section 20 receives data from any number of wired or wireless inputs. Such wired inputs may be received via a network cable, fiber optic cable, USB cable, firewire cable, or the such. Wireless inputs may be received from any form of wireless network such a WiFi, cellular, or near field communication type systems and associated protocols.
  • a signal from a wired or wireless network is received by the processing section 20 , it is processed by the appropriate processing portion to decode the signal to useful information and/or data. Activation signals for the tactile communication device 30 are then generated based on the decoded information and/or data.
  • the tactile communication device 30 facilitates communication with the user 10 through both a receiving data from the user 10 through a keypad 302 and transmitting data to the user 10 through a set of pins 304 .
  • Information received by the processing section 20 is processed and activation signals for the set of pins 302 are generated and sent to the tactile communication device 30 .
  • the tactile communication device 30 then activates the appropriate sequence of pins 304 to convey the information or data to the user 10 through a tactile indication.
  • the tactile communication device 30 is ergonomically designed, as illustrated in FIG. 2 which show the rear section of the tactile communication device 30 corresponding to FIG. 1B , so as to comfortably and completely contour to the shape of the palm of the user's 10 hand. This allows a more efficient and effective method of tactile communication with the user 10 because the pins 304 of the tactile communication device 30 are more likely to come into contact with the user 10 and the user 10 is more likely to understand and recognize the sequence of pin activations from the tactile communication device 30 .
  • the front section of the tactile communication device 30 is flat and contains a keypad 302 .
  • the keypad 302 can contain any number of keys in any number of configurations.
  • the user 10 can use the keypad 302 as an interface to initiate communication or respond to received communication.
  • the keypad 302 can be of a similar configuration to that of a standard or mobile telephone alpha/numeric keypad where the first key corresponds to 1 or ABC, the second key corresponds to 2 or DEF, etc.
  • the user 10 wants to input a message that starts with the letter “B,” the user will press the first key two times to indicate that the second character of the first key is desired to be input.
  • the tactile communication device 30 or processing section 20 can be equipped with software where the user 10 presses keys containing the desired letters once and the software will infer the desired word/phrase based on the keypad 302 combinations pressed by the user 10 .
  • the pins 304 of the tactile communication device 30 can be any form of mechanism that can convey a tactile indication, such as a solenoid 300 illustrated in FIG. 3 .
  • the solenoid 300 contains a plunger 302 , a pin 304 , a coil 306 , an end stop 308 , a frame 310 , and a set of permanent magnets 312 .
  • a pin activation signal generated at the processing section 20 actuates the solenoid 300 via the permanents magnets 312 and the coil 306 . This causes the plunger 302 to push the pin 304 in an outward direction until the pin reaches the end stop 308 .
  • the pin 304 is moving in an outward direction, it comes into contact with the user 10 providing a tactile indication.
  • the plunger 302 When the activation signal is no longer present, the plunger 302 returns to its initial state and the pin 304 moves in an inward direction. When the pin 304 is moving in an inward direction, it comes out of contact with the user 10 and no longer provides a tactile indication.
  • combinations of tactile indications can be created by activating the multiple solenoids through specific sequences so as to physically communicate data and information.
  • the sequence in which the pins 304 are activated can correspond to any form of code or language understood by the user 10 such as Braille which is commonly used by the blind or people with limited visual capability.
  • FIG. 4A illustrates the letters of the alphabet and the corresponding Braille code.
  • the user 10 will recognize letters based on a specific pin 304 combination based on the Braille code and be able to spell out words over a series of pin 304 combinations.
  • FIG. 4B illustrates the pin 304 combinations presented to the user 10 based on the tactile communication device 30 as discussed above. It should be noted that in non-limiting illustration in 4 B, that the left most column and the right most column are used to present characters according to the Braille code, but any configuration may be used that is easily understandable by the user 10 .
  • FIG. 4B also illustrates how direction information is passed to the user 10 based on cardinal direction indications such as North, South, East, West, etc.
  • the tactile communication apparatus 1 can guide the user 10 to any specified target or location using cardinal directions based on the pin 304 combinations illustrated in 4 B.
  • FIG. 5 also illustrates how direction information is passed to the user 10 based on relative direction indication based on a bearing relative to the users 10 current direction.
  • FIG. 6 is a block diagram of an exemplary tactile communication apparatus 1 .
  • Data and information is sent to the tactile communication apparatus 1 via a wireless network 40 . It should also be noted that data and information can also be sent to the tactile communication apparatus 1 via a wired network.
  • the processing section 20 receives the data signal from the wireless network 40 at the signal receiver 204 .
  • the signal receiver 204 decodes the data signal and sends the data to the communication processor 202 .
  • the communication processor parses the data and generates pin activation signals that are sent to the tactile communication device 30 which physically communicates the data to the user 10 via the pins 304 . Data and information can also be generated by the user 10 at the tactile communication device 30 , via the keypad 302 , and sent to the communication processor 202 .
  • the communication processor 202 will process the inputs received from the tactile communication device 30 and construct a formatted data or information message.
  • the message will be sent to the signal receiver 204 which will generate a data packet based on the medium in which the message will be transmitted and then transmit the data packet to the wired or wireless network 40 .
  • the tactile communication device 30 can also include a vibration unit 306 to provide an additional means of tactile communication.
  • the vibration unit 306 may be activated to provide general or non specific indication or acknowledgement of an event such as confirmation that a message has been sent, indication that a message has been received, or to notify the user 10 of an error.
  • FIG. 7 is a block diagram of a tactile communication apparatus 1 that can provide position and navigation functionality.
  • the processing section 20 also contains a GPS unit 208 that receives position data from a satellite network 50 .
  • the GPS unit calculates a current position based on the received position data and then sends the current position to the navigation processor 206 .
  • the navigation processor 206 can either relay the current position to the user 10 via the pins 304 , or update navigation data to a predetermined location or object and provide directional information to the user 10 via the pins 304 based on a current position.
  • Directional information for example, can be provided to the user 10 via cardinal direction, as illustrated in FIG. 4B , or relative direction, as illustrated in FIG. 5 .
  • the user 10 can input a desired destination or object to the navigation processor 206 , via the keypad 302 , for which the navigation processor 206 will calculate directional information.
  • FIG. 8 is a block diagram of a tactile communication apparatus 1 that can provide object recognition and identification functionality.
  • the processing section 20 also contains a detection unit 212 that receives images or sensor data 60 of the surrounding environment of the user 10 . Images or sensor data 60 can be obtained from an appropriate sensing mechanism such as a camera, video recorder, motion detection, or radar or sonar device. Data from one of these devices is received by the detection unit 212 where objects and features contained within the data can be identified and stripped or isolated. Object and feature data is then sent to the detection processor 210 where they are processed and compared to known or predetermined objects. If a match is made and an object or feature is recognized, the detection processor 210 will notify the user 10 via the tactile communication device 30 of the recognition.
  • an appropriate sensing mechanism such as a camera, video recorder, motion detection, or radar or sonar device.
  • Data from one of these devices is received by the detection unit 212 where objects and features contained within the data can be identified and stripped or isolated.
  • Object and feature data is then sent to the detection
  • the user 10 may wish to locate a nearby object, such as a digital camera.
  • the user 10 would enter in an appropriate code into the keypad 302 , such as “DC” for digital camera, to indicate to the tactile communication apparatus 1 that the user would like to locate this object.
  • the tactile communication apparatus 1 would then receive image or sensor data 60 from the surrounding environment from an appropriate sensor (not shown), which can either be attached to the tactile communication apparatus 1 or a separate device.
  • Image and sensor data 60 would then be fed into the detection unit 212 for image processing.
  • Features and object located within the image and sensor data would then be sent to the detection processor 210 which would parse the features and objects until the digital camera was recognized.
  • the detection processor could work in conjunction with the navigation processor so that once a desired object has been recognized or found, the navigation processor could guide the user 10 to the object using the pins 304 of the tactile communication device 30 .
  • FIG. 9 is a sequence diagram of a tactile communication apparatus 1 according to an exemplary embodiment.
  • the tactile communication apparatus 1 may be standing by at step S 100 to receive a wireless signal via the signal receiver 204 .
  • the signal is decoded or demodulated based on the type of network and protocols which the signal was received.
  • the signal is then processed at the communication processor 202 to produce the data which is to be communicated to the user 10 at step S 104 .
  • the communication processor 202 generates pin activation signals at step S 106 and transmits the pin activation signals to the tactile communication device 30 .
  • the tactile communication device 30 activates the appropriate pins in a specific sequence according to the pin activation signals so as to communicate the received data to the user 10 .
  • the user 10 may or may not provide a response to the data to the tactile communication device 30 via the keypad 302 at step S 110 . If no user response is detected at step S 110 , the tactile communication apparatus 1 returns to a standby state at step S 112 . If the user 10 response has been detected at step S 110 via the keypad 306 , the communication processor receives the data from the tactile communication device 30 at step S 114 .
  • the received data from user 10 is processed at step S 116 so as to transmit the data via a wireless signal.
  • the data is transmitted over the wireless network at step S 118 .
  • the tactile communication apparatus 1 returns to standby state at step S 112 .
  • FIG. 10 is a sequence diagram of a tactile communication apparatus 1 illustrating the features out of providing navigational data to a user 10 .
  • the tactile communication apparatus 1 may be standing by at step S 200 to receive a desired destination or location from user 10 .
  • the navigation processor 206 processes the destination data to produce navigation data at step S 204 .
  • the navigation processor 206 also receives GPS data from the GPS unit 208 .
  • the navigation processor 206 generates pin activation signals at step S 206 to communicate the navigation data to the user 10 .
  • Activation signals are received at the tactile communication device 30 at step S 208 which initiates the tactile communication of the navigation data to the user 10 .
  • the user may respond or continue to follow the navigation data at step S 210 . If the user provides no response and continues to follow the navigation data at step S 210 , a further determination will be made based on the continued supply of GPS data from the GPS unit 208 to determine if the user 10 has reached the desired destination at step 212 . If the desired destination has not yet been reached at step S 212 the tactile communication apparatus 1 continues to process navigation data at step S 204 to continue to guide the user 10 to the desired destination.
  • the tactile communication apparatus returns to a standby state at step S 214 .
  • the user 10 may respond to provide updated destination information or corrections at step S 210 .
  • the navigation processor 206 receives input from step S 216 and then processes that information at step 218 to update or correct navigational data.
  • the tactile communication apparatus 1 determines if the new destination has been reached at step 212 . If the new destination has not yet been reached at step S 212 the tactile communication apparatus 1 continues to process navigation data at step S 204 . Otherwise the tactile communication apparatus 1 enters into a standby state once the new destination has been reached at step S 214 .
  • FIG. 11 is a sequence diagram of a tactile communication apparatus 1 providing the additional features of object identification and recognition.
  • the tactile communication apparatus 1 may be standing by at step S 300 to receive an object identification code from the user 10 .
  • the detection processor 210 receives sensor data 60 of a plurality of objects to be identified via the detection unit 212 .
  • Sensor data received by the detection unit 212 can be any form capable of being processed by the detection processor 210 such as image information, motion information, or radar or sonar information.
  • the detection processor 210 processes and identifies objects and features contained within the sensor data at step S 306 . Once the sensor data has been processed at step S 306 , the detection processor 210 determines if an identified object or feature corresponds to the object identification code received from the user 10 at step S 302 . If a recognized object fails to match the object identification code at step S 308 , the tactile communication device 30 may indicate an error or a no match indication at step S 310 via activating a vibration unit 306 in the tactile communication device 30 . Once the user 10 is notified that no matches have been detected, the tactile communication apparatus 1 will return to a standby state at step S 312 .
  • the detection processor 210 may work in conjunction with the navigation processor 206 to generate directional data from the user to navigate to the recognized object at step S 314 .
  • Navigation data to the recognized object will be communicated to the user via the tactile communication device 30 at step S 316 .
  • the tactile communication apparatus returns to a standby state at step S 312 .
  • the tactile communication apparatus 1 includes a CPU 500 which performs the processes described above.
  • the process data and instructions may be stored in memory 502 .
  • These processes and instructions may also be stored on a storage medium disk 504 such as a hard drive (HDD) or portable storage medium or may be stored remotely.
  • a storage medium disk 504 such as a hard drive (HDD) or portable storage medium or may be stored remotely.
  • the claimed advancements are not limited by the form of the computer-readable media on which the instructions of the inventive process are stored.
  • the instructions may be stored on CDs, DVDs, in FLASH memory, RAM, ROM, PROM, EPROM, EEPROM, hard disk or any other information processing device with which the tactile communication apparatus 1 communicates, such as a server or computer.
  • claimed advancements may be provided as a utility application, background daemon, or component of an operating system, or combination thereof, executing in conjunction with CPU 500 and an operating system such as Microsoft Windows 7, UNIX, Solaris, LINUX, Apple MAC-OS and other systems known to those skilled in the art.
  • an operating system such as Microsoft Windows 7, UNIX, Solaris, LINUX, Apple MAC-OS and other systems known to those skilled in the art.
  • CPU 500 may be a Xenon or Core processor from Intel of America or an Opteron processor from AMD of America, or may be other processor types that would be recognized by one of ordinary skill in the art.
  • the CPU 500 may be implemented on an FPGA, ASIC, PLD or using discrete logic circuits, as one of ordinary skill in the art would recognize. Further, CPU 500 may be implemented as multiple processors cooperatively working in parallel to perform the instructions of the inventive processes described above.
  • the tactile communication apparatus 1 in FIG. 12 also includes a signal receiver 204 , such as an Intel Ethernet PRO network interface card from Intel Corporation of America, for interfacing with wireless network 40 .
  • the wireless network 40 can be a public network, such as the Internet, or a private network such as an LAN or WAN network, or any combination thereof and can also include PSTN or ISDN sub-networks.
  • the wireless network 40 can also be wired, such as an Ethernet network, or can be wireless such as a cellular network including EDGE, 3G and 4G wireless cellular systems.
  • the wireless network can also be WiFi, Bluetooth, or any other wireless form of communication that is known.
  • the mobile tracking and subduing apparatus 20 further includes a display controller 508 , such as a NVIDIA GeForce GTX or Quadro graphics adaptor from NVIDIA Corporation of America for interfacing with display 510 , such as a Hewlett Packard HPL2445w LCD monitor.
  • a general purpose I/O interface 512 interfaces with a keypad 302 as well as a touch screen panel 516 on or separate from display 510 .
  • General purpose I/O interface also connects to a plurality of pins 304 .
  • a sound controller 520 is also provided in the tactile communication apparatus 1 , such as Sound Blaster X-Fi Titanium from Creative, to interface with speakers/microphone 522 thereby providing sounds and/or music.
  • the speakers/microphone 522 can also be used to accept dictated words as commands for controlling the tactile communication apparatus 1 or for providing location and/or property information with respect to the target property.
  • the general purpose storage controller 524 connects the storage medium disk 504 with communication bus 526 , which may be an ISA, EISA, VESA, PCI, or similar, for interconnecting all of the components of the tactile communication apparatus 1 .
  • communication bus 526 may be an ISA, EISA, VESA, PCI, or similar, for interconnecting all of the components of the tactile communication apparatus 1 .
  • a description of the general features and functionality of the display 510 , as well as the display controller 508 , storage controller 524 , network controller 506 , and sound controller 520 is omitted herein for brevity as these features are known.
  • FIG. 13A illustrates a front view of a tactile communication apparatus 2 according to an exemplary embodiment.
  • the tactile communication apparatus 1 included two main parts: a processing section 20 and a tactile communication device 30
  • this embodiment has three: a processing section 20 , a tactile communication device for the palm 30 and, a set of tactile components for the wrist 35 .
  • the processing section 20 receives wireless signals and data from external sources and generates pin activation signals based on the data to be communicated to a user 10 .
  • the processing section 20 is incorporated into a visual interface component 50 which resembles a watch face.
  • the watch face component 50 can also act as a control component for the wearable apparatuses 30 & 35 as well as perform more rudimentary functions such as being a wrist watch (e.g. Smartwatch).
  • the watch component can be a touchscreen interface for a computer system that enables the user to adjust the settings on the tactile communicator and operate various installable applications, this component can also be operated by push buttons or by voice or by any other means that enables the user to operate the apparatus.
  • the tactile communication components 35 receive pin activation signals which activate a plurality of pin combinations in a particular sequence to physically communicate data to the user 10 through a tactile projection and/or haptic — mechanism (e.g., vibration).
  • FIG. 13B illustrates a rear view of the tactile communication apparatus 2 .
  • the tactile communication components 35 are assembled to communicate information to the user 10 via the wrist, the example presented in FIG. 13B the components shown to be assembled together in a row to make contact with the inside section of the user's wrist as shown in FIG. 13A .
  • FIGS. 14A and 14B depict a single tactile communicator component 35 and the way it functions. As illustrated three pins 45 are situated in a solid casing. FIGS. 14A and 14B illustrate the mechanical pins 45 rested in the casing and raised out of it respectfully. In the example illustrated in FIGS. 14A and 14B , three pins 45 are depicted in a line, however that number is non-limiting and more pins can be included in the case component 35 , where every pin in the case can function independently of the other. Each individual component is attachable to a strap like component that enables all of the parts of the apparatus 20 , 35 , 50 to assemble together.
  • FIG. 13B there are ten case components 35 , each containing three raisable pins 45 depicted in FIGS. 14A and 14B , each situated next each other to make contact with the inside section of the wrist.
  • the assembly of the components 20 , 35 , 50 are not limited to the depiction in example in FIGS. 13A and 13B and are not limited to making contact to just the inside of the wrist or to ten cases containing raisable pins; the assembly can include casings containing differing numbers of raisable pins to make contact with the outside section of the wrist as well as the inside section for example.
  • the tactile case components 35 communicate information in a number of ways; one can be in a ‘wave’ fashion such that the pins 45 on the wrist relay a feeling of a ‘rippling’ effect.
  • FIGS. 15A and 15B show how a ripple effect would occur.
  • FIG. 15B shows three different levels the pins 45 raise out from the case 35 , the more the pin rise the greater the intensity of the tactile effect on the user.
  • FIGS. 16A & 16B illustrate a musical equalizer display in the wrist tactile communicator for a user that listens to music or sound through a digital music or sound player and may wish to feel rhythms as depicted in FIG. 16A in their wrist; the tactile apparatus 35 would work to act as a tactile equalizer for relaying the beats and melodies of the music to the user in a tactile format as depicted in #1, #2 and #3 of FIG. 16B .
  • Another example of a method for the tactile communicator 35 is to relay information in a ‘meter’ format, so that pins positioned in a single case raise and hold to indicate a variable reading, should the variable level change (for example a temperature or speed) the pins 45 in the cases 35 would then drop and pins in a second case would raise and hold to indicate the variable change. Examples for this would be to relay such things as speed or temperature or tuning, etc.
  • An example of a use for such a method of communication can be to relay a car speed and compare it to the speed limit of a road a car is travelling on, FIG.
  • FIG. 17 illustrates an application in which the wrist tactile display 35 serves as a speedometer
  • this embodiment would function by downloading an application on a portable communication device such as a Smartphone to synchronize with a GIS mapping system such as Google Maps to retrieve road speed limit data from an online database and match the speed of the car to the speed limit and relay that information in a meter tactile format so that the driver can know whether the car is surpassing the speed limit or not.
  • a GIS mapping system such as Google Maps
  • FIG. 18 demonstrates another example for a ‘meter’ application; it illustrates a method of using the wrist tactile display 50 for tuning a string instrument so that the center case of pins e & f indicate the string is in tune and the other cases of pins indicate whether the string is either sharp or flat, etc.
  • FIG. 19 shows an example where the wrist tactile components 35 clarify the word being relayed through the palm communicator 30 ; In this example the word ‘HEY’ is being relayed to the user. While the tactile device 30 communicates the letters of the text in the form of Braille to the palm of the hand, the tactile components 35 can indicate such information as how many characters are in the word and possibly how many words are in the paragraph.
  • FIG. 19 #1 all of the first case of pins ‘a’ on the wrist communicator 35 raise, however only the middle pins in case ‘b’ and ‘c’ raise, these indicate to the user that there are three letters in the word.
  • the pins would change to show the second letter of the word the letter ‘E’ #2. While the tactile communicator in the palm 30 displays the letter ‘E’ the tactile communication component 35 raises the second case of pins ‘b’ to indicate that the letter ‘E’ is the second letter of the three letter word, the process repeats for #3. Using the tactile communication component 35 enables a better understanding of the text information being communicated through 30 .
  • the tactile communication components 35 can also have the ability to vibrate and change temperature to heat (via resistive element) or cool (via peltier element) the individual pin cases to enhance and diversify the alternative applications for the overall device.
  • An example of an application that could use the process of the pin cases 35 to heat or cool can be in a compass type application, a user can search for a location on a GIS application such as Google maps, when the user points their arm or hand in the correct direction the wrist tactile components 35 heat and when the wrist communicator is not pointed in the correct direction it then cools.
  • the tactile communication components 35 have the ability to change temperature gradually (as per user setting) to indicate the level of change or nearly instantly if selected. When for example the correct direction is located with the aid of the temperature changing function then the tactile components 35 can vibrate to indicate to the user that the correct direction is found.

Abstract

A tactile communication apparatus that includes a signal receiver configured decode data received via a wireless signal, a tactile communication device containing a plurality of pins on one side, each pin configured to respectively move in both an outward direction and inward direction to form a plurality of pin combinations based on a plurality of activation signals, and a communication processor configured to generate the plurality of pin activation signals based on the received data so as to convey the data to a user through the plurality of pin combinations of the tactile communication device.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • The present application is related to U.S. application Ser. No. 13/564,395, filed Aug. 1, 2012, the entire contents of which is incorporated herein by reference.
  • GRANT OF NON-EXCLUSIVE RIGHT
  • This application was prepared with financial support from the Saudi Arabian Cultural Mission, and in consideration therefore the present inventor(s) has granted The Kingdom of Saudi Arabia a non-exclusive right to practice the present invention.
  • BACKGROUND Field of the Disclosure
  • Embodiments described herein relate generally to an apparatus, method, and computer program product for tactile communication. More particularly, the embodiments described relate to an apparatus that can facilitate data communications for (users who are occupying the use of their visual and auditory senses) and the visually impaired.
  • SUMMARY
  • According to an embodiment, there is provided a tactile communication apparatus that includes a signal receiver configured decode data received via a wireless signal, a tactile communication device containing a plurality of pins on one side, each pin configured to respectively move in both an outward direction and inward direction to form a plurality of pin combinations based on a plurality of activation signals, and a communication processor configured to generate the plurality of pin activation signals based on the received data so as to convey the data to a user through the plurality of pin combinations of the tactile communication device.
  • According to another embodiment, there is also provided a method of tactile communication
  • The foregoing paragraphs have been provided by way of general introduction, and are not intended to limit the scope of the following claims. The described embodiments, together with further advantages, will be best understood by reference to the following detailed description taken in conjunction with the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • A more complete appreciation of the present advancements and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings. However, the accompanying drawings and the exemplary depictions do not in any way limit the scope of the advancements embraced by the specification. The scope of the advancements embraced by the specification and drawings are defined by the words of the accompanying claims.
  • FIGS. 1A and 1B illustrate a front view and a rear view, respectively, of a tactile communication apparatus according to an exemplary embodiment.
  • FIG. 2 illustrates an ergonomic design of the tactile communication device according to an exemplary embodiment.
  • FIG. 3 illustrates a pin of the tactile communication device according to an exemplary embodiment.
  • FIGS. 4A and 4B illustrate Braille code and the corresponding output of the tactile communication device according to an exemplary embodiment.
  • FIG. 5 illustrates the output of relative directional data via the tactile communication device according to an exemplary embodiment.
  • FIG. 6 is a block diagram of a tactile communication apparatus according to an exemplary embodiment.
  • FIG. 7 is a block diagram of a tactile communication apparatus including navigational features according to an exemplary embodiment.
  • FIG. 8 is a block diagram of a tactile communication apparatus including detection features according to an exemplary embodiment.
  • FIG. 9 is a sequence diagram illustrating the communication features of the tactile communication apparatus according to an exemplary embodiment.
  • FIG. 10 is a sequence diagram illustrating the navigational features of the tactile communication apparatus according to an exemplary embodiment.
  • FIG. 11 is a sequence diagram illustrating the detection features of the tactile communication apparatus according to an exemplary embodiment.
  • FIG. 12 is a hardware block diagram of a target tracking processor according to an exemplary embodiment.
  • FIGS. 13A and 13B illustrate a front view and a rear view, respectively, of a tactile communication apparatus with an additional tactile wrist communicator according to an exemplary embodiment.
  • FIGS. 14A and 14B illustrate an individual component with an internal tactile component rested and suspended, respectively, it represents one component of several tactile pins for the wrist.
  • FIG. 15A illustrates a ‘wave’ form of communicating, it demonstrates the how the pins raise and drop to relay a pulse feeling in the inside section of the wrist. FIG. 15B shows the levels the pins can raise.
  • FIGS. 16A & 16B shows how wrist tactile pins can relay tactile information in the form of a musical equalizer.
  • FIGS. 17A & 17B illustrate how tactile information relayed to the wrist can be used to communicate speed of the user against the speed limit of the road the user is travelling in.
  • FIGS. 18A & 18B Show a function of tactile wrist communication whereby a tuning of an instrument can be determined by feeling raisable pins in the inside section of the wrist.
  • FIG. 19 demonstrates how tactile communication to the wrist can work alongside relaying tactile messages to the palm by indicating the number of letters in a word to the inside section of the wrist while communicating the letters to the palm.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS
  • Tremendous developments have occurred with mobile communication devices in a very short time frame. However they have been dependant on the users' visual or auditory senses to interact with them often causing the user to have to pause whatever they are doing to use the device. Tactile communication allows the users to feel the information, enabling less disruption to their physical activities in certain cases.
  • The present inventor recognized the need to improve the way information can be communicated discreetly to individuals without interruption to their visual and auditory activities and to assist navigation and communication while they are in motion. With the way computer technology is advancing and the way it is changing the lives of people, adequate methods of communication need to be established to tackle issues especially in a mobility situation.
  • The tactile communication apparatus is designed to communicate data such as simple text in a physical or tactile manner. Text can be communicated, for example, in the form of Braille and directions in the form of directional tactile indication. The tactile communication apparatus combines both a hardware unit to work alongside computer software. It is designed to be versatile in the sense that it can work with several software programs as well as wired and wireless networks. Along with simple text, directional communications and frequency levels, the tactile communication apparatus is able to interact with the surrounding environment to communicate additional data such as tag detection, GPS navigation, object recognition and identification, obstacle detection, etc. And communicate frequency levels such as music tactile equalizer, heart rates, blood pressure, impact response (for such applications as video games), etc.
  • The tactile communicator has also been ergonomically designed to tackle many mobility communication issues highlighted in the user research. It introduces a new way of communication to mobile Smartphone users in such a way that their visual and auditory senses are not interrupted. The communication method is discrete, light, easy to use, unrestrictive and very useful for navigation in an outside mobile environment.
  • Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout several views.
  • FIG. 1A illustrates a front view of a tactile communication apparatus 1 according to an exemplary embodiment. The tactile communication apparatus 1 consists of two main parts: a processing section 20 and a tactile communication device 30. The processing section 20 receives signals and data from external sources and generates pin activation signals based on the data to be communicated to a user 10. The tactile communication device 30 receives pin activation signals and activates a plurality of pin combinations in a particular sequence to physically communicate data to the user 10 through a tactile projection and/or haptic mechanism (e.g., vibration). FIG. 1B illustrates a rear view of the tactile communication apparatus 1.
  • The processing section 20 receives data from any number of wired or wireless inputs. Such wired inputs may be received via a network cable, fiber optic cable, USB cable, firewire cable, or the such. Wireless inputs may be received from any form of wireless network such a WiFi, cellular, or near field communication type systems and associated protocols. Once a signal from a wired or wireless network is received by the processing section 20, it is processed by the appropriate processing portion to decode the signal to useful information and/or data. Activation signals for the tactile communication device 30 are then generated based on the decoded information and/or data.
  • The tactile communication device 30 facilitates communication with the user 10 through both a receiving data from the user 10 through a keypad 302 and transmitting data to the user 10 through a set of pins 304. Information received by the processing section 20 is processed and activation signals for the set of pins 302 are generated and sent to the tactile communication device 30. The tactile communication device 30 then activates the appropriate sequence of pins 304 to convey the information or data to the user 10 through a tactile indication.
  • The tactile communication device 30 is ergonomically designed, as illustrated in FIG. 2 which show the rear section of the tactile communication device 30 corresponding to FIG. 1B, so as to comfortably and completely contour to the shape of the palm of the user's 10 hand. This allows a more efficient and effective method of tactile communication with the user 10 because the pins 304 of the tactile communication device 30 are more likely to come into contact with the user 10 and the user 10 is more likely to understand and recognize the sequence of pin activations from the tactile communication device 30.
  • The front section of the tactile communication device 30, as illustrated in FIG. 1A, is flat and contains a keypad 302. The keypad 302 can contain any number of keys in any number of configurations. The user 10 can use the keypad 302 as an interface to initiate communication or respond to received communication. For a non-limiting example, the keypad 302 can be of a similar configuration to that of a standard or mobile telephone alpha/numeric keypad where the first key corresponds to 1 or ABC, the second key corresponds to 2 or DEF, etc. When the user 10 wants to input a message that starts with the letter “B,” the user will press the first key two times to indicate that the second character of the first key is desired to be input. In a second non-limiting example, the tactile communication device 30 or processing section 20 can be equipped with software where the user 10 presses keys containing the desired letters once and the software will infer the desired word/phrase based on the keypad 302 combinations pressed by the user 10.
  • The pins 304 of the tactile communication device 30 can be any form of mechanism that can convey a tactile indication, such as a solenoid 300 illustrated in FIG. 3. In an exemplary embodiment, the solenoid 300 contains a plunger 302, a pin 304, a coil 306, an end stop 308, a frame 310, and a set of permanent magnets 312. A pin activation signal generated at the processing section 20 actuates the solenoid 300 via the permanents magnets 312 and the coil 306. This causes the plunger 302 to push the pin 304 in an outward direction until the pin reaches the end stop 308. When the pin 304 is moving in an outward direction, it comes into contact with the user 10 providing a tactile indication. When the activation signal is no longer present, the plunger 302 returns to its initial state and the pin 304 moves in an inward direction. When the pin 304 is moving in an inward direction, it comes out of contact with the user 10 and no longer provides a tactile indication. Through the use of multiple solenoids, combinations of tactile indications can be created by activating the multiple solenoids through specific sequences so as to physically communicate data and information.
  • The sequence in which the pins 304 are activated can correspond to any form of code or language understood by the user 10 such as Braille which is commonly used by the blind or people with limited visual capability.
  • FIG. 4A illustrates the letters of the alphabet and the corresponding Braille code. The user 10 will recognize letters based on a specific pin 304 combination based on the Braille code and be able to spell out words over a series of pin 304 combinations. FIG. 4B illustrates the pin 304 combinations presented to the user 10 based on the tactile communication device 30 as discussed above. It should be noted that in non-limiting illustration in 4B, that the left most column and the right most column are used to present characters according to the Braille code, but any configuration may be used that is easily understandable by the user 10.
  • FIG. 4B also illustrates how direction information is passed to the user 10 based on cardinal direction indications such as North, South, East, West, etc. When communicating directional information, the tactile communication apparatus 1 can guide the user 10 to any specified target or location using cardinal directions based on the pin 304 combinations illustrated in 4B.
  • Further, FIG. 5 also illustrates how direction information is passed to the user 10 based on relative direction indication based on a bearing relative to the users 10 current direction.
  • FIG. 6 is a block diagram of an exemplary tactile communication apparatus 1. Data and information is sent to the tactile communication apparatus 1 via a wireless network 40. It should also be noted that data and information can also be sent to the tactile communication apparatus 1 via a wired network. The processing section 20 receives the data signal from the wireless network 40 at the signal receiver 204. The signal receiver 204 decodes the data signal and sends the data to the communication processor 202. The communication processor parses the data and generates pin activation signals that are sent to the tactile communication device 30 which physically communicates the data to the user 10 via the pins 304. Data and information can also be generated by the user 10 at the tactile communication device 30, via the keypad 302, and sent to the communication processor 202. The communication processor 202 will process the inputs received from the tactile communication device 30 and construct a formatted data or information message. The message will be sent to the signal receiver 204 which will generate a data packet based on the medium in which the message will be transmitted and then transmit the data packet to the wired or wireless network 40.
  • In another embodiment, the tactile communication device 30 can also include a vibration unit 306 to provide an additional means of tactile communication. The vibration unit 306 may be activated to provide general or non specific indication or acknowledgement of an event such as confirmation that a message has been sent, indication that a message has been received, or to notify the user 10 of an error.
  • In another embodiment, FIG. 7 is a block diagram of a tactile communication apparatus 1 that can provide position and navigation functionality. In this embodiment, the processing section 20 also contains a GPS unit 208 that receives position data from a satellite network 50. The GPS unit calculates a current position based on the received position data and then sends the current position to the navigation processor 206. The navigation processor 206 can either relay the current position to the user 10 via the pins 304, or update navigation data to a predetermined location or object and provide directional information to the user 10 via the pins 304 based on a current position. Directional information, for example, can be provided to the user 10 via cardinal direction, as illustrated in FIG. 4B, or relative direction, as illustrated in FIG. 5. The user 10 can input a desired destination or object to the navigation processor 206, via the keypad 302, for which the navigation processor 206 will calculate directional information.
  • In another embodiment, FIG. 8 is a block diagram of a tactile communication apparatus 1 that can provide object recognition and identification functionality. In this embodiment, the processing section 20 also contains a detection unit 212 that receives images or sensor data 60 of the surrounding environment of the user 10. Images or sensor data 60 can be obtained from an appropriate sensing mechanism such as a camera, video recorder, motion detection, or radar or sonar device. Data from one of these devices is received by the detection unit 212 where objects and features contained within the data can be identified and stripped or isolated. Object and feature data is then sent to the detection processor 210 where they are processed and compared to known or predetermined objects. If a match is made and an object or feature is recognized, the detection processor 210 will notify the user 10 via the tactile communication device 30 of the recognition.
  • In a non-limiting example of the above described process, the user 10 may wish to locate a nearby object, such as a digital camera. The user 10 would enter in an appropriate code into the keypad 302, such as “DC” for digital camera, to indicate to the tactile communication apparatus 1 that the user would like to locate this object. The tactile communication apparatus 1 would then receive image or sensor data 60 from the surrounding environment from an appropriate sensor (not shown), which can either be attached to the tactile communication apparatus 1 or a separate device. Image and sensor data 60 would then be fed into the detection unit 212 for image processing. Features and object located within the image and sensor data would then be sent to the detection processor 210 which would parse the features and objects until the digital camera was recognized.
  • Further, the detection processor could work in conjunction with the navigation processor so that once a desired object has been recognized or found, the navigation processor could guide the user 10 to the object using the pins 304 of the tactile communication device 30.
  • FIG. 9 is a sequence diagram of a tactile communication apparatus 1 according to an exemplary embodiment. Initially, the tactile communication apparatus 1 may be standing by at step S100 to receive a wireless signal via the signal receiver 204. When a signal is received by the signal receiver 104 at S102, the signal is decoded or demodulated based on the type of network and protocols which the signal was received. The signal is then processed at the communication processor 202 to produce the data which is to be communicated to the user 10 at step S104. The communication processor 202 generates pin activation signals at step S106 and transmits the pin activation signals to the tactile communication device 30.
  • Once the tactile communication device 30 receives the pin activation signals at step 108 the tactile communication device 30 activates the appropriate pins in a specific sequence according to the pin activation signals so as to communicate the received data to the user 10. When the data has been communicated to the user 10 via the tactile communication device 30, the user 10 may or may not provide a response to the data to the tactile communication device 30 via the keypad 302 at step S110. If no user response is detected at step S110, the tactile communication apparatus 1 returns to a standby state at step S112. If the user 10 response has been detected at step S110 via the keypad 306, the communication processor receives the data from the tactile communication device 30 at step S114. The received data from user 10 is processed at step S116 so as to transmit the data via a wireless signal. Once the data has been encoded or modulated via the appropriate means based on the network, the data is transmitted over the wireless network at step S118. Finally, the tactile communication apparatus 1 returns to standby state at step S112.
  • According to another embodiment, FIG. 10 is a sequence diagram of a tactile communication apparatus 1 illustrating the features out of providing navigational data to a user 10. Initially, the tactile communication apparatus 1 may be standing by at step S200 to receive a desired destination or location from user 10. Once a desired location or destination has been received from a user at step S202, the navigation processor 206 processes the destination data to produce navigation data at step S204. The navigation processor 206 also receives GPS data from the GPS unit 208. Once the navigational data has been generated by the navigation processor 206 at step S204 the navigation processor 206 generates pin activation signals at step S206 to communicate the navigation data to the user 10. Activation signals are received at the tactile communication device 30 at step S208 which initiates the tactile communication of the navigation data to the user 10. When the navigational data is being communicated to the user 10 the user may respond or continue to follow the navigation data at step S210. If the user provides no response and continues to follow the navigation data at step S210, a further determination will be made based on the continued supply of GPS data from the GPS unit 208 to determine if the user 10 has reached the desired destination at step 212. If the desired destination has not yet been reached at step S212 the tactile communication apparatus 1 continues to process navigation data at step S204 to continue to guide the user 10 to the desired destination. If a desired destination has been determined to be reached at step S212 the tactile communication apparatus returns to a standby state at step S214. Upon receiving tactile indication at step 208, the user 10 may respond to provide updated destination information or corrections at step S210. If user 10 response has been detected at step S210 the navigation processor 206 receives input from step S216 and then processes that information at step 218 to update or correct navigational data. The tactile communication apparatus 1 then determines if the new destination has been reached at step 212. If the new destination has not yet been reached at step S212 the tactile communication apparatus 1 continues to process navigation data at step S204. Otherwise the tactile communication apparatus 1 enters into a standby state once the new destination has been reached at step S214.
  • According to another embodiment FIG. 11 is a sequence diagram of a tactile communication apparatus 1 providing the additional features of object identification and recognition. Initially the tactile communication apparatus 1 may be standing by at step S300 to receive an object identification code from the user 10. When an object identification code is received from the user 10 at step S302, via the keypad 302, the detection processor 210 receives sensor data 60 of a plurality of objects to be identified via the detection unit 212. Sensor data received by the detection unit 212 can be any form capable of being processed by the detection processor 210 such as image information, motion information, or radar or sonar information. Once the surrounding objects about the tactile communication apparatus 1 have been detected at step S304, the detection processor 210 processes and identifies objects and features contained within the sensor data at step S306. Once the sensor data has been processed at step S306, the detection processor 210 determines if an identified object or feature corresponds to the object identification code received from the user 10 at step S302. If a recognized object fails to match the object identification code at step S308, the tactile communication device 30 may indicate an error or a no match indication at step S310 via activating a vibration unit 306 in the tactile communication device 30. Once the user 10 is notified that no matches have been detected, the tactile communication apparatus 1 will return to a standby state at step S312. If at step S308 a match has been determined of a recognized object with the object identification code, the detection processor 210 may work in conjunction with the navigation processor 206 to generate directional data from the user to navigate to the recognized object at step S314. Navigation data to the recognized object will be communicated to the user via the tactile communication device 30 at step S316. When the user 10 has been successfully guided to the desired object at step S318 the tactile communication apparatus returns to a standby state at step S312.
  • Next, a hardware description of the tactile communication apparatus 1 according to exemplary embodiments is described with reference to FIG. 12. In FIG. 12, the tactile communication apparatus 1 includes a CPU 500 which performs the processes described above. The process data and instructions may be stored in memory 502. These processes and instructions may also be stored on a storage medium disk 504 such as a hard drive (HDD) or portable storage medium or may be stored remotely. Further, the claimed advancements are not limited by the form of the computer-readable media on which the instructions of the inventive process are stored. For example, the instructions may be stored on CDs, DVDs, in FLASH memory, RAM, ROM, PROM, EPROM, EEPROM, hard disk or any other information processing device with which the tactile communication apparatus 1 communicates, such as a server or computer.
  • Further, the claimed advancements may be provided as a utility application, background daemon, or component of an operating system, or combination thereof, executing in conjunction with CPU 500 and an operating system such as Microsoft Windows 7, UNIX, Solaris, LINUX, Apple MAC-OS and other systems known to those skilled in the art.
  • CPU 500 may be a Xenon or Core processor from Intel of America or an Opteron processor from AMD of America, or may be other processor types that would be recognized by one of ordinary skill in the art. Alternatively, the CPU 500 may be implemented on an FPGA, ASIC, PLD or using discrete logic circuits, as one of ordinary skill in the art would recognize. Further, CPU 500 may be implemented as multiple processors cooperatively working in parallel to perform the instructions of the inventive processes described above.
  • The tactile communication apparatus 1 in FIG. 12 also includes a signal receiver 204, such as an Intel Ethernet PRO network interface card from Intel Corporation of America, for interfacing with wireless network 40. As can be appreciated, the wireless network 40 can be a public network, such as the Internet, or a private network such as an LAN or WAN network, or any combination thereof and can also include PSTN or ISDN sub-networks. The wireless network 40 can also be wired, such as an Ethernet network, or can be wireless such as a cellular network including EDGE, 3G and 4G wireless cellular systems. The wireless network can also be WiFi, Bluetooth, or any other wireless form of communication that is known.
  • The mobile tracking and subduing apparatus 20 further includes a display controller 508, such as a NVIDIA GeForce GTX or Quadro graphics adaptor from NVIDIA Corporation of America for interfacing with display 510, such as a Hewlett Packard HPL2445w LCD monitor. A general purpose I/O interface 512 interfaces with a keypad 302 as well as a touch screen panel 516 on or separate from display 510. General purpose I/O interface also connects to a plurality of pins 304.
  • A sound controller 520 is also provided in the tactile communication apparatus 1, such as Sound Blaster X-Fi Titanium from Creative, to interface with speakers/microphone 522 thereby providing sounds and/or music. The speakers/microphone 522 can also be used to accept dictated words as commands for controlling the tactile communication apparatus 1 or for providing location and/or property information with respect to the target property.
  • The general purpose storage controller 524 connects the storage medium disk 504 with communication bus 526, which may be an ISA, EISA, VESA, PCI, or similar, for interconnecting all of the components of the tactile communication apparatus 1. A description of the general features and functionality of the display 510, as well as the display controller 508, storage controller 524, network controller 506, and sound controller 520 is omitted herein for brevity as these features are known.
  • FIG. 13A illustrates a front view of a tactile communication apparatus 2 according to an exemplary embodiment. Unlike in FIG. 1A where the tactile communication apparatus 1 included two main parts: a processing section 20 and a tactile communication device 30, this embodiment has three: a processing section 20, a tactile communication device for the palm 30 and, a set of tactile components for the wrist 35. The processing section 20 receives wireless signals and data from external sources and generates pin activation signals based on the data to be communicated to a user 10. In this embodiment 2 the processing section 20 is incorporated into a visual interface component 50 which resembles a watch face. The watch face component 50 can also act as a control component for the wearable apparatuses 30 & 35 as well as perform more rudimentary functions such as being a wrist watch (e.g. Smartwatch). In this example the watch component can be a touchscreen interface for a computer system that enables the user to adjust the settings on the tactile communicator and operate various installable applications, this component can also be operated by push buttons or by voice or by any other means that enables the user to operate the apparatus. The tactile communication components 35 receive pin activation signals which activate a plurality of pin combinations in a particular sequence to physically communicate data to the user 10 through a tactile projection and/or hapticmechanism (e.g., vibration). FIG. 13B illustrates a rear view of the tactile communication apparatus 2.
  • The tactile communication components 35 are assembled to communicate information to the user 10 via the wrist, the example presented in FIG. 13B the components shown to be assembled together in a row to make contact with the inside section of the user's wrist as shown in FIG. 13A. FIGS. 14A and 14B depict a single tactile communicator component 35 and the way it functions. As illustrated three pins 45 are situated in a solid casing. FIGS. 14A and 14B illustrate the mechanical pins 45 rested in the casing and raised out of it respectfully. In the example illustrated in FIGS. 14A and 14B, three pins 45 are depicted in a line, however that number is non-limiting and more pins can be included in the case component 35, where every pin in the case can function independently of the other. Each individual component is attachable to a strap like component that enables all of the parts of the apparatus 20, 35, 50 to assemble together.
  • In the FIG. 13B there are ten case components 35, each containing three raisable pins 45 depicted in FIGS. 14A and 14B, each situated next each other to make contact with the inside section of the wrist. The assembly of the components 20, 35, 50 are not limited to the depiction in example in FIGS. 13A and 13B and are not limited to making contact to just the inside of the wrist or to ten cases containing raisable pins; the assembly can include casings containing differing numbers of raisable pins to make contact with the outside section of the wrist as well as the inside section for example.
  • The tactile case components 35 communicate information in a number of ways; one can be in a ‘wave’ fashion such that the pins 45 on the wrist relay a feeling of a ‘rippling’ effect. FIGS. 15A and 15B show how a ripple effect would occur. FIG. 15B shows three different levels the pins 45 raise out from the case 35, the more the pin rise the greater the intensity of the tactile effect on the user. In this example, the pins in the center cases e & f of #1 in FIG. 15A fully rise, in #2 they drop to 75% height to raise the next set of pins d & g in the adjacent cases, then the next set raise (c & h) while d & g drop to 75% and e & f also drop to 50%, the process continues through to #6 creating a ripple feeling as depicted in the FIG. 15A. This effect can be used to relay such things as a heartbeat when a user is participating in a sport activity and may want to feel their heart or pulse rate on their wrist or if a user is playing a computer game and can synchronize the apparatus to relay such things as ‘impact’ of for example a hit or the health of a character in the game.
  • FIGS. 16A & 16B illustrate a musical equalizer display in the wrist tactile communicator for a user that listens to music or sound through a digital music or sound player and may wish to feel rhythms as depicted in FIG. 16A in their wrist; the tactile apparatus 35 would work to act as a tactile equalizer for relaying the beats and melodies of the music to the user in a tactile format as depicted in #1, #2 and #3 of FIG. 16B.
  • Another example of a method for the tactile communicator 35 is to relay information in a ‘meter’ format, so that pins positioned in a single case raise and hold to indicate a variable reading, should the variable level change (for example a temperature or speed) the pins 45 in the cases 35 would then drop and pins in a second case would raise and hold to indicate the variable change. Examples for this would be to relay such things as speed or temperature or tuning, etc. An example of a use for such a method of communication can be to relay a car speed and compare it to the speed limit of a road a car is travelling on, FIG. 17 illustrates an application in which the wrist tactile display 35 serves as a speedometer, this embodiment would function by downloading an application on a portable communication device such as a Smartphone to synchronize with a GIS mapping system such as Google Maps to retrieve road speed limit data from an online database and match the speed of the car to the speed limit and relay that information in a meter tactile format so that the driver can know whether the car is surpassing the speed limit or not.
  • FIG. 18 demonstrates another example for a ‘meter’ application; it illustrates a method of using the wrist tactile display 50 for tuning a string instrument so that the center case of pins e & f indicate the string is in tune and the other cases of pins indicate whether the string is either sharp or flat, etc.
  • A feature of the tactile components 40 is the that it can assist the tactile apparatus 30 relay text to the user. FIG. 19 shows an example where the wrist tactile components 35 clarify the word being relayed through the palm communicator 30; In this example the word ‘HEY’ is being relayed to the user. While the tactile device 30 communicates the letters of the text in the form of Braille to the palm of the hand, the tactile components 35 can indicate such information as how many characters are in the word and possibly how many words are in the paragraph. In FIG. 19 #1 all of the first case of pins ‘a’ on the wrist communicator 35 raise, however only the middle pins in case ‘b’ and ‘c’ raise, these indicate to the user that there are three letters in the word. Immediately after relaying the first letter of the word #1 the pins would change to show the second letter of the word the letter ‘E’ #2. While the tactile communicator in the palm 30 displays the letter ‘E’ the tactile communication component 35 raises the second case of pins ‘b’ to indicate that the letter ‘E’ is the second letter of the three letter word, the process repeats for #3. Using the tactile communication component 35 enables a better understanding of the text information being communicated through 30.
  • The tactile communication components 35 can also have the ability to vibrate and change temperature to heat (via resistive element) or cool (via peltier element) the individual pin cases to enhance and diversify the alternative applications for the overall device. An example of an application that could use the process of the pin cases 35 to heat or cool can be in a compass type application, a user can search for a location on a GIS application such as Google maps, when the user points their arm or hand in the correct direction the wrist tactile components 35 heat and when the wrist communicator is not pointed in the correct direction it then cools. The tactile communication components 35 have the ability to change temperature gradually (as per user setting) to indicate the level of change or nearly instantly if selected. When for example the correct direction is located with the aid of the temperature changing function then the tactile components 35 can vibrate to indicate to the user that the correct direction is found.

Claims (13)

What is claimed is:
1. A tactile communication apparatus comprising:
a signal receiver configured to decode data received via a wireless signal;
a portable tactile communication device containing a plurality of pins on one side and configured to be detachably attached over a wrist of a user such that the one side opposes the wrist of the user without restriction of hand movement, each pin configured to reciprocately move in an outward direction and an inward direction to form a plurality of pin combinations that contact the wrist of the user in response to a plurality of pin activation signals activating at least a portion of the plurality of pins; and
a communication processor configured to generate the plurality of pin activation signals determined from the received data so as to convey the data tactilly to the user through the plurality of pin combinations of the portable tactile communication device.
2. The tactile communication apparatus of claim 1, wherein
subsets of the plurality of pins are arranged in tactile communication components, each subset being contained in a different tactile communication component, one tactile communication component being positioned next to another tactile communication component so as to convey multiple types of information to adjacent locations on the wrist of the user.
3. The tactile communication apparatus of claim 2, wherein
the one side of the portable tactile communication device containing the plurality of pins is ergonomically shaped to match at least an underside of the wrist of the user such that each of the plurality of pins, when moved in an outwards direction, comes into contact with the wrist of the user.
4. The tactile communication apparatus of claim 3, wherein
the communication processor is configured to activate a rise amount of the pins by variable degrees to form pin combinations to convey one of a ripple effect and a pulse effect to the wrist of the user.
5. The tactile communication apparatus of claim 3, wherein
the communication processor is configured to activate the pins to raise by variable degrees to form pin combinations that tactilly relay an equalizer pattern to the wrist of the user.
6. The tactile communication apparatus of claim 3, wherein
the communication processor is configured to activate the pins to raise by variable degrees to form pin combinations that tactilly relay information such as temperature, speed and tuning in a meter format.
7. The tactile communication apparatus of claim 2, wherein
the communication processor is configured to execute a downloadable application and provide output information from the application tactilly to the wrist of the user via a plurality of pins.
8. The wearable tactile communication apparatus of claim 2, wherein
the detection processor is included in at least one of a smartphone and a smartwatch and communicates wirelessly with the signal receiver.
9. A wearable tactile communication apparatus comprising:
a signal receiver configured to decode data received via a wireless signal;
a portable tactile communication device containing a plurality of pins on one side and configured to be detachably attached over a palm or wrist of a user such that the one side that opposes the palm or wrist of the user without restriction of finger movement, the plurality of pins configured to controllably and reciprocatively move in an outward direction and an inward direction to form a plurality of pin combinations to be received in the palm or wrist of the user in response to a plurality of pin activation signals activating at least a portion of the plurality of pins; and
a communication processor configured to generate the plurality of pin activation signals determined from the received data so as to convey the data tactilly to the user through the plurality of pin combinations of the tactile portable communication device,
Said received data includes object movement data, and said plurality of pin activation signals corresponds with conveying content of the object movement data to a palm or wrist of the user.
10. The wearable tactile communication apparatus of claim 9, wherein
said object movement data includes automobile blind spot sensor data, said communication processor configured to generate pin activation signals that tactilly inform the user of a presence of an obstacle in a blind spot of an automobile driven by the user.
11. The wearable tactile communication apparatus of claim 9, wherein
said object movement data includes lane departure sensor data provided by a land departure sensor, said communication processor configured to generate pin activation signals that tactilly inform the user of a lane departure of a vehicle driven by the user.
12. The wearable tactile communication apparatus of claim 8, wherein
said object movement data includes obstacle avoidance sensor data provided by an obstacle avoidance sensor in an automobile, said communication processor configured to generate pin activation signals that tactilly inform the user of a presence of an obstacle approaching the vehicle driven by the user.
13. A non-transitory computer-readable storage medium with computer readable instructions stored therein that when executed by a computer, cause the computer to execute a tactile communication method using a tactile communications device, the method comprising:
receiving with a signal receiver data in a wireless signal having;
receiving a destination data input by a user;
receiving a predetermined object data input by a user;
generating navigational data based on GPS positional data and the destination data;
matching with the computer the predetermined object data to sensor data received from a plurality of sensors;
generating with a communications processor a plurality of pin activation signals based on the received data, the navigational data, and the results of the matching; and
reciprocally moving a plurality of pins of the portable tactile communication device in an outward direction and an inward direction to form a plurality of pin combinations corresponding to output data from a downloadable application received wirelessly to relay an indication based on the plurality of activation signals, wherein the portable tactile communication device contains plurality of pins on one side and is configured to be detachably attached over a wrist or palm of a user such that the one side opposes the wrist or palm of the user without restriction finger movement, each pin configured to reciprocally move in an outward direction and an inward direction to form a plurality of pin combinations to be received in the wrist or palm of the user in response to the plurality of pin activation signals activating at least a portion of the plurality of pins.
US14/096,858 2012-08-01 2013-12-04 Tactile communication apparatus, method, and computer program product Abandoned US20150154886A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/096,858 US20150154886A1 (en) 2012-08-01 2013-12-04 Tactile communication apparatus, method, and computer program product

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/564,395 US8696357B2 (en) 2012-08-01 2012-08-01 Tactile communication apparatus, method, and computer program product
US14/096,858 US20150154886A1 (en) 2012-08-01 2013-12-04 Tactile communication apparatus, method, and computer program product

Publications (1)

Publication Number Publication Date
US20150154886A1 true US20150154886A1 (en) 2015-06-04

Family

ID=50025842

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/564,395 Active US8696357B2 (en) 2012-08-01 2012-08-01 Tactile communication apparatus, method, and computer program product
US14/096,858 Abandoned US20150154886A1 (en) 2012-08-01 2013-12-04 Tactile communication apparatus, method, and computer program product

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/564,395 Active US8696357B2 (en) 2012-08-01 2012-08-01 Tactile communication apparatus, method, and computer program product

Country Status (1)

Country Link
US (2) US8696357B2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9630555B1 (en) * 2016-01-25 2017-04-25 Ford Global Technologies, Llc Driver alert system for speed and acceleration thresholds
KR101807976B1 (en) 2016-12-05 2017-12-11 박재홍 Wearable control device
US10562450B2 (en) 2015-09-21 2020-02-18 Ford Global Technologies, Llc Enhanced lane negotiation
US20220233391A1 (en) * 2021-01-25 2022-07-28 City University Of Hong Kong Human-interface device and a guiding apparatus for a visually impaired user including such human-interface device
US11915607B2 (en) * 2020-05-29 2024-02-27 Brailleazy, Inc. Modular refreshable braille display system

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8696357B2 (en) * 2012-08-01 2014-04-15 Thieab AlDossary Tactile communication apparatus, method, and computer program product
US20140184384A1 (en) * 2012-12-27 2014-07-03 Research Foundation Of The City University Of New York Wearable navigation assistance for the vision-impaired
US9390630B2 (en) * 2013-05-03 2016-07-12 John James Daniels Accelerated learning, entertainment and cognitive therapy using augmented reality comprising combined haptic, auditory, and visual stimulation
WO2015175193A1 (en) * 2014-05-16 2015-11-19 Tactile Engineering, Llc Refreshable tactile display
US10121335B2 (en) * 2014-07-18 2018-11-06 Google Technology Holdings LLC Wearable haptic device for the visually impaired
US9727182B2 (en) 2014-07-18 2017-08-08 Google Technology Holdings LLC Wearable haptic and touch communication device
US9965036B2 (en) 2014-07-18 2018-05-08 Google Technology Holdings LLC Haptic guides for a touch-sensitive display
US9454915B2 (en) * 2014-08-20 2016-09-27 Thieab Ibrahim Aldossary Electro tactile communication apparatus, method, and computer program product
US10437335B2 (en) 2015-04-14 2019-10-08 John James Daniels Wearable electronic, multi-sensory, human/machine, human/human interfaces
GB2541516B (en) * 2015-07-02 2018-04-11 Wal Mart Stores Inc Tactile navigation systems and methods
US9646470B1 (en) 2015-10-15 2017-05-09 Honeywell International Inc. Aircraft systems and methods with operator monitoring
US11229787B2 (en) 2016-11-25 2022-01-25 Kinaptic, LLC Haptic human machine interface and wearable electronics methods and apparatus
US10380915B2 (en) * 2017-06-29 2019-08-13 Stephen Sophorn Lim Braille dot delivery system
US10959674B2 (en) 2017-10-23 2021-03-30 Datafeel Inc. Communication devices, methods, and systems
US10251788B1 (en) * 2017-10-30 2019-04-09 Dylan Phan Assisting the visually impaired
EP3710913A1 (en) * 2017-11-13 2020-09-23 Telefonaktiebolaget LM Ericsson (PUBL) Input device for a computing device
CN108279780B (en) * 2018-03-01 2020-07-24 京东方科技集团股份有限公司 Wearable device and control method
RU2708107C1 (en) * 2019-06-24 2019-12-04 Дмитрий Петрович Герасимук Tactile communication system
CA3177615A1 (en) 2020-10-30 2022-05-05 Datafeel Inc. Wearable data communication apparatus, kits, methods, and systems

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3831296A (en) * 1972-08-02 1974-08-27 E Hagle Alphanumeric tactile information communication system
US4445871A (en) * 1981-11-12 1984-05-01 Becker John V Tactile communication
US4581491A (en) * 1984-05-04 1986-04-08 Research Corporation Wearable tactile sensory aid providing information on voice pitch and intonation patterns
US4982432A (en) * 1984-05-30 1991-01-01 University Of Melbourne Electrotactile vocoder
US5047952A (en) * 1988-10-14 1991-09-10 The Board Of Trustee Of The Leland Stanford Junior University Communication system for deaf, deaf-blind, or non-vocal individuals using instrumented glove
US20090130639A1 (en) * 2007-11-21 2009-05-21 Michael Skinner Tactile display to allow sight impaired to feel visual information including color
US7788032B2 (en) * 2007-09-14 2010-08-31 Palm, Inc. Targeting location through haptic feedback signals
US20130293657A1 (en) * 2012-05-02 2013-11-07 Richard Delmerico Printed image for visually-impaired person
US8581747B2 (en) * 2007-03-27 2013-11-12 Fujitsu Limited Pedestrian support system
US8696357B2 (en) * 2012-08-01 2014-04-15 Thieab AlDossary Tactile communication apparatus, method, and computer program product

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2743922B1 (en) * 1996-01-19 1998-04-17 Parienti Raoul READING DEVICE FOR THE BLIND
US6762749B1 (en) * 1997-01-09 2004-07-13 Virtouch Ltd. Tactile interface system for electronic data display system
AUPO709197A0 (en) 1997-05-30 1997-06-26 University Of Melbourne, The Improvements in electrotactile vocoders
US6230135B1 (en) * 1999-02-02 2001-05-08 Shannon A. Ramsay Tactile communication apparatus and method
JP2003079685A (en) * 2001-09-17 2003-03-18 Seiko Epson Corp Auxiliary appliance for walking of visually handicapped person
US20100109918A1 (en) * 2003-07-02 2010-05-06 Raanan Liebermann Devices for use by deaf and/or blind people
WO2005008914A1 (en) * 2003-07-10 2005-01-27 University Of Florida Research Foundation, Inc. Mobile care-giving and intelligent assistance device
DE102005009110A1 (en) * 2005-01-13 2006-07-27 Siemens Ag Device for communicating environmental information to a visually impaired person
JP2007293132A (en) * 2006-04-26 2007-11-08 Pioneer Electronic Corp Mobile information input/output device and general purpose braille output device
US8138896B2 (en) 2007-12-31 2012-03-20 Apple Inc. Tactile feedback in an electronic device
US8362883B2 (en) * 2008-06-10 2013-01-29 Design Interactive, Inc. Method and system for the presentation of information via the tactile sense
US8451240B2 (en) * 2010-06-11 2013-05-28 Research In Motion Limited Electronic device and method of providing tactile feedback
US9691300B2 (en) 2010-09-21 2017-06-27 Sony Corporation Text-to-touch techniques
US9715837B2 (en) * 2011-12-20 2017-07-25 Second Sight Medical Products, Inc. Text reading and translation in a visual prosthesis
EP2608189A1 (en) * 2011-12-21 2013-06-26 Thomson Licensing Braille display system and method for operating a refreshable Braille display

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3831296A (en) * 1972-08-02 1974-08-27 E Hagle Alphanumeric tactile information communication system
US4445871A (en) * 1981-11-12 1984-05-01 Becker John V Tactile communication
US4581491A (en) * 1984-05-04 1986-04-08 Research Corporation Wearable tactile sensory aid providing information on voice pitch and intonation patterns
US4982432A (en) * 1984-05-30 1991-01-01 University Of Melbourne Electrotactile vocoder
US5047952A (en) * 1988-10-14 1991-09-10 The Board Of Trustee Of The Leland Stanford Junior University Communication system for deaf, deaf-blind, or non-vocal individuals using instrumented glove
US8581747B2 (en) * 2007-03-27 2013-11-12 Fujitsu Limited Pedestrian support system
US7788032B2 (en) * 2007-09-14 2010-08-31 Palm, Inc. Targeting location through haptic feedback signals
US20090130639A1 (en) * 2007-11-21 2009-05-21 Michael Skinner Tactile display to allow sight impaired to feel visual information including color
US20130293657A1 (en) * 2012-05-02 2013-11-07 Richard Delmerico Printed image for visually-impaired person
US8696357B2 (en) * 2012-08-01 2014-04-15 Thieab AlDossary Tactile communication apparatus, method, and computer program product

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10562450B2 (en) 2015-09-21 2020-02-18 Ford Global Technologies, Llc Enhanced lane negotiation
US9630555B1 (en) * 2016-01-25 2017-04-25 Ford Global Technologies, Llc Driver alert system for speed and acceleration thresholds
KR101807976B1 (en) 2016-12-05 2017-12-11 박재홍 Wearable control device
US11915607B2 (en) * 2020-05-29 2024-02-27 Brailleazy, Inc. Modular refreshable braille display system
US20220233391A1 (en) * 2021-01-25 2022-07-28 City University Of Hong Kong Human-interface device and a guiding apparatus for a visually impaired user including such human-interface device
US11684537B2 (en) * 2021-01-25 2023-06-27 City University Of Hong Kong Human-interface device and a guiding apparatus for a visually impaired user including such human-interface device

Also Published As

Publication number Publication date
US8696357B2 (en) 2014-04-15
US20140038139A1 (en) 2014-02-06

Similar Documents

Publication Publication Date Title
US20150154886A1 (en) Tactile communication apparatus, method, and computer program product
US9304588B2 (en) Tactile communication apparatus
US10706692B2 (en) Device, system and method for mobile devices to communicate through skin response
US9454915B2 (en) Electro tactile communication apparatus, method, and computer program product
US20170060850A1 (en) Personal translator
US9727182B2 (en) Wearable haptic and touch communication device
US20160284235A1 (en) Wearable Blind Guiding Apparatus
US8082152B2 (en) Device for communication for persons with speech and/or hearing handicap
US20200026764A1 (en) Method of video call
KR20200029391A (en) Vibrating haptic device for the blind
KR20170104541A (en) Navigation devices and methods
US20190019512A1 (en) Information processing device, method of information processing, and program
CN108763552B (en) Family education machine and learning method based on same
US20120218091A1 (en) Device, system and method for mobile devices to communicate through skin response
US11755111B2 (en) Spatially aware computing hub and environment
US20220343795A1 (en) Orientation assistance system
US10636261B2 (en) Intuitive tactile devices, systems and methods
JP6687743B2 (en) Information transmitting / receiving apparatus and information transmitting / receiving method
KR101683160B1 (en) Object recognition system for the visually impaired
JP6376834B2 (en) Information transmission system
US20160337822A1 (en) Vehicle and control method thereof
KR20070089263A (en) Navigation system for the blind person and mobile device therewith
CN113366483A (en) Information processing apparatus, information processing method, and information processing program
WO2020245991A1 (en) Road guidance system and mobile information terminal used in same
CN114730530A (en) Language teaching machine

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION