US9782669B1 - RF tracking with active sensory feedback - Google Patents

RF tracking with active sensory feedback Download PDF

Info

Publication number
US9782669B1
US9782669B1 US13/918,295 US201313918295A US9782669B1 US 9782669 B1 US9782669 B1 US 9782669B1 US 201313918295 A US201313918295 A US 201313918295A US 9782669 B1 US9782669 B1 US 9782669B1
Authority
US
United States
Prior art keywords
wireless device
radio signals
antennae
haptic
feedback
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US13/918,295
Inventor
Edward L. Hill
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Position Imaging Inc
Original Assignee
Position Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Position Imaging Inc filed Critical Position Imaging Inc
Priority to US13/918,295 priority Critical patent/US9782669B1/en
Assigned to POSITION IMAGING, INC. reassignment POSITION IMAGING, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HILL, EDWARD L.
Priority to US15/687,779 priority patent/US10269182B2/en
Application granted granted Critical
Publication of US9782669B1 publication Critical patent/US9782669B1/en
Assigned to ANKURA TRUST COMPANY, LLC reassignment ANKURA TRUST COMPANY, LLC SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: POSITION IMAGING IP LLC
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • A63F13/04
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • A63F13/235Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes

Definitions

  • the invention relates generally to systems and methods for human-computer interaction. More particularly, the invention relates to position tracking of game controllers and other types of 3D input devices to enhance virtual reality experience of a user with sensory feedback, for example, tactile, auditory, and/or visual feedback.
  • the NINTENDO® WII REMOTE® wireless controller is an example of recent state of the art in user interactive controllers for computer display game systems. It is a movable wireless remote controller that incorporates inertial sensors to provide motion capture to their controllers. It is hand-held by the interactive user, and transmits input data to the computer controlled game display system via conventional short-range wireless RF transmissions (e.g., a BLUETOOTH® system), and receives data via infrared light sensors.
  • conventional short-range wireless RF transmissions e.g., a BLUETOOTH® system
  • movable controllers also provide haptic feedback.
  • Haptic feedback is commonly used in arcade and video game controllers.
  • An example of this feature is the simulated automobile steering wheels that are programmed to provide a “feel” of the road. As the user makes a turn or accelerates, the steering wheel responds by resisting turns or slipping out of control.
  • Other simple examples include handlebar shake in motorcycle-based games, gun shake in shooting games, joystick vibrations, etc. Sony's DUALSHOCKTM technology and the handheld remote controller for the NINTENDO® WII® feature such technology.
  • a consumer 3D touch device with high resolution three-dimensional force feedback, allowing the simulation of objects, textures, recoil, momentum, physical presence of objects in games through haptic feedback initiated at the device is now available from NOVINTTM HAPTICSTM.
  • the feedback is enabled by actuators that apply the forces to the skin for touch feedback to simulate touching something such as a virtual object on the screen, in free space.
  • the actuator provides mechanical motion in response to an electrical stimulus.
  • Most early designs of haptic feedback use electromagnetic technologies such as vibratory motors with an offset mass, such as the pager motor, that is in most cell phones or voice coils where a central mass or output is moved by a magnetic field.
  • the electromagnetic motors typically operate at resonance and provide strong feedback, but have limited range of sensations.
  • Next-generation actuator technologies are beginning to emerge, offering a wider range of effects thanks to more rapid response times.
  • Next generation haptic actuator material technologies include electro-active materials, which includes piezoelectric and polymer materials.
  • the invention features a system for wirelessly tracking a multi-dimensional position of and for providing sensory feedback to a wireless device.
  • the system comprises a wireless device having a radio frequency (RF) receiver, an RF transmitter, and means for producing sensory feedback to a user of the wireless device.
  • RF radio frequency
  • a position-tracking system includes at least three antennae. The position-tracking system computes a multi-dimensional position of the wireless device using triangulation or trilateration based on time of arrival information from radio signals transmitted by the RF transmitter of the wireless device and received by each of the at least three antennae.
  • a processor receives the multi-dimensional position of the wireless device determined by the position-tracking system, correlates the multi-dimensional position to a point of interest registered with an interactive software program that produces a virtual reality environment, and generates, in response to the multi-dimensional position correlated with the registered point of interest, data configured to activate the sensory feedback producing means of the wireless device.
  • the invention features a method of providing sensory feedback to a wireless device.
  • the method comprises receiving, at three or more antennae, radio signals emitted by a wireless device being used to interact with a virtual reality environment produced by an interactive computer program.
  • a multi-dimensional position of the wireless device is computed using triangulation or trilateration based on time of arrival information from the radio signals received at the three or more antennae.
  • the computed multi-dimensional position of the wireless device is correlated to a point of interest registered with the interactive computer program.
  • Data configured to activate a particular sensory feedback are generated in response to the correlated multi-dimensional position.
  • the particular sensory feedback at the wireless device is activated in response to the generated data.
  • FIG. 1 is a block diagram of an embodiment of a RF tracking system and a wireless haptic feedback system with a separate interface for haptic data transmission.
  • FIG. 2 is a block diagram of another embodiment of a RF tracking system and a wireless haptic feedback system with a combined interface for RF tracking and haptic data transmission.
  • FIG. 3 is a block diagram showing another embodiment of a RF tracking system and a wireless haptic feedback system with a partially combined interface for RF tracking and haptic data transmission.
  • FIG. 4 is flow diagram of an embodiment of a process of information flow in a virtual reality system.
  • Radio frequency (RF) communication for remote wireless data transfer is ubiquitous in the electronics industry. Consequently, hardware supporting the RF communication industry has evolved and advanced over the past twenty years producing robust, cost effective solutions for wirelessly sending data. Over the decades, RF hardware has benefited from cost reduction and sensitivity increases. In addition, algorithms used for improving signal integrity continue to evolve and improve. Position tracking of wireless devices using RF communications has become a technical reality. An example of a system capable of using RF communications to track the position of a wireless transmitter can be found in U.S. application Ser. No. 13/079,800, filed Apr. 4, 2011, and titled, “Multiplexing Receiver System”, the entirety of which application is incorporated by reference herein.
  • the position of a wireless device with an RF transmitter, in free space can be computed using time as the basis of measurement for making the position calculations.
  • Systems that can utilize such RF-based tracking, using radio signal time of arrival comparisons at multiple antennae connected to the system but disposed in different locations, can achieve multi-dimensional (2D or 3D) position accuracy results of 1 cm or less in x, y, and z planes.
  • a time-based position-tracking technique that allows systems to resolve the position of a wireless device rapidly and with such high accuracy resolution enables improvements to existing applications and the generation of new applications for human-computer free space kinematic interaction involving wireless peripherals, whether held or worn by the user.
  • RF-based communication can provide users with line-of-sight independence through better signal propagation characteristics, unlike linear signal paths experienced with light sensed by cameras or infrared (IR) sensors.
  • a sensory feedback system that can accurately track the actual position of a user's handheld device or controller with RF data enables new applications not easily attainable with a gesture tracking camera system or an inertial sensing motion capture technique.
  • Larger working volumes than camera systems and line-of-sight independence can make an RF-based position-tracking technology the technique of choice for registering wireless device's position in various interactive computer programs. With a larger working volume and line-of-sight independence, interactive games can be created or improved, and new training simulation and therapy programs, used to improve kinesthetic performance and proprioception awareness, become possible.
  • this time-based RF position tracking system can deliver, there are other areas of improvement for human-computer interaction.
  • a user can receive feedback from the system that makes interaction more realistic.
  • a system can also provide tactile, visual, or auditory feedback (or any combination thereof) to the wireless device as the device travels within free space to locations of interest registered by the interactive software program. Applying tactile or auditory feedback at known or targeted points in space can improve kinematic interaction within virtually generated environments. Improved kinematic interaction can bring more realism and increased effectiveness to computer programs for physical therapy, kinematic-based training, gaming, CAD or other three dimensional design applications.
  • a golf video game can achieve a higher level of realistic interaction as the position of a game controller (the wireless device) is tracked and displayed on the screen.
  • the image of the game controller in this example displayed as a golf club, also moves in relation to the game controller held by the user.
  • the game controller can supply haptic and/or auditory feedback as the position of the controller in free space correlates with the position of the ball on the screen. The user can thus “feel” and “hear” the impact of the game controller (visually represented as the golf club) hitting the ball registered on the screen.
  • This combination of position tracking and tactile and/or auditory feedback provides a much more immersive level of human-computer interaction.
  • sensory feedback systems described herein combine RF-based position tracking of a wireless device with a sensory feedback function performed at or in the wireless device.
  • This wireless device can include a single transceiver or a low power radio receiver or receivers, a separate transmitter or transmitters, and an actuator or similar device that can produce sensory feedback to the user.
  • the wireless device can also include one or more inertial sensors, for example, accelerometers, gyroscopes, and magnetic sensors, to provide an orientation of the wireless device to the position-tracking system. The user uses this wireless device to engage in a virtual reality environment produced an interactive software program.
  • a computer in communication with the antenna network, dynamically computes a 2D or 3D position of the wireless device, thereby enabling accurate position tracking.
  • the computed positional information is correlated to points of interest registered within a virtual environment produced by the interactive software program.
  • a computer uses these registered positions to provide sensory feedback to the user as the user engages with the virtual environment provided by the interactive software program.
  • the sensory feedback functionality is built into the handheld device, which can be worn rather than held, and includes a driving apparatus, a sensing device, a controller, and software.
  • the sensory feedback system is configured to process data in a simulated interactive software program from a virtual reality environment displayed on a computer monitor (or TV, projector, or other visual display device) and to transmit data to a driver control card.
  • the driver control card is configured to control the driving apparatus.
  • the sensing device can measure user force feedback (e.g., intensity of grip, rapidity of motion) and transmit this data back to the system.
  • This feedback cycle namely, the user measurement from the sensing device, virtual reality submersion through position registration within an interactive software program, and haptic feedback, provides a full 3D rendered spatial experience.
  • FIG. 1 shows an embodiment of a wireless haptic feedback system 2 including an RF tracking system 4 , a base station 6 , and a handheld device 60 .
  • the RF tracking system 4 includes RF receiver antennae 10 , an RF receiver 20 , a position and orientation calculation algorithm 30 running on a computing device, RF transmitter antennae 80 , and an RF transmitter and controller 90 .
  • the base station 6 includes the RF receiver antennae 10 , the RF receiver system 20 , the position and orientation calculation algorithm 30 , a virtual system controller 40 , a haptic wireless interface 50 , and haptic wireless interface antenna 70 .
  • Each of the RF receiver antennae 10 are typically fixed at different locations in the RF tracking system 4 .
  • One or more of the RF antennae can serve to provide a tracking reference signal. All or a portion of the receiver antennae can be passive receivers.
  • receiver antennae 10 receive the radio signal transmitted from the transmitters of the wireless device, and the RF tracking system 4 generates timing information from these radio signals (e.g., from their carrier wave phase information) in order to make comparisons among the antennae 10 .
  • the position and orientation calculation algorithm 30 of the RF tracking system 4 computes the physical position of the wireless device 60 in 3D space using the known fixed and separate physical locations of each receiver antenna 10 , comparing the timing information associated with the receiver antennae 10 , and performing triangulation or trilateration calculations based on this information.
  • the handheld wireless device 60 can be any type of electronics device typically powered by a self-contained power source and, for example, held, grasped, or worn by a user.
  • the handheld wireless device 60 is comprises two functions: a transmitter function necessary for tracking the position of the device 60 , and a haptic feedback function necessary for sensory feedback.
  • the RF transmitter and controller block 90 and transmitter antennae block 80 accomplish the transmitter function.
  • the RF transmitter and controller block 90 generates the signals and waveforms required to feed the transmitter antennae block 80 , such that when such signals and waveforms are received by the receiver antennae 10 and processed by the RF receiver system 20 , a position of the hand held device 60 can be ascertained using the position and orientation calculation algorithm 30 running on the computing device.
  • a haptic device provides user input 62 to and receives user output 61 from a realistic virtual reality (VR) environment in conjunction with the RF tracking system 4 and the virtual system controller 40 .
  • the virtual system controller 40 which typically runs on a processor (not shown) with associated memory, program storage and interfaces, includes a program interface, haptic simulation rule generation and other functionality for integrating the user inputs and user outputs and positioning data into the interactive software program that produces the VR environment.
  • the virtual system controller 40 (and associated hardware) can be separate from or integrated with the computer system that runs the interactive software program.
  • the virtual system controller 40 is configured to process data from the RF tracking system 4 to change the VR environment over time.
  • Sensory feedback to the user can consist of, for example, vibrations, shocks, pressures, friction, motion restriction, and sound, as is known in the art.
  • a haptic interface controller 63 converts commands from the virtual system controller 40 into user feedback.
  • the haptic interface controller 63 performs a transfer function necessary to control the various haptic feedback devices 61 into “realistic” levels of feedback, dependent on the type of feedback and the mechanization of the feedback.
  • a digital-to-analog converter coupled with the appropriate other circuitry, is included in the haptic interface controller 63 .
  • Visual and auditory feedback can be performed separately and remotely from the handheld wireless device 60 , for example, at the base station 6 , or at an electronic device in communication with the base station 6 .
  • haptic feedback from the user can consist of trigger pulls, pressure, temperature, sweat, heart rate, sound, etc., as is known in the art.
  • haptic interface controller 63 typically incorporates an analog-to-digital converter.
  • the haptic feedback device that receives the user output 61 and produces the user input 62 , communicates through the haptic interface controller 63 with the virtual system controller 40 using a haptic wireless interface 64 .
  • the haptic wireless interface 64 includes an antenna 65 for bi-directional communication with the wireless antenna 70 of the haptic wireless interface 50 coupled to the virtual system controller 40 .
  • These interfaces 50 , 64 use standard RF techniques, known in the art, to transmit and receive data corresponding to feedback to and from the user.
  • These haptic wireless interfaces 50 and 64 use antennae 70 and 65 , respectively, to transmit and receive this haptic data.
  • FIG. 2 shows another embodiment of a wireless haptic feedback system 2 ′ including an RF tracking system 4 ′, a base station 6 ′, and a handheld device 160 .
  • the RF tracking system 4 ′ includes RF receiver antennae 100 , 170 , an RF transceiver 120 , a position and orientation calculation algorithm 130 running on a computing device, RF transmitter antennae 180 , and an RF transceiver and controller 190 .
  • the base station 6 ′ includes the RF receiver antennae 100 , 170 , the RF transceiver 120 , the position and orientation calculation algorithm 130 , and a virtual system controller 140 . This embodiment is similar to the embodiment of FIG.
  • the RF tracking system 4 ′ uses the transceivers 120 , 190 instead of the RF receiver 20 and RF transmitter 90 . Accordingly, the two-way RF communication between the handheld device 160 and the base station 6 (for transmitting and receiving haptic data) occurs between these transceivers 120 , 190 .
  • Handheld device 160 is held, grasped, worn, etc., by the user and is typically powered by a self-contained power source.
  • the handheld device 160 is comprised of two functions: a transmitter function necessary for tracking the position of the device 160 , and a haptic feedback function necessary for sensory feedback.
  • the RF transceiver and controller block 190 and transmitter antennae block 180 accomplish the transmitter function.
  • the RF transceiver and controller block 190 generates the signals and waveforms required to feed the transmitter antennae block 180 , such that when such signals and waveforms are received by the receiver antennae 100 and processed by the RF transceiver 120 , a position of the handheld device 160 can be ascertained using the position and orientation calculation algorithm 130 running on a computing device.
  • a haptic device acts as a user input 162 and user output 161 of a virtual environment in conjunction with the RF tracking system 4 ′ and the virtual system controller 140 .
  • the virtual system controller 140 which typically runs on a processor (not shown) with associated memory, program storage and interfaces, includes a program interface, haptic simulation rule generation and other functionality for stitching the user inputs and outputs and positioning data and into a realistic VR environment.
  • the virtual system controller 140 is configured to process data from the RF tracking system 4 ′ to change the virtual environments evolution with time.
  • Haptic feedback to the user can consist of vibrations, shocks, pressures, friction, motion restriction, sound, etc. as is known in the art. These devices are controlled by a haptic interface controller 163 , which converts commands from the virtual system controller 140 into user feedback.
  • the haptic interface controller 163 accomplishes a transfer function necessary to control the various haptic feedback devices 161 into “realistic” levels of feedback, dependent on the type of feedback and the mechanization of the feedback.
  • a digital to analog converter coupled with the appropriate other circuitry, is included in the haptic interface controller 163 .
  • haptic feedback from the user can consist of trigger pulls, pressure, temperature, sweat, heart rate, sound, etc., as is known in the art.
  • haptic interface controller 163 typically incorporates an analog-to-digital converter.
  • the haptic feedback device that receives the user output 161 and produces the user input 162 communicates, through the haptic interface controller 163 , with the virtual system controller 140 using a wireless interface.
  • the wireless interface is bi-directional to the virtual system controller 140 .
  • These interfaces use standard RF techniques, known in the art, to transmit and receive data corresponding to feedback to and from the user.
  • RF transceiver 120 and RF transceiver 190 use antennae 170 and 165 , respectively, to transmit and receive this data.
  • transceiver block 190 contains a transmitter and receiver section for antenna 165 .
  • the antenna 165 communicates with the antenna 170 , which is connected to the RF transceiver 120 .
  • the RF transceiver 120 also contains a transmitter and receiver section for the antenna 170 , in addition to the receive function needed for tracking.
  • the RF transceiver 120 communicates with the virtual system controller 140 .
  • RF transceivers provide transmitter and receiver functionality. This allows the receiver antennae 100 to receive signals while the antenna 170 can transmit and receive signals related to the haptic interface, all from RF transceiver hardware 120 . In a similar manner, the transmitter antennae 180 can transmit data while the antenna 165 can transmit and receive signals related to the haptic interface, all from RF transceiver hardware 190 . As is also known in the art, multiplexing schemes, such as switching, time multiplexing, frequency multiplexing, phase multiplexing, etc., can be designed into the RF transceiver 120 and/or the RF transceiver 190 , such that antennae may be shared. For example, the antenna 170 and one of receiver antennae 100 can be combined to reduce the antenna count (see insert A). In a similar manner, antenna 165 and one of receiver antennae 180 can be combined to reduce the antenna count (see insert B).
  • FIG. 3 shows an embodiment of a haptic feedback system 2 ′′ including an RF tracking system 4 ′′, a base station 6 ′′, and a handheld device 260 .
  • the RF tracking system 4 ′′ includes RF receiver antennae 200 , 271 , an RF receiver 220 , a position and orientation calculation algorithm 230 running on a computing device, RF transmitter antennae 266 , 280 , and an RF transmitter and controller 290 .
  • the base station 6 ′′ includes the RF receiver antennae 200 , 271 , the RF receiver 220 , the position and orientation calculation algorithm 230 , a virtual system controller 240 , a haptic wireless interface 250 , haptic wireless interface antennae 270 .
  • the RF receiver antennae 200 , 271 are typically fixed and act as a tracking reference frame.
  • This embodiment is similar to the embodiment of FIG. 1 , with a notable difference being that the two-way wireless RF communication between the handheld device 260 and the base station 6 ′′ (for transmitting and receiving haptic data) RF tracking system 4 ′ is split into two uni-directional paths, one path using the strengths of the unidirectional tracking system components (from the RF transmitter and controller 290 to RF receiver 220 ) and the other path using a unidirectional interface (from the haptic wireless interface 250 to a haptic wireless interface 264 in the device 260 ).
  • the handheld device 260 is held, grasped, worn, etc. by the user and is typically powered by a self-contained power source.
  • the handheld 260 is comprised of two functions: a transmitter function necessary for tracking the position of the device 60 , and a haptic feedback function necessary for sensory feedback.
  • the RF transmitter and controller block 290 and transmitter antennae block 280 accomplish the transmitter function.
  • the RF transmitter and controller 290 generates the signals and waveforms required to feed the transmitter antennae 280 , such that when such signals and waveforms are received by the receiver antennae 200 and processed by the RF receiver 220 , a position of the hand held device 260 can be ascertained using the position and orientation calculation algorithm 230 running on the computing device.
  • a haptic device acts as a user input 262 and user output 261 of a virtual environment in conjunction with the RF tracking system 4 ′′ and the virtual system controller 240 .
  • the virtual system controller 240 which typically runs on a processor (not shown) with associated memory, program storage and interfaces, includes a program interface, haptic simulation rule generation and other functionality for stitching the user inputs and outputs and positioning data and into a realistic VR environment.
  • the virtual system controller 240 is configured to process data from the RF tracking system 4 ′′ to change the virtual environments evolution with time.
  • Haptic feedback to the user can consist of vibrations, shocks, pressures, friction, motion restriction, sound, etc. as is known in the art. These devices are controlled by a haptic interface controller 263 , which converts commands from the virtual system controller into user feedback.
  • the haptic interface controller 263 realizes the transfer function necessary to control the various haptic feedback devices 261 into “realistic” levels of feedback, dependent on the type of feedback and the mechanization of the feedback.
  • a digital-to-analog converter coupled with the appropriate other circuitry, is included in the haptic interface controller 263 .
  • haptic feedback from the user can consist of trigger pulls, pressure, temperature, sweat, heart rate, sound, etc., as is known in the art.
  • These devices are also interfaced to the haptic interface controller 263 , which typically incorporates an analog-to-digital converter.
  • the haptic feedback device which receives the user output 261 and produces the user input 262 , communicates, through the haptic interface controller 263 , with the virtual system controller 240 using two wireless interfaces.
  • Transmission of the data from the user haptic interface 264 to the virtual system controller 240 over one of the two wireless interfaces occurs through the same RF components used for position tracking (i.e., RF components 290 , 280 , 200 , 220 ).
  • RF components 290 , 280 , 200 , 220 Separate antennae/channels are shown in FIG. 3 for the haptic wireless interface to the virtual system controller 240 .
  • These separate channels comprise a transmitting antenna 266 and a receiving antenna 271 .
  • These antennae 266 , 271 may be physically separate antennae, as shown, or they may be incorporated into the tracking system antennae framework.
  • insert C indicates that receiver antennae 200 and antenna 271 are combined.
  • transmitter antennae 280 and antenna 266 are combined, as signified by insert D.
  • the RF receiver 220 separates data into position-tracking data and haptic interface data, and sends the haptic interface data to the virtual system controller 240 for further processing.
  • This transmission is accomplished by the haptic wireless interface 250 transmitting the data over the haptic wireless interface antenna 270 at the base station 6 ′′ to its counterpart antenna 265 in the handheld device 260 .
  • the haptic wireless interface 264 processes the data received by the antenna 265 and sends the appropriate information to the haptic interface controller 263 , for conversion into the appropriate formats for controlling the feedback to the user.
  • multiplexing schemes such as switching, time multiplexing, frequency multiplexing, phase multiplexing, etc.
  • multiplexing schemes can be designed into RF receiver 220 and/or the RF transmitter and controller 290 such that antennae may be shared.
  • the antenna 271 and one of receiver antennae 200 can be combined to reduce the antenna count (see insert C).
  • antenna 266 and one of transmitter antennae 280 can be combined to reduce the antenna count (see insert D).
  • FIG. 4 shows an embodiment of a process 480 that can be performed by any of the embodiments described in connection with FIG. 1 , FIG. 2 , and FIG. 3 .
  • the virtual system controller 40 generates (step 500 ) a virtual reality scenario.
  • This virtual reality scenario can take, for example, the form of a game or a simulation.
  • Various points of interest are registered in the virtual reality scenario.
  • the virtual reality scenario is presented (step 510 ) to the user, where visual scenes, cues, and haptic data are generated for supplying to the user.
  • the virtual system controller 40 correlates the computed position of the handheld device to the points of interest identified in the virtual reality scenario. When the handheld device moves to one of these points of interest, the virtual system controller 40 identifies and generates the appropriate type of haptic data associated with that point of interest.
  • the base station 6 transmits (step 520 ) haptic data to the handheld device 60 .
  • the tracking system components of the RF tracking system 4 tracks (step 530 ) the handheld device 60 .
  • user haptic input 62 is monitored (step 540 ) for changes and sent back (step 550 ) to the virtual system controller 40 using an RF link between the antennae 65 , 70 of the haptic wireless interfaces 64 , 50 , respectively.
  • the process 480 repeats until the virtual reality scenario ends.
  • Inertial sensors such as accelerometers and gyroscopes, within the wireless handheld device, can provide short-term navigation improvements.
  • Magnetic sensors can determine orientation relative to the earth's magnetic field. These sensory inputs can be optimally combined, such as with a Kalman filter, to provide better tracking accuracy and responsiveness.
  • the handheld device is configured to perform the calculations.
  • the wireless device receives the timing information acquired from the radio signals received by the receiver antennae and calculates its own position from this timing information. Further, the wireless device can send its calculated position back to the base station for use by the virtual system controller.
  • the virtual system controller uses the position data, in conjunction with the human-computer interactive software, to produce haptic alerts inputs. These haptic alerts inputs, when sent to the wireless device, activate the haptic or auditory (or both) function at the wireless device.
  • the wireless device can perform the position calculation and execute the human-computer interactive software. This embodiment can forego any need for the wireless device to transmit the calculated position to the base station or for the base station to transmit the haptic alert inputs to the wireless device.
  • the wireless device itself, generates the haptic alerts (and subsequent sensory feedback) based on its calculated position and points of registration within the human-computer interactive software.
  • sensory feedback systems can simultaneously track the positions of multiple wireless devices engaged in virtual reality environment produced by an interactive computer program. These sensory feedback systems can activate sensory feedback in each of these devices individually, in accordance with the particular points of interests with which their respective positions correlate.
  • aspects of the present invention may be embodied as a system, method, and computer program product.
  • aspects of the present invention may be embodied entirely in hardware, entirely in software (including, but not limited to, firmware, program code, resident software, microcode), or in a combination of hardware and software. All such embodiments may generally be referred to herein as a circuit, a module, or a system.
  • aspects of the present invention may be in the form of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereon.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wired, optical fiber cable, radio frequency (RF), etc. or any suitable combination thereof.
  • any appropriate medium including but not limited to wireless, wired, optical fiber cable, radio frequency (RF), etc. or any suitable combination thereof.
  • Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as JAVA, Smalltalk, C#, C++, and Visual C++ or the like and conventional procedural programming languages, such as the C and Pascal programming languages or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider an Internet Service Provider
  • the program code may execute entirely on a user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on a remote computer or server.
  • Any such remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • aspects of the described invention may be implemented in one or more integrated circuit (IC) chips manufactured with semiconductor-fabrication processes.
  • the maker of the IC chips can distribute them in raw wafer form (on a single wafer with multiple unpackaged chips), as bare die, or in packaged form.
  • the IC chip is mounted in a single chip package, for example, a plastic carrier with leads affixed to a motherboard or other higher level carrier, or in a multichip package, for example, a ceramic carrier having surface and/or buried interconnections.
  • the IC chip is then integrated with other chips, discrete circuit elements, and/or other signal processing devices as part of either an intermediate product, such as a motherboard, or of an end product.
  • the end product can be any product that includes IC chips, ranging from electronic gaming systems and other low-end applications to advanced computer products having a display, an input device, and a central processor.

Abstract

A system includes a wireless device having a radio frequency (RF) receiver, an RF transmitter, and means for producing sensory feedback to a user of the wireless device. A position-tracking system includes at least three antennae. The position-tracking system computes a multi-dimensional position of the wireless device using triangulation or trilateration based on time of arrival information from radio signals transmitted by the RF transmitter of the wireless device and received by each of the at least three antennae. A processor receives the multi-dimensional position of the wireless device determined by the position-tracking system, correlates the multi-dimensional position to a point of interest registered with an interactive software program that produces a virtual reality environment, and generates, in response to the multi-dimensional position correlated with the registered point of interest, data configured to activate the sensory feedback producing means of the wireless device.

Description

RELATED APPLICATION
This application claims the benefit of and priority to U.S. provisional application No. 61/659,544, filed Jun. 14, 2012, titled “RF Tracking with Active Sensory Feedback,” the entirety of which is incorporated by reference herein.
FIELD OF THE INVENTION
The invention relates generally to systems and methods for human-computer interaction. More particularly, the invention relates to position tracking of game controllers and other types of 3D input devices to enhance virtual reality experience of a user with sensory feedback, for example, tactile, auditory, and/or visual feedback.
BACKGROUND
In human-computer interaction there is a natural evolution, enabled by technical innovation, toward a more immersive experience for users. Consumers have witnessed a major movement by television manufacturers to provide “3D” TV to provide realistic images that appear to be actual images seen in true free space rather than on a two-dimensional screen. Game controller manufacturers provide auditory and tactile feedback to game controllers, adding to the feel of a player being in the game or battle by emitting sounds or by vibrating the hand held controller as an avatar or similar image on the screen is hit by bullet, punch, etc. These are direct sensory inputs created to make interacting with a computer more life-like.
The NINTENDO® WII REMOTE® wireless controller is an example of recent state of the art in user interactive controllers for computer display game systems. It is a movable wireless remote controller that incorporates inertial sensors to provide motion capture to their controllers. It is hand-held by the interactive user, and transmits input data to the computer controlled game display system via conventional short-range wireless RF transmissions (e.g., a BLUETOOTH® system), and receives data via infrared light sensors.
Previous systems using magnetic, optical, ultrasound and non-integrated circuit inertial devices were cumbersome and expensive. Motion capture using inertial sensing like the Wii® allows people to use their own movement to initiate an interaction with the video game. This inertial sensing was a breakthrough for game playing and pushed the “virtual reality” experience past simple visual, tactile, or audible interaction like the original VR systems that became popular in the 1980's.
After the success of the Wii®, many new applications were created to add to the immersive experience of games or related software. Additional competitive systems were introduced, including Sony's MOVE™ controller, which uses inertial systems and camera tracking technology that tracks an active optical marker to register controller's position in three-dimensional space. Other technologies have focused on eliminating the need for a controller and/or peripheral; for example, Microsoft's KINECT®, a camera based gesture-tracking system that uses reflected infrared light to make measurements on the light reflected off the objects being tracked. Both the MOVE™ and KINECT® register either the user's peripheral or the user's body in a software program's digital environment. This three-dimensional registration is used to fuse the user and user's environment with the digital program being interacted with on screen. High-end motion-capture systems still use these same optical techniques.
These input devices have allowed consumers to interact with games in new and highly intuitive ways. These current state of the art movable controllers also provide haptic feedback. Haptic feedback is commonly used in arcade and video game controllers. An example of this feature is the simulated automobile steering wheels that are programmed to provide a “feel” of the road. As the user makes a turn or accelerates, the steering wheel responds by resisting turns or slipping out of control. Other simple examples include handlebar shake in motorcycle-based games, gun shake in shooting games, joystick vibrations, etc. Sony's DUALSHOCK™ technology and the handheld remote controller for the NINTENDO® WII® feature such technology.
A consumer 3D touch device with high resolution three-dimensional force feedback, allowing the simulation of objects, textures, recoil, momentum, physical presence of objects in games through haptic feedback initiated at the device is now available from NOVINT™ HAPTICS™. The feedback is enabled by actuators that apply the forces to the skin for touch feedback to simulate touching something such as a virtual object on the screen, in free space. The actuator provides mechanical motion in response to an electrical stimulus. Most early designs of haptic feedback use electromagnetic technologies such as vibratory motors with an offset mass, such as the pager motor, that is in most cell phones or voice coils where a central mass or output is moved by a magnetic field. The electromagnetic motors typically operate at resonance and provide strong feedback, but have limited range of sensations. Next-generation actuator technologies are beginning to emerge, offering a wider range of effects thanks to more rapid response times. Next generation haptic actuator material technologies include electro-active materials, which includes piezoelectric and polymer materials.
SUMMARY
In one aspect, the invention features a system for wirelessly tracking a multi-dimensional position of and for providing sensory feedback to a wireless device. The system comprises a wireless device having a radio frequency (RF) receiver, an RF transmitter, and means for producing sensory feedback to a user of the wireless device. A position-tracking system includes at least three antennae. The position-tracking system computes a multi-dimensional position of the wireless device using triangulation or trilateration based on time of arrival information from radio signals transmitted by the RF transmitter of the wireless device and received by each of the at least three antennae. A processor receives the multi-dimensional position of the wireless device determined by the position-tracking system, correlates the multi-dimensional position to a point of interest registered with an interactive software program that produces a virtual reality environment, and generates, in response to the multi-dimensional position correlated with the registered point of interest, data configured to activate the sensory feedback producing means of the wireless device.
In another aspect, the invention features a method of providing sensory feedback to a wireless device. The method comprises receiving, at three or more antennae, radio signals emitted by a wireless device being used to interact with a virtual reality environment produced by an interactive computer program. A multi-dimensional position of the wireless device is computed using triangulation or trilateration based on time of arrival information from the radio signals received at the three or more antennae. The computed multi-dimensional position of the wireless device is correlated to a point of interest registered with the interactive computer program. Data configured to activate a particular sensory feedback are generated in response to the correlated multi-dimensional position. The particular sensory feedback at the wireless device is activated in response to the generated data.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram of an embodiment of a RF tracking system and a wireless haptic feedback system with a separate interface for haptic data transmission.
FIG. 2 is a block diagram of another embodiment of a RF tracking system and a wireless haptic feedback system with a combined interface for RF tracking and haptic data transmission.
FIG. 3 is a block diagram showing another embodiment of a RF tracking system and a wireless haptic feedback system with a partially combined interface for RF tracking and haptic data transmission.
FIG. 4 is flow diagram of an embodiment of a process of information flow in a virtual reality system.
DETAILED DESCRIPTION
Radio frequency (RF) communication for remote wireless data transfer is ubiquitous in the electronics industry. Consequently, hardware supporting the RF communication industry has evolved and advanced over the past twenty years producing robust, cost effective solutions for wirelessly sending data. Over the decades, RF hardware has benefited from cost reduction and sensitivity increases. In addition, algorithms used for improving signal integrity continue to evolve and improve. Position tracking of wireless devices using RF communications has become a technical reality. An example of a system capable of using RF communications to track the position of a wireless transmitter can be found in U.S. application Ser. No. 13/079,800, filed Apr. 4, 2011, and titled, “Multiplexing Receiver System”, the entirety of which application is incorporated by reference herein. Specifically, the position of a wireless device with an RF transmitter, in free space, can be computed using time as the basis of measurement for making the position calculations. Systems that can utilize such RF-based tracking, using radio signal time of arrival comparisons at multiple antennae connected to the system but disposed in different locations, can achieve multi-dimensional (2D or 3D) position accuracy results of 1 cm or less in x, y, and z planes. A time-based position-tracking technique that allows systems to resolve the position of a wireless device rapidly and with such high accuracy resolution enables improvements to existing applications and the generation of new applications for human-computer free space kinematic interaction involving wireless peripherals, whether held or worn by the user.
In addition to enabling high accuracy position tracking, RF-based communication can provide users with line-of-sight independence through better signal propagation characteristics, unlike linear signal paths experienced with light sensed by cameras or infrared (IR) sensors. A sensory feedback system that can accurately track the actual position of a user's handheld device or controller with RF data enables new applications not easily attainable with a gesture tracking camera system or an inertial sensing motion capture technique. Larger working volumes than camera systems and line-of-sight independence can make an RF-based position-tracking technology the technique of choice for registering wireless device's position in various interactive computer programs. With a larger working volume and line-of-sight independence, interactive games can be created or improved, and new training simulation and therapy programs, used to improve kinesthetic performance and proprioception awareness, become possible.
With the improvement of accuracy this time-based RF position tracking system can deliver, there are other areas of improvement for human-computer interaction. For example, by sensing the actual position of the wireless device accurately, a user can receive feedback from the system that makes interaction more realistic. In addition to improving the interactive functionality through visual cues seen by the user on a screen that represent the location of the wireless device in 3D space, a system can also provide tactile, visual, or auditory feedback (or any combination thereof) to the wireless device as the device travels within free space to locations of interest registered by the interactive software program. Applying tactile or auditory feedback at known or targeted points in space can improve kinematic interaction within virtually generated environments. Improved kinematic interaction can bring more realism and increased effectiveness to computer programs for physical therapy, kinematic-based training, gaming, CAD or other three dimensional design applications.
As described herein, for a sensory feedback system to be effective, the system accurately calculates the position of the wireless device, in three dimensions, and initiates tactile stimuli and/or other sensory feedback at specific points in 3D-space registered with a computer program. For example, a golf video game can achieve a higher level of realistic interaction as the position of a game controller (the wireless device) is tracked and displayed on the screen. As the user moves the game controller in free space, the image of the game controller, in this example displayed as a golf club, also moves in relation to the game controller held by the user. In this application, the game controller can supply haptic and/or auditory feedback as the position of the controller in free space correlates with the position of the ball on the screen. The user can thus “feel” and “hear” the impact of the game controller (visually represented as the golf club) hitting the ball registered on the screen. This combination of position tracking and tactile and/or auditory feedback provides a much more immersive level of human-computer interaction.
In brief overview, sensory feedback systems described herein combine RF-based position tracking of a wireless device with a sensory feedback function performed at or in the wireless device. This wireless device can include a single transceiver or a low power radio receiver or receivers, a separate transmitter or transmitters, and an actuator or similar device that can produce sensory feedback to the user. The wireless device can also include one or more inertial sensors, for example, accelerometers, gyroscopes, and magnetic sensors, to provide an orientation of the wireless device to the position-tracking system. The user uses this wireless device to engage in a virtual reality environment produced an interactive software program. Utilizing time of arrival information of radio signals, sent from the wireless device and received by a network of separate receive antennae, a computer (or CPU), in communication with the antenna network, dynamically computes a 2D or 3D position of the wireless device, thereby enabling accurate position tracking. The computed positional information is correlated to points of interest registered within a virtual environment produced by the interactive software program. A computer uses these registered positions to provide sensory feedback to the user as the user engages with the virtual environment provided by the interactive software program.
The sensory feedback functionality is built into the handheld device, which can be worn rather than held, and includes a driving apparatus, a sensing device, a controller, and software. The sensory feedback system is configured to process data in a simulated interactive software program from a virtual reality environment displayed on a computer monitor (or TV, projector, or other visual display device) and to transmit data to a driver control card. The driver control card is configured to control the driving apparatus. The sensing device can measure user force feedback (e.g., intensity of grip, rapidity of motion) and transmit this data back to the system. This feedback cycle, namely, the user measurement from the sensing device, virtual reality submersion through position registration within an interactive software program, and haptic feedback, provides a full 3D rendered spatial experience.
FIG. 1 shows an embodiment of a wireless haptic feedback system 2 including an RF tracking system 4, a base station 6, and a handheld device 60. The RF tracking system 4 includes RF receiver antennae 10, an RF receiver 20, a position and orientation calculation algorithm 30 running on a computing device, RF transmitter antennae 80, and an RF transmitter and controller 90. The base station 6 includes the RF receiver antennae 10, the RF receiver system 20, the position and orientation calculation algorithm 30, a virtual system controller 40, a haptic wireless interface 50, and haptic wireless interface antenna 70.
Each of the RF receiver antennae 10 are typically fixed at different locations in the RF tracking system 4. One or more of the RF antennae can serve to provide a tracking reference signal. All or a portion of the receiver antennae can be passive receivers. In brief overview, receiver antennae 10 receive the radio signal transmitted from the transmitters of the wireless device, and the RF tracking system 4 generates timing information from these radio signals (e.g., from their carrier wave phase information) in order to make comparisons among the antennae 10. The position and orientation calculation algorithm 30 of the RF tracking system 4 computes the physical position of the wireless device 60 in 3D space using the known fixed and separate physical locations of each receiver antenna 10, comparing the timing information associated with the receiver antennae 10, and performing triangulation or trilateration calculations based on this information.
The handheld wireless device 60 can be any type of electronics device typically powered by a self-contained power source and, for example, held, grasped, or worn by a user. The handheld wireless device 60 is comprises two functions: a transmitter function necessary for tracking the position of the device 60, and a haptic feedback function necessary for sensory feedback. The RF transmitter and controller block 90 and transmitter antennae block 80 accomplish the transmitter function. The RF transmitter and controller block 90 generates the signals and waveforms required to feed the transmitter antennae block 80, such that when such signals and waveforms are received by the receiver antennae 10 and processed by the RF receiver system 20, a position of the hand held device 60 can be ascertained using the position and orientation calculation algorithm 30 running on the computing device.
A haptic device provides user input 62 to and receives user output 61 from a realistic virtual reality (VR) environment in conjunction with the RF tracking system 4 and the virtual system controller 40. The virtual system controller 40, which typically runs on a processor (not shown) with associated memory, program storage and interfaces, includes a program interface, haptic simulation rule generation and other functionality for integrating the user inputs and user outputs and positioning data into the interactive software program that produces the VR environment. The virtual system controller 40 (and associated hardware) can be separate from or integrated with the computer system that runs the interactive software program. The virtual system controller 40 is configured to process data from the RF tracking system 4 to change the VR environment over time.
Sensory feedback to the user (user output 61) can consist of, for example, vibrations, shocks, pressures, friction, motion restriction, and sound, as is known in the art. To produce the sensory feedback, a haptic interface controller 63 converts commands from the virtual system controller 40 into user feedback. The haptic interface controller 63 performs a transfer function necessary to control the various haptic feedback devices 61 into “realistic” levels of feedback, dependent on the type of feedback and the mechanization of the feedback. In a typical digital system, a digital-to-analog converter, coupled with the appropriate other circuitry, is included in the haptic interface controller 63. Visual and auditory feedback can be performed separately and remotely from the handheld wireless device 60, for example, at the base station 6, or at an electronic device in communication with the base station 6.
In a similar manner, haptic feedback from the user (user input 62) can consist of trigger pulls, pressure, temperature, sweat, heart rate, sound, etc., as is known in the art. These devices are interfaced to the haptic interface controller 63, which typically incorporates an analog-to-digital converter.
Two-way RF communication between the handheld device 60 and the base station 6, for transmitting and receiving haptic data, occurs between the haptic wireless interfaces 50 and 64. The haptic feedback device, that receives the user output 61 and produces the user input 62, communicates through the haptic interface controller 63 with the virtual system controller 40 using a haptic wireless interface 64. The haptic wireless interface 64 includes an antenna 65 for bi-directional communication with the wireless antenna 70 of the haptic wireless interface 50 coupled to the virtual system controller 40. These interfaces 50, 64 use standard RF techniques, known in the art, to transmit and receive data corresponding to feedback to and from the user. These haptic wireless interfaces 50 and 64 use antennae 70 and 65, respectively, to transmit and receive this haptic data.
FIG. 2 shows another embodiment of a wireless haptic feedback system 2′ including an RF tracking system 4′, a base station 6′, and a handheld device 160. The RF tracking system 4′ includes RF receiver antennae 100, 170, an RF transceiver 120, a position and orientation calculation algorithm 130 running on a computing device, RF transmitter antennae 180, and an RF transceiver and controller 190. The base station 6′ includes the RF receiver antennae 100, 170, the RF transceiver 120, the position and orientation calculation algorithm 130, and a virtual system controller 140. This embodiment is similar to the embodiment of FIG. 1, with notable differences being that the RF tracking system 4′ uses the transceivers 120, 190 instead of the RF receiver 20 and RF transmitter 90. Accordingly, the two-way RF communication between the handheld device 160 and the base station 6 (for transmitting and receiving haptic data) occurs between these transceivers 120, 190.
Handheld device 160 is held, grasped, worn, etc., by the user and is typically powered by a self-contained power source. The handheld device 160 is comprised of two functions: a transmitter function necessary for tracking the position of the device 160, and a haptic feedback function necessary for sensory feedback. The RF transceiver and controller block 190 and transmitter antennae block 180 accomplish the transmitter function. The RF transceiver and controller block 190 generates the signals and waveforms required to feed the transmitter antennae block 180, such that when such signals and waveforms are received by the receiver antennae 100 and processed by the RF transceiver 120, a position of the handheld device 160 can be ascertained using the position and orientation calculation algorithm 130 running on a computing device.
A haptic device acts as a user input 162 and user output 161 of a virtual environment in conjunction with the RF tracking system 4′ and the virtual system controller 140. The virtual system controller 140, which typically runs on a processor (not shown) with associated memory, program storage and interfaces, includes a program interface, haptic simulation rule generation and other functionality for stitching the user inputs and outputs and positioning data and into a realistic VR environment. The virtual system controller 140 is configured to process data from the RF tracking system 4′ to change the virtual environments evolution with time.
Haptic feedback to the user (output 161) can consist of vibrations, shocks, pressures, friction, motion restriction, sound, etc. as is known in the art. These devices are controlled by a haptic interface controller 163, which converts commands from the virtual system controller 140 into user feedback. The haptic interface controller 163 accomplishes a transfer function necessary to control the various haptic feedback devices 161 into “realistic” levels of feedback, dependent on the type of feedback and the mechanization of the feedback. In a typical digital system, a digital to analog converter, coupled with the appropriate other circuitry, is included in the haptic interface controller 163.
In a similar manner, haptic feedback from the user (user input 162) can consist of trigger pulls, pressure, temperature, sweat, heart rate, sound, etc., as is known in the art. These devices are also interfaced to haptic interface controller 163, which typically incorporates an analog-to-digital converter.
The haptic feedback device that receives the user output 161 and produces the user input 162 communicates, through the haptic interface controller 163, with the virtual system controller 140 using a wireless interface. The wireless interface is bi-directional to the virtual system controller 140. These interfaces use standard RF techniques, known in the art, to transmit and receive data corresponding to feedback to and from the user. RF transceiver 120 and RF transceiver 190 use antennae 170 and 165, respectively, to transmit and receive this data.
In addition to providing the tracking transmitter function, transceiver block 190 contains a transmitter and receiver section for antenna 165. The antenna 165 communicates with the antenna 170, which is connected to the RF transceiver 120. The RF transceiver 120 also contains a transmitter and receiver section for the antenna 170, in addition to the receive function needed for tracking. The RF transceiver 120 communicates with the virtual system controller 140.
As is known in the art, RF transceivers provide transmitter and receiver functionality. This allows the receiver antennae 100 to receive signals while the antenna 170 can transmit and receive signals related to the haptic interface, all from RF transceiver hardware 120. In a similar manner, the transmitter antennae 180 can transmit data while the antenna 165 can transmit and receive signals related to the haptic interface, all from RF transceiver hardware 190. As is also known in the art, multiplexing schemes, such as switching, time multiplexing, frequency multiplexing, phase multiplexing, etc., can be designed into the RF transceiver 120 and/or the RF transceiver 190, such that antennae may be shared. For example, the antenna 170 and one of receiver antennae 100 can be combined to reduce the antenna count (see insert A). In a similar manner, antenna 165 and one of receiver antennae 180 can be combined to reduce the antenna count (see insert B).
FIG. 3 shows an embodiment of a haptic feedback system 2″ including an RF tracking system 4″, a base station 6″, and a handheld device 260. The RF tracking system 4″ includes RF receiver antennae 200, 271, an RF receiver 220, a position and orientation calculation algorithm 230 running on a computing device, RF transmitter antennae 266, 280, and an RF transmitter and controller 290. The base station 6″ includes the RF receiver antennae 200, 271, the RF receiver 220, the position and orientation calculation algorithm 230, a virtual system controller 240, a haptic wireless interface 250, haptic wireless interface antennae 270. The RF receiver antennae 200, 271 are typically fixed and act as a tracking reference frame. This embodiment is similar to the embodiment of FIG. 1, with a notable difference being that the two-way wireless RF communication between the handheld device 260 and the base station 6″ (for transmitting and receiving haptic data) RF tracking system 4′ is split into two uni-directional paths, one path using the strengths of the unidirectional tracking system components (from the RF transmitter and controller 290 to RF receiver 220) and the other path using a unidirectional interface (from the haptic wireless interface 250 to a haptic wireless interface 264 in the device 260).
The handheld device 260 is held, grasped, worn, etc. by the user and is typically powered by a self-contained power source. The handheld 260 is comprised of two functions: a transmitter function necessary for tracking the position of the device 60, and a haptic feedback function necessary for sensory feedback. The RF transmitter and controller block 290 and transmitter antennae block 280 accomplish the transmitter function. The RF transmitter and controller 290 generates the signals and waveforms required to feed the transmitter antennae 280, such that when such signals and waveforms are received by the receiver antennae 200 and processed by the RF receiver 220, a position of the hand held device 260 can be ascertained using the position and orientation calculation algorithm 230 running on the computing device.
A haptic device acts as a user input 262 and user output 261 of a virtual environment in conjunction with the RF tracking system 4″ and the virtual system controller 240. The virtual system controller 240, which typically runs on a processor (not shown) with associated memory, program storage and interfaces, includes a program interface, haptic simulation rule generation and other functionality for stitching the user inputs and outputs and positioning data and into a realistic VR environment. The virtual system controller 240 is configured to process data from the RF tracking system 4″ to change the virtual environments evolution with time.
Haptic feedback to the user (user output 261) can consist of vibrations, shocks, pressures, friction, motion restriction, sound, etc. as is known in the art. These devices are controlled by a haptic interface controller 263, which converts commands from the virtual system controller into user feedback. The haptic interface controller 263 realizes the transfer function necessary to control the various haptic feedback devices 261 into “realistic” levels of feedback, dependent on the type of feedback and the mechanization of the feedback. In a typical digital system, a digital-to-analog converter, coupled with the appropriate other circuitry, is included in the haptic interface controller 263.
In a similar manner, haptic feedback from the user (user input 262) can consist of trigger pulls, pressure, temperature, sweat, heart rate, sound, etc., as is known in the art. These devices are also interfaced to the haptic interface controller 263, which typically incorporates an analog-to-digital converter. The haptic feedback device, which receives the user output 261 and produces the user input 262, communicates, through the haptic interface controller 263, with the virtual system controller 240 using two wireless interfaces.
Transmission of the data from the user haptic interface 264 to the virtual system controller 240 over one of the two wireless interfaces occurs through the same RF components used for position tracking (i.e., RF components 290, 280, 200, 220). Separate antennae/channels are shown in FIG. 3 for the haptic wireless interface to the virtual system controller 240. These separate channels comprise a transmitting antenna 266 and a receiving antenna 271. These antennae 266, 271 may be physically separate antennae, as shown, or they may be incorporated into the tracking system antennae framework. In FIG. 3, insert C indicates that receiver antennae 200 and antenna 271 are combined. In a similar manner, transmitter antennae 280 and antenna 266 are combined, as signified by insert D. The RF receiver 220 separates data into position-tracking data and haptic interface data, and sends the haptic interface data to the virtual system controller 240 for further processing.
Transmission of data back, from the virtual system controller 240 to the user haptic device as haptic feedback 261, occurs over the other of the two wireless interfaces. This transmission is accomplished by the haptic wireless interface 250 transmitting the data over the haptic wireless interface antenna 270 at the base station 6″ to its counterpart antenna 265 in the handheld device 260. The haptic wireless interface 264 processes the data received by the antenna 265 and sends the appropriate information to the haptic interface controller 263, for conversion into the appropriate formats for controlling the feedback to the user.
As is known in the art, multiplexing schemes, such as switching, time multiplexing, frequency multiplexing, phase multiplexing, etc., can be designed into RF receiver 220 and/or the RF transmitter and controller 290 such that antennae may be shared. For example, the antenna 271 and one of receiver antennae 200 can be combined to reduce the antenna count (see insert C). In a similar manner, antenna 266 and one of transmitter antennae 280 can be combined to reduce the antenna count (see insert D).
FIG. 4 shows an embodiment of a process 480 that can be performed by any of the embodiments described in connection with FIG. 1, FIG. 2, and FIG. 3. For purposes of the description of the process 480, reference is made to the elements of FIG. 1, unless specifically indicated otherwise. Initially, the virtual system controller 40 generates (step 500) a virtual reality scenario. This virtual reality scenario can take, for example, the form of a game or a simulation. Various points of interest are registered in the virtual reality scenario. The virtual reality scenario is presented (step 510) to the user, where visual scenes, cues, and haptic data are generated for supplying to the user. To generate the haptic data, the virtual system controller 40 correlates the computed position of the handheld device to the points of interest identified in the virtual reality scenario. When the handheld device moves to one of these points of interest, the virtual system controller 40 identifies and generates the appropriate type of haptic data associated with that point of interest.
The base station 6 transmits (step 520) haptic data to the handheld device 60. Substantially simultaneously, the tracking system components of the RF tracking system 4 tracks (step 530) the handheld device 60. Additionally, user haptic input 62 is monitored (step 540) for changes and sent back (step 550) to the virtual system controller 40 using an RF link between the antennae 65, 70 of the haptic wireless interfaces 64, 50, respectively. The process 480 repeats until the virtual reality scenario ends.
As known in the art, various other methods and means are available to enhance the tracking performance of the described RF tracking system 4. Inertial sensors such as accelerometers and gyroscopes, within the wireless handheld device, can provide short-term navigation improvements. Magnetic sensors can determine orientation relative to the earth's magnetic field. These sensory inputs can be optimally combined, such as with a Kalman filter, to provide better tracking accuracy and responsiveness.
In addition, although each of the embodiments described herein perform the position tracking calculations at the RF tracking system separate from the wireless handheld device, in other embodiments, the handheld device is configured to perform the calculations. In such embodiments, the wireless device receives the timing information acquired from the radio signals received by the receiver antennae and calculates its own position from this timing information. Further, the wireless device can send its calculated position back to the base station for use by the virtual system controller. The virtual system controller uses the position data, in conjunction with the human-computer interactive software, to produce haptic alerts inputs. These haptic alerts inputs, when sent to the wireless device, activate the haptic or auditory (or both) function at the wireless device.
In still another embodiment, the wireless device can perform the position calculation and execute the human-computer interactive software. This embodiment can forego any need for the wireless device to transmit the calculated position to the base station or for the base station to transmit the haptic alert inputs to the wireless device. The wireless device, itself, generates the haptic alerts (and subsequent sensory feedback) based on its calculated position and points of registration within the human-computer interactive software.
In addition, although described herein with a single wireless device, it is to be understood that the various aforementioned embodiments of sensory feedback systems can simultaneously track the positions of multiple wireless devices engaged in virtual reality environment produced by an interactive computer program. These sensory feedback systems can activate sensory feedback in each of these devices individually, in accordance with the particular points of interests with which their respective positions correlate.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method, and computer program product. Thus, aspects of the present invention may be embodied entirely in hardware, entirely in software (including, but not limited to, firmware, program code, resident software, microcode), or in a combination of hardware and software. All such embodiments may generally be referred to herein as a circuit, a module, or a system. In addition, aspects of the present invention may be in the form of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereon.
Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wired, optical fiber cable, radio frequency (RF), etc. or any suitable combination thereof.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as JAVA, Smalltalk, C#, C++, and Visual C++ or the like and conventional procedural programming languages, such as the C and Pascal programming languages or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
The program code may execute entirely on a user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on a remote computer or server. Any such remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Aspects of the described invention may be implemented in one or more integrated circuit (IC) chips manufactured with semiconductor-fabrication processes. The maker of the IC chips can distribute them in raw wafer form (on a single wafer with multiple unpackaged chips), as bare die, or in packaged form. When in packaged form, the IC chip is mounted in a single chip package, for example, a plastic carrier with leads affixed to a motherboard or other higher level carrier, or in a multichip package, for example, a ceramic carrier having surface and/or buried interconnections. The IC chip is then integrated with other chips, discrete circuit elements, and/or other signal processing devices as part of either an intermediate product, such as a motherboard, or of an end product. The end product can be any product that includes IC chips, ranging from electronic gaming systems and other low-end applications to advanced computer products having a display, an input device, and a central processor.
Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It is be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed.
While the invention has been shown and described with reference to specific preferred embodiments, it should be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the following claims.

Claims (15)

What is claimed is:
1. A system for wirelessly tracking a multi-dimensional position of, and for providing sensory feedback to, a wireless device, the system comprising:
a wireless device having a radio-frequency (RF) receiver, an RF transmitter and controller, and means for producing sensory feedback to a user of the wireless device, the RF transmitter transmitting radio signals;
a position-tracking system including a base station having at least three antennae in communication with a receiver, the at least three antennae receiving the radio signals transmitted by the wireless device, the receiver acquiring timing information from the radio signals received by the at least three antennae, the base station including an RF transmitter that transmits radio signals conveying the timing information, the RF receiver of the wireless device receiving the timing information conveyed by the radio signals transmitted by the RF transmitter of the base station and the controller of the wireless device computing a physical multi-dimensional position of the wireless device within a physical three-dimensional environment using triangulation or trilateration based on the received timing information; and
a processor receiving the computed physical multi-dimensional position of the wireless device, correlating the computed physical multi-dimensional position of the wireless device to a point of interest registered with an interactive software program that produces a virtual reality environment, and generating, in response to the physical multi-dimensional position of the wireless device within the physical three-dimensional environment correlated with the registered point of interest, haptic data configured to activate the sensory feedback producing means of the wireless device.
2. The system of claim 1, wherein the means for producing sensory feedback produces any one or combination of haptic feedback, auditory feedback, or visual feedback.
3. The system of claim 1, wherein the wireless device includes one or more inertial sensors to provide an orientation of the wireless device to the position-tracking system.
4. A method of providing sensory feedback to a wireless device, the method comprising:
receiving, at three or more antennae, radio signals emitted by a wireless device being used to interact with a virtual reality environment produced by an executing interactive computer program;
acquiring timing information from the radio signals received by the at least three antennae;
receiving, by the wireless device, radio signals conveying the timing information;
computing, by the wireless device, a physical multi-dimensional position of the wireless device within a three-dimensional environment using triangulation or trilateration based on timing information from the radio signals received at the three or more antennae;
correlating the computed physical multi-dimensional position of the wireless device within the three-dimensional environment to a point of interest registered with the interactive computer program;
generating, in response to the correlated physical multi-dimensional position of the wireless device, haptic data configured to activate a particular sensory feedback; and
activating, in response to the haptic data, the particular sensory feedback at the wireless device.
5. The method of claim 4, wherein the particular sensory feedback includes any one or combination of haptic feedback, auditory feedback, or visual feedback.
6. The method of claim 4, further comprising providing an orientation of the wireless device determined by an inertial sensor to the position-tracking system.
7. A method of providing sensory feedback to a wireless device, the method comprising:
registering points of interest within a software program;
receiving radio signals transmitted by the wireless device;
transmitting radio signals to the wireless device with timing information acquired from the received radio signals;
computing, by the wireless device, a physical position of an object within a physical three-dimensional environment based on the timing information transmitted to the wireless device;
correlating the physical position of the object within the physical three-dimensional environment to one of the points of interest registered within the software program; and
generating haptic data configured to activate a particular sensory feedback on the wireless device in response to correlating the physical position of the object to one of the points of interest.
8. The method of claim 7, further comprising:
transmitting radio signals with the haptic data to the wireless device in response to receiving the radio signals transmitted by the wireless device that conveys the computed physical position of the object; and
activating, in response to receiving the haptic data, the particular sensory feedback at the wireless device.
9. The method of claim 8, wherein the wireless device comprises the object, and further comprising:
receiving the radio signals transmitted from the wireless device at three or more receiver antennae; and
acquiring the timing information from the radio signals transmitted from the wireless device and received at the three or more receiver antennae.
10. The method of claim 9, wherein the transmission of the haptic data occurs over a different RF channel from an RF channel by which the radio signals transmitted from the wireless device are received at the three or more receiver antennae.
11. The method of claim 9, wherein the transmission of the haptic data to the wireless device occurs over an RF channel that shares a receiver antenna of the three or more receiver antennae with an RF channel over which the radio signals that are transmitted from the wireless device and received at the three or more receiver antennae.
12. The method of claim 7, further comprising receiving radio signals with haptic input over an RF channel, and wherein the transmission of the haptic data to the wireless device occurs over a different RF channel from the RF channel over which the radio signals with the haptic input are received.
13. The method of claim 7, wherein the wireless device comprises the object, and further comprising:
receiving the radio signals transmitted by the wireless device at three or more receiver antennae over a first RF channel;
acquiring the timing information from the radio signals that are transmitted by the wireless device and received at the three or more receiver antennae; and
receiving radio signals with haptic input from the wireless device over a second RF channel that shares a receiver antenna of the three or more receiver antenna with the first RF channel.
14. The method of claim 7, further comprising receiving the computed physical position of the object within the physical three-dimensional environment in radio signals transmitted by the wireless device over an RF channel.
15. The method of claim 7, wherein the steps of correlating the physical position of the object within the physical three-dimensional environment to one of the points of interest registered within the software program and of generating haptic data configured to activate a particular sensory feedback on the wireless device in response to correlating the physical position of the object to one of the points of interest are performed by the wireless device.
US13/918,295 2012-06-14 2013-06-14 RF tracking with active sensory feedback Active 2035-11-30 US9782669B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US13/918,295 US9782669B1 (en) 2012-06-14 2013-06-14 RF tracking with active sensory feedback
US15/687,779 US10269182B2 (en) 2012-06-14 2017-08-28 RF tracking with active sensory feedback

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261659544P 2012-06-14 2012-06-14
US13/918,295 US9782669B1 (en) 2012-06-14 2013-06-14 RF tracking with active sensory feedback

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/687,779 Continuation-In-Part US10269182B2 (en) 2012-06-14 2017-08-28 RF tracking with active sensory feedback

Publications (1)

Publication Number Publication Date
US9782669B1 true US9782669B1 (en) 2017-10-10

Family

ID=59982113

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/918,295 Active 2035-11-30 US9782669B1 (en) 2012-06-14 2013-06-14 RF tracking with active sensory feedback

Country Status (1)

Country Link
US (1) US9782669B1 (en)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170327083A1 (en) * 2016-05-10 2017-11-16 Volkswagen Ag Method and system for secure access to a vehicle
US20170372524A1 (en) * 2012-06-14 2017-12-28 Position Imaging, Inc. Rf tracking with active sensory feedback
US10148918B1 (en) 2015-04-06 2018-12-04 Position Imaging, Inc. Modular shelving systems for package tracking
US10180490B1 (en) 2012-08-24 2019-01-15 Position Imaging, Inc. Radio frequency communication system
US10200819B2 (en) 2014-02-06 2019-02-05 Position Imaging, Inc. Virtual reality and augmented reality functionality for mobile devices
US10234539B2 (en) 2012-12-15 2019-03-19 Position Imaging, Inc. Cycling reference multiplexing receiver system
US10237698B2 (en) 2013-01-18 2019-03-19 Position Imaging, Inc. System and method of locating a radio frequency (RF) tracking device using a calibration routine
US10257654B2 (en) 2014-01-17 2019-04-09 Position Imaging, Inc. Wireless relay station for radio frequency-based tracking system
US10324474B2 (en) 2015-02-13 2019-06-18 Position Imaging, Inc. Spatial diversity for relative position tracking
US10416276B2 (en) 2010-11-12 2019-09-17 Position Imaging, Inc. Position tracking system and method using radio signals and inertial sensing
US10444323B2 (en) 2016-03-08 2019-10-15 Position Imaging, Inc. Expandable, decentralized position tracking systems and methods
US10455364B2 (en) 2016-12-12 2019-10-22 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US10605904B2 (en) 2011-11-10 2020-03-31 Position Imaging, Inc. Systems and methods of wireless position tracking
US10634761B2 (en) 2013-12-13 2020-04-28 Position Imaging, Inc. Tracking system with mobile reader
US10634503B2 (en) 2016-12-12 2020-04-28 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US10634506B2 (en) 2016-12-12 2020-04-28 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US10642560B2 (en) 2015-02-13 2020-05-05 Position Imaging, Inc. Accurate geographic tracking of mobile devices
US10856108B2 (en) 2013-01-18 2020-12-01 Position Imaging, Inc. System and method of locating a radio frequency (RF) tracking device using a calibration routine
US10853757B1 (en) 2015-04-06 2020-12-01 Position Imaging, Inc. Video for real-time confirmation in package tracking systems
US11089232B2 (en) 2019-01-11 2021-08-10 Position Imaging, Inc. Computer-vision-based object tracking and guidance module
US11120392B2 (en) 2017-01-06 2021-09-14 Position Imaging, Inc. System and method of calibrating a directional light source relative to a camera's field of view
US11132004B2 (en) 2015-02-13 2021-09-28 Position Imaging, Inc. Spatial diveristy for relative position tracking
US11175375B2 (en) 2010-11-12 2021-11-16 Position Imaging, Inc. Position tracking system and method using radio signals and inertial sensing
US11361536B2 (en) 2018-09-21 2022-06-14 Position Imaging, Inc. Machine-learning-assisted self-improving object-identification system and method
US11416805B1 (en) 2015-04-06 2022-08-16 Position Imaging, Inc. Light-based guidance for package tracking systems
US11436553B2 (en) 2016-09-08 2022-09-06 Position Imaging, Inc. System and method of object tracking using weight confirmation
US11501244B1 (en) 2015-04-06 2022-11-15 Position Imaging, Inc. Package tracking systems and methods
WO2023043874A1 (en) * 2021-09-17 2023-03-23 Stryker Corporation Patient support apparatuses with patient monitoring

Citations (81)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3824596A (en) 1972-09-27 1974-07-16 Southwest Res Inst Automatic sector indicating direction finder system
US3940700A (en) 1972-08-15 1976-02-24 Paul Haas Method and installation for the detection of a source generating electro-magnetic oscillations
US4328499A (en) 1979-10-24 1982-05-04 The Marconi Company Limited Radio direction finding systems
US5010343A (en) 1988-04-26 1991-04-23 Vaisala Oy Method and device in the antenna and receiving system of a radio theodolite
US5426438A (en) 1992-12-30 1995-06-20 Delfin Systems Method and apparatus for adaptively determining the bearing angle of a radio frequency signal
US5574468A (en) 1995-04-20 1996-11-12 Litton Systems, Inc. Phase-equivalent interferometer arrays
US5592180A (en) 1992-08-20 1997-01-07 Nexus1994 Limited Direction finding and mobile location system for trunked mobile radio systems
US5600330A (en) 1994-07-12 1997-02-04 Ascension Technology Corporation Device for measuring position and orientation using non-dipole magnet IC fields
US5657026A (en) 1996-01-26 1997-08-12 Electronic Tracking Systems, Inc. Beacon signal receiving system
US5923286A (en) 1996-10-23 1999-07-13 Honeywell Inc. GPS/IRS global position determination method and apparatus with integrity loss provisions
US5953683A (en) 1997-10-09 1999-09-14 Ascension Technology Corporation Sourceless orientation sensor
US6167347A (en) 1998-11-04 2000-12-26 Lin; Ching-Fang Vehicle positioning method and system thereof
US6255991B1 (en) 2000-01-19 2001-07-03 Trw Inc. Low cost angle of arrival measurement system
US20020021277A1 (en) * 2000-04-17 2002-02-21 Kramer James F. Interface for controlling a graphical image
US6409687B1 (en) 1998-04-17 2002-06-25 Massachusetts Institute Of Technology Motion tracking system
US6417802B1 (en) 2000-04-26 2002-07-09 Litton Systems, Inc. Integrated inertial/GPS navigation system
US6496778B1 (en) 2000-09-14 2002-12-17 American Gnc Corporation Real-time integrated vehicle positioning method and system with differential GPS
US6512748B1 (en) 1998-01-30 2003-01-28 Ntt Mobile Communications Network Inc. Radio paging signal coding control scheme using variable number of logical channels according to paging signal traffic
US20030053492A1 (en) 2000-09-01 2003-03-20 Osamu Matsunaga Multiplexer, receiver, and multiplex transmission method
US20030120425A1 (en) 2001-12-26 2003-06-26 Kevin Stanley Self-correcting wireless inertial navigation system and method
US6593885B2 (en) 2000-04-27 2003-07-15 Wherenet Corp Low cost DTOA location processing system based on multiple readers-to-single processor architecture
US6630904B2 (en) 1999-02-02 2003-10-07 The Charles Stark Draper Laboratory, Inc. Deeply-integrated adaptive INS/GPS navigator with extended-range code tracking
US20030195017A1 (en) * 1999-09-30 2003-10-16 Tao Chen Wireless communication system with base station beam sweeping
US6683568B1 (en) 1999-05-14 2004-01-27 Auckland Uniservices Limited Position estimation services
US6697736B2 (en) 2002-02-06 2004-02-24 American Gnc Corporation Positioning and navigation method and system thereof
US6721657B2 (en) 2001-06-04 2004-04-13 Novatel, Inc. Inertial GPS navigation system
US20040095907A1 (en) 2000-06-13 2004-05-20 Agee Brian G. Method and apparatus for optimization of wireless multipoint electromagnetic communication networks
US6750816B1 (en) 2003-02-12 2004-06-15 Novatel, Inc. Integrated GPS-inertial system
US20040176102A1 (en) 2001-11-20 2004-09-09 Integrinautics Corporation Multiple antenna multi-frequency measurement system
US20050143916A1 (en) 2003-12-26 2005-06-30 In-Jun Kim Positioning apparatus and method combining RFID, GPS and INS
US20050184907A1 (en) 2002-03-18 2005-08-25 Hall Christopher J. Method and apparatus for geolocating a wireless communications device
US20050275626A1 (en) 2000-06-21 2005-12-15 Color Kinetics Incorporated Entertainment lighting system
US20060013070A1 (en) 2002-12-04 2006-01-19 Sverre Holm Ultrasonic tracking and locating system
US6989789B2 (en) 2000-02-25 2006-01-24 Thales Method for locating radioelectric sources using two-channel high resolution radiogoniometer
US7009561B2 (en) 2003-03-11 2006-03-07 Menache, Llp Radio frequency motion tracking system and method
US20060061469A1 (en) 2004-09-21 2006-03-23 Skyfence Inc. Positioning system that uses signals from a point source
US20060066485A1 (en) * 2004-09-24 2006-03-30 Guohua Min Wireless tracking system based upon phase differences
US20060101497A1 (en) 2004-06-25 2006-05-11 International Business Machines Corporation User-aware video display
US7143004B2 (en) 1998-12-09 2006-11-28 Microstrain, Inc. Solid state orientation sensor with 360 degree measurement capability
US7190309B2 (en) 2004-09-24 2007-03-13 Hill Edward L Radio signal transmitter with multiple antennas for improved position detection
US20070060384A1 (en) 2005-09-14 2007-03-15 Nintendo Co., Ltd. Storage medium storing video game program
US7193559B2 (en) 2003-01-21 2007-03-20 Novatel, Inc. Inertial GPS navigation system with modified kalman filter
US7236091B2 (en) 2005-02-10 2007-06-26 Pinc Solutions Position-tracking system
US7295925B2 (en) 1997-10-22 2007-11-13 Intelligent Technologies International, Inc. Accident avoidance systems and methods
US20080048913A1 (en) 2004-12-02 2008-02-28 Commissariat A L'energie Atomique Local Positioning System and Method
US7409290B2 (en) 2004-04-17 2008-08-05 American Gnc Corporation Positioning and navigation method and system thereof
US20080204322A1 (en) 2003-11-03 2008-08-28 Gordon Kenneth Andrew Oswald Determining Positional Information
US7443342B2 (en) 2004-01-13 2008-10-28 The Nippon Signal Co., Ltd. Reception time determining apparatus and distance measuring apparatus using the same
US20080316324A1 (en) 2007-06-22 2008-12-25 Broadcom Corporation Position detection and/or movement tracking via image capture and processing
US7533569B2 (en) 2006-03-15 2009-05-19 Qualcomm, Incorporated Sensor-based orientation system
US20090149202A1 (en) 2007-12-07 2009-06-11 Christian Steele System and method for determination of position
US20090243932A1 (en) 2008-03-31 2009-10-01 Radliofy Llc Methods and Systems for Determining the Location of an Electronic Device
US7612715B2 (en) 2003-07-12 2009-11-03 Qinetiq Limited Direction finding
US7646330B2 (en) 2005-03-14 2010-01-12 Alfred E. Mann Foundation For Scientific Research System and method for locating objects and communicating with the same
US20100103989A1 (en) 2006-10-17 2010-04-29 Smith Stephen F Robust Low-Frequency Spread-Spectrum Navigation System
US20100103173A1 (en) 2008-10-27 2010-04-29 Minkyu Lee Real time object tagging for interactive image display applications
US20100123664A1 (en) * 2008-11-14 2010-05-20 Samsung Electronics Co., Ltd. Method for operating user interface based on motion sensor and a mobile terminal having the user interface
US20110006774A1 (en) 2007-05-24 2011-01-13 Penguin Automated Systems ,Inc. Subsurface positioning system and method for monitoring movement underground
US7876268B2 (en) 2006-08-22 2011-01-25 James P. Jacobs System and method for precise location of receiving antennas
US20110187600A1 (en) 2010-01-29 2011-08-04 Tc License Ltd. System and method for measurement of distance to a tag by a modulated backscatter rfid reader
US20110208481A1 (en) 2010-02-19 2011-08-25 Vladimir Slastion Extended range interferometric methods and systems
US20110210843A1 (en) 2010-03-01 2011-09-01 Andrew Llc System and method for location of mobile devices in confined environments
US20110241942A1 (en) 2010-04-02 2011-10-06 Position Imaging, Inc. Multiplexing receiver system
US20110256882A1 (en) 2005-12-15 2011-10-20 Invisitrack, Inc. Multi-Path Mitigation in Rangefinding and Tracking Objects Using Reduced Attenuation RF Technology
US20120013509A1 (en) 2010-07-14 2012-01-19 Zebra Enterprise Solutions Corp. Frequency channel diversity for real-time locating systems, methods, and computer program products
US20120127088A1 (en) * 2010-11-19 2012-05-24 Apple Inc. Haptic input device
US20120184285A1 (en) 2011-01-14 2012-07-19 Qualcomm Incorporated Methods and apparatuses for use in providing positioning assistance data to mobile stations via a self-organizing network
US8269624B2 (en) 2006-12-27 2012-09-18 Industrial Technology Research Institute Positioning apparatus and method
US20130021417A1 (en) 2011-07-21 2013-01-24 Seiko Epson Corporation Recording apparatus
US20130036043A1 (en) 2011-07-06 2013-02-07 Patrick Faith Image-based product mapping
US8457655B2 (en) 2011-09-19 2013-06-04 Qualcomm Incorporated Hybrid time of arrival based positioning system
US20130314210A1 (en) 2012-05-22 2013-11-28 Trimble Navigation Limited Multi-modal entity tracking and display
US20140253368A1 (en) 2008-09-12 2014-09-11 Propagation Research Associates, Inc. Multi-mode, multi-static interferometer utilizing pseudo orthogonal codes
US20140300516A1 (en) 2011-11-10 2014-10-09 Position Imaging, Inc. Systems and methods of wireless position tracking
US20150009949A1 (en) 2013-07-08 2015-01-08 Alexey Khoryaev Synchronizing peer-to-peer operation for outside network coverage and partial network coverage using lte air interface
US20150091757A1 (en) 2013-09-30 2015-04-02 At&T Intellectual Property I, Lp Systems and Methods for High Precision Indoor Location Tracking
US20150169916A1 (en) 2013-12-13 2015-06-18 Position Imaging, Inc. Tracking system with mobile reader
US20150323643A1 (en) 2012-12-15 2015-11-12 Position Imaging, Inc. Cycling reference multiplexing receiver system
US20160256100A1 (en) 2011-07-22 2016-09-08 Tci3-Pressure Applications Llc Systems and methods for monitoring and providing therapeutic support for a user
US9482741B1 (en) 2013-01-18 2016-11-01 Position Imaging, Inc. System and method of locating a radio frequency (RF) tracking device using a calibration routine
US20160371574A1 (en) 2013-05-06 2016-12-22 The Johns Hopkins University System for preventing instrument retention

Patent Citations (83)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3940700A (en) 1972-08-15 1976-02-24 Paul Haas Method and installation for the detection of a source generating electro-magnetic oscillations
US3824596A (en) 1972-09-27 1974-07-16 Southwest Res Inst Automatic sector indicating direction finder system
US4328499A (en) 1979-10-24 1982-05-04 The Marconi Company Limited Radio direction finding systems
US5010343A (en) 1988-04-26 1991-04-23 Vaisala Oy Method and device in the antenna and receiving system of a radio theodolite
US5592180A (en) 1992-08-20 1997-01-07 Nexus1994 Limited Direction finding and mobile location system for trunked mobile radio systems
US5426438A (en) 1992-12-30 1995-06-20 Delfin Systems Method and apparatus for adaptively determining the bearing angle of a radio frequency signal
US5600330A (en) 1994-07-12 1997-02-04 Ascension Technology Corporation Device for measuring position and orientation using non-dipole magnet IC fields
US5574468A (en) 1995-04-20 1996-11-12 Litton Systems, Inc. Phase-equivalent interferometer arrays
US5657026A (en) 1996-01-26 1997-08-12 Electronic Tracking Systems, Inc. Beacon signal receiving system
US5923286A (en) 1996-10-23 1999-07-13 Honeywell Inc. GPS/IRS global position determination method and apparatus with integrity loss provisions
US5953683A (en) 1997-10-09 1999-09-14 Ascension Technology Corporation Sourceless orientation sensor
US7295925B2 (en) 1997-10-22 2007-11-13 Intelligent Technologies International, Inc. Accident avoidance systems and methods
US6512748B1 (en) 1998-01-30 2003-01-28 Ntt Mobile Communications Network Inc. Radio paging signal coding control scheme using variable number of logical channels according to paging signal traffic
US6409687B1 (en) 1998-04-17 2002-06-25 Massachusetts Institute Of Technology Motion tracking system
US6167347A (en) 1998-11-04 2000-12-26 Lin; Ching-Fang Vehicle positioning method and system thereof
US6292750B1 (en) 1998-11-04 2001-09-18 Ching-Fang Lin Vehicle positioning method and system thereof
US7143004B2 (en) 1998-12-09 2006-11-28 Microstrain, Inc. Solid state orientation sensor with 360 degree measurement capability
US6630904B2 (en) 1999-02-02 2003-10-07 The Charles Stark Draper Laboratory, Inc. Deeply-integrated adaptive INS/GPS navigator with extended-range code tracking
US6683568B1 (en) 1999-05-14 2004-01-27 Auckland Uniservices Limited Position estimation services
US20030195017A1 (en) * 1999-09-30 2003-10-16 Tao Chen Wireless communication system with base station beam sweeping
US6255991B1 (en) 2000-01-19 2001-07-03 Trw Inc. Low cost angle of arrival measurement system
US6989789B2 (en) 2000-02-25 2006-01-24 Thales Method for locating radioelectric sources using two-channel high resolution radiogoniometer
US20020021277A1 (en) * 2000-04-17 2002-02-21 Kramer James F. Interface for controlling a graphical image
US6417802B1 (en) 2000-04-26 2002-07-09 Litton Systems, Inc. Integrated inertial/GPS navigation system
US6593885B2 (en) 2000-04-27 2003-07-15 Wherenet Corp Low cost DTOA location processing system based on multiple readers-to-single processor architecture
US20040095907A1 (en) 2000-06-13 2004-05-20 Agee Brian G. Method and apparatus for optimization of wireless multipoint electromagnetic communication networks
US20050275626A1 (en) 2000-06-21 2005-12-15 Color Kinetics Incorporated Entertainment lighting system
US20030053492A1 (en) 2000-09-01 2003-03-20 Osamu Matsunaga Multiplexer, receiver, and multiplex transmission method
US6496778B1 (en) 2000-09-14 2002-12-17 American Gnc Corporation Real-time integrated vehicle positioning method and system with differential GPS
US6721657B2 (en) 2001-06-04 2004-04-13 Novatel, Inc. Inertial GPS navigation system
US20040176102A1 (en) 2001-11-20 2004-09-09 Integrinautics Corporation Multiple antenna multi-frequency measurement system
US20030120425A1 (en) 2001-12-26 2003-06-26 Kevin Stanley Self-correcting wireless inertial navigation system and method
US6697736B2 (en) 2002-02-06 2004-02-24 American Gnc Corporation Positioning and navigation method and system thereof
US20050184907A1 (en) 2002-03-18 2005-08-25 Hall Christopher J. Method and apparatus for geolocating a wireless communications device
US20060013070A1 (en) 2002-12-04 2006-01-19 Sverre Holm Ultrasonic tracking and locating system
US7193559B2 (en) 2003-01-21 2007-03-20 Novatel, Inc. Inertial GPS navigation system with modified kalman filter
US6750816B1 (en) 2003-02-12 2004-06-15 Novatel, Inc. Integrated GPS-inertial system
US7009561B2 (en) 2003-03-11 2006-03-07 Menache, Llp Radio frequency motion tracking system and method
US7612715B2 (en) 2003-07-12 2009-11-03 Qinetiq Limited Direction finding
US20080204322A1 (en) 2003-11-03 2008-08-28 Gordon Kenneth Andrew Oswald Determining Positional Information
US20050143916A1 (en) 2003-12-26 2005-06-30 In-Jun Kim Positioning apparatus and method combining RFID, GPS and INS
US7443342B2 (en) 2004-01-13 2008-10-28 The Nippon Signal Co., Ltd. Reception time determining apparatus and distance measuring apparatus using the same
US7409290B2 (en) 2004-04-17 2008-08-05 American Gnc Corporation Positioning and navigation method and system thereof
US20060101497A1 (en) 2004-06-25 2006-05-11 International Business Machines Corporation User-aware video display
US20060061469A1 (en) 2004-09-21 2006-03-23 Skyfence Inc. Positioning system that uses signals from a point source
US7190309B2 (en) 2004-09-24 2007-03-13 Hill Edward L Radio signal transmitter with multiple antennas for improved position detection
US20060066485A1 (en) * 2004-09-24 2006-03-30 Guohua Min Wireless tracking system based upon phase differences
US20080048913A1 (en) 2004-12-02 2008-02-28 Commissariat A L'energie Atomique Local Positioning System and Method
US7236091B2 (en) 2005-02-10 2007-06-26 Pinc Solutions Position-tracking system
US7646330B2 (en) 2005-03-14 2010-01-12 Alfred E. Mann Foundation For Scientific Research System and method for locating objects and communicating with the same
US20070060384A1 (en) 2005-09-14 2007-03-15 Nintendo Co., Ltd. Storage medium storing video game program
US20110256882A1 (en) 2005-12-15 2011-10-20 Invisitrack, Inc. Multi-Path Mitigation in Rangefinding and Tracking Objects Using Reduced Attenuation RF Technology
US7533569B2 (en) 2006-03-15 2009-05-19 Qualcomm, Incorporated Sensor-based orientation system
US7876268B2 (en) 2006-08-22 2011-01-25 James P. Jacobs System and method for precise location of receiving antennas
US20100103989A1 (en) 2006-10-17 2010-04-29 Smith Stephen F Robust Low-Frequency Spread-Spectrum Navigation System
US8269624B2 (en) 2006-12-27 2012-09-18 Industrial Technology Research Institute Positioning apparatus and method
US20110006774A1 (en) 2007-05-24 2011-01-13 Penguin Automated Systems ,Inc. Subsurface positioning system and method for monitoring movement underground
US20080316324A1 (en) 2007-06-22 2008-12-25 Broadcom Corporation Position detection and/or movement tracking via image capture and processing
US20090149202A1 (en) 2007-12-07 2009-06-11 Christian Steele System and method for determination of position
US20090243932A1 (en) 2008-03-31 2009-10-01 Radliofy Llc Methods and Systems for Determining the Location of an Electronic Device
US20140253368A1 (en) 2008-09-12 2014-09-11 Propagation Research Associates, Inc. Multi-mode, multi-static interferometer utilizing pseudo orthogonal codes
US20100103173A1 (en) 2008-10-27 2010-04-29 Minkyu Lee Real time object tagging for interactive image display applications
US20100123664A1 (en) * 2008-11-14 2010-05-20 Samsung Electronics Co., Ltd. Method for operating user interface based on motion sensor and a mobile terminal having the user interface
US20110187600A1 (en) 2010-01-29 2011-08-04 Tc License Ltd. System and method for measurement of distance to a tag by a modulated backscatter rfid reader
US20110208481A1 (en) 2010-02-19 2011-08-25 Vladimir Slastion Extended range interferometric methods and systems
US20110210843A1 (en) 2010-03-01 2011-09-01 Andrew Llc System and method for location of mobile devices in confined environments
US20110241942A1 (en) 2010-04-02 2011-10-06 Position Imaging, Inc. Multiplexing receiver system
US20120013509A1 (en) 2010-07-14 2012-01-19 Zebra Enterprise Solutions Corp. Frequency channel diversity for real-time locating systems, methods, and computer program products
US20120127088A1 (en) * 2010-11-19 2012-05-24 Apple Inc. Haptic input device
US20120184285A1 (en) 2011-01-14 2012-07-19 Qualcomm Incorporated Methods and apparatuses for use in providing positioning assistance data to mobile stations via a self-organizing network
US20130036043A1 (en) 2011-07-06 2013-02-07 Patrick Faith Image-based product mapping
US20130021417A1 (en) 2011-07-21 2013-01-24 Seiko Epson Corporation Recording apparatus
US20160256100A1 (en) 2011-07-22 2016-09-08 Tci3-Pressure Applications Llc Systems and methods for monitoring and providing therapeutic support for a user
US8457655B2 (en) 2011-09-19 2013-06-04 Qualcomm Incorporated Hybrid time of arrival based positioning system
US20140300516A1 (en) 2011-11-10 2014-10-09 Position Imaging, Inc. Systems and methods of wireless position tracking
US20130314210A1 (en) 2012-05-22 2013-11-28 Trimble Navigation Limited Multi-modal entity tracking and display
US20150323643A1 (en) 2012-12-15 2015-11-12 Position Imaging, Inc. Cycling reference multiplexing receiver system
US9482741B1 (en) 2013-01-18 2016-11-01 Position Imaging, Inc. System and method of locating a radio frequency (RF) tracking device using a calibration routine
US20160371574A1 (en) 2013-05-06 2016-12-22 The Johns Hopkins University System for preventing instrument retention
US20150009949A1 (en) 2013-07-08 2015-01-08 Alexey Khoryaev Synchronizing peer-to-peer operation for outside network coverage and partial network coverage using lte air interface
US20160286508A1 (en) 2013-07-08 2016-09-29 Intel IP Corporation Synchronizing peer-to-peer operation for outside network coverage and partial network coverage using lte air interface
US20150091757A1 (en) 2013-09-30 2015-04-02 At&T Intellectual Property I, Lp Systems and Methods for High Precision Indoor Location Tracking
US20150169916A1 (en) 2013-12-13 2015-06-18 Position Imaging, Inc. Tracking system with mobile reader

Non-Patent Citations (19)

* Cited by examiner, † Cited by third party
Title
"ADXL202/ADXL210 Product Sheet," Analog Devices, Inc., Analog.com, 1999; 11 pages.
Adrian Schumacher, "Integration of a GPS aised Strapdown Inertial Navigation System for Land Vehicles", Master of Science Thesis, KTH Electrical Engineering, 2006; 67 pages.
Debo Sun, "Ultra-Tight GPS/Reduced IMU for Land Vehicle Navigation", Mar. 2010, UCGE Reports No. 20305; 254 pages.
Farrell & Barth, "The Global Positiong System & Interial Navigation", 1999, McGraw-Hill; pp. 245-252.
Farrell, et al., "Real-Time Differential Carrier Phase GPS=Aided INS", Jul. 2000, IEEE Transactions on Control Systems Technology, vol. 8, No. 4; 13 pages.
Filho, et al., "Integrated GPS/INS Navigation System Based on a Gyrpscope-Free IMU", DINCON Brazilian Conference on Synamics, Control, and Their Applications, May 22-26, 2006; 6 pages.
Goodall, Christopher L., "Improving Usability of Low-Cost INS/GPS Navigation Systems using Intelligent Techniques", Jan. 2009, UCGE Reports No. 20276; 234 pages.
Grewal & Andrews, "Global Positioning Systems, Inertial Nagivation, and Integration", 2001, John Weiley and Sons, pp. 252-256.
International Search Report & Written Opinion in international patent application PCT/US12/64860, mailed on Feb. 28, 2013; 8 pages.
Jennifer Denise Gautier, "GPS/INS Generalized Evaluation Tool (GIGET) for the Design and Testing of Integrated Navigation Systems", Dissertation, Stanford University, Jun. 2003; 160 pages.
Jianchen Gao, "Development of a Precise GPS/INS/On-Board Vehicle Sensors Integrated Vehicular Positioning System", Jun. 2007, UCGE Reports No. 20555; 245 pages.
Pourhomayoun, Mohammad and Mark Fowler, "Improving WLAN-based Indoor Mobile Positioning Using Sparsity," Conference Record of the Forty Si11-14/2016 IDSth Asilomar Conference on Signals, Systems and Computers, Nov. 4-7, 2012, pp. 1393-1396, Pacific Grove, California.
Proakis, John G. and Masoud Salehi, "Communication Systems Engineering", Second Edition, Prentice-Hall, Inc., Upper Saddle River, New Jersey, 2002; 815 pages.
Santiago Alban, "Design and Performance of a Robust GPS/INS Attitude System for Automobile Applications", Dissertation, Stanford University, Jun. 2004; 218 pages.
Schmidt & Phillips, "INS/GPS Integration Architectures", NATO RTO Lecture Seriers, First Presented Oct. 20-21, 2003; 24 pages.
Sun, et al., "Analysis of the Kalman Filter With Different INS Error Models for GPS/INS Integration in Aerial Remote Sensing Applications", Bejing, 2008, The International Archives of the Photogrammerty, Remote Sensing and Spatial Information Sciences vol. XXXVII, Part B5.; 8 pages.
U.S. Appl. No. 13/293,639, filed Nov. 10, 2011, Edward Hill (26 pages).
Vikas Numar N., "Integration of Inertial Navigation System and Global Positioning System Using Kalman Filtering", M.Tech Dissertation, Indian Institute of Technology, Bombay, Mumbai, Jul. 2004; 69 pages.
Yong Yang, "Tightly Coupled MEMS INS/GPS Integration with INS Aided Receiver Tracking Loops", Jun. 2008, UCGE Reports No. 20270; 205 pages.

Cited By (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11175375B2 (en) 2010-11-12 2021-11-16 Position Imaging, Inc. Position tracking system and method using radio signals and inertial sensing
US10416276B2 (en) 2010-11-12 2019-09-17 Position Imaging, Inc. Position tracking system and method using radio signals and inertial sensing
US10605904B2 (en) 2011-11-10 2020-03-31 Position Imaging, Inc. Systems and methods of wireless position tracking
US10269182B2 (en) * 2012-06-14 2019-04-23 Position Imaging, Inc. RF tracking with active sensory feedback
US20170372524A1 (en) * 2012-06-14 2017-12-28 Position Imaging, Inc. Rf tracking with active sensory feedback
US10180490B1 (en) 2012-08-24 2019-01-15 Position Imaging, Inc. Radio frequency communication system
US10534067B2 (en) 2012-08-24 2020-01-14 Position Imaging, Inc. Radio frequency communication system
US10338192B2 (en) 2012-08-24 2019-07-02 Position Imaging, Inc. Radio frequency communication system
US10234539B2 (en) 2012-12-15 2019-03-19 Position Imaging, Inc. Cycling reference multiplexing receiver system
US10856108B2 (en) 2013-01-18 2020-12-01 Position Imaging, Inc. System and method of locating a radio frequency (RF) tracking device using a calibration routine
US10237698B2 (en) 2013-01-18 2019-03-19 Position Imaging, Inc. System and method of locating a radio frequency (RF) tracking device using a calibration routine
US10634762B2 (en) 2013-12-13 2020-04-28 Position Imaging, Inc. Tracking system with mobile reader
US11226395B2 (en) 2013-12-13 2022-01-18 Position Imaging, Inc. Tracking system with mobile reader
US10634761B2 (en) 2013-12-13 2020-04-28 Position Imaging, Inc. Tracking system with mobile reader
US10257654B2 (en) 2014-01-17 2019-04-09 Position Imaging, Inc. Wireless relay station for radio frequency-based tracking system
US10623898B2 (en) 2014-01-17 2020-04-14 Position Imaging, Inc. Wireless relay station for radio frequency-based tracking system
US10200819B2 (en) 2014-02-06 2019-02-05 Position Imaging, Inc. Virtual reality and augmented reality functionality for mobile devices
US10631131B2 (en) 2014-02-06 2020-04-21 Position Imaging, Inc. Virtual reality and augmented reality functionality for mobile devices
US10324474B2 (en) 2015-02-13 2019-06-18 Position Imaging, Inc. Spatial diversity for relative position tracking
US10642560B2 (en) 2015-02-13 2020-05-05 Position Imaging, Inc. Accurate geographic tracking of mobile devices
US11132004B2 (en) 2015-02-13 2021-09-28 Position Imaging, Inc. Spatial diveristy for relative position tracking
US11057590B2 (en) 2015-04-06 2021-07-06 Position Imaging, Inc. Modular shelving systems for package tracking
US10148918B1 (en) 2015-04-06 2018-12-04 Position Imaging, Inc. Modular shelving systems for package tracking
US11501244B1 (en) 2015-04-06 2022-11-15 Position Imaging, Inc. Package tracking systems and methods
US11416805B1 (en) 2015-04-06 2022-08-16 Position Imaging, Inc. Light-based guidance for package tracking systems
US10853757B1 (en) 2015-04-06 2020-12-01 Position Imaging, Inc. Video for real-time confirmation in package tracking systems
US10444323B2 (en) 2016-03-08 2019-10-15 Position Imaging, Inc. Expandable, decentralized position tracking systems and methods
US10293786B1 (en) 2016-05-10 2019-05-21 Volkswagen Ag Method and system for secure access to a vehicle
US20170327083A1 (en) * 2016-05-10 2017-11-16 Volkswagen Ag Method and system for secure access to a vehicle
US10183650B2 (en) * 2016-05-10 2019-01-22 Volkswagen Ag Method and system for secure access to a vehicle
US11436553B2 (en) 2016-09-08 2022-09-06 Position Imaging, Inc. System and method of object tracking using weight confirmation
US10634506B2 (en) 2016-12-12 2020-04-28 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US10634503B2 (en) 2016-12-12 2020-04-28 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US10455364B2 (en) 2016-12-12 2019-10-22 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US11022443B2 (en) 2016-12-12 2021-06-01 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US11506501B2 (en) 2016-12-12 2022-11-22 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US11774249B2 (en) 2016-12-12 2023-10-03 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US11120392B2 (en) 2017-01-06 2021-09-14 Position Imaging, Inc. System and method of calibrating a directional light source relative to a camera's field of view
US11361536B2 (en) 2018-09-21 2022-06-14 Position Imaging, Inc. Machine-learning-assisted self-improving object-identification system and method
US11089232B2 (en) 2019-01-11 2021-08-10 Position Imaging, Inc. Computer-vision-based object tracking and guidance module
US11637962B2 (en) 2019-01-11 2023-04-25 Position Imaging, Inc. Computer-vision-based object tracking and guidance module
WO2023043874A1 (en) * 2021-09-17 2023-03-23 Stryker Corporation Patient support apparatuses with patient monitoring

Similar Documents

Publication Publication Date Title
US10269182B2 (en) RF tracking with active sensory feedback
US9782669B1 (en) RF tracking with active sensory feedback
CN107533233B (en) System and method for augmented reality
US10974138B2 (en) Haptic surround functionality
EP2661663B1 (en) Method and apparatus for tracking orientation of a user
CN109791435B (en) Calibration of magnetic and optical sensors in virtual reality or augmented reality display systems
US20180250590A1 (en) System and method for providing haptic stimulus based on position
US9375640B2 (en) Information processing system, computer-readable storage medium, and information processing method
EP2497545A2 (en) Information processing program, information processing system, and information processing method
KR20170026567A (en) Three dimensional contextual feedback
KR20160017120A (en) A proximity sensor mesh for motion capture
US20170087455A1 (en) Filtering controller input mode
JP2008307392A (en) Self contained inertial navigation system for interactive control using movable controller
CN109240487B (en) Immersive system, control method, and non-transitory computer readable medium
JP2022518779A (en) Methods and systems for resolving hemispherical ambiguity in 6-DOF attitude measurements
KR20170004589A (en) A insole, a mobile terminal and method for controlling the same
KR101840745B1 (en) A golf simulation method of host and electronic putting device communicating host
Loviscach Playing with all senses: Human–Computer interface devices for games

Legal Events

Date Code Title Description
AS Assignment

Owner name: POSITION IMAGING, INC., NEW HAMPSHIRE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HILL, EDWARD L.;REEL/FRAME:032894/0184

Effective date: 20140513

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

Year of fee payment: 4

AS Assignment

Owner name: ANKURA TRUST COMPANY, LLC, CONNECTICUT

Free format text: SECURITY INTEREST;ASSIGNOR:POSITION IMAGING IP LLC;REEL/FRAME:064688/0851

Effective date: 20230822