WO2006096776A2 - Teleportation systems and methods in a virtual environment - Google Patents

Teleportation systems and methods in a virtual environment Download PDF

Info

Publication number
WO2006096776A2
WO2006096776A2 PCT/US2006/008264 US2006008264W WO2006096776A2 WO 2006096776 A2 WO2006096776 A2 WO 2006096776A2 US 2006008264 W US2006008264 W US 2006008264W WO 2006096776 A2 WO2006096776 A2 WO 2006096776A2
Authority
WO
WIPO (PCT)
Prior art keywords
user
virtual environment
teleportation
directional input
create
Prior art date
Application number
PCT/US2006/008264
Other languages
French (fr)
Other versions
WO2006096776A3 (en
Inventor
Leonidas Deligiannidis
Original Assignee
The University Of Georgia Research Foundation,Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The University Of Georgia Research Foundation,Inc. filed Critical The University Of Georgia Research Foundation,Inc.
Priority to US11/816,968 priority Critical patent/US20080153591A1/en
Publication of WO2006096776A2 publication Critical patent/WO2006096776A2/en
Publication of WO2006096776A3 publication Critical patent/WO2006096776A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Definitions

  • the present disclosure is generally related to virtual technology and, more particularly, is related to systems and methods for providing user interaction in a virtual environment.
  • IVEs Immersive Virtual Environments
  • sequences of gestures to make virtual environmental changes such as a direction or
  • Embodiments of the present disclosure provide a system and method for teleportation in a virtual environment. Briefly described one embodiment of the
  • a teleportation device configured to provide navigation in the virtual environment; at least one feedback
  • Embodiments of the present disclosure can also be viewed as methods for
  • FIG. 1 is a schematic diagram of an embodiment of a system for teleportation in a virtual environment.
  • FIG. 2 is a schematic diagram of an alternative embodiment of a system for teleportation in a virtual environment.
  • FIG. 3 is a schematic diagram illustrating a top view of an embodiment of a system for teleportation in a virtual environment.
  • FIG. 4 is a schematic diagram illustrating a top view of an embodiment of a teleportation device showing exemplary inputs to a directional input component.
  • FIG. 5 is a schematic diagram illustrating a partial front view of a system for teleportation in a virtual environment.
  • FIG. 6 is a schematic diagram illustrating a side view of an embodiment of a
  • teleportation device showing exemplary inputs to a directional input component.
  • FIG. 7 is a schematic diagram illustrating a side view of an alternative embodiment of a teleportation device.
  • FIG. 8 is a functional block diagram illustrating an embodiment of a control
  • FIG. 9 is a block diagram illustrating an embodiment of an architecture for
  • FIG. 10 is a block diagram illustrating an embodiment of a method for
  • FIG. 1 is a schematic diagram of an embodiment of a system 100 for teleportation in an immersive virtual environment.
  • An immersive virtual environment includes multiple sources of feedback for a user to create the sensation that the user is fully immersed in the virtual environment.
  • system 100 includes a teleportation device 104 that provides for general purpose
  • the navigation activities can include, for example
  • a user 108 can rotate himself/herself and the teleportation
  • the system 100 also includes a computing device 102, which can include a
  • the computing device 102 is configured to provide data to a head mounted display 114.
  • the head mounted display 114 is configured to communicate video and audio signals to a user 108 using one or more
  • the computing device 102 is also configured
  • FIG. 1 includes user position sensors 112 at the users head and hands.
  • the user position sensors 112 can also be used to provide orientation data to the computing device 102.
  • the computing device 102 is also configured to receive position and
  • the computing device can render the virtual environment based on the position and orientation of the teleportation device 104.
  • the teleportation device 104 includes a base 118 configured to optionally support all or a portion of the user 108.
  • the base 118 is attached to a directional input
  • the moveable coupling 120 of this embodiment includes one or more springs configured in modes of compression,
  • the teleportation device 104 also includes a vibratory feedback device 106
  • the vibratory feedback device 106 is used to deliver sound and/or vibration to the user 108 to simulate
  • the vibratory feedback device 106 maybe configured to operate at a low frequency and output level when the teleportation device 104 is moving through the virtual environment at a slow speed. Accordingly, the output level and frequency might be increased as the speed of the teleportation device 104 is increased. In some
  • the vibratory feedback device 106 can be configured as a subwoofer speaker, for example. Alternatively, or in addition, to the vibratory feedback device
  • 106 can be implemented as vibrotactile devices mounted at a variety of points on the teleportation device 104.
  • FIG. 2 is a schematic diagram of an alternative embodiment of a system 122 for teleportation in a virtual environment, hi addition to the components of the system 100 described above in reference to FIG. 1, the system 122 also includes a position interface 124, configured to communicate with the position sensors 112, 116. Communication between the position interface
  • the position interface 124 also a senor 124, also a senor 124, a senor 124, a senor 124, a senor 124, a senor 124, a senor 124, a senor 124, a senor 124, a senor 124, a senor 124, a senor 116, also includes
  • 3-D tracker reports the position and orientation of each of the position sensors 112, 116 to the computing device 102.
  • the system 122 also includes one or more fans 128 for generating a wind simulation.
  • the fan or fans 128 can be controlled by the computing device 102
  • the output device controller 130 can
  • the system 122 can also optionally include a status interface system 125
  • the status interface system 125 can be implemented to replace or supplement either or both of the position interface 124 and the output
  • the status interface system 125 includes the
  • the status interface system 125 may be implemented in separate units, or as a single unit (e.g., with two cards in it, one corresponding to the switching action function of a relay controller and the other having functionality to detect button presses and releases).
  • the status interface system 125 when implemented as a single unit, may
  • ADC analog-to-digital conversion
  • DAC digital-to-analog conversion
  • FIG. 3 is a schematic diagram illustrating a
  • the system 138 includes a teleportation device 104 having a base 118 and a directional input component 110, also referred to as a steering wheel or handle bar.
  • the teleportation device 104 includes a vibratory feedback device 106 and one or
  • the user interface devices can include
  • Switches and buttons among others.
  • Alternative embodiments may include user
  • the user interface devices can include an UP button 140 and a DOWN button 142 for causing the teleportation device 104 to move up or down within the virtual environment.
  • the UP and DOWN functions could be combined into
  • User interface devices can also be implemented as a STOP 144 button configured to cause
  • a FLY/DRIVE switch 150 is also included.
  • the FLY/DRIVE switch 150 can be toggled between a
  • an INC button 154 and a DEC button 156 configured to cause the teleportation device to increase speed or decrease speed, respectively.
  • the UP and DOWN functions, the INC and DEC functions can alternatively be combined into a multiple position switch such as a toggle switch.
  • a multiple position switch such as a toggle switch.
  • embodiments can include throttle and/or handbrake structures that can generate, for example, analog signals to increase or decrease the speed, respectively.
  • the analog can include throttle and/or handbrake structures that can generate, for example, analog signals to increase or decrease the speed, respectively.
  • signals from a throttle and/or a handbrake may be processed using, for example,
  • a throttle and/or handbrake can also be configured to generate digital signals.
  • devices providing a
  • quadrature pulse output in conjunction with a counter can be used for increasing and
  • LIGHTS button 148 for adjusting the lighting levels in the virtual environment.
  • Some embodiments may feature a simple on and off control for the lighting.
  • Other embodiments may include
  • the teleportation device 104 can also include a DEBUG button 146 configured
  • a user may experience a situation where he or she cannot move in
  • a user can activate the DEBUG
  • the teleportation device 104 can also include a JUMP button 152
  • the system 138 also includes an example arrangement of fans 128.
  • the fans 128 used independently or in selective combination can be used to simulate wind that
  • the corresponding fan 128 would be activated to simulate wind commensurate with that motion. Also, when the
  • the teleportation device 104 can detect which direction the teleportation device 104 is facing and operate one or more fans 128 corresponding to movement in the new direction.
  • FIG. 4 is a schematic diagram
  • the teleportation device 104 includes a base 118 moveably coupled to a directional input component 110.
  • a base 118 moveably coupled to a directional input component 110.
  • the directional input component 110 rotates the directional input component 110 counter-clockwise, the teleportation device 104 will turn to the left in the virtual environment. To cause an upward movement of the teleportation device 104 in the virtual environment, the directional input component 110 is pulled or tilted towards the user. Similarly, to cause a
  • directional input component 110 is pushed or tilted away from the user.
  • embodiments may use a directional input component 110 mounted to a telescopic shaft where the up and down motions are accomplished by manipulating the directional input component in a substantially vertical up and down motion.
  • FIG. 5 is a schematic diagram
  • An arrangement of multiple fans of an embodiment includes an over the head fan 210 for simulating, for example, upward movement in the virtual environment. Similarly, the arrangement includes a right side fan 212 and a left side fan 214 for simulating
  • a left ground fan 218 and a right ground fan 216 can be used to simulate left and right downward movement, respectively.
  • a left ground fan 218 and a right ground fan 216 can be used to simulate left and right downward movement, respectively.
  • a right ground fan 216 can be used to simulate left and right downward movement, respectively.
  • front of face fan 220 can be used to simulate forward motion.
  • Each of the fans can be any type of the fans.
  • fans can be used alone or in combination to create
  • FIG. 6 is a schematic diagram illustrating a side view of an embodiment of a teleportation device showing exemplary inputs to a directional input component.
  • the teleportation device 104 includes a base 118
  • the teleportation device 104 to move down, the user 108 pushes or tilts the directional input component 110 away from himself/herself. Similarly, to direct the teleportation
  • the user 108 pulls or tilts the directional input device 110
  • Alternative embodiments can feature a telescopic arrangement such that the directional input device is moved substantially vertically up
  • FIG. 7 is a schematic diagram illustrating a side view of an alternative embodiment of a teleportation device.
  • device 104 includes a base 118 attached to a directional input component 110 through
  • the moveable coupling 120 is a spring. Additional springs 230 are included to provide force feedback through additional
  • FIG. 8 is a functional block diagram illustrating an embodiment of a control arrangement for a teleportation system as
  • the computer 160 (herein, computer or host computer) communicates with a 3-D tracker 162, a fan/relay controller 176, an eye tracking
  • controller 182 controls the status interface 166. Note that in some embodiments, fewer or
  • the 3-D tracker 162 provides the position and orientation of the user's head, hands, the teleportation device, etc. to perform the following functionality:
  • the teleportation system comprises a physics component 196 to simulate gravity, so that the user stays on the ground and not in the middle of the air
  • the output of the physics component 196 is fed to a vibrator controller 198
  • the graphics generator 190 may retrieve environment data from an environment storage 192.
  • the head mounted display 202 is used and comprises
  • the headphones can be used to hear things or
  • teleportation system can simulate circumstances such as when a user collides with another object by activating one or more vibration units 200 to provide tactile
  • the host computer 160 controls the wind generator units 180 (on/off) and their speed (how much air they
  • the wind generator units 180 can be driven through a fan speed controller 178
  • the host computer 160 also drives a sound generator 172 that simulates the noise generated by the teleportation device and can also serve as a secondary vibration mechanism.
  • the sound generator 172 can be used to drive sound output units 174 using data in a sound data storing unit 170.
  • the status interface 166 can use a switch polling facility 168 to detect button presses
  • the teleportation system may also
  • the eye tracking controller 182 can communicate with a speech recognizer 164 that recognizes commands that a user verbally issues.
  • the eye tracking controller 182 can communicate with a
  • a head mounted display 202 comprises a camera that tracks the user's eye.
  • the eye tracking controller 182 determines the coordinates of the eye and further determines what the user is observing in the virtual environment. Such a feature may be useful in games. For example, as a missile from the enemy is coming at the user,
  • the user can look at the missile and press a button located on the teleportation device
  • FIG. 9 is a block diagram illustrating an embodiment of an architecture for
  • the control computer generally includes a
  • the local interface 244 may be, for example, one or more buses or other wired or wireless connections.
  • the local interface 244 may have additional elements such as
  • the local interface 244 may include address, control, and/or
  • the processor 240 is a hardware device for executing software, particularly that which is stored in memory.
  • the processor 240 may be any custom made or commercially available processor, a central processing unit (CPU), an auxiliary processor among several processors associated with the processing device, a
  • semiconductor-based microprocessor in the form of a microchip or chip set
  • a microchip or chip set a semiconductor-based microprocessor (in the form of a microchip or chip set)
  • the memory 242 may include any one or combination of volatile memory
  • RAM random access memory
  • the memory 242 may incorporate electronic, magnetic, optical, and/or other types of storage media. Note that the memory 242 may
  • the software in memory 242 may include one or more separate programs, each of which comprises an ordered listing of executable instructions for implementing
  • the software in the memory 242 includes control software 246 for providing one or more of the functionality shown in FIG. 8 according to an embodiment.
  • memory 242 may also comprise a suitable operating system (O/S) 248.
  • O/S operating system
  • system 248 essentially controls the execution of other computer programs, such as the control software, and provides scheduling, input-output control, file and data management, memory management, and communication control and related services.
  • the control software 246 is a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed.
  • the control software 246 can be implemented, in one embodiment, as a distributed
  • the modules can be accessed by one or more applications or programs or components thereof.
  • the modules can be accessed by one or more applications or programs or components thereof.
  • control software 246 can be implemented as a single module with all of the
  • control software 246 is a
  • control software 246 can be written with (a) an object oriented programming language, which has classes of data and methods, or (b) a procedure programming language, which has
  • the I/O devices 250 may include input devices such as, for example, a
  • 250 may also include output devices such as, for example, a printer, display, audio
  • the I/O devices 250 may further include devices that communicate both inputs and outputs such as, for instance, a modulator/demodulator (modem for accessing another device, system, or network), a radio frequency (RF) or other transceiver, a telephonic interface, a bridge, a router,
  • a modulator/demodulator modem for accessing another device, system, or network
  • RF radio frequency
  • the processor 240 is configured to
  • control software 246 and the operating system 248, in whole or in part, but typically the latter, are read by the processor 240, perhaps buffered within the processor 240, and then executed.
  • control software 246 can be stored on any computer-readable medium for use by or in connection with any computer-related
  • a computer-readable medium is an
  • control software 246 can be embodied in any computer-readable
  • control software 246 is implemented in hardware, or as a combination of software and hardware, the functionality of the control software 246 can be implemented with any or a
  • ASIC application specific integrated circuit
  • PGA programmable gate array
  • FPGA gate array
  • FIG. 10 is a block diagram illustrating an embodiment of a method 300 for providing teleportation in a virtual environment.
  • the method 300 includes the step of delivering a video signal to the user in block 310.
  • the video signal may be delivered using, for example, one or more displays
  • the method 300 also includes the step of delivering an audio signal to a user in block 320.
  • the audio signal can be delivered through, for example, headphones or speakers.
  • the audio signal can be delivered through, for example, headphones or speakers.
  • the method 300 also includes the step of receiving position inputs relating to
  • the three-dimensional position and orientation of the hands and head of the user can serve to ensure that the user's position and video signal correspond to the virtual
  • the computer controlling the virtual environment can be received from the three-dimensional position and orientation data for the teleportation device.
  • a user is provided vibratory feedback in block 340.
  • a user can experience the sounds and vibrations corresponding to
  • the air is directed at varying rates and from different directions to
  • Air can
  • wind generation devices including for example, fans or blowers.
  • Each wind generation device can be driven independently or in combination at one or more preset speeds or at any speed over a range of speeds.
  • Controlling the wind generation units can be accomplished using relays, electronic speed controllers, electronic motor drives, or any combination thereof.

Abstract

Provided are systems and methods for teleportation in a virtual environment. One embodiment of such a system can be implemented as a head mounted display configured to provide an immersive virtual environment and a teleportation device configured to provide navigation in the virtual environment; at least one feedback device configured to provide a user with information corresponding to movement of the teleportation device within the virtual environment. The system also includes a plurality of input devices configured to generate a plurality of input signals in response to inputs from the user and a computing device configured to receive the plurality of input signals and control the at least one feedback device.

Description

TELEPORTATION SYSTEMS AND METHODS IN A VIRTUAL
ENVIRONMENT
CROSS-REFERENCE TO RELATED APPLICATION
This application claims priority to copending U.S. provisional application
entitled, "TELEPORTATION SYSTEMS AND METHODS," having ser. no. 60/659,283, filed March 7, 2005, which is entirely incorporated herein by reference.
TECHNICAL FIELD
The present disclosure is generally related to virtual technology and, more particularly, is related to systems and methods for providing user interaction in a virtual environment.
BACKGROUND
Large scale Immersive Virtual Environments (IVEs) are common in current research. Some of the major problems in large scale IVEs, however, are traveling and
navigation. These problems has been addressed by input devices such as handheld
and fixed station user input devices as well as environment specific devices such as, for example, a virtual reality snowboard. The utilization of these input devices, however, is awkward or unnatural and may require extensive training, especially if the
device offers many degrees of freedom. For example, some of the previous input
devices have required the user to memorize and perform specific coded gestures or
sequences of gestures to make virtual environmental changes such as a direction or
mode change. In such a device having many degrees of freedom, the user is tasked with memorizing and performing many potentially unnatural tasks and gestures to
travel and navigate within a large scale IVE.
SUMMARY Embodiments of the present disclosure provide a system and method for teleportation in a virtual environment. Briefly described one embodiment of the
system, among others, can be implemented as follows: a head mounted display
configured to provide an immersive virtual environment; a teleportation device configured to provide navigation in the virtual environment; at least one feedback
device configured to provide a user with information corresponding to movement of
the teleportation device within the virtual environment; a plurality of input devices configured to generate a plurality of input signals in response to inputs from the user; and a computing device configured to receive the plurality of input signals and control the at least one feedback device. Embodiments of the present disclosure can also be viewed as methods for
providing teleportation in a virtual environment, hi this regard, one embodiment of
such a method, among others, can be broadly summarized by the following steps:
delivering a video signal, corresponding to a virtual environment, to a user; delivering an audio signal, corresponding to the virtual environment, to the user; receiving a plurality of inputs corresponding to a three-dimensional position for each of a plurality
of user physiological features; providing a vibratory feedback, corresponding to the virtual environment, to the user; and directing air towards the user to create a motion sensation.
Other systems, methods, features, and advantages of the present disclosure will
be or become apparent to one with skill in the art upon examination of the following drawings and detailed description. It is intended that all such additional systems,
methods, features, and advantages be included within this description, be within the scope of the present disclosure, and be protected by the accompanying claims.
BRIEF DESCRIPTION OF THE DRAWINGS
Many aspects of the disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawings, like reference numerals designate
corresponding parts throughout the several views.
FIG. 1 is a schematic diagram of an embodiment of a system for teleportation in a virtual environment.
FIG. 2 is a schematic diagram of an alternative embodiment of a system for teleportation in a virtual environment. FIG. 3 is a schematic diagram illustrating a top view of an embodiment of a system for teleportation in a virtual environment.
FIG. 4 is a schematic diagram illustrating a top view of an embodiment of a teleportation device showing exemplary inputs to a directional input component.
FIG. 5 is a schematic diagram illustrating a partial front view of a system for teleportation in a virtual environment.
FIG. 6 is a schematic diagram illustrating a side view of an embodiment of a
teleportation device showing exemplary inputs to a directional input component.
FIG. 7 is a schematic diagram illustrating a side view of an alternative embodiment of a teleportation device. FIG. 8 is a functional block diagram illustrating an embodiment of a control
arrangement for a teleportation system as disclosed herein.
FIG. 9 is a block diagram illustrating an embodiment of an architecture for
controlling a teleportation system. FIG. 10 is a block diagram illustrating an embodiment of a method for
providing teleportation in a virtual environment.
DETAILED DESCRIPTION
Having summarized various aspects of the present disclosure, reference will now be made in detail to the description of the disclosure as illustrated in the drawings. While the disclosure will be described in connection with these drawings,
there is no intent to limit it to the embodiment or embodiments disclosed herein. On the contrary, the intent is to cover all alternatives, modifications and equivalents
included within the spirit and scope of the disclosure as defined by the appended
claims.
Reference is first made to FIG. 1, which is a schematic diagram of an embodiment of a system 100 for teleportation in an immersive virtual environment. An immersive virtual environment includes multiple sources of feedback for a user to create the sensation that the user is fully immersed in the virtual environment. The
system 100 includes a teleportation device 104 that provides for general purpose
navigation in virtual environments. The navigation activities can include, for
example, traveling from one place to another for exploring and searching within the
virtual environment. A user 108 can rotate himself/herself and the teleportation
device 104, physically move forward and backward (and up and down), and change the speed of travel. The system 100 also includes a computing device 102, which can include a
processor, memory, and one or more input/output devices, all communicatively coupled via one or more data buses. The computing device 102 is configured to provide data to a head mounted display 114. The head mounted display 114 is configured to communicate video and audio signals to a user 108 using one or more
displays and audio output components. The computing device 102 is also configured
to receive user position data from user position sensors 112 proximate to different
user physiological features. Examples of user physiological features that might
provide useful position data include, but are not limited to, the head, hands, arms, feet, and legs. The embodiment of FIG. 1 includes user position sensors 112 at the users head and hands. In addition to providing three-dimensional position data, the user position sensors 112 can also be used to provide orientation data to the computing device 102.
The computing device 102 is also configured to receive position and
orientation data from one or more teleportation device position sensors 116 that are
mounted to the teleportation device 104. In this manner, the computing device can render the virtual environment based on the position and orientation of the teleportation device 104.
The teleportation device 104 includes a base 118 configured to optionally support all or a portion of the user 108. The base 118 is attached to a directional input
component 110 through a moveable coupling 120. The moveable coupling 120 of this embodiment includes one or more springs configured in modes of compression,
tension, or some combination thereof.
The teleportation device 104 also includes a vibratory feedback device 106
configured to be controlled by the computing device 102. The vibratory feedback device 106 is used to deliver sound and/or vibration to the user 108 to simulate
varying rates of movement within the virtual environment. In this manner the sound
and/or vibration of the teleportation device 104 in motion is simulated. For example,
the vibratory feedback device 106 maybe configured to operate at a low frequency and output level when the teleportation device 104 is moving through the virtual environment at a slow speed. Accordingly, the output level and frequency might be increased as the speed of the teleportation device 104 is increased. In some
embodiments, the vibratory feedback device 106 can be configured as a subwoofer speaker, for example. Alternatively, or in addition, to the vibratory feedback device
106 can be implemented as vibrotactile devices mounted at a variety of points on the teleportation device 104.
Reference is now made to FIG. 2, which is a schematic diagram of an alternative embodiment of a system 122 for teleportation in a virtual environment, hi addition to the components of the system 100 described above in reference to FIG. 1, the system 122 also includes a position interface 124, configured to communicate with the position sensors 112, 116. Communication between the position interface
124 and the position sensors 112, 116 can be accomplished using any one of a variety
of wired or wireless communication technologies. The position interface 124, also
referred to as a 3-D tracker, reports the position and orientation of each of the position sensors 112, 116 to the computing device 102.
The system 122 also includes one or more fans 128 for generating a wind simulation. The fan or fans 128 can be controlled by the computing device 102
through an output device controller 130. The output device controller 130 can
include, for example, relays and or electronic speed controllers to vary the speed and direction of the simulated wind. The system 122 can also optionally include a status interface system 125
configured to maintain the status of one or more of the peripheral devices external to
the computing device 102. The status interface system 125 can be implemented to replace or supplement either or both of the position interface 124 and the output
device controller 130. Additionally, the status interface system 125 includes the
functionality to detect the operation of user input devices such as buttons or switches. The status interface system 125 maybe implemented in separate units, or as a single unit (e.g., with two cards in it, one corresponding to the switching action function of a relay controller and the other having functionality to detect button presses and releases). The status interface system 125, when implemented as a single unit, may
have additional cards corresponding to analog-to-digital conversion (ADC) and
digital-to-analog conversion (DAC) to control, for example, fan speed.
Reference is now made to FIG. 3, which is a schematic diagram illustrating a
top view of an embodiment of a system for teleportation in a virtual environment. The system 138 includes a teleportation device 104 having a base 118 and a directional input component 110, also referred to as a steering wheel or handle bar. The teleportation device 104 includes a vibratory feedback device 106 and one or
more user interface devices configured to allow the user to cause or trigger an operation within the virtual environment. The user interface devices can include
switches and buttons, among others. Alternative embodiments may include user
interface devices using one or more touch screens.
The user interface devices can include an UP button 140 and a DOWN button 142 for causing the teleportation device 104 to move up or down within the virtual environment. Alternatively, the UP and DOWN functions could be combined into
one multiple position switch, for example a three position center return switch. User interface devices can also be implemented as a STOP 144 button configured to cause
the teleportation device 104 to stop within the virtual environment. A FLY/DRIVE switch 150 is also included. The FLY/DRIVE switch 150 can be toggled between a
fly mode and a drive mode. Also included are an INC button 154 and a DEC button 156 configured to cause the teleportation device to increase speed or decrease speed, respectively. Like
the UP and DOWN functions, the INC and DEC functions can alternatively be combined into a multiple position switch such as a toggle switch. Other alternative
embodiments can include throttle and/or handbrake structures that can generate, for example, analog signals to increase or decrease the speed, respectively. The analog
signals from a throttle and/or a handbrake may be processed using, for example,
analog-to-digital conversion hardware and/or software. A throttle and/or handbrake can also be configured to generate digital signals. For example, devices providing a
quadrature pulse output in conjunction with a counter can be used for increasing and
decreasing the speed.
Other user interface devices can be included such as a LIGHTS button 148 for adjusting the lighting levels in the virtual environment. Some embodiments may feature a simple on and off control for the lighting. Other embodiments may include
incremental changes in the lighting levels through actuation of the LIGHTS button 148. The teleportation device 104 can also include a DEBUG button 146 configured
to allow the user to debug one or more applications running on the computing device
102. For example, a user may experience a situation where he or she cannot move in
the virtual environment due to a collision with multiple objects, such as might occur
during a glitch in an application's implementation. A user can activate the DEBUG
button 146 and disable collision detection temporarily to enable testing of other parts of the application. The teleportation device 104 can also include a JUMP button 152
to permit the vehicle to jump over obstacles in the virtual environment when in drive
mode.
The system 138 also includes an example arrangement of fans 128. The fans 128 used independently or in selective combination can be used to simulate wind that
corresponds to the motion within the virtual environment. For example, where the
teleportation device 104 is traveling to one side or another, the corresponding fan 128 would be activated to simulate wind commensurate with that motion. Also, when the
teleportation device 104 is turned or rotated, a three dimensional position sensor 116
can detect which direction the teleportation device 104 is facing and operate one or more fans 128 corresponding to movement in the new direction.
Brief reference is now made to FIG. 4, which is a schematic diagram
illustrating a top view of an embodiment of a teleportation device showing exemplary inputs to a directional input component. The teleportation device 104 includes a base 118 moveably coupled to a directional input component 110. By way of example,
when a user rotates the directional input component 110 clockwise, the teleportation
device 104 will turn to the right in the virtual environment. Similarly, when the a user
rotates the directional input component 110 counter-clockwise, the teleportation device 104 will turn to the left in the virtual environment. To cause an upward movement of the teleportation device 104 in the virtual environment, the directional input component 110 is pulled or tilted towards the user. Similarly, to cause a
downward movement of the teleportation device 104 in the virtual environment, the
directional input component 110 is pushed or tilted away from the user. Alternative
embodiments may use a directional input component 110 mounted to a telescopic shaft where the up and down motions are accomplished by manipulating the directional input component in a substantially vertical up and down motion.
Brief reference is now made to FIG. 5, which is a schematic diagram
illustrating a partial front view of a system for teleportation in a virtual environment.
An arrangement of multiple fans of an embodiment includes an over the head fan 210 for simulating, for example, upward movement in the virtual environment. Similarly, the arrangement includes a right side fan 212 and a left side fan 214 for simulating
right and left motion, respectively. A left ground fan 218 and a right ground fan 216 can be used to simulate left and right downward movement, respectively. Similarly, a
front of face fan 220 can be used to simulate forward motion. Each of the fans can be
driven at varying speeds to create the sensation of changing speeds within the virtual environment. Additionally, the fans can be used alone or in combination to create
varying degrees of speed and directional simulation.
Brief reference is made to FIG. 6, which is a schematic diagram illustrating a side view of an embodiment of a teleportation device showing exemplary inputs to a directional input component. The teleportation device 104 includes a base 118
coupled to a directional input component 110 via a moveable coupling 120. To direct
the teleportation device 104 to move down, the user 108 pushes or tilts the directional input component 110 away from himself/herself. Similarly, to direct the teleportation
device 104 to move up, the user 108 pulls or tilts the directional input device 110
towards himself/herself. Alternative embodiments can feature a telescopic arrangement such that the directional input device is moved substantially vertically up
and down to direct the upward or downward movement of the teleportation device 104 within the virtual environment. Brief reference is made to FIG. 7, which is a schematic diagram illustrating a side view of an alternative embodiment of a teleportation device. The teleportation
device 104 includes a base 118 attached to a directional input component 110 through
a moveable coupling 120. In this embodiment, the moveable coupling 120 is a spring. Additional springs 230 are included to provide force feedback through additional
resistance. Multiple springs or other biasing elements can be used independently or in
combination to achieve a desired level of force feedback in all or selected axes.
Reference is now made to FIG. 8, which is a functional block diagram illustrating an embodiment of a control arrangement for a teleportation system as
disclosed herein. The computer 160 (herein, computer or host computer) communicates with a 3-D tracker 162, a fan/relay controller 176, an eye tracking
controller 182, and the status interface 166. Note that in some embodiments, fewer or
more components and/or functionality can be implemented. The 3-D tracker 162 provides the position and orientation of the user's head, hands, the teleportation device, etc. to perform the following functionality:
1. Enable a glove interface 186 (used to manipulate 3-D objects in a virtual environment) and a gesture recognizer 188 to recognize gestures and
manipulate 3-D objects in a virtual environment.
2. Enable a collision detector 194 to detect collisions between the user, the
teleportation device, and a 3-D virtual environment.
As shown in FIG. 8, the teleportation system comprises a physics component 196 to simulate gravity, so that the user stays on the ground and not in the middle of the air
when operating in the "drive" (as opposed to "fly") mode. For example, when the
user jumps over an obstacle in a virtual environment, he/she lands on the ground in the virtual environment. The output of the physics component 196 is fed to a vibrator controller 198
that simulates vibrations and it also provides input to a graphics generator 190 that drives the 3-D output graphics on a head mounted display 202 that is fully immersive. The graphics generator 190 may retrieve environment data from an environment storage 192. In one embodiment, the head mounted display 202 is used and comprises
a display and headphones or speakers. The headphones can be used to hear things or
events in the virtual environment, such as a bouncing ball, etc. For example, the
teleportation system can simulate circumstances such as when a user collides with another object by activating one or more vibration units 200 to provide tactile
simulation. That is, if the user takes a turn at 100 mile/hour or 5 miles/hour in the virtual environment, he/she feels the difference in wind blowing at him/her, the
vibration from the subwoofer, and perhaps the vibration from the vibrotactile devices. hi one embodiment, the host computer 160, the status interface 166, or both, controls the wind generator units 180 (on/off) and their speed (how much air they
blow). The wind generator units 180 can be driven through a fan speed controller 178
and a fan/relay controller 176. hi one embodiment, the host computer 160 also drives a sound generator 172 that simulates the noise generated by the teleportation device and can also serve as a secondary vibration mechanism. The sound generator 172 can be used to drive sound output units 174 using data in a sound data storing unit 170. The status interface 166 can use a switch polling facility 168 to detect button presses
(e.g., user interface devices coupled to activation devices or switches) from user input
devices that are attached to the teleportation device and sends that information to the
host computer 160. In some embodiments, the teleportation system may also
comprise a speech recognizer 164 that recognizes commands that a user verbally issues. In one embodiment, the eye tracking controller 182 can communicate with a
separate computer coupled to the host computer through, for example, an output
interface 184. A head mounted display 202 comprises a camera that tracks the user's eye. The eye tracking controller 182 determines the coordinates of the eye and further determines what the user is observing in the virtual environment. Such a feature may be useful in games. For example, as a missile from the enemy is coming at the user,
the user can look at the missile and press a button located on the teleportation device
and activate a missile interceptor to destroy the incoming missile.
FIG. 9 is a block diagram illustrating an embodiment of an architecture for
controlling a teleportation system. The control computer generally includes a
processor 240, memory 242, and one or more input and/or output (I/O) devices 250 (or peripherals) that are communicatively coupled via a local interface 244. The local interface 244 may be, for example, one or more buses or other wired or wireless connections. The local interface 244 may have additional elements such as
controllers, buffers (caches), drivers, repeaters, and receivers, to enable communication. Further, the local interface 244 may include address, control, and/or
data connections that enable appropriate communication among the aforementioned components.
The processor 240 is a hardware device for executing software, particularly that which is stored in memory. The processor 240 may be any custom made or commercially available processor, a central processing unit (CPU), an auxiliary processor among several processors associated with the processing device, a
semiconductor-based microprocessor (in the form of a microchip or chip set), a
macroprocessor, or generally any device for executing software instructions. The memory 242 may include any one or combination of volatile memory
elements (e.g., random access memory (RAM)) and nonvolatile memory elements
(e.g., ROM, hard drive, etc.). Moreover, the memory 242 may incorporate electronic, magnetic, optical, and/or other types of storage media. Note that the memory 242 may
have a distributed architecture in which where various components are situated
remotely from one another but may be accessed by the processor 240.
The software in memory 242 may include one or more separate programs, each of which comprises an ordered listing of executable instructions for implementing
logical functions, such as the logical functions shown in FIG. 8. In the example of
FIG. 9, the software in the memory 242 includes control software 246 for providing one or more of the functionality shown in FIG. 8 according to an embodiment. The
memory 242 may also comprise a suitable operating system (O/S) 248. The operating
system 248 essentially controls the execution of other computer programs, such as the control software, and provides scheduling, input-output control, file and data management, memory management, and communication control and related services.
The control software 246 is a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed. The control software 246 can be implemented, in one embodiment, as a distributed
network of modules, where one or more of the modules can be accessed by one or more applications or programs or components thereof. In some embodiments, the
control software 246 can be implemented as a single module with all of the
functionality of the aforementioned modules. When the control software 246 is a
source program, then the program is translated via a compiler, assembler, interpreter,
or the like, which may or may not be included within the memory, so as to operate
properly in connection with the operating system 248. Furthermore, the control software 246 can be written with (a) an object oriented programming language, which has classes of data and methods, or (b) a procedure programming language, which has
routines, subroutines, and/or functions, for example but not limited to, C, C+ +, Pascal, Basic, Fortran, Cobol, Perl, Java, and Ada. The I/O devices 250 may include input devices such as, for example, a
keyboard, mouse, scanner, microphone, sensor(s), etc. Furthermore, the I/O devices
250 may also include output devices such as, for example, a printer, display, audio
devices, vibration devices, etc. Finally, the I/O devices 250 may further include devices that communicate both inputs and outputs such as, for instance, a modulator/demodulator (modem for accessing another device, system, or network), a radio frequency (RF) or other transceiver, a telephonic interface, a bridge, a router,
etc.
When the control computer is in operation, the processor 240 is configured to
execute software stored within the memory 242, to communicate data to and from the
memory 242, and to generally control operations of the control computer pursuant to the software. The control software 246 and the operating system 248, in whole or in part, but typically the latter, are read by the processor 240, perhaps buffered within the processor 240, and then executed.
It should be noted that the control software 246 can be stored on any computer-readable medium for use by or in connection with any computer-related
system or method. In the context of this document, a computer-readable medium is an
electronic, magnetic, optical, or other physical device or means that can contain or
store a computer program for use by or in connection with a computer related system or method. The control software 246 can be embodied in any computer-readable
medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other
system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
In an alternative embodiment, where the functionality of the control software
246 is implemented in hardware, or as a combination of software and hardware, the functionality of the control software 246 can be implemented with any or a
combination of the following technologies, which are each well known in the art: a
discrete logic circuit(s) having logic gates for implementing logic functions upon data
signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable
gate array (FPGA), etc; or can be implemented with other technologies now known or later developed.
Reference is now made to FIG. 10, which is a block diagram illustrating an embodiment of a method 300 for providing teleportation in a virtual environment. The method 300 includes the step of delivering a video signal to the user in block 310.
The video signal may be delivered using, for example, one or more displays
configured in a head mounted device. The video signal provides the user with the visual information corresponding to the virtual environment. The method 300 also includes the step of delivering an audio signal to a user in block 320. The audio signal can be delivered through, for example, headphones or speakers. The audio signal can
be used to communicate sounds within the virtual environment that correspond to objects or events.
The method 300 also includes the step of receiving position inputs relating to
user physiological features and the teleportation device in block 330. For example,
the three-dimensional position and orientation of the hands and head of the user can serve to ensure that the user's position and video signal correspond to the virtual
environment. Similarly, by receiving the three-dimensional position and orientation data for the teleportation device, the computer controlling the virtual environment can
correctly render the teleportation device in the virtual environment. A user is provided vibratory feedback in block 340. By providing the vibratory feedback, a user can experience the sounds and vibrations corresponding to
different rates of speed and events such as collisions in the virtual environment. Additionally, to further enhance the sensation of motion, air is directed towards the
user in block 350. The air is directed at varying rates and from different directions to
create the sensation of moving at different speeds and in different directions. Air can
be directed using multiple wind generation devices including for example, fans or blowers. Each wind generation device can be driven independently or in combination at one or more preset speeds or at any speed over a range of speeds. Controlling the wind generation units can be accomplished using relays, electronic speed controllers, electronic motor drives, or any combination thereof.
Any process descriptions or blocks in flow charts should be understood as representing modules, segments, or portions of code which include one or more
executable instructions for implementing specific logical functions or steps in the
process, and alternate implementations are included within the scope of an
embodiment of the present disclosure in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order,
depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present disclosure.
It should be emphasized that the above-described embodiments of the present
disclosure, particularly, any illustrated embodiments, are merely possible examples of implementations, merely set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described
embodiment(s) of the disclosure without departing substantially from the spirit and
principles of the disclosure. All such modifications and variations are intended to be
included herein within the scope of this disclosure and the present disclosure and protected by the following claims.

Claims

CLAIMSAt least the following is claimed:
1. A system for teleportation in a virtual environment, comprising: a head mounted display configured to provide an immersive virtual
environment;
a teleportation device configured to provide navigation in the virtual environment; at least one feedback device configured to provide a user with information corresponding to movement of the teleportation device within the virtual environment;
a plurality of input devices configured to generate a plurality of input signals in response to inputs from the user; and a computing device configured to receive the plurality of input signals and
control the at least one feedback device.
2. The system of claim 1 , wherein the head mounted display comprises: a video display configured to provide a video signal corresponding to the virtual environment; and
an audio device configured to provide an audio signal corresponding to the virtual environment.
3. The system of claim 1, wherein the at least one feedback device
comprises a fan directed to the user and configured to create a motion sensation.
4. The system of claim 3, further comprising a plurality of fans
configured to create the motion sensation in a plurality of directions.
5. The system of claim 1, wherein the at least one feedback device
comprises a speaker configured to generate information in the form of an audio signal
and a vibratory signal to the user, the audio and vibratory signals configured to create
a motion sensation corresponding to changes in the virtual environment.
6. The system of claim 1, wherein the plurality of input devices comprise
a plurality of user position sensors configured to provide three-dimensional location
data corresponding to a plurality of user physiological features.
7. The system of claim 6, wherein the plurality of user physiological
features are selected from the group consisting of: hands, arms, head, and torso.
8. The system of claim 6, wherein one of the plurality of sensors
comprises a teleportation device position sensor configured to provide three
dimensional location data corresponding to the teleportation device.
9. The system of claim 1 , wherein one of the plurality of input devices
comprises a user interface device configured to trigger an operation within the virtual
environment.
10. The system of claim 9, wherein the user interface device is an electrical
switch.
11. The system of claim 9, wherein the operation is selected from the
group consisting of: flight mode, lights, stop, move up, move down, and jump.
12. The system of claim 1, further comprising a position interface configured to receive a position sensor input and transmit position and orientation data
to the computing device.
13. The system of claim 1, wherein the teleportation device comprises: a base configured to support at least a portion of the user; and
a directional input portion coupled to the base using a moveable coupling and
configured to simulate a directional input member of a personal vehicle.
14. The system of claim 13, wherein the moveable coupling comprises a biasing element.
15. The system of claim 13, wherein the directional input portion is
configured to tilt away from the user to cause upward movement in the virtual
environment and wherein the directional input portion is configured to tilt toward the
user to cause a downward movement in the virtual environment.
16. The system of claim 1, further comprising a means for controlling a plurality of fans with the computing device.
17. A method for providing teleportation in a virtual environment,
comprising: delivering a video signal, corresponding to a virtual environment, to a user;
delivering an audio signal, corresponding to the virtual environment, to the
user; receiving a plurality of inputs corresponding to a three-dimensional position
for each of a plurality of user physiological features; providing a vibratory feedback, corresponding to the virtual environment, to
the user; and directing air towards the user to create a motion sensation.
18. The method of claim 17, wherein the directing comprises varying a fan output to create the motion sensation corresponding to a plurality of velocities.
19. The method of claim 17, further comprising receiving user interface inputs configured to trigger an operation within the virtual environment.
20. The method of claim 19, wherein the operation is selected from the
group consisting of: flight mode, lights, stop, move up, move down, and jump.
21. The method of claim 17, further comprising supporting a portion of the
user in a configuration consistent with a personal vehicle.
22. The method of claim 21 , wherein the personal vehicle comprises a
scooter.
23. A system for teleportation in a virtual environment, comprising: a head mounted display configured to provide a video signal and an audio
signal to a user; a teleportation device configured to support a portion of the user, the
teleportation device comprising a base moveably coupled to a directional input
component; a plurality of user position sensors configured to transmit three-dimensional position and orientation data corresponding to a plurality of user physiological
features; a directional input component sensor configured to transmit three-dimensional
position and orientation data corresponding to the directional input component of the
teleportation device; a low frequency driver attached to the teleportation device and configured to provide vibratory feedback to the user corresponding to motion in the virtual environment; a plurality of fans directed at the user and configured to create a motion
sensation in a plurality of directions by controlling the output of each of the plurality
of fans independently;
a computing device configured to receive a plurality of input signals and
generate a plurality of output commands to control a plurality of output devices;
an output device controller, configured to receive a portion of the plurality of output commands and control a portion of the plurality of output devices; and a position interface device configured to receive signals from the plurality of
user position sensors and transmit three-dimensional position and orientation data to
the computing device.
PCT/US2006/008264 2005-03-07 2006-03-07 Teleportation systems and methods in a virtual environment WO2006096776A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/816,968 US20080153591A1 (en) 2005-03-07 2006-03-07 Teleportation Systems and Methods in a Virtual Environment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US65928305P 2005-03-07 2005-03-07
US60/659,283 2005-03-07

Publications (2)

Publication Number Publication Date
WO2006096776A2 true WO2006096776A2 (en) 2006-09-14
WO2006096776A3 WO2006096776A3 (en) 2007-08-16

Family

ID=36954010

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/008264 WO2006096776A2 (en) 2005-03-07 2006-03-07 Teleportation systems and methods in a virtual environment

Country Status (2)

Country Link
US (1) US20080153591A1 (en)
WO (1) WO2006096776A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE202016103302U1 (en) 2016-06-22 2016-07-11 Stefan Zimmermann For a virtual reality glasses certain guidance of an electrical line
CN110335511A (en) * 2019-05-30 2019-10-15 桂林蓝港科技有限公司 A kind of student side virtual reality head-mounted display apparatus control system and method

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007148266A1 (en) * 2006-06-19 2007-12-27 Ambx Uk Limited Game enhancer
US9254438B2 (en) 2009-09-29 2016-02-09 International Business Machines Corporation Apparatus and method to transition between a media presentation and a virtual environment
US9256347B2 (en) * 2009-09-29 2016-02-09 International Business Machines Corporation Routing a teleportation request based on compatibility with user contexts
KR101926477B1 (en) * 2011-07-18 2018-12-11 삼성전자 주식회사 Contents play method and apparatus
US9265458B2 (en) 2012-12-04 2016-02-23 Sync-Think, Inc. Application of smooth pursuit cognitive testing paradigms to clinical drug development
US9380976B2 (en) 2013-03-11 2016-07-05 Sync-Think, Inc. Optical neuroinformatics
US9393490B2 (en) * 2014-04-14 2016-07-19 International Business Machines Corporation Simulation based on audio signals
US10628186B2 (en) * 2014-09-08 2020-04-21 Wirepath Home Systems, Llc Method for electronic device virtualization and management
DE102014013961A1 (en) 2014-09-19 2016-03-24 Audi Ag Virtual reality glasses, system with virtual reality glasses and method of operating a virtual reality glasses
US10466790B2 (en) * 2015-03-17 2019-11-05 Whirlwind VR, Inc. System and method for processing an audio and video input in a point of view program for haptic delivery
US10768704B2 (en) * 2015-03-17 2020-09-08 Whirlwind VR, Inc. System and method for modulating a peripheral device based on an unscripted feed using computer vision
US10825350B2 (en) * 2017-03-28 2020-11-03 Wichita State University Virtual reality driver training and assessment system
US10777008B2 (en) 2017-08-31 2020-09-15 Disney Enterprises, Inc. Drones generating various air flow effects around a virtual reality or augmented reality user
US10898798B2 (en) * 2017-12-26 2021-01-26 Disney Enterprises, Inc. Directed wind effect for AR/VR experience
KR20190122546A (en) * 2018-04-20 2019-10-30 한국과학기술원 Kinesthetic-feedback wearable apparatus for virtual reality or augmented reality and method for controlling the same
US20220154964A1 (en) * 2020-11-16 2022-05-19 Mumarba LLC Artificial Breeze System

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6184847B1 (en) * 1998-09-22 2001-02-06 Vega Vista, Inc. Intuitive control of portable data displays

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7098891B1 (en) * 1992-09-18 2006-08-29 Pryor Timothy R Method for providing human input to a computer
US6591250B1 (en) * 1998-02-23 2003-07-08 Genetic Anomalies, Inc. System and method for managing virtual property
US6972734B1 (en) * 1999-06-11 2005-12-06 Canon Kabushiki Kaisha Mixed reality apparatus and mixed reality presentation method
US20010041328A1 (en) * 2000-05-11 2001-11-15 Fisher Samuel Heyward Foreign language immersion simulation process and apparatus
US6952716B1 (en) * 2000-07-12 2005-10-04 Treehouse Solutions, Inc. Method and system for presenting data over a network based on network user choices and collecting real-time data related to said choices
US6884170B2 (en) * 2001-09-27 2005-04-26 Igt Method and apparatus for graphically portraying gaming environment and information regarding components thereof
JP2004329463A (en) * 2003-05-06 2004-11-25 Nintendo Co Ltd Game device and control program of virtual camera
US7828657B2 (en) * 2003-05-20 2010-11-09 Turbine, Inc. System and method for enhancing the experience of participant in a massively multiplayer game
US7584082B2 (en) * 2003-08-07 2009-09-01 The Mathworks, Inc. Synchronization and data review system
KR20070007898A (en) * 2004-05-10 2007-01-16 가부시키가이샤 세가 Electronic game machine, data processing method in electronic game machine, program and storage medium for the same

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6184847B1 (en) * 1998-09-22 2001-02-06 Vega Vista, Inc. Intuitive control of portable data displays

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE202016103302U1 (en) 2016-06-22 2016-07-11 Stefan Zimmermann For a virtual reality glasses certain guidance of an electrical line
CN110335511A (en) * 2019-05-30 2019-10-15 桂林蓝港科技有限公司 A kind of student side virtual reality head-mounted display apparatus control system and method

Also Published As

Publication number Publication date
WO2006096776A3 (en) 2007-08-16
US20080153591A1 (en) 2008-06-26

Similar Documents

Publication Publication Date Title
US20080153591A1 (en) Teleportation Systems and Methods in a Virtual Environment
US6864877B2 (en) Directional tactile feedback for haptic feedback interface devices
US6147674A (en) Method and apparatus for designing force sensations in force feedback computer applications
US10322336B2 (en) Haptic braille output for a game controller
JP4441179B2 (en) Tactile remote control device for toys
US5803738A (en) Apparatus for robotic force simulation
JP2020030845A (en) Non-collocated haptic cues in immersive environments
US8737035B2 (en) Magnetically movable objects over a display of an electronic device
EP1877147A2 (en) Manifold compatibility electronic omni axis human interface
US20090325699A1 (en) Interfacing with virtual reality
Rahman et al. Motion-path based in car gesture control of the multimedia devices
WO2005050427A1 (en) Tactile force sense information display system and method
CN104423595A (en) Systems and methods for performing haptic conversion
JP2010061667A (en) Method and apparatus for controlling force feedback interface utilizing host computer
CN111298426B (en) Electronic contest cabin applied to virtual reality
CN201223711Y (en) Interactive type body-building equipment with sensing function
US20170348594A1 (en) Device, System, and Method for Motion Feedback Controller
Borst et al. Touchpad-driven haptic communication using a palm-sized vibrotactile array with an open-hardware controller design
CN210845261U (en) Multi-functional immersive VR motion platform device
WO2023189405A1 (en) Input/output device
WO2021240601A1 (en) Virtual space body sensation system
KR20180105285A (en) Haptic sensible apparatus and system
WO2001026089A1 (en) Cursor positioning device with tactile output capability (the 'living mouse')
TWI479364B (en) Portable device with magnetic controlling touch feedback function and magnetic controlling touch feedback device
CN110384932A (en) The virtual reality system of cycling emulation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 11816968

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

122 Ep: pct application non-entry in european phase

Ref document number: 06737435

Country of ref document: EP

Kind code of ref document: A2