CA2239125A1 - Method and apparatus for providing force feedback for a graphical user interface - Google Patents

Method and apparatus for providing force feedback for a graphical user interface Download PDF

Info

Publication number
CA2239125A1
CA2239125A1 CA002239125A CA2239125A CA2239125A1 CA 2239125 A1 CA2239125 A1 CA 2239125A1 CA 002239125 A CA002239125 A CA 002239125A CA 2239125 A CA2239125 A CA 2239125A CA 2239125 A1 CA2239125 A1 CA 2239125A1
Authority
CA
Canada
Prior art keywords
force
cursor
user
target
recited
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002239125A
Other languages
French (fr)
Inventor
Louis B. Rosenberg
Scott B. Brave
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Immersion Human Interface Corp
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US08/566,282 external-priority patent/US5734373A/en
Application filed by Individual filed Critical Individual
Publication of CA2239125A1 publication Critical patent/CA2239125A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05GCONTROL DEVICES OR SYSTEMS INSOFAR AS CHARACTERISED BY MECHANICAL FEATURES ONLY
    • G05G9/00Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously
    • G05G9/02Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only
    • G05G9/04Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously
    • G05G9/047Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously the controlling member being movable by hand about orthogonal axes, e.g. joysticks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0383Signal control means within the pointing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1025Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1037Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted for converting control signals received from the game device into a haptic signal, e.g. using force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8017Driving on land or water; Flying
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05GCONTROL DEVICES OR SYSTEMS INSOFAR AS CHARACTERISED BY MECHANICAL FEATURES ONLY
    • G05G9/00Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously
    • G05G9/02Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only
    • G05G9/04Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously
    • G05G9/047Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously the controlling member being movable by hand about orthogonal axes, e.g. joysticks
    • G05G2009/04766Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously the controlling member being movable by hand about orthogonal axes, e.g. joysticks providing feel, e.g. indexing means, means to create counterforce
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05GCONTROL DEVICES OR SYSTEMS INSOFAR AS CHARACTERISED BY MECHANICAL FEATURES ONLY
    • G05G9/00Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously
    • G05G9/02Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only
    • G05G9/04Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously
    • G05G9/047Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously the controlling member being movable by hand about orthogonal axes, e.g. joysticks
    • G05G2009/04777Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously the controlling member being movable by hand about orthogonal axes, e.g. joysticks with additional push or pull action on the handle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/014Force feedback applied to GUI
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/015Force feedback applied to a joystick
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01HELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
    • H01H3/00Mechanisms for operating contacts
    • H01H2003/008Mechanisms for operating contacts with a haptic or a tactile feedback controlled by electrical means, e.g. a motor or magnetofriction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality

Abstract

A human/computer interface device (14) which has a physical object (34), such as a joystick, controlling a graphical object, such as a cursor, within the graphical user interface GUI. A signal (24) is output from the host computer to the interface device to apply a force sensation to the physical object using one or more actuators (30). The force sensation assists the user to select a desired operating system fonction or physically informs the user of the graphical objects encountered by the cursor within the GUI. A
microprocessor (26) local to the interface apparatus can be used to control forces on the physical object.

Description

METHOD AND APPARATUS FOR PROVIDING FORCE ~DBACK
FOl~ A GRAPHICAL USER ~ERFACE
Description Technical Field The present invention relates generally to intPrf~re devices for allowing humans to int~ re with cu~ uLPl- systems, and more particularly to computer systems and c~ uL~l intrrf~re devices that provide force feedback to the user.
Back~round ~rt Colll~ulel systems are used extensively in many dirr~ industries to implPmPnt many 10 applications, such as word proces~in~, data management, ~imnl~tions, games, and other tasks.
These types of applications are very popular with the mass market of home consumers. A
co~ ultr system typically displays a visual envh~ llPllL to a user on a display screen or other visual output device. Users can interact with the displayed environment to perform functions on the computer, play a game, ~Y~ re a sim~ tion or "virtual reality" envir -nmt-nt use a 15 c~....l...lP, aided design (CAD) system, or otherwise inflnenre events or images depicted on the screen. Such user interaction can be imrl~m~ontP-l through the use of a human-colll~ul~ l r~re.
device, such as a joystick, mouse, trackball, stylus and tablet, "joypad" button controller, foot pedal, yoke hand grip, or the like, that is connPctP<l to the colll~ul~l system controlling the displayed envholl-llellt. The cn,..l,~ . updates the ellvil~ t in response to the user's 20 m~nirlll~tion of an object such as a joystick handle or mouse, and provides feedb~rl~ to the user ntili~ing the display screen and, typically, audio speakers.
One visual t;l~v~.~ulllllent that is particularly common is a graphical user intrrf~re (GUI).
Information within GUI's are presented to users through purely visual and auditory means such as a video monitor and sound card to present images and sound effects which describe various 25 graphical metaphors of the o~eldtillg system. Common GUI's include the Windows(~ Opt;ld~illg system from Microsoft Corporation and the System 7 Op~ldlillg system from Apple Computer, Inc.
These intPrf~rt~ allows a user to gr~phi~lly select and manipulate functions of the v~c;ldlil~g system of the collll,ultr by using a mouse, tr~r~h~ll, joystick, or other input device. Other graphical C(,...l,.llrl envir~-nmrnt~ are similar to GUI's. For example, graphical "pages" on the 30 World Wide Web of the TntPrn~t co, - - - - - -- - -ication network utilize features similar to that of GUI's to select and operate particular functions. Some CAD ~y~lellls similarly provide graphical pr~sent~tions. In addition, there has been some contemplation of three ~limPn~ional (3-D) GUI's that present ~imnl~t.o-l 3-D envil~nlllt;lll~ on a 2-D screen.

CA 0223912~ 1998 - 0~ - 29 WO 97t21160 PCT/IB96/01441 GUI's typically require users to carefully move and position a user-controlled graphical object, such as a cursor or pointer, across the screen and onto other displayed graphical objects or predefined regions on a ~;u~ uL~l screen. Such manual tasks can be described as Lalg~Ling"
activities where a user physically manipulates a mouse, joystick, or other int~ f e device in order 5 to command the cursor to a desired location or displayed object, known as a "target" herein. Such targets can include, for example, icons for ex~-cuting application programs and manipulating files;
windows for displaying icons and other information; pull-down menus for s~lçcting particular functions of the ~ Lillg system or an application program; buttons for selecting presented options; and scroll bars or "sliders" for scrolling information in windows.
Upon moving the cursor to the desired target, the user must m~in~in the cursor at thc acquired target while pressing a button, squee7ing a trigger, depressing a pedal, or making some other gesture to coll~ d the execution of the given selection or operation. Exa~lnples of targeting tasks include positioning a cursor on a graphical icon, selecting and pressing a graphical representation of a button, choosing among numerous items within a graphical representation of a pull-down menu, setting a 15 continuous analog value from a provided range of values by positioning an inflir~tor within a graphical repres~nt~tion of a scroll bar, selecting a region of text by hi~hliphtin~ a region using the cursor, as well as a number of other common windows-based and text-based metaphors.
The movement of a cursor onto various displayed graphical objects of a GUI may require ~i nifie:lnt dexte;ity. Users may move the cursor too far over an object and have to backtrack their 20 cursor. Or, particular graphical objects might be mi~t~kenly selected when the user does not wish to select the object due to pressing a button or moving the cursor by accident. In addition, a user may become confused as to which window a cursor is positioned in if the user is viewing other data on the screen at the same time as moving the cursor.
In particular, persons with neuromotor .lic~hi~ s who suffer from spastic manual control 25 have much greater difficulty interacting with GUI's because they lack the fine motor coordination required to m~n~ lly position the co-l~u~er cursor accurately and effieiently. While manual targeting activities are adequately ç~Pcutecl by persons with normal neuromotor functionality, persons with spastic hand motions find such tasks to be physically ~h~ ngin~ if not impossible.
What is needed is a coul~ulel system and intt~ e. device that will allow all users to more 30 accurately and effic~i~Mtly pelr~ l cursor movement activities and manipulate ~eldLi-~g system and other functions within a GUI.

CA 0223912~ 1998-0~-29 Disclosure of ~he Invention The present invention is directed to controlIing and providing force fee~lh~-k to a user opeld~ g a human/computer interface device in conjunction with a graphical user intt-rf~re (GUI) displayed by a host CO~ ?ul~l system. Force sensations are provided to the interface device to assist 5 and/or inform the user of graphical objects encountered by a user-controlled cursor in the GUI.
More specifically, a method of the present invention for providing force feedback within a graphical user interface (GUI) environment of a computer system includes a step of receiving an indication of movement of a physical object that is mar~ipulated by a user. This physical object, such as a joystick handle or a mouse, is in-~ln~lto.(1 in an int~rf~r~e device that outputs the indication 10 of mov~lllent to the computer system. A user-controlled graphical object, such as a cursor, is moved within a graphical user int~rf~-e (GUI) based on the indication of the movement of the physical object. Preferably, a position control paradigm is implemented such that the location of the cursor in the GUI approximately corresponds to a location of the physical object with reference to an origin; ~1t(~rn:~tively7 a rate control paradigm may be used. The cursor and the GUI are 15 displayed on a display screen conn~ctecl to the COlll~ul~l system, and the GUI allows the user to int~rf~-e with ~?t;laLillg system functions impl~ mPnt.qA by the C~l~)UIC;l system through graphical objects displayed on the screen. A signal is output from the coll~ul~r system to the i~ . r;l~e device to command the int~rf~e device to apply a desired force sensation to the physical object using one or more electrically controlled ~ct--~tors. This desired force sensation is associated with 20 at least one of the gr;~phic~l objects and ~)p~,ldLillg system functions of the graphical user interface.
Preferably, the force sensation applied to the physical object is at least partially ~ terrninf rl by a location of the cursor in the GUI with respect to targets associated with the graphical objects in the GUI. These targets rnay include or be associated with such graphical objects as icons, windows, pull-down menus and menu items, scroll bars ("sliders"), and buttons. The force 25 sensation output to the physical object is associated with targets that affect the cursor. This force preferably assists the user to select the desired ol~eldlillg system function that is associated with the force. For example, a target can provide an attractive force on the physical object and cursor so that the cursor is more easily moved onto the target. In addition, the force on the physical object may inform the user of the graphical object that the cursor has moved into or near. An ~eld~ g 30 system function may be pt;.r~,lllled as inclic~ted by the location of the cursor and as in~ tt~.d by a comm~n-l from the user, such as a physical (or ~imlll~t~d) button press. Velocity or acceleration of the cursor may also affect the applied force.
Each of the targets is L,~r~ bly associated with at least two different target force sensations that may affect the physical object and the cursor depending on the location of the cursor with CA 02239l2~ l998-0~-29 WO 97/2116~ PCT/IB96/01441 respect to each target. The two different target force sensations include an internal target foree sensation and an external target foree sensation. The internal target foree is applied to the physieal object when the eursor is located within or moving in or out of the target. The external target force is applied to the physical object when the eursor is located outside the target. The targets are also 5 preferably ordered in a hierarehy, and a target's level in the hie~ y determines if the target will provide forees on the physieal objeet.
The mzlgni~ le, direetion, duration, and other parameters of the internal and ext~rn~l forees of a target ean depend on the type of the target. For example, the external foree sensation of ieons is an attractive foree between the ieon and the eursor, which is applied to the physical object when 10 the cursor is within a pre-l~termine~ distance of the icon. An internal capture force of an icon is plGrelably an attraetive foree when the eursor is moved into the icon, and a barrier foree when the eursor is moved out of the ieon. An internal dead region foree is preferably zero near the eenter area of the ieon so the eursor ean be moved freely when inside the ieon. Other graphical objeets ean be ~cci~nPfl forees in desired ranges within and e~tern~l to the graphieal objects. A damping 15 foree can be used as a dead region foree for other graphical objects to provide recict~nre to the motion of the physical object. In addition, an inertia force can be applied to the physical objeet when a target is moved by the eursor in the GUI. The target can have a cimnl~.od mass that allows a resistive force to be applied to the physical object based on the mass, velocity, or other faetors.
A system of the present invention for providing force fee-lh:~rk to a user manipulating an 20 intt~ re aL)~dldLUs inrl~ çs a host co~ uL~i system. The host receives an input signal from the int~ re apparatus describing the location, velocity and/or acceleration of the physical object in a degree of freedom. The host provides a host output signal and updates the location of the eursor within the GUI displayed on the display sereen based on the input signal. A microprocessor local to the ini~ re a~pa dLus and ~ep~dlt; from the host reeeives the host output signal and provides a 25 processor output signal. An aetuator receives the processor output signal and provides a force along a degree of freedom to the physical object in aeeordance with the processor signal. A sensor detects motion of the physical object along the degree of freedom and outputs the input signal inrln(1in~ information .~pl~sellLdLi~e of the motion of the physieal object. Preferably, the sensor outputs the input signal to the local microprocessor, which outputs the input signal to the host.
30 The physical objeet can preferably be moved in one or more degrees of freedom using, for example, a gimbal or slotted yoke mech~nism, where an actuator and sensor ean be provided for eaeh degree of freedom. A standard serial int~ e included on many eomputers, sueh as the Universal Serial Bus, ean be used to int~ e the host colll~UL~;l system with the local mieroprocessor. A clock is preferably coupled to the host computer system and/or the local 35 processor which ean be ~-~cçcce~l for timing data to help ~letçrminf~ the force output by the actuator.

CA 0223912~ 1998-0~-29 _5 _ The host computer can receive the sensor information in a supervisory mode and output a high level host command to the mieroprocessor whenever a force sensation felt by the user is to be updated or changed. In aeeordanee with the host eomm~n-l the mieroproeessor reads sensor and timing data and outputs force values to the aetuator aceording to a foree sensation proeess that is 5 seleeted. The foree sensation proeess ean inelude using force equations, reading force profiles of predetermined force values from a storage device, or other steps, and may be dependent on sensor data, timing data, host c- mm~n(1 data, or other data. ~lte.rn~ively, the host can directly control the .tll~torS of the intP.rf~ce device.
In another method of the present invention for providing force feedback for graphical 10 objects in a game impl~m~nt~l on a e~-mr-lter system, a user-eontrolled first graphieal objeet or "paddle" is displayed on a display sereen of the eomputer system. The paddle moves on the display sereen during a game in response to manipulations of a physieal objeet of an interface deviee by a user. A second graphieal object or "ball" is also displayed and moved on the display sereen. When the paddle eollides with the ball, a eompression of the paddle is displayed at the 15 loeation where the ball eontaets the paddle. The paddle and ball eaeh have a pre~ e.,..ill~d .cimlll~t~ mass and/or cimnl~t~d eompliance. A force co~ lalld is also output to the inttq.rf~(~e device to apply a force to the physical object in at least one degree of freedom. The force is applied in the direction of the compression and has a m~gnitll(le in accordance with the cimnl~t~.~l m~ cses, compliances, and veloeities of the graphieal objeets. In addition, factors such as gravity can affect 20 the movement of the graphical objeets on the screen and the forces applied to the physical object.
The method and dyyald~us of the present invention advantageously provides force feedbaek to a user in conjunction with movement of a cursor in a GUI. This allows the movement of the cursor to be affected by forces output on the physical object manipulated by the user. Thus, the forces can assist in manipulating uyeld~hlg system functions of the GUI and/or inform the user of 25 the GUI spatial "lz-n-l~c~re" of graphical objects, providing a more efficient GUI. Also, physically h~n~lic~rped users have a far easier time moving a cursor to various graphical objects and regions of a GUI when the forces of the present invention are provided. In addition, a separate microprocessor local to the interface device ean read and proeess sensor signals as well as output foree command signals independently of the host c~ 3u~ , thus saving signifieant proeessing tirne 30 on the host colll~u~r and providing more aceurate foree fee~b~k when using a serial or other re}atively low-bandwidth c<~mm-~nic~tion interfaee between the host and the int~rf~re. device.
These and other advantages of the present invention will beeome a~y~nt to those skilled in the art upon a reading of the following specifieation of the invention and a study of the several figures of the drawing.

Psri~f Description of the Drawings Figure 1 is a block diagram of a control system in accordance with the present invention for controlling a force feedback int~rf~ce device from a host CO~ u~
Figure 2 is a schematic diagram of an actuator int~rf~e for providing control signals to an active ~c~ tnr for the present invention;
Figure 3 is a schematic diagram of an actuator interface for providing control signals to a passive actuator for the present invention;
Figure 4 is a flow rli~gr~m illustrating a first embodiment of a method of the present invention for controlling a force fee~lh~clr intP,rf~ce device;
Figure 5 is a flow (~ gr~m illustrating a second embodiment of a method of the present invention for controlling a force fee~1h~ck intP.rf~ne device;
Figure 6 is a schematic diagram of a closed loop five bar linkage mechanism for providing two degrees of freedom to the user object of the int~o,rf~,e device;
Figure 7 is a perspective view of a pler~l.ed embodiment of the linkage mPc~h~ni~m shown in Figure 6;
Figure 8 is a perspective view of a slotted yoke joystick embodiment of a mPc.h~nirn~l intP,rf~e for the user object;
Figure 9 is a table s--mmziri7.ing rate control comm~n~l~ of the present invention;
Figures lOa-c are diagr~mm:~tic representations of r~,st-~ring force profiles;
Figures 1 l a-c ~re ~ gr~mm~tic representations of restoring spring force profiles;
Figure 12 is a diagr~mm~tic representation of a vector force;
Figures 13a-b are diagr~mm:~tic l~pr~se~ ions of vibration force profiles;
Figure 14 is a table sl-mm~ri7.in~ position control CO~ of the present invention;
Figure 15 is a diagr:~mm~tic ~ ,sentation of a groove force profile;
Figure 16 is a diagr~mm~tic representation of a barrier force profile;

CA 0223912~ 1998-0~-29 Figures 17a-17i are diag~ ti~ illustrations of a paddle and ball interaction controlled by a paddle coll.,lland of the present invention;
Figures 17j-k are diagr~mm~tic illustrations of paddle and ball embodiments displayed on a display screen;
Figure 18 is a dia~l;.. l.l.. ~lic illustration of a display screen showing a graphical user inttq.rf~-e. (GUI) and the interaction of forces of the present invention with a user-controlled cursor;
Figure 19 is adiagr~mm~tic illustration of a display screen showing the GUI of Figure 18 having three windows with forces affecting the user-controlled cursor;
Figure 20a is a diagrz~mm~tic illustration of a display screen showing targets of a GUI and 10 external forces provided by those targets;
Figure20bisadia~l.,.,.~ti~illustrationofatargetandthe internal forces provided by that target;
Figure 20c is a dia~ illustration of another embodiment of a target and external forces for that target;
Figure 21 is a diagr~mms-ti~. illustration of a display screen showing the GUI of Figure 18 having a pull-down menu and associated forces of the present invention;
Figure 22 is a diaglA~ illustration of a display screen showing the GUI of Figure 18 having a pop-up window with buttons and associated forces of the present invention;
Figure 23 is a flow diagrarn illustrating a method of the present invention for providing 20 force feedback within a GUI;
Figure 24 is a flow diagram illu~ lg a step of Figure 23 for sl~sign;np force ranges and m~gnitu-l~s to graphical objects with a GUI;
Figure 25 is a flow ~ gr:~m illustrating a step of Figure 23 for ~1rlr.- Illil~illg the target of lowest hierarchy in which the cursor is positioned;
~ 25 Figure 26 is a flow diagram illustrating a step of Figure 23 for applying ~plupliate forces to the user object based on targets in the GUI; and Figure 27 is a flow diagram illustrating a method for applying external and internal forces for a target based on the position of the cursor.

CA 02239l2~ l998-0~-29 W 097~1160 PCT~B96/01441 B~st Modes for Carrying out the Invention FIGURE 1 is a block diagram illustrating a generic control system 10 of the present invention for an intl-rf~l~e device controlled by a host computer system. Control system 10 5 includes a host computer system 12 and an int~rf~e device 14.
Host coul~uLel system 12 is preferably a personal computer, such as an IBM-compatible or ~r~intQ.ch personal computer, or a w~3lh,Lation, such as a SUN or Silicon Graphics workstation.
For example, the host colll~uL~;r system can a personal c~ JuLt;l which operates under the MS-DOS or Windows operating ,y ,L~---s in conf<-rm~n(~e with an IBM PC AT standard. AlLt;~ iv~;ly, 10 host COlll~Ul~l system 12 can be one of a variety of home video game systems commonly connect~-l to a television set, such as systems available from Nintendo, Sega, or Sony. In other embodiments, home computer system 12 can be a "set top box" which can be used, for example, to provide interactive television functions to users.
In the described embodiment, host culllL,uLt;l system 12 implements a host application 15 program with which a user 22 is interacting via peripherals and int(-rf~re device 14. For example, the host application program can be an ~ tillg system, graphical user intl-.rf~e (GUI), video game, medical sim~ tion, scientific analysis program, or even an operating system or other application program that utilizes force feedback. Typically, the host application provides images to be displayed on a display output device, as described below, and/or other feedback, such as 20 auditory signals.
Host computer system 12 preferably in~]u~lec a host rnicroprocessor 16, random access memory (RAM) 17, read-only memory (ROM) 19, input/output (I/O) electronics 21, a clock 18, a display screen 20, and an audio output device 21. Host rnicroprocessor 16 can include a variety of available microprocessors from Intel, Motorola, or other m~nnf~tllrers Microprocessor 16 can be single microprocessor chip, or can include multiple primary and/or co-processors. Microprocessor preferably retrieves and stores instructions and other n~.cess~ry data from RAM 17 and ROM 19, as is well known to those skilled in the art. In the described embodiment, host computer system 12 can receive sensor data or a sensor signal via a bus 24 from sensors of intP.rf~re device 14 and other information. Microprocessor 16 can receive data from bus 24 using I/O electronics 21, and 30 can use I/O electronics to control other peripheral devices. Host co~ ulel system 12 can also output a "force cornmand" to int~.rf~e device 14 via bus 24 to cause force feedback for the illle,r~ce device. Clock 18 is a standard clock crystal or equivalent co~ ollent used by host cn~ PI system 12 to provide timing to electrical signals used by microprocessor 16 and other components of the coll.puLel system. Clock 18 is ~ce~st ci by host computer system 12 in the 35 control process, as described subsequently.

CA 0223912~ 1998-0~-29 _ g _ Display screen 20 is coupled to host microprocessor 16 by suitable display drivers and can be used to display images generated by host C~ u~cl system 12 or other eomputer systems.
Display screen 20 can be a standard display screen or CRT, 3-D goggles, or any other visual interfaee. In a described embodiment, display screen 20 displays images of a GUI, applieation 5 program, .simnl~tion or game envir~nmrnt In other embo~im~ nt.~, other images can be displayed.
A user 22 of the host eoll l~ulel 12 and intrrf~re device 14 ean receive visual feedback by viewing display screen 20.
Herein, computer 12 may be referred as displaying C~ ulcl or graphical "objeets" or "entities". These computer objeets are not physieal objeets, but is a logieal software unit 10 eollections of data and/or procedures that may be displayed as images by eolllpuLcr 12 on display sereen 20, as is well known to those skilled in the art. For example, a eursor or a third-person view of a ear might be eonsidered player-controlled eomputer objeets that can be moved aeross the sereen. A displayed, ~imnl~t~d eockpit of an aircraft might also be considered an "object", or the ~imnl~tr-l aireraft can be considered a computer controlled "entity".
Audio output deviee 21, sueh as speakers, is preferably eoupled to host microproeessor 16 via amplifiers, filters, and other cil~;uilly well known to those skilled in the art. Host processor 16 outputs signals to speakers 21 to provide sound output to user 22 when an "audio event" occurs during the implementation of the host application progra~n. Other types of peripherals can also be coupled to host processor 16, sueh as storage deviees (hard disk drive, CD ROM drive, flopw 20 disk drive, etc.), printers, and other input and output deviees.
An intrrf~re. deviee 14 is eoupled to host eolll~,ulcl system 12 by a bi-direetional bus 24.
The bi-directional bus sends signals in either direction between host COlll~Uk~i system 12 and the int~rf~re device. Herein, the term "bus" is inten~ed to g~n~ rir~lly refer to an int~.rf~re such as between host eo",~ulel 12 and microprocessor 26 which typically ineludes one or more e~nn~cting 25 wires or other eonneetions and that ean be implrmrntt--l in a variety of ways, as described below.
In the preferred embodiment, bus 24 is a serial int~rf~re bus providing data according to a serial eo~ -ieation protoeol. An intrrf~re port of host eoll~u~er system 12, such as an RS232 serial intrrf~re port, eonneets bus 24 to host eolllL,u~cr system 12. Other standard serial eo.. ~ ir~tion protoeols can also be used in the serial interface and bus 24, sueh as RS-422, Universal Serial Bus 30 (USB), MIDI, or other protocols well known to those skilled in the art. For example, the USB
standard provides a relatively high speed serial interfaee that ean provide force feedbaek signals in the present invention with a high degree of realism. USB can also source more power to drive peripheral deviees. Sinee eaeh device that ~eee~es the USB is ~ ign~-l a unique USB address by the host computer, this allows multiple deviees to share the same bus. In addition, the USB
35 standard ineludes tiII~ing data that is encoded along with differential data.

An advantage of the present invention is that low-bandwidth serial commnnicAtion signals can be used to in~erfAre with int~ ce device 14, thus allowing a standard built-in serial interface of many co~ uLel~ to be used directly. AltrrnAtively, a parallel port of host computer system 12 can be coupled to a parallel bus 24 and c(~"."~l."irAt~ with int~rf,~re device using a parallel protocol, S such as SCSI or PC Parallel Printer Bus. In a dirrelell- embodiment, bus 24 can be connected directly to a data bus of host cnmplltr.r system 12 using, for example, a plug-in card and slot or other access of c~ ult~l system 12. For example, on an IBM AT compatible computer, the intt-rfAre card can be implrm~or.t~ as an ISA, EISA, VESA local bus, PCI, or other well-known standard interface card which plugs into the motherboard of the computer and provides input and 10 output ports connected to the main data bus of the computer.
In another embodiment, an additional bus 25 can be incll1clecl to c~""""";cAtr between host col,l~uLer system 12 and interface device 14. Bus 24 can be coupled to the standard serial port of host colllyuL~l 12, while additional bus 25 can be coupled to a second port of the host computer system. For example, many c~ uL~;l systems include a "game port" in addition to a serial RS-232 port to connect a joystick or similar garne controller to the computer. The two buses 24 and 25 can be used ~imnltAn~ously to provide a increased data bandwidth. For example, microprocessor 26 can send sensor signals to host computer 12 via a uni-directional bus 25 and a game port, while host computer 12 can output force feerlbArk signals from a serial port to microprocessor 26 via a uni-directional bus 24. Other combinations of data flow configurations can be impl~mrntP~l in other embo~imrnt.c TntrrfAre device 14 includes a local microprocessor 26, sensors 28, ArhlAt-lrs 30, a user object 34, optional sensor interface 36, an optional actuator int~ re 38, and other optional input devices 39. Interface device 14 may also include additional elechronic components for c~ mmlmir~ting via standard protocols on bus 24. In the plt;r~ ed embo~lim.snt, mnltiple intrrfA~e devices 14 can be coupled to a single host c~ . system 12 through bus 24 (or multiple buses 24) so that multiple users can ~im~lltAn~ously intrrfAre with the host application program (in a multi-player game or .cimlllAtion, for example). In addition, multiple players can interact in the host application program with multiple intrrf~re devices 14 using networked host colllL ulel~ 12, as is well known to those skilled in the art.
Local microprocessor 26 is coupled to bus 24 and is preferably included within the housing of interface device 14 to aUow quick commllnicAtion with other components of the interface device.
Processor 26 is considered "local" to interface device 14, where "local" herein refers to processor 25 being a separate microprocessor from any processors in host computer system 12. "Local" also preferably refers to processor 26 being ~leflirAt~l to force feeclbArk and sensor I/O of intrrfAre device 14, and being closely coupled to sensors 28 and actuators 30, such as within the housing for interface device or in a housing coupled closely to int~rfAre device 14. Microprocessor 26 can CA 0223912~ 1998-0~-29 be provided with software instructions to wait for comm~nrls or requests from co~ u~r host 16, decode the c~-mmzln~ or request, and handle/control input and output signals according to the cnmm~n~l or request. In addition, processor 26 preferably operates independently of host co~ uLer 16 by reading sensor signals and c~lrul~ting ~ropliate forces from those sensor 5 .cign~ls, time signals, and a force sensation process (also referred to as a "subroutine" herein) selected in accordance with a host comm~n(l Suitable microprocessors for use as local microprocessor 26 include the MC68HC7 1 lE9 by Motorola and the PICl 6C74 by Microchip, for example. Microprocessor 26 can include one microprocessor chip, or multiple processors and/or co-processor chips. In other embo-limPnts, microprocessor 26 can includes a digital signal 10 processor (DSP) chip. Microprocessor 26 can receive signals from sensors 28 and provide signals to ~rt~ tons 30 of the intP.rf~re device 14 in accordance with instructions provided by host computer 12 over bus 24. Microprocessor 26 can also receive comm~nl1s from any other input devices included on interface apparatus 14 and provides apL.l~,pliate signals to host conlpuLel- 12 to indicate that the input inforrnation has been received and any information included in the input 15 information. For example, buttons, switches, dials, or other input controls on intPrf~re device 14 or user object 34 can provide signals to microprocessor 26.
Local memory 27, such as RAM and/or ROM, is preferably coupled to microprocessor 26 in intPrf~re. device 14 to store instructions for microprocessor 26 and store t~ ,ol~y and other data, such as calibration parameters. A local clock 29 can be coupled to the microprocessor 26 to 20 provide timing data, similar to system clock 18 of host co~ uLer 12; the timing data might be required, for example, to compute forces output by actuators 30. In ~ltrrn~t(~ embo-limrnt~ using the USB c~ irSltion intrrf~re, timing data for microprocessor 26 can be retrieved from a USB
signal.
In one embodiment, host c~JJ~ rJ 12 can provide low-level force commz-n~1s over bus 24, 25 which microprocessor 26 directly provides to actuators 30. This embodiment is described in greater detail with respect to Figure 4. In a different embodiment, host col~l~u~r system 12 can provide aigh level supervisory comm~n-l~ to microprocessor 26 over bus 24, and microprocessor 26 manages low level local force control loops to sensors 28 and ~ct~ trJrs 30 in accordance with the high level comm~n-lc This embodiment is described in greater detail with respect to Figure 5.
In the p~cr~ ;d embodiment, sensors 28, ~rtll~tors 30, and microprocessor 26, and other related electronic colll~3Ollellts are inrlll~1rd in a housing for interface device 14, to which user object 34 is directly or indirectly coupled. ~ltrm~tively, microprocessor 26 and/or other electror~ic components of int~-.rf~re device 14 can be provided in a separate housing from user object 34, sensors 28, and actuators 30. Also, additional mr~h~nir~l structures may be included in il~r~ r~rc device 14 to provide object 34 with desired degrees of freedom. Some embodiments of such mech~nisms are described with reference to Figures 6-8.

CA 0223912~ 1998 - 0~ - 29 wo 97/21160 PCT/Is96/01441 Sensors 28 sense the position, motion, and/or other char:~cteri~ti~s of a user object 34 of the interface device 14 along one or more degrees of freedom and provide signals to microprocessor 26 in~ Ain~ hlr~ Lion representative of those charactPri~tics Examples of embodiments of user objects and movement within provided degrees of freedom are described 5 subsequently with respect to Figures 6-8. Typically, a sensor 28 is provided for each degree of freedom along which object 34 can be moved, or a single compound sensor can be used to sense movement in multiple degrees of freedom. An example of sensors suitable for several emboAiment.c described herein are digital optical encoders, which sense the change in position of an object about a rotational axis and provide digital signals indicative of the change in position.
lO Linear optical encoders similarly sense the change in position of object 34 along a linear degree of freedom. Either relative or absolute sensors can be used. A suitable optical encoder is the "Softpot" from U.S. Digital of Vancouver, Washington.
Sensors 28 provide an electrical signal to an optional sensor interf~re 36, which can be used to convert sensor signals to signals that can be illlc;l~ ed by the microprocessor 26 and/or 15 host co~ uLel system 12. For example, sensor int~rf~ce 36 ,eceives the two phase-related signals from a sensor 28 and converts the two signals into another pair of clock signals, which drive a bi-directional binary counter. The output of the binary counter is received by microprocessor 26 as a binary number l~l.;senting the angular position of the encoded shaft. Such circuits, or equivalent Cffl~UitS, are well known to those skilled in the art; for example, the Q~ Ar~tnre Chip LS7166 from 20 Hewlett Packard, California performs the fim~tinn~ described above. Each sensor 28 can be provided with its own sensor int~ re, or one sensor int~rf~-~e may handle data from mllltipl~
sensors. For example, a sensor interface can include a st;~ processing chip A~Air~t~d to each sensor that provides input data. Alt~rn~tely, microprocessor 26 can ~elr~ these intPrfAre functions without the need for a separate sensor int~rf~e 36. The position value signals can be 25 used by microprocessor 26 and are also sent to host colll~ulel system 12 which updates the host application program and outputs force control signals. In :lltf rnz~te embo~li".e~ , sensor signals from sensors 28 can be provided directly to host co,..~ el system 12, bypassing microprocessor 26. Also, sensor inter~ e 36 can be int~ (le~l within host c~mruter system 12, such as on an int-orf;~re board or card.
Alt~rn~tively, an analog sensor can be used instead of digital sensor for all or some of the sensors 28. For ex~mrle, a strain gauge can be c--nn.oct~d to measure forces on object 34 rather than positions of the object. Also, velocity sensors and/or accelerometers can be used to directly measure velocities and accelerations on object 34. Analog sensors can provide an analog signal representative of the position/velocity/acceleration of the user object in a paIticular degree of freedom. An analog to digital converter (ADC) can convert the analog signal to a digital signal that is received and hl~ cled by microprocessor 26 and/or host computer system 12, as is well CA 0223912~ 1998-0~-29 WO 97/21160 PCT~B96/01441 known to those skilled in the art. The resolution of the .lpt(~tf~d motion of object 34 would be limited by the resolution of the ADC. However, noise can somPtim~s mask small movements of object 34 from an analog sensor, which can potentially mask the play that is irnportant to some emho-1imPntc of the present invention (described subsequently~. Other types of i~ rAr~ cil.;uiLly 5 36 can also be used. For example, sensor interface 36 can include angle ~t~",.i.,i"p chips to pre-process angle signals reads from sensors 28 before sending them to the microprocessor 26. For example, a data bus plus chip-enable lines allow any of the angle ~lrtermining chips to c--mmnnic:~tt~ with the microprocessor.
Actuators 30 transmit forces to user object 34 of the intPrf:lce device 14 in one or more 10 directions along one or more degrees of freedom in response to signals received from microprocessor 26. Typically, an actuator 30 is provided for each degree of freedom along which forces are desired to be tr~ncmittP~l Actuators 30 can include two types: active actuators and passive actuators. Active actuators include linear current control motors, stepper motors, pn~ m~ti~/hydraulic active actuators, a torquer (motor with limited angular range), a voice coil 15 actuator, and other types of actuators that lldll~llul a force to move an object. For example, active z~rtll~torc can drive a rotational shaft about an axis in a rotary degree of freedom, or drive a linear shaft along a linear degree of freedom. Passive act--~t- rC can also be used for ~chl~qtors 30.
Magnetic particle brakes, friction brakes, or pl~P~ rlhydraulic passive actuators can be used in ~lt1iti~n to or instead of a motor to g~leldle a damping resistance or friction in a degree of motion.
20 A suitable m~gnPtic particle brake for interface device 14 is available from Force T.imit~-l, Inc. of Santa Monica, California.
In :~lt~rn~tP. embodiments, all or some of sensors 28 and actuators 30 can be inrlllded together as a senSor/:~rtll~t~r pair tr~ncllucer. A suitable tr~nc-l-lrPr for the present invention including botn an optical encoder and current controlled motor is a 20 W basket wound servo 25 motor mzlnllf~rhlred by Maxon.
Actuator intPrf~re 38 can be optionally connPctP~ b~Lween ~rtU~t~rc 30 and microprocessor 26. Interface 38 converts signals from microprocessor 26 into signals ~pl~,~liate to drive ~rtll~t~ns 30. Interface 38 can include power amplifiers, switches, digital to analog conhrollers (DACs), and other components. An example of an actuator intPrf~re for active ~chlzltc-rs is 30 described with reference to Figure 2. An example of an actuator interface for passive actuators is described with reference to Figure 3. In ~ltPrn~tP embodiments, inttrf~rc 38 ci".:~uLly can be provided within microprocessor 26 or in zlchl~tors 30.
Other input devices 39 can optionally be inrlll~lr-l in intt-.rf~re device 14 and send input signals to microprocessor 26. Such input devices can include buttons, dials, switches, or other 35 mech~nisms For example, in embodiments where user object 34 is a joystick, other input devices CA 0223912~ 1998-0~-29 can include one or more buttons provided, for example, on the joystick handle or base and used to supplement the input from the user to a game or ciml~ tion. The operation of such input devices is well known to those skilled in the art.
Power supply 40 can optionally be coupled to actuator intrrf~ce 38 and/or actuators 30 to provide electrical power. Active actuators typically require a separate power source to be driven.
Power supply 40 can be included within the housing of interf:lre device 14, or can be provided as a separate component, for example, connrcted by an electrical power cord. Alternatively, if the USB
or a similar c~ ic~tion protocol is used, int~rf~ce device 14 can draw power from the USB
and thus have no need for power supply 40. This embodiment is most applicable to a device 14 10 having passive actuators 30, since passive ~rtll~t~rs require little power to operate. Active r.s tend to require more power than can be drawn from USB, but this restriction can be overcome in a number of ways. One way is to configure intt~rf~ce. 14 to appear as more than one ~e~ipl1el~1 to host computer 12; for example, each provided degree of freedom of user object 34 can be configured as a different peripheral and receive its own aUocation of power. This would 15 allow host 12 to allocate more power to interface device 14. Alternatively, power from the USB
can be stored and regulated by intrrf~re device 14 and thus used when needed to drive ~rtl-~t~-r.s 30. ~or example, power can be stored over time and then imm.o~ t~ly ~lis~ir~t~cl to provide a jolt force to the user object 34. A capacitor circuit, for example, can store the energy and ~lissip~tr the energy when enough power has been stored. Microprocessor may have to regulate the output of 20 forces to assure that time is allowed for power to be stored. This power storage embodiment can also be used in non-USI3 embo-limf nts of intrrf~re device 14 to allow a smaUer power supply 40 to be used.
Safety switch 41 is preferably inrlll(led in intr.rfare device to provide a m~ch~nism to allow a user to override and deactivate actuator~s 30, or require a user to activate ~rtll~tnrs 30, for safety 25 reasons. Certain types of ~ct~trlrs, especially active ~r~ t~r.s such as motors, can pose a safety issue for the user if the actuators unexpectedly move user object 34 against the user with a strong force. In addition, if a failure in the control system 10 occurs, the user may desire to quickly deactivate the actuators to avoid any injury. To provide this option, safety switch 41 is coupled to ~rtll:~t~rs 30. In the ~lefe;~ d embodiment, the user must continually activate or close safety 30 switch 41 during operation of intrrf:lre device 14 to activate the actuators 30. If, at any time, the safety switch is deactivated (opened), power from power supply 40 is cut to actuators 30 (or the ~rtll~t~rs are otherwise deactivated) as long as the safety switch is deactivated. For example, a preferred embodiment of safety switch is an optical switch located on user object 34 (such as a joystick) or on a convenient surface of a housing enclosing interf~re device 14. When the user 35 covers the optical switch with a hand or finger, the sensor of the switch is blocked from sensing ambient light, and the switch is closed. The actuators 30 thus will function as long as the user CA 0223912~ 1998-05-29 W O 97/21160 PCTnB96/01441 covers the switch. Other types of safety switches 41 can be provided in other embodiments. For example, an electrostatic contact switch can be used to sense contact, a button or trigger can be pressed, or a different type of sensor switch can be used.
User object 34 is preferably a device or article that may be grasped or otherwise contarte-l S or controlled by a user and which is coupled to interface device 14. By "grasp", it is meant that users may releasably engage a grip portion of the object in some fashion, such as by hand, with their fingertips, or even orally in the case of han~lirapped persons. The user 22 can manipulate and move the object along provided degrees of freedom to int~ re with the host application program the user is viewing on display screen 20. Object 34 can be a joystick, mouse, tr~ckh~ stylus, 10 ste~ring wheel, medical in~Llu~ lt ~laparoscope, catheter, etc.), pool cue, hand grip, knob, button, or other article.
PIGURE 2 is a schPm~tic ~liagr~m illustrating an example of an a~luatc,r interface 38 for an active actuator 30 of intP.rfare device 14. In this example, actuator 30 is a linear current controlled servo motor. Actuator int~rf~ce 38 includes a DAC circuit 44 and a power amplifier circuit 46.
15 DAC circuit 44 is cu~ (l to microprocessor 26 and preferably receives a digital signal representing a force value from the microprocessor 26. DAC 48 is suitable for c~ v~ g an input digital signal to an analog voltage that is output to power :~mI~lifier circuit 46. Op amp 50, for example, outputs a signal from zero to -5 volts proportional to the binary number at its input. Op amp 52 is an inverting ~llmming ~mplifier that COllv~ the output voltage to a symmrtric~l bipolar 20 range; this output signal is suitable for power amplification in ~mplific~tion circuit 46. Of course, DAC circuit 44 is intrn~lecl as one example of many possible circuits that can be used to convert a digital signal to a desired analog signal.
Power amplifier circuit 46 receives an analog low-power control voltage from DAC circuit 44 and amplifies the voltage to control ~rhl~tr,rs 30. Actuator 30 can be a high-power, current-25 controlled servo motor 30. The input voltage controls a transconcl~lct~nre stage composed of~mplifiPr 54 and several resistors. The transcontlllc~nre stage produces an output current proportional to the input voltage to drive motor 30 while drawing very little current from the input voltage source. The second amplifier stage, inrl~lrlinE :~mplif1~r 56, resistors, and a c~p~citQr C, provides additional current capacity by enh~ncing the voltage swing of the second telTnin~l 57 of 30 motor 30. Of course, circuit 46 is int~ n~ i as one example of many possible circuits that can be ~ used to amplify voltages to drive active actuators 30.
FIGURE 3 is a srh~ m~tic diagram illustrating an example of an achuator int~rf~ce 38' that can be used in conjunction with passive actuators. Tnterf~re 38' is suitable for use with passive actuators ~dampers) that are controlled with an analog voltage. Interface 38' inclllfirs a DAC circuit 35 44, amplifier 60, transistor 62, and voltage protector 64. DAC circuit 44 is coupled to CA 0223912~ 1998-0~-29 - ~6-microprocessor 26 and receives a digital signal from the computer system representing a resistive force value to be applied to user object 34. DAC circuit 44 converts the digital signal voltages to analog voltages which are then output to ~mplifier 60. ~mplifiP.r 60 receives the analog voltage from DAC 44 on a positive tP.rmin~l and scales the voltage signal to a range usable by artll~7tr~r 30.
5 Amplifier 60 can be implemented as an operational :~mplifiPr or the like. Transistor 62 is coupled to the output of amplifier 60 and preferably operates as an ,.mplificr to provide increased output current to actuator 30. Resistor Rl is coupled between :lmplifier 60 and the emitter of transistor 62, and resistor R2 is coupled b~lweell amplifier 60 and ground. Voltage protector 64 is coupled to the emitter of transistor 62 and provides protection from voltage spikes when using inductive 10 loads. Suitable passive ~rtl~,.t~-r.C 30 for use with this cil-;uiLly includes variable solenoids or m,.gnPtir particle brakes. A sel~a dLe DAC and amplifier can be used for each actuator 30 implemented in the interface a~pala~us. Interface 38' is int~n~le-l as one eY,.mplP of many possible circuits that can be used to intPrf,.re a colll~uL~l system to actuators. In an ~ltprn .te embodiment, an on/off signal might only be needed, for example, for a solenoid driving an on/off valve of a 15 fluid-controlled actuator.
FIGURE 4 is a flow diagram illustrating a first embodiment of a method 70 for controlling a force fee-lb:-rl~ interface device of the present invention. Method 70 is directed to a "host-controlled" embodiment, in which host co~ uL~ system 12 provides direct, low-level force cnmm~n~l~ to microprocessor 26, and the microprocessor directly provides these force comm~nfl~
20 to actuators 30 to control forces output by the actuators. For example, the host controlled mode is suitable for embo-limP.nt~ using a USB commnnie~tion interf~re. Data rates are sufficiently high to allow tne host to co~ ir~tP at 500 Hz or greater and provide realistic force feedback to the user object 34.
The process begins at 72. In step 74, host colll~u~ system 12 and int( rf~re device 14 are 25 powered up, for example, by a user activating power switches. After step 74, the process 70 branches into two parallel (.simnlt~nP.ous) processes. One process is implP.mPnte-l on host co,llpuL~r system 12, and the other process is implemented on local microprocessor 26. These two processes branch out of step 74 in different directions to in-lir.~tP this ~imnlt:~neity.
In the host colll~uLel system process, step 76 is first implemented, in which an application 30 program is processed or updated. This application can be a .ciml]l~tion, video game, scientific program, or other program. Images can be displayed for a user on output display screen 20 and other fee~lb~rk can be presented, such as audio feedback.
Two branches exit step 76 to indicate that there are two processes running ~imlllt~neously (mllltit~kin~) on host computer system 12. In one process, step 78 is implement~-l, where sensor 35 data is received by the host computer from local microprocessor 26. As detailed below in the CA 0223912~ 1998-0~-29 WO 97/21160 ~CT~B96/01441 microprocessor process, the local processor 26 c-~ntin~ ly receives signals from sensors 28, processes the raw data, and sends processed sensor data to host cnmp-lter 12, Alternatively, local processor 26 sends raw data directly to host computer system 12. "Sensor data", as referred to herein, can include position values, velocity values, and/or acceleration values derived from the 5 sensors 28 which detect motion of object 34 in one or more degrees of freedom. In addition, any other data received from other input devices 39 can also be received by host co~ uLt;l system 12 as sensor data in step 78, such as signals indicating a button on int~rf~f e device 14 has been activated by the user. Finally, the term "sensor data" also can include a history of values, such as position values recorded previously and stored in order to calculate a velocity.
After sensor data is read in step 78, the process returns to step 76, where the host colll~uL~l system 12 can update the application prograrn in response to the user's manipulations of object 34 and any other user input received in step 78 as well as ~etPrminP if forces need to be applied to object 34 in the parallel process. Step 78 is implrm~nt~d in a continual loop of reading data from local processor 26.
The second branch from step 76 iS conrern~cl with the process of the host colll~uLer r1~t~rmining force co~ to provide force fee lb~f~k to the user manipulating object 34. These comm:~n-lc are described herein as "low-level" force comm~n~l~, as distinguished from the "high-level" or supervisory force comm~nfl~ described in the embodiment of Figure 5. A low level force c( mm~n-l instructs an actuator to output a force of a particular m~gnitu~lr. For example, the low 20 level collllllalld typically includes a m~gnitllcle force value, e.g., equivalent signal(s) to instruct the slntll~tr~r to apply a force of a desired m~gnitucle value. Low level force comm~n~lc may also de~i~n~t~ a direction of force if an actuator can apply force in a selected direction, and/or other low-level information as required by an actuator.
The second branch starts with step 80, in which the host colll~u~er system checks if a 25 change in the force applied to user object 34 is required. This can be c~et~rmin~-l by several types of criteria, the most important of which are the sensor data read by the host c~lllL,uL.,. in step 78, timing data, and the implementation or "events" of the application program updated in step 76. The sensor data read in step 78 informs the host colll~uL~;l 12 how the user is interacting with the application program. ~rom the position of object 34 sensed over time, the host co~ uLt;l system 30 12 can ~letermin~ when forces should be applied to the object. For example, the position of a - colll~ul~,. generated object within a GUI may ~lrl~e. Illinr if a change in force feedback is called for.
In addition, the velocity and/or acceleration of the user ohject can influence whether a change in ~ force on the object is required. Also, other input, such as a user activating buttons or other controls on int~ e device 14, can change the forces required on object 34 depending on how 35 those controls have been pro~,ldll.uled to affect the application program.

CA 0223912~ 1998-0~-29 WO 97/2~160 PCT/IB96/01441 Other enteria for dc~ ing if a change in force is required includes events in the applieation program, such as collision events between graphical objects. Forces should thus be applied to the user object dependent on this collision event to cim~ tP an impact. Forces can be required on the user object depending on a eombination of such an event and the sensor data read 5 in step 78. Other parameters in the applieation program ean ~lPtPrmine if a ehange in foree to the user objeet is neeçcc~ry, sueh as other input deviees or user intP.rfare devices connected to host e~ mputer system 12 and inputting data to the application program (other intPrf~re deviees ean be direetly eonneeted, eonnPetP-l remotely through a network, ete.).
If no ehange in foree is eurrently required in step 80, then the process returns to step 76 to 10 update the host application and return to step 80 to again check until such a ehange in foree is required. When such a ehange is required, step 82 is implemPntP~, in whieh host eomputer 12 determines apL" ~liate low-level foree eomman~lc to be sent to the art~ tr~rs 30 of intPrf~e device 14, these force eommanc1c being depçn(lPnt on a seleeted foree sensation proeess, sensor data, the host applieation, and the eloek 18.
The low-level foree co.~ llrlc ean be delt~l ",i"P~l, in part, from a seleeted foree sensation proeess. A "foree sensation proeess", as referred to herein, is a set of instruetions for providing foree eomm~n-lc dependent on other parameters or eonditions, sueh as sensor data read in step 78 and timing data from eloek 18. In the described embo~limPnf, force sensation processes ean inelude several dirr~lclll types of steps and/or instruetions. One type of instruetion is a foree 20 algorithm, whieh includes an equation that host colll~JuLcl 12 ean use to ç~lrlll~tP or model a foree value based on sensor and timing data. Several types of algoli~lllls ean be used. For example, algolilhllls in whieh force varies linearly (or nr,nlinP~rly) with the position of objeet 34 can be used to provide a ~imlll:~t~l force like a spring. Algorithms in whieh foree varies linearly (or n~-nlinP~rly) with the veloeity of objeet 34 ean be also used to provide a ~im~ tP~ damping foree or 25 other forees. Algorithms in whieh foree varies linearly (or nonlinearly) with the aeeeleration of objeet 34 ean also be used to provide, for example, a ~im~ tPfl inertial foree on a mass (for linear variation) or a ~im~ t~-l gldvildtional pull (for nonlinear variation).
For foree values depending on tne velocity and aceeleration of user object 34, the veloeity and acceleration can be provided in a number of different ways. The sensor data read by host 30 Cl~lllpU~t~l 12 in step 78 can include position data, velocity data, and acceleration data. In a plcr~llt;d embodiment, the velocity and acceleration data was czllrul~tp(l previously by microprocessor 26 and then provided to the host computer 12. The host computer can thus use the velocity and aeceleration data directly in an algorithm to ç~lrlll~tP. a force value. In an ~ItPrn~tP
embodiment, the sensor data read in step 78 includes position data and no velocity or acceleration 35 data, so that host CO~ Uk;l 12 is required to c~lrlll~t~ the veloeity and aceeleration from the position data. This cam be aecomplished by reeording a number of past position values, reeording CA 0223912~ 1998-0~-29 WO 97/21160 PCT/IB96/~1441 the time when each such position value was received using the system clock 18, and calGu1:~tin~ a velocity and/or acceleration from such data. For example, a kinematic equation which c~ t~c a force based on the velocity of the user object mnltirli~d by a damping constant can be used to t(~rmine a damping force on the user object. This type of equation can ~imul~t~ motion of object 5 34 along one degree of freedom through a fluid or similar m~t~ori~l . A damping constant can indicate the degree of resistance that object 34 experien-çs when moving through a ~imnl~t~d m~t.-rizll, such as a liquid, where a greater number in~ tçs greater re~;~t~nce. The difference in position and direction of movement of the user object is calculated, and the force is set equal to the damping constant multiplied by the change in position. Movement in other m( ~lillm~, such as on a 10 bumpy surface, on an in~lin~d plane, etc., can be ~imlll~t~d in a similar fashion using dirre methods of calculating the force.
The ~l~termin~tion of force cnmm~n-l~ is ~lGrGld7L~ly inflllen~ed by timing data ~cç~s.cc~
from system clock 18. For example, in the damping force example described above, the velocity of the user obiect 34 is rlt~tf-rminl~d by calculating the different of positions of the user object and 15 multiplying by the ~l~mrin~ constant. The host computer can access clock 12 to d~L~ how much time has actually elapsed since the last position data was received and thus use the clock's timing data in the modulation of forces and force sensations to the user. Timing data can be used in other algc~liL'~ ls and force sensation processes of the present invention.
Other instructions can also be included in a force sensation process. For example, 20 conditions can be included to provide iorces only in desired directions or under other particular circum~t~nrçs For example, to ~imlll~t~- a virtual obstruction such as a wall, forces should be applied in only one direction (uni-directinn~l). To ~imll1~t~ uni-directional resi~tz-n~ e, con-litinn~
can be included in the obstruction force sensation process. Also, a "null" force sensation process can be available that instructs host colll~uL~;r 12 (or microprocessor 26 in Figure S) to issue a low 25 level comm~ncl or force values to provide zero forces (i.e., remove all forces) on object 34.
Another type of force sensation process does not use algorithms to model a force, but instead uses force values that have been previously calculated or sampled and stored as a (ligiti7t~d "force profile" in memory or other storage device. These force values may have been previously ~,rllr~ d using an equation or algorithm as described above, or provided by sampling and 30 ~ligiti7in~ forces. For exarnple, to provide a particular force sensation to the user, host coll~uler - 12 can be instructed by a force sensation process to retrieve successive force values from a certain storage device, such as RAM, ROM, hard disk, etc. These force values can be sent directly to an actuator to provide particular forces without requiring host COlll~ul~;f 12 to calculate the force values. In addition, previously-stored force values can be output with respect to other parameters 35 to provide dirrt;.~lt types of forces and force sensations from one set of stored force values. For example, using system clock 18, the stored force values can be output in sequence according to a particular time interval that can vary depending on the desired force. Or, different retrieved force values can be output depending on the current position of user object 34.
Host computer 12 can ~l~t~rmin~ a force command in step 82 according to a newly-selected force sensation process, or to a previously selected force sensation process. For exarnple, if this is 5 asecondorlateriterationof step 82, the same force sensation process as in the previous ilr."~ n can be again impl~m~nted if pararneters ~such as the position of object 34) allow it, as fl~. ".;..~cl by the host application program.
The force cornmand dt;Le~ ed in step 82 can also depend on instructions that check for other pararneters. These instructions can be incl~ l within or external to the above-described 10 force sensation processes. One such pa~ lt;l are values provided by the implçm~nt~ ~l host application program (if any). The application program may fl~t~ that a particular force c~-mm~n(l should be output or force sensation process implemented based on events occllrrinp within the application program or other instructions. Force co,.lu.~ ls or values can be provided by the host application program independently of sensor data. Also, the host application program 15 can provide its own particular position, velocity, and/or acceleration data to a sç!(ct~-l force sensation process to c~ tç or provide a force that is not based on the manipulation of user object 34, but is provided to cim~ te an event in the aprlic~tion program. Such events may include collision events, such as occur when a user-controlled comL)U~t;l image impacts a virtual surface or structure. Also, other input devices c~-nn~ct~d to host computer 12 can infl~,ence events and, 20 therefore, the forces applied to user object 34. For example, the sensor data from ",..ll;pl~
interface devices 14 connt-ct~-l to a single host col.l~-uL~l can inflllPn~e the forces felt on other c~-nnt-c~d int~rf~re. devices by illll~elln~lg events and collllJu~l-controlled images/objects of the host application program. Also, the force comm~n~1c ~lçte.rmineA in step 82 can be based on other inputs to host computer 12, such as activations of buttons or other input devices in (or external to) 25 int~ ne device 14. For example, a particular application program might require that a force be applied to a joystick whenever a user presses a fire button on the joystick.
The above-described force sensation processes and other L)a~ ers can be used to provide a variety of haptic sensations to the user through the user object 34 to cim~ t-o many di~ferent types of tactile events. For example, typical haptic sensations may include a virtual damping (described 30 above), a virtual obstruction, and a virtual texture. Virtual obstructions are provided to ~iml~ tç
walls, obstructions, and other uni-directional forces in a GUI, ~im~ tion, game, etc. and are used to provide a physical r~ci~t:~n~e to movment of the joystick in a direction. Virtual texturcs can be used to cimnl~te a surface condition or similar texture. For example, as the user moves a joystick along an axis, the host c~lllL)ul~l sends a rapid sequence of comm~n<l~ to repetitively 1) apply 35 rç~i~t~n~e along that axis, and 2) to then immt~ tl ly apply no resistance along that axis, as CA 0223912~ 1998-0~-29 WO 97/21160 PCT~B96/01441 according to a force sensation process and correlated with spatial position. Thus, the user feels a physical sensation of texture, similar to the feeling of dragging a stick over a grating.
In next step 84, a low-level force collll.l~ld rl~trl ~ d in step 82 is output to microprocessor 26 over bus 24. Tnis force col~ a~d typically includes a force value that was 5 det~rmin.~l in accordance with the parameters described above. The force command can be output as an actual force signal that is merely relayed to an actuator 30 by microprocessor 26; or, the force comm~n~l can be converted to an al?plu~liate forrn by microprocessor 26 before being sent to actuator 30. In addition, the low-level force co~nand preferably includes inforrnation in-lic~ting to microprocessor 26 which :~rfll~tor~ a~;e to receive this force value (if multiple actuators are 10 in(~.hlde~l on int~rf~e device 14). The process then returns to step 76 to process/update the host application program. The process continues to step 80, where the host computer checks if a dirr~ t force cornmand should be output as t1rt~ cl by the parameters described above. If so, a new force cnmm~n(l is (letP.rmin( d and output in step 84. If no change of force is required, host co~ uLel 12 does not issue another comm~n~l, since microprocessor 26 can contin-~es to output the 15 previous force comm~n-l to actuators 30 (alternatively, host computer 12 can continue to output comm~n-lc, even if no change of force is required). Subsequent force comm~n~ls output in step 84 can be ~ od in accordance with the same force sensation process, or a different force sensation process, depending on the parameters of step 82.
In addition, the host c~ ult;r 12 preferably synclllu~ ;s any ~ ,pliate visual fee-lh~
20 auditory feedback, or other fee~ k related to the host application with the application of forces on user object 34. For exa nple, in a video game application, the onset or start of visual events, such as an object colliding with the user on display screen 20, should be synchronized with the onset or start of forces felt by the user which correspond to or complement those visual events. The onsets visual events and force events are preferably occur within about 30 milliseconds (ms) of each 25 other. This span of time is the typical limit of human pelc~lua'l ability to perceive the events as ~imlllt:ln~ ous. If the visual and force events occur outside this range, then a time lag between the events can usually be perceived. Similarly, the output of auditory signals, corresponding to the onset of auditory events in the host application, a~;e preferably output synchronized with the onset of output forces that correspond to/complement those auditory events. Again, the onsets of these 30 events occur preferably within about 30 ms of each other. For example, host computer system 12 can output sounds of an explosion from speakers 21 as close in time as possible to the forces felt by the user from that explosion in a ~imul~tion. Preferably, the m~gnitucl~ of the sound is in direct (as opposed to inverse) proportion to the m~gnitll~le of the forces applied to user object 34.
The local microprocessor 26 implements the process branching from step 74 and starting 35 with step 86 in parallel with the host co~ uler process described above. In step 86, the int~ e device 14 is activated. For example, signals can be sent between host colll~uler 12 and int~ e W O 97~1160 PCT~B96/01441 device 14 to acknowledge that the int~rfare device is now active. From step 86, two processes branch to indicate that there are two processes running cimnlt~nPously (multi-tasking) on local processor 26. In one process, step 88 is implemented, in which the processor 26 reads raw data (sensor readings) from sensors 28. Such raw data preferably includes position values describing the position of the user object along provided degrees of freedom. In ~lt~rn~t~ embo-lim~3 tc, sensors 28 can include velocity sensors and accelerometers for providing raw velocity and acceleration values of object 34. The raw data read in step 88 can also include other input, such as from an activated button or other control 39 of intPrf~ e device 14.
In next step 90, processor 26 processes the received raw data into sensor data, if 10 applicable. In the ~lc;re~ d embo-limf~nt this proceccin~ in~lllcles two steps: colll~uLing velocity and/or acceleration values from raw position data (if velocity and/or acceleration are needed to compute forces), and filtl ring the conl~uL~d velocity and acceleration data. The velocity and acceleration values are computed from raw position data received in step 88 and stored position and time values. Preferably, processor 26 stores a number of position values and time values 15 corresponding to when the position values were received. Processor 26 can use its own or a local system clock (not shown in Fig. 1) to ~let~rrnin~ the timing data. The velocity and acceleration can be colll~uL~d using the stored position data and timing data and then filtered to remove noise from the data, such as large spikes that may result in velocity calculations from quick ~h~n~c in position of object 34. In an alternate embodiment, ch.;uiLI~/ that is ~ tri~lly coupled to but sepaldt~ from 20 processor 26 can receive the raw data and f1çt~rmin~ velocity and acceleration. For exarnple, an application-specific integrated circuit (ASIC) or discrete logic circuitry can use counters or the like to ~l~t~rmin/~ velocity and acceleration.
~lt~-m~tively, step 90 can be omittt-rl, and the processor 26 can provide raw position data to host c~nll~uk;l 12 (and other input data from other input devices 39). This would require host 25 CO~ uL~l 12 to filter and c~ tr velocity and a~-c~ll r~tion from the position data. In other embo-lim~ntc, the filtt~rin~ can be pelrcJ'''led on host co~ ,l 12 while the velocity and ~-~cçl~tinn calculation can be performed on the processor 26. In embor~im~ntc where velocity and/or acceleration sensors are used to provide raw velocity and acceleration data, the calculation of velocity and/or acceleration can be omitted. After step 90, step 91 is implt m~nt~d, in which the 30 processor 26 sends the processed sensor data to the host colll~uLel 12 via bus 24. The process then retums to step 88 to read raw data. Steps 88, 90 and 91 are thus continuously implemented to provide current sensor data to host c~ uLtr system 12.
The second branch from step 86 is cnn~ d with processor 26 controlling the actuators 30 to provide forces ç~l~ul~tlo-l by host conl~uL~r 12 to object 34. The second branch starts with 35 step 92, in which processor 26 checks if a low-level force command has been received from host colll~uLt;l12 over bus 24. If not, the process continll~lly checks for such a force com mz~n~l When CA 0223912~ 1998-0~-29 a force eornmand has been reeeived, step 94 is implemented, in whieh processor 26 outputs a low-level processor force eo~ 1 to the ~l~cignate(l actuators to set the output foree to the desired m~gnit~lde, direetion, etc. This force eommanl1 may be equivalent to the received low-level comm~ncl from the host computer, or, the processor 26 can optionally eonvert the force command 5 to an a~lupliate form usable by actuator 30 (or actuator int~ e 38 can perform such conversion). The process then returns to step 92 to eheek for another foree cornmand from the host eomputer 12.
FIGURE 5 is a flow diagram illustrating a seeond embodiment of a method 100 for eontrolling foree fee~lb~k int.orfat~e deviee 14 of the present invention. Method 100 is directed to a 10 "reflex" embo(1imPnt in which host eomputer system 12 provides only high-level supervisory force eomman-1c ("host eommamls") to mieroprocessor 26, while the microproeessorindependently ~let~rmin~s and provides low-level foree eomm~n-lc (foree values) to aetuators 30 as an independent "reflex" to eontrol forees output by the actuators.
The proeess of Figure 5 is suitable for low speed ~ ti{m interfaees, such as a 15 standard RS-232 serial interfaee. However, the embodiment of Figure 5 is also suitable for high speed eomml-nieation in~e~,~af~s sueh as USB, since the local microprocessor relieves computational burden from host proeessor 16. In addition, this embodiment ean provide a straigllLl'.~ l c~ mm~n~:l protocol, an exampIe of w~ich is deseribed with respeet to Figures 9 and 14, and which allow software developers to easily provide force fee-lb~e~ in a host application. In 20 this embo-liment for example, the slower "illLellu~l data transfers" mode of USB ean be used.
The proeess begins at 102. In step 104, host eo,ll~uLei system 12 and int~rf~(~e deviee 14 are powered up, for example, by a user aetivating power switehes. After step 104, the process 100 branehes into two parallel processes. One process is implem~nt~d on host eomputer system 12, and the other process is implemented on local rnieroproeessor 26.
In the host eoll,~uLer system proeess, step 106 is first implPmt~nt~-1, in whieh an applieation program is proeessed. This applieation ean be a cimlllation, video game, c~ientifie program, or other prograrn. Images ean be displayed for a user on output display sereen 20 and other feedbaek ean be present~l, such as audio feedbaek.
Two branehes exit step 106 to in~ t~ that there are two processes rurming cimlllt~n~ously (multi-t~ckTng, etc.) on host eolll~ulel system 12. In one of the processes, step 108 is implemented, where sensor data from the user object is reeeived by the host computer from loeal microprocessor 26. Similarly to step 78 of the process of Figure 4, host eomputer system 12 reeeives either raw data (e.g., position data and no veloeity or aeeeleration data) or proeessed sensor data (position, velocity and/or aeeeleration data) from mieroprocessor 26. In addition, any CA 0223912~ 1998-0~-29 otherdatareceived from other input devices 39 can also be received by host computer system 12 from microprocessor 26 in step 108, such as signals indicating a button on interface device 14 has been pressed by the user.
Unlike the previous embodiment of Figure 4, the host computer does not c~ ul~t~ force values from the received sensor data in step 108. Rather, host co~ u~er 12 monitors the sensor data to determine when a change in the type of force is required. This is described in greater detail below. Of course, host colllpu~er 12 also uses the sensor data as input for the host application to update the host application accordingly.
After sensor data is received in step 108, the process returns to step 106, where the host 10 computer system 12 can update the application program in response to the user's manipulations of object 34 and any other user input received in step 108. Step 108 is then implemented again in a continl-~l loop of receiving sets of sensor data from local processor 26. Since the host computer does not need to directly control ~ctu~t--rs based on sensor data, the sensor data can be provided at a much lower speed. For example, since the host colll~u~er updates the host application and 15 images on display screen 20 in response to sensor data, the sensor data need only be read at 60-70 Hz~(the refresh cycle of a typical display screen) compared to the much higher rate of about S00-1000 Hz (or greater) needed to rç~ ti~lly provide low-level force fee-lb~k signals from sensor signals. Host computer 12 also preferably synclllol~;Ges visual, audio, and force events similarly as described above with reference to Figure 4.
The second branch from step 106 is conr~orn(~cl with the process of the host computer r....i..i~ghigh-level force comm:-n~1~ ("host comm,.n~ls") to provide force l~e~lbac~ to the user manipulating object 34. The second branch starts with step 110, in which the host computer system checks if a change in the type of force applied to user object 34 is required. The "type" of force is a force sensation or profile produced by a particular force sensation process or force value 25 which the local microprocessor 26 can i...ll~..-P~-~ independently of the host computer. The host co~ ul~r 12 r1~t~ n~s whether a change in the type of force is required depending on the sensor data read by the host co~ uLer in step 108 and depending on the events of the application program llp~ l in step 106. As explained with reference to Figure 4, the sensor data informs the host cul~l~3ul~r when forces should be applied to the object based on the object's current position, 30 velocity, and/or acceleration. The user's manipulations of object 34 may have caused a new type of force to required. For example, if the user is moving a virtual race car within a virtual pool of mud in a video game, a damping type of force should be applied to the object 34 as long as the race car moves within the mud. Thus, damping forces need to be continually applied to the object, but no change in the type of force is required. When the race car moves out of the pool of mud, a new 35 type of force (i.e. a removal of damping force in this case) is required. The events of the application program may also require a change in the type of force applied. Forces may be CA 0223912~ 1998-0~-29 required on the user object depending on a combination of an application event and the sensor data read in step 108. Also, other input, such as a user activating buttons or other input devices 39 on interface device 14, can change the type of forces required on object 34.
If no change in the type of force is ~;ullellLly required in step 110, then the process returns S to step 106 to update the host application and return to step 110 to again check until such a change the type of force is required. When such a change is required, step 112 is implementPfl, in which host computer 12 de,t~rmin-o,s an ~ fiate host cnmm~nrl to send to microprocessor 26. The available host comm~n(l~ for host co~ ul~l 12 may each correspond to an associated force sensation process implemented by microprocessor 26. For ex~mple, host comm~n-l~ to provide a 10 damping force, a spring force, a gravitational pull, a bumpy surface force, a virtual obstruction force, and other forces can be available to host colll~uLer 12. These host comm~n-l~ can also include a ~lesign~tion of the particular :~rtn~t~rS 30 or degrees of freedom which are to apply this desired force on object 34. The host comm~n-ls can also include other coll..llalld p~ L~r information which might vary the force produced by a particular force sensation process, such as a 15 damping constant. The host command may also preferably override the local control of the processor 26 and include low-level force values. A preferred coll..lland protocol and detailed description of a set of host comm:~n-l~ is described in greater detail below with respect to Figures 9 and 14. In next step 114, the host computer sends the host comm~n-1 to the microprocessor 26 over bus 24. The process then returns to step 106 to update the host application and to return to 20 step 110 to check if another change in force is required.
The local microprocessor 26 implem~o,nt~ the process br~nr~hin~ from step 104 and starting with step 116 in parallel with the host computer process described above. In step 116, the intP,rf~re device 14 is activated. Por example, signals can be sent between host computer 12 and intt~ ,e, device 14 to acknowledge that the interf~f~e device is now active and can be comm~nfl~-l 25 by host c(~ r 12. From step 116, two processes branch to indicate that there are two processes rurming ~imnlt~n~ously (multi-tasking) on local processor 26. In one process, step 118 is implem.ontf,-l, in which the processor 26 reads raw data from sensors 28. As described in step 88 of Figure 4, processor 26 preferably reads position data and no velocity or acceleration data from sensors 28. In ~ltt-,m~t~ embo~1im~nt~, sensors 28 can include velocity sensors and 30 accelel~,lll~l~. for providing velocity and acceleration values of object 34. The sensor data read in step 118 can also include other input, such as from an activated button or other control of int~rf~re device 14.
- In next step 120, processor 26 processes the received raw data into sensor data. As described in step 90 of Figure 4, this processing preferably includes the two steps of colll~ulillg 35 velocity and acceleration data from the filtered position data and filt~-,ring the velocity and acceleration data. Processor 26 can use its own local clock 21 to ~lett~rminf the timing data needed CA 0223912~ 1998 - 0~ - 29 wO 97/21160 pcr/IBs6/ol44 for co~ u~ g velocity and acceleration. In addition, a history of previous recorded values, such as position or velocity values, can be used to calculate sensor data. In embo~limr.nt~ where velocity and/or acceleration sensors are used, the calculation of velocity and/or acceleration is omitted. In next step 121, the processor 26 sends the processed sensor data to host culllyuLt;l 12 and also 5 stores the data for computing forces, as described in the second branch process of processor 26.
The process then returns to step 118 to read raw data. Steps 118, 120 and 121 are thus continuously implemrntr/l to provide current sensor data to processor 26 and host computer 12.
The second branch from step 116 is concerned with an "actuator process" in whichprocessor 26 controls the ~rtll~tor.s 30 to provide forces to object 34. The second branch starts with step 122, in which processor 26 checks if a host ~;oll.llland has been received from host colll~ul~r 12 over bus 24. If so, the process continues to step 124, where a force sensation process zlssor~i~trfl with the host command is selected. Such force sensation processes can be stored local to microprocessor 26 in, for example, memory 27 such as RAM or ROM (or EPROM, EEPROM, etc.). Thus, the microprocessor might select a ~l~mpjng force sensation process if the 15 high level command in~lir~tr-1 that the ~i~mring force from this force sensation process should be applied to object 34. The available force sensation processes are preferably similar to those described above with reference to Figure 4, and may include algorithms, stored force profiles or values, conditions, etc. In some emborlimrnt~, steps 118, 120, and 121 for reading sensor data can be incolL~ol~Led in the force sensation processes for the rnicroprocessor, so that sensor data is 20 only read once a force sensation process has been selected. Also, the host Gol ~ may in some in~t:lnres simply be a low-level force e~-mm~n~l that provides a force value to be sent to an actuator 30, in which case a force sensation process need not be seleeted.
After a force sensation process has been selected in step 124, or if a new host eommand has not been received in step 122, then step 126 is implement.-.-1, in which processor 26 det~ S
25 a processor low-level foree command (i.e. force value). The force value is derived from the foree sensation process and any other data required by the force sen~tinn process as well as c~
Ptr~ inrlll-1ed in relevant host comm~lnfl~, sensor data and/or timing data from local elock 29. Thus, if no new high level ~ was reeeived in step 122, then the rnicroprocessor 26 rs a foree command according to the same force sensation process that it was previously 30 using in step 126. In addition, the host co~ ~d can indude other ~;ullll~ d parameter information needed to ~lrtrrmine a force eo.... "~l~.l For exarnple, the host command ean indieate the direetion of a force along a degree of freedom.
In step 128, processor 26 outputs the determined processor force command to ~ctl~t- rs 30 to set the output force to the desired level. 13efore sending out the force comms.n(l, processor 26 35 ean optionally convert the foree eommand to an aL plo~liate form usable by ~rtll~t~r 30, or actuator CA 0223912~ 1998-0~-29 WO 97/21160 PCT/~B96/01441 int~rf~re 38 can pelrullll such conversion. The process then returns to step 122 to check if another host eommand has been received from the host computer 12.
The actuator process of microprocessor 26 (steps 118, 120, 122, 124, 126, and 128) thus operates to provide forces on object 34 independently of host computer 12 according to a selected 5 force sensation process and other parameters. The force sensation process ciet~.rmin~s how the processor force coll~ll~ld is to be 'l~L~ based on the most reeent sensor data read by microprocessor 26. Since a force sensation process in~1ieAt~s how forces should be applied depending on the position and other parameters of user object 34, the processor ean issue low-level foree commAn~lc, freeing the host colll~ulel to process the host applirAtion and ~etermine only 10 when a new type of force needs to be output. This greatly improves c~ icAtion rates between host colll~ulel 12 and int-rf~re device 14. In addition, the host CU111~UI~1 12 ~l~;r~;ldbly has the ability to override local control operation of microprocessor 26 and direetly provide calculated or other force values as described above with reference to Figure 4. This override mode can also be implemented as a force sensation process.
FIGURE 6 is a schematic diagram of an example of a gimbal mechanism 140 for providing two or more rotary degrees of freedom to object 34. Gimbal mechanism 140 ean be coupled to int~rfAre device 14 or be provided with sensors 28 and aetuators 30 separately from the other components of intt-rfAce device 14. Gimbal ml chAni.cm 140 can be supported by a grounded surface 142, which can be a surface of the housing of interf~re device 14, for example (sehP.mAtiCAlly shown as part of member 144). Gimbal m~-~h~ni~m 140 is preferably a five-,-lellll)ei linkage that includes a ground Illr.llll~eJ 144, extension members 146a and 146b, and central members 148a and 148b. Ground member 144 is coupled to ground 142. The members of gimbal mechanism 140 are rotatably eoupled to one another through the use of bearings or pivots, whelt;ill extension member 146a is rotatably coupled to ground lllellll~ei 144 and can rotate about an axis A, centrAl member 148a is rotatably coupled to extension mPmber 146a and can rotate about a floating axis D, ext~n~inn lllelll~l 146b is rotatably coupled to ground lllc;lllb~l 144 and ean rotate about axis B, eentral lll~mbeL 148b is rotatably coupled to extension m~mh~r 146b and ean rotate about floating axis E, and central member 148a is rotatably coupled to central member 148b at a eenter point P at the int(~r~eetinn of axes D and E. The axes D and E are "floAtin~" in the sense that they are not fixed in one position as are axes A and B. Axes A and B are substAntiAlly mutually perpendieular. Gimbal m~eh:~ni~m 140 is thus formed as a five member elosed ehain.
Eaeh end of one m~mh~r is eoupled to the end of a another member. The five-member linkage is arranged sueh that extension member 146a, eentral member 148a, and eentral member 148b ean be rotated about axis A in a first degree of freedom. The linkage is also arranged sueh that extension member 146b, eentral member 148b, and central member 148a ean be rotated about axis B in a seeond degree of freedom.

CA 0223912~ 1998-0~-29 User object 34 is a physical object that can be coupled to a linear axis member 150, or linear axis member l~0 can be considered part of object 34. Linear member 150 is coupled to central member 148a and central member 148b at the point of intersection P of axes D and E and extends out of the plane defined by axes D and E. Linear axis member 150 can be rotated about S axis A (and E) by rotating extension member 146a, central lllellll)~l 148a, and central member 148b inafirstrevolute degree of freedom, shown as arrow line 151. Member 150 can also be rotated about axis B (and D) by rotating extension member 50b and the two central lllembel~ about axis B
in a second revolute degree of freedom, shown by arrow line 152. In zlltern~t~ emboflim~ntc, linear axis lllelllb~l can be linearly moved along floating axis C, providing a third degree of 10 freedom as shown by arrows 153. In addition, linear axis member 150 in some embo~lim~nt~ can rotated about axis C, as in-lic~t~d by arrow 155, to provide an additional degree of freedom. These additional degrees of ~reedom can also be provided with sensors and actuators.
Sensors 28 and actuators 30 can be coupled to gimbal m~rh~n;~m 140 at the link points between members of the apparatus and provide input to and output as described above. Sensors 15 and ~-~tll~t--rs can be coupled to extension ~ l.h~ls 146a and 146b, for example. User object 34 is coupled to mechanism 140. User object 44 may be moved in h~oth (or all three or four) degrees of freedom provided by gimbal mech~nism 140 and linear axis member 150. As object 34 is moved about axis A, flo~ting axis D varies its position, and as object 34 is moved about axis B, floating axis E varies its position.
FIGURE 7 is a p~.~pe~ e view of a specific embo-lim-ont of an apparatus 160 including gimbal mechanism 140 and other components of int~ e device 14 for providing m~ch:~nical input andoutputtohostcomputersystem 12. A~pal~us 160 inrl~1~1es gimbal mechanism 140, sensors 141 and actuators 143. User object 34 is shown in this embodiment as a joystick having a grip portion 162 and is coupled to central member 148a. Apparatus 160 operates in subst~nti~lly the 25 same fashion as gimbal m~ch~ni~m 140 described with reference to Figure 6.
Gimbal mechanism 140 provides support for apparatus 160 on grounded surface 142, such as a table top or sirnilar surface. Gimbal .llecl~ . 140 includes ground member 144, capstan drive mech~ni~m~ 164, extension members 146a and 146b, central drive member 148a, and central link member 148b. Ground member 144 inrhl~les a base member 166 and vertical support 30 members 168 which are coupled to grounded surface 142. A vertical support member 168 is coupled to each outer surface of base member 166 such that vertical members 168 are in subsf~ntizllly 90-degree relation with each other. A capstan drive mechanism 164 is preferably coupled to each vertical ..~e,~.l)er 168. Capstan drive mech~ni~m~ 164 are included in gimbal mech~ni~m 140 to provide m~ch~nic~l advantage without introducing friction and bac~l~ch to the 35 system. The drums 170 are rotated using a cable coupled to a pulley, which is driven by an actuator 143.

CA 0223912~ 1998-0~-29 WO 97/21160 PCT/I1~96/~1441 Extension membe} 146a is rigidly coupled to a capstan drum 170 and is rotated about axis A as capstan drum 170 is rotated. Likewise, extension member 146b is rigidly coupled to the other capstan drum 170 and can be rotated about axis B. Central drive member 148a is rotatably coupled to extension member 146a, and central link member 148b is rotatably coupled to an end of extension member 146b. Central drive member 148a and central link member 148b are rotatably coupled to each other at the center of rotation of the gimbal mech~nicm, which is the point of intersection P of axes A and B. Bearing 172 connects the two central mPmhPrs 148a and 148b together at the intersection point P. Gimbal mechanism 140 provides two degrees of freedom to an object 34 positioned at or near the center point P of rotation such that the object at or near point P
10 can be rotated about axis A and B or have a combination of rotational movement about these axes.
Also, object 34 can also be rotated or tr~n~l~tP(l in other degrees of freedom, such as a linear degree of freedom along axis C or a rotary degree of freedom about axis C.
Sensors 141 and açt~ fc)rs 143 are preferably coupled to gimbal mecha~ 140 to provide input and output signals between apparatus 160 and microprocessor 26. In the described 15 embo-limPnt sensors 141 and z~çtll~t r.s 143 are cornhinP~ in the same housing as grounded tr~ncduçers 174. For example, trzlnc~ ers 174a and 174b are bi-directional trz~nsduç~-rs having optical encoder sensors 141 and active DC servo motors 143. Passive ~rtll~t -rs can also be used.
The housing of each grounded tr~ncducer 174a is preferably coupled to a vertical support member 168 and preferably includes both an actuator 143 for providing force in the first revolute degree of 20 freedom about axis A and a sensor 141 for mP~cllring the position of object 34 in the first degree of freedom about axis A. A rotational shaft of actuator 174a is coupled to a pulley of capstan drive m--çh~nism 164 to Lld~ input and output along the first degree of freedom. Grounded tr:~.n~-l-lc~r 174b preferably corresponds to grounded tr~n~ rer 174a in function and operation.
Object 34 is shown in Figure 7 as a joystick having a grip portion 126 for the user to grasp.
25 A user can move the joystick about axes A and B; these m(~ve~ are sensed by processor 26 and host c~m~uLt;l system 12. Forces can be applied preferably in the two degrees of freedom to sim~ tP various haptic sensations. Optionally, other objects 34 can be coupled to gimbal mechanism 140, as described above.
FIGURE 8 is a perspective view of a different embodiment of object 34 and supporting mechanism 180 that can be used in conjunction with intPrf~re device 14. M~ch~nism 180 includes a slotted yoke configuration for use with joystick controllers that is well-known to those skilled in the art. M~-~h~nism 180 includes slotted yoke 182a, slotted yoke 182b, sensors 184a and 184b, - bearings 186a and 186b, zlçtu~torc 188a and 188b, and joystick 34. Slotted yoke 182a is rigidly coupled to shaft 189a that extends through and is rigidly coupled to sensor 184a at one end of the yoke. Slotted yoke 182a is similarly coupled to shaft 189c and bearing 186a at the other end of the yoke. Slotted yoke 182a is rotatable about axis L and this movement is ~lPtectp~l by sensor 184a.

CA 02239l2~ l998-0~-29 In slltrrn~P. embo~limrntc, bearing 186a and be implemented as another sensor like sensor 184a.
Sirnilarly, slotted yoke 182b is rigidly coupled to shaft 189b and sensor 184b at one end and shaft 189d and bearing 186b at the other end. Yoke 182b can rotated about axis M and this movement can be detected by sensor 184b.
S Objeet 34 is a joystick that is pivotally att~rhP~l to ground surface 190 at one end 192 so that the other end 194 typically can move in four 90-degree directions above surface 190 in two degrees of freedom (and additional directions in other embo~limPr~tc). Joystick 34 extends through slots 196 and 198 in yokes 182a and 182b, respectively. Thus, as joystick 34 is moved in any direction, yokes 182a and 182b follow the joystick and rotate about axes L and M. Sensors 184a-d detect this rotation and can thus track the motion of joystick 34. Actuators 188a and 188b allow the user to experience force feedb;~ck when h~n~lling joystick 34. Alternatively, other types of objeets 34 can be used in place of or coupled to the joystick, and/or additional degrees of freedom can be provided to joystick 34. For example, the joystick can be provided with a rotary degree of freedom about axis K, as in-lir~tP~ by arrow 193. Sensors and/or actuators can also be included for such additional degrees of freedom.
Other embodiments of mPçh~nir~l interface ayy~dLuses and tr~nc~ cçr.c can also be used in in~e.rface device 14 to provide mech~nir~l input/output for user object 34. For example, mechanical apparatuses which provide one or more linear degrees of freedom to user object 34 can be used. In addition, passive actuators having an amount of "play" can be provided.
FIGURE 9 is a table 300 showing a llulllbeL of preferred host comm~n~c that can be used in the embodiment of Figure 5, where host colllyuLel 12 sends high level host comm~n~c to local microprocessor 26, which implements local force sensation processes in accordance with the host comm:~nflc As rli~Gucced previously, low co.,...,,~,,ic~tinn rates on bus 24 (Figure 1) can impede yelr~llllallce~ spe~ifir~lly the accuracy and realism, of force feedback. The local _icroprocessor 25 can impll-nnt-nt force sensation processes based on host cr,mm~nf1s independently of the host col..yuL~l~ thus requiring less signals to be cc,.,....ll";r~lr.-l over bus 24. Preferably, a cftmml-nir~tion l~n~nz~ge or force feedback protocol should be standardized for the transfer of host comm~n~1c from the host processor 16 to the local processor 26 to perrnit the efficient tr~n.cmiciion of high level supervisory comm~ntic (host commands) to local processor 26.
A ylef~lled embodiment contains two primary modes or "control paradigms" of operation for force fee~lh~r.k interface device 14: rate control and position control. These modes imply a cl~cci~lr~tion scheme for host commz-n-l.c y~ by the coll,lnalld parameters. While the difference between rate control and position control is generally subtle to the user while he or she illleldl~ with an application, the difference may be profound when reprçsrntin~ force fee<lb~ck 35 information. Some of the cnmm~nrlc can be used as either rate control or position control WO 97nll60 PCT/IB96/01441 comm~n~lc Other force feedback comm~n-lc may be constructed in ~ litinn to, or as alternatives to, the following sample force feedback comm~n~lc Rate control refers to a user object mapping in which the displacement of the user object 34 along one or more provided degrees of freedom is abstractly mapped to motion of a computer-S cim~ t~-d entity under control, such as an airplane, race car, or other ~imnl~tt-A "player" or player-eontrolled graphical object. Rate control is an abstraction which makes force feedback less illluilive because there is not a direct physical mapping between object motion and co~
motion of the simlll~3tf~rl c~ u~er entity. In contra,ct, position control refers to a user objeet mapping in which displacement of the joystick handle or other user manipulable object directly 10 dictates displ~cem~nt of a ~imlll~t~ Colll~UL~l entity, so that the filnrl~mto.nt~l relation between joystick displacements and computer displacements is present. Thus, most rate control paradigms are filn~l~m~nt~lly dirft;l~;nt from position control in that the user object (joystick) can be held steady at a given position but the simlll~t~r1 entity under control is in motion at a given cnmm~ntled velocity, while the position control paradigm only allows the entity under control to be in motion if 15 the user object is in motion. Position control host comm~nc1.c are described in greater detail below with respect to Figure 14, while rate control comm~n-l~ are described presently with reference to Figure 9.
For cx~mrlP, a common form of rate control is a velocity derived abstraction in which ~licplzl-~em~-.nt of the user object, such as a joystick handle, dictates a velocity of the .cim~ t~.cl 20 colll~ulel entity, such as a vehicle or other graphical object displayed on display screen 20, in a sim~ t~.~l envhulllllent. The greater the joystick handle is moved from the original position, the greater the velocity of the controlled vehicle or player-controlled graphical object. Other common rate control paradigrnc used in colll~uler games are acceleration controlled. Rate eontrol force fee-1h~rl~ commands roughly colr~spond to forces which would be exerted on a vehicle or other 25 .cimlll~t~.cl entity controlled by the ~im~ t~ llvil~nlllell~ through the force feedback int.o.rf~ce deviee 14. Such forees are termed vehicle-centric forces.
Herein, rate control cnmm~n~l~ are divided into "conditions" and "overlays," although other cl~ ific~zltinns may be used in alternate emboflim~nt~. Conditions set up a basic physical model or baekground sensations about the user object incl~l-ling ~imlll~1.o.cl stiffness, ~imlll~t~d damping, 30 ~imlll~tt~.-l inertias, deadbands where ~imlll~tt-rl forces ~limini~h, and direetional constraints dictating - the physical model's functionality. Multiple eonditions may be specified in a single eommand to effectively superpose condition forces. Overlays, in contrast, are forces that may be applied in - addition to the conditions in the background. Any number of overlays ean preferably be provided in addition to eondition forces. A condition can be specified by one condition com~nand or by 35 multiple condition comm~nfl~

CA 0223912~ 1998 - 0~ - 29 Wo 97/21160 PCT/IB96/01441 Descriptions will now be provided for several types of forces 302, as referenced in table 300, that can be imrlemPnt~d by microprocessor 26 from host comm~n-lc. These forces include:
restoring force, restoring spring, vector force, vibration, sluggish stick, wobble, unstable, button reflex jolt, and ratchet force. The restoring force, restr~ring spring, sluggish stick, and unstable 5 forces are considered eondition forces. The vector force, vibration, wobble, and button reflex jolt forces are considered overlay forces.
The forces 30~ shown in table 300 can be impl~mPnt~-l with host comm~n~lc provided by host colllL,ul~l 12 to microprocessor 26. l~xamples 304 of host comm~ntlc and their syntax are shown in table 300 for each type of force 302. In the described embodiment, host comm~ncls 304 preferably include a c~mm~n-1 portion 306 and a number of command parameters 308. Comm~n~lc 304 indicate the type of force which the host computer 12 is instructing the processor 26 to implement. This co~ d portion may have a corresponding force sensation process which the processor 26 can retrieve from memory 27 and implement. Cnmm:~n~l parameters 304 are values or in~ tQrs provided by the host computer 12 which cnct )mi~P and/or modify the type of force 15 indicated by comm~n(l portion 304. For the following preferred rate control embodiments, most of the command L~ ers control dirr~lclll forces in the same way. The m~gnitll-lç parameter is a e-l~ge of a ~ llulll m~nitl--lP c~ spol-ding to a m~xi,..,..,. force able to be output by ~t~tn:ltors 30. The duration parameter usually corresponds to a time interval for applying the particular force model, or can be indefinitely applied. The style parameter may select a direction in which to apply the force model, and/or a degree of freedom along which to apply the force model.
Although not listed in Figure 9, all of the described types of forces 302 can have additional parameters or incorporate other properties into the listed parameters. A "~ (lb~n~l" p~ ",~lPr could specify a size of a region where a force would be small or zero. A parameter can be included in(lir~ting whether a force is bi-directional or uni-directional along a degree of freedom. Subclass 310 in-ljc~t~s a cl~ccific~tion of the types of forces 302 as either conditions or overlays, as explained above.
FIGURES lOa-c are graphs illustrating force versus displacement profiles for a restoring force. The force in graph 312 of Figure lOa is bi-directinnsll, where the force on the right side of the vertical axis is applied in one direction along a degree of freedom, and the force on the left side of the vertical axis is applied in the opposite direction along that degree of freedom. The force shown in graph 314 of Figure lOb is uni-directional. Preferably, whether the force is uni-directional or bi-directional is specified with, for example, the style parameter 308 of the comrnand 306 shown in table 300 of E;igure 8. In addition, the desired degrees of freedom along which the r~st ~ring force is to be applied are also preferably specified in the style parameter.
A rçst-)ring force applied to user object 34 always points back towards an origin position O
(or "neutral position") of the user object along a degree of freedom. For example, the origin CA 02239l2~ l998-0~-29 WO 97/21160 PCT~B96/01441 position for a joystick can be the joystick's center position, as shown in Figures 7 and 8. The m~gnitu(lf of restoring force~ specified by the m~gnit~ co~ lalld parameter, generally remains constantineitherdirectionforthe range 316 along the degree of freedom of the user object. The xil~ force m~gnitll~le F is preferably limited to about 75% of the m~ximllm possible output 5 force in a the selected degree of freedom, so that jolts and vibrations can be overlaid on top of the restoring sensation (described below). As the object is moved toward the origin position O, the applied force is constant until the user object is moved within a localized region R about the origin position. When the user object is in the localized region R, the applied force rapidly drops to zero or a small value. Thus, the rçstnring force profile provides a constant "restoring sensation" that 10 forces the user object back to the origin position when the object is in range 316. This restoring forces then ~liminish~s or vanishes as the object nears and reaches the origin position.
In Figure lOc, the restoring force is shown similarly to the force in Figure 10a, except that the applied force is about zero in an extended region 318, about the origin position. Region 318 is known as a "deadband", and allows the user to have some freedom to move object 34 for a short 15 ~ t:~nc~e around the origin before forces are applied. A restoring force sensation can be very ably applied in a rate control paradigm to the situation of hitting a wall or some other obstruction while controlling a ~imnlz-t~d vehicle.
FIGURES 1 la-l lc are graphs illustrating force versus displacement profiles for a restoring spring force. Rather than ...~ a consL~Il m:~gnitllrlP over much of its positive or negative 20 displacement, as provided by the restoring force of Figures lOa-lOc, a rçstoring spring force varies linearly over an appreciable portion of the user object's displacement, and is proportional to the object 34'S distance from the origin position O. A rçstoring spring force applied to the user object always points back towards the neutral position along a degree of freedom. Graph 320 of Figure 1 la shows the bi-directional case, and graph 322 of Figure 1 lb shows the uni-directional case. A
25 deadband specified by a de:-(1b~ntl p~ is provided about the origin position, as shown in graph 324 of Figure 1 lc. The restoring spring force can have a spring coefficient parameter to describe a desired "stiffn~ss" of the object 34.
The sluggish force creates a damping force on user object 34 having a m~gnitll-le proportional to the velocity of the user object when moved by the user. An example of this type of 30 damping force is described above with respect to step 82 of Figure 4. The degree of "viscosity" of the sluggish force can be specified by a viscous damping coefficient, which can be expressed as a percentage of a m~ximnm damping coefficient. The sluggish stick force is particularly suited for rate control applications to simnl~t~ controlling, for example, a very heavy vehicle that is poorly responsive to the movement of the user object. The unstable force creates an inverted pen-11llllm 35 style inst~hility. Alternatively, the unstable force is modeled on a spring having a negative spring constant ~an unstable or dive~gillg spring). A force is applied to the user object in a direction away CA 0223912~ 1998 - 0~ - 29 from the object's origin position and is increased as the user object is moved further away from the origin position. This creates a force that makes it ~liffic-llt for the user to bring the object to the origin position. This force can be used as another vehicle-related sensation, and could replace a restoring spring force when, for example, a ~im~ tPd vehicle guidance control is damaged.
In alternative embodiments, the condition forces described above can be comm~n(lp(1 using only one generic host cn~ d with a llulllbel of parameters to control the characteristics of the condition forces.
The condition c<mm~nrls can be provided in the background while overlay comm~n-l~s are applied in addition to or "over" the condition forces. For example, a sluggish damping force can be provided as a background force to the user ob3ect, and a "jolt" overlay force can be comm:~nde-l over the sluggish force to provide a quick, jerky motion on the user object for a few seconds. Of course, overlay forces may also be applied exclusively when no other forces are being applied, or may cancel other previously-comm:lnrl.orl forces if desired. The example overlay forces shown in Figure 9 are described below.
FIG~JRE 12 is a graph 326 illustrating a vector force model. A vector force is an overlay comm~ntl, and thus can be applied in addition to the condition forces described above. It is a general force applied to the joystick in a given direction specified by a direction c~mm:~n~l parameter. Figure 12 shows a two-dimensional iepl~s~ n of the vector force in an example direction in the X-Y plane of a user object having two degrees of freedom.
FIGURES 13a-13b are graphs illustrating force versus time profiles for a vibration force.
Figure 13a is a graph 328 showing a bi-directional vibration force while Figure 13b is a graph 330 showing a uni-directional vibration force. The vibration command shown in Figure 9 accepts m~gni~ldP7 frequency, style, direction, and duration c~-mm~n-l parameters. The frequency p~,~",~l~r can be imp~em~nt~c~ as a percentage of a ."~xi"""" frequency and is inversely p~ olLional to a time interval of one period, Tp The direction co""~ a~ can be specified as an angle or degree of freedom. The style pa~ er can indicate whether the vibration force is uni-directional or bi-directional. In addition, a duty cycle pa~ er can be provided in :~ltPrn:~tP embodiments in~lic~ting the percentage of a time period that the vibration force is applied.
Also, a comm~n-l ~?ala 11~l~. can be included to ~1P~; n~tP the "shape" or profile of the vibration waveform in the time axis, where one of a pre~letPrminPd number of shapes can be selected. For example, the force might be specified as a sinusoidal force, a sawtooth-shaped force, a square waveform force, etc.
A wobble force paradigm is another overlay force that can be comm~n~lPd by host computer 12. This force creates a random (or seemingly random to the user), of~-balance force sensation on CA 0223912~ 1998-0~-29 the user object. For example, it can ~im~ t~-. an erratic control for a damaged vehicle. A style pararneter might also specify a type of wobble force from a predeterrnined list of different types.
The jolt force is typically a short, high m:~ nitude force that is output on the user object, and can be used, for example, to notify the user of an event or ~imlll~t~r~ object in the c~ ulel 5 envhv~ ;nt. The jolt force can be used as an overlay force which can be felt in addition to any condition forces in effect. Typical ~d~lleters include the m~gnihl(le of the force of the jolt, the duration of the jolt, and direction(s) or degree(s) of freedom in which the jolt is applied, which can be specified as an angle or particular degrees of freedom.
The button force is not an actual force but may be used as a c~-mm~n-1 to trigger other 10 forces when an input device 39 is activated by the user. In many game situations, for example, it may be advantageous to trigger a force as a direct response to pressing a button or other input device 39 on the interface a~dlus 14 rather than genelalhlg the force from a host command after procescing the pressed button on the host co~ uLer 12.
For example, a comrnon force to use in conjunction with a button comm~ntl is the jolt 15 force. A specific comm~n~l, e.g., BUTTON_JOLT, can be provided to cause a jolt force whenever a specified button is pressed, and which includes button and jolt cnmm~n-l parameters.
When the button jolt coll"lland is received by microprocessor 26, the microprocessor can run a button check as a background process until comm~n-l~.d to l~ . the button background process. Thus, when the microprocessor 26 de.t~rmin~s that the user has pressed a button from the 20 sensor data, the jolt force can be overlaid on any existing forces that are output.
The button command sets up the microprocessor 26 to output a force when the other input device 39 has been activated. Thebutton l;o..~ l may accept a number of co....l~ d parameters in-hlfling, for example, button and autofire frequency parameters (in addition to any collllll~ld p~dlll~L~l~ specific to the desired force to be output when the button is pressed). The button 25 p~dllleLel selects the particular button(s) which the microprocessor 26 will check to be activated by the user and which will provide the desired forces. For example, a joystick may have multiple buttons, and the software developer may want to provide a force only when a particular one of those buttons is pressed. A duration pald,ll~Lel can (let~nnine how long the jolt lasts after the button is pressed. The "autofire" frequency parameter ci~-cign~t~ s the frequency of a repeating 30 force when the user holds down a button. Por example, if the user holds down a particular button, the microprocessor can autom~ti~lly repeat a jolt force after a pre-lf t~rmin~d time interval has passed after the user first pressed the button. The autofire parameter can also optionally ~ ignzlt~
whether the autofire feature is being used for a particular button and the desired time interval before the repeating forces are applied.

W O 97/21160 PCT~B96/01441 FIGURE 14 is a table 332 showing a number of preferred position control host com m~n(ls that can be used in the embodiment of Figure 5. Herein, "position control" refers to a mapping of a user object in which rli~pl~rPmPnt of the joystick handle or other user object directly dictates displacement of a colu~uLel-~imnlatP~l entity or object. The mapping can have an all,iLl~uy scale S factor or even be non-linear, but the filnrl~mPntal relation between user object (li~pl~-~çmPnt~ and culll~uk;l object or entity displacements should be present. Under a position control mapping, the computer-controlled entity does not move unless the user object is in motion; a static user object dictates static com m~n~1c to microprocessor 26 from host culllpuLel12.
Position control is not a popular mapping for traditional colll~!uLer games, but is effectively 10 used in other applications such as the graphical user interface ~GUI) embodiments disclosed herein. Position control is an intuitive and effective metaphor for force fee~k~rk interactions because it is a direct physical mapping rather than an abstract control paradigm. In other words, because the user object experiences the same physical manipulations as the entity being controlled within the colllyuL~r~ position control allows physical computer ~imlllations to be directly reflected 15 as realistic force feeclk~k sensations. Fxamples of position control in computer en~/ilun~ t~c might be controlling a paddle in a pong-style tennis game or controlling a cursor in a GUI or windows desktop environment. Contracte-1 with rate control's vehicle-centric forces, position control force feedb~ k roughly corresponds to forces which would be perceived directly by the user. These are "user-centric" forces.
Descriptions will now be provided for several types of position control forces 334, as referenced in table 332, that can be impl~-.mPntPr1 by microprocessor 26 from host commanll~
These forces include: vector, groove, divot, texture, barrier, field, paddle, and button reflex jolt.
Many of the exarnples 336 of host commant1~ corresponding to these forces use magnitllde and style pararneters as ~ c~ e-1 with reference to the rate control paradigms. As with the rate control c~-mmz-n~l~, comm~n~l pararneters of the same name generally have the same plu~t;l Lies for dirrelc;
host comm~nflc. However, the duration p~r~m~t~r is typically not used for position control comm~nt1~ as much as for rate control comm~nlls, since the duration of the position control forces are typically applied depending on the current position of the user object. The position control force models thus generally remain in effect until the host Cu~ uL~l 12 issues a new host force comm~n~l or a clear cF~mm~n-l In alternate embodiments, a duration p:lr~mpt~r can be used.
Preferred p~ tions for described position control comm~nds are snmm~ri7P(l in Figure 14. All the forces listed below can include additional command parameters, such as ~1e~flk~nrl parameters, or incul~o~ other properties into the parameters listed in Figure 14.
Similar to the host comm~n(ls shown in Figure 9, host comm:~n~l~ 336 preferably include a cnl,.l"~"~ portion 338 and a number of cnmm~n~ parameters 340. Com m~n~ 336 indicate the type of force which the host co~ ulel 12 is instructing the processor 26 to implement. This CA 0223912~ 1998-0~-29 c--mm, n~l portion may have a corresponding force sensation process which the processor 26 can retrieve from memory 27 and implement. Command portions 338 can be specified in virtually any form.
~ A vector force is a general force having a m, gnitll-le and direction. Refer to Figure 12 for 5 a polar representation of the vector force. Most position control sensations will be generated by the pro~ldlll.llt;l/developer using a vector force comm~n-1 and d~pl~liate instructions and progl~..""il~ constructs. A duration ~ ,.lllr~l is typically not needed since the host 12 or microprocessor 26 can le~ late or modify the force based on user object motions, not time.
FIGURE 15 is a graph 342 showing a force versus (1i~pl~rement relationship for a groove 10 force of the present invention. The groove force provides a linear detent sensation along a given degree of freedom, shown by ramps 344. The user obiect feels like it is captured in a "groove"
where there is a restoring force along the degree of freedom to keep the stick in the groove. This r~strJrin~ force groove is centered about a center groove position C located at the current location of the user object when the host colll.~ l was received. Alternatively, the location of the center 15 groove position can be specified from a comm~ntl pal~lleler along one or more degrees of freedom. Thus, if the user dl~elll~7 to move the user object out of the groove, a resisting force is applied.
Tne m:lgnitll(1P (stiffnPss) p~l~l specifies the amount of force or resistance applied.
Optionally, a "snap-out" feature can be implrm~An~A1 within the groove force sensation process 20 where the groove forces turn off when the user object deviates from the groove by a given snap-out distance, shown as ~ t~nre S. Thus, the microprocessor 26 would receive a groove cr)mmz~n~l having a snap tli~t~nc~e m~gnitll-lP. When the microprocessor detects the user object moving outside this snap distance, it turns off the groove forces. This snap-out feature can be implemented equally well by the host co~ ulel ~2 sending a clear comm~n~l to turn off forces. Also, a 25 de~-lh~nrl DB can also be provided to allow the user object to move freely near the center groove position C~, specified with a deadband command pald-l-~ler. A style command ~--t;~t;r in~1ir~tPs the orientation of the groove along one or more degrees of freedom (e.g., hori70nt~l, vertical, diagonal). For example, horizontal and vertical grooves can be useful to provide forces for scroll bars in windows. A user moving a cursor in a graphical user intrrf~re can feel groove forces 30 moving the cursor and user object toward the middle of the scroll bar. The de:~-lb~n~l gives the user room to move the cursor within the scroll bar region. The snap-out fli~t~nre can be used to free the cursor/user object from forces once the cursor is moved out of the scroll bar region.
A divot is essentially two (or more) orthogonal grooves that provide restoring forces in more than one degree of freedom. This provides the sensation of a point detent along a given 35 degree of freedom. If the divot is provided in two degrees of freedom, for example, then the user object feels as it if has been captured in a circular depression. The user object is captured at a point where there is a rçstorinp; force along both axes to keep the user object at the point. The snap-out feature of the groove force can also be implemented for the divot. In addition, the deadband feature of the groove can be provided for the divot comm~n-l A texture force cimnl~tr.c a surface property, as described above with reference to Figure 4.
A texture is a spatially varying force (as opposed to vibration, a time varying force) that ~imulzltrs the force felt, for example, when a stick is moved over a grating. Other types of textures can also be cimnl~terl The user object has to be moved to feel the texture forces, i.e., each "bump" of the grating has a specific position in the degree of freedom. The texture force has several characteristics that can be specified by a programmer/developer using the host command and cn~ l parameters. These command ~ lc;k;l~ preferably include a mzlgnit7lf1e, a grit, and a style. The m:~gnihlde specifies the amount of force applied to the user object at each "bump" of the grating. The grit is basically the sp~ing between each of the grating bumps. The style comm~n-l ~a~ can specify an orientation of the texture. For ~x~mple, the style can specify a holi~ull~al grating, a vertical grating, or a diagonal grating (or a superposition of these gratings).
Furthermore, the style p~, ~ " ~St~r can specify if the texture is felt bi-directionally or uni-directionally along a degree of freedom. Alternatively, additional c.~ t1 parameters can be provided to control the position of the "bumps" of the texture force. For e~:~mple7 information can be inclnfl~l to instruct the ~lict:~nre between bumps to vary exponentially over a ~1ict~nre~ or vary according to a speçifi~l formula. Alternatively, the texture spacing could vary randornly. In yet other embo-limrntc, the command parameters can specify one of several available standard texture patterns that microprocessor 26 can retrieve from ll~elllol y .
A barrier force, when cc)l .. ~ lP~l cimnl~tr,c a wall or other obstruction placed at a location in user object space, and is described above with reference to Figure 4. The host command can 25 specify the hardness of the barrier (m:~nit~l(le of the force applied), the location of the barrier along the degree of freedom, and the snap ~lict~nre or thirknrss of the barrier. A barrier can also be provided with a compliance or sprin~in~cc using a spring constant. Hnri7nnt:~1 barriers and vertical barriers can be provided as separate host col.-.)~ 1c, if desired. As in-licat.-A in graph 346 of FIGURE 16, a barrier force or~ly has a finite thickness. The force increases steeply as the user 30 object is moved closer into the barrier (past point B). The snap-through dict~nce defines the size of the region where the barrier is felt by the user. If the user obiect is moved into a barrier, and then is moved past the thicknrcc of the barrier, the barrier force is turned off. The barrier force can act as a hard obstruction, where the microprocessor provides mz~ l force m~gnihl~le to the user object 34, or as a bump or softer barrier, where a smaller force m~gnitl-rlr is applied (as specified by the m~gnitll(le c~ cl parameter). The barrier can remain for an extended period unless CA 0223912~ 1998-0~-29 WO 97/21160 PCTIIB96tO1441 removed or moved to a new location. Multiple barriers can also be provided in succession along a degree of freedom.
Alternatively, the barrier force can be provided by sending a host command having only two co~ land parameters, hardness and location. The hardness parameter can specify the height S and slope of the resistive force. As shown in graph 348 of Figure 16, the user object can move from left to right along the tlict~n-~.e axis. The user object feels a resistive force when hitting the barrier at point B. After the user object has been moved to point S (the snap-distance), the force is applied to the user object in the u~osil~ direction (a negative m~gnit~ force), which decreases as the user object is moved i n the same direction. This cimlll~t~os a bump or hill, where the force is 10 resistive until the user object is moved to the top of the bump, where the force becomes an ~csi~tin~ force as the object is moved down the other side of the bump.
A force field type force attracts or repulses the user object with respect to a specific position. This force can be defined by col~ d p~ll~lel~ such as a field m~gnitllde and the specific field origin position which the force field is applied with respect to. A sense parameter can 15 be included to intlil-~t~ an attractive field or a repulsive field. For example, the force field can be an attractive field to cim~ t~ a force of gravity between the field origin position and a user-controlled cursor or graphical object. Although the field origin position can be thought of as a ~lavi~Lional mass or an electric charge, the attractive force need not depend on the inverse square of t1icp~ .omf~nt from the specific position; for example, the force can depend on an inverse of the 20 displacement. The attractive force field also ~LLt;lll~. to m~int:lin the user object at the field origin position once the user object has been moved to that position. A repulsive field operates similarly except forces the user object away from a speçified field origin position. In addition, ranges can be specified as additional colllmal d parameters to limit the effect of a force field to a particular ~lict~n~e range about the field origin position.
25FIGURES 17a-17i are diagl~ r illustrations of a "paddle" col~ uLel object 350 interacting with a "ball" CUlll~JUlt;l object or sirnilar object 352. These computer objects can be displayed on display screen 20 by host colll~uL~l 16. The force interactions between the ball and paddle can be controlled by a software developer using a host comm~nd, as çxpl~in~d below. In the described ç~mple, paddle object 350 is controlled by a player by a position control paradigm 30such that the movement of paddle object 350 is directly mapped to movement of user object 34. In ~lt~rn~tP embo-limt-ntc, ball object 352 or both objects can be controlled by players.
Figures 17a-17h show how paddle object 350 interacts with a moving ball object 352 as ball object 352 collides with the paddle object. In Figure 17a, ball 352 first impacts paddle 350.
Preferably, an initial force is applied to user object 34 in the ~I,r~liate direction. In Figures 17b and 17c, ball 352 is moving into the compliant paddle or "sling". Preferably, a force based on a cim~ tçd mass of ball 352 iS felt by the user through user obiect 34 which is apL~Iupliate to the cimlll~tPd velocity of the ball (and or the paddle), the cimlllAt~(l compliancé of the paddle (and/or the ball), and the strength and direction of ~imlll~tPd gravity. These factors (and other desired physical factors) can preferably be set using a host command with the a~l u~flate parameters. For 5 example, the following host comm~nrl can be used:
PADDLE (B_mass, B_vel_x, B_vel_y, Gravity, Sense, Compliance_X, Compliance_Y, style) where the c.~.".~ d parameter B_mass in~ t~.s the sim~ t~l mass of the ball, B_vel_x and ~3_vel_y are the velocity of the ball, gravity is the strength of gravity, sense is the direction of gravity, and Compliance_X and Compliance_Y are the cim~ t~d compliance or stiffness of the 10 paddle object 34. Other Lla~ Lers can also be in~ rl to control other physical aspects of the colllpuLel ellvh~o~ t and interaction of objects. For exarnple, a ~im~ t~d mass of the paddle can also be specified. Also, the baU 352 can be displayed as a cvlllL~I~ssed object when it impacts paddle 35O, with, for e~mple, a reduced height and an oval shape. In addition, damping parameters in the x and y axes can also be inrllldefl in the paddle command to add a damping force 15 to the collision between the ball and paddle in addition to the compli~nre (spring) force. Also, the parameters such as the compliance and/or damping of the paddle might be allowed to be adjusted by the user with other input 39 or a third degree of freedom of user object 34. The style parameter of the paddle command rnight select one of several different pre-l~t~.,.lil.ed paddle configurations that are available and stored in, for example, ll~t;llloly 27. The configurations can have different 2û paddle lengths, widths, compliances, or other displayed and/or force characteristics of a paddle.
In Figure 17d, the ball has reached a IlI;I~illllllll flexibility point of paddle 34 and can no longer move in the same direction. As shown in Figures 17e through 17g, the ball is forced in the opposite direction due to the compliance of the paddle. In addition, the user may preferably exert force on user object 34 to direct the baU in a certain direction and to add more velocity to the ball's 25 mûvement. This allows the user a fine degree of control and allows a ~igttiftr.~tnt application of skill in directing the ball in a desired direction. The force fçç~bzl~k paddle is thus an improved co~ ollent of "pong" type and other similar video games. In addition, the paddle 350 can optionally flex in the opposite direction as shown in Figure 17h. An interface apparatus providing two linear (X and Y) degrees of freedom to user object 34 as well as a rotating ("spin") third 30 degree of freedom about the Z axis (or C axis is Figure 6) is quite suitable for the paddle-ball irnplel~ iûn.
A s-~h~m~ti~ model of the forces interacting between ball 352 and paddle 350 is shown in Figure 17i. A spring force indicated by spring constant K is provided in both degrees of freedom X and Y to indicate the springinP~ of the paddle 350; g is a gravity direction. In addition, a CA 0223912~ 1998-0~-29 WO 97/21160 PCT/~B96/01441 mping force indicated by ~l~mping constant B is also provided to slow tne ball 352 down once it contacts paddle 350. The spring and damping forces can also be applied in one degree of freedom.
The paddle control algorithm is a dynamic algorithm in which microprocessor 26 computes interaction forces while a ball compresses the paddle and then releases from the paddle. The S paddle C.).~ l is sent by host co~ u~l~ 12 when the ball contacts the paddle. The paddle cnmm~nrl reports ball location to the host COlll~)ul~il SO that the host can update graphics displayed on display screen 20 during the interaction period. In plc~selllly preferred embor1iment~, the updates only need to be provided at about 60 Hz to the host, since most displays 20 can only display at that rate. However, the forces should be colll~uL~d and output at about 500 Hz or more 10 to provide a realistic "feel" to the interaction. Thus the local microprocessor can compute the forces quickly while occasionally reporting the sensor readings of the paddle to the host at a slower rate. Other types of video game or cim~ ion interactions can also be comm:~.n~ cl with a high-level host col..ll~ 1 in a similar fashion. In addition, in ~ltt-rn~tive embodiments, host colll~ulel 12 can control the ~ct ~~t~rs 30 directly to implem.qnt the paddle and ball force fee~b~ck, without 15 sending any high level host comm~nfl~ Also, similar paddle-ball interactions and forces can be provided between il,ltl~;~ions of other types of graphical objects, such as between a cursor in a GUI and another object or surface.
FIGURE 17j is a dia~;l~lll..l;~lic illustration of a 2-D imple.~ n of displayed gld~hical objects on display screen 20 which can be implemented with the paddle host co~ d described above or implemented in a GUI or other graphical environment. A playing field is displayed in which action is to take place, and two goals 368 and 370 are provided on the playing field. Two paddles 360 and 362 are displayed which are moved around the playing field. Paddles 360 and 362 are shown as vertically-aligned se-gm~ -nt~ having a length and a relatively small width, but can be {~ .nt.~ and/or shaped quite differently in other embo(limt-ntc. For example, other geometric shapes, images of tennis rackets, or images of a person holding a tennis racket can be used in place of the paddles. Paddles 202 and 204, ball 206, goals 201 and 203, and any other computer-gen~,ldL~d objects that are in~ e-l in the ~imlll~tion are gen~ri~lly referred to herein as "graphical objects."
In one embodiment, paddle 360 can be controlled by host Colll~JuLel system 12, and paddle 362 can be controlled by the user by physically manipulating the user object. Ball 352 can be moved on display screen 20 according to ~im~ t~ physical parameters, such as velocity, acceleration, gravity, compliance of objects, and other parameters as discussed previously. When t'ne ball 352 collides with paddle 362, the paddle flexes, and the user feels the collision force. For example, if ball 352 is moving in direction 364, then the user feels a force in the equivalent degrees of freedom of user object 34. In some embodiments, both the paddle 362 and t'ne ball 364 can be moved in direction 364 to ~im~ t~ the paddle being pushed back by the ball. FIGURE 17k shows CA 02239l2~ l998-0~-29 WO 97~1~60 PCT~B96/01441 a similar embodiment in which a first-person perspective view (or !~im~ tf~d 3-D view) of the graphical objects is shown on display screen 20, where ball 352 iS a sphere.
The user can also move the user object so that the paddle moves in a direction 366. The user will thus feel like he or she is "carrying" the weight of the ball, as in a sling. The ball will 5 then be released from the paddle and move toward the other paddle 360. As is well known, a goal in such a game might be to direct the ball into the opposing goal. Thus, the user can try to direct the ball into goal 368, and the host computer can control paddle 360 to direct the ball into goal 370.
Paddles 360 and 362 are used to block the ball from moving into the defended goal and to direct the ball back at the desired goal. By moving the paddle in a combination of direction 366 and up 10 and down movement, the user can influence the movement of the baU to a fine degree, thus allowing a player's skill to influence game results to a greater degree than in previous games without force feedback. In addition, other features can be inf~]~ ed to further inflllçll~e the ball's direction and the forces felt by the user. For example, the orientation of the paddle can be changed by rotating the paddle about a center point of the paddle. This rotation might be sensed from a 15 "spin" degree of freedom of the user object about an axis C, as described above with reference to Figures 6 and 7. Force fe.e-1h~rk could thus be a~ iately applied in that spin degree of freedom. Other features can also be provided, such as allowing a ball to "stick" or be trapped to a paddle when the two objects collide and/or when a button is pressed by the user. The user could then activate or release the button, for example, to release the ball at a desired time.
In another embodiment, paddle 360 can be controlled by another user rather than host computer 12. For example, a second interface device 14 can be cnnnf~ctf~-l to another input/output port of host co~ uLer 12 and can be used to control paddle 360 by a second user. Each player would therefore feel the forces on their respective paddle from the ball directed by the other player.
In addition, if the two paddles 360 and 362 were brought into contact with one another, each 25 player could feel the direct force of the other player on each player's user object. That is, a first user's force on his user object would cause his paddle 362 to move into the other paddle 360, which would cause both the first and second users to feel the collision force. If the first paddle 362 were allowed to push the other paddle 360 across the screen, then the second user would feel the first user's pushing force. The first user would feel similar forces from the second user. This 30 creates the effect as if each player were pushing the other player directly. Such pushing or "tug of war" garnes between two users can take several different embodiments.
Furthermore, the second intf~ fe device 14 need not be connected to co~ u~er 12.Instead, host computer 12 can be coupled to a second host computer through a direct or network interface, as is well to those skilled in the art. The movement of a first user object would thus be 35 cn,.,.l.li~tf~d from the first host cu~ ul~l to the second host colll~uLt;l, which would then comm~nf1 forces on the second user object; and vice-versa The embodiment of Figure 17k is CA 0223912~ 1998-0~-29 a~lv~liate for such an embodiment, where each user can view paddle 362 as the paddle under his own control on his own display screen 20 and paddle 360 as the other player's paddle.
This concludes the description of position control paradigms.
In addition, a clear command is preferably available to the host computer. This comrnand S can include a pa~ tel- s~eciryillg particular degrees of freedom and allows the host computer to cancel all forces in the specified degrees of freedom. This allows forces to be removed before other forces are applied if the programrner does not wish to ~upefllll~ose the forces. Also, a configuration host cnmm~n~ can be provided to initially set up the intPrf~ce device 14 to receive particu}ar c.~.. ~.. ~ication parameters and to specify which input and output will be used for a 10 particular application, e.g. the host computer can instruct local microprocessor 26 to report specific information to the host C~ )uk;l and how often to report the information. For example, host cvlll~ulel 12 can instruct microprocessor 26 to report position values from particular degrees of freedom, button states from particular buttons of int~rf:~ce device 14, and to what degree to report errors that occur to the host computer. A "request information" cnl .""~-tl can also be sent by host 15 cn".l.. lel 12 to intt~rf~re device 14 to receive inforrnation stored on the h~ r~ce device 14 at the time of m~nuf~ctllre, such as serial number, model number, style information, calibration parameters and information, resolution of sensor data, resolution of force control, range of motion along provided degrees of freedom, vendor identifi~zltinn, device class, and power management information, etc.
20 In addition, the above described forces can be superimposed. The host computer can send anewhostc~ "1..,~ 1 whileaprevioushostcnmm~n-l is still in effect. This allows forces applied to the user object to be combined from different controlling comm~nr1~. The microprocessor 26 or host colu~uLel may prevent cert~in commS--~ that have contradictory effects from being ~-~pe~ rosed (such as a restoring force and a restoring spring). For example, the latest host 25 coll~lllalld sent can override previous cnmm:-nllc if those previous commzln-ls conflict with the new comm~n~l Or, the conflicting comm~n-ls can be assigned priorities and the coll~ll~ld with the highest priority overrides the other conflicting comm~n(1~.
It should be noted that the high-level host co...,..~ and cnmm~n~l parameters described above are merely examples for implPm-onting the forces of the present invention. For example, 30 co, .lllland parameters that are described separately can be combined into single parameters having different portions. Also, the distinct comm~n(l~ shown can be combined or separated in different ways, as shown above with the example of the condition cnmm~ncl for providing mnltirl~ rate control condition forces.

CA 0223912~ 1998-0~-29 In addition to common interface devices with one or two rectangular or spherical degrees of freedom, such as standard joysticks, other int~ re devices can be provided with three or more degrees of freedom. When the third degree of freedom is about an axis along the stick itself, those skilled in the art call it "spin" or "twist." Each degree of freedom of a user object can have its own 5 ~1P~ie~tf~1 high-level host comm~n-l By independently associating high-level host comm~ntl~ to each degree of freedom, many possible combinations of position control, rate control, and other abstract mappings can be impl~m~nte~l with interfare devices. Multiple control paradigms may also be mixed in a single degree of freedom. For example, a joystick may have position control for small deviations from the origin in a degree of freedom and rate control for large deviations from 10 the origin in the same degree of freedom. Such a mixed control paradigm can be referred to as a local position/global rate control paradigm.
FIGURE 18 is a dia~ ie illustration of display screen 20 displaying a graphical user interface (GUI) 500 used for interfacing with an ~t;iaLillg system implemented by co~ uLer system 12. A preferred embodiment of the present invention implements force feedback technologies to 15 embellish a graphical user interface with physical sensations. By cu" ." ,~ ir~ting with force feedback in~rface device 14 or a similar apparatus that provides force feeflh~rk to the user, the computer 12 can present not only visual and auditory information to the user, but also physical forces. These physical forces can be carefully tle~i~n~-l to enh~nre manual pe.rollllance by, for example, re~ cing the difficulty of required "~L~thlg" tasks. Such force feeclb~rk sensations can be used to fzlrilit~te 20 interaction with co~ uLel operating systems for all users. In addition, those users who suffer from spastic hand motion and other dexterity~ bilit~ting conditions reap great reward from the addition of these force feedback sensations.
The addition of computer generated force fee-lh~ek sensations to a windows uLJel~Lillg system environment can enh~nce manual pelru,lllallce in at least two ways. First, physical forces can be used 25 to provide haptic sensory cues on user object 34 which increase a users ~el~ Lual underst~n~1in~ of the GUI spatial "landscape" pc)lLIdyed on display screen 20. For example, sensations of physical bumps or textures which are applied to user object 34 as the user moves a cursor across the screen can be used to indicate to the user that he has positioned the cursor within a given region or crossed a particular boundary.
Second, computer-gen.-r~ted forces can be used to provide physical constraints or assistive biases which actually help the user acquire and ...~ - the cursor at a given target displayed on screen 20 within GUI ~00. For example, an attractive force field can be used to physically attract user object 34, and thus the cursor controlled by user object 34, to the location associated with a given target such as an icon. Using such an attractive field, a user simply needs to move a cursor on the 35 screen close to the desired target, and the force feellb~ek int~ ee device 14 will assist the user in CA 0223912~ 1998-0~-29 WO 97/21160 PCTlIB96/~1441 moving the cursor to the target. Many other abstract force feedback sensations can be used to enhance and embellish the wide variety of GUI-based metaphors.
Herein, the manual tasks of the user to move a cursor displayed on screen 20 by physically manipulating user object 34 (also referred to as a "physical object") in order to command the cursor to 5 a desired location or displayed object, are described as l~ge~ g" activities. "Targets", as referenced herein, are defined regions in the GUI 500 to which a cursor may be moved by the user that are associated with one or more forces and which are typically associated wit_ gr~rhir~l objects of GUI
500. Such targets can be associated with, for example, graphical objects such as icons, pull-down menu items, and buttons. A target usually is defined as the exact ~iim~nsions of its associated 10 graphical object, and is ~up~ lposed and ":~tt~-h~rl" to its associated graphical object such that the target has a constant spatial position with respect to the graphical object (i.e., when the graphical object is moved, its target also moves the same rii~t~n~e and direction). Usually, "graphical objects"
are those images appearing on the display screen which the user may select with a cursor to implement an op~,ldL~,lg system function, such as displaying images, ey~ocuting an application program, or 15 pelrc,lllling another c~ function. For simplicity, the term "target" may refer to the entire graphical object with which the target is ~o~i ~e(l Thus, an icon or window itself is often referred to herein as a "target". However, more generally, a target need not follow the exact dimensions of the graphical object associated with the target. For example, a target can be defined as either the exact displayed area of an associated graphical object, or the target can be defined as only a portion of the 20 graphical object. The target can also be a different size and/or shape than its associated graphical object, and may even be positioned a ~lict~nre away from its associated ~dl~l~ical object. The entire screen or background of GUI 500 can also be considered a "target" which may provide forces on user object 34. In addition, a single graphical object can have multiple targets associated with it. For e~r~mple7 a window might have one target associated with its entire area, and a separate target 25 associated with the title bar or corner button of the window.
Upon moving the cursor to the desired target, the user typically m~int~in~ the cursor at the acquired target while providing a "c~ 1 gesture" associated with a physical action such as pressing a button, squeezing a trigger, d~ si~g a pedal, or making some other gesture to command the execution of a particular ope~ g system function associated with the graphical object/target. In 30 the preferred embodiment, the command gesture can be provided as other input 39 as shown in Figure 1. For example, the "click" (press) of a physical button positioned on a mouse or ioystick while the cursor is on an icon allows an application program that is associated with the icon to execute.
Likewise, the click of a button while the cursor is on a portion of a window allows the user to move or "drag" the window across the screen by moving the user object. The command gesture can be 35 used to modify forces or for other functions in the present invention as well. For example, a button CA 0223912~ 1998-0~-29 WO 97/Z1160 PCT/~1396/01441 -46 ~
on the user object can be ~1e~ign~tP-l to remove the forces applied in a certain region or target in GUI
500.
In other embo-lim.-nt~, the "comm~n-l gesture" can be provided by manipulating the physical object of the interface device within provided degrees of freedom and/or with graphical objects 5 displayed on the screen. For example, if a user object has a third degree of freedom, such as linear tr~n~l~tion along axis C of Figure 6, then movement in this direction can indicate a command gesture.
In other embodiments, graphical objects on the screen can provide a command gesture when manipulated by a user. For example, if a pull-down menu is displayed, a small button can be displayed on or near each menu item. The user could then move the cursor onto the apL~Iu~liate 10 button to select that menu item. Also, a side view of button could be displayed, where the user moves the cursor into the button to "press" it and provide the ~;o~ ~d gesture. A spring force on user object 34 can be associated with this pressing motion to provide the feel of a mech~nic~l button.
The discussion below will build upon a concept of GUI targets being in~ln~lerl in a particular hierarchy of levels in relation to each other. A first target that is inclllde~l or grouped within a second 15 target is considered a "child" of the second target and lower in the hierarchy than the second target.
For example, the display screen 20 may display two windows. Windows are typically considered to be at the same l~ie~ ;lly level, since windows typically are not grouped inside other windows.
However, a window that is grouped within a higher level window, such as a window inf~ d~d within a Program Manager window (see Figure 19), is considered to be at a lower level in the hierarchy than 20 the grouping window. Within each window may be several icons. The icons are children at a lower hierarchy level than the window that groups them, since they are grouped within and associated with that window. These target concepts will become clearer below. It should be noted that one target may be displayed "within" or over another target and still be at the same hierarchy as the other target.
For example, a window can be displayed within the outer perim~ ter of another window yet still not be 25 grouped within that other window, so that the windows have the same hierarchy level.
The GUI permits the user to access various opeldLillg system functions implempntto~l by an operating system running on co~ uLer system 12. For example, the Windows ol)eldLi,lg system can be running on computer system 12 to jmpl o3JeldLillg system functions. Operating system functions typically in(~ cle, but are not limited to, peripheral input/output functions (such as writing or 30 reading data to disk or another peripheral), sel.octing and running application programs and other programs that are independent of the opeldLillg system, selecting or m~n~ing programs and data in memory, viewing/display functions (such as scrolling a document in a window, displaying and/or moving a cursor or icon across the screen, displaying or moving a window, displaying menu titles and selections, etc.), and other functions impl~m~nt~.-l by computer system 12. For simplicity of 35 discussion, the functions of application prograrns such as word processors, spre~-lch~ets, and other applications will be subsumed into the term "operating system functions", although the functions of CA 0223912~ 1998-0~-29 WO g7/21160 PCT/IB96/01441 an application program are usually considered to be independent of the operating system. Typically, application programs make use of opel~Lillg system functions to interface with the user; for example, a word processor will implement a window function of an operating system to display a text file in a window on the display screen. An ~t~Jeldtillg system function may typically be selected by the "type"
5 of graphical object; for r.xzlmrl~, an icon may generally execute an application program, a window gener~lly displays collections of other graphical objects, a slider bar scrolls images on the screen, a menu item may perform a variety of operating system functions depending on its label, etc.
In addition, other types of intrrf~res are similar to GUI's and can be used with the present invention. For example, a user can set up a "page" on the World Wide Web which is implPm~-ntrcl by 10 a remote CollyJut~l or server. The remote computer is connrctr-l to host colll~uLt;l 12 over a network such as the Internet and the Web page can be ~3.~ces.~e-l by dirrplt;lll users through the network. The page can include graphical objects similar to the graphical objects of a GUI, such as icons, pull-down menus, etc., as well as other graphical objects, such as "links" that access a different page or portion of the World Wide Web or other network when selected. These graphical objects can have forces 15 associated with them to assist in selecting objects or functions and infc.llllillg the user of the gr~rhir~l layout on the screen. In such an embodiment, the speed of data transfer between the host computer and a network node can often be slow. Therefore, the reflex embo~limrnt as described above with reference to Figure 5 is quite suitable, since the local microprocessor 26 can impl~mrnt force sensation processes controlled by c~-mmS-n-lc received from the remote COlll~llt~,l implPmrnting the 20 Web page and/or from the host c(~ Pr 12. In yet other emboriimrnt.~, a ~im~ tf~cl three-~limf-.n~ional GUI can be implemented with the present invention, in which an isometric or perspective view of a GUI environment and its graphical objects can be displayed. .AItrrn~tively, a "first person"
view of a GUI i..l~ . r;~-~ can be implem.o.ntr~l to allow a user to select u~la~illg system functions within a ~imlll~t~.-l 3-D virtual envh..~
GUI 500 is preferably implemented on host computer 12 using program instructions. The use of program instructions to pelrollll operations on a host computer and microprocessor is well known to those skilled in the art, and can be stored on a l'co~ uL~l readable medium." Herein, such a mrt~ m includes by way of example memory such as RAM and ROM coupled to host com~u~;l 12, memory 27, mzlgnrtir disks, m~gnrtir tape, optically readable media such as CD ROMs, 30 semiconductor memory such as PCMCIA cards, etc. In each case, the m~Ail-m may take the form of a portable item such as a small disk, diskette, cassette, etc., or it may take the form of a relatively larger or immobile item such as a hard disk drive.
~ In Figure 18, the display screen 20 displays a GUI 500, which can, for example, be implr.mrntrrl by a Microsoft Windows~ operating system, a M:~rintt~h operating system, or any 35 other available opeldLillg system inc~ ul~ g a GUI. In the example shown, a program manager window 501 contains various icons 502 that are grouped by window 501, here labeled as "Main", CA 0223912~ 1998-0~-29 "Startup", and "Tools", although other or dirrGlG.IL icons may be grouped within window 501. A
menu bar 504 may be included in window 501 which permits pull-down menus to appear by selecting menu hP~tling targets 505 with a user-controlled graphical object 506, such as a cursor, that is controlled by the user via a user-rnanipulable device such as the user object 34. For example, a user 5 rnay select any of the "File", "Options", "Window", and "Help" menu hP~ ing~ 505 to display an associated pull-down menu of menu items (shown in Figure 19). Typically, a command gesture such as a button press or other input 39 (as in Figure 1) is also required to display a pull-down menu when cursor 506 is positioned at a menu hP~-ling 505. In ~ltP.rn~tP embo-limPnt.~, a pull down menu might be ~ tr~m~tir~lly displayed (without a co~ al~d gesture) when cursor 505 is positioned at the 10 associated menu hP.~tling 505. In the subsequent description, the terms "user-controlled graphical object" and "cursor" will be used interçh~ngP~hly.
The present invention provides force feedback to the user through user object 34 based on a location, a velocity, an acceleration, a history of one or more of these values, and/or other char:-rtPri~ti~ of the cursor 506 within the GUI 500 environment. Other "events" within the GUI
15 may also provide forces, as des~ribe~l above with reference to Figures 4 and 5. Several preferred embo~limPntc of ~rre.t;..~ forces or "force sensations" applied to the user object 34 are described below. As described above in the emborlimPnf~ of Figures 4 and 5, the host co~ uLer can provide a signal to local processor 26 (or directly to ac~ t~rc 30) to apply different force sensations on user object 34. These "force sensations" can be forces of a single m~gnit~lfle in one direction, or they may be an interaction or sequence of forces, for example, to create the sensation of a texture, a damping force, a barrier, etc. The terrns "force" and "force sensation" (i.e. "type" of force) are used interchangeably herein, where it is ~ llmP~ that single forces and/or sequences/interactions of forces can be provided.
In one ~lGrGllc;d embodiment of Figure 18, the force feeclb~k depends upon a ~ qnre between cursor 506 and a target, such as window 501, using one of the aforementioned force models.
The ~ t~n~e can be measured from one or more points within the window 501 or its perimeter. As depicted in Figure 18, the window 501 is considered to be the highest level target displayed in GUI
500 (in actuality, the entire screen area of GUI 500 is preferably considered the highest level target, as described below). Icons 502 and menu bar 504 are targets that have a lower level in the hierarchy. In other situations, the window 501 could be grouped within with a higher level target, and the icons 502 and menu bar 504 could include additional targets lower in hierarchy than the icons and menu bar. ~ltern~tively, icons 502 and menu bar 504 can be the same hierarchical level as window 501, if, for example, icons 502 were positioned outside of window 501 and were considered on the "desktop", i.e., not grouped in any particular window. In addition, none of the associated targets are rectri-,te~l to be the same size or shape as their corresponding graphical objects, e.g. a target can be defined as a particular portion of a graphical object.

CA 0223912~ 1998-0~-29 Herein, it is assumed that a position control paradigm is implPmPntPd by the GUI 500 and interface device 14. For example, the position of cursor 506 is directly related to the position of user object 34 in provided degrees of freedom of the user object. Thus, when cursor 506 is moved left on screen 20, user object 34 is moving in a corresponding direction. The distance that user object 34 5 moves may not be same riict~n(~.e that cursor 506 moves on screen 20, but it is typically related by predetermined function. When describing the position of cursor 506 herein, the position of user object 34 within provided degrees of freedom is assumed to correlate with the cursor's position.
When forces are described herein as "affecting", "influencing" or "being applied to" cursor 506, it should be ~csumP-l that these forces are actually being applied to user object 34 by actuators 30, 10 which in turn affects the position of cursor 506.
In ~ItPrn~fP embodiments, a rate control p:lr~-ligrn can be used in GUI 500. For example, a user can push a joystick in one direction to cause the cursor to move in that direction, where the further the joystick is moved in that direction, the faster the cursor will move across the screen (in one impl~mPnt~tion of rate control). In such an embodiment, for example, the user might move the lS joystick from the origin position and then stop moving the joystick, and the cursor would continue moving across the screen at a constant speed. Forces can be applied to user object 34 dependent on the position of cursor 506 similarly to the position control embodiment. Another example where a rate control paradigm would be a~ iate is the button-like stick or knob that is positioned between keys of the keyboard on many portable co.l~ul~l~, and which uses rate control to move a cursor 20 within a GUI.
In a preferred embodiment, the host comm:-n-lc as described above with l~;r~;lGl1ce to Figures 9-17 can be used to provide the various forces used for a GUI 500 environment. The "reflex" mode of using the host co~ uLel 12 only for high-level supervisory comm~n~1c can be helpful in increasing the response time for forces applied to the user object, which is ~ccçnti~l in creating realistic and 25 accurate force fee-lb~ck For example, it may be convenient for host ~;olllLJuLt;r 12 to send a "spatial representation" to microprocessor 26, which is data describing the layout of all the graphical objects displayed in the GUI which are associated with forces and the types of these graphical objects (in the Web page embodiment, the layout/type of graphical objects can be downloaded from the remote collll,u~el providing the page). The rnicroprocessor can store such a spatial reprPSPnt~ti~n in memory 30 27. In addition, the microprocessor 26 can be provided with the nPePcs:lry instructions or data to correlate sensor readings with the position of the cursor on the display screen. The microprocessor would then be able to check sensor re:~-1ingc, ~le~Prmin~ cursor and target positions, and ~l~lrlllli~
output forces independently of host colll~uL~l 12. The host could implement o~elaLillg system functions (such as displaying images) when a~ liate, and low-speed h~n~1chs~king signals can be 35 commllnic~tP-l between processor 26 and host 12 to correlate the microprocessor and host processes.
Also, memory 27 can be a pPrm~nPnt form of memory such as ROM or EPROM which stores CA 0223912~ 1998-0~-29 WO 97/21160 PCT/IB96/0144}

predetermined force sensations (force models, values, reflexes, etc.3 for microprocessor 26 that are to be associated with par~icular types of graphical objects.
Other methods besides the use of the reflex embodiment can also be used to provide the forces within the GUI environment. For exarnple, host co~ u~r 12 can be connectl~d directly to sensors 28 S and actuators 30 of interface device 14 by a fastcu"~llic:~tion int~ ce to control the force feedback on user object 34, thus elimin~ting the need for local microprocessor 26.
In the described embodirnent, targets such as window 501, icons 502 and menu h~ ~(1in~.s 505 have force fields associated with them to influence and Pnh~nre the user's ability to move cursor 506 to or around the targets. For exarnple, icons 502 may have an attractive force associated with them.
This attractive force ori~in~tes from a desired point I within each icon 502, which may be located at the center position of the icon. Alternatively, point I can be located at a ~lifrel~l~t area of icon 502, such as near the ~elilll~ of the icon. Likewise, window 501 preferably has an attractive force associated with it which originates from a point W within window 501, which may be at the center of the window. Points I and W are considered to be "field origin points". Alternatively, force fields can rlri~in~t~ from a point or region not shown on the screen. These attractive forces are known as "~xtPrn~l forces" since they affect the cursor 506 when the cursor is positioned çxtPrn:~lly to the targets. F.xt~om~l and internal forces of targets are described in greater detail with respect to Figure 20a.
In ~lt(~ embo~lim~nt~, the field origin need not be a point, but can be a region or other defined area. For example, as shown in Figure 18, the entire area of an icon 502a can be considered the "field origin region" for an attractive force. In such an embodiment, the cursor may be able to be moved freely in a certain f1im~n~ion when within a region defined by the borders of the target. For example, if cursor 506 is in region Rl defined by the top and bottom borders of icon 502a, then horizontal forces might attract the cursor toward icon 502a, but no vertical forces would be applied.
Similarly, if cursor 506 is in region R2 defined by the left and right borders of icon 502a, then only vertical attractive forces might affect the cursor.
The attractive forces associated with window 501 and icons 502 are applied to user object 34 to influence the movement of user object 34 and cursor 506. Thus, an attractive force associated with window 501 will cause host colll~ulel 12 to command the actuators 30 of interface device 14 to apply appl u~ ate forces on user object 34 to move or bias the user object. Forces are applied to user object 34 in a direction such that cursor 506 is cc~lle:j~olldingly moved in a direction toward field origin point W of window 501. It should be noted that the forces to user object 34 do not actually have to move the user object in the aL,L~lupliate direction; for example, when using passive actuators, the user object cannot be physically mûved by the actuators. In this case, resistive forces can be applied so that user object 34 is more easily moved by the user in the ~plu~liate direction, and is blocked or CA 0223912~ 1998-0~-29 W O 97/21160 PCT~B96/01441 feels reci.ct~nre when moving in other directions away from or tangent to point W (passive actuator embodiments are described in greater detail with respect to Figure 20c). The attractive force applied to user object 34, which would move or bias cursor 506 toward point W, is represented by dotted line 507 in Figure 18. Preferably, the force is applied with reference to a single reference point of cursor 5 506, which is the tip point T in the p~c;r~lled embodiment. l[n ~ltem~te embodiments, the reference point can be located at the center or other location on cursor 506 or other user-controlled graphical object. The attractive forces can be computed, for example, with a l/R or l/R2 relationship bt;lweell field origin point W or I and cursor tip T to cim~ t~ gravity, as described above with reference to Figure 14.
For other types of targets, repulsive fields may be associated with a field origin point. For example, it may be desired to prevent cursor 506 from moving to or ~rcec.cing particular regions or targets on the screen within GUI 500. These regions might be displaying data that is processing in the background or other data that is desired to not be se~lrcte~l by cursor 506. If window 501 is one such target, for example, a repulsive field in the opposite direction to that represented by line 507 can 15 be associated with window 501 and can llri~in~tP at field origin point W. The force would move user object 34 and cursor 506 away from the target, making it more .1iffi~-l1t for the user to move cursor 506 onto the target.
In the preferred embodiment, the position of cursor 506 (lptprmint~s which field forces will affect the cursor 506 and user object 34. As described in greater detail subsequently, targets 20 preferably are associated with internal and P.Yt~rn~l forces in relation to cursor 506. Preferably, ~LLIa~;Liv~:; forces are external forces and thus affect user object 34 and cursor 506 only when the cursor 506 is positioned çxt~-.m~lly to the target. In the preferred embodiment, only the external forces of the highest level targets that are external to cursor 506 will affect the cursor 506 and object 34. Thus, in Figure 18, only the attractive force of window 501 will affect cursor 506 and user object 34, since the 25 icons 502 and menu hr~-ling.c 505 are at a lower level in the hierarchy. If cursor 506 were positioned within window 501, only the attractive fields of icons 502 and menu h~a<lingc 505 would affect cursor 506 and user object 34 and the attractive force 507 would preferably be removed. This relationship is described in greater detail with respect to Figure 20a. In ~ltPrn~l~ embo-1imPn~c, the forces from various targets can be combined or excluded in different ways.
FIGURE 19 dia~ n.l~tir~lly illustrates the GUI 500 wherein multiple windows 501, 530 and 540 are displayed on display screen 20. Grouped within window 501 are icons 502, menu bar 504, window 518, and pull-down menu 517; window 518 inrlll-ltos an icon 519. Grouped wi~in - window 530 are icons 532 and menu bar 534. Window 540 inrl~ldes icons 542 and menu bar 544.
All three windows 501, 530, and 540 are at the same hierarchical level. Therefore, in a 35 plc;r~ d embodiment, when the cursor 506 positioned outside the ~ of all three windows as CA 0223912~ 1998-0~-29 shown, cursor 506 and user object 34 are infl~lerl~ecl by a combination of the three external attractive forces, one attractive force from each window. These attractive forces are represented by dashed lines (vectors) 520, 522, and 524. Dashed line 520 represents the attractive force in a direction toward field orIgin point Wl of window 501, line 522 lG~lcsents the attractive force toward field origin point W2 of window 530, and line 524 represents the ~LIdL;Livt~ force toward field origin point W3 of window 540. The m~gnihld~s of these forces are preferably dependent on a formula, such as the inverse of the ~lict~n~ e between each target and point T of the cursor. These attractive forces are plt;r~ ly summed together as vectors to provide a resulting total attractive force in a reslllt:~nt direction having a resultant m:~gni~ P (not shown). Thus, cursor 506 and user object 34 would be moved or biased in the 10 resulting direction until either reaching the resulting field origin point or until a condition occurred to change the forces applied to cursor 506. In ~lt~rn~t~ embodiments, other methods can be used to combine force vectors from multiple targets. For example, other org~ni7~tions of hierarchies can be used. Or the m~nihl~l~,c of forces might not be sllmm~A, such that the reslllt~nt attractive force can be assigned a predetermined m:~gnitll~ or a m~gni1llcle that depends on the types of targets that have 15 contributed forces.
No forces associated with icons 502, 532, and 542, menu bars 504, 534, and 544, pull-down menu 510, internal window 518, nor "thumb" and corresponding scroll bar 582 affect cursor 506 and user object 34 while cursor 506 is positioned external to the windows as shown. The principal task to be pelro.llled in Figure 19 is the activation or selection of a particular window, not a window's 20 contents. Thus, the inclusion of forces arising from targets inside a window would inL~,lrt;le with window selection. Once the cursor 506 is positioned within a window, then the forces associated with the targets inside the window take effect. For example, once cursor 506 is moved within window 501, the ext.orn~l attractive force associated with window 501 is preferably removed and the external attractive forces of icons 502, window 518, and menu ht-:lrlings 505 are applied. The attractive force 25 of icon 519 within window 518 is plc;feldlJly not applied, since it is not at the highest hierarchy level external to cursor 506, i.e., icon 519 is at a lower hierarchy level than window 518.
Only forces associated with highest level external targets preferably affect cursor 506. One reason is that, if attractive forces associated with targets inside a window were added to the window's external force, then a window with several icons could "overpower" other windows by exerting a 30 much greater mzlgnihl~le of attractive force on user object 34 than the other windows. The cursor 506 might then be trapped into always moving to the window with several icons. If each window affects cursor 506 equally, then it is easier for the user to move the cursor to the desired window. Of course, in ~lt~-rn~t~o embodiments, if a window having more targets were desired to exert a greater force on cursor 506 than windows having less targets, then such an effect can be implemented. For example, 35 the m~nitu~le of the forces in such an embodiment could be limited so that the user would still be able Rt~ ltU SHEE~ (RULE 91) CA 0223912~ 1998-0~-29 to select all of the windows displayed in GUI S00, yet the user would feel slightly stronger forces from windows having greater numbers of icons.
The embodiment described above ~ccllmPs that the m~gnit~ e of external force associated with each window on cursor 506 is calculated the same way. However, in other embodiments, the 5 mzlgnit~ of attractive or other forces associated with targets can differ depending on char:lntpri~tics of the targets or can be comm:-nLle.d by the software pro~ l or user to be a desired mz-gnitll-lP
For example, the size of windows 501, 530, and 540 may detPrminP the m~gnitll~lP of attractive force ~ffectin~ cursor 506. If a user drags a window to be a smaller size, the attractive force associated with that window might be made proportionally smaller. For example, a virtual "mass" can be 10 :~c~i~nP~l to a target based on size, and the mass can be multiplied by the inverse of the rli~t~n~e between the target and the cursor to equal the res--ltinp~ attractive force. The cursor can also be ~ccignP-l a mass, if desired, to ~im~ tP real physical forces between objects. Also, other features or characteristics of the target, such as color, type, shape, etc., might control the m~gnih~lp of the force depending on how the programmer or user sets up a desired GUI force environment.
In addition, a programmer of the GUI 500 or of an application program running under the GUI is preferably able to control the m~gnitll(lP of the forces associated with particular targets displayed (or the "masses" of targets). For çx:~.mplP, the force field host com~ mand and c~-mm~n-l parameters, described above, may be able to ~lPsi~n~tP a m~gnitll~lP. for pa~ticular displayed windows.
Each target could thus have a different, prede~ e-l force ~ sot~ tP~l with it. This might allow a 20 software developer to rlP~i~n~tP a desired force to be associated with a particular window for his application program running under GUI S00. In addition, in some embo~limPnt.~, a user of GUI 500 might be allowed to ~P~ipn~tP particular m~gnitllclP~ of forces associated with targets. A menu cl mm~n-1 or other standard method to allow the user to associate forces with particular targets can be implemP.nt~-l FIGURE 20a is a dia~,l,ll.,.,,~;~ illustration of displayed targets illustrating the concepts of internal and external forces of the present invention associated with targets. As referred to herein, "çxtPrn~l forces" are those forces associated with a target which affect cursor 506 when the cursor 506 is positioned externally to that target, i.e. when the cursor positioned outside the pprimptpr of the target. In contrast, "internal forces" are those forces associated with a target which affect cursor 506 30 when the cursor is positioned intPrn~lly to the target, i.e., within the pç,;~l~r of the target. Each - target preferably has external forces and internal forces assigned to it, as described 'oelow. Of course, the internal forces and/or external forces associated with a target may be rlP~i~n~tP~l as zero, - effectively removing those forces.
Target regions 550, 552, 554, 556, and 558 are displayed in GUI environment ~00. Targets 35 550, 552, and 554 are at the same hierarchical level, and are associated with graphical objects such as windows or icons . Targets are "associated with" an a~plo~liate graphical object such as an icon, rn~ning that they can be characterized as a plopclly of the icon. The target is typically the same size as the associated graphical object, but may be defined to be smaller or large than the object, or to be a dirL~cllt shape than the object, in other embodiments. Targets 556 and 557 are grouped within target 5 554 and are thus at the sarne hierarchical level as each other but at a lower hierarchical level than the other targets 550, 552, and 554. For example, targets 556 and 558 can be associated with icons, windows, menus, menu items within a menu 554, or other targets grouped within window 554.
Rectangular and circular targets are shown in Figure 20a, although other shapes, even i;regular ones, may be provided as targets.
Points 560, 562, and 564 represent possible locations of cursor 506 in GUI 500. As explained above with reference to Figures 18 and l9, extemal forces associated with lower level targets 556 and 558 will not affect cursor 506 when the cursor is positioned external to higher level target 554. Therefore, when cursor 506 is at point 560 extemal to target regions 550, 552, and 554, the total force on cursor 506 is equal to the sum of external target forces associated with each target 550, 552, and 554. As an example, the associated forces may be an attractive (or repulsive) force field as described above. The forces would thus be in a direction toward the field origin points Wl, W2, and W3 shown as dashed lines 566, 567, and 569. Alternatively, the external forces can be one or arly combination of the force models previously described with respect to Figures 9 and 14. For example, a texture external force or a damping extf~.rn~l force can be applied, or a combination of these ~0 or other forces. Additionally, other forces and force models may be assigned as external forces. It should be noted that many types of force models do not require a field origin as in the examples of ~igures 18 and 19.
In addition, external target ranges are preferably assigned to each extemal force associated with each of the targets 550, 552, and 554. These external ranges define an external region from a target point P to the range lirnit in which the external force will be in effect. In one embodiment, the target point P for def~ing ranges can be the same point as the field origin point, a shown for target 550. ~;or exarnple, extemal range 555 may represent the border of a defined external region 568 for target 550, which is a pre-lrl~"",~-rd distance from point P. If cursor 506 is positioned within the external region 568 from the p~rim~ter of target 550 to external range 555, then the external force associated with target 550 is in effect. If cursor 506 is outside region 568, then the e~tern~l force is not in effect. For example, the extemal target force associated with target region 550 is zero at point 560 because its extemal region 568 does not extend to point 560. By ~l~fining such ranges, the processing tirne of local microprocessor 26 andlor host computer 12 is reduced, since the external forces need only be computed and applied when the cursor is in these regions. The extemal region 568 can be defined as a ~lict~nee from point P, or may ~ltt~rn~t;vely be defined with respect to the perim~t~r of a target, or may have a predetemlined shape about its associated target region. In Rt~ ltU SHEEl ~RUI E 9t) CA 0223912~ 1998-0~-29 addition, a total force resulting from the external forces of mnltirle targets may have a newly-computed external range. In ~lt~rn~tç embodiments, the region outside the external rarge of a target can be :lc,cign~cl a different force model and/or m~gnit~l-lç instead of zero.
Point 562 is located within target 554 (internal to target 554) and ~xtt-rn~lly to targets 556 and 558. At point 562, the total force affecting cursor 506 would be a combination of an internal target force for target 554 and external target forces for targets 556 and 558. Cursor 506 is "incnl~tf~d" from the ext~rn~l forces of targets 550, 552, and 554 since it is inside target 554. The ext~-rn~l forces associated with targets 556 and 558 are similar to the external forces described above. The internal force associated with target 554 affects cursor 506 only when the cursor is within the ~ hlle~l of the 10 target. Internal target forces of the preferred embodiment are described below with reference to Figure 20b.
Point 564 is internal to target 556. A cursor 506 placed at point 564 would experience an internal force associated with target 556 and no other forces. There are no external forces affecting the cursor 506 at this location, since there are no targets of lower hie,~-,l~ic~l level grouped in target 15 556. In ~ 1itinn, the intern~l force of target 554 is removed when the cursor is affected by an internal force of a target of lower hierarchical level, which in this case is the internal force of target 556.
FIGURE 20b is a dia~ ll~lic illustration of a single target 570 which is associated with ;ntt-rn~l and external forces. In the provided example, target 570 may be associated with a menu item, button, icon, or window. An ext~rn~l region shape ~lin~t~ by range 555 denotes a region for an 20 ~xt~rn~l force associated with target 570. Cursor 506 is infl~lenrerl by the external target force for target 570 when it is inside the extf~rn~l region 568 defined between dashed line 572 and an outer perim~tt-r 575 of target 570. Alternately, the external region can be defined belw~en dashed line 572 and an inner perimeter 577 of target 570. Recall that the target associated with a graphical object need not be the same size and shape as the graphical object, so a target pçrim~tçr may lie inside or outside 25 the pe~ l of the graphical object displayed on the screen 20.
An internal target region 574 may include a dead region 576 and a capture region 578. Dead region 576 is defined as the innermost, central region of target 570 and extends to an inner p~
577. In the dead region, forces associated with the dead region ("dead region forces") applied to cursor 506 would preferably be zero so as to allow substantially free m~v~ n~ of the cursor within 30 this region (also, any ext~orn~l forces of any targets included within target 570 would be in effect).
This dead region thus corresponds to the deadband regions ~liccucst-~l above with reference to Figures 9 and 14, as applied to the restoring and restoring spring forces and the groove/divot forces.
Alternatively, a particular force or force model can be associated with dead region 576. For example, a damping force or texture force sensation can be provided when the cursor is positioned CA 0223912~ 1998-0~-29 WO 97~1160 PCT~B96/01441 within this region, providing force feedback awareness to the user that cursor 506 is inside target 570.
Other force models can also be applied, such as the forces described above with respect to Figures 9 and 14. In addition, the entire displayed G~JI portion 500 on the screen 20 is preferably considered a target, and a dead region force such as a damping force or a texture force can be applied to user object 34 when pointer 506 iS moving over the background or desktop of the GUI. Such a damping force may greatly help users with a dexterity ~ 3hility and allow these users to move pointer 506 more accurately. Or, individual windows can be ~ ne~l different dead region forces. This feature can be useful to rii~ting~ h the "feel" of dirr.,~ L windows displayed on the screen, thus reri-l~ing the confusion of the user. For example, one window can have a texture dead region force of closely 10 spaced bumps, while a different window can have a texture dead region force of widely-spaced bumps. This allows the user to identify which window the cursor is in just by the feel of the dead region texture.
The capture region 578 iS preferably provided at or near the pe~ of target 570. The forces associated with capture region 578 are applied to cursor 506 when the cursor is p~ition~d 15 within or is moved through the capture region. Since the capture region is typically narrow, it may somf~.tim~.c be tliffi-~ult to ~ termin~ if the cursor is within the capture region. For example, the host COll~Ulei or local microprocessor 26 ~ tprminl~s the location of cursor 506 (and user object 34) by taking samples of sensors 28. If the user is moving user object 34 very quickly, the readings from the sensors may at too slow a frequency to provide data showing that the cursor was located inside the 20 capture region. The width of capture region 578 (i.e., the ~i~tAnf e from inner p~ r 577 to outer perimPt~r 575) can thus be made large enough so that the cursor can be rlet~cte~i within the capture region even when the user moves the cursor ~uickly. Altematively, a history of sensor readings can be ch~cl-~tl to cl~lt..".i"e. if the cursor was previously outside (or inside) the target 570, and if the cursor is subsequently inside (or outside) the target 570, thus in~ c~tin~ that the cursor has passed through capture region 578 and that a capture force should therefore be applied to user object 34.
In the preferred embodiment, two dirre.~ forces can affect cursor 506, depending on whether the cursor has moved from the dead region to the ~t~rn~l region of the target (exiting target 570), or vice-versa (entering target 570). When the cursor is moved from dead region 576 to extemal region 568, an "exit capture force" is applied to user object 34. The exit capture force is preferably a 30 barrier or "snap over" force positioned at inner pe.;~ , 577, which preferably inclu~les a spring force as represented symbolically by springs 579 in Figure 20b. The spring force causes a spring re~i~t~nce to the motion of cursor 506 in the exit direction, which starts as a small resistive force in the direction toward the dead region 576 and which increases as the cursor is moved closer to outer perimf~ter 575. The spring force may cause the cursor/user object to move back toward dead region 35 576 if the user lets go of the user object. This barrier force thus pl~V~ the cursor from easily "escaping" the target 570. In emborlim~nt~ having passive actuators, a damping barrier force can be CA 0223912~ 1998-0~-29 WO 97/21160 PCT~B96/01441 provided instead of the spring force. The barrier force can be useful to keep cursor 506 within an icon, scroll bar, or menu h~a-ling so that the user may more easily select the operation desi~n~ted by the icon, etc. In addition, by providing a zero dead region force arld a barrier exit capture force, a user can move the cursor within the internal area of a target and "feel" the shape of the target, which adds to the sensory pel~;e~Lion of graphical objects. Outer p~rim~t~:r 575 of target 570 preferably defines a snap ~lict~n~e (or width) of the barrier, so that once cursor 506 is moved beyond perim~ter 575, the exit capture force is removed. The divot force model can be used when a capture force is desired on all four sides of the pl~rimetPr of target 570, and a groove force model can be used if capture forces are only desired in one ~limton~ion.
When the cursor 506 enters target 570, an "entry capture force" is applied to user object 34.
Preferably, the entry capture force is the same spring force as the exit capture force, in the same direction toward the dead region 576. Thus, when cursor 506 first enters the capture region, the spring force will imm~ t~ly begin to push the user object/cursor toward the dead region. The closer the cursor is positioned to the dead region, the less spring force is applied. In some embodiments, the 15 mslgnit~ of the entry spring force can be limited to a pre-let~rmin~(l value or offset to prevent the cursor 506 from moving past ("overshooting") target 570 due to excessive attractive force.
Alternatively, an attractive force field similar to the external attractive force fields described above can be provided as the entry capture force. In such an embo~lim~rlt~ the direction of movement of cursor 506 must be established so that it is known whether to provide the exit capture force or the 20 entry capture force. The history of sensory readings can be checked as described above to ~l~t~rmin.o cursor direction. Tn alternate embo-lim~-nt~, different or additional types of entry capture forces can be applied.
In addition, a different "inertia" force can be applied to user object 34 when cursor 506 is positioned in dead region 576 for particular types of targets and when specific conditions are met.
25 For example, the inertia force can be applied when a command gesture, such as the pressing or holding of a button, is input by the user. In one L)lerell~d embo~linlent the inertia force is provided when the user moves pointer 506 into dead region 576, holds down a button on the joystick or mouse, and moves or "drags" the graphical object (and associated target 570) with pointer 506 across screen 20. The dragged target 570 has a ~imnl~t~l "mass" that will affect the amount of inertia force 30 applied to user object 34. In some embodiments, the inertia force can be affected by the velocity and/or acceleration of cursor 506 in addition to or instead of the ~imnl~t~-l mass. Other factors that may affect the m~gnitucle of inertia force, such as gravity, can also be ~im~ tf~fl For example, if a large icon is dragged by cursor 506, then the user may feel a relatively large ~l~mping force when moving user object 34. When the user drags a relatively small icon with pointer 506, then a smaller 35 damping force should be applied to user object 34. ~arger objects, such as windows, can be assigned different masses than other objects, such as icons. Alternatively, an icon's mass can be related to CA 0223912~ 1998-0~-29 how large in terms of storage space (e.g. in bytes) its associated program or file is. For example, an icon of a large-sized file is more Aifflr-llt to move (is "heavier") than an icon for a smaller-sized file.
A target's mass can also be related to other target/graphical object characteristics, such as the type of graphical object, the type of application program associated with the graphical object (i.e., larger mass S for word processor icons, less mass for game program icons, etc.), or a preAPt~rmin~d priority level.
Thus, force fee.Aha~k can directly relate information about a target to the user, ~ ting in pelro~ g and selecting desired Opt;LdtLIlg system tasks. In addition, an inertia force feature may be useful if a user wishes to retain a specific screen layout of graphical objects in GUI 500. For example, all the objects on the screen can be ~ign~-l a very high "mass" if the user does not want objects to be 10 moved easily from the preferred layout.
Other types of forces can also be applied to user object 34 when other command gestures are provided and/or when the target is dragged or moved, such as texture forces and jolts. ~n addition, if ~im~ t~d masses are being used to calculate the external force of a target, as for the attractive gravity force described above, then that same mass can be used to compute an inertia force for the target when 15 tne target is dragged. In yet another embodiment, a target may have a spring force associated with its position before it was moved. For example, when the user drags an icon, the movement of user object 34 would feel like a spring is ~tt~l-h~rl between the icon and its former position. This force would bias the cursor toward the former position of the icon. In a different, similar embodiment, a spring or other type of force can be provided on user object 34 when a graphical object is resized.
20 For example, a window can typically be ch:~ngç~i in size by sf.l~cting a border or corner of the window with cursor 506 and dragging the window to a desired size. If the window is dragged to a larger size, then a "stretching" spring force can be applied to the user object. If the window is dragged to a smaller size, then a "co~ r~;ssing" spring force can be applied. Such spring forces can also be provided in a CAD program when graphical objects are stretched or otherwise manipulated.
25 The implP.~I~P.l~t~lion of these types of forces can include a simple plu~ol~ionality between displslrçm.~nt and force and is well known to those skilled in the art.
Also, the targets for inertia forces can be defined separately from the targets for the internal and external forces as described above. For example, most windows in a GUI can only be dragged by cursor 506 when the cursor is located on a "title bar" (upper portion) of the window or similar 30 specific location. The window can be associated with a inertia target and a separate int~ 1/external force target. Thus, the target for the internal/f~xt~ l forces can be defined to cover the entire window, while the target for the inertia forces can be defined to cover just the title bar of the window.
If the cursor 506 were located on the title bar, then both inertia and internal forces could be in effect.
In addition, damping and/or friction forces can be provided instead of or in addition to the 35 inertia forces. For example, each graphical object can be ~c.cign.o.d a .~im~ t~i damping coefficient or a coefficient of friction. Such friction might be useful when free-hand drawing in a CAD

CA 0223912~ 1998-0~-29 _ 59 _ application program in the GUI, where the coefficient of friction might be based on "pen size" of a drawing cursor. A texture force might also be applied when a graphical object is dragged. Other examples of forces and associated graphical objects and functions include providing force jolts or "bumps" when the cursor 306 encounters a region, when an object is released after having been dragged across the screen, when a window is entered or exited by the cursor, or when a window is opened or closed. In a text document, these bumps can be provided when the cursor moves between words, lines, letters, paragraphs, page breaks, etc.
FIGURE 20c is a diagr~mm~tic illustration of a target 559 in a GUI 500 providing a "groove"
~xt~rn~l force. This type of external force ;s suitable for an interface device 14 having passive 10 zl-~tn~tr-r.~ 30. Passive actuators may only provide rP~ict~nre to motion of user object 34, and thus cannot provide an attractive or repulsive force field as an external force of a target. Thus, an external force of target 559 can be provided as external grooves 561, e.g. the groove force model as described above with reference to Figure 14 can be used. These grooves are ~l~;fel~bly positioned in horizontal and vertical directions and intersect at the center C of target 559. It should be noted that grooves 561 15 are preferably not displayed within GUI 500, and are shown in Figure 20c for explanatory purposes (i.e., the grooves are felt, not seen). (~ rn:~tively, the grooves can be displayed.) When cursor 506 is moved into a groove, resistive forces are applied to resist further movement out of the groove but to freely allow movement along the length of the groove. For example, if cursor 506 is positioned at horizontal groove 563a, the cursor 506 may freely be moved (i.e. with no çxt~-rnzll forces applied 20 from target 559) left and right as shown by arrows 565. However, the groove "walls" provide a resistive force to the cursor when the cursor is moved up or down. This tends to guide or bias the muv~l~lellt of the cursor 506 toward (or directly away from) target 559. Similarly, if cursor 506 is positioned at vertical groove 563b, the cursor may freely be moved up and down as shown by arrows 557, but must overcome a resistive barrier force when moving }eft or right. The grooves 561 25 preferably have a predefined length which det~rmin~s the external range of the extemal force of the target.
When cursor 506 is moved along a groove toward the center of target 559, the cursor eventually reaches the center C of the target. At this position, both grooves 561 provide combined barrier forces to the cursor in all four directions, thus locking the cursor in place. Once the cursor is 30 locked, the user can conveniently provide a command gesture to select the graphical object associated with target 559. In a ~l~rell~d embodiment, the external groove forces are removed once the user selects the target. For example, if target 559 is associated with a button as shown in Figure 22, the cursor would be guided to target 559, and, once the button is selected, the grooves would be removed, allowing the cursor to be moved freely. Once the cursor moved out of the external region 35 defined by the ends E of the grooves, the external force would again be in effect.

FIGURE 21 is a diagramrnatic illustration of display screen 20 showing GUI 500 and window 501 with a pull-down menu. The forgoing concepts and preferred embodiments will now be applied to selection of menu items in a GUI environment. Once the cursor 506 is inside the window 501, forces applied to user object 34 depend upon the cursor 506 location relative to targets within window 501 on the next lowest level of the _ierarchy below window 501. Menu bar 504 is preferably considered to be on the same hierarchy level as icons 502, so that both icons 502 and menu bar 504 exert attractive external forces on cursor 506. Alternatively, menu bar 504 can be ,.csi~ned a hierarchy level below that of window 501 but above that of icons 502, which would allow only the menu bar to attract cursor 506 (hierarchy levels of other graphical objects rnight also be changed in 10 ot_er embo~lim.-nfs).
Figure 21 depicts window 501 with a file pull-down menu 510, where menu 510 includes one or more menu items 516. The display of menu 510 results from a selection of the "File" menu h~-~(ling 505 of the menu bar 504, and is typically perforrned by moving cursor 506 onto menu h~ ling 505 and s~l~cfing or holllin~ down a button, such as a mouse or joystick button. Once a pull-15 down menu such as the "File" pull-down menu 510 has been displayed, force models associated with the menu 510 orits items 516 will affect cursor 506 and user object 34. For example, if the cursor 506 is located within window 501 as denoted by the dashed cursor outiine 512 in Figure 21 after activating the pull-down menu 510, the cursor 506/user object 34 is preferably attracted from its position at outline 512 toward f1eld origin point S of the menu 510 with an attractive external force of 20 the menu 510. ~lt~ tively, a field origin region defined as the entire mel~u 510 can be defined, as described above. Once the cursor 506 is located within the perimeter of menu 510, as shown by location 514, then the attractive extemal force of the menu is no longer in effect. Any internal menu forces of menu 510, or menu items 516, are then in effect, as described below. Preferably, menu 510 has one external force associated with it that attracts cursor 506 to the center (or other rltosi~ t~rl field 25 origin position) of the menu 510. ~ltt~rn~tively, each menu item 516 can be associated with its own external force, which may all sum to a total force that can affect cursor 506 if the cursor is positioned outside menu 510. For example, each menu item might have its own attractive external force with its own field origin point located at the center of each menu item; or, other force models can be used in other embotlim~nt~ In ~ lition, some menu items 516 might be cl~sign~ .d to have an extemal force 30 of ~reater m~onih~ . than other iterns. Fxtt~rn~'l force m~nit~lcles might be ~l~si~n~rl, for example, according to characteristics of tne menu items (size, order in the list, etc.~, frequency of use, or according to personal desires of a programmer or user of GUI 500.
Once positioned inside the pull-down menu 510, the cursor 506 will inevitably lie within one of several menu items 516 demarcated by dashed and solid perimeters 521 in Figure 21. The dashed 35 lines are typically not displayed in standard menus of GUI~s, but are shown here for explanatory purposes. Preferably, the menu 510 has no internal forces, but each menu item 516 has its own Rtl; I ll-ltV SHEET (RULE 91) WO 97nll60 PCT~B96/01441 internal forces which are in effect within the perimeters 521 of the item areas. The dashed lines define the perimeter of each menu item with respect to other menu items 516. The menu items are preferably similar to the target 570 shown in Figure 20b. Preferably, each menu item includes a zero rn~gnih~ force in its dead region 576 and includes a barrier or "snap-over" force (such as a spring or damping force) located at perimeter 521 as its exit capture force in accordance with that described with reference to Figure 20b. This capture force keeps cursor 506 within a particular menu item 516 once the cursor has moved there. In addition, each menu item 516 can include a "snap-to" entry capture force positioned at the middle of the menu item to attract the cursor 506 to this middle point.
The snap-to force can be implemented as a groove force model along the length of the menu item.
Thus, the cursor is assisted in rem~ining within a particular menu item target, such as the Open F7 item target 517, by the use of force feedback as previously discussed with reference to Figure 20b.
~ach menu item 516 such as New, Open Move, Copy, etc. can have its own dead region for free movement within a item 516 and a capture region to assist in keeping the cursor in the particular item target that it is located. Preferred force models are the grooves and barriers discussed with reference to Figure 14. For example, a groove force model can be provided at each menu item so that extra force is necessary to move the cursor 506 "out" of the groove past a pc~rim.~tPr 521, but does not prevent cursor 506 from moving left or right out of the menu. By impeding movement between selection areas 516, the force feedback prevents accidental shifting between menu items and prevents the inadvertent selection of an incorrect menu item and ~ e~ g system function. The menu items typically have no external force, since they abut at their borders. An external force can be provided at the left and right borders of each menu item if desired. The abovedescribed "snap to" force can also be useful for snap-to grid lines in a CAD program or constraining motion in a drawing program to perpendicular or 45-degree angle directions.
In other embodiments, other forces can be provided in addition to those discussed to ease the movement of cursor 506 over the menu items 516. For example, the user may inadvertently skip the cursor over some menu items 516 if a great deal of force has to be used to move the cursor 516 over penmeters 521 between menu items. To prevent the undesired skipping over of selections 516, a ~l~rnring force can be provided in the dead region 576 of each selection 516 to slow down the cursor in a menu item. Alt~.rn~tively, a repulsive entry capture force can be provided by the menu items that are not immediately adjacent to the menu item that the cursor is in, such that the skipping problem is - reduced.
The scroll bar or "slider" 581 also preferably is ~lpcign~tt-d as a target of the present invention.
The scroll bar preferably includes a "thumb" 580, a guide 582 in which to move the thumb, and arrows 583. Cursor 506 can be positioned over thumb 580 in the scroll bar 581 for the window 501 35 and the user can scroll or move the view of icons, text, or other information shown in window 501 by moving thumb 580 in a vertical direction along guide 582, as is well known to those skilled in the art.
Rt~ ltu SHEET (RULE 91) CA 0223912~ 1998-0~-29 wo s7nll60 PCTnB96/0144:~

Guide 582 is preferably a target of the present invention such that external forces and internal forces are associated with the guide. Preferably, an attractive external force is associated with the guide so that cursor 506 is attracted to a field origin point N within thumb 580. Thus, the field origin point of the guide can vary its position within guide 582 when the user moves the thumb. The guide 582 can S be rle~sign~terl the same hierarchical level as icons 502, or a higher or lower level. Internal forces of the guide are preferably equivalent to those of Figure 20b. The capture forces on the top and bottom sides of the groove prevent cursor 506 from easily moving onto arrows 583 when moving thumb 580. In an ~ltern~t~. embodiment, the dead region of guide 582 has zero width, so that the cursor is always attracted to a point halfway across the width of the guide, i.e. an entry capture force to the 10 m~ddle line L of the guide. This would be close to a groove force model, except that the sides of guide 582 near arrows 583 would have a barrier force and thus be like a divot. In a passive actuator (or other) embodiment, such a groove can be provided along guide 582, and the cursor can be locked onto thumb 580 as described with respect to Figure 20c. The cursor would, of course, still be able to be moved with the thumb when locked on the thumb, and could be released with a command gesture.
lS Preferably, thumb 580 and arrows 583 are considered children objects of guide 582, i.e., the thumb and arrows are at a lower hierarchical level than the guide and are considered "within" the guide. Thus, the çxt~rn~l forces of the thumb and arrows are only applicable when cursor 506 is positioned within the guide. The external forces of arrows 583 are preferably zero, and thumb 580 preferably has an attractive ext~rn~l force. The internal forces of thumb 580 and arrows 583 are 20 preferably similar to those described with reference to Figure 20b.
Thumb 580 can also be zlc~igne~l inertia forces as described with reference to Figure 21. The user could feel the inertia "mass" of the thumb when moving it along guide 582. Since thumb 580 can be viewed as an icon with constrained movement, many forces attributable to icons can be ~ipnt-~l to thumbs.
As cles-rihe~ above, graphical objects/targets such as icons 502 and window 501 can be ignt~ im~ t~rl "masses" which can be used to provide inertia forces when the targets are dragged across the screen. The inertia forces can also be applied due to collisions or other interactions with other graphical objects and targets. For example, if pointer 506 is dragging icon 502, and the icon collides with the edge 587 of window S01, then a collision force can be applied to user object 34.
This collision force can be based on the speed/direction of the icon/cursor as it was moved, the mass of the icon, and any ~im~ t~-l compliances of the icon 502 and the edge 587. For example, edge 587 can be ~ign~rl to be very compliant, so that when icon 502 is dragged into the edge, a spring-like force is applied to user object 34 which causes icon 502 and cursor 506 to bouncc back away from edge 587.

CA 0223912~ 1998 - 0~ - 29 Wo 97/21160 PCT/Is96/01441 Altem~tively, these same sort of "collision" forces can be applied to cursor 506 regardless of whether any object is being dragged or not. For example, certain edges, objects, or regions in GUI
500 can either be de~ign:~tt-d as "pass-through" objects or as "solid" objects. Cursor 506 would be able to move over any pass-through objects without user object 34 feeling any forces. ~Iowever, 5 forces would be applied to user object 34 if cursor 506 moves over or into any solid object. ~ursor 506 could be assigned a mass of its own so that the user object will feel collision forces in accordance with the mass of cursor 506, the velocity of the cursor across the screen, and a ~ ign~ .d compliance of the cursor and the object moved into. This can be useful in a GUI to prevent or hinder access to certain objects or functions. Such objects could be ~lecign~ted as solid objects which would allow 10 cursor 506 to be moved freely about the screen without concern about selecting undesired functions.

FIGURE 22 is a dia~,ll.l.o~ illustration of display screen 20 showing window 501 and a "pop-up" window 586. Window 501 includes icons 502. Window 586 includes buttons 584 and a "radio button" 585, and the window is typically removed from the screen after a button 584 has been selected. Buttons 584 can also be displayed in more "permanent" (i.e., non-pop-up) regions of (~UI
500. Similarly to the targets associated with the graphical objects described above, each button 584 in a window 586 in Figure 22 has e~tem~l and internal forces associated with it, as described with 20 reference to Figure 20a. Thus, an attractive external force (or other desired force) and a zero dead region force and divot capture force can be associated with each button 584. Essentially, the buttons 584 are analogous to menu items 516 in Figure 21, except that a certain distance on the screen separates the buttons 584 from each other. Also, buttons 584 preferably have radially-shaped external region for their external forces.
Radio button 586 is similar to buttons 586 in that a particular function may be selected or toggled if the user moves cursor 506 onto the radio button 586 and provides a command gesture such as pushing a button. Button 584 preferably is implemente-1 similarly to buttons 584 except that button 586 has a round perimetçr and preferably a round extern~l region. In other embo~liment~, buttons can have other shapes.
In an ~lternzlt~ embodiment, the forces associated with buttons 584 and 585 can be "turned off" or otherwise changed after the button has been selected by the user using cursor 506. For example, an attractive ext~m~l force and entry capture force of a button 584 can draw or guide the cursor to the button. The exit capture force impedes the cursor from moving outside of the button.
Once the button is selected, however, the capture and external forces can be removed, so that the CA 0223912~ 1998-0~-29 WO 97~1160 PCT~B96/01441 cursor can be moved freely (and/or be affected by the forces associated with oeher targets on the screen). The forces can then be reapplied upon desired condieions. For example, once the cursor moves out of the extern~l region of the button, then the forces would be back in effect and would be reapplied when the cursor was moved back into ehe external region of the button. Likewise, some or S all of the forces associated with the button could be changed to different types of force models once ehe button was pressed. This embodiment can also be applied to oeher types of graphical objects, such as icons, e.g., once the icon is selected, forces are removed until ehe cursor is moved out of the exeernal region and back into the external region, when the forces would be reapplied.
FIGURE 23 is a flow diagram illustrating a meehod 610 for providing force feeflb~k wiehin a 10graphical user interface (GUI) environment beginning at a step 612. Inieially, in a step 614, a position of ehe user objece 34 iS calibrated. This is accomplished so that an origin position for the user object can be detPrminP~l by host computer 12. In next step 614, forces are mapped or associated with graphical objects in the GUI. For inct~nce, referring to the diagram of Figure 20a, external and int~rn~l target forces are associated with ehe targets 55()~ 552, 554, 556, and 558. More specifically 15rPfPrring to the exarnple of Figure 19, the host co~ ,uL~;l associates types of graphical objects in GUI
500 with ç~trrn~l and internal forces. The mapping will generally include slccigning one or more force models and range sizes/shapes to each external and internal region of types of graphical objects.
For example, icons may be :-cci nP-l particular forces and ranges, and sliders may be ~ccignf~-l n~ forces and ranges. Also, particular icons or other objects can be ~?ccipnP(1 particular forces or 20ranges if the prog,d~ er has so ~ cign~te~l If only a portion of a graphical object is to be used as a target, then that portion can be defined in this step. The process of mapping forces to gr~rhir~l objects in the GUI is described in greater detail with respect to Figure 24.
In step 618, the position of the user object 34 is read by host computer 12 and/or microprocessor 26 and the cursor position on the screen is updated accordingly. This is typically 25accomplished by first reading sensors 28 on int~rf~-~e device 14 to ~letPrmin~ where user object 34 is positioned. These readings are then collvelLed to the coordinates on screen 20 and Lhe cursor is moved to the a~p,~li~e location corresponding to the position of the user object in a position control paradigm, as is well known to those skilled in the art. Since the sensor readings may include non-integer values, the sensor readings can be c~llvt;,led to integer values which are associated with 30coordinates on the screen so that the cursor position can be updated. However, when forces are calculated (as in step 622 below), the original non-integer sensor readings are used~ since these values include the necessary accuracy.
In ~lt~rn~five embo-1im~nt.~, the display rnight be updated in other ways in response to the position or other chars~-~t--ti~tirs of motion of the user object. For example, some application programs 35implemented by host ColllLJul~l 12 might use two dimensional, planar input to control other aspects of an intPrf~ or prograrn, such as panning a screen, rotating a controlled object, moving a user-CA 0223912~ 1998-0~-29 controlled player, vehicle, or Vi~w~Oillt through sim~ te(l 3-D virtual space, etc. Also, the velocity or acceleration of the user object can be c~lrul~ted and used as input. In other embodim~nts~ the meçh~nism 14 might allow three or more degrees of freedom to the user object, thus allowing other ways to control objects and operating system functions.
S In step 620, process 610 clç~rrnin~,s a target of lowest hierarchy in which the cursor is located. As mentioned above in the discussion of Figures 18 -20a, the hierarchies assigned to targets inflllPnee the forces that are in effect on cursor 506. This process is descrihed in greater detail with respect to Figure 25. In step 622, an applupliate force is lletçrmine~l frûm the external and internal forces for each target that affects the cursor, where the target selected in step 620 helps ~etermine which forces are in effect. In addition, other conditions or events in the GUI may contribute to the forces applied to the user object. The contributing forces are combined and the combined total force is applied to the user object 34 by actuators 30. This step is described in greater detail with respect to Figure 26. After step 622, the process returns to step 618 to again read the user object position and apply applopliate forces.
FIGUE~E 24 is a flow diagram illustrating an example of step 616 of Figure 23, in which forces are mapped to graphical objects. The process begins at 630, and in step 632, an available target is selected to assign forces that target. After a target has been selectçd, process 616 implements a series of steps 634, 636, 638, 640, and 642 to ~lett-rmine the particular target's type. These steps can be performed in any order, or .simlllt~nPously. Step 634 checks if the selected target is an icon. If so, step 644 assigns a radial dead range, a radial capture range, and a radial external range to the icon.
The "dead range" is the size of the dead region 576 about the center of the icon, defined by inner pi rimett-r 577 as shown in Figure 20b. The "capture range" is defined between inner and outer perimeters 577 and 575, so that a radial capture range in-lieates that the inner and outer pe. i, . ~e~ are circular about the center of the icon. The capture and external ranges are preferably radial even though the icon itself may be rectangular or shaped otherwise. In other embodiments, other shaped ranges can be assigned. The process then continues to step 652, described below. If the target is not an icon, the process continues to step 636.
In step 636, the process checks if the selected target is a button or window; if so, step 646 assigns rectangular dead and capture ranges and a radial extlorn~l range to the selected target. Buttons are illustrated in Figure 22. Since the windows and buttons are rectzlng~ r, a rectangular capture ~ range is desired to match the shape of the pçrim~t~r of the window or button. A radial external range can be provided as a predeterminl d ~list~nee from a center point of the window or button. The process then continues to step 652. If the target is not a button or window, the process continues to step 638. Step 638 checks whether the target is a radio button; if so, step 648 assigns radial internal and ext~rn~l ranges, since the radial button is typically circular in shape. The process then continues to step 652. If the target is not a radial button, the process continues to step 640, in which the process WO 97/21160 PCT~B96/01441 checks if the target is a slider. If so, step 650 assigns rectangular dead, capture, and ~x~ern:~l ranges to the guide, thumb, and arrows as explained previously. If the slider is implemented as a one ~1innencional groove, then the dead range would be linear, i.e., zero in one dimension. The process then continues to step 652, described below. If the target is not a slider, the process continues to step 642, where the process checks if the target is a menu item or menu h~ ling (or a menu 510, in which preferably no int~rns~l ranges are accip~n~fl). If so, step 650 is implemented as described above, except that no external ranges are preferably ~ecign~ to menu items. In other emborlimPntc, the process can test for other types of graphical objects to which the programrner wishes to assign ranges. If none of the steps 634,636,638,640, or 642 are true, then control passes to step 643, in which the extern~
10 and internal force ranges of the target are set to zero. Alternatively, the process can check for a particular graphical object to which predeterminPd or desired force ranges are ~ccigned This special object can be llçcign~tf~-l as such by the programmer or user. If such a special object is provided, then the process can continue to step 652.
After force ranges are ~ccignt-~l to the selected target in any of steps 644,646,648, or 650, 15 step 652 ~le.t.ormin~c whether the selected target is special. If not, step 656 assigns force m~gnit~ldes and/or force models or force sf nc~tinn processes to the external and internal forces for theparticulartarget according the target's type. For example, an icon may be ~cci~necl standard, pre~letPrrninecl force m~gnit~ es or force models for its external attractive force and for its internal dead and capture forces. Alternatively, the object can be accigned a "mass" which will infln~nre the m~gnitllcies of the accign~-l forces. If the target is special, step 654 assigns any special m~gnit~ (or mass) to the target according to any particular instructions or values provided by a ~lu~ ler or user. This allows individual targets to be ~ccignP~ desired force m~gnitll-l~s After either step 654 or 656, method 616 ends at step 658.
The assigned force ranges, m~gnitll-l~s and models can also be stored in memory 27 as a "parameter page" by microprocessor 26 or host computer 12. For example, each parameter page might assign ~lirrel~llL types or ranges of forces to the graphical objects. These parameters pages can be loaded quickly to provide dirr~lc;nl force environment~, or may allow host COlll~JUlt;l 12 to build another force ellvilulnllent by sending host comm~nrlc while the processor 26 imrl~mf~.nt~ a different force environment. P~u~ll~r pages are described in greater detail with respect to U.S.
patent application ~ entitled "Method and Apparatus for Controlling Force ~eedback Tnt~rf~-~e Systems Utilizing a Hûst Col~l~ule~," filed 12/1/95 on behalf of Rosenberg et al.
FIGURE 25 iS a flow diagrarn illustrating step 620 of Figure 23, in which the target of lowest hierarchy is (l~tt~rmintod in which the cursor resides. The process begins at 660. By well-known binary tree or set theoretic hierarchy methods, step 662 det~ ,..i.,~s whether the cursor 506 is 35 positioned within the pP~rim~t~r of a target and whether that target includes other children targets which the cursor is also within. For example, referring to Figure 19, process 620 may llctermin~ that the CA 0223912~ 1998-0~-29 WO 97/21160 PCT~B96/01441 cursor 506 iS within window 501, but is also within window 518 of window 501, and that the cursor is additionally within an icon 519 of window 518. The target of lowest hierarchy of which the cursor was positioned would thus be the icon 519.
Step 664 essentially detçrminPs whether the cursor 506 is in a region where two targets of the 5 same hierarchical level overlap. This can occur if, for example, two icons or windows of the same ~ (lowest) hierarchical level happen to be displayed on the same portion of the screen. Process 620 queries whether the cursor 506 is in more than one of the lowest level targets. If the cursor 506 is in an overlap region, then step 666 selects the "top" target whose object is displayed on screen 20. The "bottom" target will be partially or totally hidden by the top target. If there is no overlap in step 664, 10 then step 668 selects the lowest level target normally. The process is complete at 669 after step 666 or 668.
FIGURE 26 iS a flow diagrarn illustrating step 622 of Figure 23, in which an al)plv~liate force is applied to the user object 34 based on the cursor's position and the target in which the cursor is located. The process begins at 670. Having ~IP.tPnninPrl the target of lowest hie,dlchical level in 15 which the cursor is positioned in step 620, step 672 cz~ tPs an internal force for that target cont~ining the cursor 506 (the "lowest target"). The internal force is calculated using a force model or function, such as a force sensation process, given a~ pliate parameters such as m~gnitll~lP7 duration, coefficients, sensor data, and timing data. Force models, force sensation processes, and parameters were discussed above at length with respect to Figures 4-5 and 9-17. The internal force 20 might be calculated in accordance with the dead region 576 if the cursor is positioned there; or, the internal force might be c~lc~ tP<l according to a capture force if the cursor is positioned in capture region 574 or has just passed through the capture region.
In step 674, a total force value is inifizlli7Pd to the internal force of the lowest target that was ~ lrlll~tt-~l in step 672. Thus, only the internal force of the lowest hierarchical target in which the 25 cursor is positioned is included in the total force that is to be applied to the user object. The internal forces of any higher level targets are preferably not included in the total force. As an e~x~mple, consider a cursor 506 inside a window cont:lining only icons. If the cursor 506 iS not in an icon's target, the window itself is the lowest hierarchy target in which the cursor 506 resides. Only the intPrn~l target force for the window is calculated. If the cursor is moved into an icon, only the internal 30 force from that icon is included in the total force; the internal force of the window is ignored.
Step 675 ~letprminps the children targets of the lowest target whose forces will affect the user object. These "external" children are included in the lowest target which the cursor is positioned in, but which are external to the cursor, i.e., the cursor is not positioned in any of the external children.
Thus, the external forces of the çxtPrnzll children will affect cursor 506 and user object 34. Any 35 targets included in the external children are preferably not added as a force. If the cursor is in the wo 97nll60 PCTAB96/01441 "desktop" or background target of GUI 500, then the çxt~rn~l children are the next highest level targets on the screen. For example, the windows 501, 530 and 540 would be external children when cursor 506 iS positioned on the desktop as shown in Figure 19. In ~ltemzlt~ embodiments, the external children might also include additional lower level targets within other external children.
S In step 676, the process flet--rrnin~s whether any external forces of the ~xtfrn~l children have not been combined into the total force. If so, step 677 selects a previously unvisited çxtern~l child and computes the external force for the child. The ~xtsrn~l force from this child is only co~ ul~d if cursor 506 is within the ~xt~ l range of the child; if the cursor is outside the ~xtem~l range, the ~xtPrn~l force is set at zero. This saves processing time if the cursor is not in the external range.
Alt~m~tively, if a particular force is ~ign~(1 to regions outside the external range, that force is computed. The çxt.orn~l force is colll~ul~d according to the particular force model assigned to the external force, such as the attractive force field model described in the examples above.
Step 678 computes the total force by adding the extf rn~l force from the child of step 677 to the total force to be applied to the user object 34. It should be noted that the directions and m:lgnitucles of the previous total force and the external force are taken into account when fl~l~".,i"i"~ the direction and m:lpnitll(le of the reslllting total force. For example, if the previous total force had a m~gnitl~ of S in a left direction, and the t~xt~rn~l force had a m~gnit~ of 8 in the right direction, then the sum of step 678 would result in a total force of m~nit~l~le 3 in the right direction. The process then returns to step 676 to check for another unvisited extt-rn~l child and add an external force to the total force in steps 677 and 678. Steps 676-678 are repeated until external force contributions from aU the external children have been cu~ ed into the total force.
After aU the external children forces have been added to total force, then, from the negative result of step 676, the process checks if a co~ d gesture has been input by the user which would affect the force applied to the user object. For example, such a situation might occur if the inertia forces described above were implem~ntP-l These forces would be applied when the user held down a button or provided similar input and dragged an icon or window. If such input has been received, then the total force is adjusted based on the command gesture and the particular conditions or location of the cursor or other factors (such as the velocity of the cursor, mass of the dragged icon, ~imlll~t~-l gravity, etc.) The "adj-lctm~nt" to the total force may be an addition or subtraction to the mz-gnit~ of the total force and/or a change in direction, depending on how strong and in what direction the inertia force is applied.
In next step 684, or after a negative result of step 680, the process checks of another condition affects the force on the user object is in effect. Such a condition, for example, might be when cursor 506 collides with a "solid" graphical object of GUI 500 as discussed above, if such a feature is being implemented. The forces from such a collision would affect the total force output by actuators 30 on CA 02239l2~ l998-0~-29 WO 97/21160 rCT~B96/01441 user object 34. If such a col-~lition exists, then the total force is adjusted ~L,lol,liately in step 686.
After step 686, or after a negative result of step 684, the total force is applied to the user object 34 in step 688 using actuators 30 as explained previously. The process is then complete at 689. In ~ltPrn:~tive embodiments, steps 680-686 can be performed at other stages in process 622, such as 5 before step 672.
FIGURE 27 iS a flow diagram illustrating an example method 690 for applying internal or extPrn~l forces to user object 34 from a single target, where cursor 506 iS positioned near the target's boundary. To simplify the discussion, process 690 assumes that only one target is displayed on screen 20, and thus does not take into account forces from other targets that may inflllPnce the force 10 applied to the user object depending on the cursor's position. The steps of adding forces from mllltirlP targets is described above with reference to Figure 26. Also, other necessary steps as described above, such as updating the cursor position, are omitted from process 690 for expediency.
Process 690 begins at step 692, and in step 694 detP.rminPc whether cursor 506 iS in a particular target's capture zone. If so, an optional step 696 ~lett~rmines whether the host co~ u~l 16 15 and/or local microprocessor 26 last ~let~ct~cl the cursor 506 in the target dead zone. If this was the case, then the cursor 506 iS moving from the dead zone to the P~rtP.rn~l zone. Thus, step 698 iS
applied, where the exiting capture force is applied according to the ~3l~3liate force sensation process. For example, the exiting capture force in the preferred embodiment is a barrier such as a spring force to prevent the cursor 506 from easily escaping the pt~ r(~l the target. The process is 20 then complete at 702. It should be noted that in the preferred embodiment, the exit and entry capture forces are the same (a barrier force), so that step 694 iS not nPcP~ry in such an embodiment, and steps 698 and 706 are the same step. ~teps 694,698, and 706 as shown are needed if the entry and exit capture forces are different.
If the last non-capture position of the cursor was not in the dead region, then the cursor is 25 most likely being moved from the external region of the target to the dead region of the target. If this is the case, step 706 applies the entry capture force to the user ob3ect 34 as described above with reference to Figure 20b. For example, in an sllt~ tP embodiment, the entry capture force can be an attractive force that pulls the cursor 506 and user object 34 toward the center of the target. The process is then complete at 702.
If ~ in step 694, the present position of the cursor is not in the capture region, then the process checks if the cursor is in the dead region of the target in step 700. If so, then the internal dead region force ~ ignP-l to the dead region is applied in step 701. In the p.~f~ ;d embotlimPnt, the dead region force is zero and thus step 701 iS omitted; however, in other embodiments, a dead region force can be calculated based on a force sensation process such as a r~mping or texture force. The process is then 35 complete at 702. If the cursor is not in the dead region in step 700, then the process checks if the CA 02239l2~ l998-0~-29 cursor is in the external region, as defined by the external range of the target, in step 703. If so, step 704 applies the external force of the target to the user object. If the cursor is positioned outside the ext~-rn:ll range, then the process is complete at 702. Alternatively, if a force is ~c.~igned to the target's region outside the external range, then that force can be applied to the user object.
The force fee-lh~k sensations of the present invention are advantageously provided in a GUI
500. These forces can both assist a user in selecting and p~lrO~ g operating system functions, and can inform the user of the various graphical objects displayed by the GUI. In particular, those users who suffer from spastic hand motion and other dexterity-debilitating conditions are greatly advantaged by the addition of these force fe.erlb~ sensations in a GUI environment. Formerly difficult tasks such as maneuvering a cursor onto an icon become much easier using the force feedback of the present invention by implem~nting attractive forces, ~i~mrin~ forces, and other forces that assist a user in hand-eye coordination.
VVhile this invention has been described in terrns of several preferred embodiments, it is contemplated that alterations, modifications and permllt~tions thereof will become apparent to those skilled in the art upon a reading of the speçif~tion and study of the drawings. For example, many ~lirre~cnL types of forces can be applied to the user object 34 in accordance with different graphical objects or regions appearing on the c~ ulePs display screen. Also, many va;ieties of graphical objects in a GUI can be associated with particular forces to assist the user in selecting the objects or to notify the user that the cursor has moved into a particular region or object. In addition, many types of user objects can be provided to ll~ulS~ the forces to the user, such as a joystick, a mouse, a trackball, a stylus, or other objects. Furthermore, certain terminology has been used for the purposes of descriptive clarity, and not to limit the present invention. It is therefore inf~n~e~l that the following appended claims include all such alterations, modifications and permutations as fall within the true spirit and scope of the present invention.
V~ atis claimed is:

Claims (107)

- AMENDED CLAIMS
[received by the International Bureau on 6 September 1997 (06.09.97);
original claims 1-92 replaced by new claims 1-107 (15 pages)]
1. A method for providing force feedback to users interacting with a graphical user interface environment of a computer system, the method comprising steps of:
receiving an indication of movement of a physical object that is manipulated by a user, said physical object being included in a human interface device that outputs said indication to said computer system;
moving a user-controlled graphical object within a graphical user interface including at least one graphical target, said movement of said graphical object based on said indication of said movement of said physical object, wherein said user-controlled graphical object and said graphical user interface are displayed on a display screen connected to said computer system, and wherein said graphical user interface allows said user to interface with operating system functions implemented by said computer system when said user-controlled graphical object interacts with graphical targets associated with said operating system functions;
outputting a signal from said computer system to said interface device to command said interface device to apply a force sensation to said physical object, wherein said force sensation is associated with said interaction of said user-controlled graphical object with at least one of said targets associated with said operating system functions of said graphical user interface, and wherein said force sensation applied to said physical object is determined, at least in part, by a location of said cursor in said graphical user interface with respect to at least one of said targets located in said graphical user interface.
2. A method as recited in claim 1 wherein said user-controlled graphical object is a cursor.
3. A method as recited in claim 2 wherein said force sensation applied to said physical object is determined, at least in part, additionally by at least one of:
a velocity of said physical object with respect to a ground of said physical object, an acceleration of said physical object with respect to a ground of said physical object, and a history of previous locations of said cursor in said graphical user interface with respect to targets located in said graphical user interface.
4. A method as recited in claim 3 wherein said force sensation on said physical object assists said user to select said target associated with said force sensation.
5. A method as recited in claim 3 wherein said force sensation on said physical object informs said user of other targets in said graphical user interface which can be manipulated to interface said user with at least one operating system function.
6. A computer readable medium including program instructions for performing steps for a process of providing force feedback to the user of a graphical user interface displayed by a computer system, the steps comprising:
determining a location of a user-controlled cursor within a graphical user interface displayed on a display screen of a computer system, said cursor being controlled by said user by manipulating a physical object of an interface device, wherein a position control paradigm is implemented on said computer system such that movement of said cursor in said graphical user interface approximately corresponds to movement of said physical object with reference to an origin of said interface device;
determining which targets displayed within said graphical user interface are associated with target forces that affect said physical object based on said location of said user-controlled cursor, wherein said targets allow said user to interface with operating system functions implemented by said graphical user interface;
providing a signal to apply a resulting force to said physical object based on said forces associated with said targets which affect forces on said physical object.
7. A computer readable medium as recited in claim 6 wherein at least one of said targets is associated with at least two different target force sensations depending on said location of said cursor with respect to said at least one target.
8. A computer readable medium as recited in claim 7 wherein said two different target force sensations include an internal target force sensation and an external target force sensation, wherein said internal target force sensation is applied to said physical object when said cursor is located within said target, and said external target force sensation is applied to said physical object when said cursor is located outside said target.
9. A computer readable medium as recited in claim 8 wherein said internal target force sensation includes a capture force sensation when said cursor is located near a boundary of said target, and wherein said internal target force sensation includes a dead region force sensation when said cursor is located within said boundary and near a center of said target.
10. A computer readable medium as recited in claim 8 wherein said targets of said GUI
are ordered in a hierarchy, and wherein said step of determining which targets may affect force sensations on said physical object includes determining a cursor target of lowest hierarchy in which said cursor is located.
11. A computer readable medium as recited in claim 10 wherein an internal target force sensation associated with said cursor target affects said physical object and wherein no other internal force sensations of other targets affect said physical object.
12. A computer readable medium as recited in claim 11 wherein other targets at ahierarchical level lower than said cursor target have external force sensations that affect said physical object, wherein said cursor is not located in said other targets.
13. A computer readable medium as recited in claim 12 wherein said resulting force sensation applied to said physical object is a sum of said internal target force sensation of said cursor target and said external target force sensations of said other targets at said hierarchy lower than said lowest hierarchical level.
14. A computer readable medium as recited in claim 13 wherein said other targets are included within said cursor target and are at a lower hierarchical level than said cursor target.
15. A computer readable medium as recited in claim 12 wherein when said cursor is located in at least two of said cursor targets, only the target displayed on top of said other targets of said at least two targets is selected as said cursor target.
16. A computer readable medium as recited in claim 6 further comprising a step of mapping force sensations to targets provided by said graphical user interface.
17. A computer readable medium as recited in claim 16 wherein said step of mapping force sensations includes assigning ranges to said targets, said ranges defining external regions around said targets which affect said external force sensation on physical object when said cursor is located within said external regions.
18. A computer readable medium as recited in claim 9 wherein said capture force sensation includes a snap-over force sensation, said snap-over force sensation providing a resistance to said movement of said cursor when said cursor from within said target to outside said target.
19. A computer readable medium as recited in claim 9 wherein said external target force sensation includes a force field, said force field providing an attractive force sensation to said physical object to draw said cursor toward said target.
20. A computer readable medium as recited in claim 9 wherein said external target force sensation includes a force field, said force field providing a repulsive force sensation to said physical object to push said cursor away from said target.
21. A computer readable medium as recited in claim 9 wherein said external target force sensation includes two intersecting groove force sensations, said groove force sensations biasing said cursor to move toward said target and locking said cursor on said target.
22. A computer readable medium as recited in claim 21 further comprising a step of removing said groove force sensations after said cursor is locked on said target when said target is selected by said user.
23. A computer readable medium as recited in claim 16 wherein said target is one of an icon, button, slider, menu item on a pull-down menu, and window.
24. A computer system for providing force feedback to a user of a graphical user interface displayed by said computer system, comprising:
means for determining a location of a user-controlled cursor within a graphical user interface displayed on a display screen of a computer system, said cursor being controlled by said user by manipulating a physical object of an interface device, and wherein targets are provided in said graphical user interface which are associated with operating system functions implemented by said graphical user interface, said cursor being movable to select said targets to implement said associated operating system function;
means for determining which targets displayed within said graphical user interface are associated with target forces affecting said physical object based on said location of said user-controlled cursor; and means for providing a signal to apply a resulting force sensation to said physical object based on said forces associated with said targets which affect forces on said physical object.
25. A computer system as recited in claim 24 wherein said means for determining said targets includes means for determining a cursor target of lowest hierarchy in which said cursor is located.
26. A computer system as recited in claim 25 wherein said cursor target is a menu item in a pull-down menu.
27. A computer system as recited in claim 26 wherein said menu item is associated with a snap-over force at a boundary of said menu item, said snap-over force providing a obstructive force to said movement of said cursor when said cursor is moved from one menu item to another menu item.
28. A computer system as recited in claim 26 wherein said menu item is associated with a snap-to force at a center area of said menu item, said snap-to force providing an attractive force that keeps said cursor at said center area of said menu item.
29. A computer system as recited in claim 25 wherein said cursor target is a slider having a thumb that can be moved within a linear guide.
30. A computer system as recited in claim 29 wherein said slider is associated with a snap-to force provided at a middle line along a length and halfway across a width of said guide of said slider, said snap-to force providing an attractive force that keeps said cursor at said middle line of said guide.
31. A computer system as recited in claim 25 wherein said cursor target is an icon, wherein said icon is associated with an attractive force associated with a surrounding region having a predetermined size, said attractive force pulling said physical object and said cursor when said cursor is moved within said region.
32. A computer system as recited in claim 29 wherein said icon is associated with an capture force associated with a boundary of said icon, said capture force providing an obstructive force to said physical object when said cursor is moved out of said icon.
33. A computer system as recited in claim 29 wherein said icon is associated with a dead region force associated with an internal region of said icon, said dead region force providing a predetermined damping force to said physical object when said cursor is moved in said internal region.
34. A computer system as recited in claim 52 wherein said target is a graphical button.
35. A system for providing force feedback to a user manipulating an interface device, the system comprising:
a host computer system for receiving an input control signal describing a location of a user manipulable physical object in a degree of freedom and for providing a host output control signal, wherein said host computer system updates the location of a user-controlled cursor within a graphical user interface displayed on a display screen of said host computer system, said cursor being updated based on said input control signal, and wherein said host computer system displays a number of targets in said graphical user interface and determines which targets are associated with target forces that affect said physical object based on said location of said user-controlled cursor, wherein said targets allow said user to interface with operating system functions implemented by said graphical user interface, and wherein said host output control signal commands resulting force to said physical object based on said forces associated with said targets which affect forces on said physical object;
a microprocessor local to said interface device and separate from said host computer system for receiving said host output control signal from said host computer system and providing a processor output control signal, wherein said microprocessor is operative in a local control process to provide said processor output control signal to said actuator in response to said position and motion of said object independently of said host output control signal;
an actuator for receiving said processor output control signal and providing a force along said degree of freedom in accordance with said processor output control signal to said physical object coupled to said actuator; and a sensor for detecting motion of said user manipulable object along said degree of freedom and outputting said input control signal to said microprocessor, said input control signal including information representative of the position and motion of said object, and wherein said microprocessor provides said input control signal to said host computer system.
36. A system as recited in claim 35 wherein said host output signal is a high level command from said host computer system, and wherein said processor implements one of a plurality of local routines selected in accordance with said high level command to implement said local control process.
37, A system as recited in claim 35 further comprising a memory device accessible by said local processor, wherein said local processor stores a spatial representation of said targets in said memory device such that said processor may detect when said target forces affect said physical object.
38. A system as recited in claim 37 wherein said memory device includes a permanent memory storage for storing predetermined target forces associated with said targets.
39. A method for providing force feedback for graphical objects in a game implemented on a computer system, the method comprising the steps of:
displaying a user-controlled first graphical object on a display screen of a computer system, said graphical object moving on said display screen during a game in response to manipulations of a physical object of an interface device by a user, said interface device being coupled to said computer system;
displaying a second graphical object on said display screen;
when said first graphical object collides with said second graphical object on said screen:
a) displaying a compression of said first object where said second object contacts said first object, wherein said first object has a defined simulated compliance and said second object has a defined simulated mass;
b) applying a force to said physical object manipulated by said user in at least one degree of freedom provided by said interface device, said force being applied in a direction corresponding to the direction of said compression and having a magnitude in accordance with said simulated masses of said first and second graphical objects.
40. A method as recited in claim 39 wherein said force has a magnitude also in accordance with a simulated compliance of said first graphical object.
41. A method as recited in claim 40 wherein said force has a magnitude also in accordance with simulated velocities of said first graphical object and said second graphical object.
42. A method as recited in claim 40 wherein said force has a magnitude also in accordance with a simulated gravity acting on said second graphical object.
43. A method as recited in claim 40 wherein said first graphical object is a linear segment, and wherein said second graphical object is a circular object.
44. A method as recited in claim 40 wherein said second graphical object moves on said display screen during a game in response to manipulations of a second physical object of an second interface device by a second user, said second interface device being coupled to said computer system.
45. A method as recited in claim 44 wherein said first and second interface devices input sensor signals indicating motion of said first physical object and said second physical object, respectively, wherein said user is a first user and said force command to said first interface device is based on sensor signals output from said second interface device, and wherein a force command output to said second interface device is based on sensor signals output from said first interface device.
46. A method as recited in claim 40 wherein said second graphical object moves on said display screen during a game in response to manipulations of a second physical object of an second interface device by a second user, said second interface device being coupled to a second computer system coupled to said computer system through a network interface.
47. A method as recited in claim 40 further comprising a step of changing said simulated compliance of said first graphical object when an input command is received from said user operating an input device on said interface device.
48. A method as recited in claim 40 further comprising displaying a goal object on said display screen and wherein said user moves said first graphical object to block said second graphical object from moving into said goal object.
49. A method as recited in claim 43 further comprising changing a displayed orientation of said linear segment according to input data received from said user operating said input device.
50. A method as recited in claim 49 wherein said physical object is a joystick handle, and wherein said input data received from said user includes a spin of said joystick handle in a rotary degree of freedom.
51. A method as recited in claim 50 wherein said interface device provides two linear degrees of freedom to said joystick handle in addition to said rotary degree of freedom.
52. A method as recited in claim 40 wherein said first graphical object represents a paddle and said second graphical object represents a ball.
53. A method for providing force feedback for interacting simulated objects in asimulation implemented on a computer system, the method comprising the steps of:
displaying a user-controlled first simulated object and a second simulated object on a display device of a computer system, said first simulated object moving on said display device during a simulation in response to manipulations of a physical object of an interface device by a user, said interface device being coupled to said computer system;
determining when said first simulated object engages said second simulated object within said simulation;
displaying said determined engagement of said first simulated object with said second simulated object, wherein said first simulated object has a predetermined simulated compliance and said second object has a predetermined simulated mass; and outputting a force command to said interface device to apply a force to said physical object manipulated by said user in at least one degree of freedom provided by said interface device, said force being applied in the direction of said engagement of said second simulated object with said first simulated object and having a magnitude in accordance with said simulated mass of said second simulated object.
54. A method as recited in claim 52 wherein a simulated compliance of said firstsimulated object additionally affects said magnitude of said force.
55. A method as recited in claim 53 wherein simulated velocities of said first simulated object and said second object additionally affect said magnitude of said force.
56. A method as recited in claim 53 wherein a simulated gravity acting on said simulated objects additionally affects said magnitude of said force.
57. A method as recited in claim 52 wherein said first simulated object elongates at a point of impact with said second simulated object in accordance with said simulated compliance of said first simulated object.
58. An interface device for use with a host computer displaying graphical targets and a cursor, at least one of said targets associated with an operating system function implemented by said host computer, said interface device providing force feedback sensations to a user in coordination with displayed interactions between said cursor and said targets, said interface device comprising:
a user manipulatable object grasped and moved by a user;
a support mechanism coupled to said user manipulatable object and which supports said user manipulatable object with respect to an origin while allowing a plurality of degrees of freedom in the motion of said user manipulatable object with respect to said origin;
a local processor separate from said host computer, said local processor executing a local process simultaneously with execution of a host application by said host computer, said local processor coupled to said host computer by a communication interface, said local process involving the execution of a plurality of local force routines;
an actuator electrically coupled to said local processor and physically coupled to said user manipulatable object for imparting a controllable force upon said user manipulatable object in at least one of said degrees of freedom, said degree of freedom in which said force is imparted being a forced degree of freedom;
a sensor apparatus coupled to said local processor, said sensor apparatus providing said local processor with a locative signal responsive to and corresponding with manipulation of said user manipulatable object along said forced degree of freedom;
a user adjustable switch apparatus coupled to said local processor, said switch apparatus providing a switch state signal to said local processor representing the state of said switch;
a plurality of host commands received by said local processor over said communication interface, each of said host commands including a command identifier and at least one of said host commands including a command parameter;
a plurality of force routines stored in memory accessible to said local processor, wherein said force routines control a magnitude of said force produced by said actuator as a function of said locative signal, wherein a particular one of said force routines is locally executed in response to a particular one of said received host commands, said particular force routine being selected based in part upon said command identifier of said particular host command, and wherein said force routine uses values derived from a command parameter of said particular host command, said values used to coordinate force sensations caused by said force routine with a displayed interaction between said cursor and at least one of said targets displayed by said host computer, wherein said local process is implemented on said local processor, said local process:
enabling communication between said interface device and said host computer, reporting a representation of said locative signal to said host computer, wherein said host computer updates a displayed location of said cursor in response to said representation, executing one of said force routines in response to at least one of said received host commands, wherein said executed force routine is coordinated with said displayed interaction between said cursor and at least one of said targets, and reporting a representation of said switch state signal to said host computer, wherein said host computer executes an operating system function in response to a particular switch state when said cursor interacts with at least one of said targets associated with said operating system function.
59. An interface device as recited in claim 58 wherein at least one of said force routines is a damping routine that causes said actuator to output a damping force upon said user manipulatable object that is a function of the velocity of said user manipulatable object, wherein said damping routine is executed by said local processor in correlation with a display by said host computer of said cursor dragging said target.
60. An interface device as recited in claim 59 wherein said target is one of a thumb of a slider, a window, and a graphical icon.
61. An interface device as recited in claim 59 wherein a damping coefficient is used by said damping routine and is derived from a parameter in one of said received host commands, wherein said damping coefficient controls an intensity of said damping force.
62. An interface device as recited in claim 59 wherein said intensity of said damping force is correlated to a storage size of a file represented by said target.
63. An interface device as recited in claim 59 wherein said intensity of said damping force is correlated to a size or type of a file represented by said target.
64. An interface device as recited in claim 59 wherein said damping force is applied to said user manipulatable object in response to a state of said switch state signal.
65. An interface device as recited in claim 58 wherein at least one of said force routines is an inertia routine that creates an intertia force upon said user manipulatable object that is a function of an acceleration of said user manipulatable object, said inertia routine being executed by said local processor in correlation with said displayed cursor dragging a displayed target.
66. An interface device as recited in claim 65 wherein said target is one of a thumb of a slider, a window, and a graphical icon.
67. s An interface device as recited in claim 65 wherein a mass value used by said inertia routine is derived from a parameter received from said host command and effects a magnitude of said inertia force.
68. An interface device as recited in claim 67 wherein said magnitude of said inertia force is correlated to a displayed size of said target.
69. An interface device as recited in claim 67 wherein a magnitude of said inertia force is correlated to a size or type of a file represented by or displayed in said target.
70. An interfaced device as recited in claim 65 wherein said inertia force is imparted upon said user manipulatable object in response to a state of said switch state signal.
71. An interface device as recited in claim 58 wherein at least one of said force routines is a stretch routine that causes said actuator to generate a force upon said user manipulatable object whose magnitude increases with displacement of said user manipulatable object from a defined location, said stretch routine being executed by said local processor in correlation with a display of said cursor modifying a size of a displayed target.
72. An interface device as recited in claim 71 wherein said stretch routine is a spring routine that creates a linear relationship of force output versus user object displacement using a stiffness parameter received from said host computer as a linear proportionality.
73. An interface device as recited in claim 71 wherein said displayed target is a rectangular region and said host computer displays said cursor adjusting a size of said rectangular region.
74. An interface device as recited in claim 72 wherein said rectangular region is a window.
75. An interface device as recited in claim 71 wherein said defined location is derived from at least one parameter received by said local processor from said host computer.
76. An interface device as recited in claim 58 wherein at least one of said force routines is a field routine that causes said actuator to generate an attractive force upon said user manipulatable object having a magnitude based on a field equation, said field equation using a distance of said user manipulatable object from a field origin to generate said attractive force.
77. An interface device as recited in claim 76 wherein said field routine is executed by said local processor in correlation with display of a cursor and a target such that said origin is associated with a displayed location of said target and said generated force acts to draw said user manipulable object and said cursor towards said target.
78. An interface device as recited in claim 77 wherein said cursor is attracted to multiple targets simultaneously, said field routine using a distance between said user manipulatable object and multiple field origins such that a resultant force on said user manipulatable object is a summation of attractive forces associated with each of said multiple field origins.
79. An interface device as recited in claim 77 wherein a location of said field origin is derived from at least one parameter received from said host computer.
80. An interface device as recited in claim 77 wherein a region is defined around a particular field origin, wherein said attractive force associated with said field origin is not imparted on said user manipulatable object when said cursor is outside said region.
81. An interface device as recited in claim 58 wherein at least one of said force routines is a jolt routine that causes a jolt force for a specified magnitude and duration upon said user manipulatable object, a magnitude and duration of said jolt force being derived from parameters received from said host computer.
82. An interface device as recited in claim 81 wherein said jolt routine is executed in coordination with said cursor entering or exiting a target.
83. An interface device as recited in claim 82 wherein said target is one of a window, icon, and menu item.
84. An interface device as recited in claim 81 wherein said jolt force has a direction, said direction being derived from at least one parameter received from said host computer.
85. An interface device as recited in claim 58 wherein at least one of said force,routines is a capture force routine that resists motion of said user manipulatable object out of a defined boundary by producing an opposing force on said user manipulatable object that is a function of a displacement of said user manipulatable object past said boundary, wherein said capture force routine is executed by said local processor in correlation with a display of interaction between said cursor and target such that cursor motion past a defined boundary within said target is resisted.
86. An interface device as recited in claim 85, wherein said displacement of said user manipulatable object past said boundary corresponds to displacement of said cursor past a displayed boundary.
87. An interface device as recited in claim 85 wherein said capture force routine implements a dead region having a size based upon parameters received from said host, wherein no force is output on said user manipulatable object while said user manipulable object is in said dead region.
88. An interface device as recited in claim 85 wherein said capture force routine provides an opposing force having a magnitude scaled by a hardness parameter received from said host computer.
89. An interface device as recited in claim 58 wherein a location of said boundary is derived in part from a location parameter received from said host computer.
90. An interface device as recited in claim 85 wherein said opposing force is eliminated when said user manipulatable object is moved beyond said boundary by more than a defined snap distance.
91. An interface device as recited in claim 90 wherein said snap distance is derived from a parameter received from said host computer.
92. An interface device as recited in claim 85 wherein said interaction includes motion of said cursor from a location inside a displayed boundary of said target to a location outside said displayed boundary of said target.
93. An interface device as recited in claim 92 wherein said target is one of an icon, window, menu item, and button.
94. An interface device as recited in claim 85 wherein said capture force is eliminated in response to a change in said switch state signal.
95. An interface device as recited in claim 85 wherein said capture force routine uses a history of locative signal values represented in memory accessible to said local processor to determine if said user manipulatable object has crossed said boundary.
96. An interface device as recited in claim 58 wherein a plurality of said force routines are active at once such that a resultant force upon said user manipulatable object is a sum of forces imparted by each of said force routines.
97. A method for providing force feedback to a computer user controlling a location of a displayed cursor, wherein said force feedback corresponds with a displayed interaction between said cursor and a displayed target, said user using an interface device including a user manipulatable object grasped and moved by said user, a support mechanism which supports said object with respect to an origin while allowing a plurality of degrees of freedom in the motion of said object with respect to said origin, an actuator for imparting an electronically modulated force upon said object along a degree of freedom, and a sensor apparatus for providing a locative signal responsive to and corresponding with manipulation of said object along said degree of freedom, the method comprising steps of:
displaying a cursor by a computer system, said cursor having a location correlated to said user manipulatable object as indicated by said locative signal;
providing a hierarachy of targets displayed by said computer system, wherein a first target is displayed as contained within a second target, said first target being a child and said second target being a parent, providing an internal capture force on said user manipulatable object when said cursor is inside a boundary of a particular target, wherein said internal capture force resists movements of said user manipulatable object that would cause said cursor to move from inside said boundary to outside said boundary of said particular target; and outputting a signal from said computer system to said interface device to command said interface device to cause said actuator to apply said internal capture force upon said user manipulatable object.
98. A method as recited in claim 97 wherein said interface device provides a switch state signal representing the state of said switch when said user manipulated a switch apparatus included in said interface device, and wherein said internal capture force is eliminated in response to a change in said switch state.
99. A method as recited in claim 98 further producing an attractive external force on said user manipulatable object when said cursor is outside a boundary of a particular target, wherein said attractive external force is applied on said user manipulation object such that said cursor is moved towards said particular target.
100. A method as recited in claim 99 wherein said attractive external force is removed when said cursor moves from outside a boundary associated with said target to inside said boundary.
101. A method as recited in claim 99 wherein application of said attractive external force and said internal capture force associated with a particular target is dependent upon a location of said cursor and a hierarchy including said target and at least one other targets.
102. A method as recited in claim 99 wherein said attractive external force associated with a particular target is applied when said cursor is inside a parent target of said particular target, and wherein said attractive external force is not applied when said cursor is outside a parent target of said particular target.
103. A method as recited in claim 97 further providing a dead-region within a boundary associated with a given target such that an internal capture force associated with said target is not applied within said dead-region.
104. A method as recited in claim 103 wherein a damping force is applied within said dead-region to provide a feel sensation for said dead region.
105. A method as recited in claim 103 wherein a texture force is applied within said dead-region to provide a feel sensation for said dead region.
106. A method as recited in claim 97 wherein said targets are defined locations upon a web page.
107. A method as recited in claim 97 further comprising a step of performing an operating system function in response to a user command gesture, wherein said internal capture force applied to said object assists said user in selecting said operating system function using said user command gesture.
CA002239125A 1995-12-01 1996-11-26 Method and apparatus for providing force feedback for a graphical user interface Abandoned CA2239125A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US08/566,282 1995-12-01
US08/566,282 US5734373A (en) 1993-07-16 1995-12-01 Method and apparatus for controlling force feedback interface systems utilizing a host computer
US08/571,606 US6219032B1 (en) 1995-12-01 1995-12-13 Method for providing force feedback to a user of an interface device based on interactions of a controlled cursor with graphical elements in a graphical user interface
US08/571,606 1995-12-13

Publications (1)

Publication Number Publication Date
CA2239125A1 true CA2239125A1 (en) 1997-06-12

Family

ID=27074136

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002239125A Abandoned CA2239125A1 (en) 1995-12-01 1996-11-26 Method and apparatus for providing force feedback for a graphical user interface

Country Status (5)

Country Link
US (3) US6219032B1 (en)
EP (2) EP0864144B1 (en)
CA (1) CA2239125A1 (en)
DE (1) DE69635902T2 (en)
WO (1) WO1997021160A2 (en)

Families Citing this family (447)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030161889A1 (en) * 1984-03-16 2003-08-28 Reid Robert H. Vaccines against diseases caused by enteropathogenic organisms using antigens encapsulated within biodegradable-biocompatible microspheres
US5889670A (en) * 1991-10-24 1999-03-30 Immersion Corporation Method and apparatus for tactilely responsive user interface
US6433771B1 (en) 1992-12-02 2002-08-13 Cybernet Haptic Systems Corporation Haptic device attribute control
US6437771B1 (en) * 1995-01-18 2002-08-20 Immersion Corporation Force feedback device including flexure member between actuator and user object
US5734373A (en) 1993-07-16 1998-03-31 Immersion Human Interface Corporation Method and apparatus for controlling force feedback interface systems utilizing a host computer
US6941543B1 (en) 1995-05-30 2005-09-06 Roy-G-Biv Corporation Motion control system and method
US20100131081A1 (en) * 1995-05-30 2010-05-27 Brown David W Systems and methods for motion control
US20060206219A1 (en) * 1995-05-30 2006-09-14 Brown David W Motion control systems and methods
US5691897A (en) 1995-05-30 1997-11-25 Roy-G-Biv Corporation Motion control systems
US5959613A (en) 1995-12-01 1999-09-28 Immersion Corporation Method and apparatus for shaping force signals for a force feedback device
US5825308A (en) 1996-11-26 1998-10-20 Immersion Human Interface Corporation Force feedback interface having isotonic and isometric functionality
US5956484A (en) * 1995-12-13 1999-09-21 Immersion Corporation Method and apparatus for providing force feedback over a computer network
US6028593A (en) * 1995-12-01 2000-02-22 Immersion Corporation Method and apparatus for providing simulated physical interactions within computer generated environments
US6147674A (en) 1995-12-01 2000-11-14 Immersion Corporation Method and apparatus for designing force sensations in force feedback computer applications
US6219032B1 (en) * 1995-12-01 2001-04-17 Immersion Corporation Method for providing force feedback to a user of an interface device based on interactions of a controlled cursor with graphical elements in a graphical user interface
US6161126A (en) 1995-12-13 2000-12-12 Immersion Corporation Implementing force feedback over the World Wide Web and other computer networks
US6300936B1 (en) * 1997-11-14 2001-10-09 Immersion Corporation Force feedback system including multi-tasking graphical host environment and interface device
SE519661C2 (en) * 1996-02-23 2003-03-25 Immersion Corp Pointing devices and method for marking graphic details on a display with sensory feedback upon finding said detail
US7225404B1 (en) * 1996-04-04 2007-05-29 Massachusetts Institute Of Technology Method and apparatus for determining forces to be applied to a user through a haptic interface
US6111577A (en) 1996-04-04 2000-08-29 Massachusetts Institute Of Technology Method and apparatus for determining forces to be applied to a user through a haptic interface
US7811090B2 (en) 1996-05-08 2010-10-12 Gaumard Scientific Company, Inc. Interactive education system for teaching patient care
US6503087B1 (en) * 1996-05-08 2003-01-07 Gaumard Scientific, Inc. Interactive education system for teaching patient care
US7192284B2 (en) * 2000-08-17 2007-03-20 Gaumard Scientific Company, Inc. Interactive education system for teaching patient care
US20090148822A1 (en) 2007-12-07 2009-06-11 Gaumard Scientific Company, Inc. Interactive Education System for Teaching Patient Care
US8696362B2 (en) * 1996-05-08 2014-04-15 Gaumard Scientific Company, Inc. Interactive education system for teaching patient care
US7976312B2 (en) * 1996-05-08 2011-07-12 Gaumard Scientific Company, Inc. Interactive education system for teaching patient care
US8016598B2 (en) 1996-05-08 2011-09-13 Gaumard Scientific Company, Inc. Interactive education system for teaching patient care
US6443735B1 (en) * 1996-05-08 2002-09-03 Gaumard Scientific, Inc. Computerized education system for teaching patient care
US6374255B1 (en) 1996-05-21 2002-04-16 Immersion Corporation Haptic authoring
US6084587A (en) * 1996-08-02 2000-07-04 Sensable Technologies, Inc. Method and apparatus for generating and interfacing with a haptic virtual reality environment
US6411276B1 (en) 1996-11-13 2002-06-25 Immersion Corporation Hybrid control of haptic feedback for host computer and interface device
US7489309B2 (en) * 1996-11-26 2009-02-10 Immersion Corporation Control knob with multiple degrees of freedom and force feedback
US6128006A (en) * 1998-03-26 2000-10-03 Immersion Corporation Force feedback mouse wheel and other control wheels
US6956558B1 (en) * 1998-03-26 2005-10-18 Immersion Corporation Rotary force feedback wheels for remote control devices
US6686911B1 (en) 1996-11-26 2004-02-03 Immersion Corporation Control knob with control modes and force feedback
US6292170B1 (en) 1997-04-25 2001-09-18 Immersion Corporation Designing compound force sensations for computer applications
US6285351B1 (en) 1997-04-25 2001-09-04 Immersion Corporation Designing force sensations for computer applications including sounds
US7091948B2 (en) * 1997-04-25 2006-08-15 Immersion Corporation Design of force sensations for haptic feedback computer interfaces
US6292174B1 (en) 1997-08-23 2001-09-18 Immersion Corporation Enhanced cursor control using limited-workspace force feedback devices
US6252579B1 (en) 1997-08-23 2001-06-26 Immersion Corporation Interface device and method for providing enhanced cursor control with force feedback
US6882354B1 (en) 1997-09-17 2005-04-19 Sun Microsystems, Inc. Scroll bars with user feedback
US20010032278A1 (en) * 1997-10-07 2001-10-18 Brown Stephen J. Remote generation and distribution of command programs for programmable devices
US6211861B1 (en) 1998-06-23 2001-04-03 Immersion Corporation Tactile mouse device
US8020095B2 (en) * 1997-11-14 2011-09-13 Immersion Corporation Force feedback system including multi-tasking graphical host environment
US6088019A (en) * 1998-06-23 2000-07-11 Immersion Corporation Low cost force feedback device with actuator for non-primary axis
US6243078B1 (en) 1998-06-23 2001-06-05 Immersion Corporation Pointing device with forced feedback button
US6448977B1 (en) 1997-11-14 2002-09-10 Immersion Corporation Textures and other spatial sensations for a relative haptic interface device
US6191796B1 (en) 1998-01-21 2001-02-20 Sensable Technologies, Inc. Method and apparatus for generating and interfacing with rigid and deformable surfaces in a haptic virtual reality environment
IL123073A0 (en) 1998-01-26 1998-09-24 Simbionix Ltd Endoscopic tutorial system
US6219034B1 (en) * 1998-02-23 2001-04-17 Kristofer E. Elbing Tactile computer interface
US6697043B1 (en) 1999-12-21 2004-02-24 Immersion Corporation Haptic interface device and actuator assembly providing linear haptic sensations
US6184868B1 (en) 1998-09-17 2001-02-06 Immersion Corp. Haptic feedback control devices
US6429846B2 (en) 1998-06-23 2002-08-06 Immersion Corporation Haptic feedback for touchpads and other touch controls
JP4032404B2 (en) * 1998-07-10 2008-01-16 フジノン株式会社 Operating device
US6421048B1 (en) * 1998-07-17 2002-07-16 Sensable Technologies, Inc. Systems and methods for interacting with virtual objects in a haptic virtual reality environment
US6417638B1 (en) 1998-07-17 2002-07-09 Sensable Technologies, Inc. Force reflecting haptic interface
US6552722B1 (en) * 1998-07-17 2003-04-22 Sensable Technologies, Inc. Systems and methods for sculpting virtual objects in a haptic virtual reality environment
US6950534B2 (en) * 1998-08-10 2005-09-27 Cybernet Systems Corporation Gesture-controlled interfaces for self-service machines and other applications
US20010008561A1 (en) * 1999-08-10 2001-07-19 Paul George V. Real-time object tracking system
US7036094B1 (en) 1998-08-10 2006-04-25 Cybernet Systems Corporation Behavior recognition system
US7038667B1 (en) * 1998-10-26 2006-05-02 Immersion Corporation Mechanisms for control knobs and other interface devices
US10820949B2 (en) * 1999-04-07 2020-11-03 Intuitive Surgical Operations, Inc. Medical robotic system with dynamically adjustable slave manipulator characteristics
US6424356B2 (en) 1999-05-05 2002-07-23 Immersion Corporation Command of force sensations in a forceback system using force effect suites
DE20022244U1 (en) * 1999-07-01 2001-11-08 Immersion Corp Control of vibrotactile sensations for haptic feedback devices
US6717568B1 (en) * 1999-09-10 2004-04-06 Sony Computer Entertainment Inc. Method of controlling the movement of a position indicating item, storage medium on which a program implementing said method is stored, and electronic device
DE20080209U1 (en) 1999-09-28 2001-08-09 Immersion Corp Control of haptic sensations for interface devices with vibrotactile feedback
US7050955B1 (en) * 1999-10-01 2006-05-23 Immersion Corporation System, method and data structure for simulated interaction with graphical objects
US20100131078A1 (en) * 1999-10-27 2010-05-27 Brown David W Event driven motion systems
US8032605B2 (en) * 1999-10-27 2011-10-04 Roy-G-Biv Corporation Generation and distribution of motion commands over a distributed network
DE19958443C2 (en) * 1999-12-03 2002-04-25 Siemens Ag operating device
US6822635B2 (en) 2000-01-19 2004-11-23 Immersion Corporation Haptic interface for laptop computers and other portable devices
US6924787B2 (en) * 2000-04-17 2005-08-02 Immersion Corporation Interface for controlling a graphical image
US6724400B1 (en) * 2000-05-06 2004-04-20 Novint Technologies, Inc. Human-computer interface incorporating personal and application domains
US6833826B1 (en) * 2000-05-06 2004-12-21 Novint Technologies, Inc. Human-computer interface
JP2003534620A (en) 2000-05-24 2003-11-18 イマージョン コーポレイション Haptic device and method using electroactive polymer
EP1182539B1 (en) * 2000-08-16 2009-03-25 Sony Deutschland GmbH Haptic device control
US7976313B2 (en) * 2000-08-17 2011-07-12 Gaumard Scientific Company, Inc. Interactive education system for teaching patient care
US6867770B2 (en) * 2000-12-14 2005-03-15 Sensable Technologies, Inc. Systems and methods for voxel warping
US6442451B1 (en) 2000-12-28 2002-08-27 Robotic Workspace Technologies, Inc. Versatile robot control system
AU2002251731A1 (en) * 2001-01-04 2002-07-16 Roy-G-Biv Corporation Systems and methods for transmitting motion control data
US6958752B2 (en) 2001-01-08 2005-10-25 Sensable Technologies, Inc. Systems and methods for three-dimensional modeling
US20020101457A1 (en) * 2001-01-31 2002-08-01 Microsoft Corporation Bezel interface for small computing devices
US20020135615A1 (en) * 2001-01-31 2002-09-26 Microsoft Corporation Overlaid display for electronic devices
WO2002071241A1 (en) * 2001-02-09 2002-09-12 Roy-G-Biv Corporation Event management systems and methods for the distribution of motion control commands
US7904194B2 (en) * 2001-02-09 2011-03-08 Roy-G-Biv Corporation Event management systems and methods for motion control systems
US7202851B2 (en) * 2001-05-04 2007-04-10 Immersion Medical Inc. Haptic interface for palpation simulation
US20020171675A1 (en) * 2001-05-15 2002-11-21 International Business Machines Corporation Method and system for graphical user interface (GUI) widget having user-selectable mass
US20020171689A1 (en) * 2001-05-15 2002-11-21 International Business Machines Corporation Method and system for providing a pre-selection indicator for a graphical user interface (GUI) widget
JP3556203B2 (en) * 2001-05-18 2004-08-18 株式会社ソニー・コンピュータエンタテインメント Display device and display method
US7409441B2 (en) * 2001-05-18 2008-08-05 Sony Computer Entertainment Inc. Display apparatus for accessing desired web site
IL143255A (en) 2001-05-20 2015-09-24 Simbionix Ltd Endoscopic ultrasonography simulation
DE10126421B4 (en) * 2001-05-31 2005-07-14 Caa Ag Vehicle computer system and method for controlling a cursor for a vehicle computer system
US7024723B2 (en) * 2001-06-15 2006-04-11 Headwaters R&D, Inc. Duster cleaning member for a vacuum cleaner
US6937033B2 (en) * 2001-06-27 2005-08-30 Immersion Corporation Position sensor with resistive element
US7056123B2 (en) 2001-07-16 2006-06-06 Immersion Corporation Interface apparatus with cable-driven force feedback and grounded actuators
JP4187182B2 (en) * 2001-07-27 2008-11-26 株式会社バンダイナムコゲームス Image generation system, program, and information storage medium
US20030069998A1 (en) * 2001-08-31 2003-04-10 Brown David W. Motion services protocol accessible through uniform resource locator (URL)
US6703550B2 (en) * 2001-10-10 2004-03-09 Immersion Corporation Sound data output and manipulation using haptic feedback
US7665021B2 (en) * 2001-11-09 2010-02-16 Adobe Systems Incorporated System and method for script based event timing
EP1456830A1 (en) * 2001-11-14 2004-09-15 The Henry M. Jackson Foundation Multi-tactile display haptic interface device
US11202676B2 (en) 2002-03-06 2021-12-21 Mako Surgical Corp. Neural monitor-based dynamic haptics
US8996169B2 (en) 2011-12-29 2015-03-31 Mako Surgical Corp. Neural monitor-based dynamic haptics
US8010180B2 (en) 2002-03-06 2011-08-30 Mako Surgical Corp. Haptic guidance system and method
US7747311B2 (en) 2002-03-06 2010-06-29 Mako Surgical Corp. System and method for interactive haptic positioning of a medical device
JP4061105B2 (en) * 2002-03-29 2008-03-12 アルプス電気株式会社 Haptic device
CA2385224C (en) * 2002-05-07 2012-10-02 Corel Corporation Dockable drop-down dialogs
JP3986885B2 (en) * 2002-05-16 2007-10-03 アルプス電気株式会社 Haptic device
US20040008222A1 (en) * 2002-07-09 2004-01-15 Silverlynk, Corporation User intuitive easy access computer system
US6925357B2 (en) 2002-07-25 2005-08-02 Intouch Health, Inc. Medical tele-robotic system
US20040162637A1 (en) 2002-07-25 2004-08-19 Yulun Wang Medical tele-robotic system with a master remote station with an arbitrator
US20040036680A1 (en) * 2002-08-26 2004-02-26 Mark Davis User-interface features for computers with contact-sensitive displays
US7331868B2 (en) 2002-09-13 2008-02-19 Igt Wagering gaming device providing physical stimulation responses to various components of the gaming device
US7789756B2 (en) 2002-09-13 2010-09-07 Igt Wagering gaming device having simulated control of movement of game functional elements
US20040090460A1 (en) * 2002-11-12 2004-05-13 Hideya Kawahara Method and apparatus for updating a User Interface for a computer system based on a physics model
US20040113931A1 (en) * 2002-12-05 2004-06-17 Anderson Thomas G. Human-computer interfaces incorporating haptics and path-based interaction
AU2003296334A1 (en) 2002-12-08 2004-06-30 Immersion Corporation Haptic communication devices
US7779166B2 (en) * 2002-12-08 2010-08-17 Immersion Corporation Using haptic effects to enhance information content in communications
US20060136631A1 (en) * 2002-12-08 2006-06-22 Immersion Corporation, A Delaware Corporation Methods and systems for providing haptic messaging to handheld communication devices
US8830161B2 (en) 2002-12-08 2014-09-09 Immersion Corporation Methods and systems for providing a virtual touch haptic effect to handheld communication devices
US20060136630A1 (en) * 2002-12-08 2006-06-22 Immersion Corporation, A Delaware Corporation Methods and systems for providing haptic messaging to handheld communication devices
US8059088B2 (en) 2002-12-08 2011-11-15 Immersion Corporation Methods and systems for providing haptic messaging to handheld communication devices
JP4143437B2 (en) * 2003-02-20 2008-09-03 アルプス電気株式会社 Haptic input device
US7275292B2 (en) * 2003-03-07 2007-10-02 Avago Technologies Wireless Ip (Singapore) Pte. Ltd. Method for fabricating an acoustical resonator on a substrate
JP4167099B2 (en) * 2003-03-19 2008-10-15 アルプス電気株式会社 Image display device
US7850456B2 (en) 2003-07-15 2010-12-14 Simbionix Ltd. Surgical simulation device, system and method
US6836982B1 (en) 2003-08-14 2005-01-04 Caterpillar Inc Tactile feedback system for a remotely controlled work machine
US8027349B2 (en) 2003-09-25 2011-09-27 Roy-G-Biv Corporation Database event driven motion systems
JP4148084B2 (en) * 2003-09-25 2008-09-10 株式会社デンソー Display operation system
US20070022194A1 (en) * 2003-09-25 2007-01-25 Brown David W Database event driven motion systems
US20060064503A1 (en) * 2003-09-25 2006-03-23 Brown David W Data routing systems and methods
US7411576B2 (en) * 2003-10-30 2008-08-12 Sensable Technologies, Inc. Force reflecting haptic interface
US7382378B2 (en) 2003-10-30 2008-06-03 Sensable Technologies, Inc. Apparatus and methods for stenciling an image
US7542026B2 (en) * 2003-11-03 2009-06-02 International Business Machines Corporation Apparatus method and system for improved feedback of pointing device event processing
JP4220355B2 (en) * 2003-11-10 2009-02-04 アルプス電気株式会社 Haptic input device
JP4180491B2 (en) * 2003-11-10 2008-11-12 アルプス電気株式会社 Haptic input device
WO2005048086A2 (en) * 2003-11-17 2005-05-26 Roy-G-Biv Corporation Command processing systems and methods
US20060066569A1 (en) * 2003-12-08 2006-03-30 Immersion Corporation, A Delaware Corporation Methods and systems for providing haptic messaging to handheld communication devices
US7813836B2 (en) 2003-12-09 2010-10-12 Intouch Technologies, Inc. Protocol for a remotely controlled videoconferencing robot
US7889209B2 (en) * 2003-12-10 2011-02-15 Sensable Technologies, Inc. Apparatus and methods for wrapping texture onto the surface of a virtual object
US7626589B2 (en) * 2003-12-10 2009-12-01 Sensable Technologies, Inc. Haptic graphical user interface for adjusting mapped texture
US7982711B2 (en) * 2003-12-19 2011-07-19 Immersion Corporation Haptic profiling system and method
US7742036B2 (en) * 2003-12-22 2010-06-22 Immersion Corporation System and method for controlling haptic devices having multiple operational modes
US7791588B2 (en) 2003-12-22 2010-09-07 Immersion Corporation System and method for mapping instructions associated with haptic feedback
US7149596B2 (en) * 2004-01-13 2006-12-12 Sensable Technologies, Inc. Apparatus and methods for modifying a model of an object to enforce compliance with a manufacturing constraint
US8639819B2 (en) * 2004-02-05 2014-01-28 Nokia Corporation Ad-hoc connection between electronic devices
JP4173114B2 (en) * 2004-02-23 2008-10-29 株式会社国際電気通信基礎技術研究所 Experience drawing device
US20050204438A1 (en) 2004-02-26 2005-09-15 Yulun Wang Graphical interface for a remote presence system
JP4220416B2 (en) * 2004-03-05 2009-02-04 アルプス電気株式会社 Haptic input device
CA2462620A1 (en) * 2004-03-30 2005-09-30 Jvl Corporation Pool video game
FR2870617B1 (en) * 2004-05-19 2006-07-28 France Telecom CREATION AND RESTITUTION OF INCREASED ATMOSPHERE
US7624355B2 (en) * 2004-05-27 2009-11-24 Baneth Robin C System and method for controlling a user interface
US8077963B2 (en) 2004-07-13 2011-12-13 Yulun Wang Mobile robot with a head-based movement mapping scheme
US20060064640A1 (en) * 2004-09-23 2006-03-23 Forlines Clifton L Method for editing graphics objects with multi-level input devices
US7728823B2 (en) 2004-09-24 2010-06-01 Apple Inc. System and method for processing raw data of track pad device
US7719522B2 (en) * 2004-09-24 2010-05-18 Apple Inc. Raw data track pad device and system
US7388454B2 (en) * 2004-10-01 2008-06-17 Avago Technologies Wireless Ip Pte Ltd Acoustic resonator performance enhancement using alternating frame structure
EP1805585B1 (en) * 2004-10-08 2017-08-16 Immersion Corporation Haptic feedback for button and scrolling action simulation in touch input devices
US9471332B2 (en) * 2004-10-19 2016-10-18 International Business Machines Corporation Selecting graphical component types at runtime
US8981876B2 (en) 2004-11-15 2015-03-17 Avago Technologies General Ip (Singapore) Pte. Ltd. Piezoelectric resonator structures and electrical filters having frame elements
US7202560B2 (en) 2004-12-15 2007-04-10 Avago Technologies Wireless Ip (Singapore) Pte. Ltd. Wafer bonding of micro-electro mechanical systems to active circuitry
US7791434B2 (en) 2004-12-22 2010-09-07 Avago Technologies Wireless Ip (Singapore) Pte. Ltd. Acoustic resonator performance enhancement using selective metal etch and having a trench in the piezoelectric
US20100312129A1 (en) 2005-01-26 2010-12-09 Schecter Stuart O Cardiovascular haptic handle system
US20060164396A1 (en) * 2005-01-27 2006-07-27 Microsoft Corporation Synthesizing mouse events from input device events
US7369013B2 (en) * 2005-04-06 2008-05-06 Avago Technologies Wireless Ip Pte Ltd Acoustic resonator performance enhancement using filled recessed region
US7825903B2 (en) * 2005-05-12 2010-11-02 Immersion Corporation Method and apparatus for providing haptic effects to a touch panel
US20060277466A1 (en) * 2005-05-13 2006-12-07 Anderson Thomas G Bimodal user interaction with a simulated object
US7618413B2 (en) * 2005-06-22 2009-11-17 Boston Scientific Scimed, Inc. Medical device control system
US8839095B2 (en) * 2005-08-19 2014-09-16 Adobe Systems Incorporated User interface to define and/or communicate space between objects
US9552686B2 (en) 2005-09-02 2017-01-24 Igt Video and mechanical spinning bonus wheel
US20070057951A1 (en) * 2005-09-12 2007-03-15 Microsoft Corporation View animation for scaling and sorting
EP1764674B1 (en) 2005-09-14 2012-06-13 Volkswagen AG Input device
US9198728B2 (en) 2005-09-30 2015-12-01 Intouch Technologies, Inc. Multi-camera mobile teleconferencing platform
DE102005047650A1 (en) 2005-10-05 2007-04-12 Volkswagen Ag Entry device for e.g. land vehicle, has controller for adjusting slider corresponding to touch movement at touch screen, and actuator for deflecting touch screen when slider reaches preset position or is moved to preset distance
US8565839B2 (en) 2005-10-13 2013-10-22 Abbott Medical Optics Inc. Power management for wireless devices
US8380126B1 (en) 2005-10-13 2013-02-19 Abbott Medical Optics Inc. Reliable communications for wireless devices
US8187883B2 (en) * 2005-10-21 2012-05-29 Wisconsin Alumni Research Foundation Method and system for delivering nucleic acid into a target cell
DE102006029506B4 (en) 2005-10-28 2018-10-11 Volkswagen Ag input device
US7809805B2 (en) * 2007-02-28 2010-10-05 Facebook, Inc. Systems and methods for automatically locating web-based social network members
US9411428B2 (en) * 2006-01-31 2016-08-09 Hillcrest Laboratories, Inc. 3D pointing devices with keyboards
US7479685B2 (en) * 2006-03-10 2009-01-20 Avago Technologies General Ip (Singapore) Pte. Ltd. Electronic device on substrate with cavity and mitigated parasitic leakage path
US20080229254A1 (en) * 2006-03-24 2008-09-18 Ervin-Dawson Warner Method and system for enhanced cursor control
JP5330640B2 (en) * 2006-05-09 2013-10-30 任天堂株式会社 GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME PROCESSING METHOD
EP1857912A1 (en) * 2006-05-16 2007-11-21 Research In Motion Limited Tactile feedback system and method for a mobile communication device having a trackball
JP2009537231A (en) 2006-05-19 2009-10-29 マコ サージカル コーポレーション Method and apparatus for controlling a haptic device
US8849679B2 (en) 2006-06-15 2014-09-30 Intouch Technologies, Inc. Remote controlled robot system that provides medical images
DE102006037725B4 (en) 2006-06-22 2018-05-30 Volkswagen Ag Motor vehicle with an input device
CA2871409C (en) * 2006-07-10 2015-05-19 Microsoft Corporation Trackball system and method for a mobile data processing device
US20090019370A1 (en) * 2006-09-14 2009-01-15 Joseph Pally System for controlling objects in a recursive browser system: forcefield
US7750593B2 (en) * 2006-10-26 2010-07-06 Honeywell International Inc. Active human-machine interface system without a force sensor
US9295765B2 (en) 2006-11-09 2016-03-29 Abbott Medical Optics Inc. Surgical fluidics cassette supporting multiple pumps
US10959881B2 (en) 2006-11-09 2021-03-30 Johnson & Johnson Surgical Vision, Inc. Fluidics cassette for ocular surgical system
US8491528B2 (en) 2006-11-09 2013-07-23 Abbott Medical Optics Inc. Critical alignment of fluidics cassettes
US8414534B2 (en) 2006-11-09 2013-04-09 Abbott Medical Optics Inc. Holding tank devices, systems, and methods for surgical fluidics cassette
US9522221B2 (en) 2006-11-09 2016-12-20 Abbott Medical Optics Inc. Fluidics cassette for ocular surgical system
US9345462B2 (en) 2006-12-01 2016-05-24 Boston Scientific Scimed, Inc. Direct drive endoscopy systems and methods
CN101627411B (en) 2007-01-16 2014-03-19 西姆博尼克斯有限公司 Device and method for simulated image to guide medical procedure
US8543338B2 (en) 2007-01-16 2013-09-24 Simbionix Ltd. System and method for performing computerized simulations for image-guided procedures using a patient specific model
US20100113152A1 (en) * 2007-01-30 2010-05-06 Ron Shmuel Computer games based on mental imagery
US20080238635A1 (en) * 2007-03-28 2008-10-02 Gunnar Klinghult Force feedback for input devices
US9160783B2 (en) 2007-05-09 2015-10-13 Intouch Technologies, Inc. Robot system that operates through a network firewall
WO2008144077A1 (en) 2007-05-18 2008-11-27 Boston Scientific Scimed, Inc. Drive systems and methods of use
US10596032B2 (en) 2007-05-24 2020-03-24 Johnson & Johnson Surgical Vision, Inc. System and method for controlling a transverse phacoemulsification system with a footpedal
US10485699B2 (en) 2007-05-24 2019-11-26 Johnson & Johnson Surgical Vision, Inc. Systems and methods for transverse phacoemulsification
US10363166B2 (en) 2007-05-24 2019-07-30 Johnson & Johnson Surgical Vision, Inc. System and method for controlling a transverse phacoemulsification system using sensed data
US20090015568A1 (en) * 2007-07-12 2009-01-15 Koski David A Method and Apparatus for Implementing Slider Detents
US20090015557A1 (en) * 2007-07-12 2009-01-15 Koski David A Responsiveness Control Method for Pointing Device Movement With Respect to a Graphical User Interface
US8462112B2 (en) * 2007-07-12 2013-06-11 Apple Inc. Responsiveness control system for pointing device movement with respect to a graphical user interface
US8692767B2 (en) * 2007-07-13 2014-04-08 Synaptics Incorporated Input device and method for virtual trackball operation
US20090031257A1 (en) * 2007-07-26 2009-01-29 Motorola, Inc. Method and system of attractive links
US10342701B2 (en) 2007-08-13 2019-07-09 Johnson & Johnson Surgical Vision, Inc. Systems and methods for phacoemulsification with vacuum based pumps
US9569088B2 (en) * 2007-09-04 2017-02-14 Lg Electronics Inc. Scrolling method of mobile terminal
US8260985B2 (en) 2007-10-05 2012-09-04 Pano Logic, Inc. Universal serial bus assistance engine
US20090104964A1 (en) * 2007-10-17 2009-04-23 Igt Gaming system, gaming device and gaming method providing player physical activation of the symbol generator
US9128525B2 (en) 2008-01-04 2015-09-08 Tactus Technology, Inc. Dynamic tactile interface
US9552065B2 (en) 2008-01-04 2017-01-24 Tactus Technology, Inc. Dynamic tactile interface
US9423875B2 (en) 2008-01-04 2016-08-23 Tactus Technology, Inc. Dynamic tactile interface with exhibiting optical dispersion characteristics
US8553005B2 (en) 2008-01-04 2013-10-08 Tactus Technology, Inc. User interface system
US8243038B2 (en) 2009-07-03 2012-08-14 Tactus Technologies Method for adjusting the user interface of a device
US9612659B2 (en) 2008-01-04 2017-04-04 Tactus Technology, Inc. User interface system
US8456438B2 (en) 2008-01-04 2013-06-04 Tactus Technology, Inc. User interface system
US8547339B2 (en) 2008-01-04 2013-10-01 Tactus Technology, Inc. System and methods for raised touch screens
US8922502B2 (en) 2008-01-04 2014-12-30 Tactus Technology, Inc. User interface system
US9720501B2 (en) 2008-01-04 2017-08-01 Tactus Technology, Inc. Dynamic tactile interface
US9274612B2 (en) 2008-01-04 2016-03-01 Tactus Technology, Inc. User interface system
US8154527B2 (en) 2008-01-04 2012-04-10 Tactus Technology User interface system
US8922510B2 (en) 2008-01-04 2014-12-30 Tactus Technology, Inc. User interface system
US8570295B2 (en) 2008-01-04 2013-10-29 Tactus Technology, Inc. User interface system
US8704790B2 (en) 2010-10-20 2014-04-22 Tactus Technology, Inc. User interface system
US9280224B2 (en) 2012-09-24 2016-03-08 Tactus Technology, Inc. Dynamic tactile interface and methods
US8179377B2 (en) 2009-01-05 2012-05-15 Tactus Technology User interface system
US9052790B2 (en) 2008-01-04 2015-06-09 Tactus Technology, Inc. User interface and methods
US8199124B2 (en) 2009-01-05 2012-06-12 Tactus Technology User interface system
US8179375B2 (en) * 2008-01-04 2012-05-15 Tactus Technology User interface system and method
US9013417B2 (en) 2008-01-04 2015-04-21 Tactus Technology, Inc. User interface system
US8587541B2 (en) 2010-04-19 2013-11-19 Tactus Technology, Inc. Method for actuating a tactile interface layer
US9760172B2 (en) 2008-01-04 2017-09-12 Tactus Technology, Inc. Dynamic tactile interface
US9430074B2 (en) 2008-01-04 2016-08-30 Tactus Technology, Inc. Dynamic tactile interface
US9588683B2 (en) 2008-01-04 2017-03-07 Tactus Technology, Inc. Dynamic tactile interface
US9298261B2 (en) 2008-01-04 2016-03-29 Tactus Technology, Inc. Method for actuating a tactile interface layer
US8947383B2 (en) 2008-01-04 2015-02-03 Tactus Technology, Inc. User interface system and method
US9063627B2 (en) 2008-01-04 2015-06-23 Tactus Technology, Inc. User interface and methods
US9557915B2 (en) 2008-01-04 2017-01-31 Tactus Technology, Inc. Dynamic tactile interface
US20090221196A1 (en) * 2008-02-29 2009-09-03 Blair Charles S Torsional control boat throttle system
BRPI0804355A2 (en) * 2008-03-10 2009-11-03 Lg Electronics Inc terminal and control method
US10875182B2 (en) 2008-03-20 2020-12-29 Teladoc Health, Inc. Remote presence system mounted to operating room hardware
WO2009120992A2 (en) * 2008-03-27 2009-10-01 St. Jude Medical, Arrial Fibrillation Division Inc. Robotic castheter system input device
US9161817B2 (en) 2008-03-27 2015-10-20 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic catheter system
US8343096B2 (en) 2008-03-27 2013-01-01 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic catheter system
US8684962B2 (en) 2008-03-27 2014-04-01 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic catheter device cartridge
US20090248042A1 (en) * 2008-03-27 2009-10-01 Kirschenman Mark B Model catheter input device
US8317745B2 (en) * 2008-03-27 2012-11-27 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic catheter rotatable device cartridge
US9241768B2 (en) * 2008-03-27 2016-01-26 St. Jude Medical, Atrial Fibrillation Division, Inc. Intelligent input device controller for a robotic catheter system
US8317744B2 (en) 2008-03-27 2012-11-27 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic catheter manipulator assembly
US8641664B2 (en) * 2008-03-27 2014-02-04 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic catheter system with dynamic response
US8179418B2 (en) 2008-04-14 2012-05-15 Intouch Technologies, Inc. Robotic based health care system
US8170241B2 (en) 2008-04-17 2012-05-01 Intouch Technologies, Inc. Mobile tele-presence system with a microphone system
US7732977B2 (en) * 2008-04-30 2010-06-08 Avago Technologies Wireless Ip (Singapore) Transceiver circuit for film bulk acoustic resonator (FBAR) transducers
US7855618B2 (en) * 2008-04-30 2010-12-21 Avago Technologies Wireless Ip (Singapore) Pte. Ltd. Bulk acoustic resonator electrical impedance transformers
KR101630864B1 (en) * 2008-05-09 2016-06-16 코닌클리케 필립스 엔.브이. Method and system for conveying an emotion
JP4717905B2 (en) * 2008-05-28 2011-07-06 アルプス電気株式会社 Operation feeling imparting type input device
US20090303175A1 (en) * 2008-06-05 2009-12-10 Nokia Corporation Haptic user interface
US20090313020A1 (en) * 2008-06-12 2009-12-17 Nokia Corporation Text-to-speech user interface control
US20090312645A1 (en) * 2008-06-16 2009-12-17 Boston Scientific Scimed, Inc. Methods and Devices for Accessing Anatomic Structures
US20090327886A1 (en) * 2008-06-27 2009-12-31 Microsoft Corporation Use of secondary factors to analyze user intention in gui element activation
US9193065B2 (en) 2008-07-10 2015-11-24 Intouch Technologies, Inc. Docking system for a tele-presence robot
US9842192B2 (en) 2008-07-11 2017-12-12 Intouch Technologies, Inc. Tele-presence robot system with multi-cast features
US8327294B2 (en) * 2008-07-17 2012-12-04 International Business Machines Corporation Method and system to reduce workload and skills required in usage of mouse or other pointing devices
US8151188B2 (en) * 2008-07-23 2012-04-03 General Electric Company Intelligent user interface using on-screen force feedback and method of use
US8340819B2 (en) 2008-09-18 2012-12-25 Intouch Technologies, Inc. Mobile videoconferencing robot system with network adaptive driving
US8749495B2 (en) * 2008-09-24 2014-06-10 Immersion Corporation Multiple actuation handheld device
MX2011003854A (en) 2008-10-10 2011-06-09 Internet Services Llc Haptic otuput device for use with haptic encoded media.
JP4631957B2 (en) * 2008-10-17 2011-02-16 株式会社デンソー Navigation device
US8996165B2 (en) 2008-10-21 2015-03-31 Intouch Technologies, Inc. Telepresence robot with a camera boom
US9795507B2 (en) 2008-11-07 2017-10-24 Abbott Medical Optics Inc. Multifunction foot pedal
WO2010054146A1 (en) * 2008-11-07 2010-05-14 Abbott Medical Optics Inc. Method for programming foot pedal settings and controlling performance through foot pedal variation
US10219940B2 (en) 2008-11-07 2019-03-05 Johnson & Johnson Surgical Vision, Inc. Automatically pulsing different aspiration levels to an ocular probe
CA2941766A1 (en) 2008-11-07 2010-05-14 Abbott Medical Optics Inc. Automatically switching different aspiration levels and/or pumps to an ocular probe
AU2009313413B2 (en) 2008-11-07 2015-01-22 Johnson & Johnson Surgical Vision, Inc. Controlling of multiple pumps
AU2009313411B2 (en) 2008-11-07 2015-03-12 Johnson & Johnson Surgical Vision, Inc. Adjustable foot pedal control for ophthalmic surgery
EP2373266B1 (en) 2008-11-07 2020-04-29 Johnson & Johnson Surgical Vision, Inc. Surgical cassette apparatus
US8463435B2 (en) 2008-11-25 2013-06-11 Intouch Technologies, Inc. Server connectivity control for tele-presence robot
US9138891B2 (en) 2008-11-25 2015-09-22 Intouch Technologies, Inc. Server connectivity control for tele-presence robot
US20100146395A1 (en) * 2008-12-08 2010-06-10 Gustavo De Los Reyes Method and System for Exploiting Interactions Via A Virtual Environment
KR20100075009A (en) 2008-12-24 2010-07-02 삼성전자주식회사 Method and apparatus for providing gui
US20100167820A1 (en) * 2008-12-29 2010-07-01 Houssam Barakat Human interface device
JP5773884B2 (en) * 2008-12-31 2015-09-02 セント・ジュード・メディカル・エイトリアル・フィブリレーション・ディヴィジョン・インコーポレーテッド Robot catheter system input device
US9588684B2 (en) 2009-01-05 2017-03-07 Tactus Technology, Inc. Tactile interface for a computing device
WO2010085007A1 (en) * 2009-01-21 2010-07-29 한국과학기술연구원 Vibrotactile device and vibration method using the same
US8849680B2 (en) 2009-01-29 2014-09-30 Intouch Technologies, Inc. Documentation through a remote presence robot
US9652030B2 (en) * 2009-01-30 2017-05-16 Microsoft Technology Licensing, Llc Navigation of a virtual plane using a zone of restriction for canceling noise
US10007340B2 (en) * 2009-03-12 2018-06-26 Immersion Corporation Systems and methods for interfaces featuring surface-based haptic effects
US9746923B2 (en) * 2009-03-12 2017-08-29 Immersion Corporation Systems and methods for providing features in a friction display wherein a haptic effect is configured to vary the coefficient of friction
US9927873B2 (en) 2009-03-12 2018-03-27 Immersion Corporation Systems and methods for using textures in graphical user interface widgets
US9696803B2 (en) 2009-03-12 2017-07-04 Immersion Corporation Systems and methods for friction displays and additional haptic effects
US9874935B2 (en) * 2009-03-12 2018-01-23 Immersion Corporation Systems and methods for a texture engine
US10564721B2 (en) * 2009-03-12 2020-02-18 Immersion Corporation Systems and methods for using multiple actuators to realize textures
US9256282B2 (en) * 2009-03-20 2016-02-09 Microsoft Technology Licensing, Llc Virtual object manipulation
US9492317B2 (en) 2009-03-31 2016-11-15 Abbott Medical Optics Inc. Cassette capture mechanism
US8897920B2 (en) 2009-04-17 2014-11-25 Intouch Technologies, Inc. Tele-presence robot system with software modularity, projector and laser pointer
US20100271312A1 (en) * 2009-04-22 2010-10-28 Rachid Alameh Menu Configuration System and Method for Display on an Electronic Device
US20120005693A1 (en) * 2010-01-08 2012-01-05 Cypress Semiconductor Corporation Development, Programming, and Debugging Environment
WO2010134649A1 (en) * 2009-05-19 2010-11-25 한국과학기술연구원 Vibration haptic mobile apparatus and operating method thereof
US20100306825A1 (en) 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method for facilitating user interaction with a simulated object associated with a physical location
US9383823B2 (en) 2009-05-29 2016-07-05 Microsoft Technology Licensing, Llc Combining gestures beyond skeletal
US8248185B2 (en) 2009-06-24 2012-08-21 Avago Technologies Wireless Ip (Singapore) Pte. Ltd. Acoustic resonator structure comprising a bridge
US8902023B2 (en) 2009-06-24 2014-12-02 Avago Technologies General Ip (Singapore) Pte. Ltd. Acoustic resonator structure having an electrode with a cantilevered portion
JP2012532384A (en) * 2009-07-03 2012-12-13 タクタス テクノロジー User interface expansion system
DE102009032068A1 (en) 2009-07-07 2011-01-13 Volkswagen Ag Method for providing user interface in motor vehicle, involves assigning virtual mass to graphic object, and exercising haptic feedback when shifting graphic object on actuator, where feedback depends on virtual mass of graphic object
US9330497B2 (en) 2011-08-12 2016-05-03 St. Jude Medical, Atrial Fibrillation Division, Inc. User interface devices for electrophysiology lab diagnostic and therapeutic equipment
US9439736B2 (en) 2009-07-22 2016-09-13 St. Jude Medical, Atrial Fibrillation Division, Inc. System and method for controlling a remote medical device guidance system in three-dimensions using gestures
WO2011123669A1 (en) 2010-03-31 2011-10-06 St. Jude Medical, Atrial Fibrillation Division, Inc. Intuitive user interface control for remote catheter navigation and 3d mapping and visualization systems
US8384755B2 (en) 2009-08-26 2013-02-26 Intouch Technologies, Inc. Portable remote presence robot
US8421761B2 (en) * 2009-08-26 2013-04-16 General Electric Company Imaging multi-modality touch pad interface systems, methods, articles of manufacture, and apparatus
US11399153B2 (en) * 2009-08-26 2022-07-26 Teladoc Health, Inc. Portable telepresence apparatus
FR2950170B1 (en) * 2009-09-16 2011-10-14 Airbus Operations Sas METHOD FOR GENERATING INTERFACE CONFIGURATION FILES FOR CALCULATORS OF AN AVIONIC PLATFORM
US8665227B2 (en) * 2009-11-19 2014-03-04 Motorola Mobility Llc Method and apparatus for replicating physical key function with soft keys in an electronic device
US8448092B2 (en) * 2009-11-25 2013-05-21 International Business Machines Corporation Positional effects in a three-dimensional desktop environment
JP5471393B2 (en) * 2009-12-11 2014-04-16 株式会社日本自動車部品総合研究所 Input device
EP2517089A4 (en) 2009-12-21 2016-03-09 Tactus Technology User interface system
US9298262B2 (en) 2010-01-05 2016-03-29 Tactus Technology, Inc. Dynamic tactile interface
US9243316B2 (en) 2010-01-22 2016-01-26 Avago Technologies General Ip (Singapore) Pte. Ltd. Method of fabricating piezoelectric material with selected c-axis orientation
US8796904B2 (en) 2011-10-31 2014-08-05 Avago Technologies General Ip (Singapore) Pte. Ltd. Bulk acoustic resonator comprising piezoelectric layer and inverse piezoelectric layer
US11154981B2 (en) * 2010-02-04 2021-10-26 Teladoc Health, Inc. Robot user interface for telepresence robot system
US8619035B2 (en) 2010-02-10 2013-12-31 Tactus Technology, Inc. Method for assisting user input to a device
US20110199302A1 (en) * 2010-02-16 2011-08-18 Microsoft Corporation Capturing screen objects using a collision volume
US8433828B2 (en) 2010-02-26 2013-04-30 Apple Inc. Accessory protocol for touch screen device accessibility
US8670017B2 (en) 2010-03-04 2014-03-11 Intouch Technologies, Inc. Remote presence system including a cart that supports a robot face and an overhead camera
WO2011112984A1 (en) 2010-03-11 2011-09-15 Tactus Technology User interface system
US10343283B2 (en) 2010-05-24 2019-07-09 Intouch Technologies, Inc. Telepresence robot system that can be accessed by a cellular phone
US10808882B2 (en) 2010-05-26 2020-10-20 Intouch Technologies, Inc. Tele-robotic system with a robot face placed on a chair
US20110304559A1 (en) * 2010-06-11 2011-12-15 Research In Motion Limited Portable electronic device including touch-sensitive display and method of changing tactile feedback
US9132352B1 (en) 2010-06-24 2015-09-15 Gregory S. Rabin Interactive system and method for rendering an object
US9051045B2 (en) * 2010-07-28 2015-06-09 Woodward Mpc, Inc. Indirect drive active control column
US8593488B1 (en) * 2010-08-11 2013-11-26 Apple Inc. Shape distortion
US8636519B2 (en) * 2010-10-05 2014-01-28 Biosense Webster (Israel) Ltd. Simulation of an invasive procedure
WO2012054781A1 (en) 2010-10-20 2012-04-26 Tactus Technology User interface system and method
EP2453428A1 (en) * 2010-11-12 2012-05-16 EADS Construcciones Aeronauticas, S.A. Simulation methods and systems for the control panels of complex systems
JP5305250B2 (en) * 2010-11-12 2013-10-02 株式会社デンソー Vehicle operation input device
US20120139841A1 (en) * 2010-12-01 2012-06-07 Microsoft Corporation User Interface Device With Actuated Buttons
US9264664B2 (en) 2010-12-03 2016-02-16 Intouch Technologies, Inc. Systems and methods for dynamic bandwidth allocation
US8689123B2 (en) 2010-12-23 2014-04-01 Microsoft Corporation Application reporting in an application-selectable user interface
US9323250B2 (en) 2011-01-28 2016-04-26 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
EP2668008A4 (en) 2011-01-28 2018-01-24 Intouch Technologies, Inc. Interfacing with a mobile telepresence robot
US8962443B2 (en) 2011-01-31 2015-02-24 Avago Technologies General Ip (Singapore) Pte. Ltd. Semiconductor device having an airbridge and method of fabricating the same
US9048812B2 (en) 2011-02-28 2015-06-02 Avago Technologies General Ip (Singapore) Pte. Ltd. Bulk acoustic wave resonator comprising bridge formed within piezoelectric layer
US9425764B2 (en) 2012-10-25 2016-08-23 Avago Technologies General Ip (Singapore) Pte. Ltd. Accoustic resonator having composite electrodes with integrated lateral features
US9203374B2 (en) 2011-02-28 2015-12-01 Avago Technologies General Ip (Singapore) Pte. Ltd. Film bulk acoustic resonator comprising a bridge
US9083302B2 (en) 2011-02-28 2015-07-14 Avago Technologies General Ip (Singapore) Pte. Ltd. Stacked bulk acoustic resonator comprising a bridge and an acoustic reflector along a perimeter of the resonator
US9136818B2 (en) 2011-02-28 2015-09-15 Avago Technologies General Ip (Singapore) Pte. Ltd. Stacked acoustic resonator comprising a bridge
US9154112B2 (en) 2011-02-28 2015-10-06 Avago Technologies General Ip (Singapore) Pte. Ltd. Coupled resonator filter comprising a bridge
US9148117B2 (en) 2011-02-28 2015-09-29 Avago Technologies General Ip (Singapore) Pte. Ltd. Coupled resonator filter comprising a bridge and frame elements
US9444426B2 (en) 2012-10-25 2016-09-13 Avago Technologies General Ip (Singapore) Pte. Ltd. Accoustic resonator having integrated lateral feature and temperature compensation feature
US8575820B2 (en) 2011-03-29 2013-11-05 Avago Technologies General Ip (Singapore) Pte. Ltd. Stacked bulk acoustic resonator
US8942828B1 (en) 2011-04-13 2015-01-27 Stuart Schecter, LLC Minimally invasive cardiovascular support system with true haptic coupling
US10769739B2 (en) 2011-04-25 2020-09-08 Intouch Technologies, Inc. Systems and methods for management of information among medical providers and facilities
US20140139616A1 (en) 2012-01-27 2014-05-22 Intouch Technologies, Inc. Enhanced Diagnostics for a Telepresence Robot
US9098611B2 (en) 2012-11-26 2015-08-04 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US9158445B2 (en) 2011-05-27 2015-10-13 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
US9658766B2 (en) 2011-05-27 2017-05-23 Microsoft Technology Licensing, Llc Edge gesture
US20120304107A1 (en) * 2011-05-27 2012-11-29 Jennifer Nan Edge gesture
US20120304131A1 (en) * 2011-05-27 2012-11-29 Jennifer Nan Edge gesture
US9104307B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US8350445B1 (en) 2011-06-16 2013-01-08 Avago Technologies Wireless Ip (Singapore) Pte. Ltd. Bulk acoustic resonator comprising non-piezoelectric layer and bridge
US8811720B2 (en) 2011-07-12 2014-08-19 Raytheon Company 3D visualization of light detection and ranging data
US8922302B2 (en) 2011-08-24 2014-12-30 Avago Technologies General Ip (Singapore) Pte. Ltd. Acoustic resonator formed on a pedestal
US8944940B2 (en) * 2011-08-29 2015-02-03 Icuemotion, Llc Racket sport inertial sensor motion tracking analysis
US9063673B2 (en) * 2011-08-30 2015-06-23 Uniquesoft, Llc System and method for implementing application code from application requirements
US20130057587A1 (en) 2011-09-01 2013-03-07 Microsoft Corporation Arranging tiles
US9146670B2 (en) 2011-09-10 2015-09-29 Microsoft Technology Licensing, Llc Progressively indicating new content in an application-selectable user interface
US9802364B2 (en) 2011-10-18 2017-10-31 3D Systems, Inc. Systems and methods for construction of an instruction set for three-dimensional printing of a user-customizableimage of a three-dimensional structure
US9582178B2 (en) 2011-11-07 2017-02-28 Immersion Corporation Systems and methods for multi-pressure interaction on touch-sensitive surfaces
US8836751B2 (en) 2011-11-08 2014-09-16 Intouch Technologies, Inc. Tele-presence system with a user interface that displays different communication links
US8600450B2 (en) * 2011-12-28 2013-12-03 Sony Corporation Receiving user input on a graphical user interface
US9569057B2 (en) * 2012-01-05 2017-02-14 Sony Corporation Information processing apparatus and method for outputting a guiding operation to a user
KR102024006B1 (en) * 2012-02-10 2019-09-24 삼성전자주식회사 Apparatus and method for controlling vibration flow between vibration devices
US8493354B1 (en) 2012-08-23 2013-07-23 Immersion Corporation Interactivity model for shared feedback on mobile devices
US9700457B2 (en) 2012-03-17 2017-07-11 Abbott Medical Optics Inc. Surgical cassette
US9251313B2 (en) 2012-04-11 2016-02-02 Intouch Technologies, Inc. Systems and methods for visualizing and managing telepresence devices in healthcare networks
US8902278B2 (en) 2012-04-11 2014-12-02 Intouch Technologies, Inc. Systems and methods for visualizing and managing telepresence devices in healthcare networks
US20130293580A1 (en) 2012-05-01 2013-11-07 Zambala Lllp System and method for selecting targets in an augmented reality environment
US9361021B2 (en) 2012-05-22 2016-06-07 Irobot Corporation Graphical user interfaces including touchpad driving interfaces for telemedicine devices
WO2013176758A1 (en) 2012-05-22 2013-11-28 Intouch Technologies, Inc. Clinical workflows utilizing autonomous and semi-autonomous telemedicine devices
US10013082B2 (en) 2012-06-05 2018-07-03 Stuart Schecter, LLC Operating system with haptic interface for minimally invasive, hand-held surgical instrument
JP6106973B2 (en) * 2012-07-11 2017-04-05 富士ゼロックス株式会社 Information processing apparatus and program
US9429912B2 (en) * 2012-08-17 2016-08-30 Microsoft Technology Licensing, Llc Mixed reality holographic object development
US9116546B2 (en) 2012-08-29 2015-08-25 Immersion Corporation System for haptically representing sensor input
US9056244B2 (en) 2012-09-12 2015-06-16 Wms Gaming Inc. Gaming apparatus incorporating targeted haptic feedback
US9405417B2 (en) 2012-09-24 2016-08-02 Tactus Technology, Inc. Dynamic tactile interface and methods
US9671874B2 (en) * 2012-11-08 2017-06-06 Cuesta Technology Holdings, Llc Systems and methods for extensions to alternative control of touch-based devices
US9202350B2 (en) * 2012-12-19 2015-12-01 Nokia Technologies Oy User interfaces and associated methods
EP2746153A1 (en) * 2012-12-20 2014-06-25 BAE Systems PLC Inceptor apparatus
EP2935000B1 (en) 2012-12-20 2019-06-26 BAE Systems PLC Inceptor apparatus
EP2770413A3 (en) * 2013-02-22 2017-01-04 Samsung Electronics Co., Ltd. An apparatus for providing a cursor in electronic devices and a method thereof
JP5835255B2 (en) * 2013-03-19 2015-12-24 カシオ計算機株式会社 Graph display device and graph display program
CN103203750B (en) * 2013-04-15 2016-03-02 苏州工业园区职业技术学院 Based on the double-core single shaft high speed soldering robot serve control system of IMAQ
US9557813B2 (en) 2013-06-28 2017-01-31 Tactus Technology, Inc. Method for reducing perceived optical distortion
US9194977B1 (en) 2013-07-26 2015-11-24 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Active response gravity offload and method
WO2015020178A1 (en) * 2013-08-06 2015-02-12 Square Enix Holdings Co., Ltd. Information processing apparatus, control method, program, and recording medium
JP6098438B2 (en) * 2013-08-27 2017-03-22 株式会社デンソー Operating device
CN103481285B (en) * 2013-09-16 2016-03-09 国家电网公司 Based on robot for high-voltage hot-line work control system and the method for virtual reality technology
US9164587B2 (en) 2013-11-14 2015-10-20 Immersion Corporation Haptic spatialization system
US9619029B2 (en) 2013-11-14 2017-04-11 Immersion Corporation Haptic trigger control system
US20170031452A1 (en) * 2014-01-15 2017-02-02 Juice Design Co., Ltd. Manipulation determination apparatus, manipulation determination method, and, program
US9733788B2 (en) 2014-03-17 2017-08-15 Microsoft Technology Licensing, Llc Multi-stage cursor control
US9874858B2 (en) 2014-03-18 2018-01-23 The United States Of America As Represented By The Secretary Of The Navy Automation control system and a method in an automation control system
US10668353B2 (en) 2014-08-11 2020-06-02 Icuemotion Llc Codification and cueing system for sport and vocational activities
JP6428053B2 (en) * 2014-08-26 2018-11-28 カシオ計算機株式会社 Graph display device, program, and server device
KR102056298B1 (en) 2014-09-02 2019-12-16 애플 인크. Semantic framework for variable haptic output
US10185396B2 (en) * 2014-11-12 2019-01-22 Immersion Corporation Haptic trigger modification system
US9174134B1 (en) 2014-11-12 2015-11-03 Immersion Corporation Peripheral device with haptic diminishment prevention component
DE102014226798A1 (en) 2014-12-22 2016-06-23 Mxo Media-Ag Method and device for generating signals for an apparatus for electrical muscle stimulation
USD766930S1 (en) * 2014-12-31 2016-09-20 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
US10613629B2 (en) 2015-03-27 2020-04-07 Chad Laurendeau System and method for force feedback interface devices
US10854104B2 (en) 2015-08-28 2020-12-01 Icuemotion Llc System for movement skill analysis and skill augmentation and cueing
FR3044434B1 (en) * 2015-12-01 2018-06-15 Dassault Aviation INTERFACE SYSTEM BETWEEN A DISPLAY USER IN THE COCKPIT OF AN AIRCRAFT, AIRCRAFT AND ASSOCIATED METHOD
WO2017150128A1 (en) 2016-03-04 2017-09-08 株式会社ソニー・インタラクティブエンタテインメント Control device and control program
US20170300116A1 (en) * 2016-04-15 2017-10-19 Bally Gaming, Inc. System and method for providing tactile feedback for users of virtual reality content viewers
DK180122B1 (en) 2016-06-12 2020-05-19 Apple Inc. Devices, methods and graphical user interfaces for providing haptic feedback
DK179823B1 (en) 2016-06-12 2019-07-12 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
CN107506115A (en) * 2016-06-14 2017-12-22 阿里巴巴集团控股有限公司 A kind of display processing method of menu, apparatus and system
JP6626576B2 (en) 2016-07-21 2019-12-25 株式会社ソニー・インタラクティブエンタテインメント Operation device and control system
US10482646B1 (en) * 2016-07-21 2019-11-19 Pixar Directable cloth animation
EP3851175A1 (en) * 2016-07-26 2021-07-21 Sony Interactive Entertainment Inc. Operating device and method for controlling the same
EP3493029B1 (en) 2016-07-26 2020-07-22 Sony Interactive Entertainment Inc. Information processing system, operation device, and method for controlling operation device
EP3502841B1 (en) * 2016-08-18 2023-07-26 Sony Group Corporation Information processing device, information processing system and information processing method
DK201670720A1 (en) 2016-09-06 2018-03-26 Apple Inc Devices, Methods, and Graphical User Interfaces for Generating Tactile Outputs
DK179278B1 (en) 2016-09-06 2018-03-26 Apple Inc Devices, methods and graphical user interfaces for haptic mixing
US10895950B2 (en) * 2016-12-09 2021-01-19 International Business Machines Corporation Method and system for generating a holographic image having simulated physical properties
WO2018119302A1 (en) * 2016-12-23 2018-06-28 Dmitri Boutoussov Dental system and method
CN106681597A (en) * 2017-01-12 2017-05-17 合肥杰美电子科技有限公司 Method and system for migrating icon selected state front and back in human-computer interface
US11862302B2 (en) 2017-04-24 2024-01-02 Teladoc Health, Inc. Automated transcription and documentation of tele-health encounters
DK201770372A1 (en) 2017-05-16 2019-01-08 Apple Inc. Tactile feedback for locked device user interfaces
WO2018216780A1 (en) 2017-05-24 2018-11-29 株式会社ミライセンス Stimulus transmission device
CN110799144B (en) * 2017-07-06 2023-06-06 直观外科手术操作公司 System and method for haptic feedback of selection of menu items in a remote control system
US10483007B2 (en) 2017-07-25 2019-11-19 Intouch Technologies, Inc. Modular telehealth cart with thermal imaging and touch screen user interface
US11636944B2 (en) 2017-08-25 2023-04-25 Teladoc Health, Inc. Connectivity infrastructure for a telehealth platform
JP6959349B2 (en) 2017-09-29 2021-11-02 株式会社ソニー・インタラクティブエンタテインメント Operating device and its control device
JP6949982B2 (en) 2017-10-27 2021-10-13 株式会社ソニー・インタラクティブエンタテインメント Operation device
US10617299B2 (en) 2018-04-27 2020-04-14 Intouch Technologies, Inc. Telehealth cart that supports a removable tablet with seamless audio/video switching
JP2021018546A (en) * 2019-07-18 2021-02-15 トヨタ自動車株式会社 Communication device for vehicle and communication system for vehicle
US11226690B2 (en) * 2020-04-10 2022-01-18 Dell Products, L.P. Systems and methods for guiding a user with a haptic mouse
US20210326594A1 (en) * 2020-04-17 2021-10-21 James Patrick COSTELLO Computer-generated supplemental content for video
US11482002B1 (en) * 2020-10-16 2022-10-25 Splunk Inc. Codeless anchor detection for detectable features in an environment
FR3118672A1 (en) * 2021-01-05 2022-07-08 Commissariat A L'energie Atomique Et Aux Energies Alternatives HAPTIC INTERFACE WITH INCREASED REACTIVITY TIME
DE102021205349A1 (en) 2021-05-26 2022-12-01 Robert Bosch Gesellschaft mit beschränkter Haftung joystick
DE102021205814A1 (en) 2021-06-09 2022-12-15 Robert Bosch Gesellschaft mit beschränkter Haftung joystick
DE102022200968A1 (en) 2022-01-31 2023-08-03 Robert Bosch Gesellschaft mit beschränkter Haftung joystick

Family Cites Families (251)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2906179A (en) 1957-01-28 1959-09-29 North American Aviation Inc Vector gage
US3157853A (en) 1957-12-06 1964-11-17 Hirsch Joseph Tactile communication system
US2972140A (en) 1958-09-23 1961-02-14 Hirsch Joseph Apparatus and method for communication through the sense of touch
GB958325A (en) 1962-07-08 1964-05-21 Communications Patents Ltd Improvements in or relating to ground-based flight training or simulating apparatus
US3490059A (en) 1966-06-06 1970-01-13 Martin Marietta Corp Three axis mounting and torque sensing apparatus
US3497668A (en) 1966-08-25 1970-02-24 Joseph Hirsch Tactile control system
US3517446A (en) 1967-04-19 1970-06-30 Singer General Precision Vehicle trainer controls and control loading
US3531868A (en) 1968-04-18 1970-10-06 Ford Motor Co Surface scanner for measuring the coordinates of points on a three-dimensional surface
US3623064A (en) 1968-10-11 1971-11-23 Bell & Howell Co Paging receiver having cycling eccentric mass
US3903614A (en) 1970-03-27 1975-09-09 Singer Co Apparatus for simulating aircraft control loading
US3875488A (en) 1973-02-15 1975-04-01 Raytheon Co Inertially stabilized gimbal platform
US3902687A (en) 1973-06-25 1975-09-02 Robert E Hightower Aircraft indicator system
US3890958A (en) 1974-04-08 1975-06-24 Moog Automotive Inc Physiological diagnostic apparatus
US3944798A (en) 1974-04-18 1976-03-16 Eaton-Leonard Corporation Method and apparatus for measuring direction
US3911416A (en) 1974-08-05 1975-10-07 Motorola Inc Silent call pager
US4125800A (en) 1975-09-02 1978-11-14 Contraves Gorez Corporation Power controller with a modular power output
US4114882A (en) 1976-10-29 1978-09-19 Robert Ralph Runte Variable velocity control for playing images for a manually controlled electronic video display game
US4148014A (en) 1977-04-06 1979-04-03 Texas Instruments Incorporated System with joystick to control velocity vector of a display cursor
US4160508A (en) 1977-08-19 1979-07-10 Nasa Controller arm for a remotely related slave arm
US4127752A (en) 1977-10-13 1978-11-28 Sheldahl, Inc. Tactile touch switch panel
US4216467A (en) 1977-12-22 1980-08-05 Westinghouse Electric Corp. Hand controller
US4262549A (en) 1978-05-10 1981-04-21 Schwellenbach Donald D Variable mechanical vibrator
US4236325A (en) 1978-12-26 1980-12-02 The Singer Company Simulator control loading inertia compensator
US4359223A (en) * 1979-11-01 1982-11-16 Sanders Associates, Inc. Interactive video playback system
US4464117A (en) 1980-08-27 1984-08-07 Dr. Ing. Reiner Foerst Gmbh Driving simulator apparatus
US4638798A (en) 1980-09-10 1987-01-27 Shelden C Hunter Stereotactic method and apparatus for locating and treating or removing lesions
NL8006091A (en) 1980-11-07 1982-06-01 Fokker Bv FLIGHTMATTER.
US4333070A (en) 1981-02-06 1982-06-01 Barnes Robert W Motor vehicle fuel-waste indicator
JPS57169643A (en) 1981-04-13 1982-10-19 Yamato Scale Co Ltd Load cell for multiple components of force
US4599070A (en) 1981-07-29 1986-07-08 Control Interface Company Limited Aircraft simulator and simulated control system therefor
EP0085518B1 (en) 1982-01-22 1989-08-16 British Aerospace Public Limited Company Control apparatus
US4484191A (en) 1982-06-14 1984-11-20 Vavra George S Tactile signaling systems for aircraft
US4593470A (en) 1982-07-14 1986-06-10 Micro Control Systems, Inc. Portable three dimensional graphics tablet
US4477973A (en) 1982-07-14 1984-10-23 Micro Control Systems, Inc. Three dimensional graphics tablet
US4709917A (en) * 1982-09-03 1987-12-01 Yang Tai Her Mock bicycle for exercise and training effects
US4477043A (en) 1982-12-15 1984-10-16 The United States Of America As Represented By The Secretary Of The Air Force Biodynamic resistant control stick
FR2545606B1 (en) 1983-05-06 1985-09-13 Hispano Suiza Sa FORCE TENSIONER SENSOR
GB2142711A (en) 1983-07-04 1985-01-23 Philips Electronic Associated Manually operable x-y signal generator
JPS6029833A (en) 1983-07-28 1985-02-15 Canon Inc Image display device
GB2146776B (en) 1983-09-16 1986-07-30 Ferranti Plc Accelerometer systems
US4524348A (en) * 1983-09-26 1985-06-18 Lefkowitz Leonard R Control interface
US4550221A (en) 1983-10-07 1985-10-29 Scott Mabusth Touch sensitive control device
JPS60170709A (en) 1984-02-16 1985-09-04 Toshiba Corp Measuring insturment for shape
US4571834A (en) 1984-02-17 1986-02-25 Orthotronics Limited Partnership Knee laxity evaluator and motion module/digitizer arrangement
US4581491A (en) 1984-05-04 1986-04-08 Research Corporation Wearable tactile sensory aid providing information on voice pitch and intonation patterns
US4688983A (en) 1984-05-21 1987-08-25 Unimation Inc. Low cost robot
US4676002A (en) 1984-06-25 1987-06-30 Slocum Alexander H Mechanisms to determine position and orientation in space
JPS61105411A (en) 1984-10-29 1986-05-23 Mitsutoyo Mfg Co Ltd Measuring method of multidimensional measuring machine
US4654648A (en) 1984-12-17 1987-03-31 Herrington Richard A Wireless cursor control system
US4935728A (en) 1985-01-02 1990-06-19 Altra Corporation Computer control
US4782327A (en) 1985-01-02 1988-11-01 Victor B. Kley Computer control
US4632341A (en) 1985-02-06 1986-12-30 The United States Of America As Represented By The Secretary Of The Air Force Stabilizing force feedback in bio-actuated control systems
US5078152A (en) 1985-06-23 1992-01-07 Loredan Biomedical, Inc. Method for diagnosis and/or training of proprioceptor feedback capabilities in a muscle and joint system of a human patient
DE3523188A1 (en) 1985-06-28 1987-01-08 Zeiss Carl Fa CONTROL FOR COORDINATE MEASURING DEVICES
US4704909A (en) 1985-07-22 1987-11-10 Grahn Allen R Multicomponent force-torque sensor
US4679331A (en) 1985-08-26 1987-07-14 Ppg Industries, Inc. Apparatus and method for determining contour characteristics of a contoured article
US4713007A (en) 1985-10-11 1987-12-15 Alban Eugene P Aircraft controls simulator
US5275174B1 (en) * 1985-10-30 1998-08-04 Jonathan A Cook Repetitive strain injury assessment
NL8503096A (en) 1985-11-11 1987-06-01 Fokker Bv SIMULATOR OF MECHANICAL PROPERTIES OF OPERATING SYSTEM.
US4934694A (en) 1985-12-06 1990-06-19 Mcintosh James L Computer controlled exercise system
US4891764A (en) 1985-12-06 1990-01-02 Tensor Development Inc. Program controlled force measurement and control system
US5103404A (en) 1985-12-06 1992-04-07 Tensor Development, Inc. Feedback for a manipulator
US4811608A (en) 1985-12-18 1989-03-14 Spatial Systems Pty Limited Force and torque converter
US5591924A (en) * 1985-12-18 1997-01-07 Spacetec Imc Corporation Force and torque converter
US5195179A (en) 1986-01-29 1993-03-16 Hitachi, Ltd. Coordinate input apparatus
US4787051A (en) 1986-05-16 1988-11-22 Tektronix, Inc. Inertial mouse system
US4803413A (en) 1986-07-15 1989-02-07 Honeywell Inc. Magnetic isolating and pointing gimbal apparatus
US4791934A (en) 1986-08-07 1988-12-20 Picker International, Inc. Computer tomography assisted stereotactic surgery system and method
US4689449A (en) 1986-10-03 1987-08-25 Massachusetts Institute Of Technology Tremor suppressing hand controls
US4849692A (en) 1986-10-09 1989-07-18 Ascension Technology Corporation Device for quantitatively measuring the relative position and orientation of two bodies in the presence of metals utilizing direct current magnetic fields
US4945305A (en) 1986-10-09 1990-07-31 Ascension Technology Corporation Device for quantitatively measuring the relative position and orientation of two bodies in the presence of metals utilizing direct current magnetic fields
NL8602697A (en) 1986-10-27 1988-05-16 Huka Bv Developments JOYSTICK.
US4750487A (en) 1986-11-24 1988-06-14 Zanetti Paul H Stereotactic frame
CA1299362C (en) 1986-12-10 1992-04-28 Gregory James Mcdonald Coordinate measuring system
US4945501A (en) 1987-01-20 1990-07-31 The Warner & Swasey Company Method for determining position within the measuring volume of a coordinate measuring machine and the like and system therefor
US4819195A (en) 1987-01-20 1989-04-04 The Warner & Swasey Company Method for calibrating a coordinate measuring machine and the like and system therefor
US4800721A (en) 1987-02-13 1989-01-31 Caterpillar Inc. Force feedback lever
US4794392A (en) 1987-02-20 1988-12-27 Motorola, Inc. Vibrator alert device for a communication receiver
US4839838A (en) 1987-03-30 1989-06-13 Labiche Mitchell Spatial input apparatus
GB2204131B (en) 1987-04-28 1991-04-17 Ibm Graphics input tablet
US4961138A (en) 1987-05-01 1990-10-02 General Datacomm, Inc. System and apparatus for providing three dimensions of input into a host processor
IT1214292B (en) 1987-05-05 1990-01-10 Garda Impianti Srl EQUIPMENT FOR MEASUREMENT AND / OR CONTROL OF THE POSITION AND ORIENTATION OF POINTS OR AREAS CHARACTERISTIC OF STRUCTURES, IN PARTICULAR OF VEHICLE BODIES.
US4868549A (en) 1987-05-18 1989-09-19 International Business Machines Corporation Feedback mouse
DE3717459A1 (en) 1987-05-23 1988-12-01 Zeiss Carl Fa HAND-HELD COORDINATE MEASURING DEVICE
US4982618A (en) 1987-11-03 1991-01-08 Culver Craig F Multifunction tactile manipulatable control
DE3740070A1 (en) 1987-11-26 1989-06-08 Zeiss Carl Fa TURN SLEWING DEVICE FOR TEST COOKING OF COORDINATE MEASURING DEVICES
GB8729638D0 (en) 1987-12-19 1988-02-03 Renishaw Plc Mounting for surface sensing device
US5251127A (en) 1988-02-01 1993-10-05 Faro Medical Technologies Inc. Computer-aided surgery apparatus
GB8803847D0 (en) 1988-02-18 1988-03-16 Renishaw Plc Mounting for surface-sensing device
SE461548B (en) 1988-02-18 1990-02-26 Johansson Ab C E PROCEDURE AND DEVICE FOR DETERMINING AND CORRECTING IN CASE OF LOCATION ERROR IN SEATING A POINT OF A POINT OR POSITIONING TO A POINT WITH A PARTICULAR LOCATION
US5038089A (en) 1988-03-23 1991-08-06 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Synchronized computational architecture for generalized bilateral control of robot arms
US4907970A (en) 1988-03-30 1990-03-13 Grumman Aerospace Corporation Sidestick-type thrust control simulator
US4885565A (en) 1988-06-01 1989-12-05 General Motors Corporation Touchscreen CRT with tactile feedback
US4942545A (en) 1988-06-06 1990-07-17 Combustion Engineering, Inc. Calibration of eddy current profilometry
US5050608A (en) 1988-07-12 1991-09-24 Medirand, Inc. System for indicating a position to be operated in a patient's body
DE58903515D1 (en) 1988-10-03 1993-03-25 Zeiss Carl Fa TEST BODY FOR COORDINATE MEASURING DEVICES.
FR2638010B1 (en) 1988-10-13 1991-01-18 Acroe MODULAR RETROACTIVE KEYBOARD AND FLAT MODULAR ACTUATOR
US5007085A (en) 1988-10-28 1991-04-09 International Business Machines Corporation Remotely sensed personal stylus
US4907973A (en) 1988-11-14 1990-03-13 Hon David C Expert system simulator for modeling realistic internal environments and performance
US4930770A (en) 1988-12-01 1990-06-05 Baker Norman A Eccentrically loaded computerized positive/negative exercise machine
US5189806A (en) 1988-12-19 1993-03-02 Renishaw Plc Method of and apparatus for scanning the surface of a workpiece
US5116051A (en) 1989-01-12 1992-05-26 Atari Games Corporation Strain gauge pressure-sensitive video game control
US5044956A (en) 1989-01-12 1991-09-03 Atari Games Corporation Control device such as a steering wheel for video vehicle simulator with realistic feedback forces
US5186695A (en) * 1989-02-03 1993-02-16 Loredan Biomedical, Inc. Apparatus for controlled exercise and diagnosis of human performance
US5019761A (en) 1989-02-21 1991-05-28 Kraft Brett W Force feedback control for backhoe
JPH02220106A (en) 1989-02-22 1990-09-03 Okuma Mach Works Ltd Digitization controller containing measuring function
GB8904955D0 (en) 1989-03-03 1989-04-12 Atomic Energy Authority Uk Multi-axis hand controller
JPH02290506A (en) 1989-04-28 1990-11-30 Mitsutoyo Corp Three-dimensional measuring instrument
US5184306A (en) 1989-06-09 1993-02-02 Regents Of The University Of Minnesota Automated high-precision fabrication of objects of complex and unique geometry
JPH07104146B2 (en) 1989-08-29 1995-11-13 株式会社ミツトヨ Rotation table scanning control method for coordinate measuring probe
US5139261A (en) 1989-09-15 1992-08-18 Openiano Renato M Foot-actuated computer game controller serving as a joystick
US5182557A (en) * 1989-09-20 1993-01-26 Semborg Recrob, Corp. Motorized joystick
US5065145A (en) 1989-10-06 1991-11-12 Summagraphics Corporation Method and apparatus for producing signals corresponding to the position of a cursor
US5209131A (en) 1989-11-03 1993-05-11 Rank Taylor Hobson Metrology
US5126948A (en) 1989-11-08 1992-06-30 Ltv Aerospace And Defense Company Digital position encoder and data optimizer
US5107080A (en) 1989-12-01 1992-04-21 Massachusetts Institute Of Technology Multiple degree of freedom damped hand controls
US4983786A (en) 1990-01-17 1991-01-08 The University Of British Columbia XY velocity controller
US5022407A (en) 1990-01-24 1991-06-11 Topical Testing, Inc. Apparatus for automated tactile testing
US5259894A (en) 1990-01-26 1993-11-09 Sampson Richard K Method for solvent bonding non-porous materials to automatically create variable bond characteristics
US5072361A (en) 1990-02-01 1991-12-10 Sarcos Group Force-reflective teleoperation control system
US5184319A (en) 1990-02-02 1993-02-02 Kramer James F Force feedback and textures simulating interface device
US5631861A (en) * 1990-02-02 1997-05-20 Virtual Technologies, Inc. Force feedback and texture simulating interface device
US5132672A (en) 1990-03-27 1992-07-21 Apple Computer, Inc. Three degree of freedom graphic object controller
US5095303A (en) 1990-03-27 1992-03-10 Apple Computer, Inc. Six degree of freedom graphic object controller
JPH03292524A (en) 1990-04-11 1991-12-24 Oki Electric Ind Co Ltd Cursor shift system
US5128671A (en) 1990-04-12 1992-07-07 Ltv Aerospace And Defense Company Control device having multiple degrees of freedom
US5035242A (en) 1990-04-16 1991-07-30 David Franklin Method and apparatus for sound responsive tactile stimulation of deaf individuals
US5022384A (en) 1990-05-14 1991-06-11 Capitol Systems Vibrating/massage chair
US5044639A (en) * 1990-05-29 1991-09-03 Taito America Corporation Ball game with player controlled rebound surface
US5080377A (en) * 1990-05-31 1992-01-14 Rare Coin-It, Inc. Video display system
GB9014130D0 (en) * 1990-06-25 1990-08-15 Hewlett Packard Co User interface
US5251156A (en) 1990-08-25 1993-10-05 Carl-Zeiss-Stiftung, Heidenheim/Brenz Method and apparatus for non-contact measurement of object surfaces
US5181181A (en) 1990-09-27 1993-01-19 Triton Technologies, Inc. Computer apparatus input device for three-dimensional information
WO1992007350A1 (en) 1990-10-15 1992-04-30 National Biomedical Research Foundation Three-dimensional cursor control device
US5142506A (en) 1990-10-22 1992-08-25 Logitech, Inc. Ultrasonic position locating method and apparatus therefor
NL194053C (en) 1990-12-05 2001-05-03 Koninkl Philips Electronics Nv Device with a rotationally symmetrical body.
US5223776A (en) 1990-12-31 1993-06-29 Honeywell Inc. Six-degree virtual pivot controller
US5098437A (en) 1991-02-13 1992-03-24 Pfizer Hospital Products Group, Inc. Acetabular cup positioning insert
US5142931A (en) 1991-02-14 1992-09-01 Honeywell Inc. 3 degree of freedom hand controller
US5212473A (en) * 1991-02-21 1993-05-18 Typeright Keyboard Corp. Membrane keyboard and method of using same
US5143505A (en) 1991-02-26 1992-09-01 Rutgers University Actuator system for providing force feedback to a dextrous master glove
US5354162A (en) 1991-02-26 1994-10-11 Rutgers University Actuator system for providing force feedback to portable master support
DE69212149D1 (en) * 1991-03-21 1996-08-14 Atari Games Corp DRIVING SIMULATOR WITH CROSS-CROSS NETWORK FEEDBACK
US5131844A (en) 1991-04-08 1992-07-21 Foster-Miller, Inc. Contact digitizer, particularly for dental applications
JPH06508222A (en) 1991-05-23 1994-09-14 アタリ ゲームズ コーポレーション modular display simulator
US5146566A (en) 1991-05-29 1992-09-08 Ibm Corporation Input/output system for computer user interface using magnetic levitation
US5178012A (en) 1991-05-31 1993-01-12 Rockwell International Corporation Twisting actuator accelerometer
US5279309A (en) 1991-06-13 1994-01-18 International Business Machines Corporation Signaling device and method for monitoring positions in a surgical operation
JP2514490B2 (en) 1991-07-05 1996-07-10 株式会社ダイヘン Teaching control method by interlocking manual operation of industrial robot
US5386507A (en) 1991-07-18 1995-01-31 Teig; Steven L. Computer graphics system for selectively modelling molecules and investigating the chemical and physical properties thereof
US5185561A (en) 1991-07-23 1993-02-09 Digital Equipment Corporation Torque motor as a tactile feedback device in a computer system
EP0526056B1 (en) 1991-07-27 1996-01-31 Renishaw Transducer Systems Limited Calibration and measurement device
US5186629A (en) 1991-08-22 1993-02-16 International Business Machines Corporation Virtual graphics display capable of presenting icons and windows to the blind computer user and method
US5235868A (en) 1991-10-02 1993-08-17 Culver Craig F Mechanism for generating control signals
US5262777A (en) 1991-11-16 1993-11-16 Sri International Device for generating multidimensional input signals to a computer
US5889670A (en) 1991-10-24 1999-03-30 Immersion Corporation Method and apparatus for tactilely responsive user interface
US5220260A (en) 1991-10-24 1993-06-15 Lex Computer And Management Corporation Actuator having electronically controllable tactile responsiveness
US5228356A (en) 1991-11-25 1993-07-20 Chuang Keh Shih K Variable effort joystick
US5309140A (en) * 1991-11-26 1994-05-03 The United States Of America As Represented By The Secretary Of The Navy Feedback system for remotely operated vehicles
US5230623A (en) 1991-12-10 1993-07-27 Radionics, Inc. Operating pointer with interactive computergraphics
US5471571A (en) * 1991-12-27 1995-11-28 Xerox Corporation Method and apparatus for setting a graphical object's position and orientation with viscous dragging
GB9201214D0 (en) 1992-01-21 1992-03-11 Mcmahon Michael J Surgical retractors
CA2062147C (en) 1992-03-02 1995-07-25 Kenji Hara Multi-axial joy stick device
US5589828A (en) * 1992-03-05 1996-12-31 Armstrong; Brad A. 6 Degrees of freedom controller with capability of tactile feedback
US5999185A (en) * 1992-03-30 1999-12-07 Kabushiki Kaisha Toshiba Virtual reality control using image, model and control data to manipulate interactions
US5757358A (en) * 1992-03-31 1998-05-26 The United States Of America As Represented By The Secretary Of The Navy Method and apparatus for enhancing computer-user selection of computer-displayed objects through dynamic selection area and constant visual feedback
JP3199130B2 (en) 1992-03-31 2001-08-13 パイオニア株式会社 3D coordinate input device
US5245320A (en) 1992-07-09 1993-09-14 Thrustmaster, Inc. Multiport game card with configurable address
US5313230A (en) 1992-07-24 1994-05-17 Apple Computer, Inc. Three degree of freedom graphic object controller
US5296871A (en) 1992-07-27 1994-03-22 Paley W Bradford Three-dimensional mouse with tactile feedback
US5428748A (en) 1992-09-24 1995-06-27 National Semiconductor Corporation Method and apparatus for automatically configuring a computer peripheral
US5283970A (en) * 1992-09-25 1994-02-08 Strombecker Corporation Toy guns
US5264768A (en) 1992-10-06 1993-11-23 Honeywell, Inc. Active hand controller feedback loop
US5666473A (en) 1992-10-08 1997-09-09 Science & Technology Corporation & Unm Tactile computer aided sculpting device
US5790108A (en) 1992-10-23 1998-08-04 University Of British Columbia Controller
US5397323A (en) * 1992-10-30 1995-03-14 International Business Machines Corporation Remote center-of-motion robot for surgery
JP3200779B2 (en) 1992-11-10 2001-08-20 誠 西村 Pulse burner for brazing metal
US5389865A (en) 1992-12-02 1995-02-14 Cybernet Systems Corporation Method and system for providing a tactile virtual reality and manipulator defining an interface device therefor
US5629594A (en) 1992-12-02 1997-05-13 Cybernet Systems Corporation Force feedback system
FI92111C (en) 1992-12-11 1994-09-26 Icl Personal Systems Oy Method and arrangement for moving the cursor on a computer screen
US5526480A (en) 1992-12-28 1996-06-11 International Business Machines Corporation Time domain scroll bar for multimedia presentations in a data processing system
US5451924A (en) * 1993-01-14 1995-09-19 Massachusetts Institute Of Technology Apparatus for providing sensory substitution of force feedback
US5355148A (en) 1993-01-14 1994-10-11 Ast Research, Inc. Fingerpoint mouse
EP0607580A1 (en) * 1993-01-21 1994-07-27 International Business Machines Corporation Tactile feedback mechanism for cursor control
US5374942A (en) 1993-02-05 1994-12-20 Gilligan; Federico G. Mouse and method for concurrent cursor position and scrolling control
US5402582A (en) 1993-02-23 1995-04-04 Faro Technologies Inc. Three dimensional coordinate measuring apparatus
US5412880A (en) 1993-02-23 1995-05-09 Faro Technologies Inc. Method of constructing a 3-dimensional map of a measurable quantity using three dimensional coordinate measuring apparatus
US5435554A (en) * 1993-03-08 1995-07-25 Atari Games Corporation Baseball simulation system
JP3686686B2 (en) * 1993-05-11 2005-08-24 松下電器産業株式会社 Haptic device, data input device, and data input device device
US5429140A (en) 1993-06-04 1995-07-04 Greenleaf Medical Systems, Inc. Integrated virtual reality rehabilitation system
US5405152A (en) 1993-06-08 1995-04-11 The Walt Disney Company Method and apparatus for an interactive video game with physical feedback
US5396266A (en) 1993-06-08 1995-03-07 Technical Research Associates, Inc. Kinesthetic feedback apparatus and method
US5351692A (en) 1993-06-09 1994-10-04 Capistrano Labs Inc. Laparoscopic ultrasonic probe
US5513100A (en) 1993-06-10 1996-04-30 The University Of British Columbia Velocity controller with force feedback stiffness control
US5805140A (en) 1993-07-16 1998-09-08 Immersion Corporation High bandwidth force feedback interface using voice coils and flexures
US5734373A (en) 1993-07-16 1998-03-31 Immersion Human Interface Corporation Method and apparatus for controlling force feedback interface systems utilizing a host computer
US5767839A (en) 1995-01-18 1998-06-16 Immersion Human Interface Corporation Method and apparatus for providing passive force feedback to human-computer interface systems
US5701140A (en) 1993-07-16 1997-12-23 Immersion Human Interface Corp. Method and apparatus for providing a cursor control interface with force feedback
US5721566A (en) 1995-01-18 1998-02-24 Immersion Human Interface Corp. Method and apparatus for providing damping force feedback
US5739811A (en) 1993-07-16 1998-04-14 Immersion Human Interface Corporation Method and apparatus for controlling human-computer interface systems providing force feedback
US6057828A (en) 1993-07-16 2000-05-02 Immersion Corporation Method and apparatus for providing force sensations in virtual environments in accordance with host software
US5731804A (en) 1995-01-18 1998-03-24 Immersion Human Interface Corp. Method and apparatus for providing high bandwidth, low noise mechanical I/O for computer systems
US5519618A (en) * 1993-08-02 1996-05-21 Massachusetts Institute Of Technology Airport surface safety logic
US5625576A (en) 1993-10-01 1997-04-29 Massachusetts Institute Of Technology Force reflecting haptic interface
US5436640A (en) 1993-10-29 1995-07-25 Thrustmaster, Inc. Video game and simulator joystick controller with geared potentiometer actuation
US5384460A (en) 1993-11-03 1995-01-24 Silitek Corporation Encoder with a light emitting editing wheel
US5397865A (en) * 1993-11-15 1995-03-14 Park; Noel S. Digitizing tablet with display and plot capability, and methods of training a user
US5436638A (en) 1993-12-17 1995-07-25 Fakespace, Inc. Image display method and apparatus with means for yoking viewpoint orienting muscles of a user
DE69423313T2 (en) 1993-12-20 2000-07-13 Seiko Epson Corp Electronic notification system
US5577981A (en) * 1994-01-19 1996-11-26 Jarvik; Robert Virtual reality exercise machine and computer controlled video system
WO1995020787A1 (en) 1994-01-27 1995-08-03 Exos, Inc. Multimode feedback display technology
CA2140164A1 (en) 1994-01-27 1995-07-28 Kenneth R. Robertson System and method for computer cursor control
US5436542A (en) 1994-01-28 1995-07-25 Surgix, Inc. Telescopic camera mount with remotely controlled positioning
US5482051A (en) * 1994-03-10 1996-01-09 The University Of Akron Electromyographic virtual reality system
US5623642A (en) * 1994-04-06 1997-04-22 M ak Technologies, Inc. Method for simulating newtonian interactions over a computer network
US5564004A (en) * 1994-04-13 1996-10-08 International Business Machines Corporation Method and system for facilitating the selection of icons
US6004134A (en) * 1994-05-19 1999-12-21 Exos, Inc. Interactive simulation including force feedback
US5565887A (en) 1994-06-29 1996-10-15 Microsoft Corporation Method and apparatus for moving a cursor on a computer screen
US5623582A (en) 1994-07-14 1997-04-22 Immersion Human Interface Corporation Computer interface or control input device for laparoscopic surgical instrument and other elongated mechanical objects
US5530455A (en) 1994-08-10 1996-06-25 Mouse Systems Corporation Roller mouse for implementing scrolling in windows applications
EP0727065B1 (en) * 1994-09-07 2002-02-06 Koninklijke Philips Electronics N.V. Virtual workspace with user-programmable tactile feedback
US5565840A (en) * 1994-09-21 1996-10-15 Thorner; Craig Tactile sensation generator
US5570111A (en) 1994-10-03 1996-10-29 International Business Machines Corporation Graphical user interface cursor positioning device having a negative inertia transfer function
US5642469A (en) * 1994-11-03 1997-06-24 University Of Washington Direct-drive manipulator for pen-based force display
US5666138A (en) 1994-11-22 1997-09-09 Culver; Craig F. Interface control
JP3236180B2 (en) * 1994-12-05 2001-12-10 日本電気株式会社 Coordinate pointing device
US5565888A (en) 1995-02-17 1996-10-15 International Business Machines Corporation Method and apparatus for improving visibility and selectability of icons
US5583478A (en) * 1995-03-01 1996-12-10 Renzi; Ronald Virtual environment tactile system
DE69622101T2 (en) 1995-03-13 2003-02-13 Koninkl Philips Electronics Nv 3-D INPUT BY VERTICAL SHIFTING OF MOUSE OR ROLLER BALL
US5611040A (en) 1995-04-05 1997-03-11 Microsoft Corporation Method and system for activating double click applications with a single click
EP0747807B1 (en) * 1995-04-11 2002-03-06 Dragon Systems Inc. Moving an element shown on a computer display
US5736978A (en) 1995-05-26 1998-04-07 The United States Of America As Represented By The Secretary Of The Air Force Tactile graphics display
US5691898A (en) * 1995-09-27 1997-11-25 Immersion Human Interface Corp. Safe and low cost computer peripherals with force feedback for consumer applications
US5589854A (en) 1995-06-22 1996-12-31 Tsai; Ming-Chang Touching feedback device
JP3562049B2 (en) * 1995-07-21 2004-09-08 セイコーエプソン株式会社 Video display method and apparatus
US5771037A (en) 1995-07-24 1998-06-23 Altra Computer display cursor controller
US5805165A (en) * 1995-08-31 1998-09-08 Microsoft Corporation Method of selecting a displayed control item
US5808601A (en) * 1995-09-12 1998-09-15 International Business Machines Corporation Interactive object selection pointer method and apparatus
US5754023A (en) * 1995-10-26 1998-05-19 Cybernet Systems Corporation Gyro-stabilized platforms for force-feedback applications
US5710574A (en) * 1995-11-14 1998-01-20 International Business Machines Corporation Method and system for positioning a graphical pointer within a widget of a data processing system graphical user interface
US5825308A (en) 1996-11-26 1998-10-20 Immersion Human Interface Corporation Force feedback interface having isotonic and isometric functionality
US5877748A (en) 1995-11-20 1999-03-02 Redlich; Sanford I. Computer control input interface system
US6061004A (en) 1995-11-26 2000-05-09 Immersion Corporation Providing force feedback using an interface device including an indexing function
US6219032B1 (en) * 1995-12-01 2001-04-17 Immersion Corporation Method for providing force feedback to a user of an interface device based on interactions of a controlled cursor with graphical elements in a graphical user interface
US6028593A (en) 1995-12-01 2000-02-22 Immersion Corporation Method and apparatus for providing simulated physical interactions within computer generated environments
US5956484A (en) 1995-12-13 1999-09-21 Immersion Corporation Method and apparatus for providing force feedback over a computer network
US6078308A (en) 1995-12-13 2000-06-20 Immersion Corporation Graphical click surfaces for force feedback applications to provide user selection using cursor interaction with a trigger position within a boundary of a graphical object
SE519661C2 (en) * 1996-02-23 2003-03-25 Immersion Corp Pointing devices and method for marking graphic details on a display with sensory feedback upon finding said detail
US6020876A (en) 1997-04-14 2000-02-01 Immersion Corporation Force feedback interface with selective disturbance filter
US6088019A (en) 1998-06-23 2000-07-11 Immersion Corporation Low cost force feedback device with actuator for non-primary axis
US6162123A (en) * 1997-11-25 2000-12-19 Woolston; Thomas G. Interactive electronic sword game

Also Published As

Publication number Publication date
WO1997021160A2 (en) 1997-06-12
EP0864144B1 (en) 2006-03-08
US7199790B2 (en) 2007-04-03
DE69635902D1 (en) 2006-05-04
EP0864144A2 (en) 1998-09-16
WO1997021160A3 (en) 1997-10-09
US9582077B2 (en) 2017-02-28
US20010002126A1 (en) 2001-05-31
DE69635902T2 (en) 2006-12-14
EP0864144A4 (en) 2002-01-30
EP1640851A2 (en) 2006-03-29
US6219032B1 (en) 2001-04-17
US20070139375A1 (en) 2007-06-21

Similar Documents

Publication Publication Date Title
CA2239125A1 (en) Method and apparatus for providing force feedback for a graphical user interface
EP0852789B1 (en) Method and apparatus for controlling force feedback interface systems utilizing a host computer
AU734986B2 (en) Force feedback interface having isotonic and isometric functionality
US7106313B2 (en) Force feedback interface device with force functionality button
US6100874A (en) Force feedback mouse interface
US6750877B2 (en) Controlling haptic feedback for enhancing navigation in a graphical environment
US6078308A (en) Graphical click surfaces for force feedback applications to provide user selection using cursor interaction with a trigger position within a boundary of a graphical object
WO1997012357A9 (en) Method and apparatus for controlling force feedback interface systems utilizing a host computer
WO2002057885A2 (en) Controlling haptic feedback for enhancing navigation in a graphical environment

Legal Events

Date Code Title Description
EEER Examination request
FZDE Discontinued