US5512920A - Locator device for control of graphical objects - Google Patents

Locator device for control of graphical objects Download PDF

Info

Publication number
US5512920A
US5512920A US08/291,667 US29166794A US5512920A US 5512920 A US5512920 A US 5512920A US 29166794 A US29166794 A US 29166794A US 5512920 A US5512920 A US 5512920A
Authority
US
United States
Prior art keywords
locator device
rotation
mouse
graphical
locator
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US08/291,667
Inventor
Sarah F. F. Gibson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Benhov GmbH LLC
Original Assignee
Mitsubishi Electric Research Laboratories Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Research Laboratories Inc filed Critical Mitsubishi Electric Research Laboratories Inc
Priority to US08/291,667 priority Critical patent/US5512920A/en
Assigned to MITSUBISHI ELECTRIC RESEARCH LABORATORIES, INC. reassignment MITSUBISHI ELECTRIC RESEARCH LABORATORIES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GIBSON, SARAH F. FRISKEN
Application granted granted Critical
Publication of US5512920A publication Critical patent/US5512920A/en
Assigned to MITSUBISHI ELECTRIC INFORMATION TECHNOLOGY CENTER AMERICA, INC. reassignment MITSUBISHI ELECTRIC INFORMATION TECHNOLOGY CENTER AMERICA, INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: MITSUBISHI ELECTRIC RESEARCH LABORATORIES, INC.
Assigned to MITSUBISHI ELECTRIC RESEARCH LABORATORIES, INC. reassignment MITSUBISHI ELECTRIC RESEARCH LABORATORIES, INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: MITSUBISHI ELECTRIC INFORMATION TECHNOLOGY CENTER AMERICA, INC.
Assigned to BINARY SERVICES LIMITED LIABILITY COMPANY reassignment BINARY SERVICES LIMITED LIABILITY COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MITSUBISHI ELECTRIC RESEARCH LABORATORIES, INC.
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • G06F3/03544Mice or pucks having dual sensing arrangement, e.g. two balls or two coils used to track rotation of the pointing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

Definitions

  • a computer mouse has a sensor which detects either absolute position (x, y) or position displacement ( ⁇ x, ⁇ y).
  • the detected position or displacement is input to the computer and can be used to control the location of a graphical object on the computer screen.
  • Sensors of position or displacement can be: optical, detecting light reflected from a patterned mouse pad; mechanical, detecting movement of the mouse using encoders on a roller on the base of the mouse; or electromagnetic, sensing absolute position over a grid of active wires embedded in a tablet.
  • Prior locator devices detect 2D position or displacement of the mouse with a single sensor, or a single sensor system, that records either the position pair, (x, y), or the displacement pair, ( ⁇ x, ⁇ y). Rotation of the mouse about the sensor is not detected. Because the sensor monitors only 2 variables, x and y, or ⁇ x and ⁇ y, prior systems have only 2 degrees of freedom and are incapable of simultaneous control of position and orientation of graphical elements. Instead, conventional locator devices require that position and orientation of graphical objects be separately controlled. This is typically done by having two separate input modes for the locator device. In the first mode, mouse movements are interpreted as object translations. In the second mode, the mouse movements are converted into rotation about a pre-selected origin, which can either be implicit or set explicitly by the user. Switching between these modes can be relatively clumsy and can not be done quickly enough for real-time applications.
  • the subject design detects pairs of absolute positions, (x1, y1 ) and (x2, y2), or pairs of displacements, ( ⁇ x1, ⁇ yl ) and ( ⁇ x2, ⁇ y2 ).
  • the subject locator device now has the required three degrees of freedom.
  • the inputs from the subject locator device in addition to a pre-set center of rotation, enable both position and orientation to be updated simultaneously.
  • the subject locator device may be used with supporting software to interactively position and orient graphical objects, designs, text, pictures, and photographs on a canvas or image.
  • the subject locator device offers an advantage over existing devices because it enables intuitive and interactive placement of the objects and eliminates the need for using separate modes to translate and rotate the objects.
  • Prior 2D locator devices include a single sensor or a single sensor system which detects a pair of absolute positions, (x1, y1) or a pair of displacements, ( ⁇ x1, ⁇ y1).
  • Examples of such sensor systems include a trackball with two orthogonally placed displacement encoders or a light pen and a tablet with an encoded grid.
  • the purpose of the encoding system is to determine trackball rotation and to be able to translate trackball rotation into x and y displacements.
  • the second sensor or sensor system is placed on the locator device at a position which is physically separated from the first sensor.
  • This second sensor or sensor system provides the third degree of freedom required for simultaneous 2D translation and rotation.
  • the subject design detects, and inputs to a computer 30 of FIG. 3 via connection 32, pairs of absolute positions, (x1, y1) and (x2, y2), or pairs of displacements, ( ⁇ x1, ⁇ y1) and ( ⁇ x2, ⁇ y2 ). These inputs, in addition to a preset center of rotation, enable both position and orientation to be updated simultaneously.
  • a piece 60 of a jigsaw puzzle 62 is rotated and translated into place as illustrated at 60' through the corresponding translation and rotation of mouse 10 to position 10' through translation 64 and rotation 66.
  • objects can be moved from one position to another in a single movement of a mouse or like locator device.
  • FIG. 14 a block diagram is shown for translating the outputs from the dual trackball mouse of FIG. 1 into translation and rotation of a graphical object.
  • inputs 100 from the dual trackball mouse are applied to a unit 102 which calculated intermediate values d and a. These intermediate values are used in calculating the translation and rotation matrices M and T in unit 104.
  • Unit 104 calculates both the rotation matrix M and the translation vector T required to reposition and reorient the selected graphical object.
  • Unit 106 shows how the object placement at time t is generated from the object placement at time t-1. This is accomplished by first rotating all points (x, y) in the graphical object by pre-multiplying by rotation matrix M and then by translating the resultant vector by the vector T.

Abstract

A three degree-of-freedom locator device for the control of graphical objects on a computer display mimics natural two-dimensional movement of the user's hand by providing for simultaneous translation and rotation of the graphical object. In one embodiment, a computer mouse-type locator with two trackball and position-encoder sensor systems detects movement of the locator over a stationary pad or like device. The graphical object is selected by a mouse button click. Subsequent two-dimensional translation and rotation of the locator are detected by the dual trackball system and used to control the position and orientation of the graphical object. The result is intuitive control of the placement of the selected object on the computer screen. The natural movement of the mouse in the user's hand is transformed into simultaneously translation and rotation of the object.

Description

FIELD OF INVENTION
This invention relates to locator devices for the control of graphical displays and more particularly to a device which permits simultaneous two-dimensional (2D) translation and rotation of a graphical object, or simultaneous control of position and direction of the gaze or viewpoint in a virtual environment.
BACKGROUND OF THE INVENTION
Many graphics applications require the user to interactively position graphical objects on the computer screen. This positioning is frequently done with an input device known as a locator device. Examples of such devices include the computer mouse, data tablet, touch panel, 2D joystick, or trackball. These 2D locator devices have only two degrees of freedom, requiring the user to control 2D translations separately from 2D rotations. Unfortunately, this separation of rotation and translation is clumsy and somewhat counter-intuitive for control of object placement.
For example, in a typical CAD/CAM or computer drawing application, it is often necessary not only to position the graphical elements or icons by linear translation but also to rotate the elements or icons for proper presentation on the screen. Present drawing and graphical systems accomplish the translation and rotation of an object or element using two separate input modes. In the first input mode, the object or element is selected by the mouse or other locator, and then translated to a new position by mimicking subsequent movement of the locator device. In the second input mode, the object or element is selected, a center of rotation is established, and movements of the locator device are translated into rotation of the object or element about the selected center of rotation.
An example of one embodiment of rotation of graphical objects is the MacDraw system offered by Apple computer. In this system, the rotational input mode is selected from the appropriate menu and then a mouse click over the graphical element selects the element and automatically determines the center of rotation. This automatic selection of the center of rotation depends on the class of the selected graphical object. The user has little or no control over the location of the center of rotation. Subsequent movement of the mouse results in rotation of the object about this predetermined center of rotation. Another graphical rotation method is embodied in the Adobe PhotoShop system. In this method, one of a set of rotations listed in 10 degree increments is selected from a menu. In another version of this method, also embodied in the Adobe PhotoShop system, rotation angles are selected from an icon depicting arrows or rotation vectors which graphically represent a finite list of possible rotation angles.
It will be appreciated that all of the above mentioned schemes for placement of a graphical element require separate control of rotation and translation. Hence they are extremely cumbersome for drawing and design applications and completely impractical for applications which require real-time, interactive control of object or viewpoint placement.
More particularly, a computer mouse has a sensor which detects either absolute position (x, y) or position displacement (Δx, Δy). The detected position or displacement is input to the computer and can be used to control the location of a graphical object on the computer screen. Sensors of position or displacement can be: optical, detecting light reflected from a patterned mouse pad; mechanical, detecting movement of the mouse using encoders on a roller on the base of the mouse; or electromagnetic, sensing absolute position over a grid of active wires embedded in a tablet.
Prior locator devices detect 2D position or displacement of the mouse with a single sensor, or a single sensor system, that records either the position pair, (x, y), or the displacement pair, (Δx, Δy). Rotation of the mouse about the sensor is not detected. Because the sensor monitors only 2 variables, x and y, or Δx and Δy, prior systems have only 2 degrees of freedom and are incapable of simultaneous control of position and orientation of graphical elements. Instead, conventional locator devices require that position and orientation of graphical objects be separately controlled. This is typically done by having two separate input modes for the locator device. In the first mode, mouse movements are interpreted as object translations. In the second mode, the mouse movements are converted into rotation about a pre-selected origin, which can either be implicit or set explicitly by the user. Switching between these modes can be relatively clumsy and can not be done quickly enough for real-time applications.
SUMMARY OF THE INVENTION
In order to alleviate the complexity and cumbersome nature of prior systems for translating and rotating a graphical element or object, the subject invention provides a computer mouse-type locator device which enables simultaneous 2D rotation and translation of graphical objects. Using this device, the graphical object's position and orientation is made to mimic the position and orientation of the locator device. In this way, both 2D rotations and translations of the users hand are directly translated into corresponding motions of the selected graphical object. This form of control provides an intuitive and simple interface for graphical object positioning.
Current 2D locator devices include a single sensor or a single sensor system which detects a pair of absolute positions, (x1, y1 ) or a pair of displacements, (Δx1, Δy1). Examples of such sensor systems include a trackball with two orthogonally placed displacement encoders or a light pen and a tablet with an encoded grid. In the subject locator device, a second sensor or sensor system is placed on the locator device at a position which is physically separated from the first sensor. This second sensor or sensor system provides necessary and sufficient information required for simultaneous 2D translation and rotation. The subject design detects pairs of absolute positions, (x1, y1 ) and (x2, y2), or pairs of displacements, (Δx1, Δyl ) and (Δx2, Δy2 ). By adding the second sensor system, the subject locator device now has the required three degrees of freedom. Hence, the inputs from the subject locator device, in addition to a pre-set center of rotation, enable both position and orientation to be updated simultaneously.
Given the subject hardware design, there are several possible software solutions for converting the position or displacement pairs into object placement. The optimal software solution for a given situation will be application dependent. The important contribution of the proposed locator device design is the information required for simultaneous control of 2D translation and rotation. Several software strategies could easily be developed to customize an application's response to movements of the locator device.
There are various applications for the subject locator device which span many areas of computer graphics, computer design, and virtual reality. Examples from three important areas are listed below.
With respect to computer assisted drawing, art, and desktop publishing, the subject locator device may be used with supporting software to interactively position and orient graphical objects, designs, text, pictures, and photographs on a canvas or image. The subject locator device offers an advantage over existing devices because it enables intuitive and interactive placement of the objects and eliminates the need for using separate modes to translate and rotate the objects.
With respect to interactive graphics applications, it will be noted that applications which involve 2D object manipulation or maneuvering through 2D space benefits from the intuitive interface that is enabled by the subject device. Examples of computer games that benefit from this technology include virtual jig saw puzzles, and 2D maze traversal. Examples in manufacturing include interactive assembly of 2D machine parts and graphical testing of insertability. Examples in computer assisted surgery include pre-surgical planning involving sizing and placement of implants relative to a pre-surgical medical image.
Finally, it will be appreciated that the subject locator device may be used to great advantage to control the 2D position and angle of the viewpoint for a individual moving or walking through a virtual reality scene. In an interactive, immersive virtual reality environment, it is extremely important that viewpoint control be simple and intuitive. In many virtual reality applications, viewpoint control is accomplished by tracking the 3D head position of the user with either an external or head-mounted device. However, it will be appreciated that in many situations, such as virtual architectural walk-throughs or the exploration of virtual worlds, the user's feet remain on the floor and maneuvering through the space mostly requires positioning and orienting the person in the 2D floor space. Hence, in these applications, the subject locator device provides a simple, inexpensive and single-handed input device that enables users to guide themselves through virtual environments.
In summary, a three degree-of-freedom locator device for the control of graphical objects on a computer display mimics natural 2D movement of a user's hand by providing for simultaneous translation and rotation of the graphical object. In one embodiment, a computer mouse-type locator with two trackball and position-encoder sensor systems detects movement of the locator over a stationary pad or like device. The graphical object is selected with a mouse button click and subsequent 2D translation and rotation of the locator are mimicked by the graphical object. The result is intuitive placement of the selected object on the computer screen by transforming the natural movement of the mouse in the user's hand into simultaneously translation and rotation of the object. The subject locator device has applications in many areas of computer graphics, computer-aided design, and virtual reality.
BRIEF DESCRIPTION OF THE DRAWINGS
These and other features of the subject invention will be better understood taken into conjunction with the Detailed Description in conjunction with the Drawings of which:
FIGS. 1A and 1B are respectively bottom and top views of a dual trackball mouse for control of object translation and rotation on a display via a single mouse movement;
FIG. 2 is a diagrammatic illustration of the movement of the dual trackball mouse of FIG. 1, illustrating both mouse movement and corresponding movement of the designated image on the display;
FIG. 3 is a diagrammatic illustration of the connection of the dual trackball mouse of FIG. 1 to a computer, with attendant display;
FIG. 4 is a diagrammatic illustration of the utilization of the dual trackball mouse of FIG. 1 to translate and rotate a jigsaw puzzle piece into the appropriate position within the puzzle;
FIG. 5 is the top view of a room layout. A-H represents the path of an observer moving through the virtual building. Thus, the position and orientation of the gaze of the observer is controlled by a continuous, natural movement of the dual trackball mouse;
FIGS. 6-13 are diagrammatic illustrations showing scenes viewed by the observer at selected points A-H along the virtual path; and
FIG. 14 is a block diagram illustrating one system for translating the outputs from a dual trackball mouse into translations and rotations of the graphical object selected to be moved and rotated.
DETAILED DESCRIPTION
Prior 2D locator devices include a single sensor or a single sensor system which detects a pair of absolute positions, (x1, y1) or a pair of displacements, (Δx1, Δy1). Examples of such sensor systems include a trackball with two orthogonally placed displacement encoders or a light pen and a tablet with an encoded grid. The purpose of the encoding system is to determine trackball rotation and to be able to translate trackball rotation into x and y displacements.
In this embodiment, the dual trackball mouse of FIGS. 1A and 1B includes a pair of trackballs 12 and 14 which project from a bottom surface 16 of mouse housing 18. When the mouse is inverted, the trackballs co-act with a surface 20, causing the trackballs to rotate within their respective housings as mouse 10 is moved across surface 20. As can be seen, a top surface 22 of mouse 10 has a pair of switches or buttons, 24 and 26, which are utilized in the normal manner to click on an object on a screen to indicate or select a graphical object on the screen.
As illustrated by arrow 28 in FIG. 1A, the second sensor or sensor system is placed on the locator device at a position which is physically separated from the first sensor. This second sensor or sensor system provides the third degree of freedom required for simultaneous 2D translation and rotation. The subject design detects, and inputs to a computer 30 of FIG. 3 via connection 32, pairs of absolute positions, (x1, y1) and (x2, y2), or pairs of displacements, (Δx1, Δy1) and (Δx2, Δy2 ). These inputs, in addition to a preset center of rotation, enable both position and orientation to be updated simultaneously. It will be appreciated that the center of rotation of an object on a screen can be established explicitly by a mouse click on the desired position on the graphical display or implicitly by pre-specifying a point on the screen or the graphical object. For example, the center of mass could be established a priori as the object's center of rotation.
Referring now to FIG. 2, in one application, an object 40 on screen 42 within a room 44 is translated and rotated with one mouse movement to a position 40' as illustrated in dotted outline. This requires translation of the object as illustrated by arrow 44 and a rotation of the object as illustrated by arrow 46.
As can be seen, mouse 10 is rotated over pad 20 such that the orientation and position of object 40 mimics that of the mouse as the mouse is moved across surface 20. This can be seen by examining edge 50 of mouse 10 which corresponds to the orientation of chair arm 52 of object 40. It can be seen that edge 50' corresponds to arm orientation 52' as the mouse is moved from one position to the other over surface 20.
The rotary encoders normally utilized in a single-trackball mouse converts the rotation of the ball into orthogonal distances based on the movement of the surface of the ball in these two orthogonal directions. This provides the Δx and Δy of the above equations. Having derived the translation of one of the trackballs in the two orthogonal directions, one can utilize this same information from a trackball physically removed from the first trackball to derive angular rotation of the mouse. Having thus being able to model or sense both the translation and rotation of the mouse simultaneously over a surface, it is relatively easy to make a corresponding object mimic the motion of the mouse.
Referring now to FIG. 4, in another application, a piece 60 of a jigsaw puzzle 62, is rotated and translated into place as illustrated at 60' through the corresponding translation and rotation of mouse 10 to position 10' through translation 64 and rotation 66. Thus in a two dimensional space, objects can be moved from one position to another in a single movement of a mouse or like locator device.
Applications for such a locator device extend to the arena of virtual reality in which an individual can seemingly navigate through a virtual reality scene with the simple movement of this specialized mouse. Movements of the mouse provide a change in the scene corresponding to the position and direction of the gaze as controlled by the mouse.
FIG. 5 illustrates such a virtual reality scene where the user navigates through a series of rooms 70 along the virtual path represented by the dotted line 72. It can be seen that the path of the points starts at a portal 76 and proceeds through a doorway 78 into a room 80; with the path then exiting room 80 through door 82 into room 84. The path then extends through a widened arch 86 into a room 88 and thence through door 90 back into the original foyer indicated by reference character 92.
In each of the succeeding figures, namely FIGS. 6-13, path 72 is established through the movement of mouse 10 along path 72' over surface 20. The gaze of the observer is in the direction of the path at any given point along the path. It is to be appreciated that in addition to control of the gaze and movement through the virtual space in the manner described above, it is also possible, by using mouse buttons in addition to mouse movement, to fix the gaze on a specific point or object and use the movement of the mouse to control only the virtual position of the observer. In this method, the scene is presented as if the observer walks along the dotted line while turning his/her head so that his/her gaze is constantly fixed on the selected object or position within the virtual scene. It is also to be appreciated that buttons on the mouse can be utilized to control such many other functions including: image zooming, object launching, or the control of specialized illumination.
Referring now to FIG. 14, a block diagram is shown for translating the outputs from the dual trackball mouse of FIG. 1 into translation and rotation of a graphical object. In this figure, inputs 100 from the dual trackball mouse are applied to a unit 102 which calculated intermediate values d and a. These intermediate values are used in calculating the translation and rotation matrices M and T in unit 104.
Unit 104 calculates both the rotation matrix M and the translation vector T required to reposition and reorient the selected graphical object. Unit 106 shows how the object placement at time t is generated from the object placement at time t-1. This is accomplished by first rotating all points (x, y) in the graphical object by pre-multiplying by rotation matrix M and then by translating the resultant vector by the vector T.
More specifically, FIG. 14 illustrates one algorithm which provides intuitive translation of mouse movement to object placement is described here. In this example algorithm, the center of rotation of the graphical object is determined from the position of the locator device over the graphical object when the object is selected. After the object is selected and the center of rotation established, motions of the locator device are used to reposition the graphical object.
In this example algorithm, the center of rotation of the object lies at the position of one of the two sensors or sensor systems, called the primary sensor. The object transformation is then composed of a rotation of the object about the primary sensor position, (x1, y1), by the angleθ=2 sin-1 (d/2a), followed by a translation of the rotated object by d1 =(Δx1, Δy1), where: ##EQU1##
The transformation of any point (x, y) in the original object onto its new position (x', y'), can be calculated using the matrix equation: ##EQU2##
It will be appreciated that the above is only one of many algorithms that may be utilized given the information available from the dual, laterally displaced sensor systems such as provided by the dual trackball system. For example, a different mapping function could rotate the object about an arbitrary origin (or axis in 3D space) or could adjust the relative sensitivity of the object to translational and rotational movements of the locator device.
It will be appreciated that the center of rotation may be made to correspond to the center of mass of the object. Alternatively, this center of rotation can be made to correspond to a corner or other predetermined position on the graphical object or display. Moreover, the center of rotation can be made to correspond to any arbitrary point chosen interactively by the user.
While the subject invention has been described in detail in terms of a dual trackball mouse, simultaneous translation and rotation of a graphical object can be accomplished by any locator device having two sensor systems that are spaced apart at a fixed distance, and which both detect orthogonal pairs of position displacement or absolute position. Moreover, it will be appreciated that since the second sensor is placed at a fixed distance from the original sensor, the second sensor system actually only requires the ability to detect a single displacement or position (such as Δx or x) since the second displacement or position (Δy or y) could then be calculated from the known geometry.
Having now described a few embodiments of the invention, and some modifications and variations thereto it should be apparent to those skilled in the art that the foregoing is merely illustrative and not limiting, having been presented by the way of example only. Numerous modifications and other embodiments are within the scope of one of ordinary skill in the art and are contemplated as falling within the scope of the invention as limited only by the appended claims and equivalents thereto.

Claims (14)

I claim:
1. Apparatus for the control of graphical objects appearing on a computer driven display, comprising:
a locator device for permitting simultaneous two-dimensional translation and rotation of said graphical object including spaced-apart rollerball sensors positioned on said locator device in a fixed relationship to each other on the same side of said locator device, each of said rollerball sensors including means for generating signals corresponding to rollerball movement for simultaneously generating an output corresponding to the angle associated with the angular orientation of said locator device and the translation of said locator device as said locator device is moved and rotated in a two-dimensional plane; and,
means coupled to the output of said locator device for translating and rotating said graphical object in accordance with the sensed displacements of said sensors to mimic the angular rotation and displacement of said locator device, whereby the movement of said graphical object on said display is controlled by the movement of said locator device.
2. The apparatus of claim 1, wherein the output of said locator device is a pair of absolute positions of said sensors.
3. The apparatus of claim 1, wherein the output of said locator device is a pair of displacements of said sensors.
4. The apparatus of claim 3, wherein said means for moving said graphical object includes means for calculating translation and rotation matrices from said angle, said matrices controlling the position and orientation of said graphical object.
5. The apparatus of claim 4, wherein said matrix corresponding to the rotation of said graphical object by θ is given by the matrix: ##EQU3## and where x and y denote the positions of respective sensors at respective positions 1 and 2, and Δx and Δy denote translations of said sensors.
6. The apparatus of claim 1, wherein said locator device includes a mouse which has multiple rollerballs.
7. The apparatus of claim 1, wherein said locator device includes light sensors.
8. The apparatus of claim 1, wherein said locator device includes rotary encoders.
9. The apparatus of claim 1, and further including means for selecting a graphical object on said display for translation and rotation thereof and means for establishing the point about which said object is to be rotated.
10. The apparatus of claim 9, wherein said means for establishing the point about which said object is rotated includes means for determining center of mass of said object and means for establishing said point as said center of mass.
11. The apparatus of claim 9, wherein said means for establishing the point about which said object is rotated includes means for establishing said point of rotation as a predetermined point at said object.
12. The apparatus of claim 11, wherein said predetermined point is a corner of said object.
13. The apparatus of claim 11, wherein said predetermined point is the geographic center of said object.
14. The apparatus of claim 11, and further including means at said locator for selecting the object to be translated and rotated and for establishing on said object the position of said predetermined point.
US08/291,667 1994-08-17 1994-08-17 Locator device for control of graphical objects Expired - Lifetime US5512920A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US08/291,667 US5512920A (en) 1994-08-17 1994-08-17 Locator device for control of graphical objects

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US08/291,667 US5512920A (en) 1994-08-17 1994-08-17 Locator device for control of graphical objects

Publications (1)

Publication Number Publication Date
US5512920A true US5512920A (en) 1996-04-30

Family

ID=23121297

Family Applications (1)

Application Number Title Priority Date Filing Date
US08/291,667 Expired - Lifetime US5512920A (en) 1994-08-17 1994-08-17 Locator device for control of graphical objects

Country Status (1)

Country Link
US (1) US5512920A (en)

Cited By (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5689669A (en) * 1994-04-29 1997-11-18 General Magic Graphical user interface for navigating between levels displaying hallway and room metaphors
WO1998031005A1 (en) * 1997-01-09 1998-07-16 Virtouch Ltd. Mouse-like input/output device with display screen and method for its use
US5883619A (en) * 1996-11-12 1999-03-16 Primax Electronics Ltd. Computer mouse for scrolling a view of an image
US5936612A (en) * 1997-05-30 1999-08-10 Wang; Yanqing Computer input device and method for 3-D direct manipulation of graphic objects
US5949408A (en) * 1995-09-28 1999-09-07 Hewlett-Packard Company Dual orientation display handheld computer devices
US5966130A (en) * 1994-05-12 1999-10-12 Benman, Jr.; William J. Integrated virtual networks
US5973704A (en) * 1995-10-09 1999-10-26 Nintendo Co., Ltd. Three-dimensional image processing apparatus
US6001015A (en) * 1995-10-09 1999-12-14 Nintendo Co., Ltd. Operation controlling device and video processing system used therewith
US6002351A (en) * 1995-11-10 1999-12-14 Nintendo Co., Ltd. Joystick device
US6007428A (en) * 1995-10-09 1999-12-28 Nintendo Co., Ltd. Operation controlling device and video processing system used therewith
US6102803A (en) * 1995-05-10 2000-08-15 Nintendo Co., Ltd. Operating device with analog joystick
US6115028A (en) * 1996-08-22 2000-09-05 Silicon Graphics, Inc. Three dimensional input system using tilt
US6139434A (en) * 1996-09-24 2000-10-31 Nintendo Co., Ltd. Three-dimensional image processing apparatus with enhanced automatic and user point of view control
US6139433A (en) * 1995-11-22 2000-10-31 Nintendo Co., Ltd. Video game system and method with enhanced three-dimensional character and background control due to environmental conditions
US6155926A (en) * 1995-11-22 2000-12-05 Nintendo Co., Ltd. Video game system and method with enhanced three-dimensional character and background control
US6241611B1 (en) 1995-05-10 2001-06-05 Nintendo Co., Ltd. Function expansion device and operating device using the function expansion device
US6244959B1 (en) 1996-09-24 2001-06-12 Nintendo Co., Ltd. Three-dimensional image processing system with enhanced character control
US6267673B1 (en) 1996-09-20 2001-07-31 Nintendo Co., Ltd. Video game system with state of next world dependent upon manner of entry from previous world via a portal
US6271842B1 (en) * 1997-04-04 2001-08-07 International Business Machines Corporation Navigation via environmental objects in three-dimensional workspace interactive displays
US6283857B1 (en) 1996-09-24 2001-09-04 Nintendo Co., Ltd. Three-dimensional image processing apparatus with enhanced automatic and user point of view control
US6383079B1 (en) 1995-11-22 2002-05-07 Nintendo Co., Ltd. High performance/low cost video game system with multi-functional peripheral processing subsystem
US6497618B1 (en) 1995-10-09 2002-12-24 Nintendo Co. Ltd. Video game system with data transmitting/receiving controller
US6679776B1 (en) 1997-07-17 2004-01-20 Nintendo Co., Ltd. Video game system
US20040212587A1 (en) * 2003-04-25 2004-10-28 Microsoft Corporation Computer input device with angular displacement detection capabilities
US20050215321A1 (en) * 2004-03-29 2005-09-29 Saied Hussaini Video game controller with integrated trackball control device
US6967644B1 (en) * 1998-10-01 2005-11-22 Canon Kabushiki Kaisha Coordinate input apparatus and control method thereof, and computer readable memory
US20050278711A1 (en) * 2002-11-28 2005-12-15 Silva Sonia D Method and assembly for processing, viewing and installing command information transmitted by a device for manipulating images
GB2391615B (en) * 2002-04-08 2006-06-14 Agilent Technologies Inc Apparatus and method for sensing rotation
US7126584B1 (en) 1995-10-09 2006-10-24 Nintendo Co., Ltd. Operating device and image processing system using same
WO2006131945A2 (en) * 2005-06-08 2006-12-14 Nicola Narracci Pointing device, or mouse
US7305631B1 (en) * 2002-09-30 2007-12-04 Danger, Inc. Integrated motion sensor for a data processing device
US20090093857A1 (en) * 2006-12-28 2009-04-09 Markowitz H Toby System and method to evaluate electrode position and spacing
US20090264739A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Determining a position of a member within a sheath
US20090264738A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Method and apparatus for mapping a structure
US20090264750A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Locating a member in a structure
US20090264752A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Method And Apparatus For Mapping A Structure
US20090280301A1 (en) * 2008-05-06 2009-11-12 Intertape Polymer Corp. Edge coatings for tapes
US20090297001A1 (en) * 2008-04-18 2009-12-03 Markowitz H Toby Method And Apparatus For Mapping A Structure
US20110051845A1 (en) * 2009-08-31 2011-03-03 Texas Instruments Incorporated Frequency diversity and phase rotation
US20110106203A1 (en) * 2009-10-30 2011-05-05 Medtronic, Inc. System and method to evaluate electrode position and spacing
US8135467B2 (en) 2007-04-18 2012-03-13 Medtronic, Inc. Chronically-implantable active fixation medical electrical leads and related methods for non-fluoroscopic implantation
US8175681B2 (en) 2008-12-16 2012-05-08 Medtronic Navigation Inc. Combination of electromagnetic and electropotential localization
US8340751B2 (en) 2008-04-18 2012-12-25 Medtronic, Inc. Method and apparatus for determining tracking a virtual point defined relative to a tracked member
US8419741B2 (en) 2000-03-17 2013-04-16 Kinamed, Inc. Marking template for installing a custom replacement device for resurfacing a femur and associated installation method
EP2609862A1 (en) * 2010-08-23 2013-07-03 FUJIFILM Corporation Image display device, method and program
US8494613B2 (en) 2009-08-31 2013-07-23 Medtronic, Inc. Combination localization system
US8494614B2 (en) 2009-08-31 2013-07-23 Regents Of The University Of Minnesota Combination localization system
EP2701039A2 (en) * 2011-04-21 2014-02-26 Cheolwoo Kim Universal motion controller in which a 3d movement and a rotational input are possible
US11243618B1 (en) * 2021-05-25 2022-02-08 Arkade, Inc. Computer input devices having translational and rotational degrees of freedom
USD999776S1 (en) * 2020-04-29 2023-09-26 Toontrack Music Ab Display screen or portion thereof with graphical user interface
USD1000474S1 (en) * 2021-03-17 2023-10-03 Beijing Xiaomi Mobile Software Co., Ltd. Display screen with animated graphical user interface
USD1017637S1 (en) * 2022-01-20 2024-03-12 Clo Virtual Fashion Inc. Display panel with icon
USD1017638S1 (en) * 2022-01-20 2024-03-12 Clo Virtual Fashion Inc. Display panel with icon

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4759075A (en) * 1983-03-14 1988-07-19 Ana Tech Corporation Method and apparatus for vectorizing documents and symbol recognition
US4827413A (en) * 1987-06-16 1989-05-02 Kabushiki Kaisha Toshiba Modified back-to-front three dimensional reconstruction algorithm
US4917516A (en) * 1987-02-18 1990-04-17 Retter Dale J Combination computer keyboard and mouse data entry system
US5263135A (en) * 1985-07-18 1993-11-16 Canon Kabushiki Kaisha Image processing apparatus
US5298919A (en) * 1991-08-02 1994-03-29 Multipoint Technology Corporation Multi-dimensional input device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4759075A (en) * 1983-03-14 1988-07-19 Ana Tech Corporation Method and apparatus for vectorizing documents and symbol recognition
US5263135A (en) * 1985-07-18 1993-11-16 Canon Kabushiki Kaisha Image processing apparatus
US4917516A (en) * 1987-02-18 1990-04-17 Retter Dale J Combination computer keyboard and mouse data entry system
US4827413A (en) * 1987-06-16 1989-05-02 Kabushiki Kaisha Toshiba Modified back-to-front three dimensional reconstruction algorithm
US5298919A (en) * 1991-08-02 1994-03-29 Multipoint Technology Corporation Multi-dimensional input device

Cited By (141)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5896133A (en) * 1994-04-29 1999-04-20 General Magic Graphical user interface for navigating between street, hallway, room, and function metaphors
US5689669A (en) * 1994-04-29 1997-11-18 General Magic Graphical user interface for navigating between levels displaying hallway and room metaphors
US5966130A (en) * 1994-05-12 1999-10-12 Benman, Jr.; William J. Integrated virtual networks
US6102803A (en) * 1995-05-10 2000-08-15 Nintendo Co., Ltd. Operating device with analog joystick
US6461242B2 (en) 1995-05-10 2002-10-08 Nintendo Co., Ltd. Operating device for an image processing apparatus
US6489946B1 (en) 1995-05-10 2002-12-03 Nintendo Co., Ltd. Operating device with analog joystick
US6241611B1 (en) 1995-05-10 2001-06-05 Nintendo Co., Ltd. Function expansion device and operating device using the function expansion device
US6186896B1 (en) 1995-05-10 2001-02-13 Nintendo Co., Ltd. Operating device with analog joystick
US5949408A (en) * 1995-09-28 1999-09-07 Hewlett-Packard Company Dual orientation display handheld computer devices
US20050174328A1 (en) * 1995-10-09 2005-08-11 Nintendo Co., Ltd. User controlled graphics object movement based on a amount of joystick angular rotation and point of view angle
US6007428A (en) * 1995-10-09 1999-12-28 Nintendo Co., Ltd. Operation controlling device and video processing system used therewith
US6200253B1 (en) 1995-10-09 2001-03-13 Nintendo Co., Ltd. Controller pack
US7102618B2 (en) 1995-10-09 2006-09-05 Nintendo Co., Ltd. User controlled graphics object movement based on a amount of joystick angular rotation and point of view angle
US6917356B1 (en) 1995-10-09 2005-07-12 Nintendo Co. Ltd. User controlled graphics object movement based on amount of joystick angular rotation and point of view angle
US6778190B1 (en) 1995-10-09 2004-08-17 Nintendo Co., Ltd. Three-dimensional image processing apparatus
US6332840B1 (en) 1995-10-09 2001-12-25 Ninetendo Co., Ltd. Operation controlling device and video processing system used therewith
US6676520B2 (en) 1995-10-09 2004-01-13 Nintendo Co., Ltd. Video game system providing physical sensation
US6421056B1 (en) 1995-10-09 2002-07-16 Nintendo Co., Ltd. Three-dimensional image processing apparatus
US6001015A (en) * 1995-10-09 1999-12-14 Nintendo Co., Ltd. Operation controlling device and video processing system used therewith
US5973704A (en) * 1995-10-09 1999-10-26 Nintendo Co., Ltd. Three-dimensional image processing apparatus
US6590578B2 (en) 1995-10-09 2003-07-08 Nintendo Co., Ltd. Three-dimensional image processing apparatus
US6497618B1 (en) 1995-10-09 2002-12-24 Nintendo Co. Ltd. Video game system with data transmitting/receiving controller
US7126584B1 (en) 1995-10-09 2006-10-24 Nintendo Co., Ltd. Operating device and image processing system using same
US7594854B2 (en) 1995-10-09 2009-09-29 Nintendo Co., Ltd. Video game system with data transmitting/receiving controller
US6325718B1 (en) 1995-10-09 2001-12-04 Nintendo Co., Ltd. Operation controlling device and video processing system used therewith
US6307486B1 (en) 1995-11-10 2001-10-23 Nintendo Co., Ltd. Joystick device
US6002351A (en) * 1995-11-10 1999-12-14 Nintendo Co., Ltd. Joystick device
US6139433A (en) * 1995-11-22 2000-10-31 Nintendo Co., Ltd. Video game system and method with enhanced three-dimensional character and background control due to environmental conditions
US6155926A (en) * 1995-11-22 2000-12-05 Nintendo Co., Ltd. Video game system and method with enhanced three-dimensional character and background control
US6331146B1 (en) 1995-11-22 2001-12-18 Nintendo Co., Ltd. Video game system and method with enhanced three-dimensional character and background control
US6454652B2 (en) 1995-11-22 2002-09-24 Nintendo Co., Ltd. Video game system and method with enhanced three-dimensional character and background control due to environmental conditions
US6383079B1 (en) 1995-11-22 2002-05-07 Nintendo Co., Ltd. High performance/low cost video game system with multi-functional peripheral processing subsystem
US6115028A (en) * 1996-08-22 2000-09-05 Silicon Graphics, Inc. Three dimensional input system using tilt
US6267673B1 (en) 1996-09-20 2001-07-31 Nintendo Co., Ltd. Video game system with state of next world dependent upon manner of entry from previous world via a portal
US6244959B1 (en) 1996-09-24 2001-06-12 Nintendo Co., Ltd. Three-dimensional image processing system with enhanced character control
US6491585B1 (en) 1996-09-24 2002-12-10 Nintendo Co., Ltd. Three-dimensional image processing apparatus with enhanced automatic and user point of view control
US6139434A (en) * 1996-09-24 2000-10-31 Nintendo Co., Ltd. Three-dimensional image processing apparatus with enhanced automatic and user point of view control
US6283857B1 (en) 1996-09-24 2001-09-04 Nintendo Co., Ltd. Three-dimensional image processing apparatus with enhanced automatic and user point of view control
US5883619A (en) * 1996-11-12 1999-03-16 Primax Electronics Ltd. Computer mouse for scrolling a view of an image
US5912660A (en) * 1997-01-09 1999-06-15 Virtouch Ltd. Mouse-like input/output device with display screen and method for its use
WO1998031005A1 (en) * 1997-01-09 1998-07-16 Virtouch Ltd. Mouse-like input/output device with display screen and method for its use
US6271842B1 (en) * 1997-04-04 2001-08-07 International Business Machines Corporation Navigation via environmental objects in three-dimensional workspace interactive displays
US5936612A (en) * 1997-05-30 1999-08-10 Wang; Yanqing Computer input device and method for 3-D direct manipulation of graphic objects
US6679776B1 (en) 1997-07-17 2004-01-20 Nintendo Co., Ltd. Video game system
US7070507B2 (en) 1997-07-17 2006-07-04 Nintendo Co., Ltd. Video game system
US6967644B1 (en) * 1998-10-01 2005-11-22 Canon Kabushiki Kaisha Coordinate input apparatus and control method thereof, and computer readable memory
US9393032B2 (en) * 2000-03-17 2016-07-19 Kinamed, Inc. Marking template for installing a custom replacement device for resurfacing a femur and associated installation method
US20140194998A1 (en) * 2000-03-17 2014-07-10 Kinamed, Inc. Marking template for installing a custom replacement device for resurfacing a femur and associated installation method
US20140194997A1 (en) * 2000-03-17 2014-07-10 Kinamed, Inc. Marking template for installing a custom replacement device for resurfacing a femur and associated installation method
US8936601B2 (en) * 2000-03-17 2015-01-20 Kinamed, Inc. Marking template for installing a custom replacement device for resurfacing a femur and associated installation method
US8936602B2 (en) * 2000-03-17 2015-01-20 Kinamed, Inc. Marking template for installing a custom replacement device for resurfacing a femur and associated installation method
US8961529B2 (en) 2000-03-17 2015-02-24 Kinamed, Inc. Marking template for installing a custom replacement device for resurfacing a femur and associated installation method
US8419741B2 (en) 2000-03-17 2013-04-16 Kinamed, Inc. Marking template for installing a custom replacement device for resurfacing a femur and associated installation method
US8771281B2 (en) 2000-03-17 2014-07-08 Kinamed, Inc. Marking template for installing a custom replacement device for resurfacing a femur and associated installation method
GB2391615B (en) * 2002-04-08 2006-06-14 Agilent Technologies Inc Apparatus and method for sensing rotation
US7305631B1 (en) * 2002-09-30 2007-12-04 Danger, Inc. Integrated motion sensor for a data processing device
US20050278711A1 (en) * 2002-11-28 2005-12-15 Silva Sonia D Method and assembly for processing, viewing and installing command information transmitted by a device for manipulating images
US20040212587A1 (en) * 2003-04-25 2004-10-28 Microsoft Corporation Computer input device with angular displacement detection capabilities
KR101044102B1 (en) 2003-04-25 2011-06-28 마이크로소프트 코포레이션 Computer input device with angular displacement detection capabilities
JP2004326744A (en) * 2003-04-25 2004-11-18 Microsoft Corp Computer input device with angular displacement detection capability
US7081884B2 (en) * 2003-04-25 2006-07-25 Microsoft Corporation Computer input device with angular displacement detection capabilities
US20050215321A1 (en) * 2004-03-29 2005-09-29 Saied Hussaini Video game controller with integrated trackball control device
WO2006131945A3 (en) * 2005-06-08 2007-05-18 Nicola Narracci Pointing device, or mouse
WO2006131945A2 (en) * 2005-06-08 2006-12-14 Nicola Narracci Pointing device, or mouse
US20090093857A1 (en) * 2006-12-28 2009-04-09 Markowitz H Toby System and method to evaluate electrode position and spacing
US7941213B2 (en) 2006-12-28 2011-05-10 Medtronic, Inc. System and method to evaluate electrode position and spacing
US8135467B2 (en) 2007-04-18 2012-03-13 Medtronic, Inc. Chronically-implantable active fixation medical electrical leads and related methods for non-fluoroscopic implantation
US20090264778A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Uni-Polar and Bi-Polar Switchable Tracking System between
US8532734B2 (en) 2008-04-18 2013-09-10 Regents Of The University Of Minnesota Method and apparatus for mapping a structure
US20090262982A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Determining a Location of a Member
US20090262109A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Illustrating a three-dimensional nature of a data set on a two-dimensional display
US20090264744A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Reference Structure for a Tracking System
US20090262979A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Determining a Material Flow Characteristic in a Structure
US20090264743A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Interference Blocking and Frequency Selection
US20090264749A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Identifying a structure for cannulation
US20090267773A1 (en) * 2008-04-18 2009-10-29 Markowitz H Toby Multiple Sensor for Structure Identification
US10426377B2 (en) 2008-04-18 2019-10-01 Medtronic, Inc. Determining a location of a member
US20090297001A1 (en) * 2008-04-18 2009-12-03 Markowitz H Toby Method And Apparatus For Mapping A Structure
CN105769180B (en) * 2008-04-18 2019-07-12 美敦力公司 System and method for correcting the distortion in potential position sensing
US9662041B2 (en) 2008-04-18 2017-05-30 Medtronic, Inc. Method and apparatus for mapping a structure
CN105769180A (en) * 2008-04-18 2016-07-20 美敦力公司 System And Method Used For Correcting Distortion In Potential Position Sensing System
US20090264752A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Method And Apparatus For Mapping A Structure
US20090264751A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Determining the position of an electrode relative to an insulative cover
US8106905B2 (en) * 2008-04-18 2012-01-31 Medtronic, Inc. Illustrating a three-dimensional nature of a data set on a two-dimensional display
US20090264742A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Determining and Illustrating a Structure
US20090264739A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Determining a position of a member within a sheath
US8185192B2 (en) 2008-04-18 2012-05-22 Regents Of The University Of Minnesota Correcting for distortion in a tracking system
US20120130232A1 (en) * 2008-04-18 2012-05-24 Regents Of The University Of Minnesota Illustrating a Three-Dimensional Nature of a Data Set on a Two-Dimensional Display
US8208991B2 (en) 2008-04-18 2012-06-26 Medtronic, Inc. Determining a material flow characteristic in a structure
US8214018B2 (en) 2008-04-18 2012-07-03 Medtronic, Inc. Determining a flow characteristic of a material in a structure
US8260395B2 (en) 2008-04-18 2012-09-04 Medtronic, Inc. Method and apparatus for mapping a structure
US8340751B2 (en) 2008-04-18 2012-12-25 Medtronic, Inc. Method and apparatus for determining tracking a virtual point defined relative to a tracked member
US8345067B2 (en) 2008-04-18 2013-01-01 Regents Of The University Of Minnesota Volumetrically illustrating a structure
CN102118994B (en) * 2008-04-18 2016-05-25 美敦力公司 For the method and apparatus of mapping structure
US8364252B2 (en) 2008-04-18 2013-01-29 Medtronic, Inc. Identifying a structure for cannulation
US8391965B2 (en) 2008-04-18 2013-03-05 Regents Of The University Of Minnesota Determining the position of an electrode relative to an insulative cover
US8421799B2 (en) * 2008-04-18 2013-04-16 Regents Of The University Of Minnesota Illustrating a three-dimensional nature of a data set on a two-dimensional display
US20090264727A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Method and apparatus for mapping a structure
US8424536B2 (en) 2008-04-18 2013-04-23 Regents Of The University Of Minnesota Locating a member in a structure
US8442625B2 (en) 2008-04-18 2013-05-14 Regents Of The University Of Minnesota Determining and illustrating tracking system members
US8457371B2 (en) 2008-04-18 2013-06-04 Regents Of The University Of Minnesota Method and apparatus for mapping a structure
US9332928B2 (en) 2008-04-18 2016-05-10 Medtronic, Inc. Method and apparatus to synchronize a location determination in a structure with a characteristic of the structure
US9179860B2 (en) 2008-04-18 2015-11-10 Medtronic, Inc. Determining a location of a member
US8494608B2 (en) 2008-04-18 2013-07-23 Medtronic, Inc. Method and apparatus for mapping a structure
US9131872B2 (en) 2008-04-18 2015-09-15 Medtronic, Inc. Multiple sensor input for structure identification
US20090264741A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Determining a Size of A Representation of A Tracked Member
US8560042B2 (en) 2008-04-18 2013-10-15 Medtronic, Inc. Locating an indicator
US8660640B2 (en) 2008-04-18 2014-02-25 Medtronic, Inc. Determining a size of a representation of a tracked member
US9101285B2 (en) 2008-04-18 2015-08-11 Medtronic, Inc. Reference structure for a tracking system
US8663120B2 (en) 2008-04-18 2014-03-04 Regents Of The University Of Minnesota Method and apparatus for mapping a structure
US20090264738A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Method and apparatus for mapping a structure
US20090264745A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Method and Apparatus To Synchronize a Location Determination in a Structure With a Characteristic of the Structure
US8768434B2 (en) 2008-04-18 2014-07-01 Medtronic, Inc. Determining and illustrating a structure
US20090264777A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Determining a Flow Characteristic of a Material in a Structure
US20090265128A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Correcting for distortion in a tracking system
US20090264746A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Tracking a guide member
US8831701B2 (en) 2008-04-18 2014-09-09 Medtronic, Inc. Uni-polar and bi-polar switchable tracking system between
US8843189B2 (en) 2008-04-18 2014-09-23 Medtronic, Inc. Interference blocking and frequency selection
US8839798B2 (en) 2008-04-18 2014-09-23 Medtronic, Inc. System and method for determining sheath location
US8887736B2 (en) 2008-04-18 2014-11-18 Medtronic, Inc. Tracking a guide member
US20090264750A1 (en) * 2008-04-18 2009-10-22 Markowitz H Toby Locating a member in a structure
US20090280301A1 (en) * 2008-05-06 2009-11-12 Intertape Polymer Corp. Edge coatings for tapes
US20100304096A2 (en) * 2008-05-06 2010-12-02 Intertape Polymer Corp. Edge coatings for tapes
US8175681B2 (en) 2008-12-16 2012-05-08 Medtronic Navigation Inc. Combination of electromagnetic and electropotential localization
US8731641B2 (en) 2008-12-16 2014-05-20 Medtronic Navigation, Inc. Combination of electromagnetic and electropotential localization
US20110051845A1 (en) * 2009-08-31 2011-03-03 Texas Instruments Incorporated Frequency diversity and phase rotation
US8494613B2 (en) 2009-08-31 2013-07-23 Medtronic, Inc. Combination localization system
US8494614B2 (en) 2009-08-31 2013-07-23 Regents Of The University Of Minnesota Combination localization system
US8355774B2 (en) 2009-10-30 2013-01-15 Medtronic, Inc. System and method to evaluate electrode position and spacing
US20110106203A1 (en) * 2009-10-30 2011-05-05 Medtronic, Inc. System and method to evaluate electrode position and spacing
EP2609862A1 (en) * 2010-08-23 2013-07-03 FUJIFILM Corporation Image display device, method and program
EP2609862A4 (en) * 2010-08-23 2014-06-11 Fujifilm Corp Image display device, method and program
EP2701039A2 (en) * 2011-04-21 2014-02-26 Cheolwoo Kim Universal motion controller in which a 3d movement and a rotational input are possible
EP2701039A4 (en) * 2011-04-21 2014-11-26 Cheolwoo Kim Universal motion controller in which a 3d movement and a rotational input are possible
USD999776S1 (en) * 2020-04-29 2023-09-26 Toontrack Music Ab Display screen or portion thereof with graphical user interface
USD1010680S1 (en) 2020-04-29 2024-01-09 Toontrack Music Ab Display screen or portion thereof with graphical user interface
USD1000474S1 (en) * 2021-03-17 2023-10-03 Beijing Xiaomi Mobile Software Co., Ltd. Display screen with animated graphical user interface
US11243618B1 (en) * 2021-05-25 2022-02-08 Arkade, Inc. Computer input devices having translational and rotational degrees of freedom
US11487367B1 (en) 2021-05-25 2022-11-01 Arkade, Inc. Computer input devices having translational and rotational degrees of freedom
USD1017637S1 (en) * 2022-01-20 2024-03-12 Clo Virtual Fashion Inc. Display panel with icon
USD1017638S1 (en) * 2022-01-20 2024-03-12 Clo Virtual Fashion Inc. Display panel with icon

Similar Documents

Publication Publication Date Title
US5512920A (en) Locator device for control of graphical objects
Burdea et al. Virtual reality technology
Steinicke et al. Taxonomy and implementation of redirection techniques for ubiquitous passive haptic feedback
US5889505A (en) Vision-based six-degree-of-freedom computer input device
US8274535B2 (en) Video-based image control system
Leibe et al. The perceptive workbench: Toward spontaneous and natural interaction in semi-immersive virtual environments
US6181343B1 (en) System and method for permitting three-dimensional navigation through a virtual reality environment using camera-based gesture inputs
Leibe et al. Toward spontaneous interaction with the perceptive workbench
EP1292877B1 (en) Apparatus and method for indicating a target by image processing without three-dimensional modeling
US10762599B2 (en) Constrained virtual camera control
US20150097777A1 (en) 3D Motion Interface Systems and Methods
Evans et al. Tablet-based valuators that provide one, two, or three degrees of freedom
WO2009059716A1 (en) Pointing device and method for operating the pointing device
US20160334884A1 (en) Remote Sensitivity Adjustment in an Interactive Display System
US6714198B2 (en) Program and apparatus for displaying graphical objects
US6489944B2 (en) Apparatus and method for image display
Walairacht et al. 4+ 4 fingers manipulating virtual objects in mixed-reality environment
Zhang Vision-based interaction with fingers and papers
JPH08129449A (en) Signal input device
JPH10255052A (en) Gesture interface device
US10768721B2 (en) Model controller
Singletary et al. Toward Spontaneous Interaction with the Perceptive Workbench
Baggiani et al. Advanced man-machine interface for cultural heritage
WITTKOPF et al. I3-EYE-CUBE
JPH08315172A (en) Virtual body display device

Legal Events

Date Code Title Description
AS Assignment

Owner name: MITSUBISHI ELECTRIC RESEARCH LABORATORIES, INC., M

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GIBSON, SARAH F. FRISKEN;REEL/FRAME:007128/0122

Effective date: 19940816

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: MITSUBISHI ELECTRIC INFORMATION TECHNOLOGY CENTER

Free format text: CHANGE OF NAME;ASSIGNOR:MITSUBISHI ELECTRIC RESEARCH LABORATORIES, INC.;REEL/FRAME:008186/0570

Effective date: 19960424

FPAY Fee payment

Year of fee payment: 4

AS Assignment

Owner name: MITSUBISHI ELECTRIC RESEARCH LABORATORIES, INC., M

Free format text: CHANGE OF NAME;ASSIGNOR:MITSUBISHI ELECTRIC INFORMATION TECHNOLOGY CENTER AMERICA, INC.;REEL/FRAME:011564/0329

Effective date: 20000828

REMI Maintenance fee reminder mailed
REMI Maintenance fee reminder mailed
FPAY Fee payment

Year of fee payment: 8

SULP Surcharge for late payment

Year of fee payment: 7

FPAY Fee payment

Year of fee payment: 12

AS Assignment

Owner name: BINARY SERVICES LIMITED LIABILITY COMPANY, DELAWAR

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MITSUBISHI ELECTRIC RESEARCH LABORATORIES, INC.;REEL/FRAME:020638/0402

Effective date: 20071207

FEPP Fee payment procedure

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY