USRE40891E1 - Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom - Google Patents
Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom Download PDFInfo
- Publication number
- USRE40891E1 USRE40891E1 US11/188,284 US18828405A USRE40891E US RE40891 E1 USRE40891 E1 US RE40891E1 US 18828405 A US18828405 A US 18828405A US RE40891 E USRE40891 E US RE40891E
- Authority
- US
- United States
- Prior art keywords
- axis
- sensor
- controller
- signal
- response
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03543—Mice or pucks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0338—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0362—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/22—Character recognition characterised by the type of writing
- G06V30/228—Character recognition characterised by the type of writing of three-dimensional handwriting, e.g. writing in the air
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1068—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
Abstract
Disclosed is a multiple coordinate controller device having a three-dimensional body with a first surface portion and a second surface portion where the second surface portion is not coplanar with the first surface. A first transducer with a first sensing surface is coupled to the first surface portion of the body and capable of detecting both positions and a range of pressure forces at positions on the first sensing surface. The first transducer is further capable of providing a first range of z coordinates at a detected x,y coordinate in response to the range of pressure forces on said first sensing surface. A second transducer having a second sensing surface is coupled to the second surface portion of the body and capable of detecting both positions and a range of pressure forces at the positions on the second sensing surface. The second transducer is further capable of providing a second range of z coordinates of opposite polarity to the first range of z coordinates in response to the range of forces on second sensing surface.
Description
The present application is a continuation-in-part of U.S. patent application Ser. No. 08/696,366 filed on Aug. 13, 1996, now abandoned which is a continuation-in-part of U.S. patent application Ser. No. 08/509,797 filed on Aug. 1, 1995, now U.S. Pat. No. 5,729,249, which is a continuation of U.S. patent application Ser. No. 08/238,257 filed on May 3, 1994, now abandoned, which is a continuation-in-part of U.S. patent application Ser. No. 07/798,572 filed on Nov. 26, 1991, now U.S. Pat. No. 5,335,557, all of which are incorporated herein by reference. The present application also claims the benefit of U.S. Provisional Application No. 60/086,036, filed May 19, 1998, which is incorporated herein by reference.
1. Field of the Invention
The present invention relates to the field of input control devices. More specifically, it relates to force-sensitive input-control devices with multiple surfaces capable of providing intuitive input in one to thirty-six degrees of freedom.
2. Description of the Related Art
(a) Prior Art 3D and 6D Input Control Devices
Two-dimensional input control devices such as mice, joysticks, trackballs, light pens and tablets are commonly used for interactive computer graphics. These devices are refined, accurate and easy to use. Three-dimensional (“3D”) devices allow for the positioning of cursors or objects relative to conventional X, Y and Z coordinates. Six-dimensional (“6D”) devices are also capable of orienting or rotating objects. More specifically, 6D devices may provide position information as in a 3D device and further provide rotational control about each of three axes, commonly referred to as roll, pitch and yaw. However, current 3D and 6D input devices do not exhibit the refinement, accuracy or ease of use characteristic of existing 2D input devices. In fact, existing 3D/6D input devices are typically cumbersome, inaccurate, non-intuitive, tiring to use, and limited in their ability to manipulate objects.
One well known category of 3D computer controllers are the “computer gloves,” such as the Power Glove controller distributed by Mattel, Inc. Similar devices include the Exos Dextrous Hand Master by Exos, Inc., and the Data Glove by VP' Research, Inc. These controllers are worn as a glove and variously include sensors for determining the position and orientation of the glove and the bend of the various fingers. Position and orientation information is provided by ranging information between multiple electromagnetic or acoustic transducers on a base unit and corresponding sensors on the glove. However, the user is required to wear a bulky and awkward glove and movement of these awkward controllers in free space is tiring. Further, these devices are typically affected by electromagnetic or acoustic interference, and they are limited in their ability to manipulate objects because of the inherent dissimilarity between the free-form movement of a glove and the more constrained movement of manipulated objects.
A second category of 3D/6D controllers are referred to as “Flying Mice.” The Bird controller by Ascension Technology Corp. of Burlington, Vt. tracks position and orientation in six-dimensions using pulsed (DC) magnetic fields. However, it is affected by the presence of metals and also requires manipulating the controller in free space. The 2D/6D Mouse of Logitech Inc. is similar in function, but uses acoustic ranging similar to the Mattel device. The 3SPACE sensor from Polhemus, described in U.S. Pat. No. 4,017,858, issued to Jack Kuipers Apr. 12, 1977, uses electromagnetic coupling between three transmitter antennas and three receiver antennas. Three transmitter antenna coils are orthogonally arranged as are three receiver antennas, and the nine transmitter/receiver combinations provide three dimensional position and orientation information. However, all “flying mouse” devices require the undesirable and tiring movement of the user's entire arm to manipulate the controller in free space. Further, these devices are either tethered by a cord or sensitive to either electromagnetic or acoustic noise.
A device similar to the flying mice is taught in U.S. Pat. No. 4,839,838. This device is a 6D controller using 6 independent accelerometers in an “inertial mouse.” However, the device must still be moved in space, and the use of accelerometers rather than ranging devices limits the accuracy. Another inertial mouse system is taught in U.S. Pat. No. 4,787,051 issued to Lynn T. Olson.
A third category of 3D/6D controllers includes 3D/6D joysticks and trackballs. Spaceball of Spatial Systems, Inc. is a rigid sphere containing strain gauges or optical sensors to measure the forces and torques applied to a motionless ball. The user pushes, pulls or twists the ball to generate 3D translation and orientation control signals. Spaceball is described in detail in U.S. Pat. No. 4,811,608 issued to John A. Hilton Mar. 14, 1989. Similarly, the DIMENSION 6/Geoball controller distributed by CiS Graphics Inc. incorporates a 6-axis optical torque sensor housed in a spherical enclosure. The device measures translational forces and rotational torques. However, these devices are subject to a number of disadvantages. For example, it is difficult to provide for precise positioning, as there is no provision for the use of a stylus. Further, these devices are primarily controlled with hand muscles, rather than with the more precise finger muscles. Further still, these devices provide for only relative control and have no provision for providing an absolute origins or an absolute positions. They are therefor not suitable for providing closure in digitized 3D inputs. Finally, they are limited in their ability to provide an intuitive feel for 3D manipulation of a controlled object not specified in the Cartesian coordinate system. For example, they are not readily adaptable to spherical or cylindrical coordinate systems.
(b) Prior Art Force-sensitive Transducers
Force-sensitive transducer are characterized in that they do not require a significant amount of motion in order to provide a control input. These devices have appeared in a number of configurations, some of which are capable of sensing not only the presence or non-presence of the touch of a user's finger or stylus, but also the ability to quantitatively measure the amount of force applied. One such a device is available from Tekscan, Inc. of Boston, Mass. This device includes several force-sensitive pads in a grid-based matrix that can detect the force and position of multiple fingers at one time. Another force-sensitive device is available from Intelligent Computer Music Systems, Inc. of Albany, N.Y. under the TouchSurface trademark. The Touch-Surface device can continuously follow the movement and pressure of a fingertip or stylus on its surface by responding to the position (X and Y) at which the surface is touched and to the force (Z) with which it is touched. Further, if two positions are touched simultaneously in the TouchSurface device, an average position of the two positions is provided. However, these devices are currently limited in manipulating objects beyond 2.5 dimensions, i.e. X-position, Y-position, and positive Z-direction, and are not available in any intuitive controllers.
Force-sensitive transducers have been used in two-dimensional applications in place of spring-loaded joysticks. For example, U.S. Pat. No. 4,719,538 issued to John D. Cox teaches using force-responsive capacitive-transducers in a joystick-type device. However, these devices do not typically provide for 3D/6D inputs. An augmented 2D controller using force-sensitive devices is taught in U.S. Pat. No. 4,896,543 issued to Larry S. Gullman. Gullman describes a three-axis force measurement stylus used as a computer input device wherein the forces sensed by the stylus are used for recognizing ciphers, selecting colors, or establishing line widths and line densities. However, this device does not provide inputs for roll, yaw or pitch, and does not provide any input for a negative Z input (i.e. there is no input once the stylus is lifted). Thus, it is limited in its ability to provide 3D positioning information, as this would require an undesirable bias of some sort.
(c) Prior Art 3D/6D Field Controllers
3D/6D controllers are found in many field applications, such as controllers for heavy equipment. These devices must be rugged, accurate and immune from the affects of noise. Accordingly, many input control devices used for interactive computer graphics are not suitable for use in field applications. As a result, heavy equipment controllers typically consist of a baffling array of heavy-but-reliable levers which have little if any intuitive relationship to the function being performed. For example, a typical heavy crane includes separate lever controls for boom rotation (swing), boom telescope (extension), boom lift and hook hoist. This poor user interface requires the operator to select and select and pull one of a number of levers corresponding to the boom rotation control to cause the boom to rotate to the left. Such non-intuitive controls makes training difficult and time-consuming and increases the likelihood of accidents.
Accordingly, it is desirable to provide a 3D/6D controller that is easy to use, inexpensive, accurate, intuitive, not sensitive to electromagnetic or acoustic interference, and flexible in its ability to manipulate objects. Specifically, a substantial need exists for a graphical input device capable of providing for the precision manipulation of position and spatial orientation of an object. It is desirable that the device accept intuitive and simple input actions such as finger motion to manipulate position and orientation and does not require manipulation of a controller in free space or otherwise cause fatigue. It is desirable that the device provide the dual-functionality of both absolute and relative inputs, that is, inputs similar to a data tablet or touch panel that provide for absolute origins and positions, and inputs similar to mice and trackballs that report changes from former positions and orientations. It is desirable that the device recognize multiple points for versatile positioning and spatial orientation of one or more objects and allow the use of multiple finger touch to point or move a controlled object in a precise manner.
An input controller of the present invention incorporates multiple force/touch sensitive input elements and provides intuitive input in up to 36 degrees-of-freedom, including position and rotation, in either a Cartesian, cylindrical or spherical coordinate system. Input can be provided in the provided degrees of freedom without requiring movement of the controller, so that the controller is suitable for controlling both cursors or other computer objects in an interactive computer system and for controlling equipment such as heavy cranes and fork lift trucks.
More specifically, the preferred embodiment of the present invention provides a substantially cube-shaped input controller which includes a sensor on each of the six faces of the controller. The sensors are sensitive to the touch of a user's finger or other pointing object. In various embodiments, a controlled object may be translated by either a “pushing” or “dragging” metaphor on various faces of the controller. A controlled object may be rotated by either a “pushing,” “twisting,” or “gesture+ metaphor on various faces of the controller. In certain embodiments, the same sensor is used for both position and rotational inputs, and the two are differentiated by the magnitude of the force applied to the sensor. Preferably, each sensor includes a main sensor located near the center portion of each face of the controller, and a number of edge sensors surrounding the main sensor and located proximate to the edges of each face of the controller.
According to one embodiment, each face of the controller can be used to provide input in six degrees of freedom to each control an object. If every face of the controller is used, a total of thirty-six degrees of freedom may be utilized. This allows the simultaneous control of multiple objects. In one embodiment, a computer generated object displayed on a computer system includes a virtual hand. The entire hand and individual fingers of the hand may be simultaneously moved in several degrees of freedom by the user when providing input on multiple faces of the controller at the same time. In other embodiments, sets of faces can each control a separate object. For example, two opposing faces on the controller can command the translation and rotation of one object, while two different opposing faces can command the translation and rotation of a second object.
In a different embodiment, the controller of the present invention can be used to provide input to an application program implemented by a computer system, such as a computer aided design (CAD) program. A front face on the controller can be used to control a cursor in the program, and left and right faces can provide commands equivalent to left and right buttons on a mouse or other pointing device typically used with the program. An object displayed by the CAD program can be manipulated by using two touch points simultaneously. An object can be deformed, such as twisted, shrunk, or stretched, by providing input on the edge sensors of the controller. Two points of an object can be simultaneously deformed using separate faces of the controller.
In another embodiment, “pseudo force feedback” is provided to the user when the user controls a computer-generated object in a virtual environment. When a user-controlled computer object, such as a virtual hand, engages another object in the virtual environment, such as an obstacle, the user-controlled object is not allowed to move further in the direction of the obstacle object. The user thus feels the surface of the controller as if it were the surface of the obstacle, and receives visual feedback confirming this pseudo-sensation. In another embodiment, active tactile feedback can be provided to the user with the use of tactile sensation generators, such as vibratory diaphragms, placed on the controller or on peripheral surfaces to the controller.
The present invention provides an intuitive, inexpensive, and accurate controller for providing input in 3 or more degrees of freedom. The controller is flexible in its ability to manipulate objects and provide a relatively large number of degrees of freedom for a user, such that multiple objects can be manipulated simultaneously by a user. This allows realistic control of objects such as virtual hands in a simulated environment. In addition, the controller is not manipulated in free space and thus does not cause hand fatigue. The multiple dimensions of input can be generated without requiring movement of the controller, which provides a controller suitable for controlling both cursors and displayed objects in an interactive computer system. Further, the controller is insensitive to acoustic or electromagnetic noise and is thus suitable for controlling equipment such as heavy cranes and forklift trucks.
These and other advantages of the present invention will become apparent to those skilled in the art upon a reading of the following specification of the invention and a study of the several figures of the drawing.
FIGS. 33a1, 33a2, 33b1, 33b2, 33c1, 33c2, 33d1, 33d2, 33d3, 33d4, 33d5, 33d6, 33e1, and 33e2 illustrate the interpretation of various gestures;
FIG. 47e and FIG. 47f illustrate a controller in accordance with yet another embodiment of the present invention;
FIG. 53b and FIG. 53c illustrate a method of operating the controller of FIG. 53a to produce x, y, z, pitch, yaw, and roll rotation signals;
The user interface is intuitive since a real or computer generated object will move as if it is responding to the pressure (i.e., force) on controller 105. For example, pressing down on force-sensitive pad 120, positioned on the top of controller 105, will cause a controlled object to move downward (−Y). Similarly, pressing up on force-sensitive pad 135, positioned on the bottom of controller 105, will cause the object to move upward (+Y). Pressing the controller towards the user, by pressing on force-sensitive pad 130, positioned on the back of controller 105, will cause the object to move towards the user (−Z). Pressing the controller away from the user, by pressing on force-sensitive pad 110, positioned on the front of controller 105, will cause the object to move away from the user (+Z). Pressing the controller to the left, by pressing on force-sensitive pad 115 on the right side of controller 105, will cause the object to move to the left (−X). Similarly, pressing the controller to the right, by pressing on force-sensitive pad 125, positioned on the left side of controller 105, will cause the object to move to the right (+X).
One advantage of the controller 105 is that it exhibits a zero neutral force, i.e., the controller does not require a force on any sensors or mechanical members to maintain a neutral position. The user merely stops applying pressure to the sensors, and the controller is in a neutral state that does not input movement signals to the computer 220.
In the preferred first embodiment controller 105 is sensitive to the presence of a touch input and A/D converter 205 provides a binary signal output to integrator 210 for each force-sensitive pad. This provides a controller that provides a single “speed”, that is, activation of a force-sensitive pad will result in the cursor, object or equipment moving in the desired direction at a certain speed. Alternatively, force- sensitive pads 110, 115, 120, 125, 130 and 135 can be of the type that provide analog outputs responsive to the magnitude of the applied force, A/D converter 205 can be of the type that provides a multi-bit digital signal, and integrator 210 can be of the type that integrates multi-bit values. The use of a multi-bit signals allows for multiple “speeds,” that is, the speed of the cursor or object movement in a given direction will be responsive to the magnitude of the force applied to the corresponding force-sensitive pads.
In operation, sensors 310, 315 and 320 provide redundant X, Y and Z position control of a cursor, object or equipment. That is, Y-position information can be entered on either sensor 310 or 315, X-position information can be entered on either sensor 310 or 320, and Z-position information can be entered on either sensor 315 or 320. The two X inputs are summed to provide the final X position information. Y and Z information is obtained in the same manner. Thus a change in position on a sensor is interpreted as a change of position of the real or computer-generated object, with a fixed or programmable gain.
For application requiring six degree-of-freedom input, such as manipulating the orientation of an object or equipment, sensors 310, 315 and 320 also provide the pitch, yaw and roll control. Specifically, the third signal provided by each sensor is used to differentiate “light” from “strong” pressures on each sensors. Threshold detector 535, illustrated in FIG. 5 , receives the third signal from each sensor and couples the related two analog signals to either position interpreter 540 or to orientation interpreter 545 in response to the third signal being “light” or “strong” respectively. Specifically, when a pressure exceeding a pre-defined threshold is detected, it is interpreted as a “strong” pressure, i.e., an orientation “gesture”, and the two analog signals from the affected sensor are used to provide orientation information. Referring to FIG. 4b , when a strong pressure is detected on sensor 310, the two analog signals from sensor 310 are used to provide pitch information about the Z-axis, as indicated by the arrow on sensor 310. Similarly, when a strong pressure is detected on sensor 315, the two analog signals from sensor 315 are used to provide roll information about the X-axis. Finally, when a strong pressure is detected on sensor 320, the two analog signals from sensor 320 are used to provide pitch information about the Y-axis. In alternate embodiments, other types of input can be provided on sensors 310, 315, and 320 to command rotation of the controlled object. For example, trajectory gestures can be input, such as the circle gesture described in FIG. 35d , to generate a sequence of positive/negative angle changes and cause the controlled object to rotate. Similarly, a winding, snake-like gesture would cause the controlled object to rotate in alternating directions about an axis.
The X, Y and Z position data and the orientation data is derived in the same way as described with reference to controller 305 illustrated in FIGS. 3 and 4 . The additional sensors provide multiple redundant entry capabilities. Specifically, yaw information about the Z-axis can be provided by either sensor 610 or sensor 630. Roll information about the X-axis can be provided by either sensor 615 or sensor 625. Pitch information about the Y-axis can be provided by either sensor 620 or sensor 635. Similarly, X-position information can be provided by sensors 610, 620, 630 and 635. Y-position data can be provided by sensors 610, 615, 630 and 625. Z-position data can be provided by sensors 620, 615, 635, and 625. As before, multiple inputs can be resolved either by averages or by ignoring secondary inputs. More specifically, priority can be given to specific sensors or priority can be given with regards to the relative time of the inputs. Further, inputs can be interpreted on either absolute or relative modes.
Alternatively, rotation commands can be generated by another technique using the 6-sided controller of FIG. 6. Specifically, a rotation command is generated by simultaneously dragging a finger on one panel in a first direction, and dragging another finger on the opposite panel in the opposite direction. For example, as illustrated in FIG. 20a , the user's thumb 2010 is dragged vertically upward in a +Y direction on panel 610. Simultaneously, the user's forefinger 2020 is dragged vertically downward in a −Y direction on panel 630. This is interpreted as a positive rotation about the X-axis, as illustrated in FIG. 20b , where a displayed (or controlled) object 2030 is rotated about the X-axis as illustrated. More specifically, the position and change-of-position information is detected separately for each of the six panels. When touch points are detected simultaneously on opposite panels, the change-of-position information is compared for the opposite panels. If the change-of-position information indicates that the touch points are moving in substantially opposite directions, a rotation command is generated. Rotation nominally corresponds to the rotation about the affected axis such that a single complete rotation of the touch points about the controller 605 would result in a single revolution of the image. Alternatively, magnifications could be used such that the image would be rotated by an amount proportional to the rotation of the touch points.
A fourth embodiment of a 6D controller 705 is illustrated in FIG. 7. A controller 705 is shaped in the form of a cube with three attached knobs. Six force- sensitive matrix sensors 710, 715, 720, 725, 730 and 735 are positioned on controller 705 in the same manner as explained in detail with regards to controller 605 illustrated in FIG. 6. However, these force-sensitive matrix sensors are used only to generate position commands in the X, Y, and Z directions.
As illustrated with regards to knob 740, each knob includes at least one sensor pad that can detect one dimensional information about the circumference of the knob. Preferably, each sensor can average two inputs. Movement of one or two pressure points on a sensor is interpreted as rotation about the axis of that sensor. Thus each knob generates orientation information about one axis in response to twisting of a thumb and finger about that knob. Specifically, sensor 745 on knob 740 provides one-dimensional position information about the circumference of knob 740. In the case of two inputs applied to a sensor, the average position of the two inputs is interpreted in a relative mode, and a programmable gain is provided. More specifically, the rotational command (the change in rotation) is calculated as follows:
θ=G*360°*dl/L
θ=G*360°*dl/L
Where θ is the rotational command; G is the programmable gain; dl is the change in the average position of the fingers; and L is the circumference of the knob.
For example, twisting the thumb and finger one centimeter on knob 740 is interpreted as 90° of rotation about the Y-axis. Alternatively, the gain can be increased or decreased as desired.
Another embodiment of a touch cylinder 900 is illustrated in FIGS. 9a-9d . Again, touch cylinder 900 is constructed of six cylinders, each aligned along a Cartesian coordinate, and connected together at the origin of the Cartesian coordinate system. Each cylinder has force-sensitive sensors on its end for position information as in touch cylinder 800. However, touch cylinder 900 derives rotational information in a different manner. Specifically, the circumference of each cylinder is covered with a force-sensitive sensor that is divided into at least four sections. For example, the cylinder aligned in the +X direction includes sections 901, 902, 903, and 904. Each section covers 90° along the circumference of the cylinder. Similarly, the other five cylinders are also covered by force-sensitive sensors each with four sections. As illustrated, the centers of each of the sections lie on a plane of the Cartesian coordinate system defined by the six cylinders.
Operation of touch cylinder 900 is described with reference to a “push” mode. Specifically, rotational information is provided by “pushing” sensors positioned on the sides of the cylinders to rotate the object about one of the axes other than the one on the cylinder of the enabled sensor as if it had been “pushed” in the same direction as the controller. This is more easily explained by illustration. Referring to FIG. 9b , a rotational yaw input about the Z-axis is provided by pressing any of sensors 902, 904, 905, 906, 907, 908, 909 or 910. Sensors 904, 906, 908, and 910 provide a positive (counterclockwise) yaw signal, sensors 902, 905, 907 and 909 provide negative (clockwise) yaw signals. These signals can be combined as described above, and the signals can be either “on/off” or have multiple levels. Roll and pitch information is provided in a similar manner, as illustrated in simplified diagrams 9c and 9d.
A third embodiment of a touch cylinder 1000 is illustrated in FIGS. 10a-10c . Unlike touch cylinders 800 and 900, touch cylinder 1000 has no sensors on the ends of the six cylinders. Six sensors on the cylinders provide orientation information in the same manner as the sensors 810-815 in touch cylinder 800. However, the sensor pads of touch cylinder 1000 are two-dimensional and provide information responsive to the position of pressure along the cylinders as well as in response to the position of the pressure around the circumference of each cylinder. As illustrated in FIG. 10a , movement of the thumb and forefinger along the X-axis cylinder in the X-direction is detected by sensor 1010. The X-position information from the two inputs (thumb and forefinger) is averaged and used to provide a relative position input to the cursor or controlled object. Y-position information is provided in a similar manner as illustrated in FIG. 10b. Z-position information is provided as illustrated in FIG. 10c.
Alternatively, Z-position can be responsive to the force of signals applied to sensors 1105 and 1115 in a manner similar to controller 105. Theta information can be obtained in a manner similar to that used for rotation information in controller 305. Radial information can be obtained from the force of the pressure applied to sensor 1110.
The raised edges of the controller provide an area of the sensor tactilely distinguished from flat surface 1915 which operates in a different mode. When computer system 220 reads input signals from coordinates of the edge sensor areas, it can distinguish this input as a different command from input entered on the main sensor areas. For example, in a relative mode for X and Y-position a change in position on sensor area 1915 is interpreted as a proportional change in cursor or object position on a display device of the computer 220. Once the operator's finger reaches edge sensor 1910 a steady force (without substantial movement) on edge sensor 1910 is interpreted as a continuation of the cursor movement. Cursor movement can be continued at either the most recent velocity along an axis, or at a preset speed, as long as a force is detected on the portion of edge sensor 1910 on that axis, such as portion 1920 with regards to movement in the positive X-direction. Alternatively, the speed of the cursor movement along an axis could be proportional to the amount of force applied to edge sensor 1910 on that axis. Thus, area 1920 would provide control of +X cursor speed, area 1925 would provide control of +Y cursor speed, area 1930 would provide control of −X cursor speed, and 1935 would provide control of −Y cursor speed. In any case, the operator is provided with the advantages of two alternative operating modes and the ability to combine the two modes in order to continue object movements in a desired direction after reaching the edge of main sensor area 1915.
When a user presses an edge sensor area without previously entering translation input on the adjacent main sensor, then the edge sensor input can be interpreted as a separate command and not as a continuation command. For example, an object or cursor can be rotated using the edge sensors, as described in greater detail below. In an alternative embodiment, only the edge sensors are used, and the main sensor area does not provide input when touched.
Four edge sensors 2520 surround and are immediately adjacent to each of the main sensors 2510 so that a user's finger may move continuously from a main sensor 2510 to an edge sensor 2520. Each of the edge sensors 2520 is inclined and raised relative to the adjacent main sensor to tactilely distinguish it from the associated main sensor 2510. Alternatively, edge sensors 2520 could be otherwise tactilely distinguished, such as by the use a texture different from that used on the adjacent main sensor 2510. One function of the edge sensors 2520 is to provide a continuation command as described above with regard to the operation of FIG. 19. In addition, edge sensors 2520 may be used to provide rotation commands. Specifically, the eight edge sensors 2520(x) parallel to the X-axis may be used to provided rotation commands about the X-axis. As illustrated in FIG. 25a , four of these edge sensors (2520x) provide a negative rotation command. Four of these edge sensors (2520+x) provide a positive rotation command. In a similar manner, the eight edge sensors 2520z parallel to the Z axis are used to provided rotation commands about the Z axis. Similarly again, the eight edge sensors 2520y parallel to the Y-axis are used to provided rotation commands about the Y-axis.
The protocol for rotation command generation is illustrated in FIGS. 26a-f . Specifically, a rotation command is generated in response to the user touching one or more of the edge sensors 2520. FIG. 26a illustrates a user touching two of the edge sensors 2520(+x) which are located diagonally from each other on opposing faces of the controller 2500. This results in the generation of a positive X-axis rotation command, which causes the rotation of, for example, a computer-generated object 2522 as illustrated in FIG. 26b. Similarly, FIG. 26c illustrates generation of a positive Y-axis rotation command from the touching of diagonally-opposite edge sensors, resulting in the rotation of the computer-generated object 2522 as illustrated in FIG. 26d. Similarly again, FIG. 26e illustrates generation of a positive Z-axis rotation command, resulting in the rotation of object 2522 as illustrated in FIG. 26f. Both positive and negative rotations are provided in response to the detection of touch on the appropriate sensor edges 2520. Further, the magnitude of the force applied to the sensors is preferably proportional to the amplitude of the rotation signal, such that a more powerful force on the edge sensors 2520 is interpreted as a more rapid rotation.
Rotation commands are distinguished from translation commands by determining if a touch on a main sensor 2510 at a position immediately adjacent to an edge sensor 2520 occurred immediately prior to or simultaneously with the initiation of the touch of an edge sensor 2520. If touch points are detected on an edge sensor 2520 and on a main sensor 2510, and the touch points are continuous in time and position, the user's intention is interpreted as a continuation of translation command. If touch points are detected on edge sensors 2520 only, without a prior and adjacent detection on the adjacent main sensor, then the magnitude of force signal on the edge will be interpreted as a rotational command. It is preferable that a certain amount of “hysterisis” is provided in the command interpretation, such that if a user partially touches a main sensor 2510 while applying a rotation gesture, it is not interpreted as a continuation of a translation command. This is easily accomplished, as a continuation of a translation command cannot occur unless a translation command had been previously provided, and that previous translation command is smoothly continued by the candidate continuation command. This is described more fully below in the section titled Gesture Interpretation.
The rotation and continuous-translation input modes are very intuitive. The rotation mode is especially intuitive because the user's push action (one finger) or “twisting gesture” (pushing two diagonally opposite edge sensors by two fingers) of edges causes a controlled object to rotate in the pushing/twisting direction.
Rotation commands about an arbitrary axis may also be generated using controller 2500′ similar to the controller 2500 illustrated in FIG. 25a. Specifically, in this alternative embodiment, edge sensors 2520 are replaced with edge sensors 2520′ capable of providing a signal responsive to the position at which they are touched. For example, edge sensors 2520 along the X-axis provide a signal corresponding to the position along the X-axis at which a touch occurs. Similarly, the edge sensors 2520′ along the Y- (and Z-) axis provides a signal corresponding to the position along the Y- (and Z-) axis. Such position detection on the edge sensors can provide a greater degree of control of user over the movement and manipulation of an object.
An alternative embodiment of the cylinder of FIG. 11 is illustrated in FIGS. 29a and 29b . As illustrated, cylinder 2900 includes a edge sensor 2910 raised and inclined relative to the flat main sensor 2920. Rotation and translation continuation commands are generated in the same manner as have been described with reference to controller 2500. For example, when a user pushes edge sensor 2910 at point P2, located at an angle is theta relative to a reference position P1, the displayed (or controlled) controlled is rotated about axis R′, where the axis R′ is in the plane of on the top surface 2920 of cylinder 3000 and shifted theta-90 degrees from reference axis R, where theta is the angle defined by the points P1 and P2 as illustrated.
Gesture Interpretation
Gestures applied to the controllers, such as controllers 2500 and 2500′, may be interpreted in a number of different ways by a computer interface and used to control the movement of display objects on an interactive computer display or used to control the movement of a physical piece of equipment, such as an industrial crane. The interpretation of gestures can be broken down into 3 cases.
In case 1, there is no detection of pressure or touch on main sensors 2510, but there is detection of pressure on edge sensors 2520. This case is interpreted as rotation of the camera view, as illustrated in the flow chart of FIG. 30. Referring to FIG. 30 , step 3005 is the entry point for the logic executed when no touch points are detected on main sensors 2510. In step 3010, a test is conducted to determine if there are any touch points on edge sensors 2520. If no, the logic is exited in step 3015. If yes, step 3020 tests whether there are single touch points on edge sensors 2520. If yes, the camera view is rotated in step 3025 about the “i-axis”, which is either the x, y, or z-axis, depending on the edge sensor touched. The camera view is the view of the virtual environment as, for example, displayed on a computer screen or the like. The rotation of the camera view with a single edge sensor touch point is illustrated in FIGS. 33a1 and 33a2. If no single touch points are detected, a test is conducted in step 3030 to determine if two touch points occur on parallel edge sensors, as shown in the example of FIGS. 26a , 26c, and 26e. If yes, the camera view is rotated about the appropriate axis in step 3035. If no, the camera view is simultaneously rotated about the two axes indicated by the touched edge sensors in step 3040.
In case 2, there is a detection of a single touch or pressure point on main sensors 2510. This case is interpreted as a cursor manipulation or camera view rotation as illustrated in the flow chart of FIG. 31. Referring to FIG. 31 , step 3105 is the entry point for the logic executed when a single touch point is detected on main sensors 2510. In step 3110 a test is made to determine whether there are any touch points on any of the edge sensors 2520. If no, the touch point is interpreted as a cursor translation in step 3115, i.e., a cursor or object is moved in the direction of the touch point as determined by the trajectory of the touch point on the main sensor or by the direction of the single touch point (depending on the embodiment). If there are touch points on any of the edge sensors, a test is made in step 3130 to determine whether the touch point on a main sensor 2510 is within a specified region adjacent to the edge sensor 2520 on which a touch was detected, and whether a translation command has been just previously generated. This region 3132 of the main sensor 2510 is shown in FIG. 31a. If yes, the gesture is interpreted as a continuation of the cursor or object translation in step 3135. If no, the gesture is interpreted as a camera view rotation in step 3140, similar to the camera rotation implemented in FIG. 30.
In case 3, there is a detection of multiple touch points on main sensors 2510. This case is interpreted as an object manipulation as illustrated in the flow chart of FIG. 32. Referring to FIG. 32 , step 3205 is the entry point for the logic executed when multiple touch points are detected on main sensors 2510. In step 3210, a test is made to determine if any touch points are detected on edge sensors 2520. If no, a test is made in step 3215 to determine if the finger dragging is occurring is significantly opposite directions and the touch pressure exceeds a threshold value. If yes, the gesture is interpreted as object grasp and rotation in step 3220. (This gesture and its interpretation are illustrated in FIGS. 33e1 and 33e2.) If no, a test is made in step 3225 to determine if pressure on one touch point is significantly greater than another and exceeds the threshold value. If yes, the gesture is interpreted as an object grasp and translation along the appropriate axis in step 3230. For example, as illustrated in FIGS. 33d1 and 33d2, the pressure on back sensor 3227 is stronger than the pressure on front sensor 3228, so that the object and claw move along the Z axis in a negative direction. In FIGS. 33d3 and 33d4, the pressure on front sensor 3228 is stronger than the pressure on back sensor 3227, so that the object and claw move along the Z axis in a positive direction. If the pressure of one touch point is not greater than the other, the gesture is interpreted as an object grasp and translation on the X-Y plane in step 3235, as illustrated in FIGS. 33d5 and 33d6.
Returning to step 3210, if touch points are detected on edge sensors 2520, a test is made in step 3240 to determine if there is only one touch point on edge sensor 2520. If yes, the gesture is interpreted as an object grasp and rotation in step 3245, as illustrated in FIGS. 33b1 and 33b2. If no, a test is made in step 3250 to determine if the edges touched are parallel and if the touch points on the main sensor panel 2510 are within a specified region adjacent to the edge and whether there was a translation command just previously generated (similar to step 3130 of FIG. 31). If these tests are not all met, the gesture is interpreted as a camera view rotation in step 3255. If the conditions of step 3250 are met, a test is made in step 3260 to determine if three touch points occur on edge sensors 2520. If yes, the gesture is interpreted as a continuation of object translation and object rotation in step 3265, as illustrated in FIGS. 33c1 and 33c2. If no, the gesture is interpreted as a continuation of object translation in step 3270.
The controllers described in FIGS. 1-10 , 13 and 14 are adapted for use in the Cartesian coordinate system. In general, they can be categorized by the modes used for position and rotation control. Specifically, a “push mode” for position control is used in the embodiments described with reference to FIGS. 1 , 8, and 9a. In contrast, a “drag mode” for position is used in the embodiments described with reference to FIGS. 3 , 6, 7, and 10a-c. With regards to rotation, three general modes are used. “Gesture” mode for rotation is used in the embodiments described with reference to FIGS. 3 and 6 . “Push mode” or “torque mode” for rotation is used in the embodiments described with reference to FIGS. 9a-d . Finally a “twist mode” for rotation is used in the embodiments described with reference to FIGS. 7 and 8 . These modes can be combined in a number of ways as taught by the various embodiments. Further, different modes can be adapted to the cylindrical and spherical controllers taught with reference to FIGS. 11 , 12, 16 and 18.
The user's finger 3504 can be pushed against the main sensor 3508 in the direction of the z-axis shown by arrow 3518 to provide input in the z degree of freedom. A threshold pressure, greater than the pressure needed for movement in the x- and y-degrees of freedom, preferably commands the z-axis input, as described in greater detail below in FIG. 35e. As shown in FIG. 35a , the z-axis input is unidirectional, i.e., only movement in one direction along the z-axis can be input by the user when using just one face 3502 of the controller 3500. However, various implementations can assist the user in providing bi-directional movement along the z-axis, if desired, while using only one face 3502. For example, a “spring return” type command can be provided, as described above with reference to FIG. 28b , where the position of the controlled object on the Z-axis (relative to an origin) is directly proportional to the amount of pressure applied to the main sensor. When pressure is removed, the object returns to the origin position. Or, a “remain-in position” command can be provided as described above, where the controlled object moves along the Z-axis while the main sensor is touched, and the object stops at its current position when pressure is removed (optionally, the velocity of the object can be proportional to the amount of force on the main sensor). To provide bi-directional Z-axis movement, a special command input by the user on the controller, such as a finger tap or other gesture on the main sensor (or edge sensors), can toggle the desired direction along the z-axis. For example, the default can be +Z movement, and the user can tap the main sensor to subsequently command −Z movement. Alternatively, a separate peripheral device such as a button on controller 3500 or a device separate from cube 3500 can toggle the z-axis direction. Of course, if other faces of the controller 3500 are not being used for separate, independent input, then those faces can be used to provide the bi-directional z-axis movement, as described in the embodiments above.
The six degrees of freedom provided by a single face 3502 of controller 3500 can be multiplied by the number of active faces on the cube to achieve the total number of degrees of freedom in which the user may simultaneously provide input to a computer system or controlled device, e.g., when all six faces are used, there are 36 degrees of freedom. By using multiple fingers simultaneously on different faces of the controller, the user can independently and simultaneously control multiple sets of six degrees of freedom.
If the force F is greater than threshold # 1 in step 3534, then in step 3536, the process checks whether the force F is between the first threshold and a second force threshold (threshold #2). If so, the force F is used to implement bi-directional z-axis movement, as described for FIG. 35a , and the x and y data is not needed (although in some embodiments, the z-axis movement can use x- and y-data to help determine the direction of z-axis translation). For example, a spring-return type command can be used, or a remain-in-position command with the use of a finger tap input gesture. The process is then complete at 3541.
If the force F does not fit in the range of step 3536, the force F must be greater than threshold #2 (a check for F being greater than threshold # 2 can be provided in alternate embodiments). Thus, in step 3539, the x- and y-data of the touch point is used to determine the amount of roll that commanded by the user as described in FIG. 35d. The F data is typically not needed to determine the change in angle of roll of the controlled object. A preferred method of calculating the roll uses the following formula:
Δθ=tan−1(Y1/X1)−tan−1(Y2/X2)
where Δθ is the change in angle of roll of the controlled object, (X1, Y1) is the starting touch point of the roll gesture, and (X2, Y2) is the ending point of the roll gesture.
Δθ=tan−1(Y1/X1)−tan−1(Y2/X2)
where Δθ is the change in angle of roll of the controlled object, (X1, Y1) is the starting touch point of the roll gesture, and (X2, Y2) is the ending point of the roll gesture.
In other embodiments, each finger 3564, 3566, and 3568 can be controlled independently of the other fingers by a separate face of the controller. For example, pinky finger 3564 can be controlled by the left face of cube 3500, ring finger 3566 can be controlled by the bottom face of cube 3500, and the middle finger 3568 can be controlled by the back face 3548 of controller 3500. However, such an arrangement is somewhat awkward for the user to manipulate with one hand, so that the user finger-virtual finger correspondence would be difficult to maintain.
In step 3616, the process checks whether any touch points have been detected from the user pressing fingers (or other objects) on the sensor pads. In a single touch point has been detected, i.e., the user is pressing only one sensor pad, then the process continues to step 3618, in which a camera view control command is generated. This camera view control command rotates or translates the view as seen by the user in a display such as display screen 3560. The control command is sent to the appropriate destination to implement the command. For example, a microprocessor in the controlling computer system 220 can receive the control command and generate a proper response by rotating or translating the camera view on display screen 3560. Step 3618 is described in greater detail with respect to FIG. 38a. The process then returns to step 3614 to read the six sensor pads.
If the process determines that two touch points have been detected in step 3616, then in step 3620, a virtual hand movement command is generated. This type of command causes the entire virtual hand 3562 to move in three-dimensional space (the simulated space may have less than three dimensions if the simulation is so constrained). The virtual hand command is then implemented, e.g., the computer system moves the hand 3562 to correspond to the current position of the user's finger on a main sensor pad, or continues to move the hand if the user's finger is on an edge sensor after a translation command, as described in the embodiments above. The generation of virtual hand control commands is described in greater detail with respect to FIG. 38b. The process then returns to step 3614 to read the six sensor pads.
If the process determines that three or more touch points have been detected in step 3616, then the process continues to step 3622, where a virtual finger movement command is generated. This type of command causes one or more fingers of hand 3562 to move in three dimensional space. The command is implemented, e.g., by computer displaying the finger moving in the appropriate manner. The generation of virtual finger controls is described in greater detail with respect to FIG. 38c. The process then returns to step 3614 to read the sensor pads.
If the touch point is not on an edge sensor in step 3628, then the process continues to step 3632, where a translation command for the camera view is implemented corresponding to the trajectory of the touch point on the sensor pad. For example, the last-processed touch point on the pad is examined and compared to the current touch point. From these two touch points, a vector can be determined and the view shown on the display device is translated along the vector, as if a camera were being translated by which the user was viewing the scene. The process is then complete at 3634 and returns to the process of FIG. 38.
If the detected touch points are not on diagonally-located edge sensors in step 3642, then, in step 3646, a translation command for the virtual hand is implemented that corresponds to the trajectory of both touch points on the controller. The virtual hand is moved in directions corresponding to the touch points. For example, as shown above in FIGS. 33d5 and 33d6, the two fingers on opposite faces of the controller cause the hand to translate in a plane. This is typically the most common form of input method to translate the virtual hand. In another scenario, if one of a user's fingers is dragged along the y-direction on the front face 3540, and another finger is dragged in the x-direction along the top face 3544, then the virtual hand is moved along a vector resulting from corresponding component vectors along the x- and y-axes. If one finger is not moved and the other finger is dragged, then the virtual hand is translated according to the one finger that is being dragged. After step 3646, the process is complete at 3648 and returns to the main process of FIG. 38.
The process also checks if the force of the user's touch points on main sensors is 5 less than the user-defined threshold value at step 3654. As explained above, multiple fingers can be simultaneously dragged on the main sensors of different faces of the controller. If the touch point is less than the threshold, then step 3660 is performed, in which the process checks if the touch trajectory is along the x-axis and/or the y-axis of the controller. If along the x-axis, step 3662 is performed, in which a bending control command is generated to bend the two (or more) joints of the appropriate virtual finger(s) about the z-axis, thus providing x-axis translation of the tip of the virtual finger. An example of this motion is shown in FIGS. 37j and 37l . After step 3662, the process is complete at 3672 returns to the process of FIG. 38. If the touch trajectory is along the y-axis in step 3660, then the process provides a bending command for the joints of the virtual finger to implement a bend of the appropriate virtual finger about the x-axis of the hand, thereby providing y-axis translation of the tip of the finger. Simultaneous implementation of steps 3664 and 3662 for x-axis and y-axis translations can also be provided. The process is then complete at 3672 and returns to the process of FIG. 38.
The process also checks in step 3654 if any of the detected touch points are on a edge sensor of the controller that is predetermined to correspond with a virtual finger. As explained above with reference to FIGS. 37o and 37p , the pressing of an edge sensor causes a virtual finger to move about the lower joint of the finger while remaining pointing straight, i.e., a “pointing gesture” is performed by the virtual hand. If a touch point is on a predetermined edge sensor, then in step 3668, a bending command is provided about the second, lower joint of the appropriate virtual finger to generate the pointing action. The process is then complete at 3672 and returns to the process of FIG. 38.
The above process provides a large and flexible range of virtual hand and virtual finger motions to the user with the intuitive use of the controller. Unlike in other limited input devices, the controller allows fingers and the hand to controlled simultaneously and independently of each other, allowing a user to realistically perform virtual actions and interact with virtual objects in a highly realistic manner.
In the example shown in FIG. 40b , the objects 3712 and 3714 are able to connect to each other only if predetermined angular velocities are achieved for the two objects. Thus, simultaneous rotation of the two objects is required. Similar simulations, games, or other activities can be performed by controlling multiple objects simultaneously with controller 3500.
In FIG. 41e , the user presses the main sensor of bottom panel 3545 with a strong push of finger 3736, and presses the main sensor of top panel 3544 with a weaker push of finger 3550. In FIG. 41f , the object 3730 is shortened along the y-axis in the directions shown by arrows 3746 corresponding to the y-axis of the controller. The object 3730 is shortened a greater amount at end 3740 than at end 3742 since the user applied a greater pressure on bottom face 3545 than top face 3544. The previous dimensions of object 3730 are shown as dashed lines 3748. Thus, when the user presses main sensors on opposing faces of the controller, the controlled object is reduced in the corresponding dimension as if the user is “squeezing” the object. By pressing on all four faces of controller, the user can cause the shortening manipulation of FIG. 41f and the deforming manipulation of FIG. 41d to take place simultaneously.
In FIG. 41g , the user is performing dragging or translation gestures on controller 3550 to manipulate the shape of a computer-generated object. The user uses two fingers of each hand to perform each gesture. Fingers 3740 and 3742 are pushing the diagonally-opposed edge sensors 3744 and 3746, respectively, which are situated on the right face and left face, respectively, of controller 3500. The pressing of these diagonal edge sensors on these opposing faces causes the object 3730 to twist about the y-axis as shown by arrows 3748 in FIG. 41h. At the same time, the user is dragging fingers 3750 and 3752 in a linear motion along the main sensors of the front face 3540 and the back face 3548. This gesture causes the lower end of object 3730 to extend, as shown by arrows 3754 in FIG. 41h. Object deformation as shown in FIGS. 41a-41g is described below with respect to FIG. 43. The simultaneous manipulation of different portions of object 41h is allowed in the present invention due to the several degrees of freedom available on each face of controller 3500.
In addition, other functions can also be provided using the controller. For example, the right face 3806 and the left face 3808 can be used to select functions normally selected by the right and left mouse buttons, respectively. Thus, the left face 3808 can be pressed by the user to select an object 3822 that has been modeled or drawn using the CAD program. These functions are described in greater detail below with respect to the process of FIG. 43.
In FIG. 42c , the user is applying pressure to right face 3806 with finger 3802 and is applying pressure to left face 3808 with finger 3820. As shown in FIG. 42d , the left and right faces of the controller preferably control the movement of object 3822 displayed by the CAD program, similarly to the controlled movement previously shown in FIG. 33d3. The user can preferably select a drawn or modeled object (such as object 3822) using cursor 3816 and left or right faces of the cube, so that the selected object will respond to appropriate commands entered on the controller. In FIG. 42d , the user is applying a strong pressure to right face 3806 and a weaker pressure on left face 3820. Accordingly, object 3822 moves in a direction corresponding to the stronger force, as shown by arrow 3824, with a velocity proportional to the difference of the two pressures. Alternatively, other methods can be used to move the object 3822 using controller 3800. For example, the user can drag his or her fingers on opposing faces of the controller and move the object as shown previously in FIG. 33d5.
If object deformation mode is selected in step 3836, then the process checks in step 3844 if the touch point is on an edge sensor of the controller. If so, the process implements a twisting deformation of the displayed object in step 3846, as described in greater detail with respect to FIG. 43c. The process then returns to step 3834. If the touch point is not on the edge, it is on a main sensor of the controller, and the displayed object is shrunk or stretched in accordance with the user's input in step 3848, as described in greater detail with respect to FIG. 43d. The process then returns to step 3834.
If the detected touch point was not on the front sensor pad, then the process checks whether the detected touch point is positioned on the left sensor pad (relative to the front sensor pad) in step 3860. If so, then a left “click” command, equivalent to the click of a left button on a pointing device, is provided in step 3862. Typically, the left button on a mouse, trackball, touch tablet, or other pointing device, is the main button used to select objects or items displayed on the screen. Any functions selectable with the left mouse button can preferably be selected using the left face 3808 of the controller. For example, a “double click” of the left mouse button is often used to execute a program or perform a function that is different when only a single click is input. The left face of controller can be touched twice in succession to perform the double click. Other buttons or controls on standard input devices can be associated with the left face 3808 of the controller in other embodiments. The process is then complete at 3858.
If the touch point is not detected on the left sensor pad in step 3860, then in step 3864 the process checks if the touch point is detected on the main sensor pad of the right face 3806. If so, a right click command is implemented in step 3866. This command is equivalent to the command generated if the user selected the right mouse button (or equivalent control) on a mouse or other input pointing device. This step is thus similar to step 3862 for the left button of the mouse. Other buttons or controls on standard input devices can be associated with the right face 3806 of the finger in other embodiments. The process is then complete at 3858.
In other embodiments, the tactile sensation generators can be placed on other portions of each face of the controller, such as in the center of each face. Also, the tactile sensation generators can be of different sizes, e.g., a tactile sensation generator can cover an entire main sensor 3976 or an entire face of the controller 3962. In other embodiments, additional tactile sensation generators can be provided, such as a generator on each edge sensor and on the main sensor of a face. Also, the tactile sensation generator 3946 as shown in FIG. 45a can be utilized on base 3974 to provide additional user feedback.
The x,y translation signal produced by the first transducer at the first position is determined by the position of the object. When the user moves her finger, the x,y coordinates are changed by a x,y translation signal generated by the first transducer based on the direction of finger movement as follows: towards top edge 4102, the y coordinates are increased, towards bottom edge 4104, the y coordinates are decreased, towards left edge 4106, the x coordinates are decreased, and towards right edge 4108, the x coordinates are increased. That is, the object is moved in a relative, as opposed to absolute fashion in relationship to the movement of the finger on the sensing surface.
When the user moves her finger, a second transducer coupled to second sensing surface 4116 will transmit a pitch and a yaw rotation signal. If the user moves her finger towards: the top edge 4102, a positive pitch signal will be transmitted, towards the bottom edge 4104, a negative pitch signal will be transmitted, towards the left edge 4106, a negative yaw signal will be transmitted, and towards the right edge, 4108, a positive yaw signal will be transmitted.
As shown in each of FIGS. 46a-h , first edge sensor 4120 and second edge sensor 4125 are positioned around the periphery of first sensor 4110 and second sensor 4115 respectively. Preferably, first edge sensing surface 4121 and second sensing surface 4126 are specifically tactilely distinguished from first sensing surface 4110 and second sensing surface 4115 to let the user know that she has accessed the edge sensors without looking at the controller. Edge sensing surfaces 4121 and 4126 may also be raised or lowered with respect to sensing surfaces 4121 and 4126 to perform the same function.
After operating controller 4000 as indicated in the methods with reference to FIGS. 46c , 46d, 46e, and 46f, the user may continue the specified finger movement to contact the edge sensing surfaces 4121 and 4126. A first edge transducer and a second edge transducer coupled to edge sensing surfaces 4121 and 4126 then generate a continuation command signal. The continuation command signal continues the x,y and z translation signals as well as the pitch, yaw and roll rotation signals until the user initiates another signal by contacting the sensing surfaces thereafter. For example, if a user places her finger in the middle of the first sensing surface 4111 and moves the finger to the first edge sensing surface 4121 while maintaining contact the sensing surfaces, controller 4000 will continue to generate an x,y translation signal that increases the x coordinates after the user has lifted her finger away from the sensing surfaces.
The x,y and y,z translation signals produced at the first position is determined by the position of the object being moved. When the user moves her finger on the first sensing surface, the x,y coordinates are changed by a x,y translation signal generated by the first transducer based on the direction of finger movement on the first sensing surface as follows: towards top surface 4272, the y coordinates are increased, towards bottom surface 4290, the y coordinates are decreased, towards left surface 4247, the x coordinates are decreased, and towards right surface 4222, the x coordinates are increased.
When the user moves her finger on second sensing surface 4221, the y,z coordinates are changed by a y,z translation signal generated by the second transducer based on the direction of finger movement on second sensing surface 4221 as follows: towards top surface 4272, the y coordinates are increased, towards bottom surface 4290, the y coordinates are decreased, towards front surface 4212, the z coordinates are decreased, and towards rear surface 4285, the z coordinates are increased.
If a finger is dragged on first sensing surface 4211: towards top surface 4272, then a positive pitch signal is generated, towards bottom surface 4290, then a negative pitch signal is generated, towards right surface 4222, then a positive yaw signal is generated, towards left surface 4247, then a negative yaw signal is generated. If a finger is dragged on second sensing surface 4221: towards top surface 4272, then a positive roll signal is generated, towards bottom surface 4290, then a negative roll signal is generated, towards front surface 4212, then a negative yaw signal is generated, towards rear surface 4285, then a positive yaw signal is generated.
FIG. 47e and FIG. 47f illustrate a controller 4240 in accordance with yet another embodiment of the present invention. Controller 4240 is identical to controller 4000, except that it further comprises a third sensor 4245 having a third sensing surface 4246 positioned on the left surface 4247 of wedge shaped body 4205. A third edge sensor 4450 having a third edge sensing surface 4451 is positioned around the periphery of third sensor 4245.
In a preferred embodiment, method further includes an operation when a user presses a finger within a second range of force against either the second sensing surface 4221 to generate an x− translation signal or third sensing surface 4246 to generate an x+ translation signal. Preferably, the second range of force is greater than the first range of force used in method. Again, the third edge sensor 4250 may be used to generate a continuation control signal as described above.
For example, if a user wants to generate an x translation signal, she must swipe her finger along a surface of an available sensor located on a surface of cube shaped body 4320 in the direction of the x axis 4330. For example, a user may execute a finger swipe on the front surface 4321 or the rear surface 4322 of controller 4315b in the direction of x-axis 4330 to generate an x translation signal. If a user wanted to generate a y translation signal from controller 4315f, she would execute a finger swipe in the direction of y-axis 4335 on any of the faces of controller 4315 except for the top surface 4323.
For example, if a user wants to generate an pitch rotation signal, she must swipe her finger along a surface of an available sensor located on a surface of cube shaped body 4320 in the direction of the pitch rotation around x axis 4330. For example, a user may execute a finger swipe on the front surface 4321 or the rear surface 4322 of controller 4315b in the direction of pitch rotation around x axis 4330 while holding another finger against any other available sensor to generate a pitch rotation signal.
FIG. 53b and FIG. 53c illustrate a method of operating controller 4405 to produce x, y, z, pitch, yaw, and roll rotation signals. The sensors and edge sensors located on top surface 4415, front surface 4420, left surface 4435, right surface 4440, and rear surface 4445, function identically with the sensors located on corresponding faces of controller 4315f of FIG. 49f.
The sensors and edge sensors located on left front surface 4425 and right rear surface 4455 may be used to generate an x″ rotation signal, which commands the rotation of an object around an x″ axis. The x″ axis is defined at negative 45 degrees from the x-axis and located on the x,z plane. Each sensor of controller 4405 may be operated to generate a rotation signal by sliding an on the sensor in the desired direction while touching a second sensor with another object.
The invention has been described herein in terms of several preferred embodiments. Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention. For example, a variety of types of pressure-sensitive sensors can be utilized with the present invention. Various configurations and combinations of input gestures and commands can be detected by the controller in various embodiments as necessary for a particular application. Also, various types of computer-generated objects and real objects can be controlled with the present invention and be commanded to interact with other objects in an environment. Furthermore, certain terminology has been used for the purposes of descriptive clarity, and not to limit the present invention. The embodiments and preferred features described above should be considered exemplary, with the invention being defined by the appended claims.
Claims (54)
1. A multiple coordinate controller device comprising:
a three-dimensional body having a first surface portion and a second surface portion which is not coplanar with said first surface;
a first transducer having a first sensing surface, said first transducer being coupled to said first portion of said body and being capable of detecting both positions and a range of pressure forces at said positions on said first sensing surface, wherein said first transducer is further capable of providing a first range of z coordinates at a detected x,y coordinate in response to said range of pressure forces on said first sensing surface, said first range of z coordinates provided only if said range of pressure forces are greater than a first threshold pressure;
a second transducer having a second sensing surface, said second transducer being coupled to said second surface portion of said body and being capable of detecting both positions and a range of pressure forces at said positions on said second sensing surface, wherein second transducer is further capable of providing a second range of z coordinates of opposite polarity to said first range of z coordinates in response to said range of forces on second sensing surface, said second range of z coordinates provided only if said range of pressure forces are greater than a second threshold pressure.
2. The multiple coordinate controller device as recited in claim 1 wherein said first transducer detects a first position on said first sensing surface producing a first x,y coordinate and a second position on said first sensing surfaces producing a second x,y coordinate.
3. The multiple coordinate controller device as recited in claim 2 further comprising a first edge transducer having a first edge sensing surface positioned at least partially around a periphery of said first sensing surface, said first edge transducer being coupled to said first surface portion of said body and being capable of detecting a force on said first edge sensing surface.
4. The multiple coordinate controller device as recited in claim 3 further comprising a second edge transducer having a second edge sensing surface positioned at least partially around a periphery of said second sensing surface, said second edge transducer being coupled to said second surface portion of said body and being capable of detecting a force on said second edge sensing surface.
5. The multiple coordinate controller device as recited in claim 4 , wherein said first edge transducer provides a continuation control signal in response to said force applied to said first edge sensing surface, wherein said continuation control signal commands a continuation of movement in a direction determined by said first detected x,y coordinate and said second detected x,y coordinate.
6. The multiple coordinate controller device as recited in claim 5 wherein said first and second sensing surfaces and said first and second edge sensing surfaces are approximately a rectangular shape.
7. The multiple coordinate controller device as recited in claim 6 , wherein said first edge sensing surface is tactilely distinguished from said first sensing surface and said second edge sensing surface is tactilely from said second sensing surface.
8. The multiple coordinate controller device as recited in claim 6 , wherein said first edge sensing surface is raised from said first sensing surface and said second edge sensing surface is raised from said second sensing surface.
9. The multiple coordinate controller device as recited in claim 6 wherein said second transducer detects a third and fourth position on said second sensing surface.
10. A multiple coordinate controller device comprising:
a three-dimensional body having a first surface portion and a second surface portion which is not coplanar with said first surface; and
a sensor consisting essentially of;
a first transducer having a first sensing surface, said first transducer being coupled to said first surface portion of said body and being capable of detecting both positions and a range of pressure forces at said positions on said first sensing surface, wherein said first transducer is further capable of providing a first range of z coordinates at a detected x,y coordinate in response to said first range of forces, said first range of z coordinates provided only if said range of pressure forces are greater than a first threshold pressure;
a second transducer having a second sensing surface, said second transducer being coupled to said second surface portion of said body and being capable of detecting both positions and a range of pressure forces at said positions on said second sensing surface, wherein said second transducer is further capable of providing a second range of z coordinates of opposite polarity for said first range of z coordinates in response to said second range of forces, said second range of z coordinates provided only if said range of pressure forces are greater than a second threshold pressure;
whereby said sensor is capable of providing x,y and z coordinates from said first transducer and said second transducer, and
whereby, said first sensing surface and said second sensing surface do not substantially deform under pressure.
11. A two sided controller comprising:
a body having a first surface and an opposing second surface, said first surface and said second surface having dimensions that are substantially greater than a separation between said first surface and said second surface;
a first sensor assembly supported by said first surface and including a first generally flat pressure sensor surrounded, at least in part, by a first generally flat edge pressure sensor;
a second sensor assembly supported by said second surface and including a second generally flat pressure sensor surrounded, at least in part, by a second generally flat edge pressure sensor;
wherein said body is sized to be contacted on said first sensor assembly with the thumb of a hand and simultaneously on said second sensor assembly with a finger of said hand.
12. A wedge shaped controller comprising:
a body having a front edge surface having a first area, a back edge surface having a second area less than said first area, and a pair of side edge surfaces coupling said front edge surface to said back edge surface, whereby said body has a wedge shaped with angled side edges;
a first sensor assembly supported by said front edge surface and including a first generally flat pressure sensor surrounded, at least in part, by a first generally flat edge pressure sensor; and
a second sensor assembly supported by one of said pair of side edge surfaces and including a second generally flat pressure sensor surrounded, at least in part, by a second generally flat edge pressure sensor.
13. A wedge shaped controller as recited in claim 12 further comprising:
a third sensor assembly supported by the other of said pair of side edge surfaces and including a third generally flat pressure sensor surrounded, at least in part, by a third generally flat edge pressure sensor.
14. A wedge shaped controller as recited in claim 12 wherein said body further has a top surface and a bottom surface, and is provided with a pressure sensor on at least one of said top surface and said bottom surface.
15. A touch-sensitive manually operable controller for providing position control information relative to three axes, the controller comprising:
a top surface, a bottom surface, and a peripheral side surface;
a first sensor positioned on the side surface of the controller and generally aligned on and orthogonal relative to an X-axis of a Cartesian coordinate system, the first sensor adapted for providing a first Y-signal in response to the position of a force applied to the sensor along the Y-axis and a first Z-signal in response to the position of a force applied to the sensor along the Z-axis;
a second sensor positioned on the top surface of the controller and generally aligned on and orthogonal relative to an Y-axis of a Cartesian coordinate system, the second sensor adapted for providing a first X-signal in response to the position of a force applied to the sensor along the X-axis and a second Z-signal in response to the position of a force applied to the sensor along the Z-axis;
a third sensor positioned on the side surface of the controller and generally aligned on and orthogonal relative to an Z-axis of a Cartesian coordinate system, the third sensor adapted for providing a second X-signal in response to the position of a force applied to the sensor along the X-axis and a second Y-signal in response to the position of a force applied to the sensor along the Y-axis; and
a fourth sensor positioned on the side surface of the controller opposite the first sensor and generally aligned on and orthogonal relative to an X-axis of a Cartesian coordinate system, the fourth sensor adapted for providing a third Y-signal in response to the position of a force applied to the sensor along the Y-axis and a third Z-signal in response to the position of a force applied to the sensor along the Z-axis.
16. A touch-sensitive manually operable controller for providing position control information relative to three axes, the controller comprising:
a top surface, a bottom surface, and a peripheral side surface;
a first sensor positioned on the side surface of the controller and generally aligned on and orthogonal relative to an X-axis of a Cartesian coordinate system, the first sensor adapted for providing a first roll-signal in response to the position of a force applied to the sensor along the Y-axis and a first yaw-signal in response to the position of a force applied to the sensor along the Z-axis;
a second sensor positioned on the top surface of the controller and generally aligned on and orthogonal relative to an Y-axis of a Cartesian coordinate system, the second sensor adapted for providing a second roll-signal in response to the position of a force applied to the sensor along the X-axis and a first pitch-signal in response to the position of a force applied to the sensor along the Z-axis;
a third sensor positioned on the side surface of the controller and generally aligned on and orthogonal relative to an Z-axis of a Cartesian coordinate system, the third sensor adapted for providing a second pitch-signal in response to the position of a force applied to the sensor along the Y-axis; and
a fourth sensor positioned on the side surface of the controller opposite the first sensor and generally aligned on and orthogonal relative to an X-axis of a Cartesian coordinate system, the fourth sensor adapted for providing a third roll-signal in response to the position of a force applied to the sensor along the Y-axis and a second yaw-signal in response to the position of a force applied to the sensor along the Z-axis.
17. A three dimensional controller comprising:
a body having multiple faces wherein a first, second and a third face of the multiple faces meet at a common apex;
a first axis controller, positioned on the first face, which is generally aligned on and orthogonal relative to an X-axis of a Cartesian coordinate system, the first axis controller adapted for providing a first Y-signal in response to the position of a force applied to the first axis controller along the Y-axis and a first Z-signal in response to the position of a force applied to the first axis controller along the Z-axis;
a second axis controller, positioned on the second face, which is generally aligned on and orthogonal relative to an Y-axis of a Cartesian coordinate system, the second sensor adapted for providing a first X-signal in response to the position of a force applied to the second axis controller along the X-axis and a second Z-signal in response to the position of a force applied to the second axis controller along the Z-axis; and
a third axis controller, positioned on the third face, which is generally aligned on and orthogonal relative to an Z-axis of a Cartesian coordinate system, the third sensor adapted for providing a second X-signal in response to the position of a force applied to the third axis controller along the X-axis and a second Y-signal in response to the position of a force applied to the third axis controller along the Y-axis.
18. A three dimensional controller as recited in claim 17 wherein the first, the second and the third axis controllers comprise trackballs.
19. A three dimensional controller as recited in claim 17 wherein the first, the second and the third axis controllers comprise stick sensors.
20. A three dimensional controller as recited in claim 17 wherein the first, the second and the third axis controllers comprise zone sensors.
21. A three dimensional controller comprising:
a body having multiple faces wherein a first, a second and a third face of the multiple faces meet at a common apex;
a first axis controller, positioned on the first face, which is generally aligned on and orthogonal relative to an X-axis of a Cartesian coordinate system, the first axis controller adapted for providing a first Y-signal in response to the position of a force applied to the sensor along the Y-axis and a first Z-signal in response to the position of a force applied to the sensor along the Z-axis;
a second axis controller, positioned on the second face, which is generally aligned on and orthogonal relative to an Y-axis of a Cartesian coordinate system, the second sensor adapted for providing a first X-signal in response to the position of a force applied to the sensor along the X-axis and a second Z-signal in response to the position of a force applied to the sensor along the Z-axis;
a third axis controller, positioned on the third face, which is generally aligned on and orthogonal relative to an Z-axis of a Cartesian coordinate system, the third sensor adapted for providing a second X-signal in response to the position of a force applied to the sensor along the X-axis and a second Y-signal in response to the position of a force applied to the sensor along the Y-axis; and
a fourth axis controller, positioned on the third surface, which is generally aligned on and orthogonal relative to an X-axis of a Cartesian coordinate system, the fourth axis controller adapted for providing a third Y-signal in response to the position of a force applied to the fourth axis controller along the Y-axis and a third Z-signal in response to the position of a force applied to the fourth axis controller along the Z-axis.
22. A three dimensional controller as recited in claim 21 wherein the first, the second, the third and the fourth axis controllers comprise trackballs.
23. A three dimensional controller as recited in claim 21 wherein the first, the second, the third and the fourth axis controllers comprise stick sensors.
24. A three dimensional controller as recited in claim 21 wherein the first, the second, the third and the fourth axis controllers comprise zone sensors.
25. A multiple coordinate controller device comprising:
a three-dimensional movable body having a first surface portion, a second surface portion which is not coplanar with said first surface and a tracking surface engaged to handle a reference surface;
a first transducer having a first sensing surface, said first transducer being coupled to said first portion of said body and being capable of detecting both positions and a range of pressure forces at said positions on said first sensing surface, wherein said first transducer is further capable of providing a first range of z coordinates at a detected x,y coordinate In response to said range of pressure forces on said first sensing surface, said first range of z coordinates provided only if said range of pressure forces are greater than a first threshold pressure;
a second transducer having a second sensing surface, said second transducer being coupled to said second surface portion of said body and being capable of detecting both positions and a range of pressure forces at said positions on said second sensing surface, wherein second transducer is further capable of providing a second range of z coordinates of opposite polarity to said first range of z coordinates in response to said range of forces on second sensing surface, said second range of z coordinates provided only if said range of pressure forces are greater than a second threshold pressure; and
a mouse sensor mechanism supported by said body and adapted to engage said reference surface as said body is moved over said reference surface, wherein said first and second sensing surfaces can be engaged by a finger as said body is engaged by a hand of a user.
26. The multiple coordinate controller device as recited in claim 25 wherein said first transducer detects a first position on said first sensing surface producing a first x,y coordinate and a second position on said first sensing surfaces producing a second x,y coordinate.
27. The multiple coordinate controller device as recited in claim 26 further comprising a first edge transducer having a first edge sensing surface positioned at least partially around a periphery of said first sensing surface, said first edge transducer being coupled to said first surface portion of said body and being capable of detecting a force on said first edge sensing surface.
28. The multiple coordinate controller device as recited in claim 27 further comprising a second edge transducer having a second edge sensing surface, said second edge transducer being coupled to said second surface portion of said body and being capable of detecting a force on said second edge sensing surface.
29. The multiple coordinate controller device as recited in claim 28 , wherein said first edge transducer provides a continuation control signal in response to said force applied to said first edge sensing surface, wherein said continuation control signal commands a continuation of movement in a direction determined by said first detected x,y coordinate and said second detected x,y coordinate.
30. The multiple coordinate controller device as recited in claim 29 wherein said first and second sensing surfaces and said first and second edge sensing surfaces are approximately a rectangular shape.
31. The multiple coordinate controller device as recited in claim 30 , wherein said first edge sensing surface is tactilely distinguished from said first sensing surface and said second edge sensing surface is tactilely from said second sensing surface.
32. The multiple coordinate controller device as recited in claim 30 , wherein said first edge sensing surface is raised from said first sensing surface and said second edge sensing surface is raised from second sensing surface.
33. The multiple coordinate controller device as recited in claim 30 wherein said second transducer detects a third and fourth position on said second sensing surface.
34. A three dimensional controller comprising:
a body having multiple faces wherein a first, a second and a third face of the multiple faces meet at a common apex;
a first sensor, positioned on the first face, which is generally aligned on and orthogonal relative to an X-axis of a Cartesian coordinate system, the first sensor adapted for providing a first roll-signal in response to the position of a force applied to the sensor along the Y-axis and a first yaw-signal in response to the position of a force applied to the sensor along the Z-axis;
a second sensor, positioned on the second face, which is generally aligned on and orthogonal relative to an Y-axis of a Cartesian coordinate system, the second sensor adapted for providing a second roll-signal in response to the position of a force applied to the sensor along the X-axis and a first pitch-signal in response to the position of a force applied to the sensor along the Z-axis;
a third sensor, positioned on the third face, which is generally aligned on and orthogonal relative to an Z-axis of a Cartesian coordinate system, the third sensor adapted for providing second pitch-signal in response to the position of a force applied to the sensor along the Y-axis; and
a fourth sensor, positioned on the third face, which is generally aligned on and orthogonal relative to an X-axis of a Cartesian coordinate system, the fourth sensor adapted for providing a third roll-signal in response to the position of a force applied to the sensor along the Y-axis and a second yaw-signal in response to the position of a force applied to the sensor along the Z-axis.
35. A three dimensional controller as recited in claim 34 wherein the first, the second, the third and the fourth axis controllers comprise trackballs.
36. A three dimensional controller as recited in claim 34 wherein the first, the second, the third and the fourth axis controllers comprise stick sensors.
37. A three dimensional controller as recited in claim 34 wherein the first, the second, the third and the fourth axis controllers comprise zone sensors.
38. A touch-sensitive manually operable controller for providing position control information relative to three axes, the controller comprising:
a top surface, a front surface, and a peripheral side surface;
a first sensor positioned on the side surface of the controller and generally aligned on and orthogonal relative to an X-axis of a Cartesian coordinate system, the first sensor adapted for providing a first Y-signal in response to the position of a force applied to the sensor along the Y-axis and a yaw-signal in response to the position of a force applied to the sensor along the Z-axis;
a second sensor positioned on the top surface of the controller and generally aligned on and orthogonal relative to an Y-axis of a Cartesian coordinate system, the second sensor adapted for providing a first X-signal in response to the position of a force applied to the sensor along the X-axis and a first Z-signal in response to the position of a force applied to the sensor along the Z-axis;
a third sensor positioned on the side surface of the controller and generally aligned on and orthogonal relative to an Z-axis of a Cartesian coordinate system, the third sensor adapted for providing a second X-signal in response to the position of a force applied to the sensor along the X-axis and a second Y-signal in response to the position of a force applied to the sensor along the Y-axis; and
a fourth sensor positioned on the side surface of the controller opposite the first sensor and generally aligned on and orthogonal relative to an X-axis of a Cartesian Coordinate system, the fourth sensor adapted for providing a third Y-signal in response to the position of a force applied to the sensor along the Y-axis and a second Z-signal in response to the position of a force applied to the sensor along the Z-axis.
39. An input device comprising:
a plurality of pressure sensitive sensors on a body wherein each pressure sensitive sensor of the plurality of pressure sensitive sensors can be manipulated by a finger;
wherein said body is a mouse body having a lower surface for engagement with a reference surface for relative movement with respect thereto, said lower surface of said mouse body being provided with at least one sensor for sensing x and y degrees of freedom, said mouse body further having an upper surface;
wherein said pressure sensitive sensors are each at least one of a button, touch tablet, trackball and joystick and are accessible from said upper surface of said mouse body; and
wherein at least one of said pressure sensitive sensors senses a degree of freedom other than the x and y degrees of freedom.
40. An input device comprising:
a movable mouse body having a tracking surface adapted to engage a reference surface and a curved upper surface provided with at least two two-degrees-of-freedom pressure sensitive sensors to provide multiple degrees of freedom; and
a sensor mechanism supported by said mouse body and adapted to engage said reference surface as said body is moved over said reference surface to track at least two degrees of freedom.
41. The input device as recited in claim 40 wherein the plurality of pressure sensitive sensors comprises two pressure sensitive sensors.
42. The input device as recited in claim 40 wherein the plurality of pressure sensitive sensors comprises three pressure sensitive sensors.
43. The input device as recited in claim 40 wherein the plurality of pressure sensitive sensors comprises four pressure sensitive sensors.
44. The input device as recited in claim 40 , wherein said pressure sensitive sensors do not transmit a signal when a pressure is not present on said pressure sensitive sensors.
45. A touch-sensitive manually operable controller for providing position control information relative to three axes, the controller comprising:
a top surface, a front surface, and a peripheral side surface;
a first sensor positioned on the side surface of the controller and generally aligned on and orthogonal relative to an X-axis of a Cartesian coordinate system, the first sensor adapted for providing a roll-signal in response to the position of a force applied to the senor along the Y-axis and a yaw-signal in response to the position of a force applied to the sensor along the Z-axis;
a second sensor positions on the top surface of the controller and generally aligned on and orthogonal relative to an Y-axis of a Cartesian coordinate system, the second sensor adapted for providing a first X-signal in response to the position of a force applied to the sensor along the X-axis and a first Z-signal in response to the position of a force applied to the sensor along the Z-axis;
a third sensor positioned on the side surface of the controller and generally aligned on and orthogonal relative to an Z-axis of a Cartesian coordinate system, the third sensor adapted for providing a second X-signal in response to the position of a force applied to the sensor along the X-axis and a first Y-signal in response to the position of a force applied to the sensor along the Y-axis; and
a fourth sensor positioned on the side surface of the controller opposite the first sensor and generally aligned on and orthogonal relative to an X-axis of a Cartesian coordinate system, the fourth sensor adapted for providing a second Y-signal in response to the position of a force applied to the sensor along the Y-axis and a second Z-signal in response to the position of a force applied to the sensor along the Z-axis.
46. A touch-sensitive manually operable controller for providing position control information relative to three axes, the controller comprising:
a top surface, a front surface, and a peripheral side surface;
a first sensor positioned on the side surface of the controller and generally aligned on an orthogonal relative to an X-axis of a Cartesian coordinate system, the first sensor adapted for providing a first Y-signal in response to the position of a force applied to the sensor along the Y-axis and a yaw-signal in response to the position of a force applied to the sensor along the Z-axis;
a second sensor positioned on the top surface of the controller and generally aligned on and orthogonal relative to an Y-axis of a Cartesian coordinate system, the second sensor adapted for providing a first X-signal in response to the position of a force applied to the sensor along the X-axis and a pitch-signal in response to the position of a force applied to the sensor along the Z-axis;
a third sensor positioned on the side surface of the controller and generally aligned on and orthogonal relative to an Z-axis of a Cartesian coordinate system, the third sensor adapted for providing a second X-signal in response to the position of a force applied to the sensor along the X-axis and a second Y-signal in response to the position of a force applied to the sensor along the Y-axis; and
a fourth sensor positioned on the side surface of the controller opposite the first sensor and generally aligned on and orthogonal relative to an X-axis of a Cartesian coordinate system, the fourth sensor adapted for providing a third Y-signal in response to the position of a force applied to the sensor along the Y-axis and a first Z-signal in response to the position of a force applied to the sensor along the Z-axis.
47. A input device comprising:
a mouse body having a lower surface and an upper surface;
an x-y position sensor associated with said lower surface of said mouse body and providing output signals representing two degrees of freedom; and
a plurality of touch panels spaced apart on said upper surface of said mouse body for providing a plurality of output signals representing a plurality of additional degrees of freedom.
48. The input device as recited in claim 47 wherein the additional degrees of freedom include a pitch.
49. The input device as recited in claim 47 wherein the additional degrees of freedom include a yaw.
50. The input device as recited in claim 47 wherein the additional degrees of freedom include a roll.
51. The input device as recited in claim 47 wherein the additional degrees of freedom include a least pitch, yaw and roll.
52. The input device as recited in claim 47 wherein the additional degrees of freedom consists of pitch, yaw and roll.
53. The input device as recited in claim 52 wherein the body includes a bottom surface, a top surface, two opposite side surfaces both of which are in contact with the top surface and the bottom surface and the pitch, yaw and roll touch panels are each located on the body such that only one touch panel is located on the two side surfaces and the top surface.
54. An input device comprising:
a body;
an x-y position sensor providing output signals representing two degrees of freedom; and
a plurality of touch panels spaced apart on said body for providing a plurality of output signals representing a plurality of additional degrees of freedom;
wherein the additional degrees of freedom consists of pitch, yaw and roll, and wherein the body includes a bottom surface and a top surface, where the pitch, yaw and roll touch panels are each located on the top surface.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/188,284 USRE40891E1 (en) | 1991-11-26 | 2005-07-22 | Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom |
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US07/798,572 US5335557A (en) | 1991-11-26 | 1991-11-26 | Touch sensitive input control device |
US23825794A | 1994-05-03 | 1994-05-03 | |
US08/509,797 US5729249A (en) | 1991-11-26 | 1995-08-01 | Touch sensitive input control device |
US69636696A | 1996-08-13 | 1996-08-13 | |
US8603698P | 1998-05-19 | 1998-05-19 | |
US09/216,663 US6597347B1 (en) | 1991-11-26 | 1998-12-16 | Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom |
US11/188,284 USRE40891E1 (en) | 1991-11-26 | 2005-07-22 | Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/216,663 Reissue US6597347B1 (en) | 1991-11-26 | 1998-12-16 | Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom |
Publications (1)
Publication Number | Publication Date |
---|---|
USRE40891E1 true USRE40891E1 (en) | 2009-09-01 |
Family
ID=27536432
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/188,284 Expired - Fee Related USRE40891E1 (en) | 1991-11-26 | 2005-07-22 | Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom |
Country Status (1)
Country | Link |
---|---|
US (1) | USRE40891E1 (en) |
Cited By (74)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070200826A1 (en) * | 2003-07-31 | 2007-08-30 | Kye Systems Corp. | Computer input device for automaticall scrolling |
US20090160805A1 (en) * | 2007-12-21 | 2009-06-25 | Kabushiki Kaisha Toshiba | Information processing apparatus and display control method |
US20090283341A1 (en) * | 2008-05-16 | 2009-11-19 | Kye Systems Corp. | Input device and control method thereof |
US20090303231A1 (en) * | 2008-06-09 | 2009-12-10 | Fabrice Robinet | Touch Screen Device, Method, and Graphical User Interface for Manipulating Three-Dimensional Virtual Objects |
US20100001878A1 (en) * | 2008-02-21 | 2010-01-07 | Honeywell International Inc. | Apparatus for controlling an object that is movable within a coordinate system having a plurality of axes |
US20100149129A1 (en) * | 2008-12-15 | 2010-06-17 | Fuminori Homma | Information processing apparatus, information processing method and program |
US20100207899A1 (en) * | 2007-10-12 | 2010-08-19 | Oh Eui Jin | Character input device |
US20110041098A1 (en) * | 2009-08-14 | 2011-02-17 | James Thomas Kajiya | Manipulation of 3-dimensional graphical objects or view in a multi-touch display |
US20110057882A1 (en) * | 2003-07-31 | 2011-03-10 | Kye Systems Corporation | Computer input device for automatically scrolling |
US20110115784A1 (en) * | 2009-11-17 | 2011-05-19 | Tartz Robert S | System and method of controlling three dimensional virtual objects on a portable computing device |
US20120056837A1 (en) * | 2010-09-08 | 2012-03-08 | Samsung Electronics Co., Ltd. | Motion control touch screen method and apparatus |
US20120075181A1 (en) * | 2009-03-22 | 2012-03-29 | Cherif Atia Algreatly | 3D computer cursor |
US20120092330A1 (en) * | 2010-10-19 | 2012-04-19 | Elan Microelectronics Corporation | Control methods for a multi-function controller |
US8498100B1 (en) | 2012-03-02 | 2013-07-30 | Microsoft Corporation | Flexible hinge and removable attachment |
US20130249847A1 (en) * | 2010-12-16 | 2013-09-26 | BSH Bosch und Siemens Hausgeräte GmbH | Operating device for a household appliance having an electronic display panel |
US8621939B2 (en) * | 2007-12-18 | 2014-01-07 | Intuitive Surgical Operations, Inc. | Ribbed force sensor |
US20140009424A1 (en) * | 2011-03-25 | 2014-01-09 | Kyocera Corporation | Electronic device, control method, and control program |
TWI424340B (en) * | 2009-12-21 | 2014-01-21 | Soft Haptic Technology Inc | Pointing device |
US20140060223A1 (en) * | 2011-06-03 | 2014-03-06 | Sony Corporation | Actuator device, multi-shaft driving device, and robot device |
US8719603B2 (en) | 2012-03-02 | 2014-05-06 | Microsoft Corporation | Accessory device authentication |
US8873227B2 (en) | 2012-03-02 | 2014-10-28 | Microsoft Corporation | Flexible hinge support layer |
US20150015476A1 (en) * | 2013-07-09 | 2015-01-15 | EZ as a Drink Productions, Inc. | Handheld computing platform with integrated pressure sensor and associated methods of use |
US8947353B2 (en) | 2012-06-12 | 2015-02-03 | Microsoft Corporation | Photosensor array gesture detection |
US8949477B2 (en) | 2012-05-14 | 2015-02-03 | Microsoft Technology Licensing, Llc | Accessory device architecture |
US8991473B2 (en) | 2012-10-17 | 2015-03-31 | Microsoft Technology Holding, LLC | Metal alloy injection molding protrusions |
US9064654B2 (en) | 2012-03-02 | 2015-06-23 | Microsoft Technology Licensing, Llc | Method of manufacturing an input device |
US9075566B2 (en) | 2012-03-02 | 2015-07-07 | Microsoft Technoogy Licensing, LLC | Flexible hinge spine |
US9123279B2 (en) | 2011-12-27 | 2015-09-01 | Industrial Technology Research Institute | Flexible display and method for controlling the flexible display |
US9201185B2 (en) | 2011-02-04 | 2015-12-01 | Microsoft Technology Licensing, Llc | Directional backlighting for display panels |
US9229476B2 (en) | 2013-05-08 | 2016-01-05 | EZ as a Drink Productions, Inc. | Personal handheld electronic device with a touchscreen on a peripheral surface |
US9230064B2 (en) | 2012-06-19 | 2016-01-05 | EZ as a Drink Productions, Inc. | Personal wellness device |
US20160034177A1 (en) * | 2007-01-06 | 2016-02-04 | Apple Inc. | Detecting and interpreting real-world and security gestures on touch and hover sensitive devices |
US9256089B2 (en) | 2012-06-15 | 2016-02-09 | Microsoft Technology Licensing, Llc | Object-detecting backlight unit |
US9304549B2 (en) | 2013-03-28 | 2016-04-05 | Microsoft Technology Licensing, Llc | Hinge mechanism for rotatable component attachment |
US9317072B2 (en) | 2014-01-28 | 2016-04-19 | Microsoft Technology Licensing, Llc | Hinge mechanism with preset positions |
US9354748B2 (en) | 2012-02-13 | 2016-05-31 | Microsoft Technology Licensing, Llc | Optical stylus interaction |
US9360893B2 (en) | 2012-03-02 | 2016-06-07 | Microsoft Technology Licensing, Llc | Input device writing surface |
US9426905B2 (en) | 2012-03-02 | 2016-08-23 | Microsoft Technology Licensing, Llc | Connection device for computing devices |
US9432070B2 (en) | 2012-10-16 | 2016-08-30 | Microsoft Technology Licensing, Llc | Antenna placement |
US9448631B2 (en) | 2013-12-31 | 2016-09-20 | Microsoft Technology Licensing, Llc | Input device haptics and pressure sensing |
US9447620B2 (en) | 2014-09-30 | 2016-09-20 | Microsoft Technology Licensing, Llc | Hinge mechanism with multiple preset positions |
US9459160B2 (en) | 2012-06-13 | 2016-10-04 | Microsoft Technology Licensing, Llc | Input device sensor configuration |
US9544504B2 (en) | 2012-11-02 | 2017-01-10 | Microsoft Technology Licensing, Llc | Rapid synchronized lighting and shuttering |
US9552777B2 (en) | 2013-05-10 | 2017-01-24 | Microsoft Technology Licensing, Llc | Phase control backlight |
US20170115867A1 (en) * | 2015-10-27 | 2017-04-27 | Yahoo! Inc. | Method and system for interacting with a touch screen |
US9684382B2 (en) | 2012-06-13 | 2017-06-20 | Microsoft Technology Licensing, Llc | Input device configuration having capacitive and pressure sensors |
US9752361B2 (en) | 2015-06-18 | 2017-09-05 | Microsoft Technology Licensing, Llc | Multistage hinge |
US9759854B2 (en) | 2014-02-17 | 2017-09-12 | Microsoft Technology Licensing, Llc | Input device outer layer and backlighting |
US20170322626A1 (en) * | 2016-05-06 | 2017-11-09 | The Board Of Trustees Of The Leland Stanford Junior University | Wolverine: a wearable haptic interface for grasping in virtual reality |
US9824808B2 (en) | 2012-08-20 | 2017-11-21 | Microsoft Technology Licensing, Llc | Switchable magnetic lock |
US9855102B2 (en) | 2007-12-18 | 2018-01-02 | Intuitive Surgical Operations, Inc. | Force sensor temperature compensation |
US9864415B2 (en) | 2015-06-30 | 2018-01-09 | Microsoft Technology Licensing, Llc | Multistage friction hinge |
US9870066B2 (en) | 2012-03-02 | 2018-01-16 | Microsoft Technology Licensing, Llc | Method of manufacturing an input device |
WO2018067130A1 (en) * | 2016-10-04 | 2018-04-12 | Hewlett-Packard Development Company, L.P. | Three-dimensional input device |
US10031556B2 (en) | 2012-06-08 | 2018-07-24 | Microsoft Technology Licensing, Llc | User experience adaptation |
US10037057B2 (en) | 2016-09-22 | 2018-07-31 | Microsoft Technology Licensing, Llc | Friction hinge |
US10061385B2 (en) | 2016-01-22 | 2018-08-28 | Microsoft Technology Licensing, Llc | Haptic feedback for a touch input device |
US20180253221A1 (en) * | 2017-03-02 | 2018-09-06 | Samsung Electronics Co., Ltd. | Display device and user interface displaying method thereof |
US10102345B2 (en) | 2012-06-19 | 2018-10-16 | Activbody, Inc. | Personal wellness management platform |
US10107994B2 (en) | 2012-06-12 | 2018-10-23 | Microsoft Technology Licensing, Llc | Wide field-of-view virtual image projector |
US10120420B2 (en) | 2014-03-21 | 2018-11-06 | Microsoft Technology Licensing, Llc | Lockable display and techniques enabling use of lockable displays |
US10124246B2 (en) | 2014-04-21 | 2018-11-13 | Activbody, Inc. | Pressure sensitive peripheral devices, and associated methods of use |
US10133849B2 (en) | 2012-06-19 | 2018-11-20 | Activbody, Inc. | Merchandizing, socializing, and/or gaming via a personal wellness device and/or a personal wellness platform |
US10156889B2 (en) | 2014-09-15 | 2018-12-18 | Microsoft Technology Licensing, Llc | Inductive peripheral retention device |
US20190056786A1 (en) * | 2017-08-17 | 2019-02-21 | Google Inc. | Adjusting movement of a display screen to compensate for changes in speed of movement across the display screen |
US10222889B2 (en) | 2015-06-03 | 2019-03-05 | Microsoft Technology Licensing, Llc | Force inputs and cursor control |
US10272294B2 (en) | 2016-06-11 | 2019-04-30 | Apple Inc. | Activity and workout updates |
US10324733B2 (en) | 2014-07-30 | 2019-06-18 | Microsoft Technology Licensing, Llc | Shutdown notifications |
US10344797B2 (en) | 2016-04-05 | 2019-07-09 | Microsoft Technology Licensing, Llc | Hinge with multiple preset positions |
US10416799B2 (en) | 2015-06-03 | 2019-09-17 | Microsoft Technology Licensing, Llc | Force sensing and inadvertent input control of an input device |
US10578499B2 (en) | 2013-02-17 | 2020-03-03 | Microsoft Technology Licensing, Llc | Piezo-actuated virtual buttons for touch surfaces |
US11064910B2 (en) | 2010-12-08 | 2021-07-20 | Activbody, Inc. | Physical activity monitoring system |
US11103787B1 (en) | 2010-06-24 | 2021-08-31 | Gregory S. Rabin | System and method for generating a synthetic video stream |
USRE48963E1 (en) | 2012-03-02 | 2022-03-08 | Microsoft Technology Licensing, Llc | Connection device for computing devices |
Citations (56)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3490059A (en) | 1966-06-06 | 1970-01-13 | Martin Marietta Corp | Three axis mounting and torque sensing apparatus |
US4017858A (en) | 1973-07-30 | 1977-04-12 | Polhemus Navigation Sciences, Inc. | Apparatus for generating a nutating electromagnetic field |
US4216467A (en) | 1977-12-22 | 1980-08-05 | Westinghouse Electric Corp. | Hand controller |
GB2060173A (en) * | 1979-09-25 | 1981-04-29 | Fiat Ricerche | Capacitive transducer with six degrees of freedom |
US4302011A (en) | 1976-08-24 | 1981-11-24 | Peptek, Incorporated | Video game apparatus and method |
US4313113A (en) | 1980-03-24 | 1982-01-26 | Xerox Corporation | Cursor control |
US4394773A (en) | 1980-07-21 | 1983-07-19 | Siemens Corporation | Fingerprint sensor |
US4448083A (en) | 1981-04-13 | 1984-05-15 | Yamato Scale Company, Ltd. | Device for measuring components of force and moment in plural directions |
JPS6095331A (en) * | 1983-10-31 | 1985-05-28 | Sumitomo Heavy Ind Ltd | Force and moment sensor |
JPS60129635A (en) * | 1983-12-19 | 1985-07-10 | Omron Tateisi Electronics Co | Force detection apparatus |
US4550221A (en) | 1983-10-07 | 1985-10-29 | Scott Mabusth | Touch sensitive control device |
US4550617A (en) | 1983-05-06 | 1985-11-05 | Societe Nationale D'etude Et De Construction De Moteurs D'aviation "S.N.E.C.M.A." | Multi axis force and moments transducer |
SU1244515A1 (en) | 1984-11-27 | 1986-07-15 | Московский Институт Электронного Машиностроения | Device for simultaneous determining of components of force and displacement |
US4601206A (en) | 1983-09-16 | 1986-07-22 | Ferranti Plc | Accelerometer system |
JPS61292028A (en) | 1985-06-07 | 1986-12-22 | Fujitsu Ltd | Force detecting device |
US4684801A (en) | 1986-02-28 | 1987-08-04 | Carroll Touch Inc. | Signal preconditioning for touch entry device |
US4704909A (en) | 1985-07-22 | 1987-11-10 | Grahn Allen R | Multicomponent force-torque sensor |
US4720805A (en) | 1985-12-10 | 1988-01-19 | Vye Scott R | Computerized control system for the pan and tilt functions of a motorized camera head |
US4763100A (en) | 1987-08-13 | 1988-08-09 | Wood Lawson A | Joystick with additional degree of control |
US4787051A (en) | 1986-05-16 | 1988-11-22 | Tektronix, Inc. | Inertial mouse system |
US4798919A (en) | 1987-04-28 | 1989-01-17 | International Business Machines Corporation | Graphics input tablet with three-dimensional data |
US4811608A (en) | 1985-12-18 | 1989-03-14 | Spatial Systems Pty Limited | Force and torque converter |
US4823634A (en) | 1987-11-03 | 1989-04-25 | Culver Craig F | Multifunction tactile manipulatable control |
US4839838A (en) | 1987-03-30 | 1989-06-13 | Labiche Mitchell | Spatial input apparatus |
US4954817A (en) | 1988-05-02 | 1990-09-04 | Levine Neil A | Finger worn graphic interface device |
US4983786A (en) | 1990-01-17 | 1991-01-08 | The University Of British Columbia | XY velocity controller |
US4988981A (en) | 1987-03-17 | 1991-01-29 | Vpl Research, Inc. | Computer data entry and manipulation apparatus and method |
US5095303A (en) | 1990-03-27 | 1992-03-10 | Apple Computer, Inc. | Six degree of freedom graphic object controller |
WO1992008208A1 (en) | 1990-11-01 | 1992-05-14 | Queen Mary & Westfield College | Data input device having more than two degrees of freedom |
US5128671A (en) | 1990-04-12 | 1992-07-07 | Ltv Aerospace And Defense Company | Control device having multiple degrees of freedom |
GB2254911A (en) * | 1991-04-20 | 1992-10-21 | Ind Limited W | Haptic computer input/output device, eg data glove. |
US5165897A (en) | 1990-08-10 | 1992-11-24 | Tini Alloy Company | Programmable tactile stimulator array system and method of operation |
US5178012A (en) | 1991-05-31 | 1993-01-12 | Rockwell International Corporation | Twisting actuator accelerometer |
US5185561A (en) | 1991-07-23 | 1993-02-09 | Digital Equipment Corporation | Torque motor as a tactile feedback device in a computer system |
WO1993011526A1 (en) | 1991-12-03 | 1993-06-10 | Logitech, Inc. | 3d mouse on a pedestal |
US5262777A (en) | 1991-11-16 | 1993-11-16 | Sri International | Device for generating multidimensional input signals to a computer |
US5327161A (en) * | 1989-08-09 | 1994-07-05 | Microtouch Systems, Inc. | System and method for emulating a mouse input device with a touchpad input device |
US5335557A (en) * | 1991-11-26 | 1994-08-09 | Taizo Yasutake | Touch sensitive input control device |
US5354162A (en) * | 1991-02-26 | 1994-10-11 | Rutgers University | Actuator system for providing force feedback to portable master support |
US5376948A (en) * | 1992-03-25 | 1994-12-27 | Visage, Inc. | Method of and apparatus for touch-input computer and related display employing touch force location external to the display |
US5389865A (en) * | 1992-12-02 | 1995-02-14 | Cybernet Systems Corporation | Method and system for providing a tactile virtual reality and manipulator defining an interface device therefor |
US5408407A (en) * | 1993-03-15 | 1995-04-18 | Pentek, Inc. | System and method for positioning a work point |
US5429140A (en) * | 1993-06-04 | 1995-07-04 | Greenleaf Medical Systems, Inc. | Integrated virtual reality rehabilitation system |
WO1995020787A1 (en) | 1994-01-27 | 1995-08-03 | Exos, Inc. | Multimode feedback display technology |
WO1995020788A1 (en) | 1994-01-27 | 1995-08-03 | Exos, Inc. | Intelligent remote multimode sense and display system utilizing haptic information compression |
US5440476A (en) * | 1993-03-15 | 1995-08-08 | Pentek, Inc. | System for positioning a work point in three dimensional space |
US5483261A (en) * | 1992-02-14 | 1996-01-09 | Itu Research, Inc. | Graphical input controller and method with rear screen image detection |
US5506605A (en) * | 1992-07-27 | 1996-04-09 | Paley; W. Bradford | Three-dimensional mouse with tactile feedback |
US5543590A (en) * | 1992-06-08 | 1996-08-06 | Synaptics, Incorporated | Object position detector with edge motion feature |
US5555894A (en) * | 1993-05-11 | 1996-09-17 | Matsushita Electric Industrial Co., Ltd. | Force sensation exhibiting device, data input device and data input equipment |
US5565891A (en) * | 1992-03-05 | 1996-10-15 | Armstrong; Brad A. | Six degrees of freedom graphics controller |
US5703623A (en) * | 1996-01-24 | 1997-12-30 | Hall; Malcolm G. | Smart orientation sensing circuit for remote control |
US5717423A (en) * | 1994-12-30 | 1998-02-10 | Merltec Innovative Research | Three-dimensional display |
US6087599A (en) * | 1997-11-24 | 2000-07-11 | The Whitaker Corporation | Touch panels having plastic substrates |
US6091406A (en) * | 1996-12-25 | 2000-07-18 | Elo Touchsystems, Inc. | Grating transducer for acoustic touchscreens |
US6597347B1 (en) * | 1991-11-26 | 2003-07-22 | Itu Research Inc. | Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom |
-
2005
- 2005-07-22 US US11/188,284 patent/USRE40891E1/en not_active Expired - Fee Related
Patent Citations (62)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3490059A (en) | 1966-06-06 | 1970-01-13 | Martin Marietta Corp | Three axis mounting and torque sensing apparatus |
US4017858A (en) | 1973-07-30 | 1977-04-12 | Polhemus Navigation Sciences, Inc. | Apparatus for generating a nutating electromagnetic field |
US4302011A (en) | 1976-08-24 | 1981-11-24 | Peptek, Incorporated | Video game apparatus and method |
US4216467A (en) | 1977-12-22 | 1980-08-05 | Westinghouse Electric Corp. | Hand controller |
GB2060173A (en) * | 1979-09-25 | 1981-04-29 | Fiat Ricerche | Capacitive transducer with six degrees of freedom |
US4313113A (en) | 1980-03-24 | 1982-01-26 | Xerox Corporation | Cursor control |
US4394773A (en) | 1980-07-21 | 1983-07-19 | Siemens Corporation | Fingerprint sensor |
US4448083A (en) | 1981-04-13 | 1984-05-15 | Yamato Scale Company, Ltd. | Device for measuring components of force and moment in plural directions |
US4550617A (en) | 1983-05-06 | 1985-11-05 | Societe Nationale D'etude Et De Construction De Moteurs D'aviation "S.N.E.C.M.A." | Multi axis force and moments transducer |
US4601206A (en) | 1983-09-16 | 1986-07-22 | Ferranti Plc | Accelerometer system |
US4550221A (en) | 1983-10-07 | 1985-10-29 | Scott Mabusth | Touch sensitive control device |
JPS6095331A (en) * | 1983-10-31 | 1985-05-28 | Sumitomo Heavy Ind Ltd | Force and moment sensor |
JPS60129635A (en) * | 1983-12-19 | 1985-07-10 | Omron Tateisi Electronics Co | Force detection apparatus |
SU1244515A1 (en) | 1984-11-27 | 1986-07-15 | Московский Институт Электронного Машиностроения | Device for simultaneous determining of components of force and displacement |
JPS61292028A (en) | 1985-06-07 | 1986-12-22 | Fujitsu Ltd | Force detecting device |
US4704909A (en) | 1985-07-22 | 1987-11-10 | Grahn Allen R | Multicomponent force-torque sensor |
US4720805A (en) | 1985-12-10 | 1988-01-19 | Vye Scott R | Computerized control system for the pan and tilt functions of a motorized camera head |
US4811608A (en) | 1985-12-18 | 1989-03-14 | Spatial Systems Pty Limited | Force and torque converter |
US4684801A (en) | 1986-02-28 | 1987-08-04 | Carroll Touch Inc. | Signal preconditioning for touch entry device |
US4787051A (en) | 1986-05-16 | 1988-11-22 | Tektronix, Inc. | Inertial mouse system |
US4988981B1 (en) | 1987-03-17 | 1999-05-18 | Vpl Newco Inc | Computer data entry and manipulation apparatus and method |
US4988981A (en) | 1987-03-17 | 1991-01-29 | Vpl Research, Inc. | Computer data entry and manipulation apparatus and method |
US4839838A (en) | 1987-03-30 | 1989-06-13 | Labiche Mitchell | Spatial input apparatus |
US4798919A (en) | 1987-04-28 | 1989-01-17 | International Business Machines Corporation | Graphics input tablet with three-dimensional data |
US4763100A (en) | 1987-08-13 | 1988-08-09 | Wood Lawson A | Joystick with additional degree of control |
US4823634A (en) | 1987-11-03 | 1989-04-25 | Culver Craig F | Multifunction tactile manipulatable control |
US4954817A (en) | 1988-05-02 | 1990-09-04 | Levine Neil A | Finger worn graphic interface device |
US5327161A (en) * | 1989-08-09 | 1994-07-05 | Microtouch Systems, Inc. | System and method for emulating a mouse input device with a touchpad input device |
US4983786A (en) | 1990-01-17 | 1991-01-08 | The University Of British Columbia | XY velocity controller |
US5095303A (en) | 1990-03-27 | 1992-03-10 | Apple Computer, Inc. | Six degree of freedom graphic object controller |
US5128671A (en) | 1990-04-12 | 1992-07-07 | Ltv Aerospace And Defense Company | Control device having multiple degrees of freedom |
US5165897A (en) | 1990-08-10 | 1992-11-24 | Tini Alloy Company | Programmable tactile stimulator array system and method of operation |
WO1992008208A1 (en) | 1990-11-01 | 1992-05-14 | Queen Mary & Westfield College | Data input device having more than two degrees of freedom |
US5354162A (en) * | 1991-02-26 | 1994-10-11 | Rutgers University | Actuator system for providing force feedback to portable master support |
GB2254911A (en) * | 1991-04-20 | 1992-10-21 | Ind Limited W | Haptic computer input/output device, eg data glove. |
US5178012A (en) | 1991-05-31 | 1993-01-12 | Rockwell International Corporation | Twisting actuator accelerometer |
US5185561A (en) | 1991-07-23 | 1993-02-09 | Digital Equipment Corporation | Torque motor as a tactile feedback device in a computer system |
US5262777A (en) | 1991-11-16 | 1993-11-16 | Sri International | Device for generating multidimensional input signals to a computer |
US6597347B1 (en) * | 1991-11-26 | 2003-07-22 | Itu Research Inc. | Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom |
US5335557A (en) * | 1991-11-26 | 1994-08-09 | Taizo Yasutake | Touch sensitive input control device |
US5729249A (en) * | 1991-11-26 | 1998-03-17 | Itu Research, Inc. | Touch sensitive input control device |
US5774113A (en) * | 1991-12-03 | 1998-06-30 | Logitech, Inc. | 3-D mouse on a pedestal |
WO1993011526A1 (en) | 1991-12-03 | 1993-06-10 | Logitech, Inc. | 3d mouse on a pedestal |
US5483261A (en) * | 1992-02-14 | 1996-01-09 | Itu Research, Inc. | Graphical input controller and method with rear screen image detection |
US5565891A (en) * | 1992-03-05 | 1996-10-15 | Armstrong; Brad A. | Six degrees of freedom graphics controller |
US5376948A (en) * | 1992-03-25 | 1994-12-27 | Visage, Inc. | Method of and apparatus for touch-input computer and related display employing touch force location external to the display |
US5543590A (en) * | 1992-06-08 | 1996-08-06 | Synaptics, Incorporated | Object position detector with edge motion feature |
US5506605A (en) * | 1992-07-27 | 1996-04-09 | Paley; W. Bradford | Three-dimensional mouse with tactile feedback |
US5459382A (en) * | 1992-12-02 | 1995-10-17 | Cybernet Systems Corporation | Method and system for providing a tactile virtual reality and manipulator defining an interface device therefor |
US5459382B1 (en) * | 1992-12-02 | 1998-06-09 | Cybernet Systems Corp | Method and system for providing a tactile virtual reality and manipulator defining an interface device therefor |
US5389865A (en) * | 1992-12-02 | 1995-02-14 | Cybernet Systems Corporation | Method and system for providing a tactile virtual reality and manipulator defining an interface device therefor |
US5408407A (en) * | 1993-03-15 | 1995-04-18 | Pentek, Inc. | System and method for positioning a work point |
US5440476A (en) * | 1993-03-15 | 1995-08-08 | Pentek, Inc. | System for positioning a work point in three dimensional space |
US5778885A (en) * | 1993-05-11 | 1998-07-14 | Matsushita Electric Industrial Co. | Force sensation exhibiting device data input device and data input equipment |
US5555894A (en) * | 1993-05-11 | 1996-09-17 | Matsushita Electric Industrial Co., Ltd. | Force sensation exhibiting device, data input device and data input equipment |
US5429140A (en) * | 1993-06-04 | 1995-07-04 | Greenleaf Medical Systems, Inc. | Integrated virtual reality rehabilitation system |
WO1995020787A1 (en) | 1994-01-27 | 1995-08-03 | Exos, Inc. | Multimode feedback display technology |
WO1995020788A1 (en) | 1994-01-27 | 1995-08-03 | Exos, Inc. | Intelligent remote multimode sense and display system utilizing haptic information compression |
US5717423A (en) * | 1994-12-30 | 1998-02-10 | Merltec Innovative Research | Three-dimensional display |
US5703623A (en) * | 1996-01-24 | 1997-12-30 | Hall; Malcolm G. | Smart orientation sensing circuit for remote control |
US6091406A (en) * | 1996-12-25 | 2000-07-18 | Elo Touchsystems, Inc. | Grating transducer for acoustic touchscreens |
US6087599A (en) * | 1997-11-24 | 2000-07-11 | The Whitaker Corporation | Touch panels having plastic substrates |
Non-Patent Citations (5)
Title |
---|
Kameyama, et al., "A Shape Modeling System with a Volume Scanning Display and Multisensory Input Device," Presence, vol. 2, No. 2, pp. 104-111 (Spring, 1993). |
Ken-ichi Kameyama, Koichi Ohtomi; A Shape Modeling System with a Volume Scanning Display and Multisensory Input Device; Presence; vol. 2, No. 2, Spring 1993. * |
Krueger, "Artificial Reality: Perceptual Systems," pp. 54-75 (1983). |
Murakami, et al., "Direct and Intuitive Input Device for 3-D Shape Deformation," Human Factors in Computing Systems (Apr. 24-28, 1994). |
Tamotsu Murakami, Naomasa Nakajima; Direct and Intuitive Input Device for 3-D Shape Deformation; Human Factors in Computing Systems; Apr. 24-28, 1994. * |
Cited By (161)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8217896B2 (en) | 2003-07-31 | 2012-07-10 | Kye Systems Corporation | Computer input device for automatically scrolling |
US7859517B2 (en) * | 2003-07-31 | 2010-12-28 | Kye Systems Corporation | Computer input device for automatically scrolling |
US20070200826A1 (en) * | 2003-07-31 | 2007-08-30 | Kye Systems Corp. | Computer input device for automaticall scrolling |
US20110057882A1 (en) * | 2003-07-31 | 2011-03-10 | Kye Systems Corporation | Computer input device for automatically scrolling |
US10620066B2 (en) | 2005-03-30 | 2020-04-14 | Intuitive Surgical Operations, Inc. | Ribbed force sensor |
US20160034177A1 (en) * | 2007-01-06 | 2016-02-04 | Apple Inc. | Detecting and interpreting real-world and security gestures on touch and hover sensitive devices |
US20100207899A1 (en) * | 2007-10-12 | 2010-08-19 | Oh Eui Jin | Character input device |
US9829994B2 (en) * | 2007-10-12 | 2017-11-28 | Eui Jin OH | Character input device |
US8621939B2 (en) * | 2007-12-18 | 2014-01-07 | Intuitive Surgical Operations, Inc. | Ribbed force sensor |
US10390896B2 (en) | 2007-12-18 | 2019-08-27 | Intuitive Surgical Operations, Inc. | Force sensor temperature compensation |
US11650111B2 (en) | 2007-12-18 | 2023-05-16 | Intuitive Surgical Operations, Inc. | Ribbed force sensor |
US9855102B2 (en) | 2007-12-18 | 2018-01-02 | Intuitive Surgical Operations, Inc. | Force sensor temperature compensation |
US11571264B2 (en) | 2007-12-18 | 2023-02-07 | Intuitive Surgical Operations, Inc. | Force sensor temperature compensation |
US9952107B2 (en) | 2007-12-18 | 2018-04-24 | Intuitive Surgical Operations, Inc | Ribbed force sensor |
US20090160805A1 (en) * | 2007-12-21 | 2009-06-25 | Kabushiki Kaisha Toshiba | Information processing apparatus and display control method |
US20100001878A1 (en) * | 2008-02-21 | 2010-01-07 | Honeywell International Inc. | Apparatus for controlling an object that is movable within a coordinate system having a plurality of axes |
US20090283341A1 (en) * | 2008-05-16 | 2009-11-19 | Kye Systems Corp. | Input device and control method thereof |
US20090303231A1 (en) * | 2008-06-09 | 2009-12-10 | Fabrice Robinet | Touch Screen Device, Method, and Graphical User Interface for Manipulating Three-Dimensional Virtual Objects |
US8477139B2 (en) * | 2008-06-09 | 2013-07-02 | Apple Inc. | Touch screen device, method, and graphical user interface for manipulating three-dimensional virtual objects |
US8547358B2 (en) * | 2008-12-15 | 2013-10-01 | Sony Corporation | Information processing apparatus, information processing method and program |
US20100149129A1 (en) * | 2008-12-15 | 2010-06-17 | Fuminori Homma | Information processing apparatus, information processing method and program |
US9035877B2 (en) * | 2009-03-22 | 2015-05-19 | Cherif Atia Algreatly | 3D computer cursor |
US20120075181A1 (en) * | 2009-03-22 | 2012-03-29 | Cherif Atia Algreatly | 3D computer cursor |
US10198854B2 (en) * | 2009-08-14 | 2019-02-05 | Microsoft Technology Licensing, Llc | Manipulation of 3-dimensional graphical objects for view in a multi-touch display |
US20110041098A1 (en) * | 2009-08-14 | 2011-02-17 | James Thomas Kajiya | Manipulation of 3-dimensional graphical objects or view in a multi-touch display |
US20110115784A1 (en) * | 2009-11-17 | 2011-05-19 | Tartz Robert S | System and method of controlling three dimensional virtual objects on a portable computing device |
US8922583B2 (en) * | 2009-11-17 | 2014-12-30 | Qualcomm Incorporated | System and method of controlling three dimensional virtual objects on a portable computing device |
CN102667674A (en) * | 2009-11-17 | 2012-09-12 | 高通股份有限公司 | System and method of controlling three dimensional virtual objects on a portable computing device |
TWI424340B (en) * | 2009-12-21 | 2014-01-21 | Soft Haptic Technology Inc | Pointing device |
US11103787B1 (en) | 2010-06-24 | 2021-08-31 | Gregory S. Rabin | System and method for generating a synthetic video stream |
US20120056837A1 (en) * | 2010-09-08 | 2012-03-08 | Samsung Electronics Co., Ltd. | Motion control touch screen method and apparatus |
US9684439B2 (en) * | 2010-09-08 | 2017-06-20 | Samsung Electronics Co., Ltd | Motion control touch screen method and apparatus |
US9013398B2 (en) * | 2010-10-19 | 2015-04-21 | Elan Microelectronics Corporation | Control methods for a multi-function controller |
US20120092330A1 (en) * | 2010-10-19 | 2012-04-19 | Elan Microelectronics Corporation | Control methods for a multi-function controller |
US11064910B2 (en) | 2010-12-08 | 2021-07-20 | Activbody, Inc. | Physical activity monitoring system |
US9904354B2 (en) * | 2010-12-16 | 2018-02-27 | BSH Hausgeräte GmbH | Operating device for a household appliance having an electronic display panel |
US20130249847A1 (en) * | 2010-12-16 | 2013-09-26 | BSH Bosch und Siemens Hausgeräte GmbH | Operating device for a household appliance having an electronic display panel |
US9201185B2 (en) | 2011-02-04 | 2015-12-01 | Microsoft Technology Licensing, Llc | Directional backlighting for display panels |
US9430081B2 (en) * | 2011-03-25 | 2016-08-30 | Kyocera Corporation | Electronic device, control method, and control program |
US20140009424A1 (en) * | 2011-03-25 | 2014-01-09 | Kyocera Corporation | Electronic device, control method, and control program |
US9561585B2 (en) * | 2011-06-03 | 2017-02-07 | Sony Corporation | Actuator device, multi-shaft driving device, and robot device |
US20140060223A1 (en) * | 2011-06-03 | 2014-03-06 | Sony Corporation | Actuator device, multi-shaft driving device, and robot device |
US9123279B2 (en) | 2011-12-27 | 2015-09-01 | Industrial Technology Research Institute | Flexible display and method for controlling the flexible display |
US9354748B2 (en) | 2012-02-13 | 2016-05-31 | Microsoft Technology Licensing, Llc | Optical stylus interaction |
US9360893B2 (en) | 2012-03-02 | 2016-06-07 | Microsoft Technology Licensing, Llc | Input device writing surface |
US9793073B2 (en) | 2012-03-02 | 2017-10-17 | Microsoft Technology Licensing, Llc | Backlighting a fabric enclosure of a flexible cover |
US8498100B1 (en) | 2012-03-02 | 2013-07-30 | Microsoft Corporation | Flexible hinge and removable attachment |
US8947864B2 (en) | 2012-03-02 | 2015-02-03 | Microsoft Corporation | Flexible hinge and removable attachment |
US8543227B1 (en) | 2012-03-02 | 2013-09-24 | Microsoft Corporation | Sensor fusion algorithm |
USRE48963E1 (en) | 2012-03-02 | 2022-03-08 | Microsoft Technology Licensing, Llc | Connection device for computing devices |
US8935774B2 (en) | 2012-03-02 | 2015-01-13 | Microsoft Corporation | Accessory device authentication |
US8903517B2 (en) | 2012-03-02 | 2014-12-02 | Microsoft Corporation | Computer device and an apparatus having sensors configured for measuring spatial information indicative of a position of the computing devices |
US9047207B2 (en) | 2012-03-02 | 2015-06-02 | Microsoft Technology Licensing, Llc | Mobile device power state |
US9064654B2 (en) | 2012-03-02 | 2015-06-23 | Microsoft Technology Licensing, Llc | Method of manufacturing an input device |
US9075566B2 (en) | 2012-03-02 | 2015-07-07 | Microsoft Technoogy Licensing, LLC | Flexible hinge spine |
US9098117B2 (en) | 2012-03-02 | 2015-08-04 | Microsoft Technology Licensing, Llc | Classifying the intent of user input |
US9111703B2 (en) | 2012-03-02 | 2015-08-18 | Microsoft Technology Licensing, Llc | Sensor stack venting |
US9116550B2 (en) | 2012-03-02 | 2015-08-25 | Microsoft Technology Licensing, Llc | Device kickstand |
US8896993B2 (en) | 2012-03-02 | 2014-11-25 | Microsoft Corporation | Input device layers and nesting |
US9134807B2 (en) | 2012-03-02 | 2015-09-15 | Microsoft Technology Licensing, Llc | Pressure sensitive key normalization |
US9134808B2 (en) | 2012-03-02 | 2015-09-15 | Microsoft Technology Licensing, Llc | Device kickstand |
US9146620B2 (en) | 2012-03-02 | 2015-09-29 | Microsoft Technology Licensing, Llc | Input device assembly |
US9158383B2 (en) | 2012-03-02 | 2015-10-13 | Microsoft Technology Licensing, Llc | Force concentrator |
US9158384B2 (en) | 2012-03-02 | 2015-10-13 | Microsoft Technology Licensing, Llc | Flexible hinge protrusion attachment |
US9176901B2 (en) | 2012-03-02 | 2015-11-03 | Microsoft Technology Licensing, Llc | Flux fountain |
US9176900B2 (en) | 2012-03-02 | 2015-11-03 | Microsoft Technology Licensing, Llc | Flexible hinge and removable attachment |
US8873227B2 (en) | 2012-03-02 | 2014-10-28 | Microsoft Corporation | Flexible hinge support layer |
US8548608B2 (en) | 2012-03-02 | 2013-10-01 | Microsoft Corporation | Sensor fusion algorithm |
US8564944B2 (en) | 2012-03-02 | 2013-10-22 | Microsoft Corporation | Flux fountain |
US8854799B2 (en) | 2012-03-02 | 2014-10-07 | Microsoft Corporation | Flux fountain |
US10963087B2 (en) | 2012-03-02 | 2021-03-30 | Microsoft Technology Licensing, Llc | Pressure sensitive keys |
US8570725B2 (en) | 2012-03-02 | 2013-10-29 | Microsoft Corporation | Flexible hinge and removable attachment |
US9268373B2 (en) | 2012-03-02 | 2016-02-23 | Microsoft Technology Licensing, Llc | Flexible hinge spine |
US9275809B2 (en) | 2012-03-02 | 2016-03-01 | Microsoft Technology Licensing, Llc | Device camera angle |
US9298236B2 (en) | 2012-03-02 | 2016-03-29 | Microsoft Technology Licensing, Llc | Multi-stage power adapter configured to provide a first power level upon initial connection of the power adapter to the host device and a second power level thereafter upon notification from the host device to the power adapter |
US8610015B2 (en) | 2012-03-02 | 2013-12-17 | Microsoft Corporation | Input device securing techniques |
US9304949B2 (en) | 2012-03-02 | 2016-04-05 | Microsoft Technology Licensing, Llc | Sensing user input at display area edge |
US9304948B2 (en) | 2012-03-02 | 2016-04-05 | Microsoft Technology Licensing, Llc | Sensing user input at display area edge |
US8614666B2 (en) * | 2012-03-02 | 2013-12-24 | Microsoft Corporation | Sensing user input at display area edge |
US10013030B2 (en) | 2012-03-02 | 2018-07-03 | Microsoft Technology Licensing, Llc | Multiple position input device cover |
US8850241B2 (en) | 2012-03-02 | 2014-09-30 | Microsoft Corporation | Multi-stage power adapter configured to provide low power upon initial connection of the power adapter to the host device and high power thereafter upon notification from the host device to the power adapter |
US8830668B2 (en) | 2012-03-02 | 2014-09-09 | Microsoft Corporation | Flexible hinge and removable attachment |
US9411751B2 (en) | 2012-03-02 | 2016-08-09 | Microsoft Technology Licensing, Llc | Key formation |
US9426905B2 (en) | 2012-03-02 | 2016-08-23 | Microsoft Technology Licensing, Llc | Connection device for computing devices |
US8791382B2 (en) | 2012-03-02 | 2014-07-29 | Microsoft Corporation | Input device securing techniques |
US8646999B2 (en) | 2012-03-02 | 2014-02-11 | Microsoft Corporation | Pressure sensitive key normalization |
US9946307B2 (en) | 2012-03-02 | 2018-04-17 | Microsoft Technology Licensing, Llc | Classifying the intent of user input |
US9904327B2 (en) | 2012-03-02 | 2018-02-27 | Microsoft Technology Licensing, Llc | Flexible hinge and removable attachment |
US9460029B2 (en) | 2012-03-02 | 2016-10-04 | Microsoft Technology Licensing, Llc | Pressure sensitive keys |
US8699215B2 (en) | 2012-03-02 | 2014-04-15 | Microsoft Corporation | Flexible hinge spine |
US9465412B2 (en) | 2012-03-02 | 2016-10-11 | Microsoft Technology Licensing, Llc | Input device layers and nesting |
US9870066B2 (en) | 2012-03-02 | 2018-01-16 | Microsoft Technology Licensing, Llc | Method of manufacturing an input device |
US8719603B2 (en) | 2012-03-02 | 2014-05-06 | Microsoft Corporation | Accessory device authentication |
US8780541B2 (en) | 2012-03-02 | 2014-07-15 | Microsoft Corporation | Flexible hinge and removable attachment |
US9619071B2 (en) | 2012-03-02 | 2017-04-11 | Microsoft Technology Licensing, Llc | Computing device and an apparatus having sensors configured for measuring spatial information indicative of a position of the computing devices |
US9618977B2 (en) | 2012-03-02 | 2017-04-11 | Microsoft Technology Licensing, Llc | Input device securing techniques |
US9852855B2 (en) | 2012-03-02 | 2017-12-26 | Microsoft Technology Licensing, Llc | Pressure sensitive key normalization |
US9678542B2 (en) | 2012-03-02 | 2017-06-13 | Microsoft Technology Licensing, Llc | Multiple position input device cover |
US8780540B2 (en) | 2012-03-02 | 2014-07-15 | Microsoft Corporation | Flexible hinge and removable attachment |
US8724302B2 (en) | 2012-03-02 | 2014-05-13 | Microsoft Corporation | Flexible hinge support layer |
US9710093B2 (en) | 2012-03-02 | 2017-07-18 | Microsoft Technology Licensing, Llc | Pressure sensitive key normalization |
US9766663B2 (en) | 2012-03-02 | 2017-09-19 | Microsoft Technology Licensing, Llc | Hinge for component attachment |
US8949477B2 (en) | 2012-05-14 | 2015-02-03 | Microsoft Technology Licensing, Llc | Accessory device architecture |
US9348605B2 (en) | 2012-05-14 | 2016-05-24 | Microsoft Technology Licensing, Llc | System and method for accessory device architecture that passes human interface device (HID) data via intermediate processor |
US10678743B2 (en) | 2012-05-14 | 2020-06-09 | Microsoft Technology Licensing, Llc | System and method for accessory device architecture that passes via intermediate processor a descriptor when processing in a low power state |
US9959241B2 (en) | 2012-05-14 | 2018-05-01 | Microsoft Technology Licensing, Llc | System and method for accessory device architecture that passes via intermediate processor a descriptor when processing in a low power state |
US10031556B2 (en) | 2012-06-08 | 2018-07-24 | Microsoft Technology Licensing, Llc | User experience adaptation |
US8947353B2 (en) | 2012-06-12 | 2015-02-03 | Microsoft Corporation | Photosensor array gesture detection |
US10107994B2 (en) | 2012-06-12 | 2018-10-23 | Microsoft Technology Licensing, Llc | Wide field-of-view virtual image projector |
US9952106B2 (en) | 2012-06-13 | 2018-04-24 | Microsoft Technology Licensing, Llc | Input device sensor configuration |
US10228770B2 (en) | 2012-06-13 | 2019-03-12 | Microsoft Technology Licensing, Llc | Input device configuration having capacitive and pressure sensors |
US9684382B2 (en) | 2012-06-13 | 2017-06-20 | Microsoft Technology Licensing, Llc | Input device configuration having capacitive and pressure sensors |
US9459160B2 (en) | 2012-06-13 | 2016-10-04 | Microsoft Technology Licensing, Llc | Input device sensor configuration |
US9256089B2 (en) | 2012-06-15 | 2016-02-09 | Microsoft Technology Licensing, Llc | Object-detecting backlight unit |
US9230064B2 (en) | 2012-06-19 | 2016-01-05 | EZ as a Drink Productions, Inc. | Personal wellness device |
US10102345B2 (en) | 2012-06-19 | 2018-10-16 | Activbody, Inc. | Personal wellness management platform |
US10133849B2 (en) | 2012-06-19 | 2018-11-20 | Activbody, Inc. | Merchandizing, socializing, and/or gaming via a personal wellness device and/or a personal wellness platform |
US9824808B2 (en) | 2012-08-20 | 2017-11-21 | Microsoft Technology Licensing, Llc | Switchable magnetic lock |
US9432070B2 (en) | 2012-10-16 | 2016-08-30 | Microsoft Technology Licensing, Llc | Antenna placement |
US8991473B2 (en) | 2012-10-17 | 2015-03-31 | Microsoft Technology Holding, LLC | Metal alloy injection molding protrusions |
US9544504B2 (en) | 2012-11-02 | 2017-01-10 | Microsoft Technology Licensing, Llc | Rapid synchronized lighting and shuttering |
US10578499B2 (en) | 2013-02-17 | 2020-03-03 | Microsoft Technology Licensing, Llc | Piezo-actuated virtual buttons for touch surfaces |
US9304549B2 (en) | 2013-03-28 | 2016-04-05 | Microsoft Technology Licensing, Llc | Hinge mechanism for rotatable component attachment |
US9229476B2 (en) | 2013-05-08 | 2016-01-05 | EZ as a Drink Productions, Inc. | Personal handheld electronic device with a touchscreen on a peripheral surface |
US9552777B2 (en) | 2013-05-10 | 2017-01-24 | Microsoft Technology Licensing, Llc | Phase control backlight |
US9262064B2 (en) * | 2013-07-09 | 2016-02-16 | EZ as a Drink Productions, Inc. | Handheld computing platform with integrated pressure sensor and associated methods of use |
US20150015476A1 (en) * | 2013-07-09 | 2015-01-15 | EZ as a Drink Productions, Inc. | Handheld computing platform with integrated pressure sensor and associated methods of use |
US9448631B2 (en) | 2013-12-31 | 2016-09-20 | Microsoft Technology Licensing, Llc | Input device haptics and pressure sensing |
US10359848B2 (en) | 2013-12-31 | 2019-07-23 | Microsoft Technology Licensing, Llc | Input device haptics and pressure sensing |
US9317072B2 (en) | 2014-01-28 | 2016-04-19 | Microsoft Technology Licensing, Llc | Hinge mechanism with preset positions |
US9759854B2 (en) | 2014-02-17 | 2017-09-12 | Microsoft Technology Licensing, Llc | Input device outer layer and backlighting |
US10120420B2 (en) | 2014-03-21 | 2018-11-06 | Microsoft Technology Licensing, Llc | Lockable display and techniques enabling use of lockable displays |
US10124246B2 (en) | 2014-04-21 | 2018-11-13 | Activbody, Inc. | Pressure sensitive peripheral devices, and associated methods of use |
US10324733B2 (en) | 2014-07-30 | 2019-06-18 | Microsoft Technology Licensing, Llc | Shutdown notifications |
US10156889B2 (en) | 2014-09-15 | 2018-12-18 | Microsoft Technology Licensing, Llc | Inductive peripheral retention device |
US9447620B2 (en) | 2014-09-30 | 2016-09-20 | Microsoft Technology Licensing, Llc | Hinge mechanism with multiple preset positions |
US9964998B2 (en) | 2014-09-30 | 2018-05-08 | Microsoft Technology Licensing, Llc | Hinge mechanism with multiple preset positions |
US10416799B2 (en) | 2015-06-03 | 2019-09-17 | Microsoft Technology Licensing, Llc | Force sensing and inadvertent input control of an input device |
US10222889B2 (en) | 2015-06-03 | 2019-03-05 | Microsoft Technology Licensing, Llc | Force inputs and cursor control |
US9752361B2 (en) | 2015-06-18 | 2017-09-05 | Microsoft Technology Licensing, Llc | Multistage hinge |
US9864415B2 (en) | 2015-06-30 | 2018-01-09 | Microsoft Technology Licensing, Llc | Multistage friction hinge |
US10606322B2 (en) | 2015-06-30 | 2020-03-31 | Microsoft Technology Licensing, Llc | Multistage friction hinge |
US20170115867A1 (en) * | 2015-10-27 | 2017-04-27 | Yahoo! Inc. | Method and system for interacting with a touch screen |
US11182068B2 (en) * | 2015-10-27 | 2021-11-23 | Verizon Patent And Licensing Inc. | Method and system for interacting with a touch screen |
US10061385B2 (en) | 2016-01-22 | 2018-08-28 | Microsoft Technology Licensing, Llc | Haptic feedback for a touch input device |
US10344797B2 (en) | 2016-04-05 | 2019-07-09 | Microsoft Technology Licensing, Llc | Hinge with multiple preset positions |
US10248201B2 (en) * | 2016-05-06 | 2019-04-02 | The Board Of Trustees Of The Leland Stanford Junior University | Wolverine: a wearable haptic interface for grasping in virtual reality |
US20170322626A1 (en) * | 2016-05-06 | 2017-11-09 | The Board Of Trustees Of The Leland Stanford Junior University | Wolverine: a wearable haptic interface for grasping in virtual reality |
US11161010B2 (en) | 2016-06-11 | 2021-11-02 | Apple Inc. | Activity and workout updates |
US10272294B2 (en) | 2016-06-11 | 2019-04-30 | Apple Inc. | Activity and workout updates |
US11148007B2 (en) | 2016-06-11 | 2021-10-19 | Apple Inc. | Activity and workout updates |
US11660503B2 (en) | 2016-06-11 | 2023-05-30 | Apple Inc. | Activity and workout updates |
US11918857B2 (en) | 2016-06-11 | 2024-03-05 | Apple Inc. | Activity and workout updates |
US10037057B2 (en) | 2016-09-22 | 2018-07-31 | Microsoft Technology Licensing, Llc | Friction hinge |
WO2018067130A1 (en) * | 2016-10-04 | 2018-04-12 | Hewlett-Packard Development Company, L.P. | Three-dimensional input device |
US10712836B2 (en) | 2016-10-04 | 2020-07-14 | Hewlett-Packard Development Company, L.P. | Three-dimensional input device |
US20180253221A1 (en) * | 2017-03-02 | 2018-09-06 | Samsung Electronics Co., Ltd. | Display device and user interface displaying method thereof |
US11231785B2 (en) * | 2017-03-02 | 2022-01-25 | Samsung Electronics Co., Ltd. | Display device and user interface displaying method thereof |
US10528144B1 (en) | 2017-08-17 | 2020-01-07 | Google Llc | Adjusting movement of a display screen to compensate for changes in speed of movement across the display screen |
US10423229B2 (en) * | 2017-08-17 | 2019-09-24 | Google Llc | Adjusting movement of a display screen to compensate for changes in speed of movement across the display screen |
US20190056786A1 (en) * | 2017-08-17 | 2019-02-21 | Google Inc. | Adjusting movement of a display screen to compensate for changes in speed of movement across the display screen |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
USRE40891E1 (en) | Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom | |
US6597347B1 (en) | Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom | |
US5729249A (en) | Touch sensitive input control device | |
Biggs et al. | Haptic interfaces | |
MacKenzie | Input devices and interaction techniques for advanced computing | |
US5095302A (en) | Three dimensional mouse via finger ring or cavity | |
Greenstein et al. | Input devices | |
Evans et al. | Tablet-based valuators that provide one, two, or three degrees of freedom | |
US20200310561A1 (en) | Input device for use in 2d and 3d environments | |
CN107209582A (en) | The method and apparatus of high intuitive man-machine interface | |
KR101318244B1 (en) | System and Method for Implemeting 3-Dimensional User Interface | |
KR20030024681A (en) | Three dimensional human-computer interface | |
Smith et al. | Digital foam interaction techniques for 3D modeling | |
Jacoby et al. | Gestural interaction in a virtual environment | |
Cui et al. | Mid-air interaction with optical tracking for 3D modeling | |
JP3421167B2 (en) | Input device for contact control | |
Oh et al. | FingerTouch: Touch interaction using a fingernail-mounted sensor on a head-mounted display for augmented reality | |
Zhu et al. | TapeTouch: A handheld shape-changing device for haptic display of soft objects | |
US6239785B1 (en) | Tactile computer input device | |
Evreinova et al. | From kinesthetic sense to new interaction concepts: Feasibility and constraints | |
Springer et al. | State-of-the-art virtual reality hardware for computer-aided design | |
Nitzsche et al. | Mobile haptic interaction with extended real or virtual environments | |
Bayousuf et al. | Haptics-based systems characteristics, classification, and applications | |
Sofronia et al. | Haptic devices in engineering and medicine | |
JPH04257014A (en) | Input device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
REMI | Maintenance fee reminder mailed | ||
FPAY | Fee payment |
Year of fee payment: 8 |
|
SULP | Surcharge for late payment |
Year of fee payment: 7 |
|
REMI | Maintenance fee reminder mailed | ||
LAPS | Lapse for failure to pay maintenance fees |