US20160162064A1 - Method for actuating a tactile interface layer - Google Patents
Method for actuating a tactile interface layer Download PDFInfo
- Publication number
- US20160162064A1 US20160162064A1 US15/046,123 US201615046123A US2016162064A1 US 20160162064 A1 US20160162064 A1 US 20160162064A1 US 201615046123 A US201615046123 A US 201615046123A US 2016162064 A1 US2016162064 A1 US 2016162064A1
- Authority
- US
- United States
- Prior art keywords
- region
- user
- computing device
- deformable
- location
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04105—Pressure sensors for measuring the pressure or force exerted on the touch surface without providing the touch position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04809—Textured surface identifying touch areas, e.g. overlay structure for a virtual keyboard
Definitions
- This invention relates generally to tactile user interfaces, and more specifically to a new and useful mountable systems and methods for selectively raising portions of a surface of the user interface of a device.
- FIG. 1 is a schematic representation of the method of the preferred embodiments
- FIG. 2 is a top view of a variation of the tactile interface layer
- FIG. 3 is a cross sectional view of a variation of the tactile interface layer
- FIGS. 4A, 4B, and 4C are cross-sectional views illustrating the operation of a deformable region of a tactile interface layer
- FIG. 5 is a cross sectional view of a variation of the tactile interface layer with a valve
- FIG. 6 is a schematic representation of a variety of gestures and exemplary interpretations as commands
- FIGS. 7A and 7B are schematic representations of a swiping gesture and the elimination of a deformed region as applied to the variation of the tactile interface layer in FIGS. 2-4 ;
- FIGS. 8A and 8B are schematic representations of a pinch open gesture and the creation of a deformed region as applied to the variation of the tactile interface layer in FIGS. 2-4 ;
- FIGS. 9A, 9B, 10A, and 10B are schematic representations of a pinch open gesture and a change of the deformable region in a first and second variation, respectively, as applied to the variation of the tactile interface layer in FIGS. 2-4 ;
- FIGS. 11A and 11B are schematic representations of a drag gesture and a change in location of the deformed region, as applied to the variation of the tactile interface layer in FIGS. 2-4 ;
- FIG. 12 is a flowchart representation of a variation of the method
- FIG. 13 is a flowchart representation of a variation of the method
- FIG. 14 is a flowchart representation of a variation of the method
- FIG. 15 is a flowchart representation of a variation of the method
- FIG. 16 is a flowchart representation of a variation of the method
- FIG. 17 is a flowchart representation of a variation of the method
- FIG. 18 is a flowchart representation of a variation of the method.
- FIG. 19 is a flowchart representation of a variation of the method.
- the method S 100 for actuating a tactile interface layer 100 of a device that defines a surface with a deformable region of the preferred embodiments includes: detecting a gesture of the user along the surface of the tactile interface layer that includes a movement of a finger of the user from a first location (1) to a second location (2) on the surface Step S 110 ; interpreting the gesture as a command for the deformable region Step S 120 ; and manipulating the deformable region of the surface based on the command Step S 130 .
- the method S 100 for actuating a tactile interface layer 100 may also include the step of receiving a user input for a particular interpretation of a gesture as a command Step S 140 .
- the step of receiving a user input for a particular interpretation of a gesture as a command Step S 140 may include receiving a user input from the user of the device, but may alternatively include receiving a user input from a person remote from the device, for example, a third party such as the manufacturer or a second user.
- the user input for a particular interpretation of a gesture as a command may be received from any other suitable user.
- the method S 100 is preferably applied to a tactile interface layer 100 that is to be used with an electronic device and, more preferably, in an electronic device that benefits from an adaptive user interface.
- the electronic device may include a display and may include a touch sensor.
- the electronic device may be an automotive console, a steering wheel, a desktop computer, a laptop computer, a tablet computer, a television, a radio, a desk phone, a mobile phone, a PDA, a personal navigation device, a personal media player, a camera, a watch, a remote control, a mouse, a trackpad, or a keyboard.
- the tactile interface layer 100 may, however, be used as the user interface for any suitable device that interfaces with a user in a tactile and/or visual manner.
- the tactile interface layer 100 is preferably integrated with the device, for example, in the variation wherein the tactile interface layer 100 includes a sensor 140 , the tactile interface layer 100 is preferably assembled into the device and presented to the user as one unit.
- the tactile interface layer 100 may function as an accessory to a device, the user may be presented the tactile interface layer 100 and the device as two separate units wherein, when coupled to each other, the tactile interface layer 100 functions to provide tactile guidance to the user and/or to receive user inputs.
- the method S 100 may be applied to any other suitable arrangement of the tactile interface layer 100 .
- the method S 100 of the preferred embodiments is preferably applied to any suitable tactile interface layer that includes deformable regions.
- the method S 100 of the preferred embodiments may be applied to the user interface system as described in U.S. application Ser. Nos. 11/969,848, 12/319,334, and 12/497,622.
- the tactile interface layer may be applied over a display, but may alternatively be applied on to a surface without a display.
- the tactile interface layer may be applied to any suitable surface of a device that may benefit from a tactile interface.
- the tactile interface layer 100 of this variation preferably includes a layer 110 that defines a surface 115 , a substrate that at least partially defines a fluid vessel that includes a volume of fluid 112 , and a displacement device 130 coupled to the fluid vessel that manipulates the volume of fluid 112 to expand and/or contract at least a portion of the fluid vessel, thereby deforming a particular region 113 of the surface 115 .
- the substrate may also include a support region that substantially prevents inward deformation of the layer 110 (for example, inward deformation into the fluid vessel).
- the tactile interface layer 100 of this variation may also include a second layer 210 (as shown in FIGS. 10 a and 10 b ) that allows for an additional degree of deformation of the surface 115 .
- the step of manipulating the deformable region of the surface based on the command Step S 130 preferably includes manipulating the fluid within the fluid vessel.
- the displacement device 130 is preferably actuated to manipulate the fluid within the fluid vessel to deform a particular region 113 of the surface.
- the fluid vessel preferably includes a cavity 125 and the displacement device 130 preferably influences the volume of fluid 112 within the cavity 125 to expand and retract the cavity 125 .
- the fluid vessel may alternatively be a channel 138 or a combination of a channel 138 and a cavity 125 , as shown in FIG. 3 b .
- the fluid vessel may also include a second cavity 125 b in addition to a first cavity 125 a .
- the tactile interface layer of this variation may include a valve 139 that functions to direct fluid within the tactile interface layer 100 .
- the step of manipulating the fluid within the fluid vessel may include actuating the valve 139 to direct fluid within the tactile interface layer 100 .
- the user interface enhancement system 100 may include a second displacement device 130 that functions to influence the volume of fluid 112 within the second cavity 125 b to expand and retract the second cavity 125 b , thereby deforming a second particular region 113 b of the surface.
- the second cavity 125 b is preferably similar or identical to the cavity 125 , but may alternatively be any other suitable kind of cavity.
- the following examples may be described as expanding a fluid vessel that includes a cavity 125 and a channel 138 , but the fluid vessel may be any other suitable combination of combination of cavity 125 and/or channel 138 .
- any other suitable type of tactile interface layer 100 may be used.
- the tactile interface layer 100 preferably functions to provide tactile guidance to a user when using a device that the tactile interface layer 100 is applied to. As shown in FIG. 4 , the surface 115 of the tactile interface layer 100 preferably remains flat until tactile guidance is to be provided to the user at the location of the particular region 113 . In the variation of the tactile interface layer 100 as described above, the displacement device 130 then preferably expands the cavity 125 (or any other suitable portion of the fluid vessel) to expand the particular region 113 outward, forming a deformation that may be felt by a user (referenced throughout this document as a “tactilely distinguishable formation”), and providing tactile guidance for the user.
- the expanded particular region 113 preferably also provides tactile feedback to the user when he or she applies force onto the particular region 113 to provide input.
- This tactile feedback may be the result of Newton's third law, whenever a first body (the user's finger) exerts a force on a second body (the surface 115 ), the second body exerts an equal and opposite force on the first body, or, in other words, a passive tactile response.
- the displacement device 130 may retract the cavity 125 to deform the particular region 113 inward.
- any other suitable method of deforming a particular region 113 of the tactile interface layer 100 may be used.
- the tactile interface layer 100 preferably includes a sensor that functions to detect the gesture of the user, for example, a capacitive sensor that functions to detect the motion of a finger of the user from the first location to the second location.
- a pressure sensor located within the fluid vessel may be used to detect changes in pressure within the fluid vessel to detect the motion of a finger of the user from the first location to the second location.
- the sensor may be a sensor included in the device to which the tactile interface layer 100 is applied to, for example, the device may include a touch sensitive display onto which the tactile interface layer 100 is overlaid.
- the gesture of the user may be detected using the sensing capabilities of the touch sensitive display. However, any other suitable gesture detection may be used.
- the tactile interface layer 100 preferably includes a processor that functions to interpret the detected gesture as a command.
- the processor preferably functions to discern between a gesture that is provided by the user to be a command a gesture that may be provided by the user but not meant to be a command, for example, an accidental brush of the finger along the surface of the tactile interface layer 100 .
- the processor may include a storage device that functions to store a plurality of gesture and command associations and/or user preferences for interpretations of gestures as commands.
- the processor may be any suitable type of processor and the storage device may be any suitable type of storage device, for example, a flash memory device, a hard-drive, or any other suitable type.
- the processor and/or storage device may alternatively be a processor and/or storage device included into the device that the tactile interface layer 100 is applied to. However, any other suitable arrangement of the processor and/or storage device may be used.
- a gesture may be one of a variety of movements of one or more fingers of the user across the surface 115 of the tactile interface layer 100 .
- the gesture may be detected as a swipe from a first location to a second location arranged in any suitable location along the surface 115 of the tactile interface layer 100 .
- this first variation of gesture may be detected as a swipe from a first location relative to a deformed particular region 113 to a second location relative to the deformed particular region 113 .
- Detection of a gesture relative to a deformed particular region 113 may be particularly useful in the variation of the tactile interface layer 100 that includes a plurality of deformable regions and may function to allow the interpretation of the gesture as a command for a particular deformable region that is substantially proximal to the detected gesture.
- the gesture may be detected relative to any other suitable portion of the tactile interface layer.
- the gesture may be a single finger moving from the first location to the second location on the surface 115 , as shown in FIG. 6 (Example A).
- the gesture may include more than one finger, for example, two fingers, where the first finger moves from a first location to a second location and the second finger moves from a third location to a forth location, as shown in FIG. 6 (Examples B-E).
- the fingers of the user preferably move substantially concurrently.
- the fingers may move one after the other, or in other words, a “staggered” gesture, for example, a first finger moves and then the second finger moves or the first finger starts moving and continues moving as the second finger starts to move.
- any other suitable temporal relationship between the fingers of the user during a gesture may be used.
- Example A the finger or fingers of a user move from a first location to a second location in a “swiping” motion.
- a second variation at least two of the fingers of the user move apart from each other in a “pinch open” motion, as shown in Example B.
- a first finger moves from a first location to a second and a second finger moves from a third location to a fourth, where the second and fourth locations are farther apart from each other than the first and third.
- a third variation of the gesture may be thought of as opposite that of the second variation, where at least two of the fingers of the user move together in a “pinch close” motion, as shown in Example C.
- At least two fingers of the user may move in substantially the same direction in a “drag” motion, as shown in Example D.
- a first finger moves from a first location to a second and a second finger moves from a third location substantially adjacent to the first location to a fourth location substantially adjacent to the second location.
- the first and second fingers remain substantially equidistant from the beginning of the gesture to the end of the gesture.
- the first and second fingers also remain substantially equidistant from the beginning of the gesture to the end of the gesture.
- the first finger moves from a first location to a second location and the second finger moves from a third to a fourth location along the surface by rotating about a point that is substantially in between the distance between the first and third locations.
- the fingers of a user rotate about a center that is substantially defined by the distance between the initial positions of the first and second fingers of the user.
- the gesture is preferably one of the variations as described above, the gesture may be any other suitable combination of the above variations and/or any other suitable type of gesture.
- the gesture may be interpreted as one of a variety of commands for the deformable particular region 113 .
- Examples A-E show exemplary parings between a gesture and the interpreted command. However, any other suitable type of association between gesture and command may be used.
- the command may be to un-deform (or “eliminate”) the deformed particular region 113 , as shown in FIG. 6 (Examples A and B).
- the command may be to change the shape of the deformed particular region, as shown in FIG.
- the command may be to actuate (or “create”) the deformed particular region 113 . This may be thought of as the opposite of the first variation of command.
- the command may be to change the location of a deformed particular region 113 , as shown in FIG. 6 (Example D).
- the fourth variation may alternatively be thought of the “elimination” of the originally deformed particular region 113 at a first location and the “creation” of another deformed particular region 113 at a second location.
- the second location is preferably indicated by the gesture provided by the user.
- the location of the deformed particular region may be changed using any other suitable method.
- the command may be to change an already deformed particular region 113 , for example, to change the firmness or the height of the deformed particular region 113 , as shown in FIG. 6 (Example E).
- any other suitable type of change to the deformed particular region 113 may be used, for example, a gesture that is not in contact with the surface 115 .
- the sensor that detects the gesture may be a video sensor or a distance sensor that detects the motion of the user that is removed from the surface 115 .
- the gesture may include any other suitable body part of the user, for example, a hand, an arm, and/or a foot.
- the command interpreted from the gesture along the surface 115 of the tactile interface layer is preferably one of the variations described above, but may alternatively be any suitable combination of the above variations or any other suitable type of command for the deformable region.
- the gesture may also be interpreted as a command for the device, for example, when applied to a device that is a mobile phone, music player, or any other suitable device that outputs sound, the command may include a user command to change the volume of the sound output.
- the command may include a user command to change the brightness or any other suitable property of the visual output.
- any other suitable command for the device may be used.
- a “swiping” gesture as shown in FIGS. 7 a and 7 b , or a “pinch open” gesture may be interpreted as an “eliminate” command to un-deform the deformed particular region 113 .
- the “swiping” motion and the “pinch open” gesture is substantially similar to a motion a user may make when moving an object away or to push out a crease, thus, it may be useful to associate such a gesture with the elimination of a deformation of the deformed particular region 113 .
- the “swiping” and “pinch open” gesture may involve one and two fingers of the user, respectively, but may alternatively involve two and four fingers of the user, respectively, or any other suitable number of fingers of the user or any other suitable number of fingers of multiple users.
- the location of the gesture relative to a deformed particular region 113 may be used to determine the deformed region that the user wishes to eliminate.
- the tactile interface layer 100 may detect that the finger of the user passes over a particular deformed particular region 113 and interprets the gesture as a command to eliminate the particular deformation.
- the tactile interface layer 100 may detect a command motion from the user and be prepared to eliminate a deformed particular region at a location later indicated by the user.
- the swiping or pinch open gesture may indicate to the tactile interface layer 100 that the user desires to eliminate a particular deformed region.
- the user interface 100 may actuate an operation mode that waits for a user to indicate the desired deformation to eliminate.
- the user may then indicate the desired location for the desired deformation to eliminate anywhere on the tactile interface layer 100 .
- the location may be substantially adjacent to where the user provided the gesture, but may alternatively be substantially distal from where the user provided the gesture along the surface 115 .
- the user may define their desired location using any other suitable method, for example, applying pressure to a particular location on the surface 115 .
- any other suitable method to indicate the desired deformed region to eliminate may be used.
- a “pinch close” gesture may be interpreted as a “creation” command.
- the user may “create” the button for any suitable reason, for example, to mark a location on a screen, to mark an option, to mark a file for easy reference, or to indicate that tactile guidance is desired at a particular location.
- the user may draw two fingers together to indicate the desired location of a deformed particular region of the surface 113 .
- the tactile interface layer 100 preferably detects the motion of the fingers and location along the fluid vessel and/or cavity 125 corresponding to a particular region 113 that is substantially adjacent to the location substantially central to the location of the fingers as the user draws the fingers is expanded and the desired particular region of the surface 113 is deformed, as shown in FIG. 8 b .
- the user may alternatively draw more than two fingers together to better define a central location.
- the tactile interface layer 100 may alternatively detect a gesture from the user and be prepared to expand a cavity 125 in a location indicated by the user. For example, the motion of drawing two fingers together may indicate to the tactile interface layer 100 that the user desires expansion of a cavity 125 .
- the user interface 100 may actuate an operation mode that waits for a user indication for the desired location for a deformed region of the surface.
- the user may then indicate the desired location for the deformed region of the surface anywhere on the tactile interface layer 100 .
- the location may be substantially adjacent to where the user draws two fingers together, but may alternatively be substantially distal from where the user draws two fingers together.
- the user may indicate an arrangement of deformable regions to deform. For example, once a command to deform a particular region is interpreted, the user may indicate the desired arrangement of regions by providing a second gesture, such as to trace a shape on the surface 115 that may indicate, for example, a QWERTY keyboard configuration of deformable regions.
- the user may also indicate a desired shape of the deformed region of the surface.
- the user may trace out a desired shape along the surface and the user interface 100 may function to deform the regions substantially adjacent to the traced shape along the surface.
- the user may define their desired location using any other suitable method, for example, applying pressure to a particular location on the surface 115 .
- This variation is preferably used on the variation of the user interface system that includes a plurality of cavities 125 to provide the user with a plurality of options of the location of the deformed particular region of the surface 113 , but may alternatively be used on a tactile interface layer 100 with any other suitable number of cavities 125 .
- a pointing stick trademarked by IBM as the TRACKPOINT and by Synaptics as the TOUCHSTYK (which are both informally known as the “nipple”).
- the user may pull two fingers in a “pinch open” gesture to indicate the desire to expand (e.g., “spread out”) a deformed particular region 113 , for example, a user may pull two fingers in opposite directions away from a deformed particular region 113 or pull a finger in one direction away from a deformed particular region 113 , indicating to the user interface system 100 that the total surface area of the deformation of the particular region 113 is to be increased, or “spread.”
- the deformable region of the tactile interface layer 100 may include a first and second degree of deformation, as shown in FIGS.
- the deformable region may require a first pressure to deform the first degree and a second pressure to deform the second degree; for example, the layer 110 may include a second portion that requires a higher pressure to deform.
- a first and second cavity 125 a and 125 b may be coupled to the deformable region such that the expansion of one of the first and second cavities 125 a and 125 b results in a portion of the deformable region deforming and the expansion of both the first and second cavities 125 a and 125 b results in the full deformable region deforming, as shown in FIGS. 10 a and 10 b .
- the degree of deformation of the particular region 113 may be decreased and substantially adjacent particular regions 113 may be expanded to produce the effect of spreading a deformation across a large surface area.
- the deformation of the particular region 113 may be maintained and substantially adjacent particular regions 113 can be expanded to substantially the same degree, providing the effect of enlarging a deformed particular region 113 .
- the “spreading” of the deformed particular region 113 may be radially equidistant from the original deformed particular region 113 . More specifically, the central point of the resulting deformed particular region 113 is preferably of the same central point of the original deformed particular region 113 , as shown in FIG. 9 .
- the central point of the resulting deformed particular region 113 may be different from the central point of the original deformed particular region 113 , for example, a user may pull one finger away from the deformed particular region 113 in one direction, indicating expansion of the deformed particular region 113 in the indicated direction, thus moving the central point of the deformed particular region 113 towards the indicated direction, as shown in FIG. 10 .
- the motion of two fingers pulled in opposite directions away from a deformed particular region 113 may indicate to the user interface system 100 to retract the cavity 125 and undeform the deformed particular region 113 .
- the user interface system 100 may provide any other suitable active response to the motion of two fingers pulled in opposite directions away from a deformed particular region 113 .
- a “dragging” gesture may be interpreted as a command to move the deformed region from a first location to a second location along the surface 115 .
- the user may move his or her finger(s) along the surface 115 (preferably in contact with the surface 115 , but may also be any other suitable distance away from the surface 115 ) to indicate successive particular regions 113 to deform.
- the prior particular region 113 preferably undeforms (in other words, the cavity 125 corresponding to the prior particular region 113 retracts), resulting in the user seemingly “dragging” the deformed particular region 113 along the surface 115 .
- the successive particular regions 113 are preferably substantially adjacent or continuous with each prior particular region 113 to provide an experience akin to that of dragging a single object along a surface as opposed to touching a first object on a surface and then another object on the same surface.
- the deformed region may be “pushed” by the dragging gesture.
- the dragging gesture preferably starts on one side of the deformed region and “pushes” the deformed region forward, as shown in FIGS. 11 a and 11 b .
- Subsequent particular regions 113 may be deformed forward of the original deformed region (instead of behind as described in the dragging example) to emulate the user pushing the deformed region from a first location to a second location along the surface 115 .
- the user may indicate the deformed region that is to be moved by the start of the “dragging” gesture and then indicate the desired location of the moved deformed region by the end of the “dragging” gesture (in other words, where the user lifts the fingers off the surface 115 after the gesture).
- the initial deformed particular region may be “eliminated” and a particular region at the desired location is “created” while deformable regions in between the eliminated and created deformations are not actuated.
- any other suitable actuation of deformable regions may be used.
- the user may dictate interaction between expanded cavities 125 .
- the user may “drag” an object along a path and particular regions 113 are expanded along the path.
- the object and the existing deformed particular region 113 may “react” based on actions of the user.
- the deformed particular region 113 of the object and the existing deformed particular region 113 may seemingly “merge,” for example, the total surface area of the existing deformed particular region 113 may grow as if the deformed particular region 113 of the object is added into the existing deformed particular region 113 , similar to the third exemplary interpretation.
- the user may then also drag the “merged” particular region 113 to a different location.
- the existing deformed particular region 113 and the deformed particular region 113 for the object may “repel” each other, for example, the object may represent a baseball bat and the existing deformed particular region 113 may represent a ball, and the user may “hit” the ball with the baseball bat, seemingly “repelling” the two deformed particular regions.
- the user may perform a splitting motion on an existing deformed particular region 113 and the existing deformed particular region 113 may “split,” forming two distinct deformed particular regions 113 .
- Each of the resulting two distinct deformed particular regions 113 is preferably of a smaller surface area than the original existing deformed particular region 113 .
- An example of a splitting motion may be drawing two fingers apart substantially adjacent to the existing deformed particular region 113 , However, any other suitable interaction between expanded cavities 125 may be implemented. While an active response to a command given by the user is preferably one of the examples described here, any active response to a command given by the user may be used.
- a rotating gesture may be interpreted as a command to change the characteristics of a deformed particular region 113 substantially proximal to the user input.
- the command of the fifth exemplary interpretation allows for a plurality of states in between fully deformed and fully undeformed, respectively.
- the rotating gesture around a deformed particular region 113 may be interpreted as a command to increase the stiffness of the deformation. This may be particular useful in a scenario where the command includes a command for the volume of the device and the deformed particular region 113 indicates the location of the “increase volume” button.
- the deformed particular region 113 may become progressively stiffer to the touch as the volume becomes higher and reaches the limit of the volume strength, indicating to the user through tactile means where along the volume scale they are currently.
- the rate of stiffness increase may be selected by the user to be tailored to their tactile preferences and/or sensitivity.
- the height of the deformed particular region 113 may also be adjusted as the volume level changes.
- the displacement device 130 may adjust the amount of fluid that is displaced to expand the cavity 125 . The more fluid that is displaced to expand the cavity 125 , the stiffer the particular region 113 will feel to the touch.
- the user interface system 100 may also include a valve that directs the fluid displaced by the displacement device 130 .
- the valve may direct additional fluid into the cavity 125 .
- the active response may alternatively be a combination of the variations described above or any other suitable combination of gestures and commands.
- the method S 200 for responding to an implicit gesture includes: determining that a mobile computing device is held by a user in Block S 210 , the mobile computing device comprising a substrate defining a fluid channel, an attachment surface, and a fluid conduit fluidly coupled to the fluid channel and passing though the attachment surface, a tactile layer defining a deformable region and a peripheral region, the peripheral region adjacent the deformable region and coupled to the attachment surface, the deformable region adjacent the peripheral region, arranged over the fluid conduit, and disconnected from the attachment surface, and a displacement device configured to displace fluid through the fluid channel to transition the deformable region from a retracted setting to an expanded setting in Block S 220 ; identifying a position of the mobile computing device in a hand of the user in Block S 230 ; predicting a location of a future input into the mobile computing device in Block S 240 , the location proximal the deformable region; and transitioning the deformable region from the retracted setting to the expanded setting.
- the second method S 200 functions to predict a position of an upcoming input based on how a mobile computing device (e.g., a smartphone, a tablet, a PDA, personal music player, wearable device, watch, wristband, etc.) is held by a user and then to manipulate a dynamic tactile interface within the mobile computing device to yield a tactilely-distinguishable formation on the dynamic tactile interface proximal the predicted position of the upcoming input, a desired location of a button (i.e., input region), or shape of the dynamic tactile interface.
- a mobile computing device e.g., a smartphone, a tablet, a PDA, personal music player, wearable device, watch, wristband, etc.
- the second method S 200 can manipulate one or more deformable regions of a dynamic tactile interface within a mobile computing device to dynamically form tactilely-distinguishable formations on the mobile computing device, thereby improving convenience and ease of use of the mobile computing device.
- the second method S 200 identifies that the mobile computing device is held in a portrait orientation in a user's left hand and thus transitions a deformable region over the top left quadrant (i.e., II Cartesian quadrant) of the display to define an physical “unlock” region adjacent a repositioned unlock slider rendered on the display.
- the second method S 200 thus identifies how the mobile computing device is held and manipulates the dynamic tactile layer to place the physical unlock region in a position directly and naturally accessibly by the user's left thumb, thus increasing the ease with which the user may unlock the mobile computing device.
- the second method S 200 can also adjust the position of a key (e.g., graphic) rendered on the display to align with the physical unlock region. Furthermore, for the unlock region that defines an elongated ridge indicating a swipe input to unlock, the second method S 200 can modify a required input swipe direction to accommodate the user's hand position over the mobile computing device. In this example, when the mobile computing device held in a portrait orientation in the user's left hand, the second method S 200 can set the swipe direction from right to left, whereas the second method S 200 sets the swipe direction from left to right when the mobile computing device is held in a portrait orientation in the user's right hand.
- a key e.g., graphic
- the second method S 200 identifies that the mobile computing device is held in a portrait orientation in a user's right hand and thus transitions a pair of deformable regions on the upper right region of the side of the mobile computing device into expanded settings to define a physical “volume up” key and a physical “volume down” key.
- the second method S 200 thus identifies how the mobile computing device is held and manipulates the dynamic tactile layer to place physical volume adjustment regions in positions directly and naturally accessibly by the user's right index finger, thus increasing the ease with which the user may adjust the volume output of the mobile computing device.
- the second method S 200 can also render a “+” image key and a “ ⁇ ” image key near the perimeter of the display to proximal the physical “volume up” and “volume down” keys to indicate control functions of the corresponding physical keys to the user.
- the second method S 200 determines the orientation of the mobile computing device relative to the horizon (e.g., portrait, landscape, 37° from horizontal) and transitions deformable regions within the dynamic tactile interface between expanded and retracted settings to maintain a physical “home” button proximal a current effective bottom center of the mobile computing device.
- the horizon e.g., portrait, landscape, 37° from horizontal
- the second method S 200 can identify when the mobile computing device is rotated relative to the horizon and frequently update the position of the home button (e.g., a home button rendered on the display and a home button defined by a deformable region in the expanded setting), such as every five seconds or when the change in position of the mobile computing device exceeds a threshold position change while the mobile computing device is unlocked and in operation.
- the home button e.g., a home button rendered on the display and a home button defined by a deformable region in the expanded setting
- the second method S 200 accesses a user application history including frequency and duration of user of native application displayed on the home screen.
- the second method S 200 subsequently manipulates a set of deformable regions, each adjacent a displayed native application key, with a deformable region adjacent a native application key corresponding to a highest-use native application transitioned to a highest expanded position and with a deformable region adjacent a native application key corresponding to a lowest-use native application transitioned to a lowest expanded position or retained in the retracted position.
- the second method S 200 can adjust the height of various deformable regions adjacent native application keys displayed within a home screen on the mobile computing device according to a likelihood that the user will select each native application based on application selection history.
- Block S 210 of the second method S 200 recites determining that the mobile computing device is held by the user. Furthermore, Block S 220 of the second method S 200 recites identifying a position of the mobile computing device in a hand of the user.
- Block S 210 and Block S 220 function to interface with one or more sensors on the mobile computing device to detect that the mobile computing device is being held and how the mobile computing device is being held.
- Blocks S 210 and/or S 220 can interface with one or more capacitive, resistive, optical, or other touch sensors arranged about the mobile computing device, such as on and around the display, the side of the mobile computing device, and/or a back surface of the mobile computing device, to detect a finger or hand hovering over or in contact with the mobile computing device.
- Blocks S 210 and/or S 220 can additionally or alternatively interface with one or more heat sensors within the mobile computing device to detect a local temperature change across a surface of the device and to correlate the temperature change with a hand holding the mobile computing device and/or interface with an accelerometer and/or a gyroscope to detect that the mobile computing device is being held, moved, and/or manipulated.
- Block S 210 can characterize accelerometer and/or gyroscope outputs as the mobile computing device being in a user's pocket while the user is walking, resting on a table or horizontal surface, or in a user's hand, etc.
- Blocks S 210 and S 220 can interface with a heart rate sensor within the wearable device to detect the user's current heart rate, and the second method S 200 can set a position of one or more deformable regions on the wearable device based on the user's current heart rate.
- Blocks S 210 and S 220 can similarly detect the user's current breathing rate or other vital sign, and the second method S 200 can set a position of one or more deformable regions on the wearable device accordingly.
- Block S 210 and S 220 can additionally or alternatively interface with one or more bio-sensors integrated into the wearable device (or other computing device) to identify a user who is holding the wearable device based on bio-signature output from the bio-sensor, and Bocks S 210 and S 220 can thus adjust a position of one or more deformable regions (e.g., a location, a height, a firmness, and/or a unique gesture definition related to a deformable region) according to a preference of the identified user.
- one or more deformable regions e.g., a location, a height, a firmness, and/or a unique gesture definition related to a deformable region
- Block S 220 can thus compare sensed touch areas to a touch area model to characterize a touch sensor output as a left hand or a right hand holding the mobile computing device in a portrait, landscape, or other orientation.
- Block S 220 can similarly compare sensed heat areas to a heat area model to characterize a temperature sensor output as a left or right hand holding the mobile computing device in a portrait, landscape, or other orientation.
- Block S 220 can also determine how the mobile computing device is held, such as by one or both hands of the user, based on how text or other inputs are entered into the mobile computing device, and Block S 220 can further verify such characterization of user inputs substantially in real-time based on accelerometer and/or gyroscope data collected by sensors in the mobile computing device.
- Blocks S 210 and S 220 can additionally or alternatively implement machine vision and/or machine learning to identify a face, body, clothing feature, etc. in a field of view of a (forward-facing) camera within the mobile computing device and thus determine that the mobile computing device is held and how the mobile computing device is held based on the identified face, body, clothing feature, etc.
- Block S 210 can implement facial recognition to determine that the mobile computing device is currently held, and Block S 220 can implement face tracking to predict which hand the user is using to hold the mobile computing device.
- Block S 210 and S 220 can additionally or alternatively interface with a rear-facing camera within the mobile computing device to identify a hand (e.g., left or right) holding the mobile computing device.
- Blocks S 210 and S 220 can similarly identify a hand shape or hand motion (i.e., gesture) in a field of view of a camera within the mobile computing device (and not touching the mobile computing device), and subsequent Blocks of the second method S 200 can set a deformable region position according to the identified hand shape or gesture.
- a hand shape or hand motion i.e., gesture
- subsequent Blocks of the second method S 200 can set a deformable region position according to the identified hand shape or gesture.
- Blocks S 210 and S 220 can additionally or alternatively determine if the mobile computing device is worn, in use, or in a particular location, on in an “ON” or “unlocked” state.
- the second method S 200 can selectively expand and retract one or more side, back, or on-screen deformable regions based on location data of the mobile computing device determined in Blocks S 210 and S 220 through a location (e.g., GPS) sensor within the mobile computing device.
- a location e.g., GPS
- the second method S 200 can thus selectively control the position of various deformable regions based on whether the user is at home, in his car, what app is running on the mobile computing device, etc.
- Block S 210 and Block S 220 can function in any other way to determine that the mobile computing device is being held and to characterize how the mobile computing device is held.
- Block S 230 of the second method S 200 recites predicting a location of a future input into the mobile computing device, the location proximal the deformable region.
- Block S 230 functions to predict a location of an upcoming input based on how the mobile computing device is held (e.g., orientation of the mobile computing device, which hand(s) the user is using to hold the mobile computing device).
- Block S 230 can predict an upcoming input to include an “unlock” gesture.
- Block S 230 can also predict that a convenient or preferred unlock input to be from the Quadrant I of the display (current top-right quadrant) to the Quadrant II of the display (current top-left quadrant) based the holding hand and orientation determined in Blocks S 210 and S 220 .
- Block S 230 can thus predict the upcoming input and a preferred location for the upcoming input.
- Block S 230 can predict an upcoming input to include either of a “volume up” gesture and a “volume down” gesture.
- Block S 230 can also predict that a convenient or preferred “volume up” and “volume down” input regions to lie off the display on an upper left lateral side of the mobile computing device such that user's right index finger falls substantially naturally on the “volume up” and “volume down” input regions.
- Block S 230 can thus predict the upcoming input and a preferred or convenient location for the upcoming input based on the holding position of the mobile computing device determined in Blocks S 210 and S 220 .
- Block S 240 of the second method S 200 recites transitioning a deformable region from the retracted setting to the expanded setting.
- Block S 240 functions to control the displacement device to displace fluid through the fluid channel to transition the deformable region from the retracted setting to the expanded setting.
- Block S 240 can control one or more valves and/or one or more displacement devices within the mobile computing device to selectively expand and/or retract a particular subset of deformable regions, as described above or as described in U.S. patent application Ser. No. 12/319,334, filed on 5 Jan. 2009, which is incorporated in its entirety by this reference.
- the second method S 200 can function to predict a future input and/or a preferred or convenient location for a future input and manipulate a deformable region on the mobile computing device to define a tangible button accordingly.
- the second method S 200 can manipulate one or more deformable regions over a display within the mobile computing device (i.e., an on-screen physical button) and/or one or more deformable regions remote from the display (i.e., an off-screen physical button).
- the second method can therefore control one or more valves, displacement devices, etc. to form a physical volume up button, volume down button, lock button, unlock button, ringer or vibrator state button, home button, camera shutter button, and/or application selection button, etc. on the mobile computing device.
- the second method S 200 can further manage outputs from a touch sensor to handle user inputs into selectively formed buttons, and the second method can also interface with a display driver to render visual input region identifiers adjacent (i.e., under) on-screen buttons and/or to render visual input identifiers near or pointing to off-screen buttons.
- the second method S 200 can detect a first gesture, selectively adjust the position of a particular deformable region accordingly, detect a subsequent gesture, assign a particular output type to the particular deformable region, and then generate an output of the particular output type when the particular deformable region is subsequently selected by the user.
- the second method S 200 can function in any other way to estimate how the mobile computing device is held, to predict a type and/or location of a future input, and to manipulate a vertical position of one or more deformable regions accordingly to the predicted type and/or location of the future input.
- An example of method S 200 includes detecting an ongoing phone call on a mobile phone with a touchscreen or other sound output through a speaker of the mobile phone.
- Method S 200 can further detect the orientation of the phone by detecting the touchscreen proximal and/or contacting an ear of the user, such as when the user holds the mobile phone up to the ear during the ongoing phone call.
- method S 200 can select and expand a deformable region corresponding to the ear and the speaker such that the deformable region forms an earpiece.
- method S 200 can expand the earpiece to conform to the ear and focus sound output from the speaker toward to ear for improved hearing.
- the method S 300 registers interaction with a dynamic tactile interface.
- the dynamic tactile interface includes a tactile layer and a substrate, the tactile layer defining a tactile surface, a deformable region, and a peripheral region adjacent the deformable region and coupled to the substrate opposite the tactile surface.
- the method S 300 includes detecting an orientation of the device in Block S 310 ; predicting a location of an upcoming input related to a native application executing on the device in Block S 320 ; selecting a particular deformable region from a set of deformable regions, the particular deformable region substantially coincident the input location in Block S 330 ; selectively transitioning the particular deformable region from a retracted setting into an expanded setting, the deformable region substantially flush with the peripheral region in the retracted setting and tactilely distinguishable from the peripheral region in the expanded setting in Block S 340 ; and detecting an input, corresponding to the anticipated input, on the particular deformable region in Block S 350 .
- one variation of method S 300 includes receiving a notification event at the device in Block S 315 ; detecting a particular location of an input object contacting a surface of the device prior to an upcoming input in Block S 320 ; in response to the notification event, rendering a virtual communication on a region of a display of the device adjacent the particular location, the virtual communication corresponding to the notification event in Block S 325 ; selecting a particular deformable region from a set of deformable region, the particular deformable region corresponding to the anticipated output and adjacent the particular location in Block S 340 ; selectively transitioning the particular deformable region from a retracted setting substantially flush with the peripheral region to an expanded setting tactilely distinguishable from the peripheral region in Block S 350 ; and detecting an input to the particular deformable region in Block S 360 .
- method S 300 functions to register an implicit event associated with an input, define a command for the dynamic tactile interface in response to the implicit event, and, in response to the command, modify the dynamic tactile interface according to an anticipated future input to the dynamic tactile interface.
- method S 300 functions to correlate spatial orientation of the device and a native application executing on the device with a configuration of deformable regions of the dynamic tactile interface.
- the dynamic tactile interface can further include a display coupled to the substrate opposite the tactile layer and displaying an image of a key substantially aligned with the deformable region and/or a touch sensor coupled to the substrate and outputting a signal corresponding to an input on a tactile surface of the tactile layer adjacent the deformable region.
- the dynamic tactile interface can also include a housing that transiently engages a (mobile) computing device and transiently retains the substrate over a digital display of the (mobile) computing device.
- the dynamic tactile interface can be implemented within or in conjunction with a computing device to provide tactile guidance to a user entering input selections through a touchscreen or other illuminated surface of the computing device.
- the dynamic tactile interface defines one or more deformable regions of a tactile layer that can be selectively expanded and retracted to intermittently provide tactile guidance to a user interacting with the computing device.
- the dynamic tactile interface is integrated into or applied over a touchscreen of a mobile computing device, such as a smartphone or a tablet.
- the dynamic tactile interface can include a set of round or rectangular deformable regions, wherein each deformable region is substantially aligned with a virtual key of a virtual keyboard rendered on the a display integrated into the mobile computing device, and wherein each deformable region in the set mimics a physical hard key when in an expanded setting.
- the dynamic tactile interface can retract the set of deformable regions to yield a substantially uniform (e.g., flush) tactile surface yielding reduced optical distortion of an image rendered on the display.
- the dynamic tactile interface can include an elongated deformable region aligned with a virtual ‘swipe-to-unlock’ input region rendered on the display such that, when in the expanded setting, the elongated deformable region provides tactile guidance for a user entering an unlock gesture into the mobile computing device.
- the dynamic tactile interface can transition the elongated deformable region back to the retracted setting to yield a uniform surface over the display.
- the dynamic tactile interface can alternatively embody an aftermarket device that adds tactile functionality to an existing computing device.
- the dynamic tactile interface can include a housing that transiently engages an existing (mobile) computing device and transiently retains the substrate over a digital display of the computing device.
- the displacement device of the dynamic tactile interface can thus be manually or automatically actuated to transition the deformable region(s) of the tactile layer between expanded and retracted settings.
- Block S 310 detects an orientation of the device.
- Block S 310 can interface with a sensor incorporated into the device (e.g., a touch sensor, an optical sensor, an accelerometer, Global Positioning System, etc.) to detect the orientation of the device relative an external surface or body.
- a sensor incorporated into the device e.g., a touch sensor, an optical sensor, an accelerometer, Global Positioning System, etc.
- Block S 310 can interface with an accelerometer built into the device to detect orientation of a mobile phone relative to a horizontal surface.
- the mobile phone can be oriented in a portrait orientation, such that a minor axis of the device can be substantially parallel to the horizontal surface.
- the device can be oriented in a landscape orientation, such that the major axis of the device can be substantially parallel the horizontal surface.
- Block S 310 can detect the device in any other orientation with any other sensor suitable for detecting orientation of the device.
- Block S 310 can detect, with an optical sensor, a display of the device resting on a horizontal surface.
- Block S 310 can further detect the position of the device relative an external surface and/or object.
- Block S 310 can detect an input object (e.g., a finger) resting on a surface the device.
- Block S 310 can detect with a sensor, such as a capacitive, resistive, and/or optical sensor.
- Block S 320 predicts a location of an upcoming input related to a native application executing on the device.
- Block S 320 can predict a particular input at a particular location in response to execution of the native application.
- Block S 320 can predict a contact with a surface of the device at the particular location.
- Block S 320 can identify a future input defined by a contact by an input object (e.g., a finger) on a portion of the touchscreen of the computing device corresponding to a virtual image rendered by the touchscreen.
- an input object e.g., a finger
- Block S 330 selects a particular deformable region from a set of deformable regions, the particular deformable region corresponding to the anticipated input and adjacent the input location.
- Block S 330 can select the particular deformable region adjacent or arranged over the input location.
- Block S 330 can select a particular deformable region with a shape substantially corresponding to the anticipated input. For example, if the anticipated input includes a slide gesture across the tactile surface, Block S 330 can select a particular deformable region that forms an elongated and elevated button, such that the user can slide a finger across the expanded deformable region to enter the gesture into the device.
- Block S 330 can select a set of particular deformable regions from the set of deformable regions, such that the set of particular deformable regions cooperatively correspond to the anticipated input.
- Block S 340 selectively transitions the particular deformable region from a retracted setting substantially flush with the peripheral region to an expanded setting tactilely distinguishable from the peripheral region.
- Block S 340 can transition the particular deformable region(s) by displacing fluid from a fluid vessel into a cavity arranged under the deformable region.
- the tactile layer can include a substrate, a deformable region, and a peripheral region adjacent the deformable region and coupled to the substrate opposite the tactile layer, the substrate defining a fluid channel and cooperating with the deformable region to define a cavity filled with fluid.
- a displacement device fluidly coupled to the fluid channel can displace fluid between the cavity and a reservoir fluidly coupled to the displacement device, thereby transitioning the deformable region between an expanded setting substantially elevated above the peripheral region and a retracted setting substantially flush with the peripheral region.
- the tactile layer can define one or more deformable regions operable between the expanded and retracted settings to intermittently define tactilely distinguishable formations over a surface, such as over a touch-sensitive digital display (e.g., a touchscreen), such as described in U.S. patent application Ser. No. 13/414,589.
- the displacement device can transition the deformable region into the expanded setting by displacing fluid from the fluid vessel into the cavity.
- Method S 300 can additionally or alternatively transition the particular deformable region(s) using electromechanical actuation.
- method S 300 can be implemented with a “snap dome” deformable region.
- Block S 350 detects an input, corresponding to the anticipated input, to the particular deformable region.
- Block S 350 detects an input at a sensor, such as a touch sensor integrated in a touchscreen display of the mobile computing device (e.g., a capacitive, resistive, or optical touch sensor).
- Block S 350 can detect the input at a pressure sensor by detecting a change in pressure of the fluid in the cavity. An increase in pressure of the fluid in the cavity corresponds to depression of the deformable region into the cavity and, thus, an input to the dynamic tactile interface.
- method S 300 functions to register interaction with the dynamic tactile interface by detecting an orientation of the device in Block 310 , identifying an anticipated input corresponding to a native application currently executing on the device, the anticipated input associated with an input location of the device in Block S 320 ; selecting a particular deformable region from a set of deformable regions, the particular deformable region corresponding to the anticipated input and adjacent the input location in Block S 330 ; selectively transitioning the particular deformable region from a retracted setting substantially flush with the peripheral region to an expanded setting tactilely distinguishable from the peripheral region in Block S 340 ; and detecting an input, corresponding to the anticipated input, to the particular deformable region in Block S 350 .
- method S 300 includes detecting a mobile phone held by a user in a landscape orientation in Block S 310 .
- Block S 310 can detect the mobile phone held by two hands of the user, the mobile phone situated between a thumb and an index finger of each hand as shown in FIG. 16 .
- Block S 320 method S 300 can detect a native camera application executing on the phone and anticipate a future input corresponding to selection of a shutter button to save an image captured by a lends and rendered by the native camera application on a display of the mobile phone.
- Block S 320 further detects an anticipated input location of the future input corresponding to the location of one of the index fingers.
- Block S 330 can select the deformable region at a location corresponding to the anticipated input location, and Block S 340 can expand the deformable region.
- Blocks S 330 and S 340 can function to form a tactilely distinguishable shutter button substantially underneath the index finger that is resting on a surface (and holding) the mobile phone.
- Block S 350 can detect depression of the tactilely distinguishable shutter button and trigger image capture with the camera accordingly.
- method S 300 can include detecting the orientation of the mobile phone (e.g., in a portrait orientation) in Block S 310 .
- Block S 320 can detect a camera application executing on the mobile phone, the camera application capturing an image detected by a forward-facing camera built into a face of the mobile phone proximal the display.
- Block S 320 can anticipate an input, such as selection of a virtual shutter button in order to capture the image with the forward-facing camera (i.e., a “selfie”) as shown in FIG. 17 .
- the input location can correspond to the virtual shutter button rendered by the display.
- the virtual shutter button can be located at a center of the display, proximal an edge of the display.
- the input location can correspond to any location on any surface of the mobile device.
- the input location can be centered on the display corresponding to an ergonomic location for contact by a finger (e.g., a thumb).
- the input location can also be arranged adjacent a finger holding the mobile phone and contacting a surface outside the display (e.g., an edge of the phone).
- Block S 330 can select the particular deformable region corresponding to the ergonomic location and Block S 340 can expand the deformable region into a tactilely distinguishable dome.
- Blocks S 330 and S 340 function to deploy a physical shutter button and Block S 350 can detect depression of the physical shutter button, which can trigger the camera application to capture the image detected by the forward-facing camera.
- method S 300 can include detecting the orientation of the mobile computing device with an accelerometer or other orientation-detecting sensor.
- Block S 310 of method S 300 can detect the minor axis of the mobile computing device substantially parallel a horizontal plane, thereby defining a portrait orientation.
- Block S 310 of method S 300 can also detect the major axis of the mobile phone substantially parallel a horizontal plane, thereby defining a landscape orientation as shown in FIG. 18 .
- Block S 320 can identify an input to a key of a virtual keyboard as an anticipated input to a native application that renders the virtual keyboard on the touchscreen of the mobile computing device.
- Block S 320 can predict the orientation of the virtual keyboard in response to the orientation of the mobile computing device detected in Block S 310 .
- Block S 320 can identify an anticipated input to a portrait keyboard in response to detection of the mobile computing device in the portrait orientation.
- Block S 320 can identify an anticipated input to a landscape keyboard in response to detection of the mobile computing device in the landscape orientation.
- Block S 330 can select a set of particular deformable regions corresponding to (e.g., arranged over) each key of the virtual keyboard rendered by the touchscreen.
- Block S 340 can selectively transition the set of particular deformable regions to an expanded setting, thereby rendering a physical keyboard of deformable regions in an orientation corresponding to the orientation of the device.
- method S 300 can detect a music application executing on the mobile computing device and expand a deformable region corresponding to (e.g., adjacent, coincident) a volume control (e.g., a volume slider) in anticipation of an input to modify a volume output by the device and/or a native application executing thereon.
- Method S 300 can detect an input object proximal a surface of the mobile computing device. For example, method S 300 can detect a figure resting on a surface opposite the touchscreen (e.g., a back surface of the mobile computing device). Method S 300 can identify the anticipated input that changes the volume output as a slide gesture across the tactile interface.
- Method S 300 can select a particular deformable region or set of deformable regions that define a substantially elongated and tactilely distinguishable button on which the user can enter the slide gesture and that are located substantially coincident the input object, such as adjacent a finger resting on a back surface of the mobile computing device opposite a touchscreen.
- method S 300 can detect an input object proximal a surface of the device, and, upon detection of the input object contacting the device, method S 300 can expand the particular deformable region coincident the input object.
- Method S 300 can identify an anticipated input corresponding to a command to wake a “sleeping” device (e.g., a device in a low energy mode). For example, method S 300 can anticipate depression of a wake button on the “sleeping” device.
- the “sleeping” device can be powered on (e.g., consuming energy from a battery and executing programs) but a touchscreen of the device can be disabled until the command to wake the “sleeping” device enables the touchscreen.
- Method S 300 can detect the input object proximal or coincident a surface of the device. For example, method S 300 can detect a hand or finger resting on the device as would occur if one were to hold the device in the hand. Accordingly, method S 300 can select the particular deformable region coincident or adjacent the input object and selectively expand the particular deformable region. Method S 300 can detect depression of the particular deformable region and interpret depression of the particular deformable region as a command to wake the “sleeping” device accordingly.
- method S 300 includes detecting an incoming message to a native messaging application executing on the computing device. In response to the incoming message, method S 300 identifies an anticipated output from the computing device and the native message application corresponding to a notification indicating receipt of the incoming message. For example, method S 300 can anticipate an icon rendered by the touchscreen in response to receipt of the incoming message. The icon can include an abbreviated version of the message. Method S 300 can further anticipate an input corresponding to the icon, such as a slide gesture substantially over the icon. Method S 300 can unlock a lock screen and open the message in response to detection of the slide gesture into the device.
- Method S 300 can further select a particular deformable region corresponding to the icon (e.g., of substantially the same shape as the icon) and selectively expand the deformable region to an expanded setting in anticipation of the slide input. Method S 300 can also detect the slide input, which can be applied to the deformable region and, thus, the icon.
- a particular deformable region corresponding to the icon e.g., of substantially the same shape as the icon
- Method S 300 can also detect the slide input, which can be applied to the deformable region and, thus, the icon.
- method S 300 can detect an incoming phone call and, thus, render a notification on the display to notify the user of the incoming phone call.
- method S 300 can render a virtual icon on a touchscreen of the device to prompt the user to answer the phone call.
- method S 300 can selectively expand a particular deformable region arranged over the virtual icon.
- method S 300 can select and expand a particular deformable region corresponding to an anticipated input location, such as a surface of the device where an input object (e.g., the user's finger) is in contact with the device prior and up to the time of the incoming phone call.
- an input object e.g., the user's finger
- the method can raise a particular deformable region adjacent a surface of the device that the user is already touching, and the user can answer the phone call by depressing the particular deformable region thus raised under or adjacent the user's finger.
- method S 300 can detect an external surface, such as a surface on which the device rests, and selectively deformable the particular deformable region(s) opposite the external surface.
- an external surface such as a surface on which the device rests
- a mobile phone can rest on a surface of a table with the touchscreen of the mobile phone contacting the surface of the table.
- Method S 300 can detect the surface of the table proximal the touchscreen.
- method S 300 can identify a notification notifying the user of the phone call, a location of the notification corresponding to a surface of the mobile phone opposite the external surface (e.g., the back of the phone), and an anticipated input corresponding to answering the incoming phone call.
- method S 300 can select the particular deformable region corresponding to the location of the notification opposite the external surface (e.g., the back of the phone) and transition the deformable region to an expanded setting, thereby indicating the incoming phone call and providing a tactile feature on which a user can apply the anticipated input.
- the external surface e.g., the back of the phone
- Another example of the variation includes expanding the particular deformable region corresponding to an icon representing a local area wireless technology or short-range wireless communication rendered by the touchscreen of the mobile computing device in response to short-range wireless communication (e.g., Bluetooth) between the mobile computing device and a secondary device, as shown in FIG. 15 .
- method S 300 detects a short-range wireless communication application executing on the mobile computing device.
- Method S 300 can detect an event corresponding to the secondary device within an area proximal the mobile computing device.
- the secondary device can also execute a native short-range wireless communication application or emit a short-range wireless communication signal that is detectable by the mobile computing device when the secondary device is within wireless range of the mobile computing device.
- method S 300 can render on the display of the mobile computing device an interface through which the user can confirm continued short-range wireless communication between the mobile computing device and the secondary device.
- Method S 300 can select a deformable region substantially corresponding to the interface and selectively expand the deformable region, thereby yielding a raised button with which a user can interact to confirm continued wireless communication with the second device.
- the interface can correspond to an image of an icon rendered on a touchscreen within the mobile computing device, the icon graphically representing the short-range wireless communication between the devices.
- the icon can include a list of devices (or local area networks) within the area proximal the mobile computing devices from which the user can select one or more devices (or local area networks) with which the mobile computing device may communicate.
- the interface can be represented over a region of the mobile computing device distinct from the touchscreen portion, such as a side or back surface of the mobile computing device.
- method S 300 can retract the deformable region(s) and disable input(s) to the mobile computing device in response to receipt of a signal from a third party device indicating the mobile computing device was lost or stolen.
- method S 300 can detect a phone tracking application executing on the mobile computing device.
- Method S 300 can detect a message from a third party device indicating that owner of the mobile computing device no longer possesses the mobile computing device.
- method S 300 tracks location and can disable interactive features of the mobile computing device.
- Method S 300 can disable inputs and outputs to the mobile computing device.
- method S 300 can selectively transition expanded deformable regions to the retracted setting.
- the systems and methods of the embodiments can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions.
- the instructions can be executed by computer-executable components integrated with the application, applet, host, server, network, website, communication service, communication interface, native application, frame, iframe, hardware/firmware/software elements of a user computer or mobile device, or any suitable combination thereof.
- Other systems and methods of the embodiments can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions.
- the instructions can be executed by computer-executable components integrated by computer-executable components integrated with apparatuses and networks of the type described above.
- the computer-readable medium can be stored on any suitable computer readable media such as RAMs, ROMs, flash memory, EEPROMs, optical devices (CD or DVD), hard drives, floppy drives, or any suitable device.
- the computer-executable component can be a processor, though any suitable dedicated hardware device can (alternatively or additionally) execute the instructions.
Abstract
Description
- This application is a continuation of U.S. application Ser. No. 14/471,889, filed 28 Aug. 2014, which claims the benefit of U.S. Provisional Application No. 61/871,264, filed on 28 Aug. 2013, which is incorporated in its entirety by this reference.
- This application is related to U.S. application Ser. No. 11/969,848, filed on 4 Jan. 2008; U.S. application Ser. No. 12/319,334, filed on 5 Jan. 2009; U.S. application Ser. No. 12/497,622, filed on 3 Jul. 2009, which are all incorporated in their entirety by this reference.
- This invention relates generally to tactile user interfaces, and more specifically to a new and useful mountable systems and methods for selectively raising portions of a surface of the user interface of a device.
-
FIG. 1 is a schematic representation of the method of the preferred embodiments; -
FIG. 2 is a top view of a variation of the tactile interface layer; -
FIG. 3 is a cross sectional view of a variation of the tactile interface layer; -
FIGS. 4A, 4B, and 4C are cross-sectional views illustrating the operation of a deformable region of a tactile interface layer; -
FIG. 5 is a cross sectional view of a variation of the tactile interface layer with a valve; -
FIG. 6 is a schematic representation of a variety of gestures and exemplary interpretations as commands; -
FIGS. 7A and 7B are schematic representations of a swiping gesture and the elimination of a deformed region as applied to the variation of the tactile interface layer inFIGS. 2-4 ; -
FIGS. 8A and 8B are schematic representations of a pinch open gesture and the creation of a deformed region as applied to the variation of the tactile interface layer inFIGS. 2-4 ; -
FIGS. 9A, 9B, 10A, and 10B are schematic representations of a pinch open gesture and a change of the deformable region in a first and second variation, respectively, as applied to the variation of the tactile interface layer inFIGS. 2-4 ; -
FIGS. 11A and 11B are schematic representations of a drag gesture and a change in location of the deformed region, as applied to the variation of the tactile interface layer inFIGS. 2-4 ; -
FIG. 12 is a flowchart representation of a variation of the method; -
FIG. 13 is a flowchart representation of a variation of the method; -
FIG. 14 is a flowchart representation of a variation of the method; -
FIG. 15 is a flowchart representation of a variation of the method; -
FIG. 16 is a flowchart representation of a variation of the method; -
FIG. 17 is a flowchart representation of a variation of the method; -
FIG. 18 is a flowchart representation of a variation of the method; and -
FIG. 19 is a flowchart representation of a variation of the method. - The following description of the preferred embodiments of the invention is not intended to limit the invention to these preferred embodiments, but rather to enable any person skilled in the art to make and use this invention.
- As shown in
FIG. 1 , the method S100 for actuating atactile interface layer 100 of a device that defines a surface with a deformable region of the preferred embodiments includes: detecting a gesture of the user along the surface of the tactile interface layer that includes a movement of a finger of the user from a first location (1) to a second location (2) on the surface Step S110; interpreting the gesture as a command for the deformable region Step S120; and manipulating the deformable region of the surface based on the command Step S130. The method S100 for actuating atactile interface layer 100 may also include the step of receiving a user input for a particular interpretation of a gesture as a command Step S140. The step of receiving a user input for a particular interpretation of a gesture as a command Step S140 may include receiving a user input from the user of the device, but may alternatively include receiving a user input from a person remote from the device, for example, a third party such as the manufacturer or a second user. However, the user input for a particular interpretation of a gesture as a command may be received from any other suitable user. The method S100 is preferably applied to atactile interface layer 100 that is to be used with an electronic device and, more preferably, in an electronic device that benefits from an adaptive user interface. The electronic device may include a display and may include a touch sensor. For example, the electronic device may be an automotive console, a steering wheel, a desktop computer, a laptop computer, a tablet computer, a television, a radio, a desk phone, a mobile phone, a PDA, a personal navigation device, a personal media player, a camera, a watch, a remote control, a mouse, a trackpad, or a keyboard. Thetactile interface layer 100 may, however, be used as the user interface for any suitable device that interfaces with a user in a tactile and/or visual manner. Thetactile interface layer 100 is preferably integrated with the device, for example, in the variation wherein thetactile interface layer 100 includes a sensor 140, thetactile interface layer 100 is preferably assembled into the device and presented to the user as one unit. Alternatively, thetactile interface layer 100 may function as an accessory to a device, the user may be presented thetactile interface layer 100 and the device as two separate units wherein, when coupled to each other, thetactile interface layer 100 functions to provide tactile guidance to the user and/or to receive user inputs. However, the method S100 may be applied to any other suitable arrangement of thetactile interface layer 100. - The method S100 of the preferred embodiments is preferably applied to any suitable tactile interface layer that includes deformable regions. In particular, as shown in
FIGS. 2-4 , the method S100 of the preferred embodiments may be applied to the user interface system as described in U.S. application Ser. Nos. 11/969,848, 12/319,334, and 12/497,622. The tactile interface layer may be applied over a display, but may alternatively be applied on to a surface without a display. However, the tactile interface layer may be applied to any suitable surface of a device that may benefit from a tactile interface. Thetactile interface layer 100 of this variation preferably includes alayer 110 that defines asurface 115, a substrate that at least partially defines a fluid vessel that includes a volume offluid 112, and adisplacement device 130 coupled to the fluid vessel that manipulates the volume offluid 112 to expand and/or contract at least a portion of the fluid vessel, thereby deforming aparticular region 113 of thesurface 115. The substrate may also include a support region that substantially prevents inward deformation of the layer 110 (for example, inward deformation into the fluid vessel). Thetactile interface layer 100 of this variation may also include a second layer 210 (as shown inFIGS. 10a and 10b ) that allows for an additional degree of deformation of thesurface 115. In this variation of thetactile interface layer 100, the step of manipulating the deformable region of the surface based on the command Step S130 preferably includes manipulating the fluid within the fluid vessel. In particular, thedisplacement device 130 is preferably actuated to manipulate the fluid within the fluid vessel to deform aparticular region 113 of the surface. The fluid vessel preferably includes acavity 125 and thedisplacement device 130 preferably influences the volume offluid 112 within thecavity 125 to expand and retract thecavity 125. The fluid vessel may alternatively be achannel 138 or a combination of achannel 138 and acavity 125, as shown inFIG. 3b . The fluid vessel may also include asecond cavity 125 b in addition to afirst cavity 125 a. When thesecond cavity 125 b is expanded, a secondparticular region 113 b on thesurface 115 is preferably deformed. Thedisplacement device 130 preferably influences the volume offluid 112 within thesecond cavity 125 b independently of thefirst cavity 125 a. As shown inFIG. 5 , the tactile interface layer of this variation may include avalve 139 that functions to direct fluid within thetactile interface layer 100. In this variation, the step of manipulating the fluid within the fluid vessel may include actuating thevalve 139 to direct fluid within thetactile interface layer 100. Alternatively, the userinterface enhancement system 100 may include asecond displacement device 130 that functions to influence the volume offluid 112 within thesecond cavity 125 b to expand and retract thesecond cavity 125 b, thereby deforming a secondparticular region 113 b of the surface. Thesecond cavity 125 b is preferably similar or identical to thecavity 125, but may alternatively be any other suitable kind of cavity. The following examples may be described as expanding a fluid vessel that includes acavity 125 and achannel 138, but the fluid vessel may be any other suitable combination of combination ofcavity 125 and/orchannel 138. However, any other suitable type oftactile interface layer 100 may be used. - The
tactile interface layer 100 preferably functions to provide tactile guidance to a user when using a device that thetactile interface layer 100 is applied to. As shown inFIG. 4 , thesurface 115 of thetactile interface layer 100 preferably remains flat until tactile guidance is to be provided to the user at the location of theparticular region 113. In the variation of thetactile interface layer 100 as described above, thedisplacement device 130 then preferably expands the cavity 125 (or any other suitable portion of the fluid vessel) to expand theparticular region 113 outward, forming a deformation that may be felt by a user (referenced throughout this document as a “tactilely distinguishable formation”), and providing tactile guidance for the user. The expandedparticular region 113 preferably also provides tactile feedback to the user when he or she applies force onto theparticular region 113 to provide input. This tactile feedback may be the result of Newton's third law, whenever a first body (the user's finger) exerts a force on a second body (the surface 115), the second body exerts an equal and opposite force on the first body, or, in other words, a passive tactile response. Alternatively, thedisplacement device 130 may retract thecavity 125 to deform theparticular region 113 inward. However, any other suitable method of deforming aparticular region 113 of thetactile interface layer 100 may be used. - The
tactile interface layer 100 preferably includes a sensor that functions to detect the gesture of the user, for example, a capacitive sensor that functions to detect the motion of a finger of the user from the first location to the second location. Alternatively, in the variation of thetactile interface layer 100 as described above, a pressure sensor located within the fluid vessel may be used to detect changes in pressure within the fluid vessel to detect the motion of a finger of the user from the first location to the second location. Alternatively, the sensor may be a sensor included in the device to which thetactile interface layer 100 is applied to, for example, the device may include a touch sensitive display onto which thetactile interface layer 100 is overlaid. The gesture of the user may be detected using the sensing capabilities of the touch sensitive display. However, any other suitable gesture detection may be used. - Similarly, the
tactile interface layer 100 preferably includes a processor that functions to interpret the detected gesture as a command. The processor preferably functions to discern between a gesture that is provided by the user to be a command a gesture that may be provided by the user but not meant to be a command, for example, an accidental brush of the finger along the surface of thetactile interface layer 100. The processor may include a storage device that functions to store a plurality of gesture and command associations and/or user preferences for interpretations of gestures as commands. The processor may be any suitable type of processor and the storage device may be any suitable type of storage device, for example, a flash memory device, a hard-drive, or any other suitable type. The processor and/or storage device may alternatively be a processor and/or storage device included into the device that thetactile interface layer 100 is applied to. However, any other suitable arrangement of the processor and/or storage device may be used. - As shown in
FIG. 6 , a gesture may be one of a variety of movements of one or more fingers of the user across thesurface 115 of thetactile interface layer 100. The gesture may be detected as a swipe from a first location to a second location arranged in any suitable location along thesurface 115 of thetactile interface layer 100. Alternatively, this first variation of gesture may be detected as a swipe from a first location relative to a deformedparticular region 113 to a second location relative to the deformedparticular region 113. Detection of a gesture relative to a deformedparticular region 113 may be particularly useful in the variation of thetactile interface layer 100 that includes a plurality of deformable regions and may function to allow the interpretation of the gesture as a command for a particular deformable region that is substantially proximal to the detected gesture. However, the gesture may be detected relative to any other suitable portion of the tactile interface layer. - The gesture may be a single finger moving from the first location to the second location on the
surface 115, as shown inFIG. 6 (Example A). Alternatively, the gesture may include more than one finger, for example, two fingers, where the first finger moves from a first location to a second location and the second finger moves from a third location to a forth location, as shown inFIG. 6 (Examples B-E). In the variation where the gesture includes more than one finger, the fingers of the user preferably move substantially concurrently. Alternatively, the fingers may move one after the other, or in other words, a “staggered” gesture, for example, a first finger moves and then the second finger moves or the first finger starts moving and continues moving as the second finger starts to move. However, any other suitable temporal relationship between the fingers of the user during a gesture may be used. - In a first variation of the gesture, as shown in Example A, the finger or fingers of a user move from a first location to a second location in a “swiping” motion. In a second variation, at least two of the fingers of the user move apart from each other in a “pinch open” motion, as shown in Example B. In other words, a first finger moves from a first location to a second and a second finger moves from a third location to a fourth, where the second and fourth locations are farther apart from each other than the first and third. A third variation of the gesture may be thought of as opposite that of the second variation, where at least two of the fingers of the user move together in a “pinch close” motion, as shown in Example C. In a fourth variation of the gesture, at least two fingers of the user may move in substantially the same direction in a “drag” motion, as shown in Example D. In other words, a first finger moves from a first location to a second and a second finger moves from a third location substantially adjacent to the first location to a fourth location substantially adjacent to the second location. In this variation, the first and second fingers remain substantially equidistant from the beginning of the gesture to the end of the gesture. In a fifth variation, as shown in Example E, the first and second fingers also remain substantially equidistant from the beginning of the gesture to the end of the gesture. In this fifth variation, the first finger moves from a first location to a second location and the second finger moves from a third to a fourth location along the surface by rotating about a point that is substantially in between the distance between the first and third locations. In other words, the fingers of a user rotate about a center that is substantially defined by the distance between the initial positions of the first and second fingers of the user. While the gesture is preferably one of the variations as described above, the gesture may be any other suitable combination of the above variations and/or any other suitable type of gesture.
- As shown in
FIG. 6 , the gesture may be interpreted as one of a variety of commands for the deformableparticular region 113. Examples A-E show exemplary parings between a gesture and the interpreted command. However, any other suitable type of association between gesture and command may be used. In a first variation, the command may be to un-deform (or “eliminate”) the deformedparticular region 113, as shown inFIG. 6 (Examples A and B). In a second variation, the command may be to change the shape of the deformed particular region, as shown inFIG. 6 (Example B), for example, to enlarge the surface area of the deformed particular region and/or to change overall shape of the deformed particular region (e.g., from a substantially round button to a substantially square button). In a third variation, the command may be to actuate (or “create”) the deformedparticular region 113. This may be thought of as the opposite of the first variation of command. In a fourth variation, the command may be to change the location of a deformedparticular region 113, as shown inFIG. 6 (Example D). In the variation of thetactile interface layer 100 that includes a plurality of deformable regions, the fourth variation may alternatively be thought of the “elimination” of the originally deformedparticular region 113 at a first location and the “creation” of another deformedparticular region 113 at a second location. The second location is preferably indicated by the gesture provided by the user. However, the location of the deformed particular region may be changed using any other suitable method. In a fifth variation, the command may be to change an already deformedparticular region 113, for example, to change the firmness or the height of the deformedparticular region 113, as shown inFIG. 6 (Example E). However, any other suitable type of change to the deformedparticular region 113 may be used, for example, a gesture that is not in contact with thesurface 115. In this variation, the sensor that detects the gesture may be a video sensor or a distance sensor that detects the motion of the user that is removed from thesurface 115. Similarly, the gesture may include any other suitable body part of the user, for example, a hand, an arm, and/or a foot. - The command interpreted from the gesture along the
surface 115 of the tactile interface layer is preferably one of the variations described above, but may alternatively be any suitable combination of the above variations or any other suitable type of command for the deformable region. In addition to a command for the deformable region, the gesture may also be interpreted as a command for the device, for example, when applied to a device that is a mobile phone, music player, or any other suitable device that outputs sound, the command may include a user command to change the volume of the sound output. Similarly, in a device that provides a visual output, the command may include a user command to change the brightness or any other suitable property of the visual output. However, any other suitable command for the device may be used. - The following include exemplary interpretations of particular gestures as commands for the deformable region and implementation of the command using the variation of the
tactile interface layer 100 as described in U.S. application Ser. Nos. 11/969,848, 12/319,334, and 12/497,622, which are incorporated in their entireties this reference. - In a first exemplary interpretation, as shown in
FIG. 6 (Examples A and B), a “swiping” gesture, as shown inFIGS. 7a and 7b , or a “pinch open” gesture may be interpreted as an “eliminate” command to un-deform the deformedparticular region 113. The “swiping” motion and the “pinch open” gesture is substantially similar to a motion a user may make when moving an object away or to push out a crease, thus, it may be useful to associate such a gesture with the elimination of a deformation of the deformedparticular region 113. The “swiping” and “pinch open” gesture may involve one and two fingers of the user, respectively, but may alternatively involve two and four fingers of the user, respectively, or any other suitable number of fingers of the user or any other suitable number of fingers of multiple users. In this exemplary interpretation, in the variation of thetactile interface layer 100 that includes a plurality of deformable regions, the location of the gesture relative to a deformedparticular region 113 may be used to determine the deformed region that the user wishes to eliminate. For example, as shown in Example A andFIGS. 7a and 7b , thetactile interface layer 100 may detect that the finger of the user passes over a particular deformedparticular region 113 and interprets the gesture as a command to eliminate the particular deformation. Alternatively, thetactile interface layer 100 may detect a command motion from the user and be prepared to eliminate a deformed particular region at a location later indicated by the user. For example, the swiping or pinch open gesture may indicate to thetactile interface layer 100 that the user desires to eliminate a particular deformed region. Upon detection of the gesture, theuser interface 100 may actuate an operation mode that waits for a user to indicate the desired deformation to eliminate. The user may then indicate the desired location for the desired deformation to eliminate anywhere on thetactile interface layer 100. The location may be substantially adjacent to where the user provided the gesture, but may alternatively be substantially distal from where the user provided the gesture along thesurface 115. However, the user may define their desired location using any other suitable method, for example, applying pressure to a particular location on thesurface 115. However, any other suitable method to indicate the desired deformed region to eliminate may be used. - In a second exemplary interpretation, as shown in
FIG. 6 (Example C) and inFIGS. 8a and 8b , a “pinch close” gesture may be interpreted as a “creation” command. The user may “create” the button for any suitable reason, for example, to mark a location on a screen, to mark an option, to mark a file for easy reference, or to indicate that tactile guidance is desired at a particular location. For example, as shown inFIGS. 8a and 8b , the user may draw two fingers together to indicate the desired location of a deformed particular region of thesurface 113. Thetactile interface layer 100 preferably detects the motion of the fingers and location along the fluid vessel and/orcavity 125 corresponding to aparticular region 113 that is substantially adjacent to the location substantially central to the location of the fingers as the user draws the fingers is expanded and the desired particular region of thesurface 113 is deformed, as shown inFIG. 8b . The user may alternatively draw more than two fingers together to better define a central location. Similar to the first exemplary interpretation, thetactile interface layer 100 may alternatively detect a gesture from the user and be prepared to expand acavity 125 in a location indicated by the user. For example, the motion of drawing two fingers together may indicate to thetactile interface layer 100 that the user desires expansion of acavity 125. Upon detection of the gesture, theuser interface 100 may actuate an operation mode that waits for a user indication for the desired location for a deformed region of the surface. The user may then indicate the desired location for the deformed region of the surface anywhere on thetactile interface layer 100. The location may be substantially adjacent to where the user draws two fingers together, but may alternatively be substantially distal from where the user draws two fingers together. Alternatively, the user may indicate an arrangement of deformable regions to deform. For example, once a command to deform a particular region is interpreted, the user may indicate the desired arrangement of regions by providing a second gesture, such as to trace a shape on thesurface 115 that may indicate, for example, a QWERTY keyboard configuration of deformable regions. The user may also indicate a desired shape of the deformed region of the surface. For example, the user may trace out a desired shape along the surface and theuser interface 100 may function to deform the regions substantially adjacent to the traced shape along the surface. However, the user may define their desired location using any other suitable method, for example, applying pressure to a particular location on thesurface 115. This variation is preferably used on the variation of the user interface system that includes a plurality ofcavities 125 to provide the user with a plurality of options of the location of the deformed particular region of thesurface 113, but may alternatively be used on atactile interface layer 100 with any other suitable number ofcavities 125. This may be a useful tactile experience where the device is a trackpad and the user draws his or her fingers together to create a pointing stick, such as the pointing stick trademarked by IBM as the TRACKPOINT and by Synaptics as the TOUCHSTYK (which are both informally known as the “nipple”). This allows a trackpad to be combined with a pointing stick where the two navigational interfaces are generally kept separate. - In a third exemplary interpretation, as shown in
FIG. 6 (Example B), andFIGS. 9 and 10 , the user may pull two fingers in a “pinch open” gesture to indicate the desire to expand (e.g., “spread out”) a deformedparticular region 113, for example, a user may pull two fingers in opposite directions away from a deformedparticular region 113 or pull a finger in one direction away from a deformedparticular region 113, indicating to theuser interface system 100 that the total surface area of the deformation of theparticular region 113 is to be increased, or “spread.” In a first example, the deformable region of thetactile interface layer 100 may include a first and second degree of deformation, as shown inFIGS. 9a and 9b . In this example, the deformable region may require a first pressure to deform the first degree and a second pressure to deform the second degree; for example, thelayer 110 may include a second portion that requires a higher pressure to deform. Alternatively, a first andsecond cavity second cavities second cavities FIGS. 10a and 10b . Alternatively, the degree of deformation of theparticular region 113 may be decreased and substantially adjacentparticular regions 113 may be expanded to produce the effect of spreading a deformation across a large surface area. Alternatively, the deformation of theparticular region 113 may be maintained and substantially adjacentparticular regions 113 can be expanded to substantially the same degree, providing the effect of enlarging a deformedparticular region 113. The “spreading” of the deformedparticular region 113 may be radially equidistant from the original deformedparticular region 113. More specifically, the central point of the resulting deformedparticular region 113 is preferably of the same central point of the original deformedparticular region 113, as shown inFIG. 9 . Alternatively, the central point of the resulting deformedparticular region 113 may be different from the central point of the original deformedparticular region 113, for example, a user may pull one finger away from the deformedparticular region 113 in one direction, indicating expansion of the deformedparticular region 113 in the indicated direction, thus moving the central point of the deformedparticular region 113 towards the indicated direction, as shown inFIG. 10 . Alternatively, the motion of two fingers pulled in opposite directions away from a deformedparticular region 113 may indicate to theuser interface system 100 to retract thecavity 125 and undeform the deformedparticular region 113. However, theuser interface system 100 may provide any other suitable active response to the motion of two fingers pulled in opposite directions away from a deformedparticular region 113. - In a fourth exemplary interpretation, as shown in
FIG. 6 (Example D), andFIGS. 11a and 11b , a “dragging” gesture may be interpreted as a command to move the deformed region from a first location to a second location along thesurface 115. For example, once afirst cavity 125 has been expanded and aparticular region 113 has been deformed, the user may move his or her finger(s) along the surface 115 (preferably in contact with thesurface 115, but may also be any other suitable distance away from the surface 115) to indicate successiveparticular regions 113 to deform. As a successiveparticular region 113 deforms, the priorparticular region 113 preferably undeforms (in other words, thecavity 125 corresponding to the priorparticular region 113 retracts), resulting in the user seemingly “dragging” the deformedparticular region 113 along thesurface 115. As the user moves his or her finger(s) along thesurface 115, the successiveparticular regions 113 are preferably substantially adjacent or continuous with each priorparticular region 113 to provide an experience akin to that of dragging a single object along a surface as opposed to touching a first object on a surface and then another object on the same surface. Alternatively, the deformed region may be “pushed” by the dragging gesture. In this variation, the dragging gesture preferably starts on one side of the deformed region and “pushes” the deformed region forward, as shown inFIGS. 11a and 11b . Subsequentparticular regions 113 may be deformed forward of the original deformed region (instead of behind as described in the dragging example) to emulate the user pushing the deformed region from a first location to a second location along thesurface 115. Yet alternatively, the user may indicate the deformed region that is to be moved by the start of the “dragging” gesture and then indicate the desired location of the moved deformed region by the end of the “dragging” gesture (in other words, where the user lifts the fingers off thesurface 115 after the gesture). In this variation, the initial deformed particular region may be “eliminated” and a particular region at the desired location is “created” while deformable regions in between the eliminated and created deformations are not actuated. However, any other suitable actuation of deformable regions may be used. - In another aspect of the fourth exemplary interpretation, the user may dictate interaction between expanded
cavities 125. For example, in the “dragging” example mentioned above, the user may “drag” an object along a path andparticular regions 113 are expanded along the path. When an object is dragged over an existing deformedparticular region 113, the object and the existing deformedparticular region 113 may “react” based on actions of the user. For example, if the user pauses the dragging motion when the object is in the location of the existing deformedparticular region 113, the deformedparticular region 113 of the object and the existing deformedparticular region 113 may seemingly “merge,” for example, the total surface area of the existing deformedparticular region 113 may grow as if the deformedparticular region 113 of the object is added into the existing deformedparticular region 113, similar to the third exemplary interpretation. The user may then also drag the “merged”particular region 113 to a different location. Alternatively, the existing deformedparticular region 113 and the deformedparticular region 113 for the object may “repel” each other, for example, the object may represent a baseball bat and the existing deformedparticular region 113 may represent a ball, and the user may “hit” the ball with the baseball bat, seemingly “repelling” the two deformed particular regions. Similarly, the user may perform a splitting motion on an existing deformedparticular region 113 and the existing deformedparticular region 113 may “split,” forming two distinct deformedparticular regions 113. Each of the resulting two distinct deformedparticular regions 113 is preferably of a smaller surface area than the original existing deformedparticular region 113. An example of a splitting motion may be drawing two fingers apart substantially adjacent to the existing deformedparticular region 113, However, any other suitable interaction between expandedcavities 125 may be implemented. While an active response to a command given by the user is preferably one of the examples described here, any active response to a command given by the user may be used. - A fifth exemplary interpretation, as shown in
FIG. 6 (Example E), a rotating gesture may be interpreted as a command to change the characteristics of a deformedparticular region 113 substantially proximal to the user input. Unlike the first, second, third, and fourth exemplary interpretations where the deformations have binary states of expanded and retracted, the command of the fifth exemplary interpretation allows for a plurality of states in between fully deformed and fully undeformed, respectively. For example, the rotating gesture around a deformedparticular region 113 may be interpreted as a command to increase the stiffness of the deformation. This may be particular useful in a scenario where the command includes a command for the volume of the device and the deformedparticular region 113 indicates the location of the “increase volume” button. When the user provides the rotating gesture around the button to indicate an input to increase the volume, the deformedparticular region 113 may become progressively stiffer to the touch as the volume becomes higher and reaches the limit of the volume strength, indicating to the user through tactile means where along the volume scale they are currently. The rate of stiffness increase may be selected by the user to be tailored to their tactile preferences and/or sensitivity. The height of the deformedparticular region 113 may also be adjusted as the volume level changes. To adjust the stiffness of the particular region of thesurface 113, thedisplacement device 130 may adjust the amount of fluid that is displaced to expand thecavity 125. The more fluid that is displaced to expand thecavity 125, the stiffer theparticular region 113 will feel to the touch. Similarly, the more fluid that is displaced to expand thecavity 125, the taller the deformation of theparticular region 113. Theuser interface system 100 may also include a valve that directs the fluid displaced by thedisplacement device 130. In this variation, when additional fluid is desired to expand thecavity 125 to increase the stiffness and/or the height of the deformedparticular region 113, the valve may direct additional fluid into thecavity 125. - While the interpretation of the gesture as a command is preferably one of the variations described above, the active response may alternatively be a combination of the variations described above or any other suitable combination of gestures and commands.
- As shown in
FIG. 12 , the method S200 for responding to an implicit gesture includes: determining that a mobile computing device is held by a user in Block S210, the mobile computing device comprising a substrate defining a fluid channel, an attachment surface, and a fluid conduit fluidly coupled to the fluid channel and passing though the attachment surface, a tactile layer defining a deformable region and a peripheral region, the peripheral region adjacent the deformable region and coupled to the attachment surface, the deformable region adjacent the peripheral region, arranged over the fluid conduit, and disconnected from the attachment surface, and a displacement device configured to displace fluid through the fluid channel to transition the deformable region from a retracted setting to an expanded setting in Block S220; identifying a position of the mobile computing device in a hand of the user in Block S230; predicting a location of a future input into the mobile computing device in Block S240, the location proximal the deformable region; and transitioning the deformable region from the retracted setting to the expanded setting. - Generally, the second method S200 functions to predict a position of an upcoming input based on how a mobile computing device (e.g., a smartphone, a tablet, a PDA, personal music player, wearable device, watch, wristband, etc.) is held by a user and then to manipulate a dynamic tactile interface within the mobile computing device to yield a tactilely-distinguishable formation on the dynamic tactile interface proximal the predicted position of the upcoming input, a desired location of a button (i.e., input region), or shape of the dynamic tactile interface. Thus, the second method S200 can manipulate one or more deformable regions of a dynamic tactile interface within a mobile computing device to dynamically form tactilely-distinguishable formations on the mobile computing device, thereby improving convenience and ease of use of the mobile computing device.
- In one example, while the mobile computing device is ‘locked,’ the second method S200 identifies that the mobile computing device is held in a portrait orientation in a user's left hand and thus transitions a deformable region over the top left quadrant (i.e., II Cartesian quadrant) of the display to define an physical “unlock” region adjacent a repositioned unlock slider rendered on the display. In this example, the second method S200 thus identifies how the mobile computing device is held and manipulates the dynamic tactile layer to place the physical unlock region in a position directly and naturally accessibly by the user's left thumb, thus increasing the ease with which the user may unlock the mobile computing device. In this example, the second method S200 can also adjust the position of a key (e.g., graphic) rendered on the display to align with the physical unlock region. Furthermore, for the unlock region that defines an elongated ridge indicating a swipe input to unlock, the second method S200 can modify a required input swipe direction to accommodate the user's hand position over the mobile computing device. In this example, when the mobile computing device held in a portrait orientation in the user's left hand, the second method S200 can set the swipe direction from right to left, whereas the second method S200 sets the swipe direction from left to right when the mobile computing device is held in a portrait orientation in the user's right hand.
- In another example, while the mobile computing device is outputting audio (e.g., through headphones or through an internal speaker), the second method S200 identifies that the mobile computing device is held in a portrait orientation in a user's right hand and thus transitions a pair of deformable regions on the upper right region of the side of the mobile computing device into expanded settings to define a physical “volume up” key and a physical “volume down” key. In this example, the second method S200 thus identifies how the mobile computing device is held and manipulates the dynamic tactile layer to place physical volume adjustment regions in positions directly and naturally accessibly by the user's right index finger, thus increasing the ease with which the user may adjust the volume output of the mobile computing device. In this example, the second method S200 can also render a “+” image key and a “−” image key near the perimeter of the display to proximal the physical “volume up” and “volume down” keys to indicate control functions of the corresponding physical keys to the user.
- In yet another example implementation, while the mobile computing device is in use (e.g., unlocked), the second method S200 determines the orientation of the mobile computing device relative to the horizon (e.g., portrait, landscape, 37° from horizontal) and transitions deformable regions within the dynamic tactile interface between expanded and retracted settings to maintain a physical “home” button proximal a current effective bottom center of the mobile computing device. Furthermore, in this example, the second method S200 can identify when the mobile computing device is rotated relative to the horizon and frequently update the position of the home button (e.g., a home button rendered on the display and a home button defined by a deformable region in the expanded setting), such as every five seconds or when the change in position of the mobile computing device exceeds a threshold position change while the mobile computing device is unlocked and in operation.
- In another example implementation, once the mobile computing device is unlocked and a home screen with native applications rendered on the display, the second method S200 accesses a user application history including frequency and duration of user of native application displayed on the home screen. The second method S200 subsequently manipulates a set of deformable regions, each adjacent a displayed native application key, with a deformable region adjacent a native application key corresponding to a highest-use native application transitioned to a highest expanded position and with a deformable region adjacent a native application key corresponding to a lowest-use native application transitioned to a lowest expanded position or retained in the retracted position. Thus, in this example, the second method S200 can adjust the height of various deformable regions adjacent native application keys displayed within a home screen on the mobile computing device according to a likelihood that the user will select each native application based on application selection history.
- Block S210 of the second method S200 recites determining that the mobile computing device is held by the user. Furthermore, Block S220 of the second method S200 recites identifying a position of the mobile computing device in a hand of the user. Generally, Block S210 and Block S220 function to interface with one or more sensors on the mobile computing device to detect that the mobile computing device is being held and how the mobile computing device is being held. For example, Blocks S210 and/or S220 can interface with one or more capacitive, resistive, optical, or other touch sensors arranged about the mobile computing device, such as on and around the display, the side of the mobile computing device, and/or a back surface of the mobile computing device, to detect a finger or hand hovering over or in contact with the mobile computing device. Blocks S210 and/or S220 can additionally or alternatively interface with one or more heat sensors within the mobile computing device to detect a local temperature change across a surface of the device and to correlate the temperature change with a hand holding the mobile computing device and/or interface with an accelerometer and/or a gyroscope to detect that the mobile computing device is being held, moved, and/or manipulated. For example, Block S210 can characterize accelerometer and/or gyroscope outputs as the mobile computing device being in a user's pocket while the user is walking, resting on a table or horizontal surface, or in a user's hand, etc. In another example, for the mobile computing device that is a wearable device (e.g., a smart wristband), Blocks S210 and S220 can interface with a heart rate sensor within the wearable device to detect the user's current heart rate, and the second method S200 can set a position of one or more deformable regions on the wearable device based on the user's current heart rate. Blocks S210 and S220 can similarly detect the user's current breathing rate or other vital sign, and the second method S200 can set a position of one or more deformable regions on the wearable device accordingly. Block S210 and S220 can additionally or alternatively interface with one or more bio-sensors integrated into the wearable device (or other computing device) to identify a user who is holding the wearable device based on bio-signature output from the bio-sensor, and Bocks S210 and S220 can thus adjust a position of one or more deformable regions (e.g., a location, a height, a firmness, and/or a unique gesture definition related to a deformable region) according to a preference of the identified user.
- Block S220 can thus compare sensed touch areas to a touch area model to characterize a touch sensor output as a left hand or a right hand holding the mobile computing device in a portrait, landscape, or other orientation. Block S220 can similarly compare sensed heat areas to a heat area model to characterize a temperature sensor output as a left or right hand holding the mobile computing device in a portrait, landscape, or other orientation. Block S220 can also determine how the mobile computing device is held, such as by one or both hands of the user, based on how text or other inputs are entered into the mobile computing device, and Block S220 can further verify such characterization of user inputs substantially in real-time based on accelerometer and/or gyroscope data collected by sensors in the mobile computing device.
- Blocks S210 and S220 can additionally or alternatively implement machine vision and/or machine learning to identify a face, body, clothing feature, etc. in a field of view of a (forward-facing) camera within the mobile computing device and thus determine that the mobile computing device is held and how the mobile computing device is held based on the identified face, body, clothing feature, etc. For example, Block S210 can implement facial recognition to determine that the mobile computing device is currently held, and Block S220 can implement face tracking to predict which hand the user is using to hold the mobile computing device. Block S210 and S220 can additionally or alternatively interface with a rear-facing camera within the mobile computing device to identify a hand (e.g., left or right) holding the mobile computing device. Blocks S210 and S220 can similarly identify a hand shape or hand motion (i.e., gesture) in a field of view of a camera within the mobile computing device (and not touching the mobile computing device), and subsequent Blocks of the second method S200 can set a deformable region position according to the identified hand shape or gesture.
- Blocks S210 and S220 can additionally or alternatively determine if the mobile computing device is worn, in use, or in a particular location, on in an “ON” or “unlocked” state. For example, the second method S200 can selectively expand and retract one or more side, back, or on-screen deformable regions based on location data of the mobile computing device determined in Blocks S210 and S220 through a location (e.g., GPS) sensor within the mobile computing device. In this example, the second method S200 can thus selectively control the position of various deformable regions based on whether the user is at home, in his car, what app is running on the mobile computing device, etc.
- However, Block S210 and Block S220 can function in any other way to determine that the mobile computing device is being held and to characterize how the mobile computing device is held.
- Block S230 of the second method S200 recites predicting a location of a future input into the mobile computing device, the location proximal the deformable region. Generally, Block S230 functions to predict a location of an upcoming input based on how the mobile computing device is held (e.g., orientation of the mobile computing device, which hand(s) the user is using to hold the mobile computing device). In example similar to that described above, when the mobile computing device is “locked” and Blocks S210 and S220 determine that the user has picked up the mobile computing device with his left hand and is holding the mobile computing device in a portrait configuration, Block S230 can predict an upcoming input to include an “unlock” gesture. In this example, Block S230 can also predict that a convenient or preferred unlock input to be from the Quadrant I of the display (current top-right quadrant) to the Quadrant II of the display (current top-left quadrant) based the holding hand and orientation determined in Blocks S210 and S220. Block S230 can thus predict the upcoming input and a preferred location for the upcoming input.
- In another example similar to that described above, when the mobile computing device is outputting sound, such as through a headphone stereo jack or internal speaker, and Blocks S210 and S220 determine that the user is holding the mobile computing device in his right hand in a portrait configuration, Block S230 can predict an upcoming input to include either of a “volume up” gesture and a “volume down” gesture. In this example, Block S230 can also predict that a convenient or preferred “volume up” and “volume down” input regions to lie off the display on an upper left lateral side of the mobile computing device such that user's right index finger falls substantially naturally on the “volume up” and “volume down” input regions. Block S230 can thus predict the upcoming input and a preferred or convenient location for the upcoming input based on the holding position of the mobile computing device determined in Blocks S210 and S220.
- Block S240 of the second method S200 recites transitioning a deformable region from the retracted setting to the expanded setting. Generally, Block S240 functions to control the displacement device to displace fluid through the fluid channel to transition the deformable region from the retracted setting to the expanded setting. Block S240 can control one or more valves and/or one or more displacement devices within the mobile computing device to selectively expand and/or retract a particular subset of deformable regions, as described above or as described in U.S. patent application Ser. No. 12/319,334, filed on 5 Jan. 2009, which is incorporated in its entirety by this reference.
- Therefore, the second method S200 can function to predict a future input and/or a preferred or convenient location for a future input and manipulate a deformable region on the mobile computing device to define a tangible button accordingly. The second method S200 can manipulate one or more deformable regions over a display within the mobile computing device (i.e., an on-screen physical button) and/or one or more deformable regions remote from the display (i.e., an off-screen physical button). As described above, the second method can therefore control one or more valves, displacement devices, etc. to form a physical volume up button, volume down button, lock button, unlock button, ringer or vibrator state button, home button, camera shutter button, and/or application selection button, etc. on the mobile computing device. The second method S200 can further manage outputs from a touch sensor to handle user inputs into selectively formed buttons, and the second method can also interface with a display driver to render visual input region identifiers adjacent (i.e., under) on-screen buttons and/or to render visual input identifiers near or pointing to off-screen buttons. For example, the second method S200 can detect a first gesture, selectively adjust the position of a particular deformable region accordingly, detect a subsequent gesture, assign a particular output type to the particular deformable region, and then generate an output of the particular output type when the particular deformable region is subsequently selected by the user. However, the second method S200 can function in any other way to estimate how the mobile computing device is held, to predict a type and/or location of a future input, and to manipulate a vertical position of one or more deformable regions accordingly to the predicted type and/or location of the future input.
- An example of method S200 includes detecting an ongoing phone call on a mobile phone with a touchscreen or other sound output through a speaker of the mobile phone. Method S200 can further detect the orientation of the phone by detecting the touchscreen proximal and/or contacting an ear of the user, such as when the user holds the mobile phone up to the ear during the ongoing phone call. In response, method S200 can select and expand a deformable region corresponding to the ear and the speaker such that the deformable region forms an earpiece. Thus, method S200 can expand the earpiece to conform to the ear and focus sound output from the speaker toward to ear for improved hearing.
- As shown in
FIG. 14 , the method S300 registers interaction with a dynamic tactile interface. The dynamic tactile interface includes a tactile layer and a substrate, the tactile layer defining a tactile surface, a deformable region, and a peripheral region adjacent the deformable region and coupled to the substrate opposite the tactile surface. The method S300 includes detecting an orientation of the device in Block S310; predicting a location of an upcoming input related to a native application executing on the device in Block S320; selecting a particular deformable region from a set of deformable regions, the particular deformable region substantially coincident the input location in Block S330; selectively transitioning the particular deformable region from a retracted setting into an expanded setting, the deformable region substantially flush with the peripheral region in the retracted setting and tactilely distinguishable from the peripheral region in the expanded setting in Block S340; and detecting an input, corresponding to the anticipated input, on the particular deformable region in Block S350. - As shown in
FIG. 19 , one variation of method S300 includes receiving a notification event at the device in Block S315; detecting a particular location of an input object contacting a surface of the device prior to an upcoming input in Block S320; in response to the notification event, rendering a virtual communication on a region of a display of the device adjacent the particular location, the virtual communication corresponding to the notification event in Block S325; selecting a particular deformable region from a set of deformable region, the particular deformable region corresponding to the anticipated output and adjacent the particular location in Block S340; selectively transitioning the particular deformable region from a retracted setting substantially flush with the peripheral region to an expanded setting tactilely distinguishable from the peripheral region in Block S350; and detecting an input to the particular deformable region in Block S360. - Generally, method S300 functions to register an implicit event associated with an input, define a command for the dynamic tactile interface in response to the implicit event, and, in response to the command, modify the dynamic tactile interface according to an anticipated future input to the dynamic tactile interface. In particular, method S300 functions to correlate spatial orientation of the device and a native application executing on the device with a configuration of deformable regions of the dynamic tactile interface.
- The dynamic tactile interface can further include a display coupled to the substrate opposite the tactile layer and displaying an image of a key substantially aligned with the deformable region and/or a touch sensor coupled to the substrate and outputting a signal corresponding to an input on a tactile surface of the tactile layer adjacent the deformable region. The dynamic tactile interface can also include a housing that transiently engages a (mobile) computing device and transiently retains the substrate over a digital display of the (mobile) computing device.
- Generally, the dynamic tactile interface can be implemented within or in conjunction with a computing device to provide tactile guidance to a user entering input selections through a touchscreen or other illuminated surface of the computing device. In particular the dynamic tactile interface defines one or more deformable regions of a tactile layer that can be selectively expanded and retracted to intermittently provide tactile guidance to a user interacting with the computing device. In one implementation, the dynamic tactile interface is integrated into or applied over a touchscreen of a mobile computing device, such as a smartphone or a tablet. For example, the dynamic tactile interface can include a set of round or rectangular deformable regions, wherein each deformable region is substantially aligned with a virtual key of a virtual keyboard rendered on the a display integrated into the mobile computing device, and wherein each deformable region in the set mimics a physical hard key when in an expanded setting. However, in this example, when the virtual keyboard is not rendered on the display of the mobile computing device, the dynamic tactile interface can retract the set of deformable regions to yield a substantially uniform (e.g., flush) tactile surface yielding reduced optical distortion of an image rendered on the display. In another example, the dynamic tactile interface can include an elongated deformable region aligned with a virtual ‘swipe-to-unlock’ input region rendered on the display such that, when in the expanded setting, the elongated deformable region provides tactile guidance for a user entering an unlock gesture into the mobile computing device. Once the mobile computing device is unlocked responsive to the swipe gesture suitably aligned with the virtual input region, the dynamic tactile interface can transition the elongated deformable region back to the retracted setting to yield a uniform surface over the display.
- The dynamic tactile interface can alternatively embody an aftermarket device that adds tactile functionality to an existing computing device. For example, the dynamic tactile interface can include a housing that transiently engages an existing (mobile) computing device and transiently retains the substrate over a digital display of the computing device. The displacement device of the dynamic tactile interface can thus be manually or automatically actuated to transition the deformable region(s) of the tactile layer between expanded and retracted settings.
- Generally, Block S310 detects an orientation of the device. In particular, Block S310 can interface with a sensor incorporated into the device (e.g., a touch sensor, an optical sensor, an accelerometer, Global Positioning System, etc.) to detect the orientation of the device relative an external surface or body. For example, Block S310 can interface with an accelerometer built into the device to detect orientation of a mobile phone relative to a horizontal surface. The mobile phone can be oriented in a portrait orientation, such that a minor axis of the device can be substantially parallel to the horizontal surface. Likewise, the device can be oriented in a landscape orientation, such that the major axis of the device can be substantially parallel the horizontal surface. Alternatively, Block S310 can detect the device in any other orientation with any other sensor suitable for detecting orientation of the device. For example, Block S310 can detect, with an optical sensor, a display of the device resting on a horizontal surface. Block S310 can further detect the position of the device relative an external surface and/or object. In another example, Block S310 can detect an input object (e.g., a finger) resting on a surface the device. Block S310 can detect with a sensor, such as a capacitive, resistive, and/or optical sensor.
- Generally, Block S320 predicts a location of an upcoming input related to a native application executing on the device. In particular, Block S320 can predict a particular input at a particular location in response to execution of the native application. For example, Block S320 can predict a contact with a surface of the device at the particular location. For example, Block S320 can identify a future input defined by a contact by an input object (e.g., a finger) on a portion of the touchscreen of the computing device corresponding to a virtual image rendered by the touchscreen.
- Generally, Block S330 selects a particular deformable region from a set of deformable regions, the particular deformable region corresponding to the anticipated input and adjacent the input location. In particular, Block S330 can select the particular deformable region adjacent or arranged over the input location. Block S330 can select a particular deformable region with a shape substantially corresponding to the anticipated input. For example, if the anticipated input includes a slide gesture across the tactile surface, Block S330 can select a particular deformable region that forms an elongated and elevated button, such that the user can slide a finger across the expanded deformable region to enter the gesture into the device. Alternatively, Block S330 can select a set of particular deformable regions from the set of deformable regions, such that the set of particular deformable regions cooperatively correspond to the anticipated input.
- Generally, Block S340 selectively transitions the particular deformable region from a retracted setting substantially flush with the peripheral region to an expanded setting tactilely distinguishable from the peripheral region. In particular, Block S340 can transition the particular deformable region(s) by displacing fluid from a fluid vessel into a cavity arranged under the deformable region. The tactile layer can include a substrate, a deformable region, and a peripheral region adjacent the deformable region and coupled to the substrate opposite the tactile layer, the substrate defining a fluid channel and cooperating with the deformable region to define a cavity filled with fluid. A displacement device (e.g., a pump) fluidly coupled to the fluid channel can displace fluid between the cavity and a reservoir fluidly coupled to the displacement device, thereby transitioning the deformable region between an expanded setting substantially elevated above the peripheral region and a retracted setting substantially flush with the peripheral region. Generally, the tactile layer can define one or more deformable regions operable between the expanded and retracted settings to intermittently define tactilely distinguishable formations over a surface, such as over a touch-sensitive digital display (e.g., a touchscreen), such as described in U.S. patent application Ser. No. 13/414,589. Thus, the displacement device can transition the deformable region into the expanded setting by displacing fluid from the fluid vessel into the cavity. Method S300 can additionally or alternatively transition the particular deformable region(s) using electromechanical actuation. For example, method S300 can be implemented with a “snap dome” deformable region.
- Generally, Block S350 detects an input, corresponding to the anticipated input, to the particular deformable region. In particular, Block S350 detects an input at a sensor, such as a touch sensor integrated in a touchscreen display of the mobile computing device (e.g., a capacitive, resistive, or optical touch sensor). Alternatively, Block S350 can detect the input at a pressure sensor by detecting a change in pressure of the fluid in the cavity. An increase in pressure of the fluid in the cavity corresponds to depression of the deformable region into the cavity and, thus, an input to the dynamic tactile interface.
- Generally, method S300 functions to register interaction with the dynamic tactile interface by detecting an orientation of the device in Block 310, identifying an anticipated input corresponding to a native application currently executing on the device, the anticipated input associated with an input location of the device in Block S320; selecting a particular deformable region from a set of deformable regions, the particular deformable region corresponding to the anticipated input and adjacent the input location in Block S330; selectively transitioning the particular deformable region from a retracted setting substantially flush with the peripheral region to an expanded setting tactilely distinguishable from the peripheral region in Block S340; and detecting an input, corresponding to the anticipated input, to the particular deformable region in Block S350.
- One example of method S300 includes detecting a mobile phone held by a user in a landscape orientation in Block S310. Block S310 can detect the mobile phone held by two hands of the user, the mobile phone situated between a thumb and an index finger of each hand as shown in
FIG. 16 . In Block S320, method S300 can detect a native camera application executing on the phone and anticipate a future input corresponding to selection of a shutter button to save an image captured by a lends and rendered by the native camera application on a display of the mobile phone. Block S320 further detects an anticipated input location of the future input corresponding to the location of one of the index fingers. Block S330 can select the deformable region at a location corresponding to the anticipated input location, and Block S340 can expand the deformable region. Thus, Blocks S330 and S340 can function to form a tactilely distinguishable shutter button substantially underneath the index finger that is resting on a surface (and holding) the mobile phone. Block S350 can detect depression of the tactilely distinguishable shutter button and trigger image capture with the camera accordingly. - In a similar example, method S300 can include detecting the orientation of the mobile phone (e.g., in a portrait orientation) in Block S310. Block S320 can detect a camera application executing on the mobile phone, the camera application capturing an image detected by a forward-facing camera built into a face of the mobile phone proximal the display. Block S320 can anticipate an input, such as selection of a virtual shutter button in order to capture the image with the forward-facing camera (i.e., a “selfie”) as shown in
FIG. 17 . The input location can correspond to the virtual shutter button rendered by the display. The virtual shutter button can be located at a center of the display, proximal an edge of the display. Alternatively, the input location can correspond to any location on any surface of the mobile device. For example, the input location can be centered on the display corresponding to an ergonomic location for contact by a finger (e.g., a thumb). The input location can also be arranged adjacent a finger holding the mobile phone and contacting a surface outside the display (e.g., an edge of the phone). Block S330 can select the particular deformable region corresponding to the ergonomic location and Block S340 can expand the deformable region into a tactilely distinguishable dome. Thus, Blocks S330 and S340 function to deploy a physical shutter button and Block S350 can detect depression of the physical shutter button, which can trigger the camera application to capture the image detected by the forward-facing camera. - In another example, method S300 can include detecting the orientation of the mobile computing device with an accelerometer or other orientation-detecting sensor. Block S310 of method S300 can detect the minor axis of the mobile computing device substantially parallel a horizontal plane, thereby defining a portrait orientation. Block S310 of method S300 can also detect the major axis of the mobile phone substantially parallel a horizontal plane, thereby defining a landscape orientation as shown in
FIG. 18 . Block S320 can identify an input to a key of a virtual keyboard as an anticipated input to a native application that renders the virtual keyboard on the touchscreen of the mobile computing device. Block S320 can predict the orientation of the virtual keyboard in response to the orientation of the mobile computing device detected in Block S310. For example, Block S320 can identify an anticipated input to a portrait keyboard in response to detection of the mobile computing device in the portrait orientation. Likewise, Block S320 can identify an anticipated input to a landscape keyboard in response to detection of the mobile computing device in the landscape orientation. Block S330 can select a set of particular deformable regions corresponding to (e.g., arranged over) each key of the virtual keyboard rendered by the touchscreen. Block S340 can selectively transition the set of particular deformable regions to an expanded setting, thereby rendering a physical keyboard of deformable regions in an orientation corresponding to the orientation of the device. - In another example shown in
FIG. 13 , method S300 can detect a music application executing on the mobile computing device and expand a deformable region corresponding to (e.g., adjacent, coincident) a volume control (e.g., a volume slider) in anticipation of an input to modify a volume output by the device and/or a native application executing thereon. Method S300 can detect an input object proximal a surface of the mobile computing device. For example, method S300 can detect a figure resting on a surface opposite the touchscreen (e.g., a back surface of the mobile computing device). Method S300 can identify the anticipated input that changes the volume output as a slide gesture across the tactile interface. Method S300 can select a particular deformable region or set of deformable regions that define a substantially elongated and tactilely distinguishable button on which the user can enter the slide gesture and that are located substantially coincident the input object, such as adjacent a finger resting on a back surface of the mobile computing device opposite a touchscreen. - In another example, method S300 can detect an input object proximal a surface of the device, and, upon detection of the input object contacting the device, method S300 can expand the particular deformable region coincident the input object. Method S300 can identify an anticipated input corresponding to a command to wake a “sleeping” device (e.g., a device in a low energy mode). For example, method S300 can anticipate depression of a wake button on the “sleeping” device. The “sleeping” device can be powered on (e.g., consuming energy from a battery and executing programs) but a touchscreen of the device can be disabled until the command to wake the “sleeping” device enables the touchscreen. Method S300 can detect the input object proximal or coincident a surface of the device. For example, method S300 can detect a hand or finger resting on the device as would occur if one were to hold the device in the hand. Accordingly, method S300 can select the particular deformable region coincident or adjacent the input object and selectively expand the particular deformable region. Method S300 can detect depression of the particular deformable region and interpret depression of the particular deformable region as a command to wake the “sleeping” device accordingly.
- An example of this variation includes expanding a deformable region corresponding to an icon indicating receipt of an incoming message as shown in
FIG. 19 . In particular, method S300 includes detecting an incoming message to a native messaging application executing on the computing device. In response to the incoming message, method S300 identifies an anticipated output from the computing device and the native message application corresponding to a notification indicating receipt of the incoming message. For example, method S300 can anticipate an icon rendered by the touchscreen in response to receipt of the incoming message. The icon can include an abbreviated version of the message. Method S300 can further anticipate an input corresponding to the icon, such as a slide gesture substantially over the icon. Method S300 can unlock a lock screen and open the message in response to detection of the slide gesture into the device. Method S300 can further select a particular deformable region corresponding to the icon (e.g., of substantially the same shape as the icon) and selectively expand the deformable region to an expanded setting in anticipation of the slide input. Method S300 can also detect the slide input, which can be applied to the deformable region and, thus, the icon. - Another example of the variation includes expanding the particular deformable region corresponding in response to an incoming phone call, the particular deformable region corresponding to an anticipated input that answers the incoming phone call. In particular, method S300 can detect an incoming phone call and, thus, render a notification on the display to notify the user of the incoming phone call. For example, method S300 can render a virtual icon on a touchscreen of the device to prompt the user to answer the phone call. Additionally, method S300 can selectively expand a particular deformable region arranged over the virtual icon. Alternatively, method S300 can select and expand a particular deformable region corresponding to an anticipated input location, such as a surface of the device where an input object (e.g., the user's finger) is in contact with the device prior and up to the time of the incoming phone call. Thus, the method can raise a particular deformable region adjacent a surface of the device that the user is already touching, and the user can answer the phone call by depressing the particular deformable region thus raised under or adjacent the user's finger.
- In another example, method S300 can detect an external surface, such as a surface on which the device rests, and selectively deformable the particular deformable region(s) opposite the external surface. For example, a mobile phone can rest on a surface of a table with the touchscreen of the mobile phone contacting the surface of the table. Method S300 can detect the surface of the table proximal the touchscreen. In response to receipt of an incoming phone call, method S300 can identify a notification notifying the user of the phone call, a location of the notification corresponding to a surface of the mobile phone opposite the external surface (e.g., the back of the phone), and an anticipated input corresponding to answering the incoming phone call. Thus, method S300 can select the particular deformable region corresponding to the location of the notification opposite the external surface (e.g., the back of the phone) and transition the deformable region to an expanded setting, thereby indicating the incoming phone call and providing a tactile feature on which a user can apply the anticipated input.
- Another example of the variation includes expanding the particular deformable region corresponding to an icon representing a local area wireless technology or short-range wireless communication rendered by the touchscreen of the mobile computing device in response to short-range wireless communication (e.g., Bluetooth) between the mobile computing device and a secondary device, as shown in
FIG. 15 . In particular, in this example, method S300 detects a short-range wireless communication application executing on the mobile computing device. Method S300 can detect an event corresponding to the secondary device within an area proximal the mobile computing device. The secondary device can also execute a native short-range wireless communication application or emit a short-range wireless communication signal that is detectable by the mobile computing device when the secondary device is within wireless range of the mobile computing device. In response to detection of the secondary device (e.g., detection of a wireless signal from the secondary device), method S300 can render on the display of the mobile computing device an interface through which the user can confirm continued short-range wireless communication between the mobile computing device and the secondary device. Method S300 can select a deformable region substantially corresponding to the interface and selectively expand the deformable region, thereby yielding a raised button with which a user can interact to confirm continued wireless communication with the second device. The interface can correspond to an image of an icon rendered on a touchscreen within the mobile computing device, the icon graphically representing the short-range wireless communication between the devices. For example, the icon can include a list of devices (or local area networks) within the area proximal the mobile computing devices from which the user can select one or more devices (or local area networks) with which the mobile computing device may communicate. Alternatively, the interface can be represented over a region of the mobile computing device distinct from the touchscreen portion, such as a side or back surface of the mobile computing device. - In another example of the variation, method S300 can retract the deformable region(s) and disable input(s) to the mobile computing device in response to receipt of a signal from a third party device indicating the mobile computing device was lost or stolen. In particular, method S300 can detect a phone tracking application executing on the mobile computing device. Method S300 can detect a message from a third party device indicating that owner of the mobile computing device no longer possesses the mobile computing device. Thus, with the phone tracking application, method S300 tracks location and can disable interactive features of the mobile computing device. Method S300 can disable inputs and outputs to the mobile computing device. Thus, method S300 can selectively transition expanded deformable regions to the retracted setting.
- The systems and methods of the embodiments can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions. The instructions can be executed by computer-executable components integrated with the application, applet, host, server, network, website, communication service, communication interface, native application, frame, iframe, hardware/firmware/software elements of a user computer or mobile device, or any suitable combination thereof. Other systems and methods of the embodiments can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions. The instructions can be executed by computer-executable components integrated by computer-executable components integrated with apparatuses and networks of the type described above. The computer-readable medium can be stored on any suitable computer readable media such as RAMs, ROMs, flash memory, EEPROMs, optical devices (CD or DVD), hard drives, floppy drives, or any suitable device. The computer-executable component can be a processor, though any suitable dedicated hardware device can (alternatively or additionally) execute the instructions.
- As a person skilled in the art will recognize from the previous detailed description and from the figures and claims, modifications and changes can be made to the preferred embodiments of the invention without departing from the scope of this invention defined in the following claims.
Claims (1)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/046,123 US20160162064A1 (en) | 2008-01-04 | 2016-02-17 | Method for actuating a tactile interface layer |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/969,848 US8547339B2 (en) | 2008-01-04 | 2008-01-04 | System and methods for raised touch screens |
US12/319,334 US8154527B2 (en) | 2008-01-04 | 2009-01-05 | User interface system |
US12/497,622 US8179375B2 (en) | 2008-01-04 | 2009-07-03 | User interface system and method |
US201361871264P | 2013-08-28 | 2013-08-28 | |
US14/471,889 US9298261B2 (en) | 2008-01-04 | 2014-08-28 | Method for actuating a tactile interface layer |
US15/046,123 US20160162064A1 (en) | 2008-01-04 | 2016-02-17 | Method for actuating a tactile interface layer |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/471,889 Continuation US9298261B2 (en) | 2008-01-04 | 2014-08-28 | Method for actuating a tactile interface layer |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160162064A1 true US20160162064A1 (en) | 2016-06-09 |
Family
ID=52667510
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/471,889 Expired - Fee Related US9298261B2 (en) | 2008-01-04 | 2014-08-28 | Method for actuating a tactile interface layer |
US15/046,123 Abandoned US20160162064A1 (en) | 2008-01-04 | 2016-02-17 | Method for actuating a tactile interface layer |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/471,889 Expired - Fee Related US9298261B2 (en) | 2008-01-04 | 2014-08-28 | Method for actuating a tactile interface layer |
Country Status (1)
Country | Link |
---|---|
US (2) | US9298261B2 (en) |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170192663A1 (en) * | 2014-09-25 | 2017-07-06 | Intel Corporation | Touch-based link initialization and data transfer |
US9791929B2 (en) * | 2014-10-31 | 2017-10-17 | Elwha Llc | Tactile control system |
US10459537B2 (en) * | 2015-09-30 | 2019-10-29 | Stmicroelectronics, Inc. | Encapsulated pressure sensor |
KR20170043884A (en) | 2015-10-14 | 2017-04-24 | 삼성전자주식회사 | Electronic device including portective member |
JP6685695B2 (en) * | 2015-10-30 | 2020-04-22 | キヤノン株式会社 | Terminal and imaging device |
US10401962B2 (en) | 2016-06-21 | 2019-09-03 | Immersion Corporation | Haptically enabled overlay for a pressure sensitive surface |
JP6734152B2 (en) * | 2016-08-29 | 2020-08-05 | 京セラ株式会社 | Electronic device, control device, control program, and operating method of electronic device |
DK201670728A1 (en) | 2016-09-06 | 2018-03-19 | Apple Inc | Devices, Methods, and Graphical User Interfaces for Providing Feedback During Interaction with an Intensity-Sensitive Button |
US20180157395A1 (en) * | 2016-12-07 | 2018-06-07 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US10365818B2 (en) * | 2017-09-20 | 2019-07-30 | Synaptics Incorporated | Force transfer element for edge force sensing |
US20190302986A1 (en) * | 2018-03-30 | 2019-10-03 | Canon Kabushiki Kaisha | Operation apparatus and method for controlling the same |
US10871848B2 (en) | 2018-09-13 | 2020-12-22 | Sensel, Inc. | Method and apparatus for variable impedence touch sensor array gesture recognition |
US11003274B2 (en) | 2018-09-13 | 2021-05-11 | Sensel, Inc. | Method and apparatus for automotive variable impedance touch sensor array |
US10891050B2 (en) * | 2018-09-13 | 2021-01-12 | Sensel, Inc. | Method and apparatus for variable impedance touch sensor arrays in non-planar controls |
US10990223B2 (en) | 2018-09-13 | 2021-04-27 | Sensel, Inc. | Method and apparatus for variable impedence touch sensor array force aware interaction in large surface devices |
DE112019005472T5 (en) * | 2018-11-02 | 2021-10-28 | Sony Group Corporation | ELECTRONIC DEVICE AND SHAPE CHANGING SYSTEM |
WO2020139815A1 (en) | 2018-12-26 | 2020-07-02 | Sensel, Inc. | Method and apparatus for variable impedance touch sensor array force aware interaction with handheld display devices |
US11194415B2 (en) | 2019-01-03 | 2021-12-07 | Sensel, Inc. | Method and apparatus for indirect force aware touch control with variable impedance touch sensor arrays |
US11541305B2 (en) * | 2020-09-28 | 2023-01-03 | Snap Inc. | Context-sensitive remote eyewear controller |
Family Cites Families (441)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
NL270860A (en) | 1960-10-31 | 1900-01-01 | ||
US3659354A (en) | 1970-10-21 | 1972-05-02 | Mitre Corp | Braille display device |
US3780236A (en) | 1971-05-18 | 1973-12-18 | Gen Signal Corp | Push button switch assembly with slidable interlocking means preventing simultaneous operation of two or more pushbuttons |
US3759108A (en) | 1971-09-16 | 1973-09-18 | Gen Electric | Single gauge multi-time constant and multi-tissue ratio automatic decompression instruments |
US3818487A (en) | 1972-08-24 | 1974-06-18 | W Brody | Soft control materials |
US4109118A (en) | 1976-09-01 | 1978-08-22 | Victor Kley | Keyswitch pad |
US4209819A (en) | 1978-03-13 | 1980-06-24 | Key Tronic Corporation | Capacitive keyswitch |
US4290343A (en) | 1978-10-30 | 1981-09-22 | Mts Systems Corporation | High volume poppet valve with orifice opening speed control |
US4307268A (en) | 1978-11-17 | 1981-12-22 | Rogers Corporation | Tactile element and keyboard including the tactile element |
US4517421A (en) | 1980-01-28 | 1985-05-14 | Margolin George D | Resilient deformable keyboard |
US4543000A (en) | 1981-10-13 | 1985-09-24 | Hasenbalg Ralph D | Latching actuator |
US4467321A (en) | 1982-04-30 | 1984-08-21 | Volnak William M | Chording keyboard for generating binary data |
US4477700A (en) | 1983-11-14 | 1984-10-16 | Rogers Corporation | Tactile membrane keyboard with elliptical tactile key elements |
US4584625A (en) | 1984-09-11 | 1986-04-22 | Kellogg Nelson R | Capacitive tactile sensor |
AT387100B (en) | 1986-05-06 | 1988-11-25 | Siemens Ag Oesterreich | TACTILE DOTS OR PICTURE DISPLAY |
JPH0439613Y2 (en) | 1986-05-23 | 1992-09-17 | ||
US5194852A (en) | 1986-12-01 | 1993-03-16 | More Edward S | Electro-optic slate for direct entry and display and/or storage of hand-entered textual and graphic information |
US4920343A (en) | 1988-09-30 | 1990-04-24 | Honeywell Inc. | Capacitive keyswitch membrane with self contained sense-to-ground capacitance |
US4940734A (en) | 1988-11-23 | 1990-07-10 | American Cyanamid | Process for the preparation of porous polymer beads |
GB2239376A (en) | 1989-12-18 | 1991-06-26 | Ibm | Touch sensitive display |
US5631861A (en) | 1990-02-02 | 1997-05-20 | Virtual Technologies, Inc. | Force feedback and texture simulating interface device |
DE4012267A1 (en) | 1990-03-13 | 1991-11-28 | Joerg Fricke | DEVICE FOR TASTABLE PRESENTATION OF INFORMATION |
US5212473A (en) | 1991-02-21 | 1993-05-18 | Typeright Keyboard Corp. | Membrane keyboard and method of using same |
DE4133000C2 (en) | 1991-10-04 | 1993-11-18 | Siegfried Dipl Ing Kipke | Piezo-hydraulic module for the implementation of tactile information |
US5195659A (en) | 1991-11-04 | 1993-03-23 | Eiskant Ronald E | Discreet amount toothpaste dispenser |
US5369228A (en) | 1991-11-30 | 1994-11-29 | Signagraphics Corporation | Data input device with a pressure-sensitive input surface |
US5880411A (en) | 1992-06-08 | 1999-03-09 | Synaptics, Incorporated | Object position detector with edge motion feature and gesture recognition |
US5488204A (en) | 1992-06-08 | 1996-01-30 | Synaptics, Incorporated | Paintbrush stylus for capacitive touch sensor pad |
US5889236A (en) | 1992-06-08 | 1999-03-30 | Synaptics Incorporated | Pressure sensitive scrollbar feature |
US5412189A (en) | 1992-12-21 | 1995-05-02 | International Business Machines Corporation | Touch screen apparatus with tactile information |
AU6018494A (en) | 1993-05-21 | 1994-12-20 | Arthur D. Little Enterprises, Inc. | User-configurable control device |
US5721566A (en) | 1995-01-18 | 1998-02-24 | Immersion Human Interface Corp. | Method and apparatus for providing damping force feedback |
US6437771B1 (en) | 1995-01-18 | 2002-08-20 | Immersion Corporation | Force feedback device including flexure member between actuator and user object |
US5739811A (en) | 1993-07-16 | 1998-04-14 | Immersion Human Interface Corporation | Method and apparatus for controlling human-computer interface systems providing force feedback |
US5767839A (en) | 1995-01-18 | 1998-06-16 | Immersion Human Interface Corporation | Method and apparatus for providing passive force feedback to human-computer interface systems |
US5731804A (en) | 1995-01-18 | 1998-03-24 | Immersion Human Interface Corp. | Method and apparatus for providing high bandwidth, low noise mechanical I/O for computer systems |
US5459461A (en) | 1993-07-29 | 1995-10-17 | Crowley; Robert J. | Inflatable keyboard |
US5623582A (en) | 1994-07-14 | 1997-04-22 | Immersion Human Interface Corporation | Computer interface or control input device for laparoscopic surgical instrument and other elongated mechanical objects |
US5496174A (en) | 1994-08-04 | 1996-03-05 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Method and device for producing a tactile display using an electrorheological fluid |
US5717423A (en) | 1994-12-30 | 1998-02-10 | Merltec Innovative Research | Three-dimensional display |
US6850222B1 (en) | 1995-01-18 | 2005-02-01 | Immersion Corporation | Passive force feedback for computer interface devices |
NL9500589A (en) | 1995-03-28 | 1996-11-01 | Tieman Bv F J | Braille cell with an actuator containing a mechanically active, intrinsically conductive polymer. |
US6166723A (en) | 1995-11-17 | 2000-12-26 | Immersion Corporation | Mouse interface device providing force feedback |
US7113166B1 (en) | 1995-06-09 | 2006-09-26 | Immersion Corporation | Force feedback devices using fluid braking |
US8228305B2 (en) | 1995-06-29 | 2012-07-24 | Apple Inc. | Method for providing human input to a computer |
US7973773B2 (en) | 1995-06-29 | 2011-07-05 | Pryor Timothy R | Multipoint, virtual control, and force based touch screen applications |
US5959613A (en) | 1995-12-01 | 1999-09-28 | Immersion Corporation | Method and apparatus for shaping force signals for a force feedback device |
JP3524247B2 (en) | 1995-10-09 | 2004-05-10 | 任天堂株式会社 | Game machine and game machine system using the same |
US6384743B1 (en) | 1999-06-14 | 2002-05-07 | Wisconsin Alumni Research Foundation | Touch screen for the vision-impaired |
US5754023A (en) | 1995-10-26 | 1998-05-19 | Cybernet Systems Corporation | Gyro-stabilized platforms for force-feedback applications |
JPH09146708A (en) | 1995-11-09 | 1997-06-06 | Internatl Business Mach Corp <Ibm> | Driving method for touch panel and touch input method |
US6639581B1 (en) | 1995-11-17 | 2003-10-28 | Immersion Corporation | Flexure mechanism for interface device |
US6100874A (en) | 1995-11-17 | 2000-08-08 | Immersion Corporation | Force feedback mouse interface |
US5825308A (en) | 1996-11-26 | 1998-10-20 | Immersion Human Interface Corporation | Force feedback interface having isotonic and isometric functionality |
JP2000501033A (en) | 1995-11-30 | 2000-02-02 | ヴァーチャル テクノロジーズ インコーポレイテッド | Human / machine interface with tactile feedback |
US6028593A (en) | 1995-12-01 | 2000-02-22 | Immersion Corporation | Method and apparatus for providing simulated physical interactions within computer generated environments |
US7027032B2 (en) | 1995-12-01 | 2006-04-11 | Immersion Corporation | Designing force sensations for force feedback computer applications |
US6169540B1 (en) | 1995-12-01 | 2001-01-02 | Immersion Corporation | Method and apparatus for designing force sensations in force feedback applications |
US6219032B1 (en) | 1995-12-01 | 2001-04-17 | Immersion Corporation | Method for providing force feedback to a user of an interface device based on interactions of a controlled cursor with graphical elements in a graphical user interface |
US6300936B1 (en) | 1997-11-14 | 2001-10-09 | Immersion Corporation | Force feedback system including multi-tasking graphical host environment and interface device |
US6078308A (en) | 1995-12-13 | 2000-06-20 | Immersion Corporation | Graphical click surfaces for force feedback applications to provide user selection using cursor interaction with a trigger position within a boundary of a graphical object |
US6374255B1 (en) | 1996-05-21 | 2002-04-16 | Immersion Corporation | Haptic authoring |
US7629969B2 (en) | 1996-08-12 | 2009-12-08 | Tyco Electronics Corporation | Acoustic condition sensor employing a plurality of mutually non-orthogonal waves |
US6407757B1 (en) | 1997-12-18 | 2002-06-18 | E-Book Systems Pte Ltd. | Computer-based browsing method and computer program product for displaying information in an electronic book form |
US6024576A (en) | 1996-09-06 | 2000-02-15 | Immersion Corporation | Hemispherical, high bandwidth mechanical interface for computer systems |
JP3842876B2 (en) | 1996-09-27 | 2006-11-08 | 株式会社リコー | Digital camera |
US6411276B1 (en) | 1996-11-13 | 2002-06-25 | Immersion Corporation | Hybrid control of haptic feedback for host computer and interface device |
US7489309B2 (en) | 1996-11-26 | 2009-02-10 | Immersion Corporation | Control knob with multiple degrees of freedom and force feedback |
US6154201A (en) | 1996-11-26 | 2000-11-28 | Immersion Corporation | Control knob with multiple degrees of freedom and force feedback |
US6686911B1 (en) | 1996-11-26 | 2004-02-03 | Immersion Corporation | Control knob with control modes and force feedback |
US6278441B1 (en) | 1997-01-09 | 2001-08-21 | Virtouch, Ltd. | Tactile interface system for electronic data display system |
CA2278726C (en) | 1997-01-27 | 2004-08-31 | Immersion Corporation | Method and apparatus for providing high bandwidth, realistic force feedback including an improved actuator |
JPH10255106A (en) | 1997-03-10 | 1998-09-25 | Toshiba Corp | Touch panel, touch panel input device and automatic teller machine |
US5982304A (en) | 1997-03-24 | 1999-11-09 | International Business Machines Corporation | Piezoelectric switch with tactile response |
US7091948B2 (en) | 1997-04-25 | 2006-08-15 | Immersion Corporation | Design of force sensations for haptic feedback computer interfaces |
US6243074B1 (en) | 1997-08-29 | 2001-06-05 | Xerox Corporation | Handedness detection for a physical manipulatory grammar |
US6268857B1 (en) | 1997-08-29 | 2001-07-31 | Xerox Corporation | Computer user interface using a physical manipulatory grammar |
US5917906A (en) | 1997-10-01 | 1999-06-29 | Ericsson Inc. | Touch pad with tactile feature |
GB2332940A (en) | 1997-10-17 | 1999-07-07 | Patrick Eldridge | Mouse pad |
US6088019A (en) | 1998-06-23 | 2000-07-11 | Immersion Corporation | Low cost force feedback device with actuator for non-primary axis |
US8020095B2 (en) | 1997-11-14 | 2011-09-13 | Immersion Corporation | Force feedback system including multi-tasking graphical host environment |
US6211861B1 (en) | 1998-06-23 | 2001-04-03 | Immersion Corporation | Tactile mouse device |
US6243078B1 (en) | 1998-06-23 | 2001-06-05 | Immersion Corporation | Pointing device with forced feedback button |
US6448977B1 (en) | 1997-11-14 | 2002-09-10 | Immersion Corporation | Textures and other spatial sensations for a relative haptic interface device |
US6256011B1 (en) | 1997-12-03 | 2001-07-03 | Immersion Corporation | Multi-function control device with force feedback |
US6667738B2 (en) | 1998-01-07 | 2003-12-23 | Vtech Communications, Ltd. | Touch screen overlay apparatus |
US6160540A (en) | 1998-01-12 | 2000-12-12 | Xerox Company | Zoomorphic computer user interface |
EP1717684A3 (en) | 1998-01-26 | 2008-01-23 | Fingerworks, Inc. | Method and apparatus for integrating manual input |
US7663607B2 (en) | 2004-05-06 | 2010-02-16 | Apple Inc. | Multipoint touchscreen |
US20060033724A1 (en) | 2004-07-30 | 2006-02-16 | Apple Computer, Inc. | Virtual input device placement on a touch screen user interface |
US8479122B2 (en) | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
EP1051698B1 (en) | 1998-01-28 | 2018-01-17 | Immersion Medical, Inc. | Interface device and method for interfacing instruments to vascular access simulation systems |
US6100541A (en) | 1998-02-24 | 2000-08-08 | Caliper Technologies Corporation | Microfluidic devices and systems incorporating integrated optical elements |
US5977867A (en) | 1998-05-29 | 1999-11-02 | Nortel Networks Corporation | Touch pad panel with tactile feedback |
US6369803B2 (en) | 1998-06-12 | 2002-04-09 | Nortel Networks Limited | Active edge user interface |
US6429846B2 (en) | 1998-06-23 | 2002-08-06 | Immersion Corporation | Haptic feedback for touchpads and other touch controls |
US6697043B1 (en) | 1999-12-21 | 2004-02-24 | Immersion Corporation | Haptic interface device and actuator assembly providing linear haptic sensations |
US6188391B1 (en) | 1998-07-09 | 2001-02-13 | Synaptics, Inc. | Two-layer capacitive touchpad and method of making same |
JP2000029612A (en) | 1998-07-15 | 2000-01-28 | Smk Corp | Touch panel input device |
JP2000029611A (en) | 1998-07-15 | 2000-01-28 | Smk Corp | Touch panel input device |
US6681031B2 (en) | 1998-08-10 | 2004-01-20 | Cybernet Systems Corporation | Gesture-controlled interfaces for self-service machines and other applications |
US6359572B1 (en) | 1998-09-03 | 2002-03-19 | Microsoft Corporation | Dynamic keyboard |
US6354839B1 (en) | 1998-10-10 | 2002-03-12 | Orbital Research, Inc. | Refreshable braille display system |
US7038667B1 (en) | 1998-10-26 | 2006-05-02 | Immersion Corporation | Mechanisms for control knobs and other interface devices |
US6218966B1 (en) | 1998-11-05 | 2001-04-17 | International Business Machines Corporation | Tactile feedback keyboard |
US6756970B2 (en) | 1998-11-20 | 2004-06-29 | Microsoft Corporation | Pen-based computer system |
GB2345193B (en) | 1998-12-22 | 2002-07-24 | Nokia Mobile Phones Ltd | Metallic keys |
CA2278832A1 (en) | 1999-01-06 | 2000-07-06 | Vtech Communications, Ltd. | Touch screen overlay apparatus |
US7124425B1 (en) | 1999-03-08 | 2006-10-17 | Immersion Entertainment, L.L.C. | Audio/video system and method utilizing a head mounted apparatus with noise attenuation |
JP3817965B2 (en) | 1999-04-21 | 2006-09-06 | 富士ゼロックス株式会社 | Detection device |
US6377685B1 (en) | 1999-04-23 | 2002-04-23 | Ravi C. Krishnan | Cluster key arrangement |
US6903721B2 (en) | 1999-05-11 | 2005-06-07 | Immersion Corporation | Method and apparatus for compensating for position slip in interface devices |
CA2273113A1 (en) | 1999-05-26 | 2000-11-26 | Tactex Controls Inc. | Touch pad using a non-electrical deformable pressure sensor |
US7210160B2 (en) | 1999-05-28 | 2007-04-24 | Immersion Entertainment, L.L.C. | Audio/video programming and charging system and method |
US7151528B2 (en) | 1999-06-22 | 2006-12-19 | Cirque Corporation | System for disposing a proximity sensitive touchpad behind a mobile phone keypad |
US7144616B1 (en) | 1999-06-28 | 2006-12-05 | California Institute Of Technology | Microfabricated elastomeric valve and pump systems |
US6929030B2 (en) | 1999-06-28 | 2005-08-16 | California Institute Of Technology | Microfabricated elastomeric valve and pump systems |
US6899137B2 (en) | 1999-06-28 | 2005-05-31 | California Institute Of Technology | Microfabricated elastomeric valve and pump systems |
US6501462B1 (en) | 1999-07-01 | 2002-12-31 | Gateway, Inc. | Ergonomic touch pad |
US6982696B1 (en) | 1999-07-01 | 2006-01-03 | Immersion Corporation | Moving magnet actuator for providing haptic feedback |
US8169402B2 (en) | 1999-07-01 | 2012-05-01 | Immersion Corporation | Vibrotactile haptic feedback devices |
US7561142B2 (en) | 1999-07-01 | 2009-07-14 | Immersion Corporation | Vibrotactile haptic feedback devices |
US6337678B1 (en) | 1999-07-21 | 2002-01-08 | Tactiva Incorporated | Force feedback computer input and output device with coordinated haptic elements |
US6529183B1 (en) | 1999-09-13 | 2003-03-04 | Interval Research Corp. | Manual interface combining continuous and discrete capabilities |
DE20080209U1 (en) | 1999-09-28 | 2001-08-09 | Immersion Corp | Control of haptic sensations for interface devices with vibrotactile feedback |
US6680729B1 (en) | 1999-09-30 | 2004-01-20 | Immersion Corporation | Increasing force transmissibility for tactile feedback interface devices |
FI19992510A (en) | 1999-11-24 | 2001-05-25 | Nokia Mobile Phones Ltd | Electronic device and method in the electronic device |
US6693626B1 (en) | 1999-12-07 | 2004-02-17 | Immersion Corporation | Haptic feedback using a keyboard device |
US6509892B1 (en) | 1999-12-17 | 2003-01-21 | International Business Machines Corporation | Method, system and program for topographical interfacing |
US20010008396A1 (en) | 2000-01-14 | 2001-07-19 | Nobuhiro Komata | Recording medium, computer and method for selecting computer display items |
US6573844B1 (en) | 2000-01-18 | 2003-06-03 | Microsoft Corporation | Predictive keyboard |
US6822635B2 (en) | 2000-01-19 | 2004-11-23 | Immersion Corporation | Haptic interface for laptop computers and other portable devices |
AU2001244340A1 (en) | 2000-03-30 | 2001-10-15 | Electrotextiles Company Limited | Input device |
US6924787B2 (en) | 2000-04-17 | 2005-08-02 | Immersion Corporation | Interface for controlling a graphical image |
US6937225B1 (en) | 2000-05-15 | 2005-08-30 | Logitech Europe S.A. | Notification mechanisms on a control device |
AU2001264781A1 (en) | 2000-05-22 | 2001-12-17 | Digit Wireless, Llc | Input devices and their use |
US7196688B2 (en) | 2000-05-24 | 2007-03-27 | Immersion Corporation | Haptic devices using electroactive polymers |
FR2810779B1 (en) | 2000-06-21 | 2003-06-13 | Commissariat Energie Atomique | ELEMENT A RELIEF EVOLUTIF |
US7159008B1 (en) | 2000-06-30 | 2007-01-02 | Immersion Corporation | Chat interface with haptic feedback functionality |
US7233476B2 (en) | 2000-08-11 | 2007-06-19 | Immersion Corporation | Actuator thermal protection in haptic feedback devices |
DE10046099A1 (en) | 2000-09-18 | 2002-04-04 | Siemens Ag | Touch sensitive display with tactile feedback |
US6683627B1 (en) | 2000-09-28 | 2004-01-27 | International Business Machines Corporation | Scroll box controls |
US7182691B1 (en) | 2000-09-28 | 2007-02-27 | Immersion Corporation | Directional inertial tactile feedback using rotating masses |
AU2002213043A1 (en) | 2000-10-06 | 2002-04-15 | Protasis Corporation | Fluid separation conduit cartridge |
US7006081B2 (en) | 2000-10-20 | 2006-02-28 | Elo Touchsystems, Inc. | Acoustic touch sensor with laminated substrate |
US7463249B2 (en) | 2001-01-18 | 2008-12-09 | Illinois Tool Works Inc. | Acoustic wave touch actuated switch with feedback |
US6949176B2 (en) | 2001-02-28 | 2005-09-27 | Lightwave Microsystems Corporation | Microfluidic control using dielectric pumping |
US7567232B2 (en) | 2001-03-09 | 2009-07-28 | Immersion Corporation | Method of using tactile feedback to deliver silent status information to a user of an electronic device |
US6819316B2 (en) | 2001-04-17 | 2004-11-16 | 3M Innovative Properties Company | Flexible capacitive touch sensor |
US6636202B2 (en) | 2001-04-27 | 2003-10-21 | International Business Machines Corporation | Interactive tactile display for computer screen |
US7202851B2 (en) | 2001-05-04 | 2007-04-10 | Immersion Medical Inc. | Haptic interface for palpation simulation |
US6924752B2 (en) | 2001-05-30 | 2005-08-02 | Palmone, Inc. | Three-dimensional contact-sensitive feature for electronic devices |
US6937033B2 (en) | 2001-06-27 | 2005-08-30 | Immersion Corporation | Position sensor with resistive element |
US7154470B2 (en) | 2001-07-17 | 2006-12-26 | Immersion Corporation | Envelope modulator for haptic feedback devices |
US6700556B2 (en) | 2001-07-26 | 2004-03-02 | Xerox Corporation | Display sheet with stacked electrode structure |
JP3708508B2 (en) | 2001-08-23 | 2005-10-19 | 株式会社アイム | Fingertip tactile input device and portable information terminal using the same |
US6937229B2 (en) | 2001-08-28 | 2005-08-30 | Kevin Murphy | Keycap for displaying a plurality of indicia |
US6995745B2 (en) | 2001-09-13 | 2006-02-07 | E-Book Systems Pte Ltd. | Electromechanical information browsing device |
US7151432B2 (en) | 2001-09-19 | 2006-12-19 | Immersion Corporation | Circuit and method for a switch matrix and switch sensing |
US6703550B2 (en) | 2001-10-10 | 2004-03-09 | Immersion Corporation | Sound data output and manipulation using haptic feedback |
AU2002336708A1 (en) | 2001-11-01 | 2003-05-12 | Immersion Corporation | Method and apparatus for providing tactile sensations |
FI112415B (en) | 2001-11-28 | 2003-11-28 | Nokia Oyj | Piezoelectric user interface |
US6975305B2 (en) | 2001-12-07 | 2005-12-13 | Nec Infrontia Corporation | Pressure-sensitive touch panel |
EP1459245B1 (en) | 2001-12-12 | 2006-03-08 | Koninklijke Philips Electronics N.V. | Display system with tactile guidance |
US7352356B2 (en) | 2001-12-13 | 2008-04-01 | United States Of America | Refreshable scanning tactile graphic display for localized sensory stimulation |
US6703924B2 (en) | 2001-12-20 | 2004-03-09 | Hewlett-Packard Development Company, L.P. | Tactile display apparatus |
KR100769783B1 (en) | 2002-03-29 | 2007-10-24 | 가부시끼가이샤 도시바 | Display input device and display input system |
US6904823B2 (en) | 2002-04-03 | 2005-06-14 | Immersion Corporation | Haptic shifting devices |
CN1692401B (en) | 2002-04-12 | 2011-11-16 | 雷斯里·R·奥柏梅尔 | Multi-axis transducer means and joystick |
US7161580B2 (en) | 2002-04-25 | 2007-01-09 | Immersion Corporation | Haptic feedback using rotary harmonic moving mass |
US7369115B2 (en) | 2002-04-25 | 2008-05-06 | Immersion Corporation | Haptic devices having multiple operational modes including at least one resonant mode |
US7209113B2 (en) | 2002-05-09 | 2007-04-24 | Gateway Inc. | Stylus pen expansion slot |
JP3974451B2 (en) | 2002-05-15 | 2007-09-12 | 株式会社 日立ディスプレイズ | Liquid crystal display |
US6655788B1 (en) | 2002-05-17 | 2003-12-02 | Viztec Inc. | Composite structure for enhanced flexibility of electro-optic displays with sliding layers |
FI20021024A (en) | 2002-05-30 | 2003-12-01 | Nokia Corp | Cover structure for a keyboard |
FI20021162A0 (en) | 2002-06-14 | 2002-06-14 | Nokia Corp | Electronic device and a method for administering its keypad |
US6930234B2 (en) | 2002-06-19 | 2005-08-16 | Lanny Davis | Adjustable keyboard apparatus and method |
KR20050012802A (en) | 2002-06-19 | 2005-02-02 | 코닌클리케 필립스 일렉트로닉스 엔.브이. | Tactile device |
US6776546B2 (en) | 2002-06-21 | 2004-08-17 | Microsoft Corporation | Method and system for using a keyboard overlay with a touch-sensitive display screen |
US20060087479A1 (en) | 2002-06-21 | 2006-04-27 | Bridgestone Corporation | Image display and method for manufacturing image display |
US7068782B2 (en) | 2002-06-27 | 2006-06-27 | Motorola, Inc. | Communications devices with receiver earpieces and methods therefor |
US7656393B2 (en) | 2005-03-04 | 2010-02-02 | Apple Inc. | Electronic device having display and surrounding touch sensitive bezel for user interface and control |
US11275405B2 (en) | 2005-03-04 | 2022-03-15 | Apple Inc. | Multi-functional hand-held device |
JP3600606B2 (en) | 2002-09-20 | 2004-12-15 | 株式会社東芝 | Electronics |
WO2004028955A2 (en) | 2002-09-25 | 2004-04-08 | California Institute Of Technology | Microfluidic large scale integration |
US7138985B2 (en) | 2002-09-25 | 2006-11-21 | Ui Evolution, Inc. | Tactilely enhanced visual image display |
US7253807B2 (en) | 2002-09-25 | 2007-08-07 | Uievolution, Inc. | Interactive apparatuses with tactiley enhanced visual imaging capability and related methods |
US6965370B2 (en) | 2002-11-19 | 2005-11-15 | Immersion Corporation | Haptic feedback devices for simulating an orifice |
US20040106360A1 (en) | 2002-11-26 | 2004-06-03 | Gilbert Farmer | Method and apparatus for cleaning combustor liners |
FR2849258B1 (en) | 2002-12-19 | 2006-12-22 | Commissariat Energie Atomique | SURFACE MODIFICATION PLATE |
US7138977B2 (en) | 2003-01-15 | 2006-11-21 | Motorola, Inc. | Proportional force input apparatus for an electronic device |
US7336266B2 (en) | 2003-02-20 | 2008-02-26 | Immersion Corproation | Haptic pads for use with user-interface devices |
WO2004077379A2 (en) | 2003-02-24 | 2004-09-10 | Peichun Yang | Electroactive polymer actuator braille cell and braille display |
JP3669363B2 (en) | 2003-03-06 | 2005-07-06 | ソニー株式会社 | Electrodeposition type display panel manufacturing method, electrodeposition type display panel, and electrodeposition type display device |
US7064748B2 (en) | 2003-03-11 | 2006-06-20 | Eastman Kodak Company | Resistive touch screen with variable resistivity layer |
US7081888B2 (en) | 2003-04-24 | 2006-07-25 | Eastman Kodak Company | Flexible resistive touch screen |
WO2004097788A1 (en) | 2003-04-28 | 2004-11-11 | Immersion Corporation | Systems and methods for user interfaces designed for rotary input devices |
US7280095B2 (en) | 2003-04-30 | 2007-10-09 | Immersion Corporation | Hierarchical methods for generating force feedback effects |
DE10324579A1 (en) | 2003-05-30 | 2004-12-16 | Daimlerchrysler Ag | operating device |
EP1816545A3 (en) | 2003-05-30 | 2007-08-15 | Immersion Corporation | System and method for low power haptic feedback |
JP2005316931A (en) | 2003-06-12 | 2005-11-10 | Alps Electric Co Ltd | Input method and input device |
GB0313808D0 (en) | 2003-06-14 | 2003-07-23 | Binstead Ronald P | Improvements in touch technology |
US7056051B2 (en) | 2003-06-16 | 2006-06-06 | Fiffie Artiss J | Inflatable device for displaying information |
US7098897B2 (en) | 2003-06-30 | 2006-08-29 | Motorola, Inc. | Touch screen assembly and display for an electronic device |
US20050020325A1 (en) | 2003-07-24 | 2005-01-27 | Motorola, Inc. | Multi-configuration portable electronic device and method for operating the same |
DE10340188A1 (en) | 2003-09-01 | 2005-04-07 | Siemens Ag | Screen with a touch-sensitive user interface for command input |
US7245292B1 (en) | 2003-09-16 | 2007-07-17 | United States Of America As Represented By The Secretary Of The Navy | Apparatus and method for incorporating tactile control and tactile feedback into a human-machine interface |
US20050073506A1 (en) | 2003-10-05 | 2005-04-07 | Durso Nick P. | C-frame slidable touch input apparatus for displays of computing devices |
JP2005117313A (en) | 2003-10-07 | 2005-04-28 | Fujitsu Ltd | Piezo-electric element and touch panel device |
US20050088417A1 (en) | 2003-10-24 | 2005-04-28 | Mulligan Roger C. | Tactile touch-sensing system |
US7161276B2 (en) | 2003-10-24 | 2007-01-09 | Face International Corp. | Self-powered, electronic keyed, multifunction switching system |
US7096852B2 (en) | 2003-10-30 | 2006-08-29 | Immersion Corporation | Haptic throttle devices and methods |
US7218313B2 (en) | 2003-10-31 | 2007-05-15 | Zeetoo, Inc. | Human interface system |
WO2005050428A2 (en) | 2003-11-18 | 2005-06-02 | Johnson Controls Technology Company | Reconfigurable user interface |
US7495659B2 (en) | 2003-11-25 | 2009-02-24 | Apple Inc. | Touch pad for handheld device |
US8164573B2 (en) | 2003-11-26 | 2012-04-24 | Immersion Corporation | Systems and methods for adaptive interpretation of input from a touch-sensitive input device |
EP1544048A1 (en) | 2003-12-17 | 2005-06-22 | IEE INTERNATIONAL ELECTRONICS & ENGINEERING S.A. | Device for the classification of seat occupancy |
US7112737B2 (en) | 2003-12-31 | 2006-09-26 | Immersion Corporation | System and method for providing a haptic effect to a musical instrument |
US7064655B2 (en) | 2003-12-31 | 2006-06-20 | Sony Ericsson Mobile Communications Ab | Variable-eccentricity tactile generator |
US7283120B2 (en) | 2004-01-16 | 2007-10-16 | Immersion Corporation | Method and apparatus for providing haptic feedback having a position-based component and a predetermined time-based component |
US7307626B2 (en) | 2004-01-27 | 2007-12-11 | Tyco Electronics Corporation | Capacitive touch sensor |
US7403191B2 (en) | 2004-01-28 | 2008-07-22 | Microsoft Corporation | Tactile overlay for an imaging display |
US7129854B2 (en) | 2004-02-10 | 2006-10-31 | Motorola, Inc. | Electronic device with force sensing key |
US7432911B2 (en) | 2004-02-26 | 2008-10-07 | Research In Motion Limited | Keyboard for mobile devices |
CA2460943A1 (en) | 2004-03-16 | 2005-09-16 | Unknown | Pocket size computers |
US7205981B2 (en) | 2004-03-18 | 2007-04-17 | Immersion Corporation | Method and apparatus for providing resistive haptic feedback using a vacuum source |
US7289111B2 (en) | 2004-03-25 | 2007-10-30 | International Business Machines Corporation | Resistive touch pad with multiple regions of sensitivity |
US7289106B2 (en) | 2004-04-01 | 2007-10-30 | Immersion Medical, Inc. | Methods and apparatus for palpation simulation |
US7319374B2 (en) | 2004-04-14 | 2008-01-15 | Immersion Corporation | Moving magnet actuator |
US20050231489A1 (en) | 2004-04-15 | 2005-10-20 | Research In Motion Limited | System and method for providing dynamic tactile feedback on hand-held electronic devices |
US7522152B2 (en) | 2004-05-27 | 2009-04-21 | Immersion Corporation | Products and processes for providing haptic feedback in resistive interface devices |
US7515122B2 (en) | 2004-06-02 | 2009-04-07 | Eastman Kodak Company | Color display device with enhanced pixel pattern |
JP4148187B2 (en) | 2004-06-03 | 2008-09-10 | ソニー株式会社 | Portable electronic device, input operation control method and program thereof |
JP2006011646A (en) | 2004-06-23 | 2006-01-12 | Pioneer Electronic Corp | Tactile sense display device and tactile sense display function-equipped touch panel |
US7743348B2 (en) | 2004-06-30 | 2010-06-22 | Microsoft Corporation | Using physical objects to adjust attributes of an interactive display application |
US7342573B2 (en) | 2004-07-07 | 2008-03-11 | Nokia Corporation | Electrostrictive polymer as a combined haptic-seal actuator |
US7198137B2 (en) | 2004-07-29 | 2007-04-03 | Immersion Corporation | Systems and methods for providing haptic feedback with position sensing |
US20070229233A1 (en) | 2004-08-02 | 2007-10-04 | Dort David B | Reconfigurable tactile-enhanced display including "tap-and-drop" computing system for vision impaired users |
US7245202B2 (en) | 2004-09-10 | 2007-07-17 | Immersion Corporation | Systems and methods for networked haptic devices |
US8002089B2 (en) | 2004-09-10 | 2011-08-23 | Immersion Corporation | Systems and methods for providing a haptic device |
US7324020B2 (en) | 2004-09-21 | 2008-01-29 | Nokia Corporation | General purpose input board for a touch actuation |
US8232969B2 (en) | 2004-10-08 | 2012-07-31 | Immersion Corporation | Haptic feedback for button and scrolling action simulation in touch input devices |
US7397466B2 (en) | 2004-11-12 | 2008-07-08 | Eastman Kodak Company | Integral spacer dots for touch screen |
US8207945B2 (en) | 2004-12-01 | 2012-06-26 | Koninklijke Philips Electronics, N.V. | Image display that moves physical objects and causes tactile sensation |
US8199107B2 (en) | 2004-12-22 | 2012-06-12 | University Of Waterloo | Input interface device with transformable form factor |
US7551161B2 (en) | 2004-12-30 | 2009-06-23 | Mann W Stephen G | Fluid user interface such as immersive multimediator or input/output device with one or more spray jets |
JP2006268068A (en) | 2005-03-22 | 2006-10-05 | Fujitsu Ten Ltd | Touch panel device |
TWI258709B (en) | 2005-03-28 | 2006-07-21 | Elan Microelectronics Corp | Touch panel capable of soliciting keying feel |
JP2006285785A (en) | 2005-04-01 | 2006-10-19 | Fujitsu Ten Ltd | Touch panel device |
US7355595B2 (en) | 2005-04-15 | 2008-04-08 | Microsoft Corporation | Tactile device for scrolling |
US7382357B2 (en) | 2005-04-25 | 2008-06-03 | Avago Technologies Ecbu Ip Pte Ltd | User interface incorporating emulated hard keys |
US7692637B2 (en) | 2005-04-26 | 2010-04-06 | Nokia Corporation | User input device for electronic device |
US7825903B2 (en) | 2005-05-12 | 2010-11-02 | Immersion Corporation | Method and apparatus for providing haptic effects to a touch panel |
US7609178B2 (en) | 2006-04-20 | 2009-10-27 | Pressure Profile Systems, Inc. | Reconfigurable tactile sensor input device |
US7433719B2 (en) | 2005-06-03 | 2008-10-07 | Research In Motion Limited | Handheld electronic device and keypad having tactile features |
US7195170B2 (en) | 2005-06-09 | 2007-03-27 | Fuji Xerox Co., Ltd. | Post-bit: multimedia ePaper stickies |
US20070013662A1 (en) | 2005-07-13 | 2007-01-18 | Fauth Richard M | Multi-configurable tactile touch-screen keyboard and associated methods |
WO2007016704A2 (en) | 2005-08-02 | 2007-02-08 | Ipifini, Inc. | Input device having multifunctional keys |
TWI428937B (en) | 2005-08-12 | 2014-03-01 | Cambrios Technologies Corp | Nanowires-based transparent conductors |
US7233722B2 (en) | 2005-08-15 | 2007-06-19 | General Display, Ltd. | System and method for fiber optics based direct view giant screen flat panel display |
US7671837B2 (en) | 2005-09-06 | 2010-03-02 | Apple Inc. | Scrolling input arrangements using capacitive sensors on a flexible membrane |
US20070085837A1 (en) | 2005-10-17 | 2007-04-19 | Eastman Kodak Company | Touch input device with display front |
JP5208362B2 (en) | 2005-10-28 | 2013-06-12 | ソニー株式会社 | Electronics |
US7307231B2 (en) | 2005-11-16 | 2007-12-11 | Matsushita Electric Industrial Co., Ltd. | Touch panel, method of manufacturing the same, and input device using the same |
US8166649B2 (en) | 2005-12-12 | 2012-05-01 | Nupix, LLC | Method of forming an electroded sheet |
KR100677624B1 (en) | 2005-12-19 | 2007-02-02 | 삼성전자주식회사 | Liquid cooling system and electric appliances adopting the same |
US20070152983A1 (en) | 2005-12-30 | 2007-07-05 | Apple Computer, Inc. | Touch pad with symbols based on mode |
US8421755B2 (en) | 2006-01-17 | 2013-04-16 | World Properties, Inc. | Capacitive touch sensor with integral EL backlight |
US8068605B2 (en) | 2006-03-07 | 2011-11-29 | Sony Ericsson Mobile Communications Ab | Programmable keypad |
KR100826532B1 (en) | 2006-03-28 | 2008-05-02 | 엘지전자 주식회사 | Mobile communication terminal and its method for detecting a key input |
US20070236469A1 (en) | 2006-03-30 | 2007-10-11 | Richard Woolley | Fluid level sensing utilizing a mutual capacitance touchpad device |
US7511702B2 (en) | 2006-03-30 | 2009-03-31 | Apple Inc. | Force and location sensitive display |
US7538760B2 (en) | 2006-03-30 | 2009-05-26 | Apple Inc. | Force imaging input device and system |
WO2007115316A2 (en) | 2006-04-04 | 2007-10-11 | Chaffee Robert B | Method and apparatus for monitoring and controlling pressure in an inflatable device |
US8400402B2 (en) | 2006-04-14 | 2013-03-19 | Pressure Profile Systems, Inc. | Electronic device housing with integrated user input capability |
ATE473278T1 (en) | 2006-04-22 | 2010-07-15 | Scarab Genomics Llc | METHOD AND COMPOSITIONS FOR PRODUCING RECOMBINANT PROTEINS USING A GENE FOR TRNA |
US7978181B2 (en) | 2006-04-25 | 2011-07-12 | Apple Inc. | Keystroke tactility arrangement on a smooth touch surface |
US20070257634A1 (en) | 2006-05-05 | 2007-11-08 | Leschin Stephen J | Self-powered portable electronic device |
US7903092B2 (en) | 2006-05-25 | 2011-03-08 | Atmel Corporation | Capacitive keyboard with position dependent reduced keying ambiguity |
US8139035B2 (en) | 2006-06-21 | 2012-03-20 | Nokia Corporation | Touch sensitive keypad with tactile feedback |
US7841385B2 (en) | 2006-06-26 | 2010-11-30 | International Business Machines Corporation | Dual-chamber fluid pump for a multi-fluid electronics cooling system and method |
US8068097B2 (en) | 2006-06-27 | 2011-11-29 | Cypress Semiconductor Corporation | Apparatus for detecting conductive material of a pad layer of a sensing device |
US7916002B2 (en) | 2006-06-30 | 2011-03-29 | Nokia Corporation | Haptic operative user interface input apparatus |
US7545289B2 (en) | 2006-07-17 | 2009-06-09 | Synaptics Incorporated | Capacitive sensing using a repeated pattern of sensing elements |
US7834853B2 (en) | 2006-07-24 | 2010-11-16 | Motorola, Inc. | Handset keypad |
JP2008033739A (en) | 2006-07-31 | 2008-02-14 | Sony Corp | Touch screen interaction method and apparatus based on tactile force feedback and pressure measurement |
US8144271B2 (en) | 2006-08-03 | 2012-03-27 | Perceptive Pixel Inc. | Multi-touch sensing through frustrated total internal reflection |
CN101501989B (en) | 2006-08-07 | 2012-06-27 | 京瓷株式会社 | Method for manufacturing surface acoustic wave device |
JP4697095B2 (en) | 2006-08-29 | 2011-06-08 | ソニー株式会社 | Touch panel display device, electronic device and game device |
US8786033B2 (en) | 2006-09-01 | 2014-07-22 | IVI Holdings, Ltd. | Biometric sensor and sensor panel, method for detecting biometric pattern using the same, and method for manufacturing the same |
US8564544B2 (en) | 2006-09-06 | 2013-10-22 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
DE102006045174A1 (en) | 2006-09-25 | 2008-04-03 | Siemens Ag | Method for manufacturing contrast-improved image data set of analysis range of patient, involves loading image data set of analysis range, by which healthy tissue with lower intensity is represented as blood and deficient tissue |
US20100315345A1 (en) | 2006-09-27 | 2010-12-16 | Nokia Corporation | Tactile Touch Screen |
US7890863B2 (en) | 2006-10-04 | 2011-02-15 | Immersion Corporation | Haptic effects with proximity sensing |
KR101144423B1 (en) | 2006-11-16 | 2012-05-10 | 엘지전자 주식회사 | Mobile phone and display method of the same |
US20080136791A1 (en) | 2006-12-07 | 2008-06-12 | Sony Ericsson Mobile Communications Ab | Liquid resistive touch panel |
KR100851279B1 (en) | 2006-12-07 | 2008-08-08 | 한국전자통신연구원 | Braille Display Device for the physically challenged and Manufacturing Method Thereof |
KR101330697B1 (en) | 2006-12-21 | 2013-11-18 | 삼성디스플레이 주식회사 | Display device |
US20080165139A1 (en) | 2007-01-05 | 2008-07-10 | Apple Inc. | Touch screen stack-up processing |
US8144129B2 (en) | 2007-01-05 | 2012-03-27 | Apple Inc. | Flexible touch sensing circuits |
US20080202251A1 (en) | 2007-02-27 | 2008-08-28 | Iee International Electronics & Engineering S.A. | Capacitive pressure sensor |
US20080238448A1 (en) | 2007-03-30 | 2008-10-02 | Cypress Semiconductor Corporation | Capacitance sensing for percussion instruments and methods therefor |
US20080248836A1 (en) | 2007-04-04 | 2008-10-09 | Motorola, Inc. | Method and apparatus for controlling a skin texture surface on a device using hydraulic control |
US20080251368A1 (en) | 2007-04-12 | 2008-10-16 | Sony Ericsson Mobile Communications Ab | Input device |
US8130202B2 (en) | 2007-05-01 | 2012-03-06 | International Business Machines Corporation | Infrared touch screen gated by touch force |
US20080291169A1 (en) | 2007-05-21 | 2008-11-27 | Brenner David S | Multimodal Adaptive User Interface for a Portable Electronic Device |
US7733575B2 (en) | 2007-05-31 | 2010-06-08 | Artificial Muscle, Inc. | Optical systems employing compliant electroactive materials |
US20080303796A1 (en) | 2007-06-08 | 2008-12-11 | Steven Fyke | Shape-changing display for a handheld electronic device |
EP2171756A1 (en) | 2007-06-21 | 2010-04-07 | Nxp B.V. | Esd protection circuit |
US20090002328A1 (en) | 2007-06-26 | 2009-01-01 | Immersion Corporation, A Delaware Corporation | Method and apparatus for multi-touch tactile touch panel actuator mechanisms |
US7956770B2 (en) | 2007-06-28 | 2011-06-07 | Sony Ericsson Mobile Communications Ab | Data input device and portable electronic device |
TW200901014A (en) | 2007-06-28 | 2009-01-01 | Sense Pad Tech Co Ltd | Touch panel device |
US7880106B2 (en) | 2007-06-28 | 2011-02-01 | Apple Inc. | Switch assembly constructions |
US7952498B2 (en) | 2007-06-29 | 2011-05-31 | Verizon Patent And Licensing Inc. | Haptic computer interface |
US20090009480A1 (en) | 2007-07-06 | 2009-01-08 | Sony Ericsson Mobile Communications Ab | Keypad with tactile touch glass |
US20090015547A1 (en) | 2007-07-12 | 2009-01-15 | Franz Roger L | Electronic Device with Physical Alert |
US7828771B2 (en) | 2007-07-26 | 2010-11-09 | Entra Pharmaceuticals, Inc. | Systems and methods for delivering drugs |
US20090033617A1 (en) | 2007-08-02 | 2009-02-05 | Nokia Corporation | Haptic User Interface |
US8077154B2 (en) | 2007-08-13 | 2011-12-13 | Motorola Mobility, Inc. | Electrically non-interfering printing for electronic devices having capacitive touch sensors |
US20090132093A1 (en) | 2007-08-21 | 2009-05-21 | Motorola, Inc. | Tactile Conforming Apparatus and Method for a Device |
FR2920628B1 (en) | 2007-08-30 | 2011-07-01 | Celsius X Vi Ii | PORTABLE PHONE WITH A MECHANICAL WATCH |
US8270158B2 (en) | 2007-08-30 | 2012-09-18 | Hewlett-Packard Development Company, L.P. | Housing construction for mobile computing device |
JP5106955B2 (en) | 2007-09-07 | 2012-12-26 | ソニーモバイルコミュニケーションズ株式会社 | User interface device and portable information terminal |
US8098235B2 (en) | 2007-09-28 | 2012-01-17 | Immersion Corporation | Multi-touch device having dynamic haptic effects |
KR101404745B1 (en) | 2007-10-15 | 2014-06-12 | 엘지전자 주식회사 | Jog input device and portable terminal having the same |
US8232976B2 (en) | 2010-03-25 | 2012-07-31 | Panasonic Corporation Of North America | Physically reconfigurable input and output systems and methods |
US8217903B2 (en) | 2007-11-02 | 2012-07-10 | Research In Motion Limited | Electronic device and tactile touch screen |
US20090115734A1 (en) | 2007-11-02 | 2009-05-07 | Sony Ericsson Mobile Communications Ab | Perceivable feedback |
KR100896812B1 (en) | 2007-11-12 | 2009-05-11 | 한국과학기술원 | Haptic module using magnetic force, electronic apparatuses having the module |
US8379182B2 (en) | 2007-11-16 | 2013-02-19 | Manufacturing Resources International, Inc. | Cooling system for outdoor electronic displays |
US8208115B2 (en) | 2007-11-16 | 2012-06-26 | Manufacturing Resources International, Inc. | Fluid cooled display |
US8174508B2 (en) | 2007-11-19 | 2012-05-08 | Microsoft Corporation | Pointing and data entry input device |
US8866641B2 (en) | 2007-11-20 | 2014-10-21 | Motorola Mobility Llc | Method and apparatus for controlling a keypad of a device |
US10488926B2 (en) | 2007-11-21 | 2019-11-26 | Immersion Corporation | Method and apparatus for providing a fixed relief touch screen with locating features using deformable haptic surfaces |
US8253698B2 (en) | 2007-11-23 | 2012-08-28 | Research In Motion Limited | Tactile touch screen for electronic device |
US20090140989A1 (en) | 2007-12-04 | 2009-06-04 | Nokia Corporation | User interface |
US7679839B2 (en) | 2007-12-10 | 2010-03-16 | Artificial Muscle, Inc. | Optical lens displacement systems |
JP2009151684A (en) | 2007-12-21 | 2009-07-09 | Sony Corp | Touch-sensitive sheet member, input device and electronic equipment |
US8395587B2 (en) | 2007-12-21 | 2013-03-12 | Motorola Mobility Llc | Haptic response apparatus for an electronic device |
US8123660B2 (en) | 2007-12-28 | 2012-02-28 | Immersion Corporation | Method and apparatus for providing communications with haptic cues |
US8373549B2 (en) | 2007-12-31 | 2013-02-12 | Apple Inc. | Tactile feedback in an electronic device |
US9857872B2 (en) | 2007-12-31 | 2018-01-02 | Apple Inc. | Multi-touch display screen with localized tactile feedback |
US20090167567A1 (en) | 2008-01-02 | 2009-07-02 | Israeli Aerospace Industries Ltd. | Method for avoiding collisions and a collision avoidance system |
US8456438B2 (en) | 2008-01-04 | 2013-06-04 | Tactus Technology, Inc. | User interface system |
US9052790B2 (en) | 2008-01-04 | 2015-06-09 | Tactus Technology, Inc. | User interface and methods |
US8179377B2 (en) | 2009-01-05 | 2012-05-15 | Tactus Technology | User interface system |
US8553005B2 (en) | 2008-01-04 | 2013-10-08 | Tactus Technology, Inc. | User interface system |
US8547339B2 (en) | 2008-01-04 | 2013-10-01 | Tactus Technology, Inc. | System and methods for raised touch screens |
US8179375B2 (en) | 2008-01-04 | 2012-05-15 | Tactus Technology | User interface system and method |
US8570295B2 (en) | 2008-01-04 | 2013-10-29 | Tactus Technology, Inc. | User interface system |
US8243038B2 (en) | 2009-07-03 | 2012-08-14 | Tactus Technologies | Method for adjusting the user interface of a device |
US9274612B2 (en) | 2008-01-04 | 2016-03-01 | Tactus Technology, Inc. | User interface system |
US8199124B2 (en) | 2009-01-05 | 2012-06-12 | Tactus Technology | User interface system |
US8922510B2 (en) | 2008-01-04 | 2014-12-30 | Tactus Technology, Inc. | User interface system |
US8154527B2 (en) | 2008-01-04 | 2012-04-10 | Tactus Technology | User interface system |
US9063627B2 (en) | 2008-01-04 | 2015-06-23 | Tactus Technology, Inc. | User interface and methods |
US8922503B2 (en) | 2008-01-04 | 2014-12-30 | Tactus Technology, Inc. | User interface system |
US8947383B2 (en) | 2008-01-04 | 2015-02-03 | Tactus Technology, Inc. | User interface system and method |
US8125461B2 (en) | 2008-01-11 | 2012-02-28 | Apple Inc. | Dynamic input graphic display |
US7890257B2 (en) | 2008-01-14 | 2011-02-15 | Research In Motion Limited | Using a shape-changing display as an adaptive lens for selectively magnifying information displayed onscreen |
US20090181724A1 (en) | 2008-01-14 | 2009-07-16 | Sony Ericsson Mobile Communications Ab | Touch sensitive display with ultrasonic vibrations for tactile feedback |
US8004501B2 (en) | 2008-01-21 | 2011-08-23 | Sony Computer Entertainment America Llc | Hand-held device with touchscreen and digital tactile pixels |
US20090195512A1 (en) | 2008-02-05 | 2009-08-06 | Sony Ericsson Mobile Communications Ab | Touch sensitive display with tactile feedback |
US8022933B2 (en) | 2008-02-21 | 2011-09-20 | Sony Corporation | One button remote control with haptic feedback |
US20090243998A1 (en) | 2008-03-28 | 2009-10-01 | Nokia Corporation | Apparatus, method and computer program product for providing an input gesture indicator |
EP3484135A1 (en) | 2008-04-02 | 2019-05-15 | Twilio Inc. | System and method for processing telephony sessions |
US9829977B2 (en) | 2008-04-02 | 2017-11-28 | Immersion Corporation | Method and apparatus for providing multi-point haptic feedback texture systems |
US8212795B2 (en) | 2008-05-21 | 2012-07-03 | Hypercom Corporation | Payment terminal stylus with touch screen contact detection |
DE602008005865D1 (en) | 2008-05-29 | 2011-05-12 | Lg Electronics Inc | Transparent display and operating procedures for it |
US7924143B2 (en) | 2008-06-09 | 2011-04-12 | Research In Motion Limited | System and method for providing tactile feedback to a user of an electronic device |
US8421483B2 (en) | 2008-06-13 | 2013-04-16 | Sony Ericsson Mobile Communications Ab | Touch and force sensing for input devices |
US8115745B2 (en) | 2008-06-19 | 2012-02-14 | Tactile Displays, Llc | Apparatus and method for interactive display with tactile feedback |
US8174372B2 (en) | 2008-06-26 | 2012-05-08 | Immersion Corporation | Providing haptic feedback on a touch surface |
KR20100010860A (en) | 2008-07-23 | 2010-02-02 | 엘지전자 주식회사 | Mobile terminal and event control method thereof |
TWI489329B (en) | 2008-08-20 | 2015-06-21 | Au Optronics Corp | Touch panel, display, and manufacturing method of touch panel |
US8174452B2 (en) | 2008-09-25 | 2012-05-08 | Apple Inc. | Cavity antenna for wireless electronic devices |
TW201013259A (en) | 2008-09-30 | 2010-04-01 | J Touch Corp | Double-sided composite touch panel structure |
US8441450B2 (en) | 2008-09-30 | 2013-05-14 | Apple Inc. | Movable track pad with added functionality |
US7999660B2 (en) | 2008-10-10 | 2011-08-16 | Motorola Mobility, Inc. | Electronic device with suspension interface for localized haptic response |
US20100097323A1 (en) | 2008-10-17 | 2010-04-22 | Honeywell International Inc. | Hydrogel-based tactile-feedback touch screen |
US8427433B2 (en) | 2008-10-17 | 2013-04-23 | Honeywell International Inc. | Tactile-feedback touch screen |
US8436816B2 (en) | 2008-10-24 | 2013-05-07 | Apple Inc. | Disappearing button or slider |
US8222799B2 (en) | 2008-11-05 | 2012-07-17 | Bayer Materialscience Ag | Surface deformation electroactive polymer transducers |
US20100121928A1 (en) | 2008-11-07 | 2010-05-13 | Penango, Inc. | Methods and systems for allocating and indicating trustworthiness of secure communications |
US8106787B2 (en) | 2008-11-14 | 2012-01-31 | Nokia Corporation | Warning system indicating excessive force on a touch screen or display |
US20100141608A1 (en) | 2008-12-09 | 2010-06-10 | Lili Huang | Index Matching For Touch Screens |
US8362882B2 (en) | 2008-12-10 | 2013-01-29 | Immersion Corporation | Method and apparatus for providing Haptic feedback from Haptic textile |
US9600070B2 (en) | 2008-12-22 | 2017-03-21 | Apple Inc. | User interface having changeable topography |
US8384680B2 (en) | 2008-12-23 | 2013-02-26 | Research In Motion Limited | Portable electronic device and method of control |
US8345013B2 (en) | 2009-01-14 | 2013-01-01 | Immersion Corporation | Method and apparatus for generating haptic feedback from plasma actuation |
EP2500924B1 (en) | 2009-02-24 | 2015-07-22 | BlackBerry Limited | Breathable sealed dome switch assembly |
US8253703B2 (en) | 2009-03-03 | 2012-08-28 | Empire Technology Development Llc | Elastomeric wave tactile interface |
US8361334B2 (en) | 2009-03-18 | 2013-01-29 | Medtronic, Inc. | Plasma deposition to increase adhesion |
US8169306B2 (en) | 2009-03-23 | 2012-05-01 | Methode Electronics, Inc. | Touch panel assembly with haptic effects and method of manufacturing thereof |
US8125347B2 (en) | 2009-04-09 | 2012-02-28 | Samsung Electronics Co., Ltd. | Text entry system with depressable keyboard on a dynamic display |
US8224392B2 (en) | 2009-04-29 | 2012-07-17 | Lg Electronics Inc. | Mobile terminal capable of recognizing fingernail touch and method of controlling the operation thereof |
US8279200B2 (en) | 2009-05-19 | 2012-10-02 | Microsoft Corporation | Light-induced shape-memory polymer display screen |
US8417297B2 (en) | 2009-05-22 | 2013-04-09 | Lg Electronics Inc. | Mobile terminal and method of providing graphic user interface using the same |
US8400410B2 (en) | 2009-05-26 | 2013-03-19 | Microsoft Corporation | Ferromagnetic user interfaces |
KR101658991B1 (en) | 2009-06-19 | 2016-09-22 | 삼성전자주식회사 | Touch panel and electronic device including the touch panel |
EP2449452B1 (en) | 2009-07-03 | 2016-02-10 | Tactus Technology | User interface enhancement system |
US8310458B2 (en) | 2009-07-06 | 2012-11-13 | Research In Motion Limited | Electronic device including a moveable touch-sensitive input and method of controlling same |
US8120588B2 (en) | 2009-07-15 | 2012-02-21 | Sony Ericsson Mobile Communications Ab | Sensor assembly and display including a sensor assembly |
US8378797B2 (en) | 2009-07-17 | 2013-02-19 | Apple Inc. | Method and apparatus for localization of haptic feedback |
US8395591B2 (en) | 2009-07-22 | 2013-03-12 | Empire Technology Development Llc | Electro-osmotic tactile display |
US8723825B2 (en) | 2009-07-28 | 2014-05-13 | Cypress Semiconductor Corporation | Predictive touch surface scanning |
US20110029862A1 (en) | 2009-07-30 | 2011-02-03 | Research In Motion Limited | System and method for context based predictive text entry assistance |
US8390594B2 (en) | 2009-08-18 | 2013-03-05 | Immersion Corporation | Haptic feedback using composite piezoelectric actuator |
US8456430B2 (en) | 2009-08-21 | 2013-06-04 | Motorola Mobility Llc | Tactile user interface for an electronic device |
EP2473927A4 (en) | 2009-09-04 | 2016-05-11 | Iii Holdings 2 Llc | System and method for managing internet media content |
US8816965B2 (en) | 2009-09-30 | 2014-08-26 | At&T Mobility Ii Llc | Predictive force sensitive keypad |
US8350820B2 (en) | 2009-11-06 | 2013-01-08 | Bose Corporation | Touch-based user interface user operation accuracy enhancement |
US8558802B2 (en) | 2009-11-21 | 2013-10-15 | Freescale Semiconductor, Inc. | Methods and apparatus for performing capacitive touch sensing and proximity detection |
GB0922165D0 (en) | 2009-12-18 | 2010-02-03 | Pelikon Ltd | Human interface device and related methods |
WO2011087816A1 (en) | 2009-12-21 | 2011-07-21 | Tactus Technology | User interface system |
US8994666B2 (en) | 2009-12-23 | 2015-03-31 | Colin J. Karpfinger | Tactile touch-sensing interface system |
KR101616875B1 (en) | 2010-01-07 | 2016-05-02 | 삼성전자주식회사 | Touch panel and electronic device including the touch panel |
US8519974B2 (en) | 2010-01-19 | 2013-08-27 | Sony Corporation | Touch sensing device, touch screen device comprising the touch sensing device, mobile device, method for sensing a touch and method for manufacturing a touch sensing device |
KR101631892B1 (en) | 2010-01-28 | 2016-06-21 | 삼성전자주식회사 | Touch panel and electronic device including the touch panel |
US20110193787A1 (en) | 2010-02-10 | 2011-08-11 | Kevin Morishige | Input mechanism for providing dynamically protruding surfaces for user interaction |
US8619035B2 (en) | 2010-02-10 | 2013-12-31 | Tactus Technology, Inc. | Method for assisting user input to a device |
US8330305B2 (en) | 2010-02-11 | 2012-12-11 | Amazon Technologies, Inc. | Protecting devices from impact damage |
US8253052B2 (en) | 2010-02-23 | 2012-08-28 | Research In Motion Limited | Keyboard dome stiffener assembly |
US20120056846A1 (en) | 2010-03-01 | 2012-03-08 | Lester F. Ludwig | Touch-based user interfaces employing artificial neural networks for hdtp parameter and symbol derivation |
WO2011112984A1 (en) | 2010-03-11 | 2011-09-15 | Tactus Technology | User interface system |
US8450627B2 (en) | 2010-04-01 | 2013-05-28 | Apple Inc. | Capacitive dome switch |
WO2011133604A1 (en) | 2010-04-19 | 2011-10-27 | Tactus Technology | User interface system |
KR20130141344A (en) | 2010-04-19 | 2013-12-26 | 택투스 테크놀로지, 아이엔씨. | Method of actuating a tactile interface layer |
US8599165B2 (en) | 2010-08-16 | 2013-12-03 | Perceptive Pixel Inc. | Force and true capacitive touch measurement techniques for capacitive touch sensors |
US8592699B2 (en) | 2010-08-20 | 2013-11-26 | Apple Inc. | Single support lever keyboard mechanism |
KR101323052B1 (en) | 2010-10-01 | 2013-10-29 | 엘지디스플레이 주식회사 | Electrostatic capacity type touch screen panel |
WO2012054780A1 (en) | 2010-10-20 | 2012-04-26 | Tactus Technology | User interface system |
CN103124946B (en) | 2010-10-20 | 2016-06-29 | 泰克图斯科技公司 | User interface system and method |
US8780060B2 (en) | 2010-11-02 | 2014-07-15 | Apple Inc. | Methods and systems for providing haptic control |
JP5648437B2 (en) | 2010-11-15 | 2015-01-07 | セイコーエプソン株式会社 | Electro-optical device and projection display device |
US8966408B2 (en) | 2011-07-11 | 2015-02-24 | Apple Inc. | Removable clip with user interface |
US8963886B2 (en) | 2011-07-13 | 2015-02-24 | Flatfrog Laboratories Ab | Touch-sensing display panel |
US8947105B2 (en) | 2011-12-01 | 2015-02-03 | Atmel Corporation | Capacitive coupling of bond pads |
US8711118B2 (en) | 2012-02-15 | 2014-04-29 | Immersion Corporation | Interactivity model for shared feedback on mobile devices |
US9471185B2 (en) | 2012-02-21 | 2016-10-18 | Atmel Corporation | Flexible touch sensor input device |
EP2730995B1 (en) | 2012-05-25 | 2016-11-30 | Nintendo Co., Ltd. | Controller device, information processing system, and communication method |
-
2014
- 2014-08-28 US US14/471,889 patent/US9298261B2/en not_active Expired - Fee Related
-
2016
- 2016-02-17 US US15/046,123 patent/US20160162064A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
US9298261B2 (en) | 2016-03-29 |
US20150077364A1 (en) | 2015-03-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9298261B2 (en) | Method for actuating a tactile interface layer | |
US11907013B2 (en) | Continuity of applications across devices | |
US11126704B2 (en) | Authenticated device used to unlock another device | |
US11269575B2 (en) | Devices, methods, and graphical user interfaces for wireless pairing with peripheral devices and displaying status information concerning the peripheral devices | |
US20220100841A1 (en) | Authenticated device used to unlock another device | |
US9372539B2 (en) | Method for actuating a tactile interface layer | |
US11747956B2 (en) | Multi-dimensional object rearrangement | |
US11354015B2 (en) | Adaptive user interfaces | |
US20190079648A1 (en) | Method, device, and graphical user interface for tabbed and private browsing | |
US10488919B2 (en) | System for gaze interaction | |
TWI629636B (en) | Method for controlling an electronic device, electronic device and non-transitory computer-readable storage medium | |
US20170235360A1 (en) | System for gaze interaction | |
US20170090566A1 (en) | System for gaze interaction | |
KR20150127530A (en) | User input method of portable device and the portable device enabling the method | |
KR20070113025A (en) | Apparatus and operating method of touch screen | |
US20150160731A1 (en) | Method of recognizing gesture through electronic device, electronic device, and computer readable recording medium | |
EP3187977A1 (en) | System for gaze interaction | |
WO2018156912A1 (en) | System for gaze interaction | |
US20130321322A1 (en) | Mobile terminal and method of controlling the same | |
US11635826B2 (en) | Device, method, and graphical user interface for adjusting touch activation regions associated with selectable user interface elements | |
KR20120078816A (en) | Providing method of virtual touch pointer and portable device supporting the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: TACTUS TECHNOLOGY, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PARTHASARATHY, RADHAKRISHNAN;YAIRI, MICAH;SIGNING DATES FROM 20141002 TO 20141003;REEL/FRAME:037937/0095 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: SILICON VALLEY BANK, CALIFORNIA Free format text: SECURITY INTEREST;ASSIGNOR:TACTUS TECHNOLOGY, INC.;REEL/FRAME:043445/0953 Effective date: 20170803 |
|
AS | Assignment |
Owner name: TACTUS TECHNOLOGY, INC., CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:SILICON VALLEY BANK;REEL/FRAME:046492/0687 Effective date: 20180508 |
|
AS | Assignment |
Owner name: SILICON VALLEY BANK, CALIFORNIA Free format text: SECURITY INTEREST;ASSIGNOR:TACTUS TECHNOLOGY, INC.;REEL/FRAME:047155/0587 Effective date: 20180919 |