US20110227877A1 - Visual Simulation of Touch Pressure - Google Patents

Visual Simulation of Touch Pressure Download PDF

Info

Publication number
US20110227877A1
US20110227877A1 US13/118,145 US201113118145A US2011227877A1 US 20110227877 A1 US20110227877 A1 US 20110227877A1 US 201113118145 A US201113118145 A US 201113118145A US 2011227877 A1 US2011227877 A1 US 2011227877A1
Authority
US
United States
Prior art keywords
display
touch
image
pressure
touch pressure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/118,145
Inventor
Paul Chen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to US13/118,145 priority Critical patent/US20110227877A1/en
Publication of US20110227877A1 publication Critical patent/US20110227877A1/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/045Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using resistive elements, e.g. a single continuous surface or two parallel surfaces put in contact
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means

Definitions

  • FIG. 4 shows a schematic view of an embodiment of a touch-sensing mechanism for a touch-sensitive display.

Abstract

The simulation of touch pressure on a touch-sensitive display is disclosed. In one disclosed embodiment, a touch pressure is simulated on a touch-sensitive display by detecting inputs corresponding to each of an untouched display and two or more measures of touch pressure, and displaying images on the display corresponding to the untouched display and each measure of touch pressure. In this manner, a user may be provided with a richer visual response to a touch-sensitive display input.

Description

    BACKGROUND
  • Touch-sensitive displays may be used as input devices in many different computing device environments. Generally, touch-sensitive displays comprise a mechanism for detecting the touch of a user's finger or other object on a display screen, and therefore allow a user to input selections or commands to a computing device by touching the display in an appropriate location indicated by a graphical user interface (GUI). A touch-sensitive display may detect touch via any of several different mechanisms, including but not limited to optical, capacitive, and resistive mechanisms.
  • To provide a richer and more intuitive user experience, some GUIs may be configured to alter an image displayed on the display screen in response to a user's touch to simulate a reaction to the touch. For example, some user-selectable items may appear on a GUI as buttons. Such buttons may be displayed in either a “button up” or “button pressed down” state to visually simulate the pressing of a button by the user. However, such graphical representations of a physical response to a touch input are generally binary in nature, having only two states (pressed or unpressed) that are presented to the user.
  • SUMMARY
  • Accordingly, the simulation of touch pressure on a touch-sensitive display is described below in the Detailed Description. For example, in one disclosed embodiment, a touch pressure is simulated on a touch-sensitive display by detecting inputs corresponding to each of an untouched display and two or more measures of touch pressure, and displaying images on the display corresponding to the untouched display and each measure of touch pressure. In this manner, a user may be provided with a richer visual response to a touch-sensitive display input.
  • This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows an embodiment of a touch-sensitive display device.
  • FIG. 2 shows a process flow depicting an embodiment of a method of simulating touch pressure via a touch-sensitive display.
  • FIG. 3 shows a graphical and schematic representation of a change in a displayed image as a function of an increase in a measure of touch pressure.
  • FIG. 4 shows a schematic view of an embodiment of a touch-sensing mechanism for a touch-sensitive display.
  • FIG. 5 shows a schematic view of another embodiment of a touch-sensing mechanism for a touch-sensitive display.
  • FIG. 6 shows a schematic view of a change in an image detected by the touch-sensing mechanism of FIG. 5 with an increase in touch pressure.
  • FIG. 7 shows a schematic view of another embodiment of a touch-sensing mechanism for a touch-sensitive display.
  • FIG. 8 shows a schematic view of another embodiment of a touch-sensing mechanism for a touch-sensitive display.
  • DETAILED DESCRIPTION
  • FIG. 1 shows an embodiment of a touch-sensitive display device 100. Touch-sensitive display device 100 has a table-like configuration and comprises a horizontally disposed display surface 102 with a touch-sensitive input device. Touch-sensitive display device 100 may be configured to detect the touch of a person 104 and/or an object 106 other than a person, depending upon the touch-sensing mechanism employed by touch-sensitive display device 100. It will be appreciated that reference to a touch by an “object” in the discussion below refers to a touch by a person or another object interchangeably unless an embodiment is described specifically in the context of one or the other.
  • Further, while the embodiment of FIG. 1 comprises a display device comprising horizontally-disposed touch-sensitive surface, it will be appreciated that the embodiments discussed below and the concepts generally disclosed herein may be implemented on any suitable touch-enabled display device. Examples of such devices include, but are not limited to, computing devices such as laptop and desktop computers, hand-held devices, cellular phones, portable media players, personal digital assistants, cameras, video cameras, and other programmable consumer and commercial electronics and appliances. As used herein, the term “computing device” may include any device that electronically executes one or more programs. The embodiments described herein may be implemented on such devices, for example, via computer-executable instructions or code, such as programs, stored on a computer-readable storage medium and executed by the computing device. Generally, programs include routines, objects, components, data structures, and the like that perform particular tasks or implement particular abstract data types. The term “program” as used herein may connote a single program or multiple programs acting in concert, and may be used to denote applications, services, or any other type or class of program.
  • Touch-sensitive display device 100 may be used to display any suitable type of content or data, including but not limited to photographic data, video data, graphical data, documents, spreadsheets, presentations, etc. Further, as described in more detail below, touch-sensitive display device 100 may be used to simulate the appearance and properties of various materials and/or surfaces, and to simulate the response of the simulated material and/or surface to a measure of touch pressure detected by the touch-sensing mechanism or mechanisms employed.
  • FIG. 2 shows a process flow depicting an embodiment of a method 200 for simulating a touch pressure via a touch-sensitive display. Method 200 comprises, at 202, displaying an image in an untouched state, and then, at 204, detecting an input from a touch-sensitive input device associated with the display that corresponds to a first measure of touch pressure. Detecting the input corresponding to a first measure of touch pressure may be accomplished in any number of ways, depending upon the touch sensing mechanism employed by the touch-sensitive input device. For example, detecting the first measure of touch pressure may comprise optically detecting, at 206, a height of an object above a display screen via one or more optical detectors configured to image the front side of the display screen and a region of space adjacent to the front side of the display screen. Likewise, detecting the first measure of touch pressure may comprise optically detecting, at 208, the presence of a shadow on the display screen from an image detector configured to image a backside of the display screen. Further, detecting the first measure of touch pressure may comprise detecting, at 210, a touch via a capacitance change via a capacitive touch-screen device, and/or detecting, at 212, a touch via a resistive change via a resistive touch screen device. Also, detecting the first measure of touch pressure may further comprise determining or estimating, at 214, a surface area of the display screen that is touched from the optical, capacitive and/or resistive inputs received from the touch-sensitive input device.
  • Method 200 next comprises, at 216, displaying an image corresponding to the first measure of touch pressure. As opposed to the two-state systems described above in which only images corresponding to general “untouched” and “touched” states are displayed, process 216 permits the image displayed to more finely reflect the specific input or inputs detected at 204. As an example, where the touch-sensitive input device comprises one or more image capture devices configured to detect a height of an object above the display screen, the image corresponding to the first measure of touch pressure may be specifically tailored to reflect the actual height of the object above the display surface detected. Where an object is detected at a farther distance from the display screen, the image displayed on the display screen may be modified only slightly to simulate a light touch Likewise, where an object is detected at a closer distance from the display screen, the image displayed on the display screen may be more heavily modified to simulate a stronger touch. Further, measuring the surface area of the surface of the object that is responsible for generating a touch input, as indicated at 214, may allow the image corresponding to the first measure of touch pressure to be tailored such that the response of the image to the touch corresponds to the shape and size of the “touching” surface of the object.
  • The image corresponding to the first measure of touch pressure may be calculated or determined in any suitable manner. For example, as indicated at 218, the image corresponding to the first measure of touch pressure may be calculated utilizing mathematical functions that apply a gradient of pressure effect to a displayed image. Alternatively, as indicated at 220, various images corresponding to different measures of touch pressure may be stored in a look-up table. In these embodiments, an input received from the touch-sensitive input device may be compared to the look-up table, and an image corresponding to that measure of touch pressure may be located in the table and displayed on the display. While both of these approaches may provide the ability to simulate multiple degrees of touch pressure, the use of mathematical functions may allow for a greater response range, a more object-specific response, and/or a finer degree of detail. Further, in some embodiments, a sound emitted by a device may change as a function of a measure of touch pressure, as indicated at 221.
  • The amount of variation between an appearance of the untouched image and an appearance of the image corresponding to the first measure of touch pressure may be a function of the material or surface being simulated. For example, where the surface being simulated is fabric, sand, soft clay, or other relatively soft surface, the displayed image may undergo a relatively significant change in response to a detected change in a measure of touch pressure. Examples of changes that may be made to such images in response to a touch input may include displaying a relatively deep deformation or depression in the surface. Likewise, where the surface being simulated is a relatively hard surface, the displayed image may undergo a relatively insignificant change in response to the detected measure of touch pressure.
  • Continuing with FIG. 2, method 200 next comprises, at 222, detecting an input corresponding to a second measure of touch pressure. The second measure of touch pressure may be either greater or lesser than the first measure of touch pressure, thereby corresponding to either an increase or decrease in the simulated pressure displayed on the display. The second measure of touch pressure may be detected in any suitable manner, depending upon the touch-sensing mechanism utilized. For example, the second measure of touch pressure may be detected by optically detecting a change in the height of the object above the screen compared to the first measure of touch pressure, as indicated at 224. Likewise, the second measure of touch pressure may be detected by optically detecting a change in the size of a shadow detected on the screen via an image capture device configured to image a backside of the display screen, as indicated at 226. In other embodiments, a change in a capacitance or resistance corresponding to a change in a pressure or an area of the touch-sensitive display contacted by an object may be detected via a suitable capacitive or resistive touch-sensitive input device, as shown at 228 and 230.
  • Further, in some embodiments, the second (or first) measure of touch pressure may be determined, at 232, by measuring a velocity of an object approaching the screen. In this manner, a greater velocity may be interpreted as causing the exertion of a greater simulated pressure on the displayed image, while a lesser velocity may be interpreted as causing the exertion of a lesser simulated pressure on the displayed image. This may allow a visual effect of an “impact” to be simulated for different “impact” speeds.
  • Continuing with FIG. 2, method 200 next includes, at 234, displaying an image corresponding to a second measure of touch pressure. As with the image corresponding to the first measure of touch pressure at 216, the image corresponding to the second measure of touch pressure may be determined mathematically, or may be determined via a look-up table. Where the image corresponding to the second measure of touch pressure takes into account the shape and size of the contacting object in simulating the touch pressure, the use of a mathematical formula to calculate the image may offer a richer, more detailed response to the measure of touch pressure than the use of images stored in a look-up table.
  • The image corresponding to the second measure of touch pressure may simulate the second measure of touch pressure in any suitable manner. For example, where the second measure of touch pressure corresponds to a greater touch pressure than the first measure of touch pressure, an indentation effect, lighting/shading effects, and/or other visual simulation of pressure may be increased to simulate the increase in pressure. Likewise, where the second measure of touch pressure corresponds to a lesser touch pressure than the first measure of touch pressure, an indentation, lighting/shading effect, and/or other visual simulation of pressure may be decreased to simulate the decrease in pressure. Further, the rate at which an indentation, light/shading effect, and/or other visual simulation of pressure decreased may be controlled to more realistically simulate a property of the displayed material or surface. For example, if the displayed material or surface is a pillow, a decrease in the measure of touch pressure may be simulated by a more gradual decrease in the visual effects in the displayed image, simulating a slow return to an untouched state.
  • Further, as indicated at 236, the image corresponding to either the first or the second measure of touch pressure may be displayed for a duration after removal of the touch pressure. Again using the example of the display of a pillow, a residual indentation may remain in the displayed image for an extended period of time after the cessation of any measure of touch pressure to simulate a property of a real pillow. Likewise, if the simulated surface is clay, a depression may remain in the displayed image indefinitely until reset by a user to simulate the moldability of clay. It will be appreciated that any suitable material and/or material property may be displayed in this manner. Further, display device 100 may be configured to simulate any number of surfaces and/or materials, and may utilize any number of general or material-specific mathematical functions to calculate the images corresponding to any suitable measure of touch pressure.
  • In some embodiments, other properties of a material rather than a degree/type/duration of indentation and/or lighting/shading effects may be simulated in response to different measures of touch pressure. For example, a degree, magnitude, or duration of a motion simulated on the display may be varied depending upon the magnitude of the measure of touch pressure. As a specific example, if the simulated material is water or other liquid, a magnitude of a splash and/or ripple effect may be varied depending upon the measure of touch pressure, wherein a greater measure of touch pressure and/or higher measured touch velocity may cause an increased magnitude and/or duration of simulated ripples Likewise, an output sound may be varied in response to different measures of touch pressure, as shown at 237. For example, if a cymbal is displayed on the display, a greater measure of touch pressure and/or a higher measured touch velocity may cause a greater magnitude and/or duration of vibration of the displayed cymbal, as well as a louder initial sound. Further, referring again to the water example, a “splash” sound emitted in response to a detected measure of touch pressure may be varied depending upon the magnitude of the measure of the touch pressure.
  • While FIG. 2 depicts the detection of a first measure of touch pressure occurring before the detection of a second measure of touch pressure, it will be appreciated that the various processes depicted in FIG. 2 may be performed in any suitable order. For example, the input corresponding to the first measure of touch pressure may cease for a duration before the input corresponding to the measure of the second touch pressure begins, thereby having a period with no touch pressure between the two periods of touch pressure. Alternatively, the first and second touch pressures may occur back-to-back, without any intermediate period of no measure of touch pressure. Further, while FIG. 2 depicts the display of images corresponding to an untouched state and two different measures of touch pressure, it will be appreciated that the concepts disclosed in FIG. 2 may apply to any number of touch states, including but not limited to embodiments that employ equations that allow a displayed image to be continuously or finely altered in response to small changes in the measure of touch pressure received from the input device.
  • Method 200 may be used in a wide variety of applications. For example, method 200 may be used to provide a richer and more entertaining display background. As a specific example, a computing device may employ a desktop background depicting water, sand, clay, etc. that reacts to a user's input according to method 200. Likewise, method 200 may be used to provide a richer user experience in various games and entertainment programs. For example, an application may be configured to display a drum kit, and the visual effects displayed and the sounds emitted may be modified depending upon the measure of touch pressure received.
  • Method 200 may also find uses in therapeutic and training environments. For example, children with autism sometimes demonstrate an unusual sensitivity to the feel of different materials and surface textures. As a possible therapy for such children, method 200 may be used to display to an autistic child a material or surface that has caused a negative touch response in that child. Because the actual display surface has a different texture or feel than the material displayed in the image on the display, the feel of the display surface may not cause the same negative reaction caused by the feel of the actual material. However, the displayed image of the material may react to the user's touch in a manner that simulates how the actual material would react to the user's touch. Therefore, the user may develop a familiarity with some properties of the actual material via manipulating the simulated material before being re-introduced to the actual material.
  • Method 200 may find further use in professional training environments. For example, a display device may be configured to allow virtual dissections to be performed, thereby allowing doctors, medical students, veterinarians, veterinary students, and other health professionals to study anatomy via virtual dissections performed at a display device embodying method 200. For example, a display device may be configured to detect the proximity or touch of a practice scalpel, and tissue displayed on the display may be configured to display a reaction to the scalpel, such as to indent under light pressure and to open an incision under heavier pressure. It will be appreciated that the above-listed examples of use environments for method 200 are set forth merely for the purpose of example, and are not intended to be limiting in any manner.
  • FIG. 3 shows a graphical representation 302 and a schematic representation 304 of an example of changes made to an image displayed on a touch-sensitive display as an increase in touch pressure is detected over a period of time. Each numbered zone in graph 302 corresponds to the schematic representation of surface effect having the same number.
  • As can be seen in graph 302, the degree of effects applied to the image increases relatively proportionately with increases in touch pressure. As touch pressure initially increases at a relatively faster pace, the surface effects are also changed at a relatively faster pace. The change is relatively linear for the first portion of the detected increase in measured touch pressure, and then increases less rapidly as the measure of touch pressure continues to increase. The schematic representation of the surface effects shown at 304 represent an indentation that may be displayed around the outer perimeter of an object, such as a finger approaching or touching the display surface. As the pressure increases, the indentation simulated in the image also increases and becomes more sharply delineated. Only five separate degrees of applied effects are shown in FIG. 3 for the purpose of clarity and simplicity. However, it will be appreciated that any number of separate degrees of applied effects may be employed, and that the richest implementations may be able to detect and respond to sufficiently fine changes in the measure of touch pressure as to represent a continuous reaction to changes in the measurement of pressure.
  • As mentioned above, various different touch-sensitive input devices may be used to detect a change in touch pressure. FIGS. 4 and 5 show schematic diagrams of two examples of optical touch-sensitive display systems. First referring to FIG. 4, touch-sensitive display system 400 comprises a display screen 402 and a plurality of cameras 404 a, 404 b and 404 c arranged around display screen 402. Cameras 404 a-c are configured to capture images of a front side 406 of display screen 402 (i.e. the side of the display screen that faces a user) and a region of space adjacent to front side 406 of display screen 402. Cameras 404 a-c are further configured to provide this image data to an electronic controller 407 configured to determine a height of an object 408 above display screen 402 when the object is within the field of view of cameras 404 a-c. Therefore, in this embodiment, the height of object 408 above display screen 402 may serve as the measure of touch pressure exerted by the object, wherein the measure of pressure increases as object 408 gets closer to display screen 402. Further, cameras 404 a-c may also be used to determine an approximate size of the object. From this object height and object size data, a measure of touch pressure and a measure of a touch surface area may be determined and used to modify an image displayed on display screen 402.
  • Further, controller 407 and cameras 404 a-c may be configured to capture image data at an appropriately high frame rates such that a velocity at which object 408 is moving relative to display screen 402 may be determined. In this matter, the velocity of the approaching object 408 may be used as an additional input to determine a measure of touch pressure.
  • FIG. 5 shows another embodiment of an optical touch-sensitive display device 500. Display device 500 comprises a camera 502 and an infrared light source 504 disposed within a body 506 of the display device. Camera 502 is configured to capture an image of a backside 508 of a display screen 510. This allows camera 502 to image objects that reflect infrared light from source 504. Display screen 510 may include a diffuser layer (not shown) to allow an image to be projected onto the screen.
  • In the embodiment of FIG. 5, objects located on or slightly above display screen 510 may be detectable, while objects located farther from the screen will not be imaged due to the presence of the diffuser layer in display screen 510. Further, where the object 512 (such as a finger) that is used to touch display screen 510 is relatively soft or deformable, the size of the image caused by the touching object may increase with increasing touch pressure due to the deformation of the touching object on the display screen 510. This is illustrated in FIG. 6, where a light touch is indicated by solid shape 600 and a heavier touch is indicated by dashed line shape 602. In this manner, a varying measure of touch may be detected by changes in the size of the object imaged by camera 502. This data may be input to an electronic controller 514 for calculating the appropriate modifications to make to an image displayed on screen 510.
  • FIGS. 7 and 8 show simple schematic diagrams of other types of touch-sensitive displays that may be used to provide a measure of touch pressure. First, FIG. 7 shows a simple schematic diagram of a resistive touch-sensitive display 700. Resistive touch-sensitive display 700 comprises two layers of materials 702, 704 held in a separated arrangement by one or more spacers (not shown). Each layer of material comprises an electrically conductive coating facing the other layer, and at least one of the layers comprises a resistive coating. A voltage V1 is applied to layer 702, and a voltage V2 is applied to layer 704. When touched, layers 702 and 704 are pressed together, thereby completing a circuit between layers 702 and 704. The (x,y) touch location and a measure of pressure may be determined by a controller 706 by analyzing the properties of the signal produced by the contacted layers.
  • Next, FIG. 8 shows a simple schematic diagram of a capacitive touch-sensitive display 800. Capacitive touch-sensitive display comprises a capacitive layer 802 comprising a material configured to store an electric charge. When the screen is touched, some charge is transferred to the touching object as long as the object is electrically conductive. The decrease in stored charge is detected by the measurement of voltage at each corner of the screen, and the (x,y) touch location and measure of touch pressure may be determined from these voltage measurements by a controller 804.
  • It will be appreciated that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. Furthermore, the specific routines or methods described herein may represent one or more of any number of processing strategies such as event-driven, interrupt-driven, multi-tasking, multi-threading, and the like. As such, various acts illustrated may be performed in the sequence illustrated, in parallel, or in some cases omitted. Likewise, the order of any of the above-described processes is not necessarily required to achieve the features and/or results of the embodiments described herein, but is provided for ease of illustration and description. The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.

Claims (11)

1-10. (canceled)
11. A display device, comprising:
a display;
an input device configured to detect a change in a proximity of an object to the display and a change in a touch pressure of an object on the display; and
a controller configured to receive a plurality of inputs from the input device and to simulate a change in a virtual pressure exerted on an image displayed on the display by changing the image in response to a detected change in the proximity of the object to the display.
12. The device of claim 11, wherein the input device comprises a plurality of image capture devices positioned to optically monitor a space adjacent to a front side of the display.
13. The device of claim 12, wherein the controller is configured to determine a speed an object moving toward the display, and to change the image displayed on the display in response to the speed of the object moving toward the display.
14. The device of claim 11, wherein the input device comprises one or more of a capacitive touch-screen device and a resistive touch-screen device.
15. The device of claim 11, wherein the input device comprises an image capture device configured to capture an image of a backside of the display.
16. The device of claim 11, wherein the controller is configured to calculate the changed image based upon the one or more of the detected change in the proximity of the object to the display and the detected change in the touch pressure of the object on the display.
17. The device of claim 11, wherein the controller is configured to simulate an increase in pressure by changing the image to simulate an indentation in the image corresponding to the increase in pressure, and wherein the controller is configured to simulate the indention for a duration after the input device no longer detects the one or more of the object proximate to the display and the object on the display.
18. A device, comprising:
a display;
a plurality of image capture devices positioned to optically monitor a surface of the display and a volume of space adjacent to the surface of the display; and
a controller configured to receive image data from the plurality of image capture devices, to calculate a distance from the display of an object detected by the plurality of image capture devices, and to change an image displayed on the display based upon the distance of the object from the display.
19. The device of claim 18, wherein the controller is configured to determine a speed of a motion of an object toward the display, and to change the image displayed on the display in response to the speed of the motion of the object toward the display.
20. The device of claim 18, wherein the controller is configured to change the image displayed on the display via calculations based upon the distance of the object from the display.
US13/118,145 2007-04-16 2011-05-27 Visual Simulation of Touch Pressure Abandoned US20110227877A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/118,145 US20110227877A1 (en) 2007-04-16 2011-05-27 Visual Simulation of Touch Pressure

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/787,372 US7973778B2 (en) 2007-04-16 2007-04-16 Visual simulation of touch pressure
US13/118,145 US20110227877A1 (en) 2007-04-16 2011-05-27 Visual Simulation of Touch Pressure

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/787,372 Division US7973778B2 (en) 2007-04-16 2007-04-16 Visual simulation of touch pressure

Publications (1)

Publication Number Publication Date
US20110227877A1 true US20110227877A1 (en) 2011-09-22

Family

ID=39853283

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/787,372 Active 2030-05-04 US7973778B2 (en) 2007-04-16 2007-04-16 Visual simulation of touch pressure
US13/118,145 Abandoned US20110227877A1 (en) 2007-04-16 2011-05-27 Visual Simulation of Touch Pressure

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/787,372 Active 2030-05-04 US7973778B2 (en) 2007-04-16 2007-04-16 Visual simulation of touch pressure

Country Status (1)

Country Link
US (2) US7973778B2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110134061A1 (en) * 2009-12-08 2011-06-09 Samsung Electronics Co. Ltd. Method and system for operating a mobile device according to the rate of change of the touch area
US20120274601A1 (en) * 2011-04-29 2012-11-01 Shih Hua Technology Ltd. Method for detecting touch trace based on resistive touch panel
US9417754B2 (en) 2011-08-05 2016-08-16 P4tents1, LLC User interface system, method, and computer program product

Families Citing this family (91)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090046110A1 (en) * 2007-08-16 2009-02-19 Motorola, Inc. Method and apparatus for manipulating a displayed image
TWI380196B (en) * 2007-12-25 2012-12-21 Pixart Imaging Inc Method for detecting users' pressing action and optical operating unit
US20100085328A1 (en) * 2008-10-08 2010-04-08 Hewlett-Packard Development Company, L.P. Touch-Sensitive Display Device And Method
JP4885938B2 (en) * 2008-12-25 2012-02-29 京セラ株式会社 Input device
JP4746085B2 (en) * 2008-12-25 2011-08-10 京セラ株式会社 Input device
JP4746086B2 (en) * 2008-12-25 2011-08-10 京セラ株式会社 Input device
KR101628782B1 (en) * 2009-03-20 2016-06-09 삼성전자주식회사 Apparatus and method for providing haptic function using multi vibrator in portable terminal
US20100238126A1 (en) * 2009-03-23 2010-09-23 Microsoft Corporation Pressure-sensitive context menus
JP4723660B2 (en) * 2009-04-24 2011-07-13 京セラ株式会社 Input device
WO2010131122A2 (en) * 2009-05-13 2010-11-18 France Telecom User interface to provide enhanced control of an application program
US8373669B2 (en) * 2009-07-21 2013-02-12 Cisco Technology, Inc. Gradual proximity touch screen
US8499239B2 (en) * 2009-08-28 2013-07-30 Microsoft Corporation Globe container
EP2494427A4 (en) * 2009-10-26 2015-09-02 Semiconductor Energy Lab Display device and semiconductor device
KR20110056167A (en) * 2009-11-20 2011-05-26 삼성전자주식회사 Display apparatus and calibration method therefor
GB0921216D0 (en) * 2009-12-03 2010-01-20 St Microelectronics Res & Dev Improved touch screen device
US20110221684A1 (en) * 2010-03-11 2011-09-15 Sony Ericsson Mobile Communications Ab Touch-sensitive input device, mobile device and method for operating a touch-sensitive input device
TWI564757B (en) 2010-08-31 2017-01-01 萬國商業機器公司 Computer device with touch screen, method, and computer readable medium for operating the same
US9507485B2 (en) * 2010-09-27 2016-11-29 Beijing Lenovo Software Ltd. Electronic device, displaying method and file saving method
JP5457987B2 (en) * 2010-09-27 2014-04-02 株式会社ジャパンディスプレイ Touch detection device, display device with touch detection function, touch position detection method, and electronic device
US20120105373A1 (en) * 2010-10-31 2012-05-03 Chih-Min Liu Method for detecting touch status of surface of input device and input device thereof
US9965094B2 (en) 2011-01-24 2018-05-08 Microsoft Technology Licensing, Llc Contact geometry tests
US8988087B2 (en) 2011-01-24 2015-03-24 Microsoft Technology Licensing, Llc Touchscreen testing
US9542092B2 (en) 2011-02-12 2017-01-10 Microsoft Technology Licensing, Llc Prediction-based touch contact tracking
US8982061B2 (en) 2011-02-12 2015-03-17 Microsoft Technology Licensing, Llc Angular contact geometry
JP5708083B2 (en) * 2011-03-17 2015-04-30 ソニー株式会社 Electronic device, information processing method, program, and electronic device system
TWI454978B (en) * 2011-05-02 2014-10-01 Shih Hua Technology Ltd Touching based input device
TWI453649B (en) * 2011-05-02 2014-09-21 Shih Hua Technology Ltd Display device with touch panel
US8913019B2 (en) 2011-07-14 2014-12-16 Microsoft Corporation Multi-finger detection and component resolution
US9785281B2 (en) 2011-11-09 2017-10-10 Microsoft Technology Licensing, Llc. Acoustic touch sensitive testing
US8914254B2 (en) 2012-01-31 2014-12-16 Microsoft Corporation Latency measurement
KR101991749B1 (en) * 2012-02-16 2019-06-21 삼성전자주식회사 Apparatus and method for controlling lock function in portable terminal
KR101894567B1 (en) * 2012-02-24 2018-09-03 삼성전자 주식회사 Operation Method of Lock Screen And Electronic Device supporting the same
WO2013169845A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for scrolling nested regions
WO2013169843A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for manipulating framed graphical objects
AU2013259642A1 (en) 2012-05-09 2014-12-04 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
WO2013169842A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for selecting object within a group of objects
DE112013002387T5 (en) 2012-05-09 2015-02-12 Apple Inc. Apparatus, method and graphical user interface for providing tactile feedback for operations in a user interface
WO2013169849A2 (en) 2012-05-09 2013-11-14 Industries Llc Yknots Device, method, and graphical user interface for displaying user interface objects corresponding to an application
WO2013169851A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for facilitating user interaction with controls in a user interface
EP2847660B1 (en) 2012-05-09 2018-11-14 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
WO2013169865A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
DE112013002409T5 (en) 2012-05-09 2015-02-26 Apple Inc. Apparatus, method and graphical user interface for displaying additional information in response to a user contact
DE112013002412T5 (en) 2012-05-09 2015-02-19 Apple Inc. Apparatus, method and graphical user interface for providing feedback for changing activation states of a user interface object
AU2013259630B2 (en) 2012-05-09 2016-07-07 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to gesture
WO2013169875A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US9493342B2 (en) 2012-06-21 2016-11-15 Nextinput, Inc. Wafer level MEMS force dies
US9481084B2 (en) * 2012-06-22 2016-11-01 Microsoft Technology Licensing, Llc Touch quality test robot
EP2870445A1 (en) 2012-07-05 2015-05-13 Ian Campbell Microelectromechanical load sensor and methods of manufacturing the same
US9557846B2 (en) 2012-10-04 2017-01-31 Corning Incorporated Pressure-sensing touch system utilizing optical and capacitive systems
US9317147B2 (en) 2012-10-24 2016-04-19 Microsoft Technology Licensing, Llc. Input testing tool
US10101905B1 (en) * 2012-12-07 2018-10-16 American Megatrends, Inc. Proximity-based input device
KR101905174B1 (en) * 2012-12-29 2018-10-08 애플 인크. Device, method, and graphical user interface for navigating user interface hierachies
WO2014105279A1 (en) 2012-12-29 2014-07-03 Yknots Industries Llc Device, method, and graphical user interface for switching between user interfaces
KR102001332B1 (en) 2012-12-29 2019-07-17 애플 인크. Device, method, and graphical user interface for determining whether to scroll or select contents
EP2939095B1 (en) 2012-12-29 2018-10-03 Apple Inc. Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics
WO2014105275A1 (en) 2012-12-29 2014-07-03 Yknots Industries Llc Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
CN104903834B (en) 2012-12-29 2019-07-05 苹果公司 For equipment, method and the graphic user interface in touch input to transition between display output relation
EP3094950B1 (en) 2014-01-13 2022-12-21 Nextinput, Inc. Miniaturized and ruggedized wafer level mems force sensors
US9971406B2 (en) 2014-12-05 2018-05-15 International Business Machines Corporation Visually enhanced tactile feedback
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10048757B2 (en) 2015-03-08 2018-08-14 Apple Inc. Devices and methods for controlling media presentation
US9990107B2 (en) 2015-03-08 2018-06-05 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9645732B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US9785305B2 (en) 2015-03-19 2017-10-10 Apple Inc. Touch input cursor manipulation
US9639184B2 (en) 2015-03-19 2017-05-02 Apple Inc. Touch input cursor manipulation
US20170045981A1 (en) 2015-08-10 2017-02-16 Apple Inc. Devices and Methods for Processing Touch Inputs Based on Their Intensities
US10152208B2 (en) 2015-04-01 2018-12-11 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10346030B2 (en) 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
US9674426B2 (en) 2015-06-07 2017-06-06 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10200598B2 (en) 2015-06-07 2019-02-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9860451B2 (en) 2015-06-07 2018-01-02 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9891811B2 (en) 2015-06-07 2018-02-13 Apple Inc. Devices and methods for navigating between user interfaces
US9830048B2 (en) 2015-06-07 2017-11-28 Apple Inc. Devices and methods for processing touch inputs with instructions in a web page
WO2016201235A1 (en) 2015-06-10 2016-12-15 Nextinput, Inc. Ruggedized wafer level mems force sensor with a tolerance trench
US10416800B2 (en) 2015-08-10 2019-09-17 Apple Inc. Devices, methods, and graphical user interfaces for adjusting user interface objects
US10235035B2 (en) 2015-08-10 2019-03-19 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US10248308B2 (en) 2015-08-10 2019-04-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures
US9880735B2 (en) 2015-08-10 2018-01-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
EP3580539A4 (en) 2017-02-09 2020-11-25 Nextinput, Inc. Integrated digital force sensors and related methods of manufacture
US11243125B2 (en) 2017-02-09 2022-02-08 Nextinput, Inc. Integrated piezoresistive and piezoelectric fusion force sensor
US11221263B2 (en) 2017-07-19 2022-01-11 Nextinput, Inc. Microelectromechanical force sensor having a strain transfer layer arranged on the sensor die
US11423686B2 (en) 2017-07-25 2022-08-23 Qorvo Us, Inc. Integrated fingerprint and force sensor
WO2019023552A1 (en) 2017-07-27 2019-01-31 Nextinput, Inc. A wafer bonded piezoresistive and piezoelectric force sensor and related methods of manufacture
WO2019079420A1 (en) 2017-10-17 2019-04-25 Nextinput, Inc. Temperature coefficient of offset compensation for force sensor and strain gauge
WO2019090057A1 (en) 2017-11-02 2019-05-09 Nextinput, Inc. Sealed force sensor with etch stop layer
US11874185B2 (en) 2017-11-16 2024-01-16 Nextinput, Inc. Force attenuator for force sensor
CN109104658B (en) * 2018-07-26 2020-06-05 歌尔科技有限公司 Touch identification method and device of wireless earphone and wireless earphone
US10962427B2 (en) 2019-01-10 2021-03-30 Nextinput, Inc. Slotted MEMS force sensor
JP7309466B2 (en) * 2019-06-11 2023-07-18 キヤノン株式会社 Electronic equipment and its control method
US11907463B2 (en) * 2020-05-08 2024-02-20 Accenture Global Solutions Limited Pressure-sensitive machine interface device

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5231698A (en) * 1991-03-20 1993-07-27 Forcier Mitchell D Script/binary-encoded-character processing method and system
US5241308A (en) * 1990-02-22 1993-08-31 Paragon Systems, Inc. Force sensitive touch panel
US5475401A (en) * 1993-04-29 1995-12-12 International Business Machines, Inc. Architecture and method for communication of writing and erasing signals from a remote stylus to a digitizing display
US5610629A (en) * 1991-12-06 1997-03-11 Ncr Corporation Pen input to liquid crystal display
US5953735A (en) * 1991-03-20 1999-09-14 Forcier; Mitchell D. Script character processing method and system with bit-mapped document editing
US20020130844A1 (en) * 1998-12-31 2002-09-19 Natoli Anthony James Francis Virtual reality keyboard system and method
US6567102B2 (en) * 2001-06-05 2003-05-20 Compal Electronics Inc. Touch screen using pressure to control the zoom ratio
US20040108995A1 (en) * 2002-08-28 2004-06-10 Takeshi Hoshino Display unit with touch panel
US6801191B2 (en) * 2001-04-27 2004-10-05 Matsushita Electric Industrial Co., Ltd. Input device and inputting method with input device
US20050057531A1 (en) * 2003-09-17 2005-03-17 Joseph Patino Method and system for generating characters
US20050226505A1 (en) * 2004-03-31 2005-10-13 Wilson Andrew D Determining connectedness and offset of 3D objects relative to an interactive surface
US20060022962A1 (en) * 2002-11-15 2006-02-02 Gerald Morrison Size/scale and orientation determination of a pointer in a camera-based touch system
US20060022955A1 (en) * 2004-07-30 2006-02-02 Apple Computer, Inc. Visual expander
US20060044280A1 (en) * 2004-08-31 2006-03-02 Huddleston Wyatt A Interface
US20060109256A1 (en) * 2004-10-08 2006-05-25 Immersion Corporation, A Delaware Corporation Haptic feedback for button and scrolling action simulation in touch input devices
US7077009B2 (en) * 2001-06-28 2006-07-18 Tactex Controls Inc. Pressure sensitive surfaces
US20060227120A1 (en) * 2005-03-28 2006-10-12 Adam Eikman Photonic touch screen apparatus and method of use
US20060252541A1 (en) * 2002-07-27 2006-11-09 Sony Computer Entertainment Inc. Method and system for applying gearing effects to visual tracking
US7138984B1 (en) * 2001-06-05 2006-11-21 Idc, Llc Directly laminated touch sensitive screen
US20060284874A1 (en) * 2005-06-15 2006-12-21 Microsoft Corporation Optical flow-based manipulation of graphical objects

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19942376A1 (en) 1999-09-04 2001-04-12 Schott Interactive Glass Gmbh Pressure switch element and its use

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5241308A (en) * 1990-02-22 1993-08-31 Paragon Systems, Inc. Force sensitive touch panel
US5231698A (en) * 1991-03-20 1993-07-27 Forcier Mitchell D Script/binary-encoded-character processing method and system
US5953735A (en) * 1991-03-20 1999-09-14 Forcier; Mitchell D. Script character processing method and system with bit-mapped document editing
US5610629A (en) * 1991-12-06 1997-03-11 Ncr Corporation Pen input to liquid crystal display
US5475401A (en) * 1993-04-29 1995-12-12 International Business Machines, Inc. Architecture and method for communication of writing and erasing signals from a remote stylus to a digitizing display
US20020130844A1 (en) * 1998-12-31 2002-09-19 Natoli Anthony James Francis Virtual reality keyboard system and method
US6801191B2 (en) * 2001-04-27 2004-10-05 Matsushita Electric Industrial Co., Ltd. Input device and inputting method with input device
US6567102B2 (en) * 2001-06-05 2003-05-20 Compal Electronics Inc. Touch screen using pressure to control the zoom ratio
US7138984B1 (en) * 2001-06-05 2006-11-21 Idc, Llc Directly laminated touch sensitive screen
US7077009B2 (en) * 2001-06-28 2006-07-18 Tactex Controls Inc. Pressure sensitive surfaces
US20060252541A1 (en) * 2002-07-27 2006-11-09 Sony Computer Entertainment Inc. Method and system for applying gearing effects to visual tracking
US20040108995A1 (en) * 2002-08-28 2004-06-10 Takeshi Hoshino Display unit with touch panel
US20060022962A1 (en) * 2002-11-15 2006-02-02 Gerald Morrison Size/scale and orientation determination of a pointer in a camera-based touch system
US20050057531A1 (en) * 2003-09-17 2005-03-17 Joseph Patino Method and system for generating characters
US20050226505A1 (en) * 2004-03-31 2005-10-13 Wilson Andrew D Determining connectedness and offset of 3D objects relative to an interactive surface
US20060022955A1 (en) * 2004-07-30 2006-02-02 Apple Computer, Inc. Visual expander
US20060044280A1 (en) * 2004-08-31 2006-03-02 Huddleston Wyatt A Interface
US20060109256A1 (en) * 2004-10-08 2006-05-25 Immersion Corporation, A Delaware Corporation Haptic feedback for button and scrolling action simulation in touch input devices
US20060227120A1 (en) * 2005-03-28 2006-10-12 Adam Eikman Photonic touch screen apparatus and method of use
US20060284874A1 (en) * 2005-06-15 2006-12-21 Microsoft Corporation Optical flow-based manipulation of graphical objects

Cited By (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110134061A1 (en) * 2009-12-08 2011-06-09 Samsung Electronics Co. Ltd. Method and system for operating a mobile device according to the rate of change of the touch area
US9619025B2 (en) * 2009-12-08 2017-04-11 Samsung Electronics Co., Ltd. Method and system for operating a mobile device according to the rate of change of the touch area
US20120274601A1 (en) * 2011-04-29 2012-11-01 Shih Hua Technology Ltd. Method for detecting touch trace based on resistive touch panel
US8624872B2 (en) * 2011-04-29 2014-01-07 Shih Hua Technology Ltd. Method for detecting touch trace based on resistive touch panel
US9417754B2 (en) 2011-08-05 2016-08-16 P4tents1, LLC User interface system, method, and computer program product
US10013094B1 (en) 2011-08-05 2018-07-03 P4tents1, LLC System, method, and computer program product for a multi-pressure selection touch screen
US10013095B1 (en) 2011-08-05 2018-07-03 P4tents1, LLC Multi-type gesture-equipped touch screen system, method, and computer program product
US10031607B1 (en) 2011-08-05 2018-07-24 P4tents1, LLC System, method, and computer program product for a multi-pressure selection touch screen
US10120480B1 (en) 2011-08-05 2018-11-06 P4tents1, LLC Application-specific pressure-sensitive touch screen system, method, and computer program product
US10133397B1 (en) 2011-08-05 2018-11-20 P4tents1, LLC Tri-state gesture-equipped touch screen system, method, and computer program product
US10146353B1 (en) 2011-08-05 2018-12-04 P4tents1, LLC Touch screen system, method, and computer program product
US10156921B1 (en) 2011-08-05 2018-12-18 P4tents1, LLC Tri-state gesture-equipped touch screen system, method, and computer program product
US10162448B1 (en) 2011-08-05 2018-12-25 P4tents1, LLC System, method, and computer program product for a pressure-sensitive touch screen for messages
US10203794B1 (en) 2011-08-05 2019-02-12 P4tents1, LLC Pressure-sensitive home interface system, method, and computer program product
US10209806B1 (en) 2011-08-05 2019-02-19 P4tents1, LLC Tri-state gesture-equipped touch screen system, method, and computer program product
US10209809B1 (en) 2011-08-05 2019-02-19 P4tents1, LLC Pressure-sensitive touch screen system, method, and computer program product for objects
US10209807B1 (en) 2011-08-05 2019-02-19 P4tents1, LLC Pressure sensitive touch screen system, method, and computer program product for hyperlinks
US10209808B1 (en) 2011-08-05 2019-02-19 P4tents1, LLC Pressure-based interface system, method, and computer program product with virtual display layers
US10222893B1 (en) 2011-08-05 2019-03-05 P4tents1, LLC Pressure-based touch screen system, method, and computer program product with virtual display layers
US10222892B1 (en) 2011-08-05 2019-03-05 P4tents1, LLC System, method, and computer program product for a multi-pressure selection touch screen
US10222895B1 (en) 2011-08-05 2019-03-05 P4tents1, LLC Pressure-based touch screen system, method, and computer program product with virtual display layers
US10222894B1 (en) 2011-08-05 2019-03-05 P4tents1, LLC System, method, and computer program product for a multi-pressure selection touch screen
US10222891B1 (en) 2011-08-05 2019-03-05 P4tents1, LLC Setting interface system, method, and computer program product for a multi-pressure selection touch screen
US10275087B1 (en) 2011-08-05 2019-04-30 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10275086B1 (en) 2011-08-05 2019-04-30 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US10338736B1 (en) 2011-08-05 2019-07-02 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10345961B1 (en) 2011-08-05 2019-07-09 P4tents1, LLC Devices and methods for navigating between user interfaces
US10365758B1 (en) 2011-08-05 2019-07-30 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10386960B1 (en) 2011-08-05 2019-08-20 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10521047B1 (en) 2011-08-05 2019-12-31 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US10534474B1 (en) 2011-08-05 2020-01-14 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US10540039B1 (en) 2011-08-05 2020-01-21 P4tents1, LLC Devices and methods for navigating between user interface
US10551966B1 (en) 2011-08-05 2020-02-04 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US10592039B1 (en) 2011-08-05 2020-03-17 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product for displaying multiple active applications
US10606396B1 (en) 2011-08-05 2020-03-31 P4tents1, LLC Gesture-equipped touch screen methods for duration-based functions
US10642413B1 (en) 2011-08-05 2020-05-05 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US10649580B1 (en) 2011-08-05 2020-05-12 P4tents1, LLC Devices, methods, and graphical use interfaces for manipulating user interface objects with visual and/or haptic feedback
US10649581B1 (en) 2011-08-05 2020-05-12 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10649579B1 (en) 2011-08-05 2020-05-12 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10649571B1 (en) 2011-08-05 2020-05-12 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10649578B1 (en) 2011-08-05 2020-05-12 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US10656752B1 (en) 2011-08-05 2020-05-19 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US10656757B1 (en) 2011-08-05 2020-05-19 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US10656754B1 (en) 2011-08-05 2020-05-19 P4tents1, LLC Devices and methods for navigating between user interfaces
US10656755B1 (en) 2011-08-05 2020-05-19 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US10656759B1 (en) 2011-08-05 2020-05-19 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10656753B1 (en) 2011-08-05 2020-05-19 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US10656758B1 (en) 2011-08-05 2020-05-19 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US10656756B1 (en) 2011-08-05 2020-05-19 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US10664097B1 (en) 2011-08-05 2020-05-26 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10671213B1 (en) 2011-08-05 2020-06-02 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10671212B1 (en) 2011-08-05 2020-06-02 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US10725581B1 (en) 2011-08-05 2020-07-28 P4tents1, LLC Devices, methods and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10782819B1 (en) 2011-08-05 2020-09-22 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US10788931B1 (en) 2011-08-05 2020-09-29 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10838542B1 (en) 2011-08-05 2020-11-17 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US10936114B1 (en) 2011-08-05 2021-03-02 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US10996787B1 (en) 2011-08-05 2021-05-04 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US11061503B1 (en) 2011-08-05 2021-07-13 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US11740727B1 (en) 2011-08-05 2023-08-29 P4Tents1 Llc Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback

Also Published As

Publication number Publication date
US20080252616A1 (en) 2008-10-16
US7973778B2 (en) 2011-07-05

Similar Documents

Publication Publication Date Title
US7973778B2 (en) Visual simulation of touch pressure
CN104737096B (en) Display device
CN205721636U (en) A kind of electronic equipment
CN107690609A (en) Power inputs and cursor control
CN107667332A (en) Power senses and is not intended to input control
CN107835968A (en) Force curve and unintentionally input control
TWI584164B (en) Emulating pressure sensitivity on multi-touch devices
CN105683882A (en) Latency measuring and testing system and method
Kubo et al. B2B-Swipe: Swipe gesture for rectangular smartwatches from a bezel to a bezel
Heo et al. Indirect shear force estimation for multi-point shear force operations
CN110096136A (en) System for the visual processes of spectrogram to generate haptic effect
US20140198071A1 (en) Force Sensing Touchscreen
US20180011538A1 (en) Multimodal haptic effects
Liu et al. Tri-modal tactile display and its application into tactile perception of visualized surfaces
US9075438B2 (en) Systems and related methods involving stylus tactile feel
TWI488082B (en) Portable electronic apparatus and touch sensing method
Smith et al. Low-cost malleable surfaces with multi-touch pressure sensitivity
Nguyen et al. Bendid: Flexible interface for localized deformation recognition
Nguyen et al. SOFTii: soft tangible interface for continuous control of virtual objects with pressure-based input
JP5876733B2 (en) User interface device capable of imparting tactile vibration according to object height, tactile vibration imparting method and program
Sziládi et al. Cost-effective hand gesture computer control interface
Hwang et al. Micpen: pressure-sensitive pen interaction using microphone with standard touchscreen
Li et al. Pseudo-haptics for rigid tool/soft surface interaction feedback in virtual environments
Muller Multi-touch displays: design, applications and performance evaluation
Chung et al. Effect of elastic touchscreen and input devices with different softness on user task performance and subjective satisfaction

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034544/0001

Effective date: 20141014

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION