US20110199386A1 - Overlay feature to provide user assistance in a multi-touch interactive display environment - Google Patents

Overlay feature to provide user assistance in a multi-touch interactive display environment Download PDF

Info

Publication number
US20110199386A1
US20110199386A1 US12/705,026 US70502610A US2011199386A1 US 20110199386 A1 US20110199386 A1 US 20110199386A1 US 70502610 A US70502610 A US 70502610A US 2011199386 A1 US2011199386 A1 US 2011199386A1
Authority
US
United States
Prior art keywords
overlay
display
touch
semi
transparent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/705,026
Inventor
Pallavi Dharwada
Jason Laberge
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honeywell International Inc
Original Assignee
Honeywell International Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honeywell International Inc filed Critical Honeywell International Inc
Priority to US12/705,026 priority Critical patent/US20110199386A1/en
Assigned to HONEYWELL INTERNATIONAL INC. reassignment HONEYWELL INTERNATIONAL INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DHARWADA, PALLAVI, LABERGE, JASON
Publication of US20110199386A1 publication Critical patent/US20110199386A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04804Transparency, e.g. transparent or translucent windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • FIG. 1A illustrates an example touch-sensitive display that shows a semi-transparent overlay after touching the display.
  • FIG. 1B illustrates the touch-sensitive display of FIG. 1A after a user has executed the gesture path shown on the semi-transparent overlay.
  • FIG. 2A illustrates an example touch-sensitive display that shows an overlay after a single touch of the display.
  • FIG. 2B illustrates the touch-sensitive display of FIG. 2A after a user has executed the gesture path shown on the overlay.
  • FIG. 3A illustrates an example touch-sensitive display that shows an overlay after a dual touch of the display.
  • FIG. 3B illustrates the touch-sensitive display of FIG. 3A after a user has executed the gesture path shown on the overlay.
  • FIG. 4A illustrates an example touch-sensitive display that shows an overlay after a simultaneous four touch point contact of the display.
  • FIG. 4B illustrates the touch-sensitive display of FIG. 3A after a user has executed the gesture path shown on the overlay.
  • FIG. 5 is a block diagram of an example system for executing the method described herein with reference to FIGS. 1-4 .
  • the functions or algorithms described herein may be implemented in software or a combination of software and human implemented procedures in one embodiment.
  • the software may consist of computer executable instructions stored on computer readable media such as memory or other type of storage devices. Further, such functions correspond to modules, which are software, hardware, firmware or any combination thereof. Multiple functions may be performed in one or more modules as desired, and the embodiments described are merely examples.
  • the software may be executed on a digital signal processor, ASIC, microprocessor, or other type of processor operating on a computer system, such as a personal computer, server or other computer system.
  • FIGS. 1A-1B illustrate an example method of displaying items on a touch screen display 10 .
  • the method includes detecting contact with a touch-sensitive display 10 .
  • the method further includes generating a semi-transparent overlay 12 that appears on the display 10 ( FIG. 1A ) based on the contact with the touch-sensitive display 10 .
  • the semi-transparent overlay(s) 12 described herein may provide memory aiding functions that help users remember the features available at any given time in complex environments.
  • the semi-transparent overlays 12 can show (i) different features or functions available based on the current context; (ii) context-sensitive menu options (similar to a right click function in most desktop applications); (iii) help information (e.g., tooltips, online help, etc.).
  • a semi-transparent overlay 12 when a semi-transparent overlay 12 shows the available options, it can provide paths that the user can trace with their finger(s) to help users remember the available gestures at any given time (i.e., help users intuitively identify the actions that can be performed on the display 10 ) or help users access the functions available.
  • the semi-transparent overlay 12 may be context aware in the sense that the gestures or other features shown in each overlay window will vary based on the current situation shown on the display 10 (other options may also be possible). Some examples of context aware situations include (i) information currently shown on the screen (e.g., camera positions and status); (ii) current situation (e.g., normal operations vs. active incident); (iii) features recently assessed (e.g., navigation moves); and/or (iv) last user interaction (e.g., entering data in a form).
  • any of the semi-transparent overlays 12 can be turned on/off depending on the task situation and/or user preferences (i.e., an expert user may not require the overlay).
  • the display 10 shows information in the same manner as if the gesture was completed without the semi-transparent overlay.
  • a user brings hands toward the display 10 and touches one or more points on the surface.
  • the method detects touch points and the touch points remain active based on timing requirements.
  • the method generates a context aware semi-transparent overlay 12 which may be displayed along with corresponding text that indicates the action the overlay 12 affords (see FIG. 1A ) that is shown on the display 10 .
  • the semi-transparent overlay window 12 may be anchored to the touch point on a corner.
  • the semi-transparent overlay window 12 may show options available in current context. If multiple overlays apply in the current context, they may be arranged in a tiled manner along with appropriate text labels (if text labels are used).
  • the user traces gesture path(s) shown in semi-transparent overlay (see FIG. 1B ).
  • the overlay(s) may disappear or be minimized.
  • the semi-transparent overlay window(s) will default to 50% transparency with background color (R:221, G:221, B:221), although other colors and transparencies may be configurable.
  • the semi-transparent overlay(s) 12 may be anchored to the touch point on a corner.
  • the size of the transparent overlay window may also vary depending on (i) current context (gestures supported); (ii) size of gesture tracing paths; (iii) font size; (iv) display resolution; and (v) display size.
  • the semi-transparent overlay 12 may provide a visual cue as to where the touch point must start for the gesture trace. As an example, the user may have to move their finger (touch point) to the middle and trace either up/down or left/right. In addition, users may be able to close the semi-transparent overlay 12 by pressing a close button. Gesture paths may be shown as dashed lines with arrows at the end to indicate the direction of tracing for each possible gesture.
  • any other active overlays may disappear.
  • the semi-transparent overlay(s) 12 (i) increase in transparency (thus causing them to fade more in the background but not disappear); (ii) reduce to an inactive window (but again not disappear); and/or (iii) stay active (depending on context).
  • the active gesture path may change color to provide feedback to the user.
  • the system and method may react in the same manner as if the gesture was completed without the semi-transparent overlay 12 and gesture path.
  • the user may start by touching the center and then tracing the gesture path to the right such that the system and method could react by moving the current view to the right (i.e., the same behavior that would happen if the user completed a left-to-right gesture without the semi-transparent overlay 12 ).
  • FIGS. 2A-2B illustrate another example method of displaying items on a touch screen display 10 .
  • the method includes detecting how many contacts are made with a touch-sensitive display 10 .
  • the method further includes generating an overlay 14 , 16 , 18 that appears on the display 10 , wherein the type of overlay 14 , 16 , 18 that appears on the display 10 is based on the number of contacts that are made with the touch-sensitive display 10 ( FIGS. 2A , 3 A and 4 A).
  • a single touch point may be detected by the unit within a configurable time period (e.g., 1 sec).
  • the touch point may need to remain active for a minimum configurable amount of time before the overlay is shown (e.g., 250 msecs).
  • the appropriate time requirements will depend on the display platform, sampling rate, and target domain and/or application. When the timing requirements are met, the appropriate context overlay is shown (see FIG. 2A ).
  • two touch points must be detected by the unit within a configurable time period (e.g., 1 sec).
  • both touch points may need to remain active for a minimum configurable amount of time before the overlay is shown (e.g., 250 msecs).
  • the appropriate time requirements will depend on the display platform, sampling rate, and target domain and/or application. When the timing requirements are met, the appropriate dual touch context overlay is shown (see FIG. 3A ).
  • a user brings hands toward multi-touch display and places one touch point or multiple touch points on the surface.
  • the method detects touch points and they remain active based on timing requirements.
  • the display 10 shows an overlay 14 , 16 , 18 based on number of touch points detected ( FIGS. 2A , 3 A and 4 A).
  • an overlay window may be anchored to the touch point on a corner of a window.
  • the overlay window(s) 14 , 16 , 18 show options that are available in current context for the number of touch points that are detected. If multiple overlays apply in the current context, they may be arranged in a tiled manner.
  • the user traces gesture paths shown in the particular overlay 14 , 16 , 18 ( FIGS. 2B , 3 B and 4 B). In some embodiments, when gesture tracing begins additional overlays disappear or are minimized.
  • FIGS. 3A-3B shows an example two-fingered navigation options for zooming a current view.
  • FIGS. 4A-4B shows an example four-fingered option for taking a snapshot of the current view on the display 10 .
  • FIG. 5 A block diagram of a computer system that executes programming 525 for performing the above method is shown in FIG. 5 .
  • the programming may be written in one of many languages, such as virtual basic, Java and others.
  • a general computing device in the form of a computer 510 may include a processing unit 502 , memory 504 , removable storage 512 , and non-removable storage 514 .
  • Memory 504 may include volatile memory 506 and non-volatile memory 508 .
  • Computer 510 may include—or have access to a computing environment that includes—a variety of computer-readable media, such as volatile memory 506 and non-volatile memory 508 , removable storage 512 and non-removable storage 514 .
  • Computer storage includes random access memory (RAM), read only memory (ROM), erasable programmable read-only memory (EPROM) & electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technologies, compact disc read-only memory (CD ROM), Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium capable of storing computer-readable instructions.
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory or other memory technologies
  • compact disc read-only memory (CD ROM) compact disc read-only memory
  • DVD Digital Versatile Disks
  • magnetic cassettes magnetic tape
  • magnetic disk storage or other magnetic storage devices, or any other medium capable of storing computer-readable instructions.
  • Computer 510 may include or have access to a computing environment that includes input 516 , output 518 , and a communication connection 520 .
  • the input 516 may be a keyboard and mouse/touchpad, or other type of data input device
  • the output 518 may be a display device or printer or other type of device to communicate information to a user.
  • a touch screen device may be used as both an input and an output device.
  • the computer may operate in a networked environment using a communication connection to connect to one or more remote computers.
  • the remote computer may include a personal computer (PC), server, router, network PC, a peer device or other common network node, or the like.
  • the communication connection may include a Local Area Network (LAN), a Wide Area Network (WAN) or other networks.
  • LAN Local Area Network
  • WAN Wide Area Network
  • the method described herein may help to provide on-demand assistance to help users know the features and functions available at any given time.
  • the on-demand assistance is a context aware overlay that is activated when the user places at least one finger on the touch-sensitive display.
  • the overlay is semi-transparent so as not to occlude the critical information shown in the environment that is shown on the display. Showing the overlay may help users remember the features or functions available by reinforcing the options available. The need for an overlay may be reduced with repeated use because users may be more likely to remember the options available and how to use them.

Abstract

An example method of displaying items on a touch screen display includes detecting contact with a touch-sensitive display. The method further includes generating a semi-transparent overlay that appears on the display based on the contact with the touch-sensitive display. In other embodiments, the example method includes detecting how many contacts are made with a touch-sensitive display, and generating an overlay that appears on the display, wherein the type of overlay that appears on the display is based on the number of contacts that are made with the touch-sensitive display.

Description

    BACKGROUND
  • Monitoring large and complex environments is a challenging task for security operators because situations evolve quickly, information is distributed across multiple screens and systems, uncertainty is rampant, decisions can have high risk and far reaching consequences, and responses must be quick and coordinated when problems occur. The increased market present of single-touch and multi-touch interaction devices such as the iPhone, GPS navigators, HP TouchSmart laptop, Microsoft Surface and Blackberry mobile devices offer a significant opportunity to investigate new gesture-based interaction techniques that can improve operator performance during complex monitoring and response tasks.
  • However, the solutions that are typically incorporated to address the myriad of needs in complex security environments often consist of adding a multitude of features and functions. Unfortunately, one consequence of adding additional features is that operators must remember the features available, including when and how to access them.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1A illustrates an example touch-sensitive display that shows a semi-transparent overlay after touching the display.
  • FIG. 1B illustrates the touch-sensitive display of FIG. 1A after a user has executed the gesture path shown on the semi-transparent overlay.
  • FIG. 2A illustrates an example touch-sensitive display that shows an overlay after a single touch of the display.
  • FIG. 2B illustrates the touch-sensitive display of FIG. 2A after a user has executed the gesture path shown on the overlay.
  • FIG. 3A illustrates an example touch-sensitive display that shows an overlay after a dual touch of the display.
  • FIG. 3B illustrates the touch-sensitive display of FIG. 3A after a user has executed the gesture path shown on the overlay.
  • FIG. 4A illustrates an example touch-sensitive display that shows an overlay after a simultaneous four touch point contact of the display.
  • FIG. 4B illustrates the touch-sensitive display of FIG. 3A after a user has executed the gesture path shown on the overlay.
  • FIG. 5 is a block diagram of an example system for executing the method described herein with reference to FIGS. 1-4.
  • DETAILED DESCRIPTION
  • In the following description, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific embodiments which may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that structural, electrical, and optical changes may be made without departing from the scope of the present invention. The following description of example embodiments is, therefore, not to be taken in a limited sense, and the scope of the present invention is defined by the appended claims.
  • The functions or algorithms described herein may be implemented in software or a combination of software and human implemented procedures in one embodiment. The software may consist of computer executable instructions stored on computer readable media such as memory or other type of storage devices. Further, such functions correspond to modules, which are software, hardware, firmware or any combination thereof. Multiple functions may be performed in one or more modules as desired, and the embodiments described are merely examples. The software may be executed on a digital signal processor, ASIC, microprocessor, or other type of processor operating on a computer system, such as a personal computer, server or other computer system.
  • FIGS. 1A-1B illustrate an example method of displaying items on a touch screen display 10. The method includes detecting contact with a touch-sensitive display 10. The method further includes generating a semi-transparent overlay 12 that appears on the display 10 (FIG. 1A) based on the contact with the touch-sensitive display 10.
  • The semi-transparent overlay(s) 12 described herein may provide memory aiding functions that help users remember the features available at any given time in complex environments. The semi-transparent overlays 12 can show (i) different features or functions available based on the current context; (ii) context-sensitive menu options (similar to a right click function in most desktop applications); (iii) help information (e.g., tooltips, online help, etc.).
  • In some embodiments, when a semi-transparent overlay 12 shows the available options, it can provide paths that the user can trace with their finger(s) to help users remember the available gestures at any given time (i.e., help users intuitively identify the actions that can be performed on the display 10) or help users access the functions available. The semi-transparent overlay 12 may be context aware in the sense that the gestures or other features shown in each overlay window will vary based on the current situation shown on the display 10 (other options may also be possible). Some examples of context aware situations include (i) information currently shown on the screen (e.g., camera positions and status); (ii) current situation (e.g., normal operations vs. active incident); (iii) features recently assessed (e.g., navigation moves); and/or (iv) last user interaction (e.g., entering data in a form).
  • It should be noted that any of the semi-transparent overlays 12 can be turned on/off depending on the task situation and/or user preferences (i.e., an expert user may not require the overlay). When gestures are detected as being carried out commensurate with the gesture path shown in the overlay 12, the display 10 shows information in the same manner as if the gesture was completed without the semi-transparent overlay.
  • One example operation of the method involving the semi-transparent overlay may be described as follows:
  • 1. A user brings hands toward the display 10 and touches one or more points on the surface.
  • 2. The method detects touch points and the touch points remain active based on timing requirements.
  • 3. The method generates a context aware semi-transparent overlay 12 which may be displayed along with corresponding text that indicates the action the overlay 12 affords (see FIG. 1A) that is shown on the display 10. The semi-transparent overlay window 12 may be anchored to the touch point on a corner. The semi-transparent overlay window 12 may show options available in current context. If multiple overlays apply in the current context, they may be arranged in a tiled manner along with appropriate text labels (if text labels are used).
  • 4. The user traces gesture path(s) shown in semi-transparent overlay (see FIG. 1B). Depending on the application where the system and method are used, when gesture tracing begins, the overlay(s) may disappear or be minimized.
  • In one example embodiment, the semi-transparent overlay window(s) will default to 50% transparency with background color (R:221, G:221, B:221), although other colors and transparencies may be configurable. In addition, the semi-transparent overlay(s) 12 may be anchored to the touch point on a corner. The size of the transparent overlay window may also vary depending on (i) current context (gestures supported); (ii) size of gesture tracing paths; (iii) font size; (iv) display resolution; and (v) display size.
  • The semi-transparent overlay 12 may provide a visual cue as to where the touch point must start for the gesture trace. As an example, the user may have to move their finger (touch point) to the middle and trace either up/down or left/right. In addition, users may be able to close the semi-transparent overlay 12 by pressing a close button. Gesture paths may be shown as dashed lines with arrows at the end to indicate the direction of tracing for each possible gesture.
  • If multiple semi-transparent overlays are shown based on the current context, the last one used may be shown anchored to the touch point and act as the primary or active overlay. Additional overlays may be organized in a tiled arrangement starting from the bottom left corner and moving to the right and up (consuming available display real estate as needed).
  • In some embodiments, as soon as the user starts tracing a gesture path in one of the semi-transparent overlays 12 any other active overlays may disappear. Other embodiments could have the semi-transparent overlay(s) 12 (i) increase in transparency (thus causing them to fade more in the background but not disappear); (ii) reduce to an inactive window (but again not disappear); and/or (iii) stay active (depending on context). In still other embodiments, as soon as the user starts tracing a gesture path in one of the semi-transparent overlays, the active gesture path may change color to provide feedback to the user.
  • It should be noted that when the user starts tracing a gesture path the system and method may react in the same manner as if the gesture was completed without the semi-transparent overlay 12 and gesture path. As an example, the user may start by touching the center and then tracing the gesture path to the right such that the system and method could react by moving the current view to the right (i.e., the same behavior that would happen if the user completed a left-to-right gesture without the semi-transparent overlay 12).
  • FIGS. 2A-2B illustrate another example method of displaying items on a touch screen display 10. The method includes detecting how many contacts are made with a touch-sensitive display 10. The method further includes generating an overlay 14, 16, 18 that appears on the display 10, wherein the type of overlay 14, 16, 18 that appears on the display 10 is based on the number of contacts that are made with the touch-sensitive display 10 (FIGS. 2A, 3A and 4A).
  • It should be noted that the number of touch points detected by system may be single finger, multi-finger, multi-hand and multi-user.
  • Single Touch Point Timing Requirements
  • In some embodiments, a single touch point may be detected by the unit within a configurable time period (e.g., 1 sec). In addition, the touch point may need to remain active for a minimum configurable amount of time before the overlay is shown (e.g., 250 msecs). The appropriate time requirements will depend on the display platform, sampling rate, and target domain and/or application. When the timing requirements are met, the appropriate context overlay is shown (see FIG. 2A).
  • Two Touch Point Timing Requirements
  • In some embodiments, two touch points must be detected by the unit within a configurable time period (e.g., 1 sec). In addition, both touch points may need to remain active for a minimum configurable amount of time before the overlay is shown (e.g., 250 msecs). The appropriate time requirements will depend on the display platform, sampling rate, and target domain and/or application. When the timing requirements are met, the appropriate dual touch context overlay is shown (see FIG. 3A).
  • Four Touch Point Timing Requirements
  • In some embodiments, four touch points must be detected by the unit within a configurable time period (e.g., 1 sec). In addition, all touch points may need to remain active for a minimum configurable amount of time before the overlay is shown (e.g., 250 msecs). The appropriate time requirements will depend on the display platform, sampling rate, and target domain and/or application. When the timing requirements are met, the appropriate four touch context overlay is shown (see FIG. 4A).
  • One example operation of the method involving an overlay may be described as follows:
  • 1. A user brings hands toward multi-touch display and places one touch point or multiple touch points on the surface. The method detects touch points and they remain active based on timing requirements.
  • 2. The display 10 shows an overlay 14, 16, 18 based on number of touch points detected (FIGS. 2A, 3A and 4A). As an example, an overlay window may be anchored to the touch point on a corner of a window.
  • 3. The overlay window(s) 14, 16, 18 show options that are available in current context for the number of touch points that are detected. If multiple overlays apply in the current context, they may be arranged in a tiled manner.
  • 4. The user traces gesture paths shown in the particular overlay 14, 16, 18 (FIGS. 2B, 3B and 4B). In some embodiments, when gesture tracing begins additional overlays disappear or are minimized.
  • The user accesses different options by tracing the touch gesture path to select the option they want. When more than one touch point is detected by the system, the overlay shows the gestures and/or options available in the current context. FIGS. 3A-3B shows an example two-fingered navigation options for zooming a current view. FIGS. 4A-4B shows an example four-fingered option for taking a snapshot of the current view on the display 10.
  • A block diagram of a computer system that executes programming 525 for performing the above method is shown in FIG. 5. The programming may be written in one of many languages, such as virtual basic, Java and others. A general computing device in the form of a computer 510, may include a processing unit 502, memory 504, removable storage 512, and non-removable storage 514. Memory 504 may include volatile memory 506 and non-volatile memory 508. Computer 510 may include—or have access to a computing environment that includes—a variety of computer-readable media, such as volatile memory 506 and non-volatile memory 508, removable storage 512 and non-removable storage 514. Computer storage includes random access memory (RAM), read only memory (ROM), erasable programmable read-only memory (EPROM) & electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technologies, compact disc read-only memory (CD ROM), Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium capable of storing computer-readable instructions.
  • Computer 510 may include or have access to a computing environment that includes input 516, output 518, and a communication connection 520. The input 516 may be a keyboard and mouse/touchpad, or other type of data input device, and the output 518 may be a display device or printer or other type of device to communicate information to a user. In one embodiment, a touch screen device may be used as both an input and an output device.
  • The computer may operate in a networked environment using a communication connection to connect to one or more remote computers. The remote computer may include a personal computer (PC), server, router, network PC, a peer device or other common network node, or the like. The communication connection may include a Local Area Network (LAN), a Wide Area Network (WAN) or other networks.
  • Computer-readable instructions stored on a computer-readable medium are executable by the processing unit 502 of the computer 510. A hard drive, CD-ROM, and RAM are some examples of articles including a computer-readable medium.
  • The method described herein may help to provide on-demand assistance to help users know the features and functions available at any given time. The on-demand assistance is a context aware overlay that is activated when the user places at least one finger on the touch-sensitive display. In some embodiments, the overlay is semi-transparent so as not to occlude the critical information shown in the environment that is shown on the display. Showing the overlay may help users remember the features or functions available by reinforcing the options available. The need for an overlay may be reduced with repeated use because users may be more likely to remember the options available and how to use them.
  • The Abstract is provided to comply with 37 C.F.R. §1.72(b) to allow the reader to quickly ascertain the nature and gist of the technical disclosure. The Abstract is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims.

Claims (20)

1. A method of displaying items on a touch screen display comprising:
detecting contact with a touch-sensitive display; and
generating a semi-transparent overlay that appears on the display based on the contact with the touch-sensitive display.
2. The method of claim 1, wherein generating a semi-transparent overlay includes generating a semi-transparent overlay having 50% transparency.
3. The method of claim 1, wherein generating a semi-transparent overlay includes generating a plurality of semi-transparent overlays.
4. The method of claim 1, wherein generating a semi-transparent overlay includes generating a semi-transparent overlay that provides paths which a user can trace on the display to perform operations.
5. The method of claim 4, further comprising causing the semi-transparent overlay to disappear once the user traces the paths shown on semi-transparent overlay.
6. A method comprising:
detecting how many contacts are made with a touch-sensitive display; and
generating an overlay that appears on the display, wherein the type of overlay that appears on the display is based on the number of contacts that are made with the touch-sensitive display.
7. The method of claim 6, wherein generating an overlay includes generating a plurality of overlays that appear on the display is based on the number of contacts that are made with the touch-sensitive display.
8. The method of claim 6, wherein generating an overlay includes generating an overlay that provides paths which a user can trace on the display to perform an operation.
9. The method of claim 8, further comprising causing the overlay to disappear once the user traces the paths shown on the overlay.
10. The method of claim 6, wherein detecting how many contacts are made with a touch-sensitive display includes determining how many contacts are made with the display for a minimum amount of time.
11. A system comprising:
a touch-sensitive display; and
a processor to detect contact with a touch-sensitive display and to generate a semi-transparent overlay that appears on the display based on the contact with the touch-sensitive display.
12. The system of claim 11 wherein the processor generates a semi-transparent overlay having 50% transparency.
13. The system of claim 11 wherein the processor generates a plurality of semi-transparent overlays.
14. The system of claim 11 wherein the processor generates a semi-transparent overlay that provides paths which a user can trace on the display to perform operations.
15. The system of claim 14 wherein the processor causes the semi-transparent overlay to disappear once the user traces the paths shown on semi-transparent overlay.
16. A system comprising:
a touch-sensitive display;
a processor to detect contact with a touch-sensitive display and to generate an overlay that appears on the display, wherein the type of overlay that appears on the display is based on the number of contacts that are made with the touch-sensitive display.
17. The system of claim 16 wherein the processor generates a plurality of overlays that appear on the display is based on the number of contacts that are made with the touch-sensitive display.
18. The system of claim 16 wherein the processor generates an overlay that provides paths which a user can trace on the display to perform an operation.
19. The system of claim 16 wherein the processor causes the overlay to disappear once the user traces the paths shown on the overlay.
20. The system of claim 16 wherein the processor determines how many contacts are made with the display for a minimum amount of time.
US12/705,026 2010-02-12 2010-02-12 Overlay feature to provide user assistance in a multi-touch interactive display environment Abandoned US20110199386A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/705,026 US20110199386A1 (en) 2010-02-12 2010-02-12 Overlay feature to provide user assistance in a multi-touch interactive display environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/705,026 US20110199386A1 (en) 2010-02-12 2010-02-12 Overlay feature to provide user assistance in a multi-touch interactive display environment

Publications (1)

Publication Number Publication Date
US20110199386A1 true US20110199386A1 (en) 2011-08-18

Family

ID=44369344

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/705,026 Abandoned US20110199386A1 (en) 2010-02-12 2010-02-12 Overlay feature to provide user assistance in a multi-touch interactive display environment

Country Status (1)

Country Link
US (1) US20110199386A1 (en)

Cited By (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110185300A1 (en) * 2010-01-28 2011-07-28 Microsoft Corporation Brush, carbon-copy, and fill gestures
US20110199495A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Method of manipulating assets shown on a touch-sensitive display
US20110199314A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Gestures on a touch-sensitive display
US20110199516A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Method of showing video on a touch-sensitive display
US20110209097A1 (en) * 2010-02-19 2011-08-25 Hinckley Kenneth P Use of Bezel as an Input Mechanism
US20110209098A1 (en) * 2010-02-19 2011-08-25 Hinckley Kenneth P On and Off-Screen Gesture Combinations
US20110209100A1 (en) * 2010-02-25 2011-08-25 Microsoft Corporation Multi-screen pinch and expand gestures
US8239785B2 (en) 2010-01-27 2012-08-07 Microsoft Corporation Edge gestures
US20120281080A1 (en) * 2011-05-05 2012-11-08 Shouyu Wang User behavior tracing method, apparatus and system used in touch screen terminals
CN102955670A (en) * 2011-08-22 2013-03-06 富士施乐株式会社 Input display apparatus and method, image forming apparatus and imaging apparatus
US8473870B2 (en) 2010-02-25 2013-06-25 Microsoft Corporation Multi-screen hold and drag gesture
US8707174B2 (en) 2010-02-25 2014-04-22 Microsoft Corporation Multi-screen hold and page-flip gesture
US8751970B2 (en) 2010-02-25 2014-06-10 Microsoft Corporation Multi-screen synchronous slide gesture
US8799827B2 (en) 2010-02-19 2014-08-05 Microsoft Corporation Page manipulations using on and off-screen gestures
US8836802B2 (en) 2011-03-21 2014-09-16 Honeywell International Inc. Method of defining camera scan movements using gestures
US8836648B2 (en) 2009-05-27 2014-09-16 Microsoft Corporation Touch pull-in gesture
US20140267125A1 (en) * 2011-10-03 2014-09-18 Furuno Electric Co., Ltd. Device having touch panel, radar apparatus, plotter apparatus, ship network system, information displaying method and information displaying program
US9015737B2 (en) 2013-04-18 2015-04-21 Microsoft Technology Licensing, Llc Linked advertisements
US9052820B2 (en) 2011-05-27 2015-06-09 Microsoft Technology Licensing, Llc Multi-application environment
US9075522B2 (en) 2010-02-25 2015-07-07 Microsoft Technology Licensing, Llc Multi-screen bookmark hold gesture
US9104440B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US9158445B2 (en) 2011-05-27 2015-10-13 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
US9229918B2 (en) 2010-12-23 2016-01-05 Microsoft Technology Licensing, Llc Presenting an application change through a tile
US9261964B2 (en) 2005-12-30 2016-02-16 Microsoft Technology Licensing, Llc Unintentional touch rejection
US9274682B2 (en) 2010-02-19 2016-03-01 Microsoft Technology Licensing, Llc Off-screen gestures to create on-screen input
US9367205B2 (en) 2010-02-19 2016-06-14 Microsoft Technolgoy Licensing, Llc Radial menus with bezel gestures
US9411504B2 (en) 2010-01-28 2016-08-09 Microsoft Technology Licensing, Llc Copy and staple gestures
US9454304B2 (en) 2010-02-25 2016-09-27 Microsoft Technology Licensing, Llc Multi-screen dual tap gesture
US9477337B2 (en) 2014-03-14 2016-10-25 Microsoft Technology Licensing, Llc Conductive trace routing for display and bezel sensors
US20160322027A1 (en) * 2012-11-12 2016-11-03 Sony Corporation Information processing device, communication system, and information processing method
US9519356B2 (en) 2010-02-04 2016-12-13 Microsoft Technology Licensing, Llc Link gestures
US9582122B2 (en) 2012-11-12 2017-02-28 Microsoft Technology Licensing, Llc Touch-sensitive bezel techniques
US9658766B2 (en) 2011-05-27 2017-05-23 Microsoft Technology Licensing, Llc Edge gesture
US20170154175A1 (en) * 2011-03-21 2017-06-01 Assa Abloy Ab System and method of secure data entry
US9696888B2 (en) 2010-12-20 2017-07-04 Microsoft Technology Licensing, Llc Application-launching interface for multiple modes
US20170254050A1 (en) * 2016-03-03 2017-09-07 Caterpillar Inc. System and method for operating implement system of machine
US9965165B2 (en) 2010-02-19 2018-05-08 Microsoft Technology Licensing, Llc Multi-finger gestures
US10254955B2 (en) 2011-09-10 2019-04-09 Microsoft Technology Licensing, Llc Progressively indicating new content in an application-selectable user interface
US10579250B2 (en) 2011-09-01 2020-03-03 Microsoft Technology Licensing, Llc Arranging tiles
US10969944B2 (en) 2010-12-23 2021-04-06 Microsoft Technology Licensing, Llc Application reporting in an application-selectable user interface
US11272017B2 (en) 2011-05-27 2022-03-08 Microsoft Technology Licensing, Llc Application notifications manifest
US11755194B2 (en) * 2020-10-06 2023-09-12 Capital One Services, Llc Interactive searching using gestures on any mobile search results page

Citations (69)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4992866A (en) * 1989-06-29 1991-02-12 Morgan Jack B Camera selection and positioning system and method
US5483261A (en) * 1992-02-14 1996-01-09 Itu Research, Inc. Graphical input controller and method with rear screen image detection
US5742504A (en) * 1995-11-06 1998-04-21 Medar, Inc. Method and system for quickly developing application software for use in a machine vision system
US5793367A (en) * 1993-01-07 1998-08-11 Canon Kabushiki Kaisha Apparatus and method for displaying both an image and control information related to the image
US5872594A (en) * 1994-09-20 1999-02-16 Thompson; Paul A. Method for open loop camera control using a motion model to control camera movement
US6208329B1 (en) * 1996-08-13 2001-03-27 Lsi Logic Corporation Supplemental mouse button emulation system, method and apparatus for a coordinate based data input device
US20010026263A1 (en) * 2000-01-21 2001-10-04 Shino Kanamori Input unit and capturing apparatus using the same
US20020067412A1 (en) * 1994-11-28 2002-06-06 Tomoaki Kawai Camera controller
US6542191B1 (en) * 1996-04-23 2003-04-01 Canon Kabushiki Kaisha Image display apparatus, camera control apparatus and method
US6697105B1 (en) * 1996-04-24 2004-02-24 Canon Kabushiki Kaisha Camera control system and method
US20050036036A1 (en) * 2001-07-25 2005-02-17 Stevenson Neil James Camera control apparatus and method
US20050079896A1 (en) * 2003-10-14 2005-04-14 Nokia Corporation Method and apparatus for locking a mobile telephone touch screen
US6888565B1 (en) * 1999-08-31 2005-05-03 Canon Kabushiki Kaisha Apparatus and method for remote-controlling image sensing apparatus in image sensing system
US6954224B1 (en) * 1999-04-16 2005-10-11 Matsushita Electric Industrial Co., Ltd. Camera control apparatus and method
US20050225634A1 (en) * 2004-04-05 2005-10-13 Sam Brunetti Closed circuit TV security system
US6965376B2 (en) * 1991-04-08 2005-11-15 Hitachi, Ltd. Video or information processing method and processing apparatus, and monitoring method and monitoring apparatus using the same
US6965394B2 (en) * 2001-03-30 2005-11-15 Koninklijke Philips Electronics N.V. Remote camera control device
US6973200B1 (en) * 1997-04-22 2005-12-06 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
US6992702B1 (en) * 1999-09-07 2006-01-31 Fuji Xerox Co., Ltd System for controlling video and motion picture cameras
US20060026065A1 (en) * 2004-04-22 2006-02-02 Bolatti Hugo A Digital entertainment distribution system
US20060026521A1 (en) * 2004-07-30 2006-02-02 Apple Computer, Inc. Gestures for touch sensitive input devices
US20060034043A1 (en) * 2004-08-10 2006-02-16 Katsumi Hisano Electronic device, control method, and control program
US20060036944A1 (en) * 2004-08-10 2006-02-16 Microsoft Corporation Surface UI for gesture-based interaction
US20060050090A1 (en) * 2000-03-16 2006-03-09 Kamran Ahmed User selectable hardware zoom in a video display system
US7030861B1 (en) * 2001-02-10 2006-04-18 Wayne Carl Westerman System and method for packing multi-touch gestures onto a hand
US20060123360A1 (en) * 2004-12-03 2006-06-08 Picsel Research Limited User interfaces for data processing devices and systems
US7061525B1 (en) * 1997-01-28 2006-06-13 Canon Kabushiki Kaisha Apparatus and method for controlling a camera based on a displayed image
US20060161871A1 (en) * 2004-07-30 2006-07-20 Apple Computer, Inc. Proximity detector in handheld device
US20060187196A1 (en) * 2005-02-08 2006-08-24 Underkoffler John S System and method for gesture based control system
US20060267966A1 (en) * 2005-05-24 2006-11-30 Microsoft Corporation Hover widgets: using the tracking state to extend capabilities of pen-operated devices
US7183944B2 (en) * 2001-06-12 2007-02-27 Koninklijke Philips Electronics N.V. Vehicle tracking and identification of emergency/law enforcement vehicles
US20070146337A1 (en) * 2005-12-23 2007-06-28 Bas Ording Continuous scrolling list with acceleration
US20070171273A1 (en) * 2006-01-26 2007-07-26 Polycom, Inc. System and Method for Controlling Videoconference with Touch Screen Interface
US7278115B1 (en) * 1999-06-18 2007-10-02 Microsoft Corporation Methods, apparatus and data structures for providing a user interface to objects, the user interface exploiting spatial memory and visually indicating at least one object parameter
US20070229471A1 (en) * 2006-03-30 2007-10-04 Lg Electronics Inc. Terminal and method for selecting displayed items
US20080013826A1 (en) * 2006-07-13 2008-01-17 Northrop Grumman Corporation Gesture recognition interface system
US7362221B2 (en) * 2005-11-09 2008-04-22 Honeywell International Inc. Touchscreen device for controlling a security system
US20080129686A1 (en) * 2006-12-04 2008-06-05 Samsung Electronics Co., Ltd. Gesture-based user interface method and apparatus
US20080143559A1 (en) * 2006-12-18 2008-06-19 Dietz Paul H Appliance Control Panel
US7394367B1 (en) * 2004-11-16 2008-07-01 Colorado Vnet, Llc Keypad for building automation
US20080163053A1 (en) * 2006-12-28 2008-07-03 Samsung Electronics Co., Ltd. Method to provide menu, using menu set and multimedia device using the same
US20080168403A1 (en) * 2007-01-06 2008-07-10 Appl Inc. Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
US7411575B2 (en) * 2003-09-16 2008-08-12 Smart Technologies Ulc Gesture recognition method and touch system incorporating the same
US7479949B2 (en) * 2006-09-06 2009-01-20 Apple Inc. Touch screen device, method, and graphical user interface for determining commands by applying heuristics
US20090040188A1 (en) * 2007-08-08 2009-02-12 Se Youp Chu Terminal having touch screen and method of performing function thereof
US20090084612A1 (en) * 2007-10-01 2009-04-02 Igt Multi-user input systems and processing techniques for serving multiple users
US7519223B2 (en) * 2004-06-28 2009-04-14 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
US7535463B2 (en) * 2005-06-15 2009-05-19 Microsoft Corporation Optical flow-based manipulation of graphical objects
US20090160785A1 (en) * 2007-12-21 2009-06-25 Nokia Corporation User interface, device and method for providing an improved text input
US20090262091A1 (en) * 2008-01-07 2009-10-22 Tetsuo Ikeda Information Processing Apparatus and Vibration Control Method in Information Processing Apparatus
US20100058252A1 (en) * 2008-08-28 2010-03-04 Acer Incorporated Gesture guide system and a method for controlling a computer system by a gesture
US20100053219A1 (en) * 2008-08-22 2010-03-04 Google Inc. Panning In A Three Dimensional Environment On A Mobile Device
US20100138763A1 (en) * 2008-12-01 2010-06-03 Lg Electronics Inc. Method for operating execution icon of mobile terminal
US20100188423A1 (en) * 2009-01-28 2010-07-29 Tetsuo Ikeda Information processing apparatus and display control method
US20100224192A1 (en) * 2009-03-06 2010-09-09 Cardinal Health 207, Inc. Automated Oxygen Delivery Method
US20100259486A1 (en) * 2009-04-08 2010-10-14 Douglas Anson System And Method For Secure Gesture Information Handling System Communication
US20100304731A1 (en) * 2009-05-26 2010-12-02 Bratton R Alex Apparatus and method for video display and control for portable device
US20110093822A1 (en) * 2009-01-29 2011-04-21 Jahanzeb Ahmed Sherwani Image Navigation for Touchscreen User Interface
US20110117526A1 (en) * 2009-11-16 2011-05-19 Microsoft Corporation Teaching gesture initiation with registration posture guides
US20110181526A1 (en) * 2010-01-26 2011-07-28 Shaffer Joshua H Gesture Recognizers with Delegates for Controlling and Modifying Gesture Recognition
US20110199495A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Method of manipulating assets shown on a touch-sensitive display
US20110199516A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Method of showing video on a touch-sensitive display
US20110199314A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Gestures on a touch-sensitive display
US20110225553A1 (en) * 2010-03-15 2011-09-15 Abramson Robert W Use Of Standalone Mobile Devices To Extend HID Capabilities Of Computer Systems
US20110239155A1 (en) * 2007-01-05 2011-09-29 Greg Christie Gestures for Controlling, Manipulating, and Editing of Media Files Using Touch Sensitive Devices
US8085300B2 (en) * 2005-09-20 2011-12-27 Sony Corporation Surveillance camera system, remote-controlled monitoring device, control method, and their control program
US20120023509A1 (en) * 2007-01-07 2012-01-26 Christopher Blumenberg Application programming interfaces for gesture operations
US20120088526A1 (en) * 2010-10-08 2012-04-12 Research In Motion Limited System and method for displaying object location in augmented reality
US20120242850A1 (en) * 2011-03-21 2012-09-27 Honeywell International Inc. Method of defining camera scan movements using gestures

Patent Citations (77)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4992866A (en) * 1989-06-29 1991-02-12 Morgan Jack B Camera selection and positioning system and method
US6965376B2 (en) * 1991-04-08 2005-11-15 Hitachi, Ltd. Video or information processing method and processing apparatus, and monitoring method and monitoring apparatus using the same
US5483261A (en) * 1992-02-14 1996-01-09 Itu Research, Inc. Graphical input controller and method with rear screen image detection
US5793367A (en) * 1993-01-07 1998-08-11 Canon Kabushiki Kaisha Apparatus and method for displaying both an image and control information related to the image
US5872594A (en) * 1994-09-20 1999-02-16 Thompson; Paul A. Method for open loop camera control using a motion model to control camera movement
US6680746B2 (en) * 1994-11-28 2004-01-20 Canon Kabushiki Kaisha Apparatus and method for controlling configuration of video camera
US20020067412A1 (en) * 1994-11-28 2002-06-06 Tomoaki Kawai Camera controller
US5742504A (en) * 1995-11-06 1998-04-21 Medar, Inc. Method and system for quickly developing application software for use in a machine vision system
US6542191B1 (en) * 1996-04-23 2003-04-01 Canon Kabushiki Kaisha Image display apparatus, camera control apparatus and method
US6697105B1 (en) * 1996-04-24 2004-02-24 Canon Kabushiki Kaisha Camera control system and method
US6208329B1 (en) * 1996-08-13 2001-03-27 Lsi Logic Corporation Supplemental mouse button emulation system, method and apparatus for a coordinate based data input device
US7061525B1 (en) * 1997-01-28 2006-06-13 Canon Kabushiki Kaisha Apparatus and method for controlling a camera based on a displayed image
US6973200B1 (en) * 1997-04-22 2005-12-06 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
US6954224B1 (en) * 1999-04-16 2005-10-11 Matsushita Electric Industrial Co., Ltd. Camera control apparatus and method
US7278115B1 (en) * 1999-06-18 2007-10-02 Microsoft Corporation Methods, apparatus and data structures for providing a user interface to objects, the user interface exploiting spatial memory and visually indicating at least one object parameter
US6888565B1 (en) * 1999-08-31 2005-05-03 Canon Kabushiki Kaisha Apparatus and method for remote-controlling image sensing apparatus in image sensing system
US6992702B1 (en) * 1999-09-07 2006-01-31 Fuji Xerox Co., Ltd System for controlling video and motion picture cameras
US20010026263A1 (en) * 2000-01-21 2001-10-04 Shino Kanamori Input unit and capturing apparatus using the same
US20060050090A1 (en) * 2000-03-16 2006-03-09 Kamran Ahmed User selectable hardware zoom in a video display system
US7030861B1 (en) * 2001-02-10 2006-04-18 Wayne Carl Westerman System and method for packing multi-touch gestures onto a hand
US6965394B2 (en) * 2001-03-30 2005-11-15 Koninklijke Philips Electronics N.V. Remote camera control device
US7183944B2 (en) * 2001-06-12 2007-02-27 Koninklijke Philips Electronics N.V. Vehicle tracking and identification of emergency/law enforcement vehicles
US20050036036A1 (en) * 2001-07-25 2005-02-17 Stevenson Neil James Camera control apparatus and method
US7411575B2 (en) * 2003-09-16 2008-08-12 Smart Technologies Ulc Gesture recognition method and touch system incorporating the same
US20050079896A1 (en) * 2003-10-14 2005-04-14 Nokia Corporation Method and apparatus for locking a mobile telephone touch screen
US20050225634A1 (en) * 2004-04-05 2005-10-13 Sam Brunetti Closed circuit TV security system
US20060026065A1 (en) * 2004-04-22 2006-02-02 Bolatti Hugo A Digital entertainment distribution system
US7519223B2 (en) * 2004-06-28 2009-04-14 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
US20080211775A1 (en) * 2004-07-30 2008-09-04 Apple Inc. Gestures for touch sensitive input devices
US20060161871A1 (en) * 2004-07-30 2006-07-20 Apple Computer, Inc. Proximity detector in handheld device
US20080231610A1 (en) * 2004-07-30 2008-09-25 Apple Inc. Gestures for touch sensitive input devices
US20060026521A1 (en) * 2004-07-30 2006-02-02 Apple Computer, Inc. Gestures for touch sensitive input devices
US20060034043A1 (en) * 2004-08-10 2006-02-16 Katsumi Hisano Electronic device, control method, and control program
US20060036944A1 (en) * 2004-08-10 2006-02-16 Microsoft Corporation Surface UI for gesture-based interaction
US7394367B1 (en) * 2004-11-16 2008-07-01 Colorado Vnet, Llc Keypad for building automation
US20060123360A1 (en) * 2004-12-03 2006-06-08 Picsel Research Limited User interfaces for data processing devices and systems
US20060187196A1 (en) * 2005-02-08 2006-08-24 Underkoffler John S System and method for gesture based control system
US20060267966A1 (en) * 2005-05-24 2006-11-30 Microsoft Corporation Hover widgets: using the tracking state to extend capabilities of pen-operated devices
US7535463B2 (en) * 2005-06-15 2009-05-19 Microsoft Corporation Optical flow-based manipulation of graphical objects
US8085300B2 (en) * 2005-09-20 2011-12-27 Sony Corporation Surveillance camera system, remote-controlled monitoring device, control method, and their control program
US7362221B2 (en) * 2005-11-09 2008-04-22 Honeywell International Inc. Touchscreen device for controlling a security system
US20070146337A1 (en) * 2005-12-23 2007-06-28 Bas Ording Continuous scrolling list with acceleration
US20070171273A1 (en) * 2006-01-26 2007-07-26 Polycom, Inc. System and Method for Controlling Videoconference with Touch Screen Interface
US20070229471A1 (en) * 2006-03-30 2007-10-04 Lg Electronics Inc. Terminal and method for selecting displayed items
US20080013826A1 (en) * 2006-07-13 2008-01-17 Northrop Grumman Corporation Gesture recognition interface system
US7479949B2 (en) * 2006-09-06 2009-01-20 Apple Inc. Touch screen device, method, and graphical user interface for determining commands by applying heuristics
US20080129686A1 (en) * 2006-12-04 2008-06-05 Samsung Electronics Co., Ltd. Gesture-based user interface method and apparatus
US20080143559A1 (en) * 2006-12-18 2008-06-19 Dietz Paul H Appliance Control Panel
US20080163053A1 (en) * 2006-12-28 2008-07-03 Samsung Electronics Co., Ltd. Method to provide menu, using menu set and multimedia device using the same
US20110239155A1 (en) * 2007-01-05 2011-09-29 Greg Christie Gestures for Controlling, Manipulating, and Editing of Media Files Using Touch Sensitive Devices
US20080168403A1 (en) * 2007-01-06 2008-07-10 Appl Inc. Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
US20100192109A1 (en) * 2007-01-06 2010-07-29 Wayne Carl Westerman Detecting and Interpreting Real-World and Security Gestures on Touch and Hover Sensitive Devices
US20100211920A1 (en) * 2007-01-06 2010-08-19 Wayne Carl Westerman Detecting and Interpreting Real-World and Security Gestures on Touch and Hover Sensitive Devices
US20120023509A1 (en) * 2007-01-07 2012-01-26 Christopher Blumenberg Application programming interfaces for gesture operations
US20090040188A1 (en) * 2007-08-08 2009-02-12 Se Youp Chu Terminal having touch screen and method of performing function thereof
US20090084612A1 (en) * 2007-10-01 2009-04-02 Igt Multi-user input systems and processing techniques for serving multiple users
US20090160785A1 (en) * 2007-12-21 2009-06-25 Nokia Corporation User interface, device and method for providing an improved text input
US20090262091A1 (en) * 2008-01-07 2009-10-22 Tetsuo Ikeda Information Processing Apparatus and Vibration Control Method in Information Processing Apparatus
US20100053219A1 (en) * 2008-08-22 2010-03-04 Google Inc. Panning In A Three Dimensional Environment On A Mobile Device
US20100058252A1 (en) * 2008-08-28 2010-03-04 Acer Incorporated Gesture guide system and a method for controlling a computer system by a gesture
US20100138763A1 (en) * 2008-12-01 2010-06-03 Lg Electronics Inc. Method for operating execution icon of mobile terminal
US20100188423A1 (en) * 2009-01-28 2010-07-29 Tetsuo Ikeda Information processing apparatus and display control method
US20110093822A1 (en) * 2009-01-29 2011-04-21 Jahanzeb Ahmed Sherwani Image Navigation for Touchscreen User Interface
US20100224192A1 (en) * 2009-03-06 2010-09-09 Cardinal Health 207, Inc. Automated Oxygen Delivery Method
US20100259486A1 (en) * 2009-04-08 2010-10-14 Douglas Anson System And Method For Secure Gesture Information Handling System Communication
US20100304731A1 (en) * 2009-05-26 2010-12-02 Bratton R Alex Apparatus and method for video display and control for portable device
US20110117526A1 (en) * 2009-11-16 2011-05-19 Microsoft Corporation Teaching gesture initiation with registration posture guides
US20110181526A1 (en) * 2010-01-26 2011-07-28 Shaffer Joshua H Gesture Recognizers with Delegates for Controlling and Modifying Gesture Recognition
US20110199516A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Method of showing video on a touch-sensitive display
US20110199314A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Gestures on a touch-sensitive display
US20110199495A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Method of manipulating assets shown on a touch-sensitive display
US8570286B2 (en) * 2010-02-12 2013-10-29 Honeywell International Inc. Gestures on a touch-sensitive display
US8638371B2 (en) * 2010-02-12 2014-01-28 Honeywell International Inc. Method of manipulating assets shown on a touch-sensitive display
US20110225553A1 (en) * 2010-03-15 2011-09-15 Abramson Robert W Use Of Standalone Mobile Devices To Extend HID Capabilities Of Computer Systems
US20120088526A1 (en) * 2010-10-08 2012-04-12 Research In Motion Limited System and method for displaying object location in augmented reality
US20120242850A1 (en) * 2011-03-21 2012-09-27 Honeywell International Inc. Method of defining camera scan movements using gestures
US8836802B2 (en) * 2011-03-21 2014-09-16 Honeywell International Inc. Method of defining camera scan movements using gestures

Cited By (69)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9594457B2 (en) 2005-12-30 2017-03-14 Microsoft Technology Licensing, Llc Unintentional touch rejection
US9946370B2 (en) 2005-12-30 2018-04-17 Microsoft Technology Licensing, Llc Unintentional touch rejection
US9952718B2 (en) 2005-12-30 2018-04-24 Microsoft Technology Licensing, Llc Unintentional touch rejection
US10019080B2 (en) 2005-12-30 2018-07-10 Microsoft Technology Licensing, Llc Unintentional touch rejection
US9261964B2 (en) 2005-12-30 2016-02-16 Microsoft Technology Licensing, Llc Unintentional touch rejection
US8836648B2 (en) 2009-05-27 2014-09-16 Microsoft Corporation Touch pull-in gesture
US8239785B2 (en) 2010-01-27 2012-08-07 Microsoft Corporation Edge gestures
US10282086B2 (en) 2010-01-28 2019-05-07 Microsoft Technology Licensing, Llc Brush, carbon-copy, and fill gestures
US20110185300A1 (en) * 2010-01-28 2011-07-28 Microsoft Corporation Brush, carbon-copy, and fill gestures
US9411504B2 (en) 2010-01-28 2016-08-09 Microsoft Technology Licensing, Llc Copy and staple gestures
US9411498B2 (en) 2010-01-28 2016-08-09 Microsoft Technology Licensing, Llc Brush, carbon-copy, and fill gestures
US8261213B2 (en) 2010-01-28 2012-09-04 Microsoft Corporation Brush, carbon-copy, and fill gestures
US9857970B2 (en) 2010-01-28 2018-01-02 Microsoft Technology Licensing, Llc Copy and staple gestures
US9519356B2 (en) 2010-02-04 2016-12-13 Microsoft Technology Licensing, Llc Link gestures
US20110199495A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Method of manipulating assets shown on a touch-sensitive display
US20110199516A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Method of showing video on a touch-sensitive display
US8638371B2 (en) 2010-02-12 2014-01-28 Honeywell International Inc. Method of manipulating assets shown on a touch-sensitive display
US8570286B2 (en) 2010-02-12 2013-10-29 Honeywell International Inc. Gestures on a touch-sensitive display
US20110199314A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Gestures on a touch-sensitive display
US8799827B2 (en) 2010-02-19 2014-08-05 Microsoft Corporation Page manipulations using on and off-screen gestures
US20110209097A1 (en) * 2010-02-19 2011-08-25 Hinckley Kenneth P Use of Bezel as an Input Mechanism
US9965165B2 (en) 2010-02-19 2018-05-08 Microsoft Technology Licensing, Llc Multi-finger gestures
US20110209098A1 (en) * 2010-02-19 2011-08-25 Hinckley Kenneth P On and Off-Screen Gesture Combinations
US10268367B2 (en) 2010-02-19 2019-04-23 Microsoft Technology Licensing, Llc Radial menus with bezel gestures
US9367205B2 (en) 2010-02-19 2016-06-14 Microsoft Technolgoy Licensing, Llc Radial menus with bezel gestures
US9310994B2 (en) * 2010-02-19 2016-04-12 Microsoft Technology Licensing, Llc Use of bezel as an input mechanism
US9274682B2 (en) 2010-02-19 2016-03-01 Microsoft Technology Licensing, Llc Off-screen gestures to create on-screen input
US8751970B2 (en) 2010-02-25 2014-06-10 Microsoft Corporation Multi-screen synchronous slide gesture
US11055050B2 (en) 2010-02-25 2021-07-06 Microsoft Technology Licensing, Llc Multi-device pairing and combined display
US8473870B2 (en) 2010-02-25 2013-06-25 Microsoft Corporation Multi-screen hold and drag gesture
US8539384B2 (en) 2010-02-25 2013-09-17 Microsoft Corporation Multi-screen pinch and expand gestures
US20110209100A1 (en) * 2010-02-25 2011-08-25 Microsoft Corporation Multi-screen pinch and expand gestures
US9075522B2 (en) 2010-02-25 2015-07-07 Microsoft Technology Licensing, Llc Multi-screen bookmark hold gesture
US8707174B2 (en) 2010-02-25 2014-04-22 Microsoft Corporation Multi-screen hold and page-flip gesture
US9454304B2 (en) 2010-02-25 2016-09-27 Microsoft Technology Licensing, Llc Multi-screen dual tap gesture
US9696888B2 (en) 2010-12-20 2017-07-04 Microsoft Technology Licensing, Llc Application-launching interface for multiple modes
US10969944B2 (en) 2010-12-23 2021-04-06 Microsoft Technology Licensing, Llc Application reporting in an application-selectable user interface
US11126333B2 (en) 2010-12-23 2021-09-21 Microsoft Technology Licensing, Llc Application reporting in an application-selectable user interface
US9229918B2 (en) 2010-12-23 2016-01-05 Microsoft Technology Licensing, Llc Presenting an application change through a tile
US10599822B2 (en) 2011-03-21 2020-03-24 Assa Abloy Ab System and method of secure data entry
US8836802B2 (en) 2011-03-21 2014-09-16 Honeywell International Inc. Method of defining camera scan movements using gestures
US20170154175A1 (en) * 2011-03-21 2017-06-01 Assa Abloy Ab System and method of secure data entry
US20120281080A1 (en) * 2011-05-05 2012-11-08 Shouyu Wang User behavior tracing method, apparatus and system used in touch screen terminals
US9658766B2 (en) 2011-05-27 2017-05-23 Microsoft Technology Licensing, Llc Edge gesture
US9535597B2 (en) 2011-05-27 2017-01-03 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
US11698721B2 (en) 2011-05-27 2023-07-11 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
US11272017B2 (en) 2011-05-27 2022-03-08 Microsoft Technology Licensing, Llc Application notifications manifest
US10303325B2 (en) 2011-05-27 2019-05-28 Microsoft Technology Licensing, Llc Multi-application environment
US9052820B2 (en) 2011-05-27 2015-06-09 Microsoft Technology Licensing, Llc Multi-application environment
US9104440B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US9104307B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US9158445B2 (en) 2011-05-27 2015-10-13 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
CN102955670A (en) * 2011-08-22 2013-03-06 富士施乐株式会社 Input display apparatus and method, image forming apparatus and imaging apparatus
US10579250B2 (en) 2011-09-01 2020-03-03 Microsoft Technology Licensing, Llc Arranging tiles
US10254955B2 (en) 2011-09-10 2019-04-09 Microsoft Technology Licensing, Llc Progressively indicating new content in an application-selectable user interface
US20140267125A1 (en) * 2011-10-03 2014-09-18 Furuno Electric Co., Ltd. Device having touch panel, radar apparatus, plotter apparatus, ship network system, information displaying method and information displaying program
US9459716B2 (en) * 2011-10-03 2016-10-04 Furuno Electric Co., Ltd. Device having touch panel, radar apparatus, plotter apparatus, ship network system, information displaying method and information displaying program
US10656750B2 (en) 2012-11-12 2020-05-19 Microsoft Technology Licensing, Llc Touch-sensitive bezel techniques
US9875725B2 (en) 2012-11-12 2018-01-23 Sony Corporation Information processing to exchange information via wireless communication
US10679586B2 (en) 2012-11-12 2020-06-09 Sony Corporation Information processing device, communication system, and information processing method
US9582122B2 (en) 2012-11-12 2017-02-28 Microsoft Technology Licensing, Llc Touch-sensitive bezel techniques
US11106292B2 (en) 2012-11-12 2021-08-31 Sony Corporation Information processing device, communication system, and information processing method
US20160322027A1 (en) * 2012-11-12 2016-11-03 Sony Corporation Information processing device, communication system, and information processing method
US10276132B2 (en) * 2012-11-12 2019-04-30 Sony Corporation Information processing for recognition of acceptance of user operation
US9015737B2 (en) 2013-04-18 2015-04-21 Microsoft Technology Licensing, Llc Linked advertisements
US9946383B2 (en) 2014-03-14 2018-04-17 Microsoft Technology Licensing, Llc Conductive trace routing for display and bezel sensors
US9477337B2 (en) 2014-03-14 2016-10-25 Microsoft Technology Licensing, Llc Conductive trace routing for display and bezel sensors
US20170254050A1 (en) * 2016-03-03 2017-09-07 Caterpillar Inc. System and method for operating implement system of machine
US11755194B2 (en) * 2020-10-06 2023-09-12 Capital One Services, Llc Interactive searching using gestures on any mobile search results page

Similar Documents

Publication Publication Date Title
US20110199386A1 (en) Overlay feature to provide user assistance in a multi-touch interactive display environment
US20180203596A1 (en) Computing device with window repositioning preview interface
US9026944B2 (en) Managing content through actions on context based menus
EP2893418B1 (en) Systems and methods for handling stackable workspaces
US9389777B2 (en) Gestures for manipulating tables, charts, and graphs
TWI609319B (en) Predictive contextual toolbar for productivity applications
EP3025218B1 (en) Multi-region touchpad
US20140306898A1 (en) Key swipe gestures for touch sensitive ui virtual keyboard
US20140306897A1 (en) Virtual keyboard swipe gestures for cursor movement
US20140306899A1 (en) Multidirectional swipe key for virtual keyboard
US20120272144A1 (en) Compact control menu for touch-enabled command execution
US20130191781A1 (en) Displaying and interacting with touch contextual user interface
KR20140051228A (en) Submenus for context based menu system
US11099723B2 (en) Interaction method for user interfaces
US11429272B2 (en) Multi-factor probabilistic model for evaluating user input
KR20140051230A (en) Launcher for context based menus
CN106104450B (en) Method for selecting a part of a graphical user interface
KR102129827B1 (en) User interface elements for content selection and extended content selection
US20140022179A1 (en) System and method for displaying keypad via various types of gestures
WO2015013152A1 (en) Scrollable smart menu
US9870122B2 (en) Graphical user interface for rearranging icons
US20140033129A1 (en) Computing device and method for controlling desktop applications
US9146654B2 (en) Movement reduction when scrolling for item selection during direct manipulation
EP3210101B1 (en) Hit-test to determine enablement of direct manipulations in response to user actions
US11782599B1 (en) Virtual mouse for electronic touchscreen display

Legal Events

Date Code Title Description
AS Assignment

Owner name: HONEYWELL INTERNATIONAL INC., NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DHARWADA, PALLAVI;LABERGE, JASON;REEL/FRAME:023954/0242

Effective date: 20100203

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION