WO2012104288A1 - A device having a multipoint sensing surface - Google Patents
A device having a multipoint sensing surface Download PDFInfo
- Publication number
- WO2012104288A1 WO2012104288A1 PCT/EP2012/051531 EP2012051531W WO2012104288A1 WO 2012104288 A1 WO2012104288 A1 WO 2012104288A1 EP 2012051531 W EP2012051531 W EP 2012051531W WO 2012104288 A1 WO2012104288 A1 WO 2012104288A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gesture
- action
- sensing surface
- multipoint sensing
- fingers
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/42224—Touch pad or touch panel provided on the remote control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/432—Content retrieval operation from a local storage medium, e.g. hard-disk
- H04N21/4325—Content retrieval operation from a local storage medium, e.g. hard-disk by playing back content from the storage medium
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/438—Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving MPEG packets from an IP network
- H04N21/4383—Accessing a communication channel
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present application relates to a device comprising a multipoint sensing surface; a method in a device having a multipoint sensing surface, and a computer-readable medium.
- Touch sensitive surfaces are popular device interfaces and can be found in the form of touch screens on smart-phones and as touch-pads on laptops. With the advent of capacitive, as opposed to resistive, touch sensors it became possible to detect multiple points of contact on the touch sensitive surface and from this the use of gestures in the user interface became more common.
- An example of a simple gesture and action is pinch-to-zoom-out, whereby if two contact points such as those presented by a thumb and forefinger on the touch sensitive surface are brought together as a pinch the device responds by performing a zoom-out action on the currently displayed item, such as a picture or a webpage.
- US Patent No. 7,340,077 to Gokturk et al. describes an arrangement where a camera is used to recognise a user's body position, which is used as an input into a related electronic device.
- the terminal device includes a communication unit which receives electronic program guide (EPG) information from the Internet, a display unit that displays EPG information, and a control unit that controls the broadcast receiving apparatus to perform an operation corresponding to a selection.
- EPG electronic program guide
- a Touch Gesture Reference Guide by C. Villamor et al., available at www.lukew.com, describes gestures used for touch commands, using gestures to support user actions, visual representations of gestures, and outlines of how popular software platforms support touch gestures.
- gestures as part of the user interface of a device is that certain actions can be performed intuitively by the user.
- a problem with existing devices that use touch sensitive surfaces is that as the gestures increase in number and complexity, the user interface becomes less intuitive and less user friendly.
- a device comprising a multipoint sensing surface and a gesture module.
- the multipoint sensing surface is for receiving inputs from one or more objects.
- the gesture module is configured to receive inputs from the multipoint sensing surface, and to match a gesture on the multipoint sensing surface to an action, the magnitude of the action determined by the number of contact points with which the gesture is made.
- the device may further comprise a display.
- the multipoint sensing surface may be mounted on the display.
- the device may further comprise a transmission means for sending instructions to a further device.
- the further device may be at least one of: a television, a set top box, a radio, a media player and a PC.
- the action may be to change channel on a further device, wherein the number of fingers with which the gesture is performed determines the number of channels changed.
- the direction of the gesture may determine the direction of channel change.
- the action may be to skip a segment of media playback, wherein the number of fingers with which the gesture is performed determines the length of the skip.
- the direction of the gesture may determine the direction of the skip.
- the direction of the skip may be forwards in time or backwards in time.
- the action may be to change the speed of media playback, wherein the number of fingers with which the gesture is performed determines the change in playback speed.
- the direction of the gesture may determine whether the playback speed is increased or decreased.
- the playback speed may be negative, sometimes referred to as "re-wind"
- the action may be to move a cursor on a screen, wherein the number of fingers with which the gesture is performed determines the amount the cursor moves. In this way a multi-speed cursor is provided.
- the method comprises receiving inputs from one or more objects in contact with the multipoint sensing surface.
- the method further comprises matching a gesture on the multipoint sensing surface to an action.
- the method further comprises determining the magnitude of the action by the number of contact points with which the gesture is made.
- the multipoint sensing surface may be mounted in front of a display. Such a multipoint sensing surface is substantially transparent.
- the method may further comprise sending instructions to a further device, the instructions relating to the action corresponding to the gesture and the number of contact points with which the gesture is made.
- the action may be to change channel, wherein the number of fingers with which the gesture is performed determines the number of channels changed.
- the direction in which the gesture is made may determine the direction of channel change.
- the gesture may be at least one of a horizontal swipe, a vertical swipe or a rotational swipe, or a combination thereof.
- a remote control for selecting and controlling an electronic device.
- the remote control comprising: a touch-sensitive screen; a programmable digital signal processor in communication with the touch- sensitive screen; and a transmitter in communication with the programmable digital signal processor; wherein the processor is configured to determine a number of fingers touching the touch-sensitive screen, to set based on the number determined a pace corresponding to the electronic device, to identify a gesture made by the fingers on the screen, and to generate based on the gesture identified a corresponding command to be transmitted to the electronic device corresponding to the set pace.
- Figure 1 illustrates an embodiment of a device as described herein having a multipoint sensing surface
- Figure 2 illustrates a plurality of gestures which may be used to instruct a command to be sent to a further device
- Figure 3 shows some alternative gestures which may be used herein;
- Figure 4 illustrates a method incorporating the gestures described herein;
- Figure 5 is a block diagram illustrating the components of the device described herein; and Figure 6 is a block diagram illustrating the components of an alternative device also described herein.
- Figure 1 illustrates an embodiment of a device having a multipoint sensing surface.
- the device shown in figure 1 is a remote control comprising a user interface for receiving instructions from a user and a transmitting means for transmitting commands to at least one further device.
- Remote control 100 sends instructions to a television 1 10 and a radio 120. The instructions are transmitted from the remote control 100 using conventional wireless
- Remote control 100 comprises a display 102 which has a transparent touch sensitive surface overlaid to create what is commonly termed a touch-screen. Remote control 100 additionally includes a plurality of hardware buttons 103.
- a user may use remote control 100 to send commands to the further devices 1 10, 120 to control their operation.
- the remote control 100 thus allows a user to perform actions such as turn on a further device, and to change the channel the further device outputs.
- a typical remote control has only physical buttons for receiving input from a user, but remote controls with touch sensitive surfaces such as touch screens are known.
- Figure 2 illustrates a plurality of gestures which may be used to instruct a channel change command to be sent to one of the further devices 1 10, 120.
- To which of the further devices 1 10, 120 that the gesture relates is determined by an aspect of the user interface of device 100.
- the user interface of remote control 100 comprises a plurality of buttons at the top of display 102 each button relating to a further device 1 10, 120 that may be controlled. Before making a gesture on the touch screen 102 the user must ensure that the appropriate button is highlighted indicating the appropriate further device is selected.
- the touch screen 102 of remote control 100 is capable of detecting multiple simultaneous contact points.
- remote control 100 comprises an LCD display with a capacitive touch surface overlaid.
- Figure 2A illustrates a two finger swipe to the right which is interpreted by remote control 100 as single channel up instruction.
- Figure 2B illustrates a two finger swipe to the left which is interpreted by remote control 100 as a single channel down instruction. While selecting a channel, switching between channels, or channel surfing, a user will typically wish to change from one channel to another channel multiple channels away in the displayed channel order. This may be achieved by the user inputting multiple consecutive channel up or channel down instructions.
- Remote control 100 presents an intuitive and more efficient method of providing these instructions by recognising a three finger swipe to the left or to the right as a multiple channel up or a multiple channel down instruction. This is illustrated in figure 2C which shows a three finger swipe to the right indicating a five channel up instruction; and figure 2D which illustrates a three finger swipe to the left indicating a five channels down instruction.
- the number of channels that is skipped when a multiple channel change instruction is received is an implementation detail which will vary dependant upon the total number of channels available. For example, for a television with only eight channels a five channel change instruction is unlikely to be useful whereas a three channel change instruction is more likely to be used. Similarly, for a radio with one hundred channels a ten channel change instruction maybe more useful than a five channel change instruction.
- Figure 3 shows some alternative gestures to which the multiple channel change instruction may be assigned. Each of the gestures in figure 3 is shown made by two contact points which will typically be two fingers. Each of the gestures shown in figure 3 may also be made with one, three or even four fingers.
- Figures 3A and 3B illustrate a two finger swipe to the right and to the left respectively corresponding to the single channel change instructions in figures 2A and 2B.
- Figure 3C shows a two finger swipe up
- figure 3D shows a two finger swipe down
- Figure 3E shows a two finger rotational swipe clockwise
- Figure 3F shows a two finger rotational swipe anticlockwise.
- the channel up and channel down instruction described above in relation to remote control 100 may alternatively be implemented with vertical swipes (figures 3C and 3D respectively) or rotational swipes (figures 3E and 3F respectively). It should be further noted that where the size of the touch screen 102 of remote control 100 is sufficient three speed channel change instructions may be implemented.
- a two finger swipe may correspond to a single channel change
- a three finger swipe may corresponds to a four channel change
- a four finger swipe may correspond to a ten channel change.
- the technique may also be applied to scrolling or skipping through a media presentation (such as a recorded or buffered TV programme, an audio file or an audio & video file), or scrolling through a list such as an electronic programme guide (EPG) or content catalogue.
- a two finger swipe may correspond to scrolling at twice normal speed
- a three finger swipe may correspond to scrolling at four times normal speed.
- a two finger swipe may correspond to an increase of the scrolling speed by one step (from 1x to 2x, or from 2x to 4x), and a three finger swipe may correspond to an increase of the scrolling speed by two steps (from 1x to 4x, from 2x to 8x, or from 4x to 16x).
- a two finger swipe may correspond to skipping 5 seconds, whereas a three finger swipe may correspond to skipping 30 seconds.
- a two finger swipe may correspond to a moving on to the next item on the list (or time slot in an EPG) and a three finger swipe may correspond to moving onto the fourth next item on the list (or time slot on the EPG).
- the direction of the gesture may be used to determine the direction of the skipping or scrolling.
- single finger gestures are not used for scrolling or skipping instructions as this may be reserved for another function in the user interface of remote control 100 such as, for example, cursor movement. It should be understood that where reserving of the single finger swipe is not require then the single finger swipe may be used for a single channel change, skip or scroll instructions with additional fingers used to indicate a higher speed of channel change, skip or scroll.
- the technique described herein may be applied to cursor movement. This is particularly relevant to devices that have a separate touch sensitive surface and screen, or to devices which have a touch screen but where the cursor controlled by the touch sensitive surface of the touch screen is displayed on a screen other than the touch screen.
- the relation between amount of input movement and amount of cursor movement is determined by the cursor speed. For a given input movement a fast cursor moves further on the screen than a slow cursor.
- the technique described herein may be used to provide a multi-speed cursor whereby a cursor moves at one speed in response to a one finger gesture, and at a faster speed in response to a two finger gesture.
- the cursor may move faster still in response to a three finger gesture.
- the cursor may move at one speed in response to a one finger gesture, and at a slower speed in response to a two finger gesture.
- the cursor may move slower still in response to a three finger gesture. This may be particularly useful where precision cursor control is occasionally required.
- Figure 4 illustrates a method implemented by a device such as remote control 100 incorporating the gestures described above.
- the process starts and at 410 it is determined whether or not a touch input is detected. If no touch input is detected the process proceeds to a standby mode 405 in which the device may periodically determine whether touch inputs have been made to the multipoint sensing surface. If a touch input is detected at 410 the process proceeds to 420 where a determination is made as to the number of contact points that the touch input comprises. Then the process proceeds to 430 where the touch input is monitored and any movements of the contact points are detected. Once movement is detected a gesture is identified at 440 which best correlates with the detected movement. Thus, the process establishes what gesture was used and the number of contact points used to make it. At 450 the process performs the action corresponding to the input gesture and the number of contact points. The action may comprise transmitting a control instruction to a further device.
- FIG. 5 is a block diagram illustrating the components of remote control 100.
- Remote control 100 comprises a touch screen 510, a gesture module 520, a processor 530 and a graphics module 540.
- the gesture module 520 receives touch information from the touch sensitive surface of touch screen 510 and translates these signals into gesture information which is sent to a processor 530.
- Processor 530 is arranged to run instructions to allow the remote control 100 to perform as required.
- Processor 530 sends display information to a graphics module 540 which drives the display component of touch screen 510.
- Figure 6 illustrates the structure of a device 600 which does not comprise of a touch screen but has separate touch and display surfaces.
- Device 600 comprises a touch surface 610, a gesture module 620, a processor 630, a graphics module 540, and a display 650.
- the gesture module 620 receives touch information from the touch surface 610 and translates these signals into gesture information which is sent to a processor 630.
- Processor 630 is arranged to run instructions to allow the device 600 to perform as required.
- Processor 630 sends display information to a graphics module 640 which drives the display 650.
Abstract
There is provided a device comprising a multipoint sensing surface and a gesture module. The multipoint sensing surface is for receiving inputs from one or more objects. The gesture module is configured to receive inputs from the multipoint sensing surface, and to match a gesture on the multipoint sensing surface to an action, the magnitude of the action determined by the number of contact points with which the gesture is made.
Description
A DEVICE HAVING
A MULTIPOINT SENSING SURFACE
Technical field
The present application relates to a device comprising a multipoint sensing surface; a method in a device having a multipoint sensing surface, and a computer-readable medium.
Background
Touch sensitive surfaces are popular device interfaces and can be found in the form of touch screens on smart-phones and as touch-pads on laptops. With the advent of capacitive, as opposed to resistive, touch sensors it became possible to detect multiple points of contact on the touch sensitive surface and from this the use of gestures in the user interface became more common.
An example of a simple gesture and action is pinch-to-zoom-out, whereby if two contact points such as those presented by a thumb and forefinger on the touch sensitive surface are brought together as a pinch the device responds by performing a zoom-out action on the currently displayed item, such as a picture or a webpage.
US Patent No. 7,340,077 to Gokturk et al. describes an arrangement where a camera is used to recognise a user's body position, which is used as an input into a related electronic device.
US Patent No. 6,191 ,773 to Maruno et al. describes an interface apparatus including means for recognizing the shape or movement of the hand of an operator, and using this to control the information displayed on the screen of a device.
US Patent Application Publication No. US 2010/0180298 by Kim et al.
describes a terminal device coupled to a broadcast receiving apparatus. The
terminal device includes a communication unit which receives electronic program guide (EPG) information from the Internet, a display unit that displays EPG information, and a control unit that controls the broadcast receiving apparatus to perform an operation corresponding to a selection.
"A Touch Gesture Reference Guide" by C. Villamor et al., available at www.lukew.com, describes gestures used for touch commands, using gestures to support user actions, visual representations of gestures, and outlines of how popular software platforms support touch gestures.
An advantage of using gestures as part of the user interface of a device is that certain actions can be performed intuitively by the user. A problem with existing devices that use touch sensitive surfaces is that as the gestures increase in number and complexity, the user interface becomes less intuitive and less user friendly.
Summary
There is provided a device comprising a multipoint sensing surface and a gesture module. The multipoint sensing surface is for receiving inputs from one or more objects. The gesture module is configured to receive inputs from the multipoint sensing surface, and to match a gesture on the multipoint sensing surface to an action, the magnitude of the action determined by the number of contact points with which the gesture is made. By having the magnitude or strength of an action that is invoked by a gesture determined by the number of fingers with which the gesture is made, a simple and intuitive man-machine interface is provided.
The device may further comprise a display. The multipoint sensing surface may be mounted on the display. The device may further comprise a transmission means for sending instructions to a further device. The further device may be at least one of: a television, a set top box, a radio, a media player and a PC.
The action may be to change channel on a further device, wherein the number of fingers with which the gesture is performed determines the number of channels changed. The direction of the gesture may determine the direction of channel change.
The action may be to skip a segment of media playback, wherein the number of fingers with which the gesture is performed determines the length of the skip. The direction of the gesture may determine the direction of the skip. The direction of the skip may be forwards in time or backwards in time.
The action may be to change the speed of media playback, wherein the number of fingers with which the gesture is performed determines the change in playback speed. The direction of the gesture may determine whether the playback speed is increased or decreased. The playback speed may be negative, sometimes referred to as "re-wind"
The action may be to move a cursor on a screen, wherein the number of fingers with which the gesture is performed determines the amount the cursor moves. In this way a multi-speed cursor is provided.
There is further provided a method, in a device having a multipoint sensing surface. The method comprises receiving inputs from one or more objects in contact with the multipoint sensing surface. The method further comprises matching a gesture on the multipoint sensing surface to an action. The method further comprises determining the magnitude of the action by the number of contact points with which the gesture is made.
The multipoint sensing surface may be mounted in front of a display. Such a multipoint sensing surface is substantially transparent.
The method may further comprise sending instructions to a further device, the instructions relating to the action corresponding to the gesture and the number of contact points with which the gesture is made. The action may be to change channel, wherein the number of fingers with which the gesture is
performed determines the number of channels changed. The direction in which the gesture is made may determine the direction of channel change.
The gesture may be at least one of a horizontal swipe, a vertical swipe or a rotational swipe, or a combination thereof.
There is further still provided a computer-readable medium, carrying instructions, which, when executed by computer logic, causes said computer logic to carry out any of the methods defined herein.
There is further provided a remote control for selecting and controlling an electronic device. The remote control comprising: a touch-sensitive screen; a programmable digital signal processor in communication with the touch- sensitive screen; and a transmitter in communication with the programmable digital signal processor; wherein the processor is configured to determine a number of fingers touching the touch-sensitive screen, to set based on the number determined a pace corresponding to the electronic device, to identify a gesture made by the fingers on the screen, and to generate based on the gesture identified a corresponding command to be transmitted to the electronic device corresponding to the set pace.
Brief description of the drawings
A device having a multipoint sensing surface will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 illustrates an embodiment of a device as described herein having a multipoint sensing surface;
Figure 2 illustrates a plurality of gestures which may be used to instruct a command to be sent to a further device;
Figure 3 shows some alternative gestures which may be used herein; Figure 4 illustrates a method incorporating the gestures described herein;
Figure 5 is a block diagram illustrating the components of the device described herein; and
Figure 6 is a block diagram illustrating the components of an alternative device also described herein.
Detailed description
Figure 1 illustrates an embodiment of a device having a multipoint sensing surface. The device shown in figure 1 is a remote control comprising a user interface for receiving instructions from a user and a transmitting means for transmitting commands to at least one further device. Remote control 100 sends instructions to a television 1 10 and a radio 120. The instructions are transmitted from the remote control 100 using conventional wireless
communications techniques such as infrared, Bluetooth™, Wi-Fi or similar. Remote control 100 comprises a display 102 which has a transparent touch sensitive surface overlaid to create what is commonly termed a touch-screen. Remote control 100 additionally includes a plurality of hardware buttons 103.
A user may use remote control 100 to send commands to the further devices 1 10, 120 to control their operation. The remote control 100 thus allows a user to perform actions such as turn on a further device, and to change the channel the further device outputs. A typical remote control has only physical buttons for receiving input from a user, but remote controls with touch sensitive surfaces such as touch screens are known.
Figure 2 illustrates a plurality of gestures which may be used to instruct a channel change command to be sent to one of the further devices 1 10, 120. To which of the further devices 1 10, 120 that the gesture relates is determined by an aspect of the user interface of device 100. For example, in one embodiment the user interface of remote control 100 comprises a plurality of buttons at the top of display 102 each button relating to a further device 1 10, 120 that may be controlled. Before making a gesture on the touch screen 102 the user must ensure that the appropriate button is highlighted indicating the appropriate further device is selected.
The touch screen 102 of remote control 100 is capable of detecting multiple simultaneous contact points. In the embodiment shown remote control 100
comprises an LCD display with a capacitive touch surface overlaid. Figure 2A illustrates a two finger swipe to the right which is interpreted by remote control 100 as single channel up instruction. Figure 2B illustrates a two finger swipe to the left which is interpreted by remote control 100 as a single channel down instruction. While selecting a channel, switching between channels, or channel surfing, a user will typically wish to change from one channel to another channel multiple channels away in the displayed channel order. This may be achieved by the user inputting multiple consecutive channel up or channel down instructions. Remote control 100 presents an intuitive and more efficient method of providing these instructions by recognising a three finger swipe to the left or to the right as a multiple channel up or a multiple channel down instruction. This is illustrated in figure 2C which shows a three finger swipe to the right indicating a five channel up instruction; and figure 2D which illustrates a three finger swipe to the left indicating a five channels down instruction.
The number of channels that is skipped when a multiple channel change instruction is received is an implementation detail which will vary dependant upon the total number of channels available. For example, for a television with only eight channels a five channel change instruction is unlikely to be useful whereas a three channel change instruction is more likely to be used. Similarly, for a radio with one hundred channels a ten channel change instruction maybe more useful than a five channel change instruction. Figure 3 shows some alternative gestures to which the multiple channel change instruction may be assigned. Each of the gestures in figure 3 is shown made by two contact points which will typically be two fingers. Each of the gestures shown in figure 3 may also be made with one, three or even four fingers. Figures 3A and 3B illustrate a two finger swipe to the right and to the left respectively corresponding to the single channel change instructions in figures 2A and 2B. Figure 3C shows a two finger swipe up, and figure 3D shows a two finger swipe down. Figure 3E shows a two finger rotational swipe clockwise and Figure 3F shows a two finger rotational swipe anticlockwise. The channel up and channel down instruction described above in
relation to remote control 100 may alternatively be implemented with vertical swipes (figures 3C and 3D respectively) or rotational swipes (figures 3E and 3F respectively). It should be further noted that where the size of the touch screen 102 of remote control 100 is sufficient three speed channel change instructions may be implemented. For example, a two finger swipe may correspond to a single channel change, a three finger swipe may corresponds to a four channel change and a four finger swipe may correspond to a ten channel change. It should be noted that the above described technique is not limited to channel change commands. The technique may also be applied to scrolling or skipping through a media presentation (such as a recorded or buffered TV programme, an audio file or an audio & video file), or scrolling through a list such as an electronic programme guide (EPG) or content catalogue. In the context of scrolling through a media presentation, a two finger swipe may correspond to scrolling at twice normal speed, whereas a three finger swipe may correspond to scrolling at four times normal speed. Alternatively, a two finger swipe may correspond to an increase of the scrolling speed by one step (from 1x to 2x, or from 2x to 4x), and a three finger swipe may correspond to an increase of the scrolling speed by two steps (from 1x to 4x, from 2x to 8x, or from 4x to 16x).
In the context of skipping through a presentation, a two finger swipe may correspond to skipping 5 seconds, whereas a three finger swipe may correspond to skipping 30 seconds. Similarly, when scrolling through a list a two finger swipe may correspond to a moving on to the next item on the list (or time slot in an EPG) and a three finger swipe may correspond to moving onto the fourth next item on the list (or time slot on the EPG). It should be noted that in the above examples the direction of the gesture may be used to determine the direction of the skipping or scrolling.
Furthermore, it should be noted that in the above, single finger gestures are not used for scrolling or skipping instructions as this may be reserved for
another function in the user interface of remote control 100 such as, for example, cursor movement. It should be understood that where reserving of the single finger swipe is not require then the single finger swipe may be used for a single channel change, skip or scroll instructions with additional fingers used to indicate a higher speed of channel change, skip or scroll.
Further still, the technique described herein may be applied to cursor movement. This is particularly relevant to devices that have a separate touch sensitive surface and screen, or to devices which have a touch screen but where the cursor controlled by the touch sensitive surface of the touch screen is displayed on a screen other than the touch screen. The relation between amount of input movement and amount of cursor movement is determined by the cursor speed. For a given input movement a fast cursor moves further on the screen than a slow cursor. The technique described herein may be used to provide a multi-speed cursor whereby a cursor moves at one speed in response to a one finger gesture, and at a faster speed in response to a two finger gesture. Optionally, the cursor may move faster still in response to a three finger gesture. Alternatively, the cursor may move at one speed in response to a one finger gesture, and at a slower speed in response to a two finger gesture.
Optionally, the cursor may move slower still in response to a three finger gesture. This may be particularly useful where precision cursor control is occasionally required.
Figure 4 illustrates a method implemented by a device such as remote control 100 incorporating the gestures described above. At 400 the process starts and at 410 it is determined whether or not a touch input is detected. If no touch input is detected the process proceeds to a standby mode 405 in which the device may periodically determine whether touch inputs have been made to the multipoint sensing surface. If a touch input is detected at 410 the process proceeds to 420 where a determination is made as to the number of contact points that the touch input comprises. Then the process proceeds to 430 where the touch input is monitored and any movements of the contact
points are detected. Once movement is detected a gesture is identified at 440 which best correlates with the detected movement. Thus, the process establishes what gesture was used and the number of contact points used to make it. At 450 the process performs the action corresponding to the input gesture and the number of contact points. The action may comprise transmitting a control instruction to a further device.
Figure 5 is a block diagram illustrating the components of remote control 100. Remote control 100 comprises a touch screen 510, a gesture module 520, a processor 530 and a graphics module 540. The gesture module 520 receives touch information from the touch sensitive surface of touch screen 510 and translates these signals into gesture information which is sent to a processor 530. Processor 530 is arranged to run instructions to allow the remote control 100 to perform as required. Processor 530 sends display information to a graphics module 540 which drives the display component of touch screen 510.
Figure 6 illustrates the structure of a device 600 which does not comprise of a touch screen but has separate touch and display surfaces. Device 600 comprises a touch surface 610, a gesture module 620, a processor 630, a graphics module 540, and a display 650. The gesture module 620 receives touch information from the touch surface 610 and translates these signals into gesture information which is sent to a processor 630. Processor 630 is arranged to run instructions to allow the device 600 to perform as required. Processor 630 sends display information to a graphics module 640 which drives the display 650.
It will be apparent to the skilled person that the exact order and content of the actions carried out in the method described herein may be altered according to the requirements of a particular set of execution parameters. Accordingly, the order in which actions are described and/or claimed is not to be construed as a strict limitation on order in which actions are to be performed.
Claims
1 . A device comprising:
a multipoint sensing surface for receiving inputs from one or more objects;
a gesture module configured to receive inputs from the multipoint sensing surface, and to match a gesture on the multipoint sensing surface to an action, the magnitude of the action determined by the number of contact points with which the gesture is made.
2. The device of claim 1 , further comprising a display.
3. The device of claim 2, wherein the multipoint sensing surface is mounted on the display.
4. The device of any preceding claim, further comprising a transmission means for sending instructions to a further device.
5. The device of claim 4, wherein the further device is at least one of: a television, a set top box, a radio, a media player and a PC.
6. The device of any preceding claim, wherein the action is to change channel, wherein the number of fingers with which the gesture is performed determines the number of channels changed.
7. The device of any of claims 1 to 5, wherein the action is to skip a segment of content playback, wherein the number of fingers with which the gesture is performed determines the length of the skip.
8. The device of any of claims 1 to 5, wherein the action is to change the speed of content playback, wherein the number of fingers with which the gesture is performed determines the change in playback speed.
9. The device of any of claims 1 to 5, wherein the action is to move a cursor on a screen, wherein the number of fingers with which the gesture is performed determines the amount the cursor moves.
10. The device of any preceding claim, wherein the gesture is at least one of a horizontal swipe, a vertical swipe or a rotational swipe, or a combination thereof.
1 1 . A method, in a device having a multipoint sensing surface, the method comprising:
receiving inputs from one or more objects in contact with the multipoint sensing surface;
matching a gesture on the multipoint sensing surface to an action; determining the magnitude of the action by the number of contact points with which the gesture is made.
12. The method of claim 1 1 , wherein the multipoint sensing surface is mounted on a display.
13. The method of claim 1 1 or 12, further comprising sending instructions to a further device, the instructions relating to the action corresponding to the gesture and the number of contact points with which the gesture is made.
14. The method of claim, 1 1 , 12 or 13, wherein the action is to change channel, wherein the number of fingers with which the gesture is performed determines the number of channels changed.
15. The method of claim 14, wherein the direction of the gesture
determines the direction of channel change.
16. The method of any of claims 1 1 to 15, wherein the gesture is at least one of a horizontal swipe, a vertical swipe or a rotational swipe, or a combination thereof.
17. A computer-readable medium, carrying instructions, which, when executed by computer logic, causes said computer logic to carry out any of the methods defined by claims 1 1 to 16.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161439147P | 2011-02-03 | 2011-02-03 | |
US61/439,147 | 2011-02-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012104288A1 true WO2012104288A1 (en) | 2012-08-09 |
Family
ID=45833357
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2012/051531 WO2012104288A1 (en) | 2011-02-03 | 2012-01-31 | A device having a multipoint sensing surface |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2012104288A1 (en) |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130246948A1 (en) * | 2012-03-16 | 2013-09-19 | Lenovo (Beijing) Co., Ltd. | Control method and control device |
WO2014099893A3 (en) * | 2012-12-17 | 2014-08-21 | Motorola Mobility Llc | Multi-touch gesture for movement of media |
US9024894B1 (en) * | 2012-08-29 | 2015-05-05 | Time Warner Cable Enterprises Llc | Remote control including touch-sensing surface |
WO2017081455A1 (en) * | 2015-11-09 | 2017-05-18 | Sky Cp Limited | Television user interface |
WO2019127566A1 (en) * | 2017-12-30 | 2019-07-04 | 李庆远 | Method and device for multi-level gesture-based station changing |
WO2019127419A1 (en) * | 2017-12-29 | 2019-07-04 | 李庆远 | Multi-level fast forward and fast rewind hand gesture method and device |
US11201961B2 (en) * | 2017-05-16 | 2021-12-14 | Apple Inc. | Methods and interfaces for adjusting the volume of media |
AU2022200515B2 (en) * | 2017-05-16 | 2023-01-12 | Apple Inc. | Methods and interfaces for home media control |
US11620103B2 (en) | 2019-05-31 | 2023-04-04 | Apple Inc. | User interfaces for audio media control |
US11683408B2 (en) | 2017-05-16 | 2023-06-20 | Apple Inc. | Methods and interfaces for home media control |
US11714597B2 (en) | 2019-05-31 | 2023-08-01 | Apple Inc. | Methods and user interfaces for sharing audio |
US11750734B2 (en) | 2017-05-16 | 2023-09-05 | Apple Inc. | Methods for initiating output of at least a component of a signal representative of media currently being played back by another device |
US11755273B2 (en) | 2019-05-31 | 2023-09-12 | Apple Inc. | User interfaces for audio media control |
US11755712B2 (en) | 2011-09-29 | 2023-09-12 | Apple Inc. | Authentication with secondary approver |
US11782598B2 (en) | 2020-09-25 | 2023-10-10 | Apple Inc. | Methods and interfaces for media control with dynamic feedback |
US11785387B2 (en) | 2019-05-31 | 2023-10-10 | Apple Inc. | User interfaces for managing controllable external devices |
US11847378B2 (en) | 2021-06-06 | 2023-12-19 | Apple Inc. | User interfaces for audio routing |
US11893052B2 (en) | 2011-08-18 | 2024-02-06 | Apple Inc. | Management of local and remote media items |
US11900372B2 (en) | 2016-06-12 | 2024-02-13 | Apple Inc. | User interfaces for transactions |
US11907519B2 (en) | 2009-03-16 | 2024-02-20 | Apple Inc. | Device, method, and graphical user interface for moving a current position in content at a variable scrubbing rate |
US11907013B2 (en) | 2014-05-30 | 2024-02-20 | Apple Inc. | Continuity of applications across devices |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080062141A1 (en) * | 2006-09-11 | 2008-03-13 | Imran Chandhri | Media Player with Imaged Based Browsing |
EP1942401A1 (en) * | 2007-01-05 | 2008-07-09 | Apple Inc. | Multimedia communication device with touch screen responsive to gestures for controlling, manipulating and editing of media files |
US20080165140A1 (en) * | 2007-01-05 | 2008-07-10 | Apple Inc. | Detecting gestures on multi-event sensitive devices |
US20090153288A1 (en) * | 2007-12-12 | 2009-06-18 | Eric James Hope | Handheld electronic devices with remote control functionality and gesture recognition |
EP2182431A1 (en) * | 2008-10-28 | 2010-05-05 | Sony Corporation | Information processing |
US20100162181A1 (en) * | 2008-12-22 | 2010-06-24 | Palm, Inc. | Interpreting Gesture Input Including Introduction Or Removal Of A Point Of Contact While A Gesture Is In Progress |
EP2202614A1 (en) * | 2008-12-26 | 2010-06-30 | Brother Kogyo Kabushiki Kaisha | User input apparatus for multifunction peripheral device |
US20100214322A1 (en) * | 2009-02-24 | 2010-08-26 | Samsung Electronics Co., Ltd. | Method for controlling display and device using the same |
-
2012
- 2012-01-31 WO PCT/EP2012/051531 patent/WO2012104288A1/en active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080062141A1 (en) * | 2006-09-11 | 2008-03-13 | Imran Chandhri | Media Player with Imaged Based Browsing |
EP1942401A1 (en) * | 2007-01-05 | 2008-07-09 | Apple Inc. | Multimedia communication device with touch screen responsive to gestures for controlling, manipulating and editing of media files |
US20080165140A1 (en) * | 2007-01-05 | 2008-07-10 | Apple Inc. | Detecting gestures on multi-event sensitive devices |
US20090153288A1 (en) * | 2007-12-12 | 2009-06-18 | Eric James Hope | Handheld electronic devices with remote control functionality and gesture recognition |
EP2182431A1 (en) * | 2008-10-28 | 2010-05-05 | Sony Corporation | Information processing |
US20100162181A1 (en) * | 2008-12-22 | 2010-06-24 | Palm, Inc. | Interpreting Gesture Input Including Introduction Or Removal Of A Point Of Contact While A Gesture Is In Progress |
EP2202614A1 (en) * | 2008-12-26 | 2010-06-30 | Brother Kogyo Kabushiki Kaisha | User input apparatus for multifunction peripheral device |
US20100214322A1 (en) * | 2009-02-24 | 2010-08-26 | Samsung Electronics Co., Ltd. | Method for controlling display and device using the same |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11907519B2 (en) | 2009-03-16 | 2024-02-20 | Apple Inc. | Device, method, and graphical user interface for moving a current position in content at a variable scrubbing rate |
US11893052B2 (en) | 2011-08-18 | 2024-02-06 | Apple Inc. | Management of local and remote media items |
US11755712B2 (en) | 2011-09-29 | 2023-09-12 | Apple Inc. | Authentication with secondary approver |
US20130246948A1 (en) * | 2012-03-16 | 2013-09-19 | Lenovo (Beijing) Co., Ltd. | Control method and control device |
US9024894B1 (en) * | 2012-08-29 | 2015-05-05 | Time Warner Cable Enterprises Llc | Remote control including touch-sensing surface |
WO2014099893A3 (en) * | 2012-12-17 | 2014-08-21 | Motorola Mobility Llc | Multi-touch gesture for movement of media |
US11907013B2 (en) | 2014-05-30 | 2024-02-20 | Apple Inc. | Continuity of applications across devices |
CN108476338A (en) * | 2015-11-09 | 2018-08-31 | Cp天空有限公司 | Television user interface |
US11523167B2 (en) | 2015-11-09 | 2022-12-06 | Sky Cp Limited | Television user interface |
WO2017081455A1 (en) * | 2015-11-09 | 2017-05-18 | Sky Cp Limited | Television user interface |
US11900372B2 (en) | 2016-06-12 | 2024-02-13 | Apple Inc. | User interfaces for transactions |
AU2022200515B2 (en) * | 2017-05-16 | 2023-01-12 | Apple Inc. | Methods and interfaces for home media control |
US11201961B2 (en) * | 2017-05-16 | 2021-12-14 | Apple Inc. | Methods and interfaces for adjusting the volume of media |
US11683408B2 (en) | 2017-05-16 | 2023-06-20 | Apple Inc. | Methods and interfaces for home media control |
US11750734B2 (en) | 2017-05-16 | 2023-09-05 | Apple Inc. | Methods for initiating output of at least a component of a signal representative of media currently being played back by another device |
WO2019127419A1 (en) * | 2017-12-29 | 2019-07-04 | 李庆远 | Multi-level fast forward and fast rewind hand gesture method and device |
WO2019127566A1 (en) * | 2017-12-30 | 2019-07-04 | 李庆远 | Method and device for multi-level gesture-based station changing |
US11714597B2 (en) | 2019-05-31 | 2023-08-01 | Apple Inc. | Methods and user interfaces for sharing audio |
US11853646B2 (en) | 2019-05-31 | 2023-12-26 | Apple Inc. | User interfaces for audio media control |
US11785387B2 (en) | 2019-05-31 | 2023-10-10 | Apple Inc. | User interfaces for managing controllable external devices |
US11755273B2 (en) | 2019-05-31 | 2023-09-12 | Apple Inc. | User interfaces for audio media control |
US11620103B2 (en) | 2019-05-31 | 2023-04-04 | Apple Inc. | User interfaces for audio media control |
US11782598B2 (en) | 2020-09-25 | 2023-10-10 | Apple Inc. | Methods and interfaces for media control with dynamic feedback |
US11847378B2 (en) | 2021-06-06 | 2023-12-19 | Apple Inc. | User interfaces for audio routing |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2012104288A1 (en) | A device having a multipoint sensing surface | |
US11243615B2 (en) | Systems, methods, and media for providing an enhanced remote control having multiple modes | |
JP6144242B2 (en) | GUI application for 3D remote controller | |
US8881049B2 (en) | Scrolling displayed objects using a 3D remote controller in a media system | |
US10324612B2 (en) | Scroll bar with video region in a media system | |
US8194037B2 (en) | Centering a 3D remote controller in a media system | |
KR101621524B1 (en) | Display apparatus and control method thereof | |
US20120162101A1 (en) | Control system and control method | |
KR101379398B1 (en) | Remote control method for a smart television | |
US20090158222A1 (en) | Interactive and dynamic screen saver for use in a media system | |
US20090153475A1 (en) | Use of a remote controller Z-direction input mechanism in a media system | |
EP2682853A2 (en) | Mobile device and operation method control available for using touch and drag | |
US20090284532A1 (en) | Cursor motion blurring | |
CN111897480B (en) | Playing progress adjusting method and device and electronic equipment | |
EP2341492B1 (en) | Electronic device including touch screen and operation control method thereof | |
KR101515454B1 (en) | Remote controller having dual touch pad and method for controlling using the same | |
US20100162155A1 (en) | Method for displaying items and display apparatus applying the same | |
WO2012057179A1 (en) | Electronic device | |
KR20110134810A (en) | A remote controller and a method for remote contrlling a display | |
JP2013143139A (en) | Input apparatus, display apparatus, control method thereof and display system | |
KR102250091B1 (en) | A display apparatus and a display method | |
US20160124606A1 (en) | Display apparatus, system, and controlling method thereof | |
KR101253168B1 (en) | Apparatus for input with touch pad | |
JP5246974B2 (en) | Electronic device input device, input operation processing method, and input control program | |
KR20170111787A (en) | A display apparatus and a display method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12708772 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12708772 Country of ref document: EP Kind code of ref document: A1 |