WO2006036069A1 - Information processing system and method - Google Patents

Information processing system and method Download PDF

Info

Publication number
WO2006036069A1
WO2006036069A1 PCT/NO2005/000360 NO2005000360W WO2006036069A1 WO 2006036069 A1 WO2006036069 A1 WO 2006036069A1 NO 2005000360 W NO2005000360 W NO 2005000360W WO 2006036069 A1 WO2006036069 A1 WO 2006036069A1
Authority
WO
WIPO (PCT)
Prior art keywords
optoelectronic device
processing system
motion
input
information processing
Prior art date
Application number
PCT/NO2005/000360
Other languages
French (fr)
Inventor
Hans Gude Gudensen
Per-Erik Nordal
Isak Engquist
Original Assignee
Hans Gude Gudensen
Per-Erik Nordal
Isak Engquist
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hans Gude Gudensen, Per-Erik Nordal, Isak Engquist filed Critical Hans Gude Gudensen
Publication of WO2006036069A1 publication Critical patent/WO2006036069A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the present invention concerns an information processing system comprising an optoelectronic device, associated display means and display control electronics that either is physically a part of said optoelectronic device or physically separate therefrom, but connected therewith via a data link, wherein the optoelectronic device comprises an internal memory, a data processing system for operating the device, including input- and/or output processes, and an optical image recorder for capturing information from the environment surrounding said optoelectronic device.
  • the present invention also concerns a method for operating an information processing system of this kind.
  • buttons or other means of input have led to a situation where user control of the device is seriously slowed down, e.g. during text input and menu item selection.
  • miniaturized multiple-choice buttons typically leads to increased operator fatigue and higher incidence of erroneous data entries:
  • a common way to implement menu item selection is to employ a four-way button which has to be repeatedly operated in order to step through menu items that are typically arranged in a 1-dimensional list or a 2-dimensional array of symbols. As the lists and menus tend to increase in size with the increased functionality of mobile devices, the number of button operations required will increase in a highly undesired manner. ⁇ ,
  • the small area available for a display screen implies that displayed features must either be very small and correspondingly difficult to read, or they must be so few as to severely limit the information content in the display.
  • US patent application 2004/0141085 discloses a system that uses an orientation detecting mechanism (an accelerometer) to determine the orientation of an imaging device, e.g. a digital camera.
  • the acquired orientation information is used to reconfigure the displayed image for increased user viewing convenience.
  • the reconfiguration is explicitly effected on the captured image only. The general problem of using small displays for viewing and performing operations within large and content-rich images, diagrams, etc. is not addressed.
  • a solution to the problem of providing input to a small portable device with limited space for keyboard is promoted by the firm Memsic Inc., of North Andover, MA in U.S.A.
  • the portable device is equipped with a small accelerometer which senses tilting of the device. By tilting right/left and/or up/down and clicking on a button on the device, navigation is possible on a screen on the device, i.e. scrolling through lists, menu selection, map scrolling, gaming, etc.
  • An obvious drawback of this approach is the need for an accelerometer in the portable device, where space and cost typically are at an extreme premium. Beyond this, input is sensitive to motion of the device involving changes in speed or direction, and presupposes a given attitude of the portable device relative to the direction of the gravity vector.
  • a possible solution to the text input problem, described in US Pat. No. 5,818,437 is to assign multiple letters to each of the numerical buttons on the device, and use built-in dictionaries to find words that match the button sequence keyed by the user.
  • Drawbacks of this solution include the need to scroll through several valid alternatives to find the correct word, and having to explicitly spell words, names and abbreviations that are not included in the dictionary.
  • the optoelectronic device comprises hardware and software adapted for analyzing information captured by said optical image recorder with respect to temporal variations in the signals from the optical image recorder, said variations being caused by translational and/or rotational motion of said optoelectronic device relative to real or virtual objects, textures or scenery within the field of view of said optical image recorder, that said processing hardware and software are adapted for providing an output signal in scalar or vector form on the basis of the analyzed information, said output signal being indicative of the motion characteristics of said optoelectronic device relative to the real or virtual objects, textures or scenery within the field of view of said optical image recorder, that the display control electronics is connected with said optoelectronic device for receiving said output signal, and that said display control electronics is adapted for modifying or changing a displayed message or image in response to the characteristics of said output signal.
  • a method according to the invention which is characterized by deriving at least one parameter from translational and/or rotational motion of said optoelectronic device or auxiliary equipment linked to same, relative to real or virtual objects, textures or scenery objects within the field of view of said optical image recorder, and applying said at least one parameter as input to control electronics in said optoelectronic device and to effect a navigation and/or input commands in said display means, partly or completely in response to said at least one parameter.
  • figure 1 shows the main components in an information processing system according to the present invention
  • figures 2a and 2b a first preferred embodiment where the display screen on a mobile phone constitutes part of a virtual display where the viewing field is enhanced by scrolling a window across a larger page
  • figure 3 a second preferred embodiment involving a virtual keyboard
  • figure 4 a third preferred embodiment involving menu item selection
  • figures 5a and 5b an example of a variant of the third preferred embodiment
  • figure 6 a fourth preferred embodiment where an optoelectronic device according to the invention functions as a computer mouse
  • figures 7a and 7b show examples of variants of the fourth embodiment.
  • the main components of an optoelectronic device as used in the information processing system of the invention is shown in fig. 1.
  • the optoelectronic device comprises an optical image recorder, an image motion analyser linked with the former and used for analyzing recorded images and generating a continuously updated image output to a connected display.
  • the optoelectronic device is implemented in a mobile telephone or a PDA equipped with a digital camera and a display screen.
  • the optoelectronic device is termed a "phone" in these preferred embodiments. This is not meant to preclude that the optoelectronic device in question may be another type of mobile platform equipped with a digital camera and a display screen, e.g. a helmet-mounted or head-up device or system.
  • FIG. 2 A first preferred embodiment is shown in figure 2.
  • This embodiment provides a virtual display wherein large pages of text, images, web content, etc. that would otherwise defy normal criteria for readability on the phone display screen, are accessed in selected segments that are scrolled across the larger page simply by moving the phone in three dimensions.
  • This scrolling is rendered intuitive by correlation between the virtual motion across the page to be viewed with the real motion of the phone.
  • the latter is typically a right/left and up/down motion of the phone above or in front of a surface or object or scene that exhibits adequate structure or texture for the camera to register movement.
  • the degree of zoom can be used to control the speed of scrolling for a given motion input, e.g. the more zoom, the slower the scrolling, and vice versa.
  • Combined with the lateral motion of the phone optionally there may be an in/out motion relative to the surface or object which effects a zoom in/zoom out function on the displayed image of the large page.
  • One may combine scrolling with the zoom function, such that fast scrolling reduces the zoom factor and vice versa.
  • the zoom level then applied may be maintained until the zoom function is activated again, e.g. by the relative in/out motion of the phone.
  • the absolute zoom level shall be user controllable
  • An alternative variant of phone motion which is possible in cases where no suitable surface or object is available in reasonably close proximity to the phone, is to rotate the phone right/left and up/down, i.e. panning and tilting the camera to cause the observed scene to move across the camera chip surface.
  • this can be perceived intuitively as being related to tilting of the observer's head to view different parts of the page.
  • motion of the camera in a line of sight to the distant objects will cause the image to change very little, and other means of providing zooming commands must be used.
  • zooming in/out in the virtual image can be achieved in several ways, namely a) by using the zoom function built into the camera, if such exists; b) by pressing or toggling a mechanical input key, switch or pressure- sensitive field on the phone; c) by moving the phone in a prescribed pattern, e.g. rapid sideways or up/down rotation for zoom in/zoom out; or d) by acoustic input, e.g. voice command.
  • This threshold shall typically apply for the recorded speed of motion (translational or rotational), and the magnitude of the threshold may be selected or adjusted according to the type of usage and environment that is encountered in each application.
  • any predictable motion pattern that shall not serve as input cue to the optoelectronic device may be compensated for in the signal processing, e.g. a constant linear motion due to steady translational motion of the optoelectronic device such as walking.
  • the "dead band” would be centered around a certain speed value corresponding to the constant motion.
  • the algorithm for making the system insensitive to motion that is not intended by the operator to provide meaningful motion input to the optical image recorder may also be set to reject very rapid accelerations or motions, e.g. by setting an upper limit to the legitimate sensitive range.
  • the processing can easily be set to pick up a specific motion as an input cue, e.g. cyclic motion within a defined frequency band such as would occur in response to rapid wagging of the optoelectronic device sideways or up and down.
  • the phone movement can also be used to scroll a virtual keyboard for text input, for instance to message systems like SMS.
  • a single function key is pressed to select a letter, and the range of characters can be extended or various special functions implemented by using movement in the third dimension.
  • writing via a virtual keyboard is performed without the need for a set of physical keys representing the alphabet or other symbols on the face of the phone. Instead, letters or symbols are selected on a virtual keyboard shown in the display, by a two- step process: First, moving the phone in a lateral or rotational motion causes a visual indicator function to move across the virtual keyboard.
  • a "Select" command is given to the phone, by one of the following modalities, namely a) by pressing or toggling a mechanical input key, switch or pressure- sensitive field on the phone; b) by moving the phone in a prescribed pattern, i.e. laterally up/down or rotationally (pan/tilt) or along the optical axis of the phone's camera (in/out); or c) by acoustic input, e.g. voice command.
  • the virtual keyboard may be larger than the display, causing only part of it to be visible within the field defined by the phone's display screen. This situation is depicted in figure 3, and in this case the virtual keyboard scrolls across the display, one keyboard letter or symbol being highlighted or framed at a time upon passing a selector field in e.g. the middle of the screen.
  • the whole virtual keyboard is shown inside the display, and a cursor or highlighting field moves across the virtual keyboard when the phone is moved.
  • Selection of upper or lower case characters or special symbols or functions is achieved by entering a command into the phone by one of the modalities a) to c) described earlier in this paragraph.
  • the command may be of the "Caps shift” or “Caps lock” type, depending on the situation.
  • a sweep across a virtual background field where the menu icons are laid out can be performed quickly and simply.
  • Motion patterns and selection modes can be similar to those described for the virtual keyboard above.
  • a typical phone has many menus. The user could pre-set which menus should be subject to motion controlled selection, and then the camera would automatically be activated when these menus are selected. Menus could be nested hierarchically, and motion control could be used to navigate into and within sub-menus.
  • the phone camera is automatically activated to detect phone movement.
  • the phone calls the presently highlighted contact. • If the contact has more than one phone number, the first dial-up or select command produces a pop-up window with all the phone numbers. Phone movement is used to highlight the desired number and the dial-up button is used for placing the call.
  • This strategy enables fast selection of the desired contact in one sweeping phone movement. It enables more contacts on the screen than the usual text list layout, and is based on human recognition of images/graphical symbols rather than text, which is normally a faster search method.
  • the present invention could also in a general fashion implement a so-called relative pointer device, for instance for controlling the cursor on a computer screen. However, the present invention shall allow the extension of relative pointer technology far beyond that offered by e.g.
  • a conventional computer mouse This opens for a fourth preferred embodiment, the principle of which is shown in fig.6, which basically renders the use of the optoelectronic device as three-dimensional computer mouse or similar input device to a mobile or stationary information processing system. Cursor movement on the computer screen is controlled by moving the optoelectronic device in one or more of the translational, rotational or combined translational/rotational modes described in the other preferred embodiments above, while mouse clicks can be effected in several ways, e.g.
  • the input from the optoelectronic device to the information processing system is the operator signature, e.g. in legal or economic transactions or in access control.
  • the operator in this case moves the optoelectronic device as he would a pen to create the pattern, which may be his name signature or some other pre-defined identifying pattern.
  • the motion need not take place against any supporting surface, but could be performed in the air in front of the operator ("writing in the air").
  • One example of such processing would be to scale the size of the recorded signature to a predefined norm.
  • an alternative to moving the phone in the open air shall be to equip the phone with additional physical hardware, e.g. a rounded object, e.g. ball-like, which may be attached to the phone at the side where the camera is located.
  • additional physical hardware e.g. a rounded object, e.g. ball-like, which may be attached to the phone at the side where the camera is located.
  • the phone may be rested on the desktop or any other object, while the three-dimensional motions are achieved by tilting the phone in the various directions, as allowed by the thickness of the attached, rounded object.
  • the translation movements of the mouse now can be replaced by a rotational movement around any of three axes and over a fairly small angle.
  • Communication with the computer could be via a wire or by one of several wireless means.
  • Bluetooth communication would enable using the mobile electronic device for controlling presentations, including pointer control and the possibility for the user to stand up and move around.
  • An important advantage of this embodiment relative to the conventional computer mouse is that contact with or close proximity to a surface is not needed.
  • it is possible to provide input commands linked to a motion in the third dimension either by increasing or reducing the distance from the camera of the mobile electronic device to an object in front of it or by employing the camera zoom function. This opens up opportunities for 3-D cursor control, e.g. in CAD software or in the display/manipulation of 3 dimensional virtual or real-space objects.
  • the optoelectronic device can be equipped with a microphone, such as is the case when the optoelectronic device is a mobile telephone with camera. This makes it possible to insert audio files into the computer. Likewise, by means of a speaker or earphone attached to the optoelectronic device, it becomes possible to listen to audio files that have been located in the computer.
  • relative motion is created by the optoelectronic device being stationary or nearly so.
  • the relative motion is then created by moving an object, e.g. the operator's hand (or finger), in front of the optical image recorder of the optoelectronic device.
  • This variant can be implemented in various ways, as shown in figs 7a and 7b.
  • Figure 7a shows a person carrying an optoelectronic device according to the invention with a camera viewing out in front of the person. By moving his hands in front of the camera, this causes a cursor to move on the screen of a computer as shown in fig.7a, or on a screen in the optoelectronic device itself.
  • Mouse clicks can be done in several ways, e.g.
  • the person holds a contrast-enhancing or recognition aiding implement in his hand, e.g. a pen or pencil with a marker spot of high optical contrast on it, optionally in a color selected in an optimized spectral region, to simplify processing of the image.
  • a contrast-enhancing or recognition aiding implement e.g. a pen or pencil with a marker spot of high optical contrast on it, optionally in a color selected in an optimized spectral region, to simplify processing of the image.
  • Fingers or hands of the operator can be marked with a spot or symbol of high optical contrast, e.g. to show the positions of the fingertips.
  • variants of this fourth embodiment exemplified in figures 7a and 7b can be used to effect handwritten input into the computer, the optoelectronic device or both, including name or other types of signatures for legal and transaction purposes or access control.
  • Image analysis provides further possibilities for more complex and hence more compact or efficient input coding: For example, two fingers in contact with each other side by side may carry a different message from the same two fingers splayed away from each other.
  • a simple example of how this is when writing to virtual keyboards (cf. below), in which case lower case symbols can be written with a single finger, upper case with two fingers close together.
  • detection of more than one motion vector in the field of view of the optical sensing device of the optoelectronic device can be employed as input.
  • An example of this is the use of two hands or fingers that are moved independently, but in coordinated fashion.
  • the principle scheme in figure 7b can be used to provide keyboard input via a "virtual keyboard”:
  • a keyboard may be printed on the table surface or on a flat sheet lying on the table, and the camera records the position of the operator's fingers on the keyboard.
  • Image analysis software is used to determine which key is indicated, e.g. by analyzing the obscuration of keyboard by the hands and fingers. In cases where fingertips are tagged for enhanced visibility, the tag location relative to the keyboard is analyzed.
  • the actual tapping to indicate selection of an indicated key may be achieved by one of the modalities mentioned above in connection with mouse clicking, or by providing a motion cue to the imaging software. An example of the latter would be to set an activation threshold related to the dwell time on a given key, i.e. the finger must rest a time exceeding a predetermined minimum for a tapping to be registered.
  • a variant of a virtual keyboard would be to project a keyboard as an image onto the writing surface. Fingertip tags with high contrast could in this case be rendered especially useful by making them fluoresce in response to the light used in projecting the keyboard or another light source.
  • virtual touch screen Another type of virtual keyboard can be established under a wider concept which may be termed a "virtual touch screen”:
  • the optoelectronic device is equipped with a camera which is directed towards the screen display of a computer.
  • the operator places a pointer or wand, or one or more Fingers or hand(s) against the screen, thereby obscuring parts of the displayed image.
  • Image analysis software determines which of the displayed image parts or regions that are indicated, e.g. by analysing the contours of the object held in front of the screen. This principle thus provides the features of a touch-sensitive screen, providing input opportunities for menu selection, etc, but without the need for technically complex physical sensors to record the touching action.
  • writing may be performed in a manner similar to the case described above where writing was performed on a printed or projected keyboard.
  • the virtual touch screen provides opportunities for more sophisticated input operations than simple two-dimensional coordinate point indication, by making the image analysis software able to discriminate between different motion- and shape patterns for the operator's fingers and hands in front of the screen.
  • An example of this is to use the flat palm of the hand to mark or wipe larger parts of a document or scene, to effect copying, deletion or transfer.
  • One vital part of any of the indicated embodiments is the processing of camera image sequences to obtain the output signal which indicates movement of the mobile electronic device relative to imaged objects or scenes.
  • motion estimation which is a well established research field where the present major application is compression of digital video sequences.
  • the MPEG video compression standards see e.g.: http://www.chiariglione.org/mpeg/ ) rely heavily on motion estimation.
  • the present invention has an advantage over conventional video compression, in that only the camera motion need be estimated, which means characterization of the general movement of the background of the scene. The motion of smaller objects across the background can be disregarded, which lowers the demand for processing power compared to video compression.
  • the camera image needs to be split into macroblocks, typically of the size 16x 16 pixels.
  • the motion of each macroblock from one frame to another is then computed, resulting in a motion vector (MV).
  • MV motion vector
  • MVF motion vector field
  • Moving objects, mismatched macroblocks due to insufficient texture, etc. will show up as outliers in the MVF and can be disregarded through proper filtering.
  • the implementation of MVF calculation in the present invention will be chosen in each particular embodiment according to the properties of the camera and other phone hardware, available processing power, etc.
  • Block matching is probably the most common way to calculate the MVF (cf., e.g.: A. M. Telkap, Digital Video Processing, Prentice Hall, 1995; J. Watkinson, The Engineer 's Guide to Motion Compensation, Snell & Wilcox, 1994).
  • each macroblock in the new frame is compared with shifted regions of the same size from the previous frame, and the shift which results in the minimum error is selected as the best MV for that macroblock.
  • Other methods of camera motion estimation include phase correlation (cf., e.g.: J. Watkinson, The Engineer 's Guide to Motion Compensation, Snell & Wilcox, 1994; Y. Liang, Phase-Correlation Motion Estimation, Final project, Stanford University) and the utilization of MPEG motion vectors (cf., e.g.: M. PiIu, On Using Raw MPEG Motion Vectors To Determine Global Camera Motion, Hewlett-Packard Company, 1997).
  • phase correlation cf., e.g.: J. Watkinson, The Engineer 's Guide to Motion Compensation, Snell & Wilcox, 1994; Y. Liang, Phase-Correlation Motion Estimation, Final project, Stanford University
  • MPEG motion vectors cf., e.g.: M. PiIu,
  • Power consumption is a potential concern in embodiments of the present invention where an imaging camera and/or a display screen are operated on a battery.
  • the camera part of the system draws little power compared to an illuminated display screen.
  • several power conservation strategies are possible, since imaging requirements are different from and generally less demanding than those in regular photography.
  • the total number of available camera pixels need not be active in the primary photo-event or in subsequent signal processing steps.
  • Frame rates may also be reduced, depending on the updating rate and response speed that is required in applications based on the present invention.
  • several strategies are available to conserve power.

Abstract

In an information processing system an optoelectronic device comprises a memory a data processing system, and an optical imaging recorder and further hardware and software for analyzing information captured by the optical image recorder. On the basis of the analyzed information an output signal indicative of the motion characteristics of the optoelectronic device relative to real or virtual recorded objects, textures or scenery is output to display control electronics adapted for managing a displayed image in response to the output signal characteristics. In a method for operating an information processing system of this kind at least one parameter is derived from the motion of the optoelectronic device and equipment linked therewith relative to real or virtual recorded objects, textures or scenery, and the at least one parameter is applied as input to control electronics for enabling navigation and/or input commands in display means connected therewith. - Particularly the system and method can be used for implementing a three-dimensional relative pointer device, in practice a three-dimensional computer mouse.

Description

Information processing system and method
Field of the invention
The present invention concerns an information processing system comprising an optoelectronic device, associated display means and display control electronics that either is physically a part of said optoelectronic device or physically separate therefrom, but connected therewith via a data link, wherein the optoelectronic device comprises an internal memory, a data processing system for operating the device, including input- and/or output processes, and an optical image recorder for capturing information from the environment surrounding said optoelectronic device. The present invention also concerns a method for operating an information processing system of this kind.
Background of the invention Operator communication with information processing systems such as personal computers and other stationary systems have generally been achieved via keyboard and mouse. There is now a trend towards mobile and hand-held electronic devices such as mobile telephones and personal digital assistants (PDAs) becoming progressively more sophisticated, with growing processing power, memory capacity, functionalities and communication options and bandwidths. This implies that many of the tasks that are presently being performed on personal computers and similar stationary equipment shall be performed on physically much smaller platforms in the future. Before this can become a widespread reality, however, there are two major problems that must be solved, both of which stem from the small physical size of the devices in question.
First, there is not space for a traditional keyboard on which to enter operator commands. The severe restrictions on the number of buttons or other means of input (scroll wheels, four-way buttons, miniature joy sticks) have led to a situation where user control of the device is seriously slowed down, e.g. during text input and menu item selection. Furthermore, extensive use of miniaturized multiple-choice buttons typically leads to increased operator fatigue and higher incidence of erroneous data entries: A common way to implement menu item selection is to employ a four-way button which has to be repeatedly operated in order to step through menu items that are typically arranged in a 1-dimensional list or a 2-dimensional array of symbols. As the lists and menus tend to increase in size with the increased functionality of mobile devices, the number of button operations required will increase in a highly undesired manner. ,
Second, the small area available for a display screen implies that displayed features must either be very small and correspondingly difficult to read, or they must be so few as to severely limit the information content in the display.
Several solutions have been tried to solve these problems, but with limited success. One solution is to display only a portion of a page, and scroll or step across it using touch buttons on the mobile device to bring sub-sections of the page into view. Analogously, a cursor or a highlighted area can be moved across the image on the display. Indeed, this is a widely used method to alleviate the problems associated with a small screen, but has a number of drawbacks: It requires dedicated touch buttons that must be operated in a hands-on mode, it is not practical for scrolling and navigation in more than one direction at a time, and it lacks intuitive user features that are desirable in many applications.
Another solution, described in US Pat. No. 6,466,203 is to exhibit the entire page at low magnification on a touch-sensitive display screen initially to provide an overview. The user can recognize the page's general lay-out and presence of hyperlinks. When the user touches a particular location on the touch screen that corresponds to a portion of the page's image, the portion gets displayed so as to fill the display's area. Thus, the user can browse the Web with a display of a limited size. There are several drawbacks associated with this solution: Apart from requiring input via a touch-sensitive screen of small size and high resolution, it requires a particular logic organization and editing of the displayed data to permit extraction of blocks of information, it is not suitable for incremental scanning across a large page, etc.
In US patent application 2002/0024506, a method is disclosed that enables scrolling and zooming in an image displayed on the small screen of a hand¬ held device, using an image capture device that tracks the motion of a reference target, where said target is a part of, or attached to, the users body or clothing. The need for a reference target poses several drawbacks: There will be a delay at start-up while the target is acquired, the movement of the handheld device is limited by the requirement to keep the target in view of the image capture device, and there may be undesired scrolling or zooming when the user needs to e.g. turn his head.
US patent application 2004/0141085 discloses a system that uses an orientation detecting mechanism (an accelerometer) to determine the orientation of an imaging device, e.g. a digital camera. The acquired orientation information is used to reconfigure the displayed image for increased user viewing convenience. The reconfiguration is explicitly effected on the captured image only. The general problem of using small displays for viewing and performing operations within large and content-rich images, diagrams, etc. is not addressed.
A solution to the problem of providing input to a small portable device with limited space for keyboard is promoted by the firm Memsic Inc., of North Andover, MA in U.S.A. In this case, the portable device is equipped with a small accelerometer which senses tilting of the device. By tilting right/left and/or up/down and clicking on a button on the device, navigation is possible on a screen on the device, i.e. scrolling through lists, menu selection, map scrolling, gaming, etc. An obvious drawback of this approach is the need for an accelerometer in the portable device, where space and cost typically are at an extreme premium. Beyond this, input is sensitive to motion of the device involving changes in speed or direction, and presupposes a given attitude of the portable device relative to the direction of the gravity vector.
A possible solution to the text input problem, described in US Pat. No. 5,818,437 is to assign multiple letters to each of the numerical buttons on the device, and use built-in dictionaries to find words that match the button sequence keyed by the user. Drawbacks of this solution include the need to scroll through several valid alternatives to find the correct word, and having to explicitly spell words, names and abbreviations that are not included in the dictionary.
It is thus a major object of the present invention to provide apparatus and methods for input/output operations involving information processing systems where the available space for keyboard or other mechanical input implements is restricted, and/or where available space for display is restricted. It is a further major object of the present invention to provide apparatus and methods for input/output operations involving information processing systems where at least the input from the operator takes place on a mobile, e.g. handheld or helmet-mounted component of the system. Summary of the invention
The above objects as well as other features and advantages are realized with an information processing system according to the present invention which is characterized in that the optoelectronic device comprises hardware and software adapted for analyzing information captured by said optical image recorder with respect to temporal variations in the signals from the optical image recorder, said variations being caused by translational and/or rotational motion of said optoelectronic device relative to real or virtual objects, textures or scenery within the field of view of said optical image recorder, that said processing hardware and software are adapted for providing an output signal in scalar or vector form on the basis of the analyzed information, said output signal being indicative of the motion characteristics of said optoelectronic device relative to the real or virtual objects, textures or scenery within the field of view of said optical image recorder, that the display control electronics is connected with said optoelectronic device for receiving said output signal, and that said display control electronics is adapted for modifying or changing a displayed message or image in response to the characteristics of said output signal.
The above objects as well as other features and advantages are also realized with a method according to the invention which is characterized by deriving at least one parameter from translational and/or rotational motion of said optoelectronic device or auxiliary equipment linked to same, relative to real or virtual objects, textures or scenery objects within the field of view of said optical image recorder, and applying said at least one parameter as input to control electronics in said optoelectronic device and to effect a navigation and/or input commands in said display means, partly or completely in response to said at least one parameter.
Further features and advantages shall be apparent from the appended dependent claims. The invention shall now be explained in more detail by resorting to a discussion of exemplary embodiments thereof and with reference to the accompanying drawing figures, of which figure 1 shows the main components in an information processing system according to the present invention, figures 2a and 2b a first preferred embodiment where the display screen on a mobile phone constitutes part of a virtual display where the viewing field is enhanced by scrolling a window across a larger page, figure 3 a second preferred embodiment involving a virtual keyboard, figure 4 a third preferred embodiment involving menu item selection, figures 5a and 5b an example of a variant of the third preferred embodiment, figure 6 a fourth preferred embodiment where an optoelectronic device according to the invention functions as a computer mouse, and figures 7a and 7b show examples of variants of the fourth embodiment. The main components of an optoelectronic device as used in the information processing system of the invention is shown in fig. 1. The optoelectronic device comprises an optical image recorder, an image motion analyser linked with the former and used for analyzing recorded images and generating a continuously updated image output to a connected display. There shall now be described a set of preferred embodiments where the optoelectronic device is implemented in a mobile telephone or a PDA equipped with a digital camera and a display screen. For convenience the optoelectronic device is termed a "phone" in these preferred embodiments. This is not meant to preclude that the optoelectronic device in question may be another type of mobile platform equipped with a digital camera and a display screen, e.g. a helmet-mounted or head-up device or system.
A first preferred embodiment is shown in figure 2. This embodiment provides a virtual display wherein large pages of text, images, web content, etc. that would otherwise defy normal criteria for readability on the phone display screen, are accessed in selected segments that are scrolled across the larger page simply by moving the phone in three dimensions. This scrolling is rendered intuitive by correlation between the virtual motion across the page to be viewed with the real motion of the phone. The latter is typically a right/left and up/down motion of the phone above or in front of a surface or object or scene that exhibits adequate structure or texture for the camera to register movement. In cases where the phone camera is equipped with a zoom function with the zooming corresponding to movement in a third dimension, the degree of zoom can be used to control the speed of scrolling for a given motion input, e.g. the more zoom, the slower the scrolling, and vice versa.
Combined with the lateral motion of the phone optionally there may be an in/out motion relative to the surface or object which effects a zoom in/zoom out function on the displayed image of the large page. One may combine scrolling with the zoom function, such that fast scrolling reduces the zoom factor and vice versa. When the scrolling stops, the zoom level then applied may be maintained until the zoom function is activated again, e.g. by the relative in/out motion of the phone. The absolute zoom level shall be user controllable
An alternative variant of phone motion, which is possible in cases where no suitable surface or object is available in reasonably close proximity to the phone, is to rotate the phone right/left and up/down, i.e. panning and tilting the camera to cause the observed scene to move across the camera chip surface. In the virtual display image of the page this can be perceived intuitively as being related to tilting of the observer's head to view different parts of the page. When distant objects are imaged by the camera, motion of the camera in a line of sight to the distant objects will cause the image to change very little, and other means of providing zooming commands must be used. In this case, zooming in/out in the virtual image can be achieved in several ways, namely a) by using the zoom function built into the camera, if such exists; b) by pressing or toggling a mechanical input key, switch or pressure- sensitive field on the phone; c) by moving the phone in a prescribed pattern, e.g. rapid sideways or up/down rotation for zoom in/zoom out; or d) by acoustic input, e.g. voice command.
If the user scrolls to one of the edges of the page to be viewed, further movement/tilting of the phone should not result in a blank screen (corresponding to virtually looking outside the page). Rather, the scrolling should be stopped at the page edge for as long as the movement/tilting continues, to be resumed again as soon as the user begins scrolling in another direction. The user could be alerted of reaching the edge of the page by e.g. a thumping sound from the phone. Apart from avoiding getting lost in the virtual blank space surrounding the page, this scheme enables the user to virtually drag the page along if he desires to turn around, walk, or otherwise shift position.
It is reasonable to expect that mobile and hand-held optoelectronic devices according to the present invention shall be exposed to mechanical disturbances in the form of unsteady and erratic, or cyclic or continuous motion that is not intended by the operator to provide meaningful motion input to the optical image recorder. To handle such situations, the present invention teaches the introduction of a discrimination threshold below which motion is not register as input activation to the optoelectronic device
("clutching"). This threshold shall typically apply for the recorded speed of motion (translational or rotational), and the magnitude of the threshold may be selected or adjusted according to the type of usage and environment that is encountered in each application. In a variant of this scheme, any predictable motion pattern that shall not serve as input cue to the optoelectronic device may be compensated for in the signal processing, e.g. a constant linear motion due to steady translational motion of the optoelectronic device such as walking. In the latter example, the "dead band" would be centered around a certain speed value corresponding to the constant motion. The algorithm for making the system insensitive to motion that is not intended by the operator to provide meaningful motion input to the optical image recorder may also be set to reject very rapid accelerations or motions, e.g. by setting an upper limit to the legitimate sensitive range. Likewise, the processing can easily be set to pick up a specific motion as an input cue, e.g. cyclic motion within a defined frequency band such as would occur in response to rapid wagging of the optoelectronic device sideways or up and down.
The phone movement can also be used to scroll a virtual keyboard for text input, for instance to message systems like SMS. A single function key is pressed to select a letter, and the range of characters can be extended or various special functions implemented by using movement in the third dimension. This forms the basis of a second preferred embodiment, illustrated in figure 3. In this embodiment writing via a virtual keyboard is performed without the need for a set of physical keys representing the alphabet or other symbols on the face of the phone. Instead, letters or symbols are selected on a virtual keyboard shown in the display, by a two- step process: First, moving the phone in a lateral or rotational motion causes a visual indicator function to move across the virtual keyboard. When the desired letter or symbol has been reached, the camera is momentarily held still and a "Select" command is given to the phone, by one of the following modalities, namely a) by pressing or toggling a mechanical input key, switch or pressure- sensitive field on the phone; b) by moving the phone in a prescribed pattern, i.e. laterally up/down or rotationally (pan/tilt) or along the optical axis of the phone's camera (in/out); or c) by acoustic input, e.g. voice command.
Several variants of this basic scheme are possible. For instance may the virtual keyboard may be larger than the display, causing only part of it to be visible within the field defined by the phone's display screen. This situation is depicted in figure 3, and in this case the virtual keyboard scrolls across the display, one keyboard letter or symbol being highlighted or framed at a time upon passing a selector field in e.g. the middle of the screen.
In another variant of this embodiment, the whole virtual keyboard is shown inside the display, and a cursor or highlighting field moves across the virtual keyboard when the phone is moved. Selection of upper or lower case characters or special symbols or functions is achieved by entering a command into the phone by one of the modalities a) to c) described earlier in this paragraph. The command may be of the "Caps shift" or "Caps lock" type, depending on the situation.
In a third preferred embodiment illustrated in figure 4, movement of the phone is used for selecting menu items on the phone. It is often annoying and cumbersome to flip through ever-increasing numbers of main menu items, contact list items, or display icons. Instead of laboriously flipping through a large number of menu items by sequentially displaying individual windows on the display, a sweep across a virtual background field where the menu icons are laid out can be performed quickly and simply. Motion patterns and selection modes can be similar to those described for the virtual keyboard above. Several variants of this basic scheme are possible: A typical phone has many menus. The user could pre-set which menus should be subject to motion controlled selection, and then the camera would automatically be activated when these menus are selected. Menus could be nested hierarchically, and motion control could be used to navigate into and within sub-menus.
A variant of this third embodiment shall now be described, with reference to figures 5a and 5b: Mobile phones often have extensive contact lists, sorted alphabetically. The standard way of accessing the contact list for calling one of the contacts is to scroll through the list until the contact is found. To speed up the process, it is possible to press one of the alphabetic/numeric keys one or several times to select a letter, whereupon the list is directly scrolled to the first contact item with this initial letter. Even so, further scrolling is normally required to find the correct contact item, and on the average, a significant number of key presses have to be performed before a call can be placed. By using the virtual display concept (scrolling of the screen image) in combination with cursor control by phone movement, a more efficient way of contact list access can be implemented:
• Let each contact be represented by a small image (photo of person, company logo, cartoon image, etc).
• Arrange the images in a 2D array
• When the user accesses the contact list, the phone camera is automatically activated to detect phone movement.
• The upper left portion of the 2D image array is presented on the display, with the upper left image highlighted, {see figure 5 a)
• Phone movement causes movement of the highlight between images. If the highlight reaches the display border, the display image is scrolled. • If the highlight rests at the same image for longer than, say, 1 s, an additional pop-up information window is shown next to the image, revealing the contact name, {see figure 5b)
• As soon as the dial-up button is pressed, the phone calls the presently highlighted contact. • If the contact has more than one phone number, the first dial-up or select command produces a pop-up window with all the phone numbers. Phone movement is used to highlight the desired number and the dial-up button is used for placing the call. This strategy enables fast selection of the desired contact in one sweeping phone movement. It enables more contacts on the screen than the usual text list layout, and is based on human recognition of images/graphical symbols rather than text, which is normally a faster search method. The present invention could also in a general fashion implement a so-called relative pointer device, for instance for controlling the cursor on a computer screen. However, the present invention shall allow the extension of relative pointer technology far beyond that offered by e.g. a conventional computer mouse. This opens for a fourth preferred embodiment, the principle of which is shown in fig.6, which basically renders the use of the optoelectronic device as three-dimensional computer mouse or similar input device to a mobile or stationary information processing system. Cursor movement on the computer screen is controlled by moving the optoelectronic device in one or more of the translational, rotational or combined translational/rotational modes described in the other preferred embodiments above, while mouse clicks can be effected in several ways, e.g. by a) pressing or toggling a mechanical input key, switch or pressure-sensitive field on the optoelectronic device such that right and left mouse buttons could be assigned to user-selectable buttons on the mobile electronic device; b) moving the optoelectronic device in a prescribed pattern, which in principle may be any translational, rotational or combined translational/rotational motion, but in practice a simple lateral up/down or rotational (pan/tilt) motion shall be adequate, although an additional motion mode is along the optical axis of the camera (i.e. in/out); or c) acoustic input, e.g. voice command.
In a first variant of this fourth embodiment, the input from the optoelectronic device to the information processing system is the operator signature, e.g. in legal or economic transactions or in access control. The operator in this case moves the optoelectronic device as he would a pen to create the pattern, which may be his name signature or some other pre-defined identifying pattern. The motion need not take place against any supporting surface, but could be performed in the air in front of the operator ("writing in the air"). In certain instances, it may be advantageous to employ processing of the recorded signature to conform with predefined formats required in e.g. economic transactions. One example of such processing would be to scale the size of the recorded signature to a predefined norm. Further extensions of this embodiment are obvious to those skilled in the art of image and pattern analysis, e.g. analysis of hand-generated symbols and patterns to generate digital alphanumeric files in the information processing system.
In order to facilitate the use of the phone as a three-dimensional mouse, an alternative to moving the phone in the open air shall be to equip the phone with additional physical hardware, e.g. a rounded object, e.g. ball-like, which may be attached to the phone at the side where the camera is located. By proper positioning of such a ball-type object, the phone may be rested on the desktop or any other object, while the three-dimensional motions are achieved by tilting the phone in the various directions, as allowed by the thickness of the attached, rounded object. Generally the translation movements of the mouse now can be replaced by a rotational movement around any of three axes and over a fairly small angle.
Communication with the computer could be via a wire or by one of several wireless means. As an example, Bluetooth communication would enable using the mobile electronic device for controlling presentations, including pointer control and the possibility for the user to stand up and move around. An important advantage of this embodiment relative to the conventional computer mouse is that contact with or close proximity to a surface is not needed. Furthermore, it is possible to provide input commands linked to a motion in the third dimension, either by increasing or reducing the distance from the camera of the mobile electronic device to an object in front of it or by employing the camera zoom function. This opens up opportunities for 3-D cursor control, e.g. in CAD software or in the display/manipulation of 3 dimensional virtual or real-space objects. Still further, the optoelectronic device can be equipped with a microphone, such as is the case when the optoelectronic device is a mobile telephone with camera. This makes it possible to insert audio files into the computer. Likewise, by means of a speaker or earphone attached to the optoelectronic device, it becomes possible to listen to audio files that have been located in the computer.
In a second variant of the fourth embodiment, relative motion is created by the optoelectronic device being stationary or nearly so. The relative motion is then created by moving an object, e.g. the operator's hand (or finger), in front of the optical image recorder of the optoelectronic device. This variant can be implemented in various ways, as shown in figs 7a and 7b. Figure 7a shows a person carrying an optoelectronic device according to the invention with a camera viewing out in front of the person. By moving his hands in front of the camera, this causes a cursor to move on the screen of a computer as shown in fig.7a, or on a screen in the optoelectronic device itself. Mouse clicks can be done in several ways, e.g. by closing the hand to a fist, by voice command, by touching the screen at the appropriate spot, etc. In this manner all commands which can be executed by a mouse can be performed by hand (or finger, for a higher degree of precision) motions. Instead of hanging in front of the operator as the optoelectronic device could be mounted in a harness, helmet or the like which is carried by the operator. In fig.7b the optoelectronic device is shown in a mount that is physically separated from the operator, but the basic input mechanisms are the same. Again, the operator may move his hands in free air or resting his hand against the surface of a table, to effect cursor motion. Mouse clicks can be effected in several ways, e.g. via the camera image or by the modalities mentioned in connection with fig. 6. In a variant of this embodiment, the person holds a contrast-enhancing or recognition aiding implement in his hand, e.g. a pen or pencil with a marker spot of high optical contrast on it, optionally in a color selected in an optimized spectral region, to simplify processing of the image. Fingers or hands of the operator can be marked with a spot or symbol of high optical contrast, e.g. to show the positions of the fingertips.
The variants of this fourth embodiment exemplified in figures 7a and 7b can be used to effect handwritten input into the computer, the optoelectronic device or both, including name or other types of signatures for legal and transaction purposes or access control.
Image analysis provides further possibilities for more complex and hence more compact or efficient input coding: For example, two fingers in contact with each other side by side may carry a different message from the same two fingers splayed away from each other. A simple example of how this is when writing to virtual keyboards (cf. below), in which case lower case symbols can be written with a single finger, upper case with two fingers close together.
Likewise, detection of more than one motion vector in the field of view of the optical sensing device of the optoelectronic device can be employed as input. An example of this is the use of two hands or fingers that are moved independently, but in coordinated fashion.
Moreover, people skilled in the art will recognize the technological relationship of the variant embodiments rendered in figures 7a and 7b with proposals for using nose tip or eyelid movements for controlling a cursor or accessing a keyboard. Approaches of this kind have been developed particularly in order to enable paraplegics to operate computers and the like, and persons skilled in the art will realize that the present invention shall be able to provide similar solutions by employing movements of any body part as practicable. Evidently, in some cases physical impairment may result in a restriction of the operational possibilities, but nevertheless the present invention then can be implemented in order to enable severely handicapped people to undertake normally fairly complicated tasks.
The principle scheme in figure 7b can be used to provide keyboard input via a "virtual keyboard": A keyboard may be printed on the table surface or on a flat sheet lying on the table, and the camera records the position of the operator's fingers on the keyboard. Image analysis software is used to determine which key is indicated, e.g. by analyzing the obscuration of keyboard by the hands and fingers. In cases where fingertips are tagged for enhanced visibility, the tag location relative to the keyboard is analyzed. The actual tapping to indicate selection of an indicated key may be achieved by one of the modalities mentioned above in connection with mouse clicking, or by providing a motion cue to the imaging software. An example of the latter would be to set an activation threshold related to the dwell time on a given key, i.e. the finger must rest a time exceeding a predetermined minimum for a tapping to be registered.
A variant of a virtual keyboard would be to project a keyboard as an image onto the writing surface. Fingertip tags with high contrast could in this case be rendered especially useful by making them fluoresce in response to the light used in projecting the keyboard or another light source.
Another type of virtual keyboard can be established under a wider concept which may be termed a "virtual touch screen": In the latter case, the optoelectronic device is equipped with a camera which is directed towards the screen display of a computer. The operator places a pointer or wand, or one or more Fingers or hand(s) against the screen, thereby obscuring parts of the displayed image. Image analysis software determines which of the displayed image parts or regions that are indicated, e.g. by analysing the contours of the object held in front of the screen. This principle thus provides the features of a touch-sensitive screen, providing input opportunities for menu selection, etc, but without the need for technically complex physical sensors to record the touching action. When the displayed image is a keyboard, writing may be performed in a manner similar to the case described above where writing was performed on a printed or projected keyboard. The virtual touch screen provides opportunities for more sophisticated input operations than simple two-dimensional coordinate point indication, by making the image analysis software able to discriminate between different motion- and shape patterns for the operator's fingers and hands in front of the screen. An example of this is to use the flat palm of the hand to mark or wipe larger parts of a document or scene, to effect copying, deletion or transfer.
A person skilled in the art will immediately recognize the issues related to possible mismatch in phase and/or frequency between the frame updating in the screen display and in the camera sensor, but shall also be able to provide remedies to problems that would otherwise result from this. These issues shall therefore not be discussed here.
One vital part of any of the indicated embodiments is the processing of camera image sequences to obtain the output signal which indicates movement of the mobile electronic device relative to imaged objects or scenes. This is largely a problem known as motion estimation, which is a well established research field where the present major application is compression of digital video sequences. As an example, the MPEG video compression standards (see e.g.: http://www.chiariglione.org/mpeg/ ) rely heavily on motion estimation. In several of its embodiments, the present invention has an advantage over conventional video compression, in that only the camera motion need be estimated, which means characterization of the general movement of the background of the scene. The motion of smaller objects across the background can be disregarded, which lowers the demand for processing power compared to video compression. To cope with phenomena such as perspective, camera zoom (or in/out movement), and moving objects, the camera image needs to be split into macroblocks, typically of the size 16x 16 pixels. The motion of each macroblock from one frame to another is then computed, resulting in a motion vector (MV). This is repeated for each macroblock in the image, resulting in a motion vector field (MVF), which then is analyzed to yield the camera motion. Moving objects, mismatched macroblocks due to insufficient texture, etc., will show up as outliers in the MVF and can be disregarded through proper filtering. The implementation of MVF calculation in the present invention will be chosen in each particular embodiment according to the properties of the camera and other phone hardware, available processing power, etc. Block matching is probably the most common way to calculate the MVF (cf., e.g.: A. M. Telkap, Digital Video Processing, Prentice Hall, 1995; J. Watkinson, The Engineer 's Guide to Motion Compensation, Snell & Wilcox, 1994).
Briefly, each macroblock in the new frame is compared with shifted regions of the same size from the previous frame, and the shift which results in the minimum error is selected as the best MV for that macroblock. Other methods of camera motion estimation include phase correlation (cf., e.g.: J. Watkinson, The Engineer 's Guide to Motion Compensation, Snell & Wilcox, 1994; Y. Liang, Phase-Correlation Motion Estimation, Final project, Stanford University) and the utilization of MPEG motion vectors (cf., e.g.: M. PiIu, On Using Raw MPEG Motion Vectors To Determine Global Camera Motion, Hewlett-Packard Company, 1997). The latter is of particular interest for the present invention in case hardware or software MPEG compression is already implemented in the mobile electronic device.
Power consumption is a potential concern in embodiments of the present invention where an imaging camera and/or a display screen are operated on a battery. In general, the camera part of the system draws little power compared to an illuminated display screen. However, when operating the camera as an input tool according to the present invention, several power conservation strategies are possible, since imaging requirements are different from and generally less demanding than those in regular photography. Thus, the total number of available camera pixels need not be active in the primary photo-event or in subsequent signal processing steps. Frame rates may also be reduced, depending on the updating rate and response speed that is required in applications based on the present invention. Similarly, in embodiments of the present invention where a battery-fed display screen on the optoelectronic device is involved, several strategies are available to conserve power. Among them are for instance a) reducing the frame rate on the display; b) displaying reduced-complexity images, e.g. line or grid drawings and diagrams, on the screen instead of full-pixellated images; c) employing highlight on darker background; d) introducing automatic "sleep mode" with dark or reduced-intensity screen in periods without operator input.

Claims

1. An information processing system comprising an optoelectronic device, associated display means and display control electronics that either is physically a part of said optoelectronic device or physically separate therefrom, but connected therewith via a data link, wherein the optoelectronic device comprises an internal memory, a data processing system for operating the device, including input- and/or output processes, and an optical image recorder for capturing information from the environment surrounding said optoelectronic device; characterized in that the optoelectronic device comprises hardware and software adapted for analyzing information captured by said optical image recorder with respect to temporal variations in the signals from the optical image recorder, said variations being caused by translational and/or rotational motion of said optoelectronic device relative to real or virtual objects, textures or scenery within the field of view of said optical image recorder, that said processing hardware and software are adapted for providing an output signal in scalar or vector form on the basis of the analyzed information, said output signal being indicative of the motion characteristics of said optoelectronic device relative to the real or virtual objects, textures or scenery within the field of view of said optical image recorder, that the display control electronics is connected with said optoelectronic device for receiving said output signal, and that said display control electronics is adapted for modifying or changing a displayed message or image in response to the characteristics of said output signal.
2. An information processing system according to claim 1, characterized in that said optoelectronic device incorporates a display and associated display control electronics, said display having a substantially small size suitable for said optoelectronic device.
3. An information processing system according to claim 1, characterized in that said optoelectronic device is connected via a cable or wireless communication link to a display which is physically separated from said optoelectronic device.
4. An information processing system according to claim 3, characterized in that said wireless link is implemented by means of one or more of the following, viz. electromagnetic radiation means including, but not limited to radio-, microwave and infrared radiation means, acoustic radiation means, and inductive and capacitive pick-up means.
5. An information processing system according to claim 1, characterized in that said optoelectronic device is a mobile telephone, a personal digital assistant (PDA), a handheld computer game, a barcode scanner, a helmet-mounted display, a robot camera, a paraplegic communication tool, or a computer input device.
6. An information processing system according to claim 1, characterized in that the computer input device is relative pointer device or a mouse capable of operating in three dimensions.
7. An information processing system according to claim 1, characterized in that said optical image recorder is an imaging camera.
8. An information processing system according to claim 1, characterized in that said display is an indicator light array for displaying vector magnitudes in one, two, three or more dimensions.
9. An information processing system according to claim 1, characterized in that said display is a screen for displaying images in two or three spatial dimensions.
10. An information processing system according to claim 9, characterized in that said images are in the form of symbols, drawings, artwork, photographs or movie sequences.
11. An information processing system according to claim 1, characterized in that said motion characteristics of said optoelectronic device are detected via automated analysis of the translational and/or rotational motion of one or more sub-regions of the image in the imaging plane of said camera.
12. An information processing system according to claim 11, characterized in that said processing hardware and software providing an output signal in scalar or vector form is integrated to a significant degree with processing hardware and software provided in said optoelectronic device for automatic motion stabilization and/or video compression of images captured by said camera.
13. A method for operating an information processing system comprising an optoelectronic device, associated display means that either is physically a part of said optoelectronic device or physically separate therefrom, but connected therewith via a data link; wherein said optoelectronic device comprises an internal memory and data processing system for operating the device, including input- and/or output processes, and an optical image recorder for capturing information from the environment surrounding said optoelectronic device, and wherein the method is characterized by deriving at least one parameter from translational and/or rotational motion of said optoelectronic device or auxiliary equipment linked to same, relative to real or virtual objects, textures or scenery within the field of view of said optical image recorder, and applying said at least one parameter as input to control electronics in said optoelectronic device and to effect a navigation and/or input commands in said display means, partly or completely in response to said at least one parameter.
14. A method according to claim 13, characterized by starting said navigation when said optoelectronic device detects an initializing event.
15. A method according to claim 14, characterized by said initializing event being one or more of the following, viz. input by mechanical means such as depression of a key or button, or toggling or throwing a switch, input by motion pattern of said optoelectronic device, input by acoustic, infrared or radio signal, input by prompt from execution of a program in said optoelectronic device, input from timer.
16. A method according to claim 13, characterized by representing said translational and/or rotational motion internally in said optoelectronic device by a scalar quantity in the form of an electrical signal level or an electrical vector in two or higher dimensions.
17. A method according to claim 16, characterized by deriving said at least one parameter from said electrical vector by scaling with a constant or variable factor.
18. A method according to claim 16, characterized by making the magnitude of said electrical vector proportional to the instantaneous rotation rate and/or translation speed of said optoelectronic device.
19. A method according to claim 16, characterized by making the magnitude of said electrical vector proportional to the integrated rotation angle and/or translation distance of said < optoelectronic device, measured from said initializing event recorded by said optoelectronic device.
20. A method according to claim 13, characterized by establishing a non-responsive range of motion up to a threshold value by subtracting a certain base value from said parameter or parameters.
21. A method according to claim 20, characterized by continuously re-calculating said base value to compensate for any component of said motion that is constant over a certain amount of time, and which thus corresponds to motion of the person carrying said optoelectronic device.
22. A method according to claim 13, characterized by said navigation involving at least one of the following modes of display response, viz. scrolling of the screen image in any direction at constant magnification; and/or zooming into or out of a certain location in the screen image; and/or moving a cursor, symbol or highlighted field in any direction.
23. A method according to claim 22, characterized by combining scrolling and zooming such that a scrolling speed is inverse in some proportion to a zooming speed.
24. A method according to claim 22, characterized by using said cursor, symbol or highlighted field movement to highlight letters or symbols on a displayed image of a keyboard, thereby providing a means of text input via said optoelectronic device.
25. A method according to claim 22, characterized by using said cursor, symbol or highlighted field movement to highlight keys on a displayed image of a piano keyboard, or notes or other symbols on a displayed image of a note system, thereby providing a means of input of tone sequences or melodies via said optoelectronic device.
26. A method according to claim 22, characterized by said opto-electronic device being a mobile telephone, and further characterized by using said cursor, symbol or highlighted field movement to highlight images or symbols representing contacts in the contact list of said mobile telephone, thereby providing means of rapid selection of a contact to be called via said mobile telephone.
PCT/NO2005/000360 2004-09-27 2005-09-27 Information processing system and method WO2006036069A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
NO20044073 2004-09-27
NO20044073A NO20044073D0 (en) 2004-09-27 2004-09-27 Information Processing System and Procedures

Publications (1)

Publication Number Publication Date
WO2006036069A1 true WO2006036069A1 (en) 2006-04-06

Family

ID=35057647

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/NO2005/000360 WO2006036069A1 (en) 2004-09-27 2005-09-27 Information processing system and method

Country Status (2)

Country Link
NO (1) NO20044073D0 (en)
WO (1) WO2006036069A1 (en)

Cited By (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007042189A1 (en) * 2005-10-11 2007-04-19 Sony Ericsson Mobile Communication Ab Cellular communication terminals and methods that sense terminal movement for cursor control
WO2007068791A1 (en) * 2005-12-13 2007-06-21 Elcoteq Se Method and arrangement to manage graphical user interface and portable device provided with graphical user interface
EP1884864A1 (en) * 2006-08-02 2008-02-06 Research In Motion Limited System and Method for Adjusting Presentation of Moving Images on an Electronic Device According to an Orientation of the Device
WO2008029180A1 (en) * 2006-09-06 2008-03-13 Santosh Sharan An apparatus and method for position-related display magnification
WO2008098331A1 (en) * 2007-02-15 2008-08-21 Edson Roberto Minatel Optoeletronic device for helping and controlling industrial processes
WO2009001240A1 (en) * 2007-06-27 2008-12-31 Nokia Corporation Method, apparatus and computer program product for providing a scrolling mechanism for touch screen devices
WO2009032638A2 (en) * 2007-09-04 2009-03-12 Apple Inc. Application menu user interface
WO2009052848A1 (en) * 2007-10-25 2009-04-30 Nokia Corporation Controlling information presentation by an apparatus
WO2009071234A1 (en) * 2007-12-08 2009-06-11 T-Mobile International Ag Virtual keyboard of a mobile terminal
US7616186B2 (en) 2005-12-09 2009-11-10 Sony Ericsson Mobile Communications Ab Acceleration reference devices, cellular communication terminal systems, and methods that sense terminal movement for cursor control
WO2009150522A1 (en) * 2008-06-11 2009-12-17 Nokia Corporation Camera gestures for user interface control
US20100125816A1 (en) * 2008-11-20 2010-05-20 Bezos Jeffrey P Movement recognition as input mechanism
US20100149100A1 (en) * 2008-12-15 2010-06-17 Sony Ericsson Mobile Communications Ab Electronic Devices, Systems, Methods and Computer Program Products for Detecting a User Input Device Having an Optical Marker Thereon
WO2010068312A1 (en) * 2008-12-10 2010-06-17 Sony Ericsson Mobile Communications Ab System and method for modifying a plurality of key input regions based on detected tilt and/or rate of tilt of an electronic device
FR2940690A1 (en) * 2008-12-31 2010-07-02 Cy Play Mobile terminal i.e. mobile telephone, user navigation method, involves establishing contents to be displayed on terminal for permitting navigation on user interface having size larger than size of screen of terminal
FR2940703A1 (en) * 2008-12-31 2010-07-02 Cy Play Display modeling method for application on server, involves forming image based on pixels of traces, and transmitting image and encoding information conforming to assembly of modification data to encoder by transmitting unit
FR2940689A1 (en) * 2008-12-31 2010-07-02 Cy Play Method for navigating user of mobile terminal e.g. portable computer, on application, involves detecting movement of positioning device, and modulating movement data in audio format for transmitting movement data to server
WO2010080166A1 (en) * 2009-01-06 2010-07-15 Qualcomm Incorporated User interface for mobile devices
WO2010076436A3 (en) * 2008-12-31 2010-11-25 Cy Play Method for macroblock modeling of the display of a remote terminal by means of layers characterized by a movement vector and transparency data
EP2296076A1 (en) * 2009-09-15 2011-03-16 Palo Alto Research Center Incorporated System for interacting with objects in a virtual environment
EP2341412A1 (en) * 2009-12-31 2011-07-06 Sony Computer Entertainment Europe Limited Portable electronic device and method of controlling a portable electronic device
EP2370889A1 (en) * 2008-12-18 2011-10-05 Nokia Corporation Apparatus, method, computer program and user interface for enabling user input
EP2382527A2 (en) * 2008-12-30 2011-11-02 France Telecom User interface to provide enhanced control of an application program
US8139026B2 (en) 2006-08-02 2012-03-20 Research In Motion Limited System and method for adjusting presentation of text and images on an electronic device according to an orientation of the device
EP2438506A1 (en) * 2009-06-04 2012-04-11 Mellmo Inc. Displaying multi-dimensional data using a rotatable object
EP2482170A2 (en) * 2011-01-31 2012-08-01 Hand Held Products, Inc. Terminal operative for display of electronic record
EP2290931A3 (en) * 2009-08-24 2013-05-22 Pantech Co., Ltd. Apparatus and method for executing hot key function of mobile terminal
KR101281058B1 (en) 2011-12-06 2013-07-15 (주)나노티에스 Touch keyboard apparatus and touch position detection method thereof
US8493323B2 (en) 2006-08-02 2013-07-23 Research In Motion Limited System and method for adjusting presentation of moving images on an electronic device according to an orientation of the device
EP2597590A3 (en) * 2011-11-28 2013-11-27 Samsung Electronics Co., Ltd Method of authenticating password and portable device thereof
CN101196795B (en) * 2006-08-02 2014-04-09 黑莓有限公司 System and method for adjusting presentation of text, image and moving images on device
US8878773B1 (en) 2010-05-24 2014-11-04 Amazon Technologies, Inc. Determining relative motion as input
US8884928B1 (en) 2012-01-26 2014-11-11 Amazon Technologies, Inc. Correcting for parallax in electronic displays
US8947351B1 (en) 2011-09-27 2015-02-03 Amazon Technologies, Inc. Point of view determinations for finger tracking
US9035874B1 (en) 2013-03-08 2015-05-19 Amazon Technologies, Inc. Providing user input to a computing device with an eye closure
US9041734B2 (en) 2011-07-12 2015-05-26 Amazon Technologies, Inc. Simulating three-dimensional features
JP2015102924A (en) * 2013-11-22 2015-06-04 シャープ株式会社 Display device, scroll display method, and scroll display program
US9063574B1 (en) 2012-03-14 2015-06-23 Amazon Technologies, Inc. Motion detection systems for electronic devices
US9123272B1 (en) 2011-05-13 2015-09-01 Amazon Technologies, Inc. Realistic image lighting and shading
US9122917B2 (en) 2011-08-04 2015-09-01 Amazon Technologies, Inc. Recognizing gestures captured by video
US9223415B1 (en) 2012-01-17 2015-12-29 Amazon Technologies, Inc. Managing resource usage for task performance
US9269012B2 (en) 2013-08-22 2016-02-23 Amazon Technologies, Inc. Multi-tracker object tracking
US9285895B1 (en) 2012-03-28 2016-03-15 Amazon Technologies, Inc. Integrated near field sensor for display devices
US9335924B2 (en) 2006-09-06 2016-05-10 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
GB2532010A (en) * 2014-11-04 2016-05-11 Samsung Electronics Co Ltd Display method and device
US9367203B1 (en) 2013-10-04 2016-06-14 Amazon Technologies, Inc. User interface techniques for simulating three-dimensional depth
US9367232B2 (en) 2007-01-07 2016-06-14 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US9423929B2 (en) 2009-06-04 2016-08-23 Sap Se Predictive scrolling
US9619143B2 (en) 2008-01-06 2017-04-11 Apple Inc. Device, method, and graphical user interface for viewing application launch icons
US9733812B2 (en) 2010-01-06 2017-08-15 Apple Inc. Device, method, and graphical user interface with content display modes and display rotation heuristics
US9772751B2 (en) 2007-06-29 2017-09-26 Apple Inc. Using gestures to slide between user interfaces
US9933913B2 (en) 2005-12-30 2018-04-03 Apple Inc. Portable electronic device with interface reconfiguration mode
US9933937B2 (en) 2007-06-20 2018-04-03 Apple Inc. Portable multifunction device, method, and graphical user interface for playing online videos
US10055013B2 (en) 2013-09-17 2018-08-21 Amazon Technologies, Inc. Dynamic object tracking for user interfaces
FR3063821A1 (en) * 2017-03-10 2018-09-14 Institut Mines Telecom HUMAN MACHINE INTERFACE
US10088924B1 (en) 2011-08-04 2018-10-02 Amazon Technologies, Inc. Overcoming motion effects in gesture recognition
US10272294B2 (en) 2016-06-11 2019-04-30 Apple Inc. Activity and workout updates
CN109804341A (en) * 2017-07-31 2019-05-24 腾讯科技(深圳)有限公司 With the interaction of the three-dimensional internet content of display on a user interface
US10313505B2 (en) 2006-09-06 2019-06-04 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US10613732B2 (en) 2015-06-07 2020-04-07 Apple Inc. Selecting content items in a user interface display
US10620780B2 (en) 2007-09-04 2020-04-14 Apple Inc. Editing interface
US11153472B2 (en) 2005-10-17 2021-10-19 Cutting Edge Vision, LLC Automatic upload of pictures from a camera
US11199906B1 (en) 2013-09-04 2021-12-14 Amazon Technologies, Inc. Global user input management
US11216119B2 (en) 2016-06-12 2022-01-04 Apple Inc. Displaying a predetermined view of an application

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6288704B1 (en) * 1999-06-08 2001-09-11 Vega, Vista, Inc. Motion detection and tracking system to control navigation and display of object viewers
US6433793B1 (en) * 1998-04-24 2002-08-13 Nec Corporation Scrolling system of a display image
US20020130839A1 (en) * 2001-03-16 2002-09-19 Hugh Wallace Optical screen pointing device with inertial properties
WO2003003185A1 (en) * 2001-06-21 2003-01-09 Ismo Rakkolainen System for establishing a user interface
GB2387755A (en) * 2002-03-28 2003-10-22 Nec Corp Portable apparatus including improved pointing device using image shift

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6433793B1 (en) * 1998-04-24 2002-08-13 Nec Corporation Scrolling system of a display image
US6288704B1 (en) * 1999-06-08 2001-09-11 Vega, Vista, Inc. Motion detection and tracking system to control navigation and display of object viewers
US20020130839A1 (en) * 2001-03-16 2002-09-19 Hugh Wallace Optical screen pointing device with inertial properties
WO2003003185A1 (en) * 2001-06-21 2003-01-09 Ismo Rakkolainen System for establishing a user interface
GB2387755A (en) * 2002-03-28 2003-10-22 Nec Corp Portable apparatus including improved pointing device using image shift
US20040204067A1 (en) * 2002-03-28 2004-10-14 Nec Corporation Portable apparatus including improved pointing device

Cited By (127)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007042189A1 (en) * 2005-10-11 2007-04-19 Sony Ericsson Mobile Communication Ab Cellular communication terminals and methods that sense terminal movement for cursor control
US7643850B2 (en) 2005-10-11 2010-01-05 Sony Ericsson Mobile Communications Ab Cellular communication terminals and methods that sense terminal movement for cursor control
US11818458B2 (en) 2005-10-17 2023-11-14 Cutting Edge Vision, LLC Camera touchpad
US11153472B2 (en) 2005-10-17 2021-10-19 Cutting Edge Vision, LLC Automatic upload of pictures from a camera
US7616186B2 (en) 2005-12-09 2009-11-10 Sony Ericsson Mobile Communications Ab Acceleration reference devices, cellular communication terminal systems, and methods that sense terminal movement for cursor control
WO2007068791A1 (en) * 2005-12-13 2007-06-21 Elcoteq Se Method and arrangement to manage graphical user interface and portable device provided with graphical user interface
US11650713B2 (en) 2005-12-30 2023-05-16 Apple Inc. Portable electronic device with interface reconfiguration mode
US9933913B2 (en) 2005-12-30 2018-04-03 Apple Inc. Portable electronic device with interface reconfiguration mode
US10884579B2 (en) 2005-12-30 2021-01-05 Apple Inc. Portable electronic device with interface reconfiguration mode
US11449194B2 (en) 2005-12-30 2022-09-20 Apple Inc. Portable electronic device with interface reconfiguration mode
US10915224B2 (en) 2005-12-30 2021-02-09 Apple Inc. Portable electronic device with interface reconfiguration mode
US10359907B2 (en) 2005-12-30 2019-07-23 Apple Inc. Portable electronic device with interface reconfiguration mode
US9110499B2 (en) 2006-08-02 2015-08-18 Blackberry Limited System and method for adjusting presentation of moving images on an electronic device according to an orientation of the device
US8139026B2 (en) 2006-08-02 2012-03-20 Research In Motion Limited System and method for adjusting presentation of text and images on an electronic device according to an orientation of the device
EP2259163A3 (en) * 2006-08-02 2011-03-16 Research In Motion Limited System and method for adjusting presentation of moving images on an electronic device according to an orientation of the device
CN101196795B (en) * 2006-08-02 2014-04-09 黑莓有限公司 System and method for adjusting presentation of text, image and moving images on device
EP1884864A1 (en) * 2006-08-02 2008-02-06 Research In Motion Limited System and Method for Adjusting Presentation of Moving Images on an Electronic Device According to an Orientation of the Device
US8493323B2 (en) 2006-08-02 2013-07-23 Research In Motion Limited System and method for adjusting presentation of moving images on an electronic device according to an orientation of the device
EP2090975A1 (en) * 2006-08-02 2009-08-19 Research In Motion Limited System and method for adjusting presentation of moving images on an electronic device according to an orientationof the device
US9367097B2 (en) 2006-08-02 2016-06-14 Blackberry Limited System and method for adjusting presentation of text and images on an electronic device according to an orientation of the device
US10313505B2 (en) 2006-09-06 2019-06-04 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US11736602B2 (en) 2006-09-06 2023-08-22 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US11240362B2 (en) 2006-09-06 2022-02-01 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US11029838B2 (en) 2006-09-06 2021-06-08 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
WO2008029180A1 (en) * 2006-09-06 2008-03-13 Santosh Sharan An apparatus and method for position-related display magnification
US9335924B2 (en) 2006-09-06 2016-05-10 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US9952759B2 (en) 2006-09-06 2018-04-24 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US10778828B2 (en) 2006-09-06 2020-09-15 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US11586348B2 (en) 2007-01-07 2023-02-21 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US9367232B2 (en) 2007-01-07 2016-06-14 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US10254949B2 (en) 2007-01-07 2019-04-09 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US11169691B2 (en) 2007-01-07 2021-11-09 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US10732821B2 (en) 2007-01-07 2020-08-04 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
WO2008098331A1 (en) * 2007-02-15 2008-08-21 Edson Roberto Minatel Optoeletronic device for helping and controlling industrial processes
US20100026812A1 (en) * 2007-02-15 2010-02-04 Edson Roberto Minatel Optoeletronic Device for Helping and Controlling Industrial Processes
US9933937B2 (en) 2007-06-20 2018-04-03 Apple Inc. Portable multifunction device, method, and graphical user interface for playing online videos
WO2009001240A1 (en) * 2007-06-27 2008-12-31 Nokia Corporation Method, apparatus and computer program product for providing a scrolling mechanism for touch screen devices
US10761691B2 (en) 2007-06-29 2020-09-01 Apple Inc. Portable multifunction device with animated user interface transitions
US11507255B2 (en) 2007-06-29 2022-11-22 Apple Inc. Portable multifunction device with animated sliding user interface transitions
US9772751B2 (en) 2007-06-29 2017-09-26 Apple Inc. Using gestures to slide between user interfaces
US11126321B2 (en) 2007-09-04 2021-09-21 Apple Inc. Application menu user interface
US11604559B2 (en) 2007-09-04 2023-03-14 Apple Inc. Editing interface
US11010017B2 (en) 2007-09-04 2021-05-18 Apple Inc. Editing interface
US11861138B2 (en) 2007-09-04 2024-01-02 Apple Inc. Application menu user interface
WO2009032638A2 (en) * 2007-09-04 2009-03-12 Apple Inc. Application menu user interface
WO2009032638A3 (en) * 2007-09-04 2010-01-21 Apple Inc. Application menu user interface
US10620780B2 (en) 2007-09-04 2020-04-14 Apple Inc. Editing interface
WO2009052848A1 (en) * 2007-10-25 2009-04-30 Nokia Corporation Controlling information presentation by an apparatus
RU2494442C2 (en) * 2007-12-08 2013-09-27 Т-Мобиле Интернациональ Аг Virtual keyboard of mobile terminal device
CN101889256B (en) * 2007-12-08 2013-04-17 T-移动国际股份公司 Virtual keyboard of a mobile terminal
KR101352462B1 (en) 2007-12-08 2014-01-17 티-모바일 인터내셔널 아게 Virtual keyboard of a mobile terminal
US20100313160A1 (en) * 2007-12-08 2010-12-09 T-Mobile International Ag Virtual keyboard of a mobile terminal
JP2011507058A (en) * 2007-12-08 2011-03-03 ティー−モバイル インターナツィオナール アーゲー Mobile device virtual keyboard
WO2009071234A1 (en) * 2007-12-08 2009-06-11 T-Mobile International Ag Virtual keyboard of a mobile terminal
US8527895B2 (en) 2007-12-08 2013-09-03 T-Mobile International, AG Virtual keyboard of a mobile terminal
US9619143B2 (en) 2008-01-06 2017-04-11 Apple Inc. Device, method, and graphical user interface for viewing application launch icons
US10628028B2 (en) 2008-01-06 2020-04-21 Apple Inc. Replacing display of icons in response to a gesture
WO2009150522A1 (en) * 2008-06-11 2009-12-17 Nokia Corporation Camera gestures for user interface control
US8269842B2 (en) 2008-06-11 2012-09-18 Nokia Corporation Camera gestures for user interface control
CN102089738A (en) * 2008-06-11 2011-06-08 诺基亚公司 Camera gestures for user interface control
US20100125816A1 (en) * 2008-11-20 2010-05-20 Bezos Jeffrey P Movement recognition as input mechanism
US8788977B2 (en) * 2008-11-20 2014-07-22 Amazon Technologies, Inc. Movement recognition as input mechanism
US9304583B2 (en) 2008-11-20 2016-04-05 Amazon Technologies, Inc. Movement recognition as input mechanism
CN102216882A (en) * 2008-12-10 2011-10-12 索尼爱立信移动通讯有限公司 System and method for modifying a plurality of key input regions based on detected tilt and/or rate of tilt of an electronic device
JP2012511867A (en) * 2008-12-10 2012-05-24 ソニー エリクソン モバイル コミュニケーションズ, エービー System and method for modifying a plurality of key input areas based on at least one of detected tilt and tilt rate of an electronic device
WO2010068312A1 (en) * 2008-12-10 2010-06-17 Sony Ericsson Mobile Communications Ab System and method for modifying a plurality of key input regions based on detected tilt and/or rate of tilt of an electronic device
US20100149100A1 (en) * 2008-12-15 2010-06-17 Sony Ericsson Mobile Communications Ab Electronic Devices, Systems, Methods and Computer Program Products for Detecting a User Input Device Having an Optical Marker Thereon
WO2010070460A1 (en) * 2008-12-15 2010-06-24 Sony Ericsson Mobile Communications Electronic devices, systems, methods and computer program products for detecting a user input device having an optical marker thereon
EP2370889A4 (en) * 2008-12-18 2012-08-08 Nokia Corp Apparatus, method, computer program and user interface for enabling user input
EP2370889A1 (en) * 2008-12-18 2011-10-05 Nokia Corporation Apparatus, method, computer program and user interface for enabling user input
KR101471708B1 (en) * 2008-12-18 2014-12-10 노키아 코포레이션 Apparatus, method, computer program and user interface for enabling user unput
EP2382527A2 (en) * 2008-12-30 2011-11-02 France Telecom User interface to provide enhanced control of an application program
FR2940689A1 (en) * 2008-12-31 2010-07-02 Cy Play Method for navigating user of mobile terminal e.g. portable computer, on application, involves detecting movement of positioning device, and modulating movement data in audio format for transmitting movement data to server
US9185159B2 (en) 2008-12-31 2015-11-10 Cy-Play Communication between a server and a terminal
WO2010076436A3 (en) * 2008-12-31 2010-11-25 Cy Play Method for macroblock modeling of the display of a remote terminal by means of layers characterized by a movement vector and transparency data
FR2940690A1 (en) * 2008-12-31 2010-07-02 Cy Play Mobile terminal i.e. mobile telephone, user navigation method, involves establishing contents to be displayed on terminal for permitting navigation on user interface having size larger than size of screen of terminal
FR2940703A1 (en) * 2008-12-31 2010-07-02 Cy Play Display modeling method for application on server, involves forming image based on pixels of traces, and transmitting image and encoding information conforming to assembly of modification data to encoder by transmitting unit
US8441441B2 (en) 2009-01-06 2013-05-14 Qualcomm Incorporated User interface for mobile devices
WO2010080166A1 (en) * 2009-01-06 2010-07-15 Qualcomm Incorporated User interface for mobile devices
US9423929B2 (en) 2009-06-04 2016-08-23 Sap Se Predictive scrolling
EP2438506A1 (en) * 2009-06-04 2012-04-11 Mellmo Inc. Displaying multi-dimensional data using a rotatable object
EP2438506A4 (en) * 2009-06-04 2013-10-02 Mellmo Inc Displaying multi-dimensional data using a rotatable object
US8599153B2 (en) 2009-08-24 2013-12-03 Pantech Co., Ltd. Apparatus and method for executing hot key function of mobile terminal
EP2290931A3 (en) * 2009-08-24 2013-05-22 Pantech Co., Ltd. Apparatus and method for executing hot key function of mobile terminal
US9542010B2 (en) 2009-09-15 2017-01-10 Palo Alto Research Center Incorporated System for interacting with objects in a virtual environment
EP2296076A1 (en) * 2009-09-15 2011-03-16 Palo Alto Research Center Incorporated System for interacting with objects in a virtual environment
EP2341412A1 (en) * 2009-12-31 2011-07-06 Sony Computer Entertainment Europe Limited Portable electronic device and method of controlling a portable electronic device
US9733812B2 (en) 2010-01-06 2017-08-15 Apple Inc. Device, method, and graphical user interface with content display modes and display rotation heuristics
US9557811B1 (en) 2010-05-24 2017-01-31 Amazon Technologies, Inc. Determining relative motion as input
US8878773B1 (en) 2010-05-24 2014-11-04 Amazon Technologies, Inc. Determining relative motion as input
EP2482170A2 (en) * 2011-01-31 2012-08-01 Hand Held Products, Inc. Terminal operative for display of electronic record
CN102750077A (en) * 2011-01-31 2012-10-24 手持产品公司 Terminal operative for display of electronic record
EP2482170A3 (en) * 2011-01-31 2015-01-21 Hand Held Products, Inc. Terminal operative for display of electronic record
US9123272B1 (en) 2011-05-13 2015-09-01 Amazon Technologies, Inc. Realistic image lighting and shading
US9041734B2 (en) 2011-07-12 2015-05-26 Amazon Technologies, Inc. Simulating three-dimensional features
US10088924B1 (en) 2011-08-04 2018-10-02 Amazon Technologies, Inc. Overcoming motion effects in gesture recognition
US9122917B2 (en) 2011-08-04 2015-09-01 Amazon Technologies, Inc. Recognizing gestures captured by video
US8947351B1 (en) 2011-09-27 2015-02-03 Amazon Technologies, Inc. Point of view determinations for finger tracking
EP2597590A3 (en) * 2011-11-28 2013-11-27 Samsung Electronics Co., Ltd Method of authenticating password and portable device thereof
US9165132B2 (en) 2011-11-28 2015-10-20 Samsung Electronics Co., Ltd. Method of authenticating password and portable device thereof
KR101281058B1 (en) 2011-12-06 2013-07-15 (주)나노티에스 Touch keyboard apparatus and touch position detection method thereof
US9223415B1 (en) 2012-01-17 2015-12-29 Amazon Technologies, Inc. Managing resource usage for task performance
US8884928B1 (en) 2012-01-26 2014-11-11 Amazon Technologies, Inc. Correcting for parallax in electronic displays
US10019107B2 (en) 2012-01-26 2018-07-10 Amazon Technologies, Inc. Correcting for parallax in electronic displays
US9471153B1 (en) 2012-03-14 2016-10-18 Amazon Technologies, Inc. Motion detection systems for electronic devices
US9063574B1 (en) 2012-03-14 2015-06-23 Amazon Technologies, Inc. Motion detection systems for electronic devices
US9285895B1 (en) 2012-03-28 2016-03-15 Amazon Technologies, Inc. Integrated near field sensor for display devices
US9652083B2 (en) 2012-03-28 2017-05-16 Amazon Technologies, Inc. Integrated near field sensor for display devices
US9035874B1 (en) 2013-03-08 2015-05-19 Amazon Technologies, Inc. Providing user input to a computing device with an eye closure
US9483113B1 (en) 2013-03-08 2016-11-01 Amazon Technologies, Inc. Providing user input to a computing device with an eye closure
US9269012B2 (en) 2013-08-22 2016-02-23 Amazon Technologies, Inc. Multi-tracker object tracking
US11199906B1 (en) 2013-09-04 2021-12-14 Amazon Technologies, Inc. Global user input management
US10055013B2 (en) 2013-09-17 2018-08-21 Amazon Technologies, Inc. Dynamic object tracking for user interfaces
US9367203B1 (en) 2013-10-04 2016-06-14 Amazon Technologies, Inc. User interface techniques for simulating three-dimensional depth
JP2015102924A (en) * 2013-11-22 2015-06-04 シャープ株式会社 Display device, scroll display method, and scroll display program
GB2532010A (en) * 2014-11-04 2016-05-11 Samsung Electronics Co Ltd Display method and device
US10613732B2 (en) 2015-06-07 2020-04-07 Apple Inc. Selecting content items in a user interface display
US11161010B2 (en) 2016-06-11 2021-11-02 Apple Inc. Activity and workout updates
US11148007B2 (en) 2016-06-11 2021-10-19 Apple Inc. Activity and workout updates
US10272294B2 (en) 2016-06-11 2019-04-30 Apple Inc. Activity and workout updates
US11660503B2 (en) 2016-06-11 2023-05-30 Apple Inc. Activity and workout updates
US11918857B2 (en) 2016-06-11 2024-03-05 Apple Inc. Activity and workout updates
US11216119B2 (en) 2016-06-12 2022-01-04 Apple Inc. Displaying a predetermined view of an application
FR3063821A1 (en) * 2017-03-10 2018-09-14 Institut Mines Telecom HUMAN MACHINE INTERFACE
EP3373118A3 (en) * 2017-03-10 2018-12-12 Institut Mines-Telecom Human/machine interface comprising a camera and a marker
US10895953B2 (en) 2017-07-31 2021-01-19 Tencent Technology (Shenzhen) Company Limited Interaction with a three-dimensional internet content displayed on a user interface
CN109804341A (en) * 2017-07-31 2019-05-24 腾讯科技(深圳)有限公司 With the interaction of the three-dimensional internet content of display on a user interface

Also Published As

Publication number Publication date
NO20044073D0 (en) 2004-09-27

Similar Documents

Publication Publication Date Title
WO2006036069A1 (en) Information processing system and method
Lee et al. Interaction methods for smart glasses: A survey
Kratz et al. HoverFlow: expanding the design space of around-device interaction
US20220319100A1 (en) User interfaces simulated depth effects
US7366540B2 (en) Hand-held communication device as pointing device
US10209877B2 (en) Touch screen device, method, and graphical user interface for moving on-screen objects without using a cursor
JP6603332B2 (en) Device and method for operating a user interface with a stylus
CN105335001B (en) Electronic device having curved display and method for controlling the same
EP2263134B1 (en) Communication terminals with superimposed user interface
CN103262008B (en) Intelligent wireless mouse
US9740297B2 (en) Motion-based character selection
EP2045694B1 (en) Portable electronic device with mouse-like capabilities
KR100783552B1 (en) Input control method and device for mobile phone
EP2068235A2 (en) Input device, display device, input method, display method, and program
US8739053B2 (en) Electronic device capable of transferring object between two display units and controlling method thereof
US20100275122A1 (en) Click-through controller for mobile interaction
US20030234766A1 (en) Virtual image display with virtual keyboard
CN113253908B (en) Key function execution method, device, equipment and storage medium
CN104360813A (en) Display equipment and information processing method thereof
JP5173001B2 (en) Information processing apparatus, screen display method, control program, and recording medium
CN113168221A (en) Information processing apparatus, information processing method, and program
Ballagas et al. Mobile Phones as Pointing Devices.
Sasaki et al. Hit-wear: A menu system superimposing on a human hand for wearable computers
KR20110066545A (en) Method and terminal for displaying of image using touchscreen
Lik-Hang et al. Interaction methods for smart glasses

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: COMMUNICATION UNDER RULE 69 EPC ( EPO FORM 1205A DATED 12/09/07 )

122 Ep: pct application non-entry in european phase

Ref document number: 05798944

Country of ref document: EP

Kind code of ref document: A1