WO2011045786A2 - Wearable device for generating input for computerized systems - Google Patents

Wearable device for generating input for computerized systems Download PDF

Info

Publication number
WO2011045786A2
WO2011045786A2 PCT/IL2010/000834 IL2010000834W WO2011045786A2 WO 2011045786 A2 WO2011045786 A2 WO 2011045786A2 IL 2010000834 W IL2010000834 W IL 2010000834W WO 2011045786 A2 WO2011045786 A2 WO 2011045786A2
Authority
WO
WIPO (PCT)
Prior art keywords
input device
screen
user
wearable
operative
Prior art date
Application number
PCT/IL2010/000834
Other languages
French (fr)
Other versions
WO2011045786A3 (en
Inventor
Rami Parham
Original Assignee
Rami Parham
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rami Parham filed Critical Rami Parham
Publication of WO2011045786A2 publication Critical patent/WO2011045786A2/en
Publication of WO2011045786A3 publication Critical patent/WO2011045786A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0386Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry for light pen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0331Finger worn pointing device

Definitions

  • Wearable device for generating input for computerized systems
  • the present invention relates generally to input devices and more particularly to portable input devices.
  • WO 2009024971 (A2): Finger-worn Devices and Related Methods of Use 3.
  • WO 0237466 (Al): Electronic User Worn Interface Device
  • WO 2010064094 (Al): Portable Electronic Device with Split Vision Content Sharing Control and Method
  • WEB Mobile Phone with a Built-in Projector
  • Gyro mouse devices are known.
  • Multi-touch technology is known.
  • Optic touch technology is known.
  • Certain embodiments of the present invention seek to provide a convenient, intuitive wearable input device. There is thus provided, in accordance with at least one embodiment of the present invention, a convenient, intuitive wearable input device.
  • Certain embodiments of the present invention seek to provide a system for generating computerized input to electronic devices including some or all of the following components, interacting between them e.g. as shown and described herein:
  • an optical sensor such as an IR camera having optic communication with the sleeve device
  • Wireless communication controller emitter
  • wireless receiver on the sleeve device and the controlled host
  • Light sources e.g. some or all of:
  • Infra red / Near Infra red laser includes a feedback circuitry for laser activation monitoring
  • red/green Laser includes a feedback circuitry for laser activation monitoring
  • touch pad scrolling bar for remote scrolling functionality
  • the optic sensor has a spatial field of view which is as wide as possible given application-specific and/or technical constraints.
  • the optic sensor is typically positioned and selected to point at a controlled screen such that its field of view includes the entire screen or to point at a user e.g. such that its field of view includes at least the user's hands.
  • the sensor's filtering is such as to match the optics characterization of the wearable input device's light sources, e.g. in terms of frequencies, power, and distribution.
  • the force sensor actuator also termed herein "force sensor” may for example comprise a conventional button mechanism.
  • each wearable input device may include some or all of the following:
  • Signals terminal and processing unit such as not but limited to MCU controlled system, embedded in wearable sleeve, with I/O's, serial interface, signal conversion capabilities.
  • Wireless communication controller on wearable sleeve for emitting wireless information from sleeve to controlled host.
  • a wireless communication controller on controlled host (e.g. USB dongle) for emitting wireless information from controlled host to sleeves/IR Camera.
  • a wireless receiver on wearable sleeve for receiving wireless information from host.
  • Wireless receiver on controlled host e.g. USB dongle
  • sleeve/s and/or IR Camera for receiving wireless information from sleeve/s and/or IR Camera.
  • Optional sleeve memory unit e.g. with compatible interface to the MCU 2 for storing data (e.g. user preferences, user ID/nickname etc.).
  • light source e.g. Infra-red/Near Infra-red laser to generate optic signals to the IR camera. 6a2.
  • feedback circuitry for laser activation monitoring operative to notify the system when a laser is activated (e.g. red laser / IR laser). Typically located adjacent the monitored laser diode/s.
  • light source Infra-red / Near Infra-red LED, Both vertical and horizontal, to create optic signals to the IR camera.
  • Horizontal light source facilitates remote physical screen control and surface control inter alia.
  • Vertical light source facilitates front projection touch control and laptop/desktop control.
  • Optional Red Laser operative for emphasis and visual pointing.
  • Optional battery charging indicator e.g. light which indicates battery being charged and/or low battery indicator e.g. light and/or, if mandated for eye safety and regulatory issues, a laser activation indicator e.g. light, and/or current work state indicator e.g. light.
  • buttons with haptic feedback each having 2 press options: half and full press, operative to trigger light sources activation and to trigger wireless events transmission.
  • Touch pad scrolling bar operative to trigger scroll up/down events, for applications in which it is desired to provide remote scrolling abilities rather than, say, locating the cursor on the scroller of the screen and dragging it up and down each time it is desired to scroll the page.
  • Force sensor actuator operative to trigger light sources activation, typically embedded within the sleeve's substrate or body.
  • a microphone typically embedded in the wearable sleeve.
  • Speaker embedded in wearable sleeve for applications in which it is desired to provide voice output (e.g. indicating a current work mode).
  • Wearable finger sleeve body/substrate which is suitable in terms of flexibility, texture, weight, and strength.
  • IR camera to detect IR signals and send them to the software application, located so as to point at the screen and/or the user, depending on desired work mode.
  • the one or more wearable input devices e.g. 2 sleeves; optical sensor e.g. IR camera, and software application resident e.g. on the controlled host, may be served by conventional apparatus for charging and data transition from/to the host e.g. Micro USB to USB cable and matching micro USB port on the sleeve/s.
  • the Micro USB port located anywhere on the sleeve, preferably at the bottom of the finger, typically connects the sleeve to a host with a USB port (e.g. computer) so as to charge the battery, send data and receive data.
  • Each wearable input device also may comprise the following optional components:
  • Flash memory which enables the sleeve to act as a 'wearable disc on key'
  • An input system including components 3a, 3d, 8, 9 above, which mimics or preserves mouse functionality e.g. left click, right click, scrolling using buttons and scroller similar to those in a regular mouse.
  • a wearable input device e.g. including components 1 , 6al, 6b, 10, 14 above, which is operative both when the optic sensor points at the screen (the 'natural' state for projected screens) and when the optic sensor points at the user (the 'natural' state for tangible screens).
  • a wearable input device e.g. including components 6b, 10, 14 above, having an operating mode in which touch/multi touch abilities are executed by pressing the screen with the fingertip - e.g. as in conventional touch screens.
  • a wearable input device e.g. including components 1, 6al, 6b, 10, 14 above, having an operating mode in which remote cursor control is executed by joining the thumb and the forefinger in a 'pinching' like manner, e.g. as shown in Figs. 4d - 4e, which affords ergonomic and intuitive motion for the human hand such that to the user it seems like s/he is holding the cursor with her or his thumb and forefinger and moving it around.
  • a wearable input device e.g. including components 1, 3a, 3d, 6al, 6b, 8, 9, 10, 14 above, which enables both remote and touch interaction rather than either remote or touch solution but not both.
  • F. - A wearable input device e.g. including components 1, 3a, 3d, 6al, 6b, 6cl,
  • a wearable input device e.g. including components 3a, 3d, 6b, 8, 9, 10, 14 above providing Laptop/desktop interaction in which, typically, at least the entire keyboard area has 2 roles: keyboard + mouse pad which may be alternated in zero setup time e.g. by joining the forefinger and the thumb together, typically such that mouse functionality including some or all of left click, right click and scrolling is in thumb reach.
  • Certain embodiments of the present invention seek to provide a touchless embodiment which prevents or diminishes screen amortization and, in view of absence of friction between the finger and the screen, enables faster and smooth movements by the user.
  • a wearable input device operative to control an electronic system
  • the input device comprising a wearable substrate, an IR laser source mounted on the wearable substrate and operative to emit a laser beam impinging on a screen at a location whose coordinates depend on motion of the user wearing the input device, and a laser spot detector operative to detect coordinates of the location within the screen and accordingly to control the electronic system.
  • an IR- based apparatus for controlling an electronic device, the apparatus comprising an IR camera configured to be mountable on an electronic device, the electronic device having an input area, the IR camera's field of view including the input area, the IR camera being operative to sense IR signals generated by an input device worn on at least one user's hand, when the hand is operating on the input area; and a controlling functionality operative to receive the IR signals from the IR camera and to control the electronic device accordingly.
  • a wearable input device serving a human user
  • the device comprising a wearable substrate, and at least one force sensor mounted on the wearable substrate and operative to sense pressure patterns applied by the human user which mimics the pressure patterns the human user would apply to a mouse button, and a controlling functionality operative to receive signals, indicative of the pressure, from the force sensor and to control a normally cursor-based electronic device accordingly, including commanding the electronic device to respond to each pressure pattern applied by the human user, as it would respond to the same pressure pattern were it to have been applied by the human user to a cursor-based input device operating the electronic device.
  • a wearable input apparatus operative to provide multi-touch control of an electronic system when the apparatus is worn by a human user
  • the input apparatus comprising a first wearable input device operative to control the electronic system when mounted on the human user's right hand, and a second wearable input device which is a mirrored copy of the first wearable input device and is operative to control the electronic system when mounted on the human user's left hand.
  • a wearable input apparatus operative to control an electronic system when worn by a human user
  • the input apparatus comprising a finger-wearable substrate configured to be mounted on a user's finger and having a tip portion configured to be mounted on the user's finger tip and a force sensing device mounted on the tip portion and including at least one force sensor and being operative to sense pressure applied by the human user's thumb when the user presses thumb to finger, and to control said electronic system at least partly in accordance with at least one characteristic of the pressure.
  • said force sensing device comprises an annular substrate configured to pivot around the user's finger on which a plurality of force sensors are mounted, such that, by rotating the annular substrate, the user selectably positions any of the plurality of force sensors at a location on his finger tip which is accessible to his thumb.
  • Light sources may optionally be provided on the annular substrate.
  • an apparatus comprising an input device worn on at least one user's hand which enables a user to alternate the controlling functionality between a first state in which the electronic device is responsive to the input area and a second state in which the electronic device is responsive to said IR signals.
  • ne hand may operate the keyboard and the other hand may operate the IR signals and the controlling functionalities of the system may be such that both states may be operative simultaneously.
  • an apparatus wherein the input device includes a states selector allowing the user to alternate the controlling functionality between the first and second states.
  • a wearable input system including a wearable input device, and a docking station operative to receive the wearable input device and selectably retain and release the wearing input device once received, so as to enable a human to wear and shed the wearing input device without manual contact therewith.
  • the docking station is operative to provide mechanical protection to the wearable input device.
  • retaining of the input device by the docking station is activated by the human's twisting a body part on which the input device is mounted in a first azimuthal direction, after the device has been received by the docking station, and wherein releasing the input device by the docking station is activated by the human's twisting a body part on which the input device is mounted in a second azimuthal direction, opposite to the first direction, while the device is being retained by the docking station.
  • the laser spot detector comprises an IR camera arranged such that its field of view includes the screen.
  • a device wherein the screen may comprise a projected screen and the laser spot detector may be mounted on a projector projecting the projected screen.
  • the projector comprises an image projector in a handheld deviceor a mobile device e.g. laptop which is not 'hand held'.
  • the laser spot detector comprises an optic-sensitive functionality of the screen itself.
  • the electronic system comprises a computer.
  • the electronic system comprises a portable communication device.
  • the screen comprises a physical screen.
  • the laser source comprises an IR laser source.
  • a device having two user-selectable modes of operation including a remote mode of operation utilizing the laser source and the laser spot detector and a touch mode of operation.
  • a device wherein the touch mode of operation utilizes a light source mounted on the wearable input device and activated by an actual touching of the screen by the user, a sensor which senses a location of the light source and a controller which receives the light source location from the sensor and controls the electronic system accordingly.
  • a device wherein selectability of the selectable modes is activated by a human user who operates a two-manner press mechanism of a pressable element on the wearable input device.
  • an apparatus wherein the input area includes a keyboard. In accordance with an embodiment of the invention there is still further provided an apparatus wherein the input area includes a touch-pad.
  • a device wherein the controlling functionality is also operative to receive signals, indicative of a mouse-sliding operation simulated by the wearable substrate and to control a normally mouse-operated electronic device accordingly, including commanding the electronic device to respond to each mouse-sliding operation simulated by the human user using the substrate, as it would respond to the same mouse-sliding operation were it to have been applied by the human user to a mouse operating the electronic device.
  • first and second wearable input devices are entantiomers which are mirror images of each other.
  • an apparatus comprising an optical sensor operative to simultaneously sense positions of a plurality of light points generated by both input devices simultaneously.
  • an apparatus comprising a controlling application operative for simultaneously receiving and simultaneously processing positions of a plurality of light points generated by both input devices simultaneously and for controlling a host device accordingly.
  • the optical sensor comprises an IR sensor and the light points generated by both input devices simultaneously comprise IR light points.
  • a touchless user input system comprising a rear projection unit located at the bottom of a surface which projects onto the surface at an angle acute enough to accommodate the small distance between the screen and the projecting unit, a wearable input device emitting light, and a rear optical sensor located behind the screen e.g. adjacent the projecting unit, which is operative to see the light emitted by the wearable input unit.
  • a system comprising a force sensing actuator on the wearable input device which triggers emission of the light.
  • a system wherein a user joins thumb and finger together adjacent a desired screen location, the light comprises IR light which impinges upon the surface at the desired screen location and wherein the rear sensor detects the points on the screen from which the laser beams are scattering.
  • the IR laser source comprises a near-IR laser source.
  • an apparatus wherein the state selector is controlled by a manual operation by the user in which the user presses together his thumb and a finger wearing the input device.
  • a wearable input device which includes an optic sensor and which, when worn by a human user, provides inputs to an electronic device, the input device having a first operating mode in which the optic sensor points at a screen and a second operating mode in which the optic sensor points at the user.
  • an input device wherein the screen comprises a projected screen.
  • a wearable input device which, when worn by a human user, provides inputs to an electronic device, the input device having a "touch" operating mode in which the human user presses a screen which is not a touch-screen and the electronic device is controlled as though the screen were a touch screen.
  • a device which provides to a human user an experience as though the user were holding a cursor with thumb and forefinger and moving the cursor.
  • a wearable input device which, when worn by a human user, provides inputs to an electronic device, the input device having a "touch" operating mode in which the human user presses a screen and the electronic device is controlled as though the screen were a touch screen; and a remote operating mode in which the human user interacts with the screen remotely from afar.
  • a wearable input device which is operative in any of a selectable plurality of
  • an input device wherein the plurality of environments includes a front projected screen environment.
  • an input device wherein the plurality of environments includes a rear projected screen environment.
  • an input device wherein the plurality of environments includes a desktop computer environment.
  • an input device wherein the plurality of environments includes a laptop computer environment.
  • an input device wherein the plurality of environments includes a mobile device with an embedded pico projector.
  • an input device wherein the plurality of environments includes a interactive surface environment.
  • a method for providing IR-based apparatus for controlling an electronic device comprising providing an IR camera configured to be mountable on an electronic device, the electronic device having an input area, the IR camera's field of view including the input area, the IR camera being operative to sense IR signals generated by an input device worn on at least one user's hand, when the hand is operating on the input area, and providing a controlling functionality operative to receive the IR signals from the IR camera and to control the electronic device accordingly.
  • a method for providing a wearable input device serving a human user comprising mounting at least one force sensor on a wearable substrate, wherein the force sensor is operative to sense pressure patterns applied by the human user which mimics the pressure patterns the human user would apply to a mouse button, and providing a controlling functionality operative to receive signals, indicative of the pressure, from the force sensor and to control a normally cursor-based electronic device accordingly, including commanding the electronic device to respond to each pressure pattern applied by the human user, as it would respond to the same pressure pattern were it to have been applied by the human user to a cursor-based input device operating the electronic device.
  • a method for providing wearable input apparatus operative to provide multi-touch control of an electronic system when the apparatus is worn by a human user comprising providing a first wearable input device operative to control the electronic system when mounted on the human user's right hand, and providing a second wearable input device which is a mirrored copy of the first wearable input device and is operative to control the electronic system when mounted on the human user's left hand.
  • a method for providing wearable input apparatus operative to control an electronic system when worn by a human user comprising providing a finger-wearable substrate configured to be mounted on a user's finger and having a tip portion configured to be mounted on the user's finger tip, and mounting a force sensing device on the tip portion, the force sensing device including at least one force sensor and being operative to sense pressure applied by the human user's thumb when the user presses thumb to finger, and to control the electronic system at least partly in accordance with at least one characteristic of the pressure.
  • a method for providing a wearable input system including providing a docking station operative to receive a wearable input device and to selectably retain and release the wearing input device once received, so as to enable a human to wear and shed the wearing input device without manual contact therewith.
  • a method for providing a touchless user input system comprising providing a rear projection unit located at the bottom of a surface which projects onto the surface at an angle acute enough to accommodate the small distance between the screen and the projecting unit, providing a wearable input device emitting light, and providing a rear optical sensor operative to be disposed behind the screen e.g. adjacent the projecting unit, which is operative to see the light emitted by the wearable input unit.
  • a method for providing a wearable input device including providing a wearable input device which includes an optic sensor and which, when worn by a human user, provides inputs to an electronic device, the input device having a first operating mode in which the optic sensor points at a screen and a second operating mode in which the optic sensor points at the user.
  • a method for providing a wearable input device including providing a wearable input device which, when worn by a human user, provides inputs to an electronic device, the input device having a "touch" operating mode in which the human user presses a screen which is not a touch-screen and the electronic device is controlled as though the screen were a touch screen.
  • a method for providing a wearable input device including providing a wearable input device which, when worn by a human user, provides inputs to an electronic device, the input device having a "touch" operating mode in which the human user presses a screen and the electronic device is controlled as though said screen were a touch screen, and a remote operating mode in which the human user interacts with the screen remotely from afar.
  • a method for providing a wearable input device including providing a wearable input device which is operative in any of a selectable plurality of interactive environments.
  • a method for providing a wearable input device operative to control an electronic system comprising providing an IR laser source mounted on a wearable substrate and operative to emit a laser beam impinging on a screen at a location whose coordinates depend on motion of the user wearing the input device, and providing a laser spot detector operative to detect coordinates of the location within the screen and accordingly to control the electronic system.
  • a computer program product comprising a computer usable medium having a computer readable program code embodied therein, the computer readable program code adapted to be executed to implement any of the methods shown and described herein.
  • annular substrate comprises at least one light source selectably triggered by at least one of the force sensors respectively.
  • the at least one light source comprises at least one LED.
  • an apparatus wherein the user can control the keyboard with one hand and control the cursor with the other hand simultaneously.
  • an apparatus operative to distinguish between input streams emanating from each of the user's hands and to handle both of the input streams concurrently.
  • the light source comprises a LED.
  • a computer usable medium on which resides a controlling functionality.
  • a computer program product comprising a computer usable medium or computer readable storage medium, typically tangible, having a computer readable program code embodied therein, the computer readable program code adapted to be executed to implement any or all of the methods shown and described herein. It is appreciated that any or all of the computational steps shown and described herein may be computer- implemented. The operations in accordance with the teachings herein may be performed by a computer specially constructed for the desired purposes or by a general purpose computer specially configured for the desired purpose by a computer program stored in a computer readable storage medium.
  • Any suitable processor, display and input means may be used to process, display e.g. on a computer screen or other computer output device, store, and accept information such as information used by or generated by any of the methods and apparatus shown and described herein; the above processor, display and input means including computer programs, in accordance with some or all of the embodiments of the present invention.
  • processors workstation or other programmable device or computer or electronic computing device, either general- purpose or specifically constructed, used for processing; a computer display screen and/or printer and/or speaker for displaying; machine-readable memory such as optical disks, CDROMs, magnetic-optical discs or other discs; RAMs, ROMs, EPROMs, EEPROMs, magnetic or optical or other cards, for storing, and keyboard or mouse for accepting.
  • processor includes a single processing unit or a plurality of distributed or remote such units.
  • the above devices may communicate via any conventional wired or wireless digital communication means, e.g. via a wired or cellular telephone network or a computer network such as the Internet.
  • the apparatus of the present invention may include, according to certain embodiments of the invention, machine readable memory containing or otherwise storing a program of instructions which, when executed by the machine, implements some or all of the apparatus, methods, features and functionalities of the invention shown and described herein.
  • the apparatus of the present invention may include, according to certain embodiments of the invention, a program as above which may be written in any conventional programming language, and optionally a machine for executing the program such as but not limited to a general purpose computer which may optionally be configured or activated in accordance with the teachings of the present invention. Any of the teachings incorporated herein may wherever suitable operate on signals representative of physical objects or substances.
  • the term "computer” should be broadly construed to cover any kind of electronic device with data processing capabilities, including, by way of non-limiting example, personal computers, servers, computing system, communication devices, processors (e.g. digital signal processor (DSP), microcontrollers, field programmable gate array (FPGA), application specific integrated circuit (ASIC), etc.) and other electronic computing devices.
  • processors e.g. digital signal processor (DSP), microcontrollers, field programmable gate array (FPGA), application specific integrated circuit (ASIC), etc.
  • DSP digital signal processor
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • Any suitable output device or display may be used to display or output information generated by the apparatus and methods shown and described herein.
  • Any suitable processor may be employed to compute or generate information as described herein e.g. by providing one or more modules in the processor to perform functionalities described herein.
  • Any suitable computerized data storage e.g. computer memory may be used to store information received by or generated by the systems shown and described herein.
  • Functionalities shown and described herein may be divided between a server computer and a plurality of client computers. These or any other computerized components shown and described herein may communicate between themselves via a suitable computer network.
  • Fig. la is a pictorial side-view illustration of a wearable input device constructed and operative in accordance with one embodiment of the present invention.
  • Fig. lb is a pictorial isometric illustration of the device of Fig. la.
  • Fig. lc is a pictorial illustration of an IR camera mounted on an image projector, facing a screen onto which an image is being projected by the projector, all in accordance with an embodiment of the present invention.
  • Fig. 1 d is a pictorial diagram of a first method whereby a wearable input device constructed and operative in accordance with certain embodiments of the present invention controls a computerized application.
  • Fig. le is a diagram of a state-chart for a half-press mechanism constructed and operative in accordance with an embodiment of the present invention.
  • Fig. If is a diagram of a state-chart for a touch-screen mechanism constructed and operative in accordance with an embodiment of the present invention.
  • Fig. lg is a pictorial illustration of an IR camera mounted adjacent a computer screen, facing down toward a typical location of a user's hands on an input device such as a keyboard associated with the computer screen, all in accordance with an embodiment of the present invention.
  • Fig. lh is a diagram of a state-chart for a continuous PC mechanism constructed and operative in accordance with an embodiment of the present invention.
  • Fig. li is a pictorial illustration of a user's hands on the keyboard of Fig. lg, wherein the user is wearing an input device constructed and operative in accordance with an embodiment of the present invention, and is using the input device to control the keyboard and mouse without removing his hands from the keyboard, due to the operation of the IR camera of Fig. lg in conjunction with the input device, all in accordance with an embodiment of the present invention.
  • Figs. 2a - 2d are pictorial illustrations of respective stages in the interaction of a wearable input device constructed and operative in accordance with an embodiment of the present invention, with a docking station constructed and operative in accordance with an embodiment of the present invention.
  • Fig. 3 is a pictorial diagram of a second method whereby a wearable input device constructed and operative in accordance with certain embodiments of the present invention controls a computerized application.
  • Figs. 4a - 4b are isometric views of a wearable "sleeve" device for generating computerized input to electronic devices, according to certain embodiments of the present invention.
  • Figs. 4c - 4e are pictorial illustrations of the device of Figs. 4a - 4b as mounted on and manipulated by a user's hands, according to certain embodiments of the present invention.
  • Fig. 5a illustrates an exemplary plurality of work modes according to which the sleeve device of Figs. 4a - 4b may operate, and which work modes are typically selectable as alternative states, using the state selection actuator 60 of Figs. 4a - 4b.
  • Fig. 5b illustrates a pico projector embedded in a mobile communication device such as a cellphone, which is operative in conjunction with a wearable input device according to certain embodiments of the present invention.
  • Fig. 5c illustrates a tangible screens environment which may be one of a selectable plurality of interactive environments in which the wearable input device shown and described herein is operative, according to certain embodiments of the present invention.
  • Fig. 5d illustrates an interactive surface environment which may be one of a selectable plurality of interactive environments in which the wearable input device shown and described herein is operative, according to certain embodiments of the present invention.
  • Figs. 6a - 6d are illustrations useful in understanding methods of operation of a software application typically resident on the controlled host e.g. computer or mobile communication device being controlled by inputs generated using the input generating system shown and described herein, all in accordance with certain embodiments of the present invention.
  • Figs. 7a - 7b are front and back isometric illustrations of a flat surface with rear projection configuration, characterized by projection at a very obtuse angle such that projection may be from very close, in conjunction with a rear sensor, whose field of view typically comprises the entire screen, which apparatus is useful in implementing a touchless embodiment of the present invention.
  • Figs. 8a - 8q are useful in understanding hardware components of a wearable input device constructed and operative in accordance with certain embodiments of the present invention.
  • Figs. 9a - 9f are useful in understanding software components of a wearable input device constructed and operative in accordance with certain embodiments of the present invention.
  • Fig. la is a pictorial illustration of the Input device.
  • the Input device typically include some or all of an IR laser/LEG and red laser, RF controllers and a flattened scrolling bar, typically accessible by a user's thumb.
  • the mode of operation of the apparatus of Figs, la and lb is typically as follows: First, the two RF control buttons are able to transmit, to a PC application residing on the controlled computer, mouse clicks, e.g. left and right, way similar to the way a cordless mouse operates.
  • the scroll bar is a touch-sensitive electronic pad which simulates the operation of a scroll-wheel, creating signals that are transferred to the PC application using RF communication.
  • the input device's touch scroll bar is not visible in Fig. lb as it is right under the thumb in the image.
  • the scrolling operation of the input device is typically effected by an intuitive sliding of the thumb along on one side of the finger.
  • Mouse cursor control using the input device, according to certain embodiments of the invention, is now described in detail.
  • a laser ray in the IR / near IR spectrum is emitted from the input device, and strikes the screen at a given point. That point's coordinates are transferred to the PC application. Any suitable method may be employed to detect that point, such as but not limited to any of the following methods a - c:
  • the IR camera After striking the screen, the laser ray is scattered into the space of the room.
  • An IR camera is constantly videoing the screen and detects the points on the screen where the laser beams are being scattered from it. The camera then sends this input to the PC application.
  • the IR camera may be mounted on the projector facing the screen as shown in Fig. lc.b. Two or more IR cameras are located on the ribs of the screen, constantly videoing the screen and sending their input to a Central Computing Unit. Then, using mathematic & trigonometric computations on the integrated input, the CCU detects the contact point between the laser beams and the screen, sending it to the PC application, c.
  • the screen as a whole is an optic sensitive screen, which self-detects the striking point of laser beams at a given range of the electromagnetic spectrum.
  • the apparatus of Fig. lb and/or variations thereof illustrated and described herein typically includes some or all of the following, suitably positioned e.g. as shown: horizontal ir led 10, vertical ir led 10, ir laser 30, red laser 40, scroller 50, buttons 60 and 70, state selector 80, force sensing actuator 90, ir led 100, technology box 1 10 in which suitable non-location sensitive components may be concentrated, speakers and microphone 120, flexible body 140 with paper battery inside, indicator LEDs 150 and on / off switch 160.
  • the wearable input device of the present invention is sometimes termed herein "sleeve" although the configuration thereof may or may not resemble an actual sleeve such as a finger-wearable sleeve.
  • the image is not projected onto the screen on the user's side, as in a conventional projector, but rather from the back.
  • the laser beam penetrates the transparent screen, refracting somewhat, and is absorbed in an IR camera disposed at the back of the screen. The camera sees the point on the screen from which the beam arrived and thereby discerns the location on the screen pointed at by the user.
  • a module of the PC application receives some or all of the above-mentioned inputs such as but not limited to some or all of: mouse clicks, scroll messages, beams striking point, and integrates them into the OS, masquerading as a regular or multi- touch mouse.
  • the user receives feedback to her or his actions as if s/he were using a mouse; for example, the mouse cursor moves on the screen, clicks open windows and menus, etc. The interaction continues.
  • Fig. Id is a graphical representation of the process, as a low resolution process breakdown.
  • the input device typically comprises some or all of the following components: a. on / off switch
  • light sources typically, IR laser + IR LED +Red Laser
  • buttons constructed in a two press manner- half and full press as described below
  • the input device body is made of a combination of flexible polymers with memory shape which easily conform themselves to the user's physical dimensions.
  • the input device may be worn on any of the user fingers of each hand.
  • Half and full press The two RF controllers located at the second & third parts of the finger e.g. as shown in Fig. lb, are constructed in a two-press manner - half & full press. Since the input device is a wearable mouse, it constantly follows any hand movement. Therefore, the user is typically afforded the ability to choose when s/he wants it to influence the cursor position and when not.
  • the half press mechanism helps him to easily achieve this ability as described with reference to the state-charts of Figs, le - lh, inter alia.
  • the input device is typically multi-state and may be in one of several different states a - e at any given moment, such as but not limited to some or all of the following:
  • IR Laser state enables remote control of large screens, as described above.
  • the behavior of the operation algorithm is described in Fig. le which is a suitable half press state-chart
  • Red Laser state the device switches to a laser-pointer mode, emitting instead of an IR laser detectable by the IR camera, a red laser at the human eye viewable spectrum. Using this mode the user may switch from controlling the mouse to highlighting important features on the screen, similar to the way a regular laser-pointer works. Although the user has no effect on the cursor position, the other functionality of the input device, such as clicking and scrolling abilities, continue to work regularly.
  • IR LED mode 1 - touch screen This mode enables the user to easily transform any large screen into a multi-touch screen.
  • the user may work with the input device adjacent to the screen while the active light source may be the IR LED instead of the IR laser. Operation, according to certain embodiments, is illustrated in Fig. If which is a touch screen state-chart.
  • the IR LED may have a stretching mechanism which may enable it to protrude from the input device similarly to a stylus pen.
  • IR LED mode 2 personal computer: This mode enables the user to control desktop and laptop computers. In this mode's settings, the IR camera is located at the upper rib of the computer screen, facing down towards the user's hand which is typically lying on the keyboard, as shown at Fig. lg. The user uses both hands to work with the keyboard and when s/he wishes to move the mouse curser or execute any other mouse operation such as clicking or scrolling, s he does not need to lift her or his hand from the keyboard and hold the mouse - s/he uses the input device while her or his hands are still on the keyboard.
  • the IR camera located on the computer's screen detects the IR signals being emitted from the input device and sends them to PC application module on the controlled computer which transforms them into mouse movements on the screen.
  • the behavior of the operation algorithm is as shown at Fig. le - half press state chart.
  • IR LED mode 3 - continuous PC This mode may be exactly the same as IR LED mode 2 - personal computer (d), except that the IR LED typically works constantly. Therefore, typically, any movement always influences the cursor position without the need to half press the button first.
  • the behavior of the operation algorithm is described in Fig. lh which is a continuous PC state-chart.
  • the input device enables the user in front of any kind of audience to achieve total remote mouse control, typically including some or all of the following: Left click, Right click, Cursor control, scrolling, double click, typically while offering advantages such as but not limited to some or all of the following:
  • the user is not limited to a single position and may walk freely around the presentation area, as large as it may be.
  • the presenter does not need to hold anything in her or his hands. Both her or his hands are totally free and s/he may use her or his body language in a full, natural manner and even hold objects during the presentation. Furthermore, the input device is built in such a way as to leave the tip of the pointing finger totally free, which allows him to comfortably type on a keyboard without first laying another object down.
  • the user may turn the input device into a laser pointer at any point while still maintaining control of any mouse functionality except for cursor position, such as but not limited to some or all of the following: Left click, Right click, scrolling, double click.
  • the user may use any kind of big screen, whether an LCD screen or a projection screen, as a touch screen.
  • the user may have full control of the keyboard and mouse while both her or his hands remain on the keyboard.
  • the advantages of such an approach may include but are not limited to some or all of the following: a. Effectiveness - the user's work rate increases as the move from mouse to keyboard requires zero setup time. Many users are wary of the constant move of hands from mouse to keyboard and work with one hand on the mouse and one hand on the keyboard. This approach suffers from limitations because keyboards are planned for two-hand use. b. Ergonomics - The frequent move from keyboard to mouse wears the wrist and shoulder joints - phenomena which are overcome by certain embodiments of the input device shown and described herein.
  • Laptop control may be performed using an IR camera hooked on the top of the screen.
  • the IR camera facing the keyboard area becomes embedded in laptops at manufacturing stage. This may transform the input device into a completely independent unit, communicating with the laptop without the need for external devices.
  • Fig. li illustrates a possible implementation in which the IR camera is embedded into the laptop's body.
  • the input device market is moving towards multi-touch control e.g. iPhone and other devices.
  • This multi-touch control has barely been extended into desktop and laptops due to price & limitations of touch-screen technology.
  • An input device worn on one of the fingers of both the left and right hands brings multi-touch functionality to desktop / laptop / media center / projector screen applications.
  • the ability to control objects on the screen using both hands in an intuitive way is useful for everyday computer work-flow.
  • a docking station useful in association with the input device is now described. Referring now to Figs. 2a - 2d, the interaction between the input device and the docking station may be as follows:
  • the user inserts her or his finger into the docking station while the input device is on the finger.
  • the user rotates her or his finger 90 degrees clockwise c.
  • the user takes the finger out of the docking station, leaving the input device inside.
  • the action of wearing the input device is symmetrical:
  • the user takes the finger out of the docking station with the input device on it, ready to be used.
  • the advantages of the docking station may include but are not limited to some or all of the following: a.
  • the docking station enables the user to wear and remove the input device from her or his hand quickly and efficiently, but more important - without the use of the other hand.
  • the input device is typically formed of relatively gentle micro-electronics.
  • the docking station shall protect the input device while in transit, e.g. in a laptop bag.
  • the docking station shall provide a very respectable casing for the input device d.
  • the docking station may connect to electricity and be used as the input device's charger.
  • the docking station may be equipped with an internal large capacity rechargeable battery, and so able to charge the input device even when the docking station is not connected to electricity. This overcomes the space limitations in the input device, extending its battery life much further.
  • cordless input devices such as wireless keyboards or mouse devices often function in a way allowing them to communicate with a single specific plug which has their ID encoded into it.
  • moving the control from one computer to the next typically involves the user disconnecting the plug from the first computer and reconnecting it to the second computer.
  • a more convenient mode which is used mainly in Bluetooth devices such as cell phones and earpieces allows the user to connect any device to any other device using a process called "Pairing" in which the devices are authorized to work with each other.
  • the PC application, the IR camera and the entire input device's design typically enable the user to work in this mode of operation.
  • Advantages created by this communication mode may include but are not limited to some or all of the following:
  • the user control of the input device PC application may enable the definition of user groups, and so enable multi-user control of the controlled computer.
  • Examples may include but are not limited to the following:
  • the active user group shall be "presenter” but when students ask questions, the presenter switches the active user group to "student” and the entire discussion becomes interactive.
  • the user group may include, say, a marketing director, the CEO and the chairman.
  • the CEO and chairman may intervene and assume control at any point during the meeting, resulting in a dynamic, multi-user control system which revolutionizes the way people collaborate with computers.
  • the input device is no longer the computer's mouse, it is a personal mouse just as cell phones are a personal rather than an office communication device.
  • the Input device's components may include some or all of the following:
  • light sources including: such as but not limited to some or all of the following:
  • buttons constructed in a two press manner- Half and full press
  • the Input device body may be made of a combination of flexible polymers with memory shape which easily conform themselves to the user's physical dimensions and enable him to freely move her or his finger.
  • the Input device may be worn on any of the user fingers on each hand, e.g. as shown in Fig. lb.
  • the light source is found at the front of the finger, typically in the middle joint.
  • the edge of the Input device which is positioned, when worn by the user, at the edge of her or his finger, typically pivots or revolves, typically a full 360 degrees, such that some or all of the following four components may be disposed azimuthally at the tip of the user's finger, e.g. on its four sides respectively, approximately 90 degrees apart:
  • the Input device's Operating Mechanism typically includes some or all of the following: RF Controllers, Scroll bar, cursor position control, integration of mouse clicks, scroll messages, beam striking point into a controlled device's operation system and provision of user feedback re her or his own actions e.g. as with a mouse, light sources, half-press mechanism, force sensing mechanism. Each of these is described in detail below, according to example embodiments of the present invention: RF Controllers: The two or more RF controllers may be located at the 2'nd & 3'rd parts of the finger but not necessarily e.g. as shown in Fig. lb).
  • the controlled device may be a Laptop / Desktop computer, mobile phone, camera, television and substantially any device that employs Human- Computer-Interaction (HCI). Those events may then be interpreted as mouse clicks (e.g. left and right) in a way similar to the way a cordless mouse operates.
  • HCI Human- Computer-Interaction
  • the scroll bar is a touch-sensitive electronic pad which simulates the operation of a scroll-wheel, creating signals that are transferred to the application in the same way.
  • the Input device's touch scroll bar is not visible in Fig. lb as it is right under the thumb in the image.
  • the scrolling operation of the Input device is typically effected by sliding of the thumb along one side of the finger in any direction.
  • Cursor position Input device control of the cursor position, according to certain embodiments, is now described, using the remote control environment being operated by the IR Laser as an example. Other environments / light sources, as described below, may share the same control scheme.
  • a laser beam in the IR / near IR spectrum is emitted from the Input device, and strikes the screen at a given point. That point's coordinates are transferred to the application.
  • the laser beam After striking the screen, the laser beam is scattered into the space of the room.
  • An IR camera is constantly videoing the screen area and detects the points on the screen where the laser beams are being scattered from.
  • the camera has been pre calibrated to know the exact area of the screen within its field of view, so that it may convert the light source's absolute position to its exact relative position on the screen.
  • the camera then sends this input to the application.
  • part/all of the abovementioned processing may take place in the application on the controlled device rather than at the camera.
  • IR cameras are located on the ribs of the screen, constantly videoing the screen and sending their input to a Central Computing Unit (CCU). Then, using mathematical and trigonometric computations on the integrated input, the CCU detects the contact point between the laser beams and the screen, sending it to the PC application.
  • CCU Central Computing Unit
  • the screen as a whole is an optic sensitive screen, which may self-detect the striking point of laser beams at a given range of the electromagnetic spectrum.
  • the image is typically not projected onto the screen frontally, i.e. from the side of the user, but rather from behind (back projection).
  • the laser beam hits the screen, scatters in the space of the room, and is absorbed by the IR-camera that is positioned behind the screen close to the projector.
  • the camera sees the point on the screen from where the beam emanates, and thus identifies where on the screen the user is pointing. Integration of mouse clicks, scroll messages, beam striking point into a controlled device's operation system and provision of user feedback re her or his own actions: a module of the application typically receives some or all of the abovementioned inputs, including e.g.
  • Fig. 3 is a pictorial diagram of this process according to certain embodiments.
  • the Input device typically includes several light sources such as but not limited to some or all of the following:
  • a. Infra red laser - may be positioned e.g. Laterally on the finger between the first and second joints (middle of the finger e.g.) orParallel to the finger between the second and third joints (base of the finger e.g.).
  • Red Laser - may be positioned as described below for Infra red laser.
  • Infra red LED - may be positioned on the finger tip on the revolving piece. However, the source of the IR LED may also be positioned on any part of the finger.
  • the camera typically identifies the source of the IR LED exactly in the same manner as it identifies the point where the laser beam scatters as described above - the rest of the process is as described above.
  • the camera cannot of course identify the actual red laser beam; instead it emphasizes elements on the screen without the ability to act upon those elements. Since the Input device is a wearable device, it constantly follows hand movement. Therefore, the user has the ability to choose when s/he wants it to influence the cursor position and when not.
  • the control of the sources of light may be affected by one or both of two mechanisms, half-press and force-sensing, each described in detail below.
  • the user may determine which light source each mechanism controls at any given moment, either from the application interface and from the Input device itself, e.g. using the Selection Controller.
  • the user may always choose to leave the light source on continuously in order to manipulate the pointer location at all times.
  • Half Press Mechanism Each of the RF controllers is constructed in a two press manner - half & full press. The operational algorithm of the mechanism above is described in Fig. le which is a state chart of the half press mechanism.
  • the Button Up/Down Event When the Button Up/Down Event is implemented, it may be used by wireless communication and by a different encoding of the source that is emitted from the Input device and that is received by the IR-camera which effects a system command when a specific control is pressed. In effect, all the information that is transferred by wireless communication may be transferred by any specific encoding of the light source.
  • the revolving part at the finger tip has a specific type of force sensor such as but not limited to FSR.
  • FSR Force Sensing Mechanism
  • a specific type of force sensor such as but not limited to FSR.
  • a specific threshold which may be user-configurable, the appropriate light source is turned on. After the applied force falls below the threshold or goes to zero, the light source turns off.
  • Enable/Disable may be enacted with this mechanism using the pad of the finger in the area of the finger nail of the inner/crossing finger respectively. The user may enable this in any suitable matter, such as but not limited to:
  • the Input device application may work in multiple modes of operation such as touch screen mode, mouse pad mode, and position only mode, e.g. as described below. The transition between modes may be enacted using the Selection controller of the Input device and from the interface of the application.
  • App_Touch Screen When the light source is turned on, the application positions the cursor in the appropriate place and executes a Left Click down. When the light source is turned off the application executes a Left Click Up in the exact place where the light source appeared before turning off. When the light source moves, the application executes a mouse movement.
  • App_Mouse Pad The light source is always implemented as a mouse movement typically with exceptions such as but not limited to some or all of the following: i. Left Click: Less than a second (or other user-selectable time-period) has passed from the moment it was turned on to the moment it was turned off, and if during this time period, e.g. second, the light source did not move, even minutely.
  • Double Click/Drag and Drop A small movement of the finger between the two clicks may nullify the Double Click implementation by the operating system. Therefore, each single click executes a timer. If the system identifies another click at this time, notwithstanding location, the second click may be implemented exactly in the same place as the first click. If the system identified that the light turned on only during this time without turning off, without regard to location, the system implements a Drag&Drop exactly from the position of the first click. App_Position Only: In this situation status, the light source is implemented as a mouse movement only and cannot execute any type of mouse clicks.
  • Dots per Inch optionally, in the IR LEG configuration, tracking the user may reduce the dpi, using a keyboard for example, thereby to enable the user to pinpoint very smaller icons with greater ease.
  • An ejectable stylus option may be provided according to which a small stylus is connected to the Input device which protrudes slightly from the finger tip.
  • the connection to the finger tip may be executed by screwing, applying pressure or any other mechanical application. This mechanism enables the user to have maximum specificity at the point of contact with the screen for applications in which precision is important.
  • the stages of operation may include some or all of the following, suitably ordered e.g. as follows:
  • the Stylus touches the area, because it protrudes, and therefore pressure is applied b.
  • the base of the Stylus which completely absorbs the pressure applied on the Stylus, pushes on the Force sensor typically located on the revolving side of the finger and effects an on/off of the IR LED.
  • the light enters the stylus from one end connected to the bulb and exits only from the other side which contacts the screen, similar to the operation of fiber optics.
  • the camera identifies the light source from the point of the Stylus.
  • a Pressure differentiation feature is provided according to which the
  • Input device communicates to the application the level of pressure put on the Force sensing mechanism.
  • the application may relate to the light sources differently based on the level of pressure applied. Thus the user may execute different actions based on how much pressure s/he puts on the surface.
  • gentle pressure may execute mouse movement whereas more pressure would execute a Drag &Drop.
  • the mechanism may also be controlled via the interface of the application.
  • thresholds of different pressure points and/or the appropriate actions they perform may each be configurable.
  • a Red Laser feature is provided according to which the device switches to a laser-pointer mode, emitting, instead of an IR laser detectable by the IR camera, a red laser at the human eye viewable spectrum. Using this mode the user may switch from controlling the mouse to highlighting important features on the screen, similar to the way a regular laser-pointer works. It is important to note that although the user has no effect on the cursor position, the other functionality of the Input device, such as clicking and scrolling abilities, continue to work regularly.
  • a Multi- touch feature is provided according to which an Input device worn on one or more fingers of both hands enables the user to interact with the screen using a complete multi- touch ability. Different encoding of the source of light enables the application to connect between appropriate sources of light from the Input device and from the user's perspective from the appropriate hand or finger.
  • Four Way Scrolling enhanced input is provided.
  • the user may effect 4-way scrolling whereby one hand is responsible for horizontal scrolling and the other hand for vertical scrolling, without constantly moving the cursor back and forward.
  • This is useful in programs like Excel, drawing boards and substantially any application where the file is larger than the visible screen size.
  • the user may determine the role of each hand and finger.
  • the system receives events from a different devices, or hands/fingers, simultaneously it may alternate passing them to the system, passing an event from each device alternately until all events have been passed.
  • the outcome is circular scrolling on the screen that is completely open and free.
  • enhanced remote text input is provided.
  • the Input device enables effective use of virtual keyboards whereby the user may type freely with two or more fingers in a much more intuitive way and also use shortcuts such as but not limited to Ctrl+C, Windows+E.
  • shortcuts such as but not limited to Ctrl+C, Windows+E.
  • programmable gestures are provided.
  • the Input device application includes several common gestures such as, the @ to open a mail application, or the V sign to promptly open and close a virtual keyboard. This option typically also enables the user to create an unlimited number of personalized gestures, each of which may execute a commonly used action of the user. In addition, an engine may be provided that alerts the user, and in certain cases blocks the user, from creating a new form that is too similar to a form already in use, thereby to decline the True-Negative in the system.
  • buttons e.g. Double right click, notifies the application that the user is requesting to switch to Gesture command mode.
  • the application gives feedback to the user that s/he is now in Gesture Command mode.
  • Embedded Flash Memory e.g. Disk on Key
  • the Input device has internal flash memory such that it serves as a disk-on-key as well. Transferring information may be executed in a number of methods including using a USB port, using the Input device docking station e.g. as described below, WIFI and other connection methodologies.
  • the internal memory of the Input device enables the controlled environment to immediately load the user's preferences including for example gestures that s/he's created, dpi preferences etc.
  • an embedded speaker and microphone is provided.
  • the speaker is located on the rotating finger tip apparatus while the microphone may be placed anywhere.
  • the Input device may serve a communications function, e.g. a Blue tooth earpiece, where the user places her or his finger wearing the Input device next to her or his ear.
  • the Input device is a personal control mechanism and is not intended for a specific work environment like a personal computer.
  • the Input device may control any environment that supports it, e.g. like the communication between a Blue Tooth device and multiple cellular phones.
  • the user may go from device to device, click on her or his unique user-ID and start interacting with it.
  • social interfacing is provided.
  • the IR camera and the Input device control application may handle several light sources simultaneously and therefore enables several users to share work space in the same environment. Due to the fact that the Input device is a personal control device, the system may associate sources of light to different users and therefore to relate to them in a different mode. Different encoding of the light source for each user in addition to the wireless communication may accurately identify each user at the time of the social interfacing. This individual identification enables management of different interactive permissions for every user. In addition, groups may be established, enabling permissions to be switched between sets of people rather than individuals.
  • the Input device utilizes the hand's and wrist's natural hand position.
  • Traditional mice are designed around the "palm down" posture that has been proven to contribute to Repetitive Stress Injury (RSI) and Carpal Tunnel Syndrome (CTS).
  • RSI Repetitive Stress Injury
  • CTS Carpal Tunnel Syndrome
  • the Input device applies a clinically proven technique to relieve pressure from the median nerve in the wrist which is similar to the 'Hand shake position'. This position has been shown to reduce muscle load and the incidence of discomfort associated with RSI and CTS.
  • the Input device is available in several sizes for maximum comfort to the user.
  • the differences in size may be around the circumference and the length of the Input device.
  • the Input device body may be made of a combination of flexible polymers with memory shape which may easily conform themselves to the user's physical dimensions.
  • bidirectional synchronization is provided.
  • Bidirectional communication between the controlled environment and the Input device enables automatic synchronization between the settings of the application to the internal settings of the Input device and vis. vs. For example:
  • a change in the work modes of the application may cause the active light source to change as well.
  • the mechanism of the bidirectional synchronization optionally enables the system to pre-define the users' modes of operation such that any changes in one direction may automatically synchronize in the other direction.
  • the user may always add additional operational modes.
  • Settings for the Input device may include setting one of, say, two mechanisms for turning on the light source, e.g. Half Press Mechanism and Force Sensing Mechanism as described herein; setting one of, say, three Light Sources (IR LED, IR Laser and Red Laser); and setting one of, say, three modes of operation in the application e.g. App_Touch Screen, App_Mouse Pad and App_Movement Only. Examples of available modes of operation for the Input device and their settings may include but are not limited to the following:
  • a.Touch Screen Force Sensing Mechanism + IR LED+ App_Touch ScreenMouse
  • Pad Force Sensing Mechanism + IR LED + App Mouse Pad
  • Remote Control Half Press Mechanism + IR Laser + App_Movement Only d.
  • Nearby Control also referred as 'KeyTop' control E.g. as described below: Half Press Mechanism + IR LED + App_Movement Only e.
  • Red Laser Half Press Mechanism + Red Laser + App Movement Only
  • the IR camera may be mounted on the projector facing the screen as shown in Fig. le.
  • the user has full control, from anywhere in the area and may also control the projected screen with multitouch. This capability typically is provided regardless of whether the projection is from the rear or the front e.g. due to the mechanism that revolves at the fingertip and/or due to the use of lasers.
  • this architecture works with any projector such as pico projects whether they are standalone or embedded in another device.
  • Physical screens Control of physical (also termed herein "tangible") screens, such as but not limited to LCD or plasma, may, if desired, be effected in exactly the same manner as projected, non-physical screens.
  • the IR-camera is positioned such that it is facing the user. In this manner, the user may control any physical screen such as but not limited to IPTV, home theatre, and media center.
  • This mode enables the user to control desktop and laptop computers.
  • the IR camera is located at the upper rib of the computer screen, facing down towards the user's hand (typically lying on the keyboard). The user uses both hands to work with the keyboard and when s/he wishes to move the mouse curser or do any other mouse operation such as clicking or scrolling, s/he does not need to lift her or his hand from the keyboard and hold the mouse - s/he uses the Input device while her or his hands are still on the keyboard.
  • the IR camera located on the computer's screen detects the IR signals being emitted from the Input device and sends them to PC application module on the controlled computer which transforms them into mouse movements on the screen, e.g. as shown in Figs, lg and li.
  • the user may have full control of the keyboard and mouse while both her or his hands remain on the keyboard.
  • Advantages of certain embodiments include but are not limited to some or all of the following: a. Effectiveness - the user's work rate increases as the move from mouse to keyboard requires zero setup time. Many users are wary of the constant move of hands from mouse to keyboard and work with one hand on the mouse and one hand on the keyboard. This approach suffers from limitations because keyboards are planned for two-hand use.
  • the Input device may transform into a completely independent unit, communicating with the laptop without the need for external devices.
  • the wearable input device typically is operative to turn any surface into a fully operative multi-touch screen utilizing a finger touch control method, e.g. as follows:
  • the apparatus of the present invention can, according to certain embodiments, convert a 200 inch wall sized surface into an enormous interactive multi-touch screen.
  • a particular advantage of certain embodiments of the Input device architecture is that little development is required for future products that are to employ the input device shown and described herein. Typically, only two adjustments are made:
  • the IR camera as described herein is embedded in the Head-up display and its code adapted to the HUD's operating system.
  • a docking station is provided, e.g. as shown in Figs. 2a - 2d.
  • the roles of the docking station typically include some or all of:
  • a preferred method for using the Input device shown and described herein in conjunction with a Docking Station typically includes the following steps, suitably ordered e.g. as follows:
  • Wearing the device typically comprises:
  • Removing the Input device typically comprises:
  • the user inserts her or his finger into the docking station while the Input device is on the finger.
  • the user rotates her or his finger 90 degrees clockwise c.
  • the user takes the finger out of the docking station, leaving the Input device inside.
  • Figs. 4a - 4b are isometric views of a wearable "sleeve" device for generating computerized input to electronic devices, according to certain embodiments of the present invention.
  • many of the components may be located anywhere on the sleeves without adversely affecting functionality and only the following components of the apparatus of Figs. 4a - 4b are location sensitive, e.g. as shown: Horizontal LED (light emitting diode) 10; Vertical LED 20; IR (Infra red) laser 30, Red laser 40, Touch scrolling bar 50, buttons 60 and 70, State selection actuator 80 and Force sensing actuator 90.
  • Figs. 4c - 4e are pictorial illustrations of the device of Figs. 4a - 4b as mounted on and manipulated by a user's hands, according to certain embodiments of the present invention.
  • Fig. 5 a illustrates an exemplary plurality of work modes according to which the sleeve device of Figs. 4a - 4b may operate; which work modes are typically selectable as alternative states, using the state selection actuator 60 of Figs. 4a - 4b.
  • the dimensions along which work modes differ may include one or both of: a. Active light source dimension of sleeve's work mode: as described above, the sleeve device may have some or all of the following 4 light sources: IR Laser 30 , Red Laser
  • the active light source is typically triggered by the force sensing actuator 90.
  • the sleeve device typically includes a software application which is typically resident on the controlled host e.g. computer or mobile communication device being controlled by inputs generated using the input generating system shown and described herein.
  • the software application typically has several selectable operating modes, e.g. as shown in the right-most column of the table of Fig. 5a, which differ from each other as to, e.g. input interpretation and operating method.
  • Each work mode is typically predesigned to control one or more environments such as one or more of the following controlled environments: projected screen environment (front or rear), physical screen environment, laptop/desktop computer environment, mobile devices with embedded pico projector and interactive surfaces environment.
  • Table of Fig. 5A summarizes seven exemplary work modes, of the state selection actuator 60, some or all of which may be provided.
  • Controlled Environments projected screen environment, physical screen environment, laptop/desktop computer environment, and interactive surfaces environment controlled by the sleeve of Figs. 4a - 4b in accordance with a suitable work mode e.g. as shown in the table of Fig. 5a, are now described in detail.
  • the selfsame input generating system typically including sleeves of Figs. 4a - 4b, sensor of Figs, lc, lg, li, 5b or 5c, and software application described below with reference to Figs. 6a - 6d, may be used to control multiple interactive environments such as but not limited to the four environments described below.
  • the software application described below with reference to Figs. 6a - 6d may interact with more than one sensor such that the user has the option of connecting an additional sensor to gain control of another environment simultaneously.
  • Suitable work modes for the Projected screen environment may include the following work modes from the table of Fig. 5a: 1 (Touch interaction), 2 (Remote projected screen interaction) or 5 (Power point interaction).
  • the sleeve device of Figs. 4a - 4b works with any projector including pico projectors whether they are standalone or embedded in another device (e.g. mobile phone), e.g. as shown in Fig. 5b.
  • Suitable work modes for the Physical screen environment may include the following work modes from the table of Fig. 5a: 3 (Remote tangible screen interaction) or 5 (Power point interaction).
  • the user may control any computer based application such as but not limited to an IPTV or a media center.
  • the sensor When the sensor is pointing at the screen, from anywhere in the room, the user may perform touch interaction with the screen using work mode number 1 (Touch interaction) of Fig. 5a.
  • a suitable work mode for this environment may include the following work mode from the table of Fig. 5a: work-mode 4 (Laptop/Desktop computer interaction).
  • the sensor may be mounted on an adjustable pivot which enables the user to raise it slightly and control the computer screen using work mode number 3 (Remote tangible screen interaction) e.g. as in environment 2 (Tangible screens).
  • work mode number 3 Remote tangible screen interaction
  • this environment employs optic communication between sleeve/s and sensor.
  • a suitable work mode for this environment may be the following work mode from the table of Fig. 5a: work mode 6 (Surface interaction).
  • the input generating system shown and described herein typically provides communication channels between the following components: sleeves of Figs. 4A - 4B, software application as described below with reference to Figs. 6A - 6D, and a sensor e.g. the IR sensor mounted on a projector in Fig. lc, on a laptop computer in Figs, lg and li, and on a mobile-communication device-mounted pico-projector in Fig. 5b.
  • the communication channels may for example comprise:
  • the application software When the user changes the sleeve device's work mode using the state selection actuator 80, the application software receives a respective event which changes the application's operating mode according to the table of Fig. 5a. However, if the sensor is static and used for a specific environment permanently, the user may notify the application software to always address that specific sensor with the suitable operation mode e.g. as described below with reference to Figs. 6a - 6b.
  • Each of the sleeve device's buttons 60 and 70 is able to trigger multiple events transmitted to the application software. Examples for such events include some or all of the following:
  • the sleeve device's touch scrolling bar 50 is able to trigger multiple events transmitted to the application software. Examples for such events include some or all of:
  • Each of the events transferred to the software application may contain background information regarding the user such as not but limited to her or his identity and her or his sleeve device's current operating mode.
  • aspects of the sleeve device's sleeves may be controlled and modified directly from the software application. Examples of this type of communication include but are not limited tochanging the sleeve device's work mode, embedding and modifying the user's ID and preferences to the sleeves memory; and controlling the coding of the light sources in order to enable the sensor to distinguish between users, hands and light sources on the sleeve itself.
  • Optics communication channel between sleeves of Figs. 4a - 4b and the sensor The optics communication between the sleeves of Figs. 4a - 4b and the sensor of
  • the optics communication between the sleeves and the sensor may be such that the sensor is pointing at the controlled screen, e.g. as in environments 1 and 4 in the table of Fig. 5a.
  • the optic communication between the sleeves and the sensor may be such that the sensor is pointing at the sleeve device's sleeves which are mounted on the user's fingers e.g. as in environments 1, 2, 3 in the table of Fig. 5a.
  • the sensor constantly monitors the projected screen area.
  • the force sensing actuator 90 which triggers the sleeve device's IR Laser according to the sleeve device's selected operating mode.
  • the IR Laser 30 is then emitted from the sleeve device's sleeve and impinges upon the projected screen at a given point. After impinging upon the screen, the laser beam is scattered into the space of the room.
  • the sensor detects the points on the screen where the laser beams are being scattered from.
  • the surface has a rear projection unit located at the bottom of the surface which projects onto the surface at a very acute angle. This obviates any need for a distance between the screen and the projecting unit thereby allowing the surface to be a 'flat unit' as opposed to conventional rear projecting setups which require considerable distance between the screen and the projecting unit. Also, because the screen is projected onto the surface light may pass through it as opposed to physical screens like LCD and Plasma. This enables the IR sensor, also located behind the screen next to the projecting unit, to "see
  • Figs. 4d - 4e typically adjacent a desired screen location, thereby applying pressure to the force sensing actuator 90 which triggers the sleeve device's IR laser according to the sleeve device's current operating mode.
  • the IR Laser 30 is emitted from the sleeve device and impinges upon the surface at the pinching point. After striking the screen, the laser beam is scattered into the space of the room behind the surface due to its transparent character as described above.
  • he rear sensor detects the points on the screen from which the laser beams are scattering.
  • Figs. 7a - 7b are front and back isometric illustrations of a flat surface with rear projection configuration, characterized by projection at a very obtuse angle such that projection may be from very close, in conjunction with a rear sensor, whose field of view typically comprises the entire screen.
  • the 4 dotted line segments emanate from, and signify a suitable "field of view" for, both the camera and the projector which as shown includes the entire screen.
  • the sensor constantly monitors the screen.
  • the force sensing actuator 90 which triggers one of the sleeve device's IR LED's according to the sleeve device's current operating mode.
  • the light is then emitted from the IR LED and absorbed in the sensor which detects the exact position of the user's fingertip.
  • the sensor constantly monitors the space of the room.
  • the force sensing actuator 90 which triggers one of the sleeve device's IR LED's according to the sleeve device's current operating mode. Light is then emitted from the IR LED and absorbed in the sensor which detects the exact position of the user's fingertip.
  • the sensor constantly monitors the keyboard area or the space in front of the screen e.g., if the sensor is pivot-mounted as described above, according to its pivot direction.
  • the force sensing actuator 90 which triggers one of the sleeve device's IR LED's, according to the sleeve device's current operating mode.
  • Light is then emitted from the IR LED and absorbed in the sensor which detects the exact position of the user's fingertip.
  • the light sources may be coded, e.g. digital coding using discrete on/off, or analog coding using continuous power alternating, in a predetermined manner which enables the light sources to send background information to the sensor regarding the user with which the light sources are associated.
  • the background may for example include the user's identity, her or his sleeve device's current operating mode, and the specific light source currently being used by the user's sleeve device.
  • the active light source turns on according to the current work mode. This results in the sleeve device communicating with both the sensor and the application software concurrently which facilitates, for example, remote drag & drop.
  • the sensor constantly processes the data it receives including light source position and background information and sends the output to the software application.
  • every aspect of the sensor may be controlled and modified directly from the software application such as but not limited to any or all of the following: modifying the sensor's state - on, off, standby; creating initial communication with the sensor, and adjusting the sensor's sensitivity.
  • the system of the present invention includes a wearable input device, which is in two-way data communication with a controlling module, typically comprising a software application, which also is in two-way data communication with a sensor as described herein.
  • the sensor in turn is in typically oneway data communication with the wearable input device.
  • the controlling software application sends commands to an operating system of a controlled entity such as but not limited to a laptop, personal or other variety of computer (in which case the operating system may for example be a Windows operating system), or a cellular telephone or other variety of portable communication device (in which case the operating system may for example be a Symbian or Google Android operating system).
  • the software application receives input from the wearable input device/s and from the optic sensor e.g. IR camera and sends control commands, responsively, to the operating system of the controlled entity and, typically, also to the optic sensor (e.g. wake up commands) and/or to sleeve itself (e.g. change work mode commands).
  • the optical data is, for simplicity, represented by infra red (IR) light and the wireless data is, for simplicity, represented by radio frequency (RF) communication.
  • Fig. 6a is a simplified block diagram illustration of interactions between various components of the input-generating system, and the input the software application receives from the hardware (sensor or wearable input device), according to certain embodiments of the present invention.
  • the inputs may include some or all of the data set out in the table of Fig. 6B.
  • event type refers to the type of application operating mode-changing event received by the software from the sleeve, via the wireless communication channel between Sleeves of Figs. 4a - 4b and software application resident on controlled host, as described above.
  • the hardware layer is responsible for receiving the raw data, processing it and sending it to the application layer.
  • the application layer might use DLL files or drivers in order to read the data from the hardware.
  • DLL files or drivers Once the inputs have been received by the hardware layer, they are typically transmitted into the application layer.
  • One way of implementing this, for example, is by using two threads that are continuously running in parallel and reading the inputs: one thread reads the IR data sent by the IR Sensor and one reads the RF data sent by the wearable input device itself.
  • the application receives the inputs from the hardware layer, it decides what to do with the information.
  • the decision of action to execute may be decided using two factors:
  • Provision of a plurality of application modes is optional yet advantageous. Using different application modes allows the system to run several forms of user interaction. For example, in the 'normal mode' (default), when receiving an IR event, the application may change the position of the operating system's cursors. In a different mode, when viewing maps using Google Earth, the IR movements may be translated into map movement, which is done using APIs. The same may be done with the RF inputs.
  • the application is aware of its mode, and functions according to the method of operation which is associated with that mode.
  • the application may change modes in certain situations, such as but not limited to:
  • Wearable input device has been pressed.
  • Example application modes include:
  • Normal (default) mode may be used in order to control one or more of the applications.
  • the IR and RF inputs may control the operating system's cursors movement and clicks.
  • This mode typically incorporates various "Operation Modes" which control the cursors differently, such as but not limited to some or all of the following operation modes each described in detail below: absolute position operation mode, touch operation mode, and mouse-pad operation mode.
  • Absolute Position After retrieving the information from the IR Sensor, the application converts IR positioning values quantifying positions relative to the sensor's field of view into computer screen positioning values relative to the screen position, size and resolution, e.g. as per the calibration method described below.
  • Yl it commands the operating system to move the relevant cursor to the exact position (XI, Yl) when using windows, for example, it may use a Windows API called SetCursorPos.
  • mouse clicks may be effected using the RF signals (Wearable input device button clicks). For example - the sequence 'Right hand left click down' + 'Right hand left click up' may implement a mouse click at the right cursor's current position.
  • Touch - This mode of operation uses the same cursor movement processing as does the Absolute position mode of operation described above. However, in Touch operating mode, mouse clicks may also be triggered by the IR input, in addition to triggering by RF input as described with reference to the absolute position operation mode.
  • the application monitors the number of IR sources found by the IR Sensor.
  • All of the above may be implemented for more than a single light source, typically limited only by the number of light sources the sensor may detect simultaneously.
  • An example of a use case in which more than one light source may be useful employed is when controlling multi-touch application (e.g. Windows 7).
  • Mouse pad The purpose of this mode of operation is to simulate a mouse pad using the IR information from the IR Sensor of Figs, lc, lg, li, 5b or 5c. This typically includes moving the cursor related to the cursor current location instead of an absolute location. To do so, the application waits until a new IR source appears, and saves some or all of the following values:
  • the application typically calculates the vector movement of the IR source (distance and direction), translates it into desired vector movement of cursor position (A2, B2) and moves the cursor accordingly.
  • the user may define a measurement conversion unit between the two. Setting this value may allow the user to move the cursor faster or slower, balancing speed with precision. For example - When setting this value to 2, a 10 inch movement of the IR source in space may trigger a 20 inch movement of the cursor on the controlled screen.
  • the application may simulate mouse clicks by reading the RF signals (Wearable input device button clicks) or by detecting an IR source that has been activated for a short period of time (less than a second, for example), just like in a tangible mouse pad.
  • Fig. 6c is a table setting out example results of various Operation Modes within the Normal Application Mode when an IR source is detected.
  • Calibration is the process of defining the relation between the IR sensor's spectrum (scope of view) and the screen.
  • the Application shows visual indications on the screen/surface.
  • the user signals that indicators location to the IR sensor.
  • the application receives the IR data, it has 2 sets of data: coordinates of the indicators on the screen ⁇ (Al, Bl) (A2, B2) etc. ⁇ and coordinates of the sensor's scope from which the light has been received accordingly ⁇ (XI, Yl) (X2, Y2) etc. ⁇ .
  • the application uses trigonometric computation in order to convert the sensor's raw coordinates into screen coordinates.
  • the calibration computations also typically handle the direction of the conversion. For example, the left and right points at a sensor located behind the screen (i.e. environment number 4 - surface interaction) are exactly opposite to those retrieved by a sensor located in front of the surface (i.e. environment number 1 - projected screens).
  • Application mode Activated when an application that may use the Wearable input device's inputs in a special way is opened such as but not limited to the Google Earth application.
  • Google Earth may receive a various kind of inputs such as map moving, zoom in/out, pan and tilt, optic and wireless input received by the software application are converted into meaningful orders to Google Earth, as opposed to merely sending cursor movements and clicks to the operating system.
  • application mode number 1 e.g. Google Earth
  • the software typically distinguishes between some or all of the following 3 sub-states (a) - (c) which differ from one another e.g. in their input interpretation: sub-state a - One IR Point, Moving the map: When the sensor detects only one IR source is moving, the application orders Google Earth to move the map, according to the distance and direction of the IR movement,
  • sub-state b -Two IR Points, Rotate and Zoom When the sensor detects two IR points AND the user has entered the rotate and zoom sub-state, the application orders Google Earth to move and rotate the earth concurrently, e.g. as illustrated by the example set out in Fig. 6d.
  • sub-state c - Two IR Points, Pan and Tilt When the sensor detects two IR points AND the user has entered the pan / tilt sub-state, the application orders Google Earth to Pan and Tilt the earth in space, rather than rotating and zooming as in sub-state b.
  • Examples of other special applications which may trigger a certain application mode include but are not limited to specific games where controlling the game is not achievable by conventional movement and click events, and advanced applications which employ complex interaction.
  • the software typically provides individual support for each such application.
  • Gesture mode A "gesture mode" may be provided which is used to activate a specific operation based on a pre-recorded path of IR inputs.
  • the Gesture Mode may include one or both of the following two sub-states: a.
  • Definition sub-state- The user defines an IR path which the application is recording, and adds an action to this path.
  • a first, e.g. V sign may open a specific virtual keyboard software
  • a second, e.g. @ sign may open a mail application
  • a third, e.g. 'e' sign may open a browser application etc.
  • Execution sub-state The user decides that s/he want to execute a gesture. S/he triggers the application to enter the execution sub-state, e.g. by holding down one button of the wearable input device for a period of at least a predetermined length e.g. a few seconds. The user then moves one or both of her or his hands to create the relevant gesture path. The application then compares this path to the internal gesture database and when finding a match, runs the desired action.
  • the Wearable input device typically includes a pair of wearable input devices which are entantiomers i.e. are mirror images of each other such that they may be worn on both hands, the IR Sensor may send positions of a plurality of IR points to the application simultaneously and the software application is designed to handle multiple IR inputs concurrently.
  • the standard multi-touch API may be employed in order to control default multi-touch behavior.
  • a multi touch environment may be simulated.
  • multi-touch painting software may be created that may display several simulated cursors on the screen based on the detected IR sources. Since the RF and IR data typically include a unique identifier of the Wearable input device e.g. user ID plus identification of hand as either left or right) the application may contain the unique identifier of each Wearable input device that operates each cursor on the screen operated by more than one user. When receiving an RF event (left click up for example, also containing the Wearable input device's unique identifier), painting occurs on the screen, using the appropriately corresponding cursor and without affecting other users' cursors.
  • user preferences are accommodated by the software application. Some or all of the variables that define the behavior of the application in specific modes may be saved in the software application's internal memory. However, each and every one of them may be saved per a specific user, specific sensor or any combination between the two. Examples include:
  • Conversion ratio - a certain user defines that whenever s/he interacts with the system using the 'mouse pad' operating mode described herein, regardless of the controlled environment or the sensor being used, s/he always wants to use a conversion ratio size of, say, 4.
  • Gestures - a certain user may define that when s/he interact with sensor A s/he wants a particular gesture V to do something but when s/he interact with sensor B s/he wants the gesture V to do something else.
  • the user may also define different gestures on each sensor.
  • Fig. 8a is a simplified block diagram of hardware components of a wearable input device constructed and operative in accordance with certain embodiments of the present invention.
  • three light sources are provided which emit light at a certain wavelength and optical power in order to activate (or not, in case of the red laser) the sensor.
  • the IR laser diode emits light at a wavelength of at least 800nm where the wavelength corresponds to the sensor filter frequency response.
  • the laser diode typically includes a built-in photo diode which absorbs light emitted from the laser and may be interfaced to a feedback circuit to monitor and indicate the laser activation.
  • the IR laser diode may for example comprise a 5.6mm diameter diode cased in a TO-18 footprint as shown in Figs.
  • a specific laser collimator may be mounted on top of the diode which typically matches the 5.6mm diode and may include a holding tube with a locking ring and a 4mm plastic lens.
  • Laser diode driving circuits an example implementation thereof is shown in Fig. 8e.
  • the IR laser and RED laser may be driven using the same method.
  • a current source controlled by an MCU may be used to supply a steady current to the laser diode.
  • Trimmers R13 and R17 may be used to calibrate this current for the IR laser and red laser respectively, so as to calibrate each laser's current flow in order to control the optical emitted power by the lasers.
  • Signals "SHTD_IR_LASER” and “SHTD_RED_LASER” are used to turn on the current sources supplying current to the laser diodes.
  • the "LS_VOLTAGE_SHT” signal turns on/off the voltage supply to the lasers. This method assures a very low current consumption in the off state, when none of the lasers is turned on, and in the on state too by turning on only one circuit at a time (either the red laser circuit or IR laser circuit).
  • Suitable Feedback circuitry for the IR laser is shown in Fig. 8f.
  • almost every laser diode is typically supplied with a photodiode which generates current upon laser activation, so as to provide a feedback circuit to trigger a LED each time the laser emits.
  • the example circuit illustrated in Fig. 8f samples the current generated from the IR laser photodiode and converts it to measurable voltage.
  • a yellow LED mounted on the control's unit panel is turned on when the IR laser is emitting.
  • the red laser emits light at a wavelength visible to the human eye and is out of scale regarding the sensor filter frequency response.
  • the red laser may have a 5.6mm diameter case in a TO-18 footprint.
  • the IR LED emits light at a wavelength of at least 800nm, however, unlike the IR laser, the IR LED light is projected directly to the sensor.
  • the radiant power of the LED should be at least 20m W, to allow the sensor to detect it.
  • the LED may comprise a 940nm LED device with a typical radiant power of 20m W and may have an outline of 3 mm and a length of 3mm.
  • the relative radiant intensity vs. angular displacement of the LED is illustrated in Fig. 8g.
  • the axes of the graph of Fig. 8g include a normalized radiant intensity from 0-1 and the emitted light angle from -90° to +90°.
  • the parabolic lines mark the intensity axis and the straight lines mark the angles.
  • each spot on the curve has a parabolic line leading to an intensity reading and a straight line leading to an angle reading. For example pointing at the 0.4 reading and moving along the line to a point on the curve results in a meeting with the 60° angle meaning that the LED has 40% intensity at angle of 60°.
  • Fig. 8b illustrates the CC2510D2 SOC by TI; this comes in a 4X4 mm QFN footprint to minimize board size.
  • Fig. 8h is a top view, pin out illustration of the CC2510O2.
  • Fig. 8i is an example of a suitable application circuit of the CC2510.
  • the RF transceiver may use a 50 ⁇ antenna.
  • the antenna is typically a chip antenna of a very small size such as a Fractus ANT
  • the board space used for this type of antenna is very small (proximally 5X15mm of board area).
  • the chip itself has a very low profile of 7X3mm rectangular shape.
  • the CC251 1G2 is a 2.4GHz SOC that includes a full speed USB controller. RF connectivity from the control unit is received in the host side using one PCB dongle device which incorporates the CC251 102 SOC.
  • the scroller is operative capacitance sensing. If the application requires high resolution sensors such as scroll bars or wheels, software, running on the host processor, may be employed.
  • the memory for the host typically, depending on the sensor type, stores 10 KB of code and 600 bytes of data memory.
  • Fig. 8j illustrates a Three part capacitance to sensor solution.
  • a slider may comprise 5 to 8 or more discrete sensor segments depending on the sensor length, with each segment connected to a CIN input pin on the AD7147/ AD7148.
  • Discrete sliders may be used for applications that employ linear, repeatable output position locations and may comprise discrete sensors elements arranged in a strip, one after the other.
  • the discrete sensing segments may operate like buttons.
  • Each sensing segment is arranged in close proximity to the next sensor; thus, when a user moves a finger along the slider, more than one sensor segment is activated at a time.
  • the slider of Fig. 8k can produce up to 128 output positions.
  • Each segment of the slider employs one CIN input connection to the AD7147/AD7148.
  • Double action surface mounted switches are provided according to certain embodiments.
  • a double action switch allows a two signal device actuated by two different forces to be provided.
  • a first press on the switch triggers a "half press signal” and a subsequent press triggers a "full press signal”.
  • This mechanism exhibits a mechanical feedback for each of the presses so as to produce the following 4 electrical signals as shown in Figs. 8L, 8m, 8n and 8o: no to half press, half to full press, full to half press and half to no press.
  • Figs. 8L, 8m, 8n and 8o Referring to the denounce circuitry for denouncing a mechanical switch illustrated in Figs. 8L, 8m, 8n and 8o, it is appreciated that when a switch is operated by a human, spikes of low and high voltages always occur across that switch resulting into a series of High and Low signals that may be interpreted by a digital circuit as more than one pulse instead of as one clean pulse or as a transition from a logic state to another.
  • the circuitry of Figs. 8L, 8m, 8n and 8o includes a denounce circuit including a capacitor, pull up resistor and Schmitt trigger IC.
  • System connectivity may be provided by the example connection scheme illustrated in fig. 8p.
  • a light sources board (LSPCB) has 6 SMD pads lined at the rear and connects directly to a 1mm pitch right angle 18 pin connector on the control board PCB (CPCB).
  • CPCB control board PCB
  • the control board may be unplugged by the operator from the finger without opening the enclosure.
  • the switches are connected to the same 18 pin connector on the CPCB.
  • the cable is soldered directly to the switches.
  • the scroll pad PCB (SPCB) is connected to the same 18 pin connector.
  • Fig. 8q is a table providing a glossary of terms used in Figs. 8a - 8p and in the description thereof, and elsewhere.
  • Fig. 9A is an example Sleeves Data flow diagram illustrating example interactions between all HW actors of a wearable input device e.g. Sleeve via which SW interface actors may interact.
  • Fig. 9b is an example Dongle Data flow illustrating example interactions between all HW actors of a Dongle provided in accordance with certain embodiments, via which SW interface actors interact.
  • Fig. 9c is an example Main module Flow Chart for a wearable input device e.g.
  • a main module of a wearable input device e.g. that of Figs. 4a - 4b, monitors and manages some or all of the buttons, state selection, slider and light sources e.g. as shown and described herein, and sends event signals accordingly.
  • Fig. 9c is an example flow chart for the dongle of Fig. 9b according to which dongle software residing in the dongle receives the RF events messages from a Laser Pointer device, prepares the buffer for events list sending to a USB and sends data according to a Host request, where the host typically comprises the electronic device being controlled by the wearable input device shown and described herein.
  • the Dongle main module typically waits for RF message receiving, tests it and, if the message is valid, adds new events to the USB input buffer etc.
  • the host DLL typically contains variables and methods for receiving by USB and displaying the buttons and slider events (among other events).
  • An example Events list which may be transmitted by protocol RF, from Device to Dongle is set out in the table of Fig. 9e.
  • Fig. 9f is a table providing a glossary of terms used in Figs. 9a - 9e and in the description thereof, and elsewhere.
  • software components of the present invention including programs and data may, if desired, be implemented in ROM (read only memory) form including CD-ROMs, EPROMs and EEPROMs, or may be stored in any other suitable computer-readable medium such as but not limited to disks of various kinds, cards of various kinds and RAMs.
  • ROM read only memory
  • EEPROM electrically erasable programmable read-only memory
  • Components described herein as software may, alternatively, be implemented wholly or partly in hardware, if desired, using conventional techniques.
  • components described herein as hardware may, alternatively, be implemented wholly or partly in software, if desired, using conventional techniques.
  • Any computer-readable or machine-readable media described herein is intended to include non-transitory computer- or machine-readable media.
  • Any computations or other forms of analysis described herein may be performed by a suitable computerized method. Any step described herein may be computer- implemented.
  • the invention shown and described herein may include (a) using a computerized method to identify a solution to any of the problems or for any of the objectives described herein, the solution optionally include at least one of a decision, an action, a product, a service or any other information described herein that impacts, in a positive manner, a problem or objectives described herein; and (b) outputting the solution.

Abstract

A wearable input device operative to control an electronic system, the input device comprising a wearable substrate; an IR laser source mounted on the wearable substrate and operative to emit a laser beam impinging on a screen at a location whose coordinates depend on motion of the user wearing the input device; and a laser spot detector operative to detect coordinates of the location within the screen and accordingly to control the electronic system. Optionally, the device has one or more of the following features: multidisciplinary, controlling more than one of computers, cell phones, media centers, projected and other screens; multi-touch using both hands; and ergonomic.

Description

Wearable device for generating input for computerized systems
REFERENCE TO CO-PENDING APPLICATIONS Priority is claimed from US provisional application No. 61/272,610, entitled
"Wearable input device" and filed 13 October 2009 and from US provisional application No. 61/282,513 entitled "Multi-disciplinary personal input device and application thereof and filed 23 February 2010 FIELD OF THE INVENTION
The present invention relates generally to input devices and more particularly to portable input devices. BACKGROUND OF THE INVENTION
Conventional technology pertaining to certain embodiments of the present invention is described in the following publications inter alia:
1. US 5453759 : Pointing device for communication with computer systems
2. WO 2009024971 (A2): Finger-worn Devices and Related Methods of Use 3. WO 0237466 (Al): Electronic User Worn Interface Device
4. WO 2010053260 (A2): Mouse Controlled Via Finger Movements in Air
5. US 6198485 (Bl): Method and Apparatus for Three-Dimensional Input Entry
6. US2006001646: Finger Worn and Operated Input Device
7. US 4988981 (A): Computer Data Entry and Manipulation Apparatus and
Method
8. US2008042995: Wearable Signal Input Apparatus for Data Processing System
9. US 7057604 (B2): Computer Mouse on a Glove 10. US' 2003227437: Computer Pointing Device and Utilization System
1 1. GB 2442973 (A): Finger Worn Computer Mouse with an Optical Sensor on a Pivoting Arm
12. US 6587090: Finger Securable Computer Input Device
13. US 2006012567: Miniature Optical Mouse and Stylus 14. US 2009322680: Radio Frequency Pointing Device
15. US 7042438 (B2): Hand Manipulated Data Apparatus for Computers and Video Games
16. US 2010188428 (A 1): Mobile Terminal with Image Projection
17. WO 2009125258 (Al): Communication Terminals with Superimposed User Interface
18. WO 2010064094 (Al): Portable Electronic Device with Split Vision Content Sharing Control and Method
19. WEB: Mobile Phone with a Built-in Projector
20. US 2008317331 : Recognizing Hand Poses and/or Object Classes 21. WO 2004055726 (Al): Interface System
22. US 7006079 (B2): Information Input System
Gestural computing is known. Gyro mouse devices are known. Multi-touch technology is known. Optic touch technology is known.
The disclosures of all publications and patent documents mentioned in the specification, and of the publications and patent documents cited therein directly or indirectly, are hereby incorporated by reference. SUMMARY OF THE INVENTION
Certain embodiments of the present invention seek to provide a convenient, intuitive wearable input device. There is thus provided, in accordance with at least one embodiment of the present invention, a convenient, intuitive wearable input device.
Certain embodiments of the present invention seek to provide a system for generating computerized input to electronic devices including some or all of the following components, interacting between them e.g. as shown and described herein:
1. at least one wearable input device
2. an optical sensor such as an IR camera having optic communication with the sleeve device
3. a software application resident on the controlled host and having typically bidirectional wireless communication with the sleeve device and with the sensor, wherein the wearable input device includes some or all of:
a. State selection actuator selecting wearable input device's work mode
b. A signal terminal and processing unit
c. Wireless communication controller (emitter) and wireless receiver on the sleeve device and the controlled host
d. Energy supply
e. Optional sleeve Memory unit.
f. Light sources e.g. some or all of:
i. Infra red / Near Infra red laser (includes a feedback circuitry for laser activation monitoring)
ii. Infra red / Near Infra red LED (light emitting diode)
ii. Optionally, red/green Laser (includes a feedback circuitry for laser activation monitoring)
g. 2 buttons with half press and full press mode and with haptic feedback whose clicks may be interpreted by the software application as mouse right clicks and mouse left clicks respectively.
h. Optionally, touch pad scrolling bar for remote scrolling functionality
i. Force sensor actuator triggering one of the light sources depending on work mode. Typically, the optic sensor has a spatial field of view which is as wide as possible given application-specific and/or technical constraints. The optic sensor is typically positioned and selected to point at a controlled screen such that its field of view includes the entire screen or to point at a user e.g. such that its field of view includes at least the user's hands. The sensor's filtering is such as to match the optics characterization of the wearable input device's light sources, e.g. in terms of frequencies, power, and distribution. The force sensor actuator, also termed herein "force sensor" may for example comprise a conventional button mechanism.
In an example embodiment of the present invention, each wearable input device may include some or all of the following:
1. State selection actuator which alternates between work modes
2. Signals terminal and processing unit such as not but limited to MCU controlled system, embedded in wearable sleeve, with I/O's, serial interface, signal conversion capabilities.
3 a. Wireless communication controller (emitter) on wearable sleeve for emitting wireless information from sleeve to controlled host.
3b. For applications in which it is desired to enable 2-way communication (from host to sleeve/IR Camera), a wireless communication controller (emitter) on controlled host (e.g. USB dongle) for emitting wireless information from controlled host to sleeves/IR Camera.
3c. For applications in which it is desired to enable 2-way communication (from host to sleeve, a wireless receiver on wearable sleeve for receiving wireless information from host.
3d. Wireless receiver on controlled host (e.g. USB dongle) for receiving wireless information from sleeve/s and/or IR Camera.
4. Conventional apparatus for supplying and monitoring energy e.g. battery powered system with low battery monitoring, charging during work mode capability and battery over-voltage and under-voltage protection.
5. Optional sleeve memory unit e.g. with compatible interface to the MCU 2 for storing data (e.g. user preferences, user ID/nickname etc.).
6al. light source e.g. Infra-red/Near Infra-red laser to generate optic signals to the IR camera. 6a2. If required due to eye safety and regulatory issues, feedback circuitry for laser activation monitoring operative to notify the system when a laser is activated (e.g. red laser / IR laser). Typically located adjacent the monitored laser diode/s.
6b. light source: Infra-red / Near Infra-red LED, Both vertical and horizontal, to create optic signals to the IR camera. Horizontal light source facilitates remote physical screen control and surface control inter alia. Vertical light source facilitates front projection touch control and laptop/desktop control.
6c 1. Optional Red Laser operative for emphasis and visual pointing.
7. Optional battery charging indicator e.g. light which indicates battery being charged and/or low battery indicator e.g. light and/or, if mandated for eye safety and regulatory issues, a laser activation indicator e.g. light, and/or current work state indicator e.g. light.
8. 2 buttons with haptic feedback, each having 2 press options: half and full press, operative to trigger light sources activation and to trigger wireless events transmission.
9. Touch pad scrolling bar operative to trigger scroll up/down events, for applications in which it is desired to provide remote scrolling abilities rather than, say, locating the cursor on the scroller of the screen and dragging it up and down each time it is desired to scroll the page.
10. Force sensor actuator operative to trigger light sources activation, typically embedded within the sleeve's substrate or body.
1 1a. For applications in which it is desired to receive voice input via speech recognition, a microphone typically embedded in the wearable sleeve.
l ib. Speaker embedded in wearable sleeve, for applications in which it is desired to provide voice output (e.g. indicating a current work mode).
12. Wearable finger sleeve body/substrate which is suitable in terms of flexibility, texture, weight, and strength.
The above may be operative in conjunction with some or all of the following:
13. Software on the controlled host
14. IR camera to detect IR signals and send them to the software application, located so as to point at the screen and/or the user, depending on desired work mode.
The one or more wearable input devices e.g. 2 sleeves; optical sensor e.g. IR camera, and software application resident e.g. on the controlled host, may be served by conventional apparatus for charging and data transition from/to the host e.g. Micro USB to USB cable and matching micro USB port on the sleeve/s. The Micro USB port, located anywhere on the sleeve, preferably at the bottom of the finger, typically connects the sleeve to a host with a USB port (e.g. computer) so as to charge the battery, send data and receive data.
Each wearable input device also may comprise the following optional components:
16. Accelerometer for detecting depth motion
17. Gyro for detecting rotational motion
18. Flash memory which enables the sleeve to act as a 'wearable disc on key'
It is appreciated that various sub-combinations of the above components have a very wide variety of applications, such as but not limited to the following:
A. An input system, including components 3a, 3d, 8, 9 above, which mimics or preserves mouse functionality e.g. left click, right click, scrolling using buttons and scroller similar to those in a regular mouse.
B. A wearable input device, e.g. including components 1 , 6al, 6b, 10, 14 above, which is operative both when the optic sensor points at the screen (the 'natural' state for projected screens) and when the optic sensor points at the user (the 'natural' state for tangible screens).
C. A wearable input device, e.g. including components 6b, 10, 14 above, having an operating mode in which touch/multi touch abilities are executed by pressing the screen with the fingertip - e.g. as in conventional touch screens.
D. A wearable input device, e.g. including components 1, 6al, 6b, 10, 14 above, having an operating mode in which remote cursor control is executed by joining the thumb and the forefinger in a 'pinching' like manner, e.g. as shown in Figs. 4d - 4e, which affords ergonomic and intuitive motion for the human hand such that to the user it seems like s/he is holding the cursor with her or his thumb and forefinger and moving it around.
E. A wearable input device, e.g. including components 1, 3a, 3d, 6al, 6b, 8, 9, 10, 14 above, which enables both remote and touch interaction rather than either remote or touch solution but not both. F. - A wearable input device, e.g. including components 1, 3a, 3d, 6al, 6b, 6cl,
8, 9, 10, 14 above, having multiple, e.g. 5, interactive environments enabling convergence of input devices rather than mandating one such device per environment.
G. - A wearable input device, e.g. including components 3a, 3d, 6b, 8, 9, 10, 14 above providing Laptop/desktop interaction in which, typically, at least the entire keyboard area has 2 roles: keyboard + mouse pad which may be alternated in zero setup time e.g. by joining the forefinger and the thumb together, typically such that mouse functionality including some or all of left click, right click and scrolling is in thumb reach.
Certain embodiments of the present invention seek to provide a touchless embodiment which prevents or diminishes screen amortization and, in view of absence of friction between the finger and the screen, enables faster and smooth movements by the user.
In accordance with an aspect of the invention there is provided a wearable input device operative to control an electronic system, the input device comprising a wearable substrate, an IR laser source mounted on the wearable substrate and operative to emit a laser beam impinging on a screen at a location whose coordinates depend on motion of the user wearing the input device, and a laser spot detector operative to detect coordinates of the location within the screen and accordingly to control the electronic system.
In accordance with an aspect of the invention there is further provided an IR- based apparatus for controlling an electronic device, the apparatus comprising an IR camera configured to be mountable on an electronic device, the electronic device having an input area, the IR camera's field of view including the input area, the IR camera being operative to sense IR signals generated by an input device worn on at least one user's hand, when the hand is operating on the input area; and a controlling functionality operative to receive the IR signals from the IR camera and to control the electronic device accordingly.
In accordance with an aspect of the invention there is further provided a wearable input device serving a human user, the device comprising a wearable substrate, and at least one force sensor mounted on the wearable substrate and operative to sense pressure patterns applied by the human user which mimics the pressure patterns the human user would apply to a mouse button, and a controlling functionality operative to receive signals, indicative of the pressure, from the force sensor and to control a normally cursor-based electronic device accordingly, including commanding the electronic device to respond to each pressure pattern applied by the human user, as it would respond to the same pressure pattern were it to have been applied by the human user to a cursor-based input device operating the electronic device.
In accordance with an aspect of the invention there is provided a wearable input apparatus operative to provide multi-touch control of an electronic system when the apparatus is worn by a human user, the input apparatus comprising a first wearable input device operative to control the electronic system when mounted on the human user's right hand, and a second wearable input device which is a mirrored copy of the first wearable input device and is operative to control the electronic system when mounted on the human user's left hand.
In accordance with an aspect of the invention there is provided a wearable input apparatus operative to control an electronic system when worn by a human user, the input apparatus comprising a finger-wearable substrate configured to be mounted on a user's finger and having a tip portion configured to be mounted on the user's finger tip and a force sensing device mounted on the tip portion and including at least one force sensor and being operative to sense pressure applied by the human user's thumb when the user presses thumb to finger, and to control said electronic system at least partly in accordance with at least one characteristic of the pressure.
In accordance with an embodiment of the invention there is provided an apparatus wherein said force sensing device comprises an annular substrate configured to pivot around the user's finger on which a plurality of force sensors are mounted, such that, by rotating the annular substrate, the user selectably positions any of the plurality of force sensors at a location on his finger tip which is accessible to his thumb. Light sources may optionally be provided on the annular substrate.
In accordance with an embodiment of the invention there is further provided an apparatus comprising an input device worn on at least one user's hand which enables a user to alternate the controlling functionality between a first state in which the electronic device is responsive to the input area and a second state in which the electronic device is responsive to said IR signals. Ooptionally and if desired, ne hand may operate the keyboard and the other hand may operate the IR signals and the controlling functionalities of the system may be such that both states may be operative simultaneously.
In accordance with an embodiment of the invention there is further provided an apparatus wherein the input device includes a states selector allowing the user to alternate the controlling functionality between the first and second states.
In accordance with an aspect of the invention there is further provided a wearable input system including a wearable input device, and a docking station operative to receive the wearable input device and selectably retain and release the wearing input device once received, so as to enable a human to wear and shed the wearing input device without manual contact therewith.
In accordance with an embodiment of the invention there is further provided a system wherein the docking station is operative to provide mechanical protection to the wearable input device.
In accordance with an embodiment of the invention there is still further provided a system wherein the docking station is operative to charge the input device while it is being retained within the docking station.
In accordance with an embodiment of the invention there is still further provided a system wherein retaining of the input device by the docking station is activated by the human's twisting a body part on which the input device is mounted in a first azimuthal direction, after the device has been received by the docking station, and wherein releasing the input device by the docking station is activated by the human's twisting a body part on which the input device is mounted in a second azimuthal direction, opposite to the first direction, while the device is being retained by the docking station.
In accordance with an embodiment of the invention there is still further provided a device wherein the laser spot detector comprises an IR camera arranged such that its field of view includes the screen.
In accordance with an embodiment of the invention there is still further provided a device wherein the screen may comprise a projected screen and the laser spot detector may be mounted on a projector projecting the projected screen. In accordance with an embodiment of the invention there is still further provided a device wherein the projector comprises an image projector in a handheld deviceor a mobile device e.g. laptop which is not 'hand held'.
In accordance with an embodiment of the invention there is still further provided a device wherein the laser spot detector comprises an optic-sensitive functionality of the screen itself.
In accordance with an embodiment of the invention there is still further provided a device wherein the electronic system comprises a computer.
In accordance with an embodiment of the invention there is still further provided a device wherein the electronic system comprises a portable communication device.
In accordance with an embodiment of the invention there is still further provided a device wherein the screen comprises a physical screen.
In accordance with an embodiment of the invention there is still further provided a device wherein the laser source comprises an IR laser source.
In accordance with an embodiment of the invention there is still further provided a device having two user-selectable modes of operation including a remote mode of operation utilizing the laser source and the laser spot detector and a touch mode of operation.
In accordance with an embodiment of the invention there is still further provided a device wherein the touch mode of operation utilizes a light source mounted on the wearable input device and activated by an actual touching of the screen by the user, a sensor which senses a location of the light source and a controller which receives the light source location from the sensor and controls the electronic system accordingly.
In accordance with an embodiment of the invention there is still further provided a device wherein selectability of the selectable modes is activated by a human user who pinches thumb to finger.
In accordance with an embodiment of the invention there is still further provided a device wherein selectability of the selectable modes is activated by a human user who operates a two-manner press mechanism of a pressable element on the wearable input device.
In accordance with an embodiment of the invention there is still further provided an apparatus wherein the input area includes a keyboard. In accordance with an embodiment of the invention there is still further provided an apparatus wherein the input area includes a touch-pad.
In accordance with an embodiment of the invention there is still further provided a device wherein the controlling functionality is also operative to receive signals, indicative of a mouse-sliding operation simulated by the wearable substrate and to control a normally mouse-operated electronic device accordingly, including commanding the electronic device to respond to each mouse-sliding operation simulated by the human user using the substrate, as it would respond to the same mouse-sliding operation were it to have been applied by the human user to a mouse operating the electronic device.
In accordance with an embodiment of the invention there is still further provided an apparatus wherein the first and second wearable input devices are entantiomers which are mirror images of each other.
In accordance with an embodiment of the invention there is still further provided an apparatus comprising an optical sensor operative to simultaneously sense positions of a plurality of light points generated by both input devices simultaneously.
In accordance with an embodiment of the invention there is still further provided an apparatus comprising a controlling application operative for simultaneously receiving and simultaneously processing positions of a plurality of light points generated by both input devices simultaneously and for controlling a host device accordingly.
In accordance with an embodiment of the invention there is still further provided an apparatus wherein the optical sensor comprises an IR sensor and the light points generated by both input devices simultaneously comprise IR light points.
In accordance with an aspect of the invention there is still further provided a touchless user input system comprising a rear projection unit located at the bottom of a surface which projects onto the surface at an angle acute enough to accommodate the small distance between the screen and the projecting unit, a wearable input device emitting light, and a rear optical sensor located behind the screen e.g. adjacent the projecting unit, which is operative to see the light emitted by the wearable input unit. In accordance with an embodiment of the invention there is still further provided a system comprising a force sensing actuator on the wearable input device which triggers emission of the light.
In accordance with an embodiment of the invention there is still further provided a system wherein a user joins thumb and finger together adjacent a desired screen location, the light comprises IR light which impinges upon the surface at the desired screen location and wherein the rear sensor detects the points on the screen from which the laser beams are scattering.
In accordance with an embodiment of the invention there is still further provided a device wherein the IR laser source comprises a near-IR laser source.
In accordance with an embodiment of the invention there is still further provided an apparatus wherein the state selector is controlled by a manual operation by the user in which the user presses together his thumb and a finger wearing the input device.
In accordance with an aspect of the invention there is still further provided a wearable input device which includes an optic sensor and which, when worn by a human user, provides inputs to an electronic device, the input device having a first operating mode in which the optic sensor points at a screen and a second operating mode in which the optic sensor points at the user.
In accordance with an embodiment of the invention there is still further provided an input device wherein the screen comprises a projected screen.
In accordance with an aspect of the invention there is still further provided a wearable input device which, when worn by a human user, provides inputs to an electronic device, the input device having a "touch" operating mode in which the human user presses a screen which is not a touch-screen and the electronic device is controlled as though the screen were a touch screen.
In accordance with an embodiment of the invention there is still further provided a device which provides to a human user an experience as though the user were holding a cursor with thumb and forefinger and moving the cursor.
In accordance with an aspect of the invention there is still further provided a wearable input device which, when worn by a human user, provides inputs to an electronic device, the input device having a "touch" operating mode in which the human user presses a screen and the electronic device is controlled as though the screen were a touch screen; and a remote operating mode in which the human user interacts with the screen remotely from afar.
In accordance with an aspect of the invention there is still further provided a wearable input device which is operative in any of a selectable plurality of
interactive environments.
In accordance with an embodiment of the invention there is still further provided an input device wherein the plurality of environments includes a front projected screen environment.
In accordance with an embodiment of the invention there is still further provided an input device wherein the plurality of environments includes a rear projected screen environment.
In accordance with an embodiment of the invention there is still further provided an input device wherein the plurality of environments includes a desktop computer environment.
In accordance with an embodiment of the invention there is still further provided an input device wherein the plurality of environments includes a laptop computer environment.
In accordance with an embodiment of the invention there is still further provided an input device wherein the plurality of environments includes a mobile device with an embedded pico projector.
In accordance with an embodiment of the invention there is still further provided an input device wherein the plurality of environments includes a interactive surface environment.
In accordance with an aspect of the invention there is still further provided a method for providing IR-based apparatus for controlling an electronic device, the method comprising providing an IR camera configured to be mountable on an electronic device, the electronic device having an input area, the IR camera's field of view including the input area, the IR camera being operative to sense IR signals generated by an input device worn on at least one user's hand, when the hand is operating on the input area, and providing a controlling functionality operative to receive the IR signals from the IR camera and to control the electronic device accordingly. In accordance with an aspect of the invention there is still further provided a method for providing a wearable input device serving a human user, the method comprising mounting at least one force sensor on a wearable substrate, wherein the force sensor is operative to sense pressure patterns applied by the human user which mimics the pressure patterns the human user would apply to a mouse button, and providing a controlling functionality operative to receive signals, indicative of the pressure, from the force sensor and to control a normally cursor-based electronic device accordingly, including commanding the electronic device to respond to each pressure pattern applied by the human user, as it would respond to the same pressure pattern were it to have been applied by the human user to a cursor-based input device operating the electronic device.
In accordance with an aspect of the invention there is still further provided a method for providing wearable input apparatus operative to provide multi-touch control of an electronic system when the apparatus is worn by a human user, the method comprising providing a first wearable input device operative to control the electronic system when mounted on the human user's right hand, and providing a second wearable input device which is a mirrored copy of the first wearable input device and is operative to control the electronic system when mounted on the human user's left hand.
In accordance with an aspect of the invention there is still further provided a method for providing wearable input apparatus operative to control an electronic system when worn by a human user, the method comprising providing a finger-wearable substrate configured to be mounted on a user's finger and having a tip portion configured to be mounted on the user's finger tip, and mounting a force sensing device on the tip portion, the force sensing device including at least one force sensor and being operative to sense pressure applied by the human user's thumb when the user presses thumb to finger, and to control the electronic system at least partly in accordance with at least one characteristic of the pressure.
In accordance with an aspect of the invention there is still further provided a method for providing a wearable input system, the method including providing a docking station operative to receive a wearable input device and to selectably retain and release the wearing input device once received, so as to enable a human to wear and shed the wearing input device without manual contact therewith. In accordance with an aspect of the invention there is still further provided a method for providing a touchless user input system, the method comprising providing a rear projection unit located at the bottom of a surface which projects onto the surface at an angle acute enough to accommodate the small distance between the screen and the projecting unit, providing a wearable input device emitting light, and providing a rear optical sensor operative to be disposed behind the screen e.g. adjacent the projecting unit, which is operative to see the light emitted by the wearable input unit.
In accordance with an aspect of the invention there is still further provided a method for providing a wearable input device, the method including providing a wearable input device which includes an optic sensor and which, when worn by a human user, provides inputs to an electronic device, the input device having a first operating mode in which the optic sensor points at a screen and a second operating mode in which the optic sensor points at the user.
In accordance with an aspect of the invention there is still further provided a method for providing a wearable input device, the method including providing a wearable input device which, when worn by a human user, provides inputs to an electronic device, the input device having a "touch" operating mode in which the human user presses a screen which is not a touch-screen and the electronic device is controlled as though the screen were a touch screen.
In accordance with an aspect of the invention there is still further provided a method for providing a wearable input device, the method including providing a wearable input device which, when worn by a human user, provides inputs to an electronic device, the input device having a "touch" operating mode in which the human user presses a screen and the electronic device is controlled as though said screen were a touch screen, and a remote operating mode in which the human user interacts with the screen remotely from afar.
In accordance with an aspect of the invention there is still further provided a method for providing a wearable input device, the method including providing a wearable input device which is operative in any of a selectable plurality of interactive environments.
In accordance with an aspect of the invention there is still further provided a method for providing a wearable input device operative to control an electronic system, the method comprising providing an IR laser source mounted on a wearable substrate and operative to emit a laser beam impinging on a screen at a location whose coordinates depend on motion of the user wearing the input device, and providing a laser spot detector operative to detect coordinates of the location within the screen and accordingly to control the electronic system.
In accordance with an aspect of the invention there is still further provided a computer program product, comprising a computer usable medium having a computer readable program code embodied therein, the computer readable program code adapted to be executed to implement any of the methods shown and described herein.
In accordance with an embodiment of the invention there is still further provided an apparatus wherein the annular substrate comprises at least one light source selectably triggered by at least one of the force sensors respectively.
In accordance with an embodiment of the invention there is still further provided an apparatus wherein the at least one light source comprises at least one LED.
In accordance with an embodiment of the invention there is still further provided an apparatus wherein the user can control the keyboard with one hand and control the cursor with the other hand simultaneously.
In accordance with an embodiment of the invention there is still further provided an apparatus wherein wireless communication actuators on the input device send signals directly to the electronic device.
In accordance with an embodiment of the invention there is still further provided an apparatus operative to distinguish between input streams emanating from each of the user's hands and to handle both of the input streams concurrently.
In accordance with an embodiment of the invention there is still further provided a device wherein the light source comprises a LED.
In accordance with an embodiment of the invention there is still further provided a computer usable medium on which resides a controlling functionality. Also provided is a computer program product, comprising a computer usable medium or computer readable storage medium, typically tangible, having a computer readable program code embodied therein, the computer readable program code adapted to be executed to implement any or all of the methods shown and described herein. It is appreciated that any or all of the computational steps shown and described herein may be computer- implemented. The operations in accordance with the teachings herein may be performed by a computer specially constructed for the desired purposes or by a general purpose computer specially configured for the desired purpose by a computer program stored in a computer readable storage medium.
Any suitable processor, display and input means may be used to process, display e.g. on a computer screen or other computer output device, store, and accept information such as information used by or generated by any of the methods and apparatus shown and described herein; the above processor, display and input means including computer programs, in accordance with some or all of the embodiments of the present invention. Any or all functionalities of the invention shown and described herein may be performed by a conventional personal computer processor, workstation or other programmable device or computer or electronic computing device, either general- purpose or specifically constructed, used for processing; a computer display screen and/or printer and/or speaker for displaying; machine-readable memory such as optical disks, CDROMs, magnetic-optical discs or other discs; RAMs, ROMs, EPROMs, EEPROMs, magnetic or optical or other cards, for storing, and keyboard or mouse for accepting. The term "process" as used above is intended to include any type of computation or manipulation or transformation of data represented as physical, e.g. electronic, phenomena which may occur or reside e.g. within registers and /or memories of a computer. The term processor includes a single processing unit or a plurality of distributed or remote such units.
The above devices may communicate via any conventional wired or wireless digital communication means, e.g. via a wired or cellular telephone network or a computer network such as the Internet.
The apparatus of the present invention may include, according to certain embodiments of the invention, machine readable memory containing or otherwise storing a program of instructions which, when executed by the machine, implements some or all of the apparatus, methods, features and functionalities of the invention shown and described herein. Alternatively or in addition, the apparatus of the present invention may include, according to certain embodiments of the invention, a program as above which may be written in any conventional programming language, and optionally a machine for executing the program such as but not limited to a general purpose computer which may optionally be configured or activated in accordance with the teachings of the present invention. Any of the teachings incorporated herein may wherever suitable operate on signals representative of physical objects or substances.
The embodiments referred to above, and other embodiments, are described in detail in the next section.
Any trademark occurring in the text or drawings is the property of its owner and occurs herein merely to explain or illustrate one example of how an embodiment of the invention may be implemented.
Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions, utilizing terms such as, "processing", "computing", "estimating", "selecting", "ranking", "grading", "calculating", "determining", "generating", "reassessing", "classifying", "generating", "producing", "stereo-matching", "registering", "detecting", "associating", "superimposing", "obtaining" or the like, refer to the action and/or processes of a computer or computing system, or processor or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities within the computing system's registers and/or memories, into other data similarly represented as physical quantities within the computing system's memories, registers or other such information storage, transmission or display devices. The term "computer" should be broadly construed to cover any kind of electronic device with data processing capabilities, including, by way of non-limiting example, personal computers, servers, computing system, communication devices, processors (e.g. digital signal processor (DSP), microcontrollers, field programmable gate array (FPGA), application specific integrated circuit (ASIC), etc.) and other electronic computing devices.
The present invention may be described, merely for clarity, in terms of terminology specific to particular programming languages, operating systems, browsers, system versions, individual products, and the like. It will be appreciated that this terminology is intended to convey general principles of operation clearly and briefly, by way of example, and is not intended to limit the scope of the invention to any particular programming language, operating system, browser, system version, or individual product. Elements separately listed herein need not be distinct components and alternatively may be the same structure.
Any suitable output device or display may be used to display or output information generated by the apparatus and methods shown and described herein. Any suitable processor may be employed to compute or generate information as described herein e.g. by providing one or more modules in the processor to perform functionalities described herein. Any suitable computerized data storage e.g. computer memory may be used to store information received by or generated by the systems shown and described herein. Functionalities shown and described herein may be divided between a server computer and a plurality of client computers. These or any other computerized components shown and described herein may communicate between themselves via a suitable computer network.
BRIEF DESCRIPTION OF THE DRAWINGS Certain embodiments of the present invention are illustrated in the following drawings:
Fig. la is a pictorial side-view illustration of a wearable input device constructed and operative in accordance with one embodiment of the present invention.
Fig. lb is a pictorial isometric illustration of the device of Fig. la.
Fig. lc is a pictorial illustration of an IR camera mounted on an image projector, facing a screen onto which an image is being projected by the projector, all in accordance with an embodiment of the present invention.
Fig. 1 d is a pictorial diagram of a first method whereby a wearable input device constructed and operative in accordance with certain embodiments of the present invention controls a computerized application.
Fig. le is a diagram of a state-chart for a half-press mechanism constructed and operative in accordance with an embodiment of the present invention.
Fig. If is a diagram of a state-chart for a touch-screen mechanism constructed and operative in accordance with an embodiment of the present invention.
Fig. lg is a pictorial illustration of an IR camera mounted adjacent a computer screen, facing down toward a typical location of a user's hands on an input device such as a keyboard associated with the computer screen, all in accordance with an embodiment of the present invention.
Fig. lh is a diagram of a state-chart for a continuous PC mechanism constructed and operative in accordance with an embodiment of the present invention.
Fig. li is a pictorial illustration of a user's hands on the keyboard of Fig. lg, wherein the user is wearing an input device constructed and operative in accordance with an embodiment of the present invention, and is using the input device to control the keyboard and mouse without removing his hands from the keyboard, due to the operation of the IR camera of Fig. lg in conjunction with the input device, all in accordance with an embodiment of the present invention.
Figs. 2a - 2d are pictorial illustrations of respective stages in the interaction of a wearable input device constructed and operative in accordance with an embodiment of the present invention, with a docking station constructed and operative in accordance with an embodiment of the present invention.
Fig. 3 is a pictorial diagram of a second method whereby a wearable input device constructed and operative in accordance with certain embodiments of the present invention controls a computerized application.
Figs. 4a - 4b are isometric views of a wearable "sleeve" device for generating computerized input to electronic devices, according to certain embodiments of the present invention.
Figs. 4c - 4e are pictorial illustrations of the device of Figs. 4a - 4b as mounted on and manipulated by a user's hands, according to certain embodiments of the present invention.
Fig. 5a illustrates an exemplary plurality of work modes according to which the sleeve device of Figs. 4a - 4b may operate, and which work modes are typically selectable as alternative states, using the state selection actuator 60 of Figs. 4a - 4b.
Fig. 5b illustrates a pico projector embedded in a mobile communication device such as a cellphone, which is operative in conjunction with a wearable input device according to certain embodiments of the present invention.
Fig. 5c illustrates a tangible screens environment which may be one of a selectable plurality of interactive environments in which the wearable input device shown and described herein is operative, according to certain embodiments of the present invention.
Fig. 5d illustrates an interactive surface environment which may be one of a selectable plurality of interactive environments in which the wearable input device shown and described herein is operative, according to certain embodiments of the present invention.
Figs. 6a - 6d are illustrations useful in understanding methods of operation of a software application typically resident on the controlled host e.g. computer or mobile communication device being controlled by inputs generated using the input generating system shown and described herein, all in accordance with certain embodiments of the present invention.
Figs. 7a - 7b are front and back isometric illustrations of a flat surface with rear projection configuration, characterized by projection at a very obtuse angle such that projection may be from very close, in conjunction with a rear sensor, whose field of view typically comprises the entire screen, which apparatus is useful in implementing a touchless embodiment of the present invention.
Figs. 8a - 8q are useful in understanding hardware components of a wearable input device constructed and operative in accordance with certain embodiments of the present invention.
Figs. 9a - 9f are useful in understanding software components of a wearable input device constructed and operative in accordance with certain embodiments of the present invention.
DETAILED DESCRIPTION OF CERTAIN EMBODIMENTS Fig. la is a pictorial illustration of the Input device. The Input device, as shown in Fig. lb, typically include some or all of an IR laser/LEG and red laser, RF controllers and a flattened scrolling bar, typically accessible by a user's thumb. The mode of operation of the apparatus of Figs, la and lb is typically as follows: First, the two RF control buttons are able to transmit, to a PC application residing on the controlled computer, mouse clicks, e.g. left and right, way similar to the way a cordless mouse operates. The scroll bar is a touch-sensitive electronic pad which simulates the operation of a scroll-wheel, creating signals that are transferred to the PC application using RF communication. The input device's touch scroll bar is not visible in Fig. lb as it is right under the thumb in the image. In summary, the scrolling operation of the input device is typically effected by an intuitive sliding of the thumb along on one side of the finger. Mouse cursor control, using the input device, according to certain embodiments of the invention, is now described in detail. A laser ray in the IR / near IR spectrum is emitted from the input device, and strikes the screen at a given point. That point's coordinates are transferred to the PC application. Any suitable method may be employed to detect that point, such as but not limited to any of the following methods a - c:
a. After striking the screen, the laser ray is scattered into the space of the room. An IR camera is constantly videoing the screen and detects the points on the screen where the laser beams are being scattered from it. The camera then sends this input to the PC application. At projector environments, the IR camera may be mounted on the projector facing the screen as shown in Fig. lc.b. Two or more IR cameras are located on the ribs of the screen, constantly videoing the screen and sending their input to a Central Computing Unit. Then, using mathematic & trigonometric computations on the integrated input, the CCU detects the contact point between the laser beams and the screen, sending it to the PC application, c. The screen as a whole is an optic sensitive screen, which self-detects the striking point of laser beams at a given range of the electromagnetic spectrum. The apparatus of Fig. lb and/or variations thereof illustrated and described herein typically includes some or all of the following, suitably positioned e.g. as shown: horizontal ir led 10, vertical ir led 10, ir laser 30, red laser 40, scroller 50, buttons 60 and 70, state selector 80, force sensing actuator 90, ir led 100, technology box 1 10 in which suitable non-location sensitive components may be concentrated, speakers and microphone 120, flexible body 140 with paper battery inside, indicator LEDs 150 and on / off switch 160. It is appreciated that the wearable input device of the present invention is sometimes termed herein "sleeve" although the configuration thereof may or may not resemble an actual sleeve such as a finger-wearable sleeve.
According to certain embodiments, the image is not projected onto the screen on the user's side, as in a conventional projector, but rather from the back. The laser beam penetrates the transparent screen, refracting somewhat, and is absorbed in an IR camera disposed at the back of the screen. The camera sees the point on the screen from which the beam arrived and thereby discerns the location on the screen pointed at by the user.
A module of the PC application receives some or all of the above-mentioned inputs such as but not limited to some or all of: mouse clicks, scroll messages, beams striking point, and integrates them into the OS, masquerading as a regular or multi- touch mouse. The user receives feedback to her or his actions as if s/he were using a mouse; for example, the mouse cursor moves on the screen, clicks open windows and menus, etc. The interaction continues.
Fig. Id is a graphical representation of the process, as a low resolution process breakdown.
The input device typically comprises some or all of the following components: a. on / off switch
b. light sources (typically, IR laser + IR LED +Red Laser)
c. state selection button as described below
d. one or more indicator lights
e. touch pad scrolling bar
f. two buttons constructed in a two press manner- half and full press as described below
g. rechargeable energy unit
h. wireless communication unit
i. micro controller unit
The input device body is made of a combination of flexible polymers with memory shape which easily conform themselves to the user's physical dimensions. The input device may be worn on any of the user fingers of each hand.
Half and full press: The two RF controllers located at the second & third parts of the finger e.g. as shown in Fig. lb, are constructed in a two-press manner - half & full press. Since the input device is a wearable mouse, it constantly follows any hand movement. Therefore, the user is typically afforded the ability to choose when s/he wants it to influence the cursor position and when not. The half press mechanism helps him to easily achieve this ability as described with reference to the state-charts of Figs, le - lh, inter alia. The input device is typically multi-state and may be in one of several different states a - e at any given moment, such as but not limited to some or all of the following:
a. IR Laser state: enables remote control of large screens, as described above. The behavior of the operation algorithm is described in Fig. le which is a suitable half press state-chart
b. Red Laser state: the device switches to a laser-pointer mode, emitting instead of an IR laser detectable by the IR camera, a red laser at the human eye viewable spectrum. Using this mode the user may switch from controlling the mouse to highlighting important features on the screen, similar to the way a regular laser-pointer works. Although the user has no effect on the cursor position, the other functionality of the input device, such as clicking and scrolling abilities, continue to work regularly.
c. IR LED mode 1 - touch screen: This mode enables the user to easily transform any large screen into a multi-touch screen. In this mode, the user may work with the input device adjacent to the screen while the active light source may be the IR LED instead of the IR laser. Operation, according to certain embodiments, is illustrated in Fig. If which is a touch screen state-chart. In order to help the user distinguish the exact contact point between the IR LED and the screen, the IR LED may have a stretching mechanism which may enable it to protrude from the input device similarly to a stylus pen.
d. IR LED mode 2 - personal computer: This mode enables the user to control desktop and laptop computers. In this mode's settings, the IR camera is located at the upper rib of the computer screen, facing down towards the user's hand which is typically lying on the keyboard, as shown at Fig. lg. The user uses both hands to work with the keyboard and when s/he wishes to move the mouse curser or execute any other mouse operation such as clicking or scrolling, s he does not need to lift her or his hand from the keyboard and hold the mouse - s/he uses the input device while her or his hands are still on the keyboard. The IR camera located on the computer's screen detects the IR signals being emitted from the input device and sends them to PC application module on the controlled computer which transforms them into mouse movements on the screen. The behavior of the operation algorithm is as shown at Fig. le - half press state chart. e. IR LED mode 3 - continuous PC: This mode may be exactly the same as IR LED mode 2 - personal computer (d), except that the IR LED typically works constantly. Therefore, typically, any movement always influences the cursor position without the need to half press the button first. The behavior of the operation algorithm is described in Fig. lh which is a continuous PC state-chart.
The input device enables the user in front of any kind of audience to achieve total remote mouse control, typically including some or all of the following: Left click, Right click, Cursor control, scrolling, double click, typically while offering advantages such as but not limited to some or all of the following:
a. The user is not limited to a single position and may walk freely around the presentation area, as large as it may be.
b. The presenter does not need to hold anything in her or his hands. Both her or his hands are totally free and s/he may use her or his body language in a full, natural manner and even hold objects during the presentation. Furthermore, the input device is built in such a way as to leave the tip of the pointing finger totally free, which allows him to comfortably type on a keyboard without first laying another object down.
c. The user may turn the input device into a laser pointer at any point while still maintaining control of any mouse functionality except for cursor position, such as but not limited to some or all of the following: Left click, Right click, scrolling, double click.
d. The user may use any kind of big screen, whether an LCD screen or a projection screen, as a touch screen.
Using the input device with desktop and laptop computers, the user may have full control of the keyboard and mouse while both her or his hands remain on the keyboard. The advantages of such an approach may include but are not limited to some or all of the following: a. Effectiveness - the user's work rate increases as the move from mouse to keyboard requires zero setup time. Many users are wary of the constant move of hands from mouse to keyboard and work with one hand on the mouse and one hand on the keyboard. This approach suffers from limitations because keyboards are planned for two-hand use. b. Ergonomics - The frequent move from keyboard to mouse wears the wrist and shoulder joints - phenomena which are overcome by certain embodiments of the input device shown and described herein. Additionally, it is known that daily work with a regular mouse may, in the long-term, create issues such as Carpal Tunnel Syndrome and Repetitive Stress Injury. Beyond the pain, these effects cause considerable financial loss due to decreased work rate and, in more serious cases, punitive damages. The ability to work without needing to lift the hands every time to hold another object may drastically decrease such phenomena,
c. Space resources - the mouse located next to the computer requires additional table space, especially when padding surfaces are provided due to an inherent lack of ergonomics. This limitation becomes critical especially when working in limited spaces, usually with a laptop.
Laptop control may be performed using an IR camera hooked on the top of the screen. According to certain embodiments, the IR camera facing the keyboard area becomes embedded in laptops at manufacturing stage. This may transform the input device into a completely independent unit, communicating with the laptop without the need for external devices. Fig. li illustrates a possible implementation in which the IR camera is embedded into the laptop's body.
The input device market is moving towards multi-touch control e.g. iPhone and other devices. This multi-touch control, however, has barely been extended into desktop and laptops due to price & limitations of touch-screen technology. An input device worn on one of the fingers of both the left and right hands brings multi-touch functionality to desktop / laptop / media center / projector screen applications. The ability to control objects on the screen using both hands in an intuitive way is useful for everyday computer work-flow.
Having described various modes of operation for the input device and each mode's advantages, a docking station useful in association with the input device is now described. Referring now to Figs. 2a - 2d, the interaction between the input device and the docking station may be as follows:
a. The user inserts her or his finger into the docking station while the input device is on the finger.
b. The user rotates her or his finger 90 degrees clockwise c. The user takes the finger out of the docking station, leaving the input device inside.
The action of wearing the input device is symmetrical:
a. The user inserts her or his finger into the docking station
b. The user rotates her or his finger 90 degrees counterclockwise
c. The user takes the finger out of the docking station with the input device on it, ready to be used. The advantages of the docking station may include but are not limited to some or all of the following: a. The docking station enables the user to wear and remove the input device from her or his hand quickly and efficiently, but more important - without the use of the other hand.
b. The input device is typically formed of relatively gentle micro-electronics. The docking station shall protect the input device while in transit, e.g. in a laptop bag. c. The docking station shall provide a very respectable casing for the input device d. The docking station may connect to electricity and be used as the input device's charger. Furthermore, the docking station may be equipped with an internal large capacity rechargeable battery, and so able to charge the input device even when the docking station is not connected to electricity. This overcomes the space limitations in the input device, extending its battery life much further.
Conventionally, cordless input devices such as wireless keyboards or mouse devices often function in a way allowing them to communicate with a single specific plug which has their ID encoded into it. In this mode of operation, moving the control from one computer to the next typically involves the user disconnecting the plug from the first computer and reconnecting it to the second computer. A more convenient mode, which is used mainly in Bluetooth devices such as cell phones and earpieces allows the user to connect any device to any other device using a process called "Pairing" in which the devices are authorized to work with each other. The PC application, the IR camera and the entire input device's design typically enable the user to work in this mode of operation. Advantages created by this communication mode may include but are not limited to some or all of the following:
a. All for one - Every input device, given sufficient authorization, instantly assumes full control of the computer. b. One for all - My input device, given sufficient authorization, instantly assumes full control of every other computer that is input device enabled according to certain embodiments of the present invention.
c. Groups - The user control of the input device PC application may enable the definition of user groups, and so enable multi-user control of the controlled computer.
Examples may include but are not limited to the following:
a. During a lecture, while the presenter is presenting, the active user group shall be "presenter" but when students ask questions, the presenter switches the active user group to "student" and the entire discussion becomes interactive.
b. During a board meeting, while the marketing director is presenting, the user group may include, say, a marketing director, the CEO and the chairman. The CEO and chairman may intervene and assume control at any point during the meeting, resulting in a dynamic, multi-user control system which revolutionizes the way people collaborate with computers.
These advantages transform the input device into a product with 'network effect'
- the customer value of owning input device increases with the amount of input devices. The input device is no longer the computer's mouse, it is a personal mouse just as cell phones are a personal rather than an office communication device.
A Multi Disciplinary Personal Input Device constructed and operative in accordance with an alternative embodiment of the present invention is now described. The Input device's components may include some or all of the following:
1. On / Off switch
2. 3 or more light sources including: such as but not limited to some or all of the following:
a. Infra red laser
b. Infra red LED
c. Red Laser
3. 1 or more indicator lights
4. 1 or more buttons constructed in a two press manner- Half and full press
5. 1 or more Microphones
6. 1 or more Speakers 7. Selection controller
8. Touch pad scrolling bar
9. Force sensors
10. Rechargeable energy unit
11. Memory unit
12. Wireless communication unit
13. Micro controller unit
The Input device body may be made of a combination of flexible polymers with memory shape which easily conform themselves to the user's physical dimensions and enable him to freely move her or his finger. The Input device may be worn on any of the user fingers on each hand, e.g. as shown in Fig. lb.
• The controls that are controlled by the thumb are on the upper part of the finger
• The light source is found at the front of the finger, typically in the middle joint.
o Red Laser
o IR Laser
o IR LED
The edge of the Input device which is positioned, when worn by the user, at the edge of her or his finger, typically pivots or revolves, typically a full 360 degrees, such that some or all of the following four components may be disposed azimuthally at the tip of the user's finger, e.g. on its four sides respectively, approximately 90 degrees apart:
a. Touch pad scrolling bar
b. Force sensor
c. IR LED + Force Sensor (both not shown)
d. SpeakersThe other components are not placement sensitive and may be placed anywhere on the Input device.The Input device's Operating Mechanism typically includes some or all of the following: RF Controllers, Scroll bar, cursor position control, integration of mouse clicks, scroll messages, beam striking point into a controlled device's operation system and provision of user feedback re her or his own actions e.g. as with a mouse, light sources, half-press mechanism, force sensing mechanism. Each of these is described in detail below, according to example embodiments of the present invention: RF Controllers: The two or more RF controllers may be located at the 2'nd & 3'rd parts of the finger but not necessarily e.g. as shown in Fig. lb). Each of them is able to transmit direct events to the application residing on the controlled device via any kind of wireless communication. The controlled device may be a Laptop / Desktop computer, mobile phone, camera, television and substantially any device that employs Human- Computer-Interaction (HCI). Those events may then be interpreted as mouse clicks (e.g. left and right) in a way similar to the way a cordless mouse operates.
Scroll bar: The scroll bar is a touch-sensitive electronic pad which simulates the operation of a scroll-wheel, creating signals that are transferred to the application in the same way. The Input device's touch scroll bar is not visible in Fig. lb as it is right under the thumb in the image. In summary, the scrolling operation of the Input device is typically effected by sliding of the thumb along one side of the finger in any direction.
Cursor position: Input device control of the cursor position, according to certain embodiments, is now described, using the remote control environment being operated by the IR Laser as an example. Other environments / light sources, as described below, may share the same control scheme. A laser beam in the IR / near IR spectrum is emitted from the Input device, and strikes the screen at a given point. That point's coordinates are transferred to the application.
There are several ways to detect that point such as but not limited to some or all of the following:
a. After striking the screen, the laser beam is scattered into the space of the room. An IR camera is constantly videoing the screen area and detects the points on the screen where the laser beams are being scattered from. The camera has been pre calibrated to know the exact area of the screen within its field of view, so that it may convert the light source's absolute position to its exact relative position on the screen. The camera then sends this input to the application. Alternatively, part/all of the abovementioned processing may take place in the application on the controlled device rather than at the camera.
b. Two or more IR cameras are located on the ribs of the screen, constantly videoing the screen and sending their input to a Central Computing Unit (CCU). Then, using mathematical and trigonometric computations on the integrated input, the CCU detects the contact point between the laser beams and the screen, sending it to the PC application.
c. The screen as a whole is an optic sensitive screen, which may self-detect the striking point of laser beams at a given range of the electromagnetic spectrum.
d. The image is typically not projected onto the screen frontally, i.e. from the side of the user, but rather from behind (back projection). The laser beam hits the screen, scatters in the space of the room, and is absorbed by the IR-camera that is positioned behind the screen close to the projector. The camera sees the point on the screen from where the beam emanates, and thus identifies where on the screen the user is pointing. Integration of mouse clicks, scroll messages, beam striking point into a controlled device's operation system and provision of user feedback re her or his own actions: a module of the application typically receives some or all of the abovementioned inputs, including e.g. some or all of mouse clicks, scroll messages, and beam striking points, and integrates them into the controlled device's OS. The user receives feedback to her or his actions as if s/he were using a mouse, e.g. the mouse cursor may move on the screen, click open windows and menus, etc) and the interaction continues. Fig. 3 is a pictorial diagram of this process according to certain embodiments.
Light Sources: As described above, the Input device typically includes several light sources such as but not limited to some or all of the following:
a. Infra red laser - may be positioned e.g. Laterally on the finger between the first and second joints (middle of the finger e.g.) orParallel to the finger between the second and third joints (base of the finger e.g.).
b. Red Laser - may be positioned as described below for Infra red laser.
c. Infra red LED - may be positioned on the finger tip on the revolving piece. However, the source of the IR LED may also be positioned on any part of the finger.
The camera typically identifies the source of the IR LED exactly in the same manner as it identifies the point where the laser beam scatters as described above - the rest of the process is as described above. The camera cannot of course identify the actual red laser beam; instead it emphasizes elements on the screen without the ability to act upon those elements. Since the Input device is a wearable device, it constantly follows hand movement. Therefore, the user has the ability to choose when s/he wants it to influence the cursor position and when not.
The control of the sources of light may be affected by one or both of two mechanisms, half-press and force-sensing, each described in detail below. The user may determine which light source each mechanism controls at any given moment, either from the application interface and from the Input device itself, e.g. using the Selection Controller. In addition, the user may always choose to leave the light source on continuously in order to manipulate the pointer location at all times. Half Press Mechanism: Each of the RF controllers is constructed in a two press manner - half & full press. The operational algorithm of the mechanism above is described in Fig. le which is a state chart of the half press mechanism. When the Button Up/Down Event is implemented, it may be used by wireless communication and by a different encoding of the source that is emitted from the Input device and that is received by the IR-camera which effects a system command when a specific control is pressed. In effect, all the information that is transferred by wireless communication may be transferred by any specific encoding of the light source.
Force Sensing Mechanism: The revolving part at the finger tip has a specific type of force sensor such as but not limited to FSR. When the applied force exceeds a specific threshold which may be user-configurable, the appropriate light source is turned on. After the applied force falls below the threshold or goes to zero, the light source turns off. Enable/Disable may be enacted with this mechanism using the pad of the finger in the area of the finger nail of the inner/crossing finger respectively. The user may enable this in any suitable matter, such as but not limited to:
a. Placing a capacitor along the length of the nail where the input is the direction of its discharge; or
b. A physical control positioned in the area of the fingernail. Light Source Interpretation according to certain embodiments is now described. Typically, the Input device application may work in multiple modes of operation such as touch screen mode, mouse pad mode, and position only mode, e.g. as described below. The transition between modes may be enacted using the Selection controller of the Input device and from the interface of the application. App_Touch Screen: When the light source is turned on, the application positions the cursor in the appropriate place and executes a Left Click down. When the light source is turned off the application executes a Left Click Up in the exact place where the light source appeared before turning off. When the light source moves, the application executes a mouse movement. Since typically, this only happens between Left Click Down and the Left Click up, it therefore acts as a Drag&Drop. When the light source turns on/off/on/off the application interprets this action as a double click.App_Mouse Pad: The light source is always implemented as a mouse movement typically with exceptions such as but not limited to some or all of the following: i. Left Click: Less than a second (or other user-selectable time-period) has passed from the moment it was turned on to the moment it was turned off, and if during this time period, e.g. second, the light source did not move, even minutely.
ii. Double Click/Drag and Drop: A small movement of the finger between the two clicks may nullify the Double Click implementation by the operating system. Therefore, each single click executes a timer. If the system identifies another click at this time, notwithstanding location, the second click may be implemented exactly in the same place as the first click. If the system identified that the light turned on only during this time without turning off, without regard to location, the system implements a Drag&Drop exactly from the position of the first click. App_Position Only: In this situation status, the light source is implemented as a mouse movement only and cannot execute any type of mouse clicks.
Dots per Inch: optionally, in the IR LEG configuration, tracking the user may reduce the dpi, using a keyboard for example, thereby to enable the user to pinpoint very smaller icons with greater ease.
An ejectable stylus option may be provided according to which a small stylus is connected to the Input device which protrudes slightly from the finger tip. The connection to the finger tip may be executed by screwing, applying pressure or any other mechanical application. This mechanism enables the user to have maximum specificity at the point of contact with the screen for applications in which precision is important. The stages of operation may include some or all of the following, suitably ordered e.g. as follows:
a. When the user wants to press with her or his finger on a specific area, the Stylus touches the area, because it protrudes, and therefore pressure is applied b. The base of the Stylus, which completely absorbs the pressure applied on the Stylus, pushes on the Force sensor typically located on the revolving side of the finger and effects an on/off of the IR LED. c. Instead of scattering the light from the IR LED bulb, the light enters the stylus from one end connected to the bulb and exits only from the other side which contacts the screen, similar to the operation of fiber optics. d. The camera identifies the light source from the point of the Stylus.
Optionally, a Pressure differentiation feature is provided according to which the
Input device communicates to the application the level of pressure put on the Force sensing mechanism. The application may relate to the light sources differently based on the level of pressure applied. Thus the user may execute different actions based on how much pressure s/he puts on the surface.
For example, gentle pressure may execute mouse movement whereas more pressure would execute a Drag &Drop. The mechanism may also be controlled via the interface of the application. For example, thresholds of different pressure points and/or the appropriate actions they perform may each be configurable.
Optionally, a Red Laser feature is provided according to which the device switches to a laser-pointer mode, emitting, instead of an IR laser detectable by the IR camera, a red laser at the human eye viewable spectrum. Using this mode the user may switch from controlling the mouse to highlighting important features on the screen, similar to the way a regular laser-pointer works. It is important to note that although the user has no effect on the cursor position, the other functionality of the Input device, such as clicking and scrolling abilities, continue to work regularly. Optionally, a Multi- touch feature is provided according to which an Input device worn on one or more fingers of both hands enables the user to interact with the screen using a complete multi- touch ability. Different encoding of the source of light enables the application to connect between appropriate sources of light from the Input device and from the user's perspective from the appropriate hand or finger.
Optionally, Four Way Scrolling enhanced input is provided. Using a Input device, the user may effect 4-way scrolling whereby one hand is responsible for horizontal scrolling and the other hand for vertical scrolling, without constantly moving the cursor back and forward. This is useful in programs like Excel, drawing boards and substantially any application where the file is larger than the visible screen size. The user may determine the role of each hand and finger. When the system receives events from a different devices, or hands/fingers, simultaneously it may alternate passing them to the system, passing an event from each device alternately until all events have been passed. The outcome is circular scrolling on the screen that is completely open and free.
Optionally, enhanced remote text input is provided. The Input device enables effective use of virtual keyboards whereby the user may type freely with two or more fingers in a much more intuitive way and also use shortcuts such as but not limited to Ctrl+C, Windows+E. Optionally, programmable gestures are provided.
The Input device application includes several common gestures such as, the @ to open a mail application, or the V sign to promptly open and close a virtual keyboard. This option typically also enables the user to create an unlimited number of personalized gestures, each of which may execute a commonly used action of the user. In addition, an engine may be provided that alerts the user, and in certain cases blocks the user, from creating a new form that is too similar to a form already in use, thereby to decline the True-Negative in the system.
Typically, the mode of operation of this option is as follows:
a. Clicking on a button/combination of buttons, e.g. Double right click, notifies the application that the user is requesting to switch to Gesture command mode.
b. The application gives feedback to the user that s/he is now in Gesture Command mode.
c. The user receives feedback from the screen with regard to the shape s/he is trying to draw in an attempt to help the user draw more accurate 'forms'. d. At the end of the process, the system analyzes the form and implements the appropriate command. Optionally, Embedded Flash Memory, e.g. Disk on Key, is provided. The Input device has internal flash memory such that it serves as a disk-on-key as well. Transferring information may be executed in a number of methods including using a USB port, using the Input device docking station e.g. as described below, WIFI and other connection methodologies.
The internal memory of the Input device enables the controlled environment to immediately load the user's preferences including for example gestures that s/he's created, dpi preferences etc.
Optionally, an embedded speaker and microphone is provided. The speaker is located on the rotating finger tip apparatus while the microphone may be placed anywhere. In this manner, the Input device may serve a communications function, e.g. a Blue tooth earpiece, where the user places her or his finger wearing the Input device next to her or his ear.
Optionally, Multiple Environments are provided.
The Input device is a personal control mechanism and is not intended for a specific work environment like a personal computer. By a pairing mechanism, the Input device may control any environment that supports it, e.g. like the communication between a Blue Tooth device and multiple cellular phones.
The user may go from device to device, click on her or his unique user-ID and start interacting with it.
Optionally, social interfacing is provided.
The IR camera and the Input device control application may handle several light sources simultaneously and therefore enables several users to share work space in the same environment. Due to the fact that the Input device is a personal control device, the system may associate sources of light to different users and therefore to relate to them in a different mode. Different encoding of the light source for each user in addition to the wireless communication may accurately identify each user at the time of the social interfacing. This individual identification enables management of different interactive permissions for every user. In addition, groups may be established, enabling permissions to be switched between sets of people rather than individuals.
Optionally, ergonomic posture is provided in that the Input device utilizes the hand's and wrist's natural hand position. Traditional mice are designed around the "palm down" posture that has been proven to contribute to Repetitive Stress Injury (RSI) and Carpal Tunnel Syndrome (CTS). The Input device applies a clinically proven technique to relieve pressure from the median nerve in the wrist which is similar to the 'Hand shake position'. This position has been shown to reduce muscle load and the incidence of discomfort associated with RSI and CTS.
Optionally, ergonomic customization is provided in that the Input device is available in several sizes for maximum comfort to the user. The differences in size may be around the circumference and the length of the Input device. In addition, as described above, the Input device body may be made of a combination of flexible polymers with memory shape which may easily conform themselves to the user's physical dimensions.
Optionally, bidirectional synchronization is provided. Bidirectional communication between the controlled environment and the Input device enables automatic synchronization between the settings of the application to the internal settings of the Input device and vis. vs. For example:
a. Control over the level of the laser beam through the application b. A change in the work modes of the application may cause the active light source to change as well.
The mechanism of the bidirectional synchronization optionally enables the system to pre-define the users' modes of operation such that any changes in one direction may automatically synchronize in the other direction. The user may always add additional operational modes.
Examples of how the mechanism may be applied to several popular work modes are now described. Settings for the Input device may include setting one of, say, two mechanisms for turning on the light source, e.g. Half Press Mechanism and Force Sensing Mechanism as described herein; setting one of, say, three Light Sources (IR LED, IR Laser and Red Laser); and setting one of, say, three modes of operation in the application e.g. App_Touch Screen, App_Mouse Pad and App_Movement Only. Examples of available modes of operation for the Input device and their settings may include but are not limited to the following:
a.Touch Screen: Force Sensing Mechanism + IR LED+ App_Touch ScreenMouse b. Pad: Force Sensing Mechanism + IR LED + App Mouse Pad c. Remote Control: Half Press Mechanism + IR Laser + App_Movement Only d. Nearby Control (also referred as 'KeyTop' control E.g. as described below): Half Press Mechanism + IR LED + App_Movement Only e. Red Laser: Half Press Mechanism + Red Laser + App Movement Only
Examples of compatible Environments are now described, but are not intended to be limiting:
1. Projected Environments: The IR camera may be mounted on the projector facing the screen as shown in Fig. le. In projector environments, the user has full control, from anywhere in the area and may also control the projected screen with multitouch. This capability typically is provided regardless of whether the projection is from the rear or the front e.g. due to the mechanism that revolves at the fingertip and/or due to the use of lasers. In addition, this architecture works with any projector such as pico projects whether they are standalone or embedded in another device.
2. Physical screens: Control of physical (also termed herein "tangible") screens, such as but not limited to LCD or plasma, may, if desired, be effected in exactly the same manner as projected, non-physical screens. The IR-camera is positioned such that it is facing the user. In this manner, the user may control any physical screen such as but not limited to IPTV, home theatre, and media center.
3. Keytop Control: This mode enables the user to control desktop and laptop computers. In this mode's settings, the IR camera is located at the upper rib of the computer screen, facing down towards the user's hand (typically lying on the keyboard). The user uses both hands to work with the keyboard and when s/he wishes to move the mouse curser or do any other mouse operation such as clicking or scrolling, s/he does not need to lift her or his hand from the keyboard and hold the mouse - s/he uses the Input device while her or his hands are still on the keyboard. The IR camera located on the computer's screen detects the IR signals being emitted from the Input device and sends them to PC application module on the controlled computer which transforms them into mouse movements on the screen, e.g. as shown in Figs, lg and li. Using the Input device with desktop and laptop computers, the user may have full control of the keyboard and mouse while both her or his hands remain on the keyboard.
Advantages of certain embodiments include but are not limited to some or all of the following: a. Effectiveness - the user's work rate increases as the move from mouse to keyboard requires zero setup time. Many users are wary of the constant move of hands from mouse to keyboard and work with one hand on the mouse and one hand on the keyboard. This approach suffers from limitations because keyboards are planned for two-hand use.
b. Ergonomics - The frequent move from keyboard to mouse wears the wrist and shoulder joints - a phenomena which Input devices, according to certain embodiments of the invention, overcome. Additionally, it is a known fact that daily work with a regular mouse may, in the long-term, create issues such as Carpal Tunnel Syndrome and Repetitive Stress Injury. Beyond the pain, these effects cause considerable financial loss due to decreased work rate and, in more serious cases, punitive damages. The ability to work without needing to lift the hands every time to hold another object may drastically decrease such phenomena.
c. Space resources - the mouse located next to the computer requires additional table space, especially when padding surfaces are provided due to its inherent lack of ergonomics. This limitation becomes critical especially when working in limited spaces, usually with a laptop.
d. multitouch abilities on a huge virtual mouse pad which may for example be an order of magnitude bigger than conventional physical embedded mouse pads
After embedding the IR camera in laptops at manufacturing stage the Input device may transform into a completely independent unit, communicating with the laptop without the need for external devices.
4. Touch screens - The wearable input device typically is operative to turn any surface into a fully operative multi-touch screen utilizing a finger touch control method, e.g. as follows:
A. when the screen is projected onto the surface and the sensor is pointing at the projected screen, the user touches a specific location, thereby to trigger the force sensor and the active light source turns on
B. when the screen is a tangible screen and the sensor points at the tangible screen, the user touches a specific location thereby to trigger the force sensor and the active light source turns on
C. when the surface has no screen on it at all and is used to control e.g. mouse pad and/or another screen, the user touches a specific location, thereby to trigger the force sensor and the active light source turns on.
Typically, , there is no size limit for the controlled screen. For example, given a conventional projector, the apparatus of the present invention can, according to certain embodiments, convert a 200 inch wall sized surface into an enormous interactive multi-touch screen.
A particular advantage of certain embodiments of the Input device architecture, is that little development is required for future products that are to employ the input device shown and described herein. Typically, only two adjustments are made:
a. Mechanical adaptation of the IR camera to the new hardware or environment b. Code adaptation to the new operating system.
For example, to adapt the Input device to future applications such as Head-Up Display based on augmented reality, the IR camera as described herein is embedded in the Head-up display and its code adapted to the HUD's operating system.
According to certain embodiments, a docking station is provided, e.g. as shown in Figs. 2a - 2d. The roles of the docking station typically include some or all of:
a. Enabling the user to wear and remove the Input device from her or his hand quickly and efficiently and without the use of the other hand.
b. Protecting the Input device while in transit, e.g. inside a laptop bag.
c. Providing a casing for the Input device
d. charging the Input device. The docking station may be equipped with an internal large capacity rechargeable battery, and so is able to charge the Input device even when not connected to electricity. This extends the Input device's effective battery life much further. A preferred method for using the Input device shown and described herein in conjunction with a Docking Station, e.g. as shown in Figs. 2a - 2d, typically includes the following steps, suitably ordered e.g. as follows:
i. Wearing the device typically comprises:
a. The user inserts her or his finger into the docking station
b. The user rotates her or his finger 90 degrees counterclockwise
c. The user takes the finger out of the docking station with the Input device on it, ready to be used.
ii. Removing the Input device typically comprises:
a. The user inserts her or his finger into the docking station while the Input device is on the finger.
b. The user rotates her or his finger 90 degrees clockwise c. The user takes the finger out of the docking station, leaving the Input device inside.
Figs. 4a - 4b are isometric views of a wearable "sleeve" device for generating computerized input to electronic devices, according to certain embodiments of the present invention. Typically, many of the components may be located anywhere on the sleeves without adversely affecting functionality and only the following components of the apparatus of Figs. 4a - 4b are location sensitive, e.g. as shown: Horizontal LED (light emitting diode) 10; Vertical LED 20; IR (Infra red) laser 30, Red laser 40, Touch scrolling bar 50, buttons 60 and 70, State selection actuator 80 and Force sensing actuator 90.
Figs. 4c - 4e are pictorial illustrations of the device of Figs. 4a - 4b as mounted on and manipulated by a user's hands, according to certain embodiments of the present invention.
Fig. 5 a illustrates an exemplary plurality of work modes according to which the sleeve device of Figs. 4a - 4b may operate; which work modes are typically selectable as alternative states, using the state selection actuator 60 of Figs. 4a - 4b.
Multiple state selection functionality of the apparatus of Figs. 4a - 4b, according to certain embodiments, is now described. Using the state selection actuator 60 the user may select a work mode in which s/he wants the sleeve device to operate. The work modes differ from each other along two dimensions: type of active light source, and software operating mode. It is appreciated that the particular work modes included in the table of Fig. 5a are merely exemplary and none, one, some or all of these may in fact be provided.
The dimensions along which work modes differ may include one or both of: a. Active light source dimension of sleeve's work mode: as described above, the sleeve device may have some or all of the following 4 light sources: IR Laser 30 , Red Laser
40 , Horizontal IR LED 10 , Vertical IR LED 20 . The active light source is typically triggered by the force sensing actuator 90.
b. Software operating mode dimension of sleeve's work mode: as described below e.g. with reference to Figs. 6a - 6d, the sleeve device typically includes a software application which is typically resident on the controlled host e.g. computer or mobile communication device being controlled by inputs generated using the input generating system shown and described herein. The software application typically has several selectable operating modes, e.g. as shown in the right-most column of the table of Fig. 5a, which differ from each other as to, e.g. input interpretation and operating method.
Each work mode is typically predesigned to control one or more environments such as one or more of the following controlled environments: projected screen environment (front or rear), physical screen environment, laptop/desktop computer environment, mobile devices with embedded pico projector and interactive surfaces environment. Table of Fig. 5A summarizes seven exemplary work modes, of the state selection actuator 60, some or all of which may be provided.
Four exemplary Controlled Environments (projected screen environment, physical screen environment, laptop/desktop computer environment, and interactive surfaces environment) controlled by the sleeve of Figs. 4a - 4b in accordance with a suitable work mode e.g. as shown in the table of Fig. 5a, are now described in detail. The selfsame input generating system, typically including sleeves of Figs. 4a - 4b, sensor of Figs, lc, lg, li, 5b or 5c, and software application described below with reference to Figs. 6a - 6d, may be used to control multiple interactive environments such as but not limited to the four environments described below. Typically, the software application described below with reference to Figs. 6a - 6d may interact with more than one sensor such that the user has the option of connecting an additional sensor to gain control of another environment simultaneously.
The Projected screen environment is now described in detail with reference to
Figs, lc and 5b. Suitable work modes for the Projected screen environment may include the following work modes from the table of Fig. 5a: 1 (Touch interaction), 2 (Remote projected screen interaction) or 5 (Power point interaction). The sleeve device of Figs. 4a - 4b works with any projector including pico projectors whether they are standalone or embedded in another device (e.g. mobile phone), e.g. as shown in Fig. 5b.
The Tangible screens environment is now described in detail with reference to
Fig. 5c.
Suitable work modes for the Physical screen environment may include the following work modes from the table of Fig. 5a: 3 (Remote tangible screen interaction) or 5 (Power point interaction). The user may control any computer based application such as but not limited to an IPTV or a media center. When the sensor is pointing at the screen, from anywhere in the room, the user may perform touch interaction with the screen using work mode number 1 (Touch interaction) of Fig. 5a.
The Laptop/Desktop computer environment is now described in detail with reference to Fig. lg. A suitable work mode for this environment may include the following work mode from the table of Fig. 5a: work-mode 4 (Laptop/Desktop computer interaction). The sensor may be mounted on an adjustable pivot which enables the user to raise it slightly and control the computer screen using work mode number 3 (Remote tangible screen interaction) e.g. as in environment 2 (Tangible screens). Typically, this environment employs optic communication between sleeve/s and sensor.
The Interactive surface environment is now described in detail with reference to Fig. 5d. In the Interactive surface the projecting unit and the sensor are both embedded within the surface's lower plane. A suitable work mode for this environment may be the following work mode from the table of Fig. 5a: work mode 6 (Surface interaction).
The input generating system shown and described herein typically provides communication channels between the following components: sleeves of Figs. 4A - 4B, software application as described below with reference to Figs. 6A - 6D, and a sensor e.g. the IR sensor mounted on a projector in Fig. lc, on a laptop computer in Figs, lg and li, and on a mobile-communication device-mounted pico-projector in Fig. 5b. The communication channels may for example comprise:
a. Bi-directional wireless communication between the sleeve device of Figs. 4a - 4b and the software application resident on the controlled host.
b. Optic communication between the sleeve device (sleeves) and the sensor.
c. Bi-directional wireless communication between the sensor and the software application resident on the controlled host.
Each of communication channels a - c are now described in detail.
a. Wireless communication channel between Sleeves of FIRS. 4a - 4b and Software application resident on controlled host:
When the user changes the sleeve device's work mode using the state selection actuator 80, the application software receives a respective event which changes the application's operating mode according to the table of Fig. 5a. However, if the sensor is static and used for a specific environment permanently, the user may notify the application software to always address that specific sensor with the suitable operation mode e.g. as described below with reference to Figs. 6a - 6b.
Each of the sleeve device's buttons 60 and 70 is able to trigger multiple events transmitted to the application software. Examples for such events include some or all of the following:
o Left hand left button down
o Left hand left button up
o Left hand right button down
o Left hand right button up
o Right hand left button down
o Right hand left button up
o Right hand right button down
o Right hand right button up
The sleeve device's touch scrolling bar 50 is able to trigger multiple events transmitted to the application software. Examples for such events include some or all of:
o Left hand scroll down
o Left hand scroll up
o Right hand scroll down
o Right hand scroll up Each of the events transferred to the software application may contain background information regarding the user such as not but limited to her or his identity and her or his sleeve device's current operating mode. Utilizing the bi-directional communication, aspects of the sleeve device's sleeves may be controlled and modified directly from the software application. Examples of this type of communication include but are not limited tochanging the sleeve device's work mode, embedding and modifying the user's ID and preferences to the sleeves memory; and controlling the coding of the light sources in order to enable the sensor to distinguish between users, hands and light sources on the sleeve itself. b. Optics communication channel between sleeves of Figs. 4a - 4b and the sensor: The optics communication between the sleeves of Figs. 4a - 4b and the sensor of
Figs, lc, lg, li, 5b or 5c is now described. The optics communication between the sleeves and the sensor may be such that the sensor is pointing at the controlled screen, e.g. as in environments 1 and 4 in the table of Fig. 5a. Alternatively, the optic communication between the sleeves and the sensor may be such that the sensor is pointing at the sleeve device's sleeves which are mounted on the user's fingers e.g. as in environments 1, 2, 3 in the table of Fig. 5a. Each of the above two embodiments is now described in detail. It is appreciated that environment 1 (Projected screens) of Fig. 5a is typically controlled by either of the above two embodiments, selectably, using a single sensor.
Sensor pointing at controlled screen:
In Environment 1 (projected screens), the sensor constantly monitors the projected screen area. When the user joins her or his thumb to a finger, on which the sleeve of Figs. 4A - 4B is mounted, s/he presses the force sensing actuator 90 which triggers the sleeve device's IR Laser according to the sleeve device's selected operating mode. The IR Laser 30 is then emitted from the sleeve device's sleeve and impinges upon the projected screen at a given point. After impinging upon the screen, the laser beam is scattered into the space of the room. The sensor detects the points on the screen where the laser beams are being scattered from.
In Environment 4 (surface interaction), the image is not projected onto the screen from the side of the user (frontal projection), but rather from behind (rear projection).
A touchless embodiment of the present invention is now described in detail with reference to Figs. 7a - 7b. In the touchless embodiment, the surface has a rear projection unit located at the bottom of the surface which projects onto the surface at a very acute angle. This obviates any need for a distance between the screen and the projecting unit thereby allowing the surface to be a 'flat unit' as opposed to conventional rear projecting setups which require considerable distance between the screen and the projecting unit. Also, because the screen is projected onto the surface light may pass through it as opposed to physical screens like LCD and Plasma. This enables the IR sensor, also located behind the screen next to the projecting unit, to "see|" the light emitted from the sleeve device light sources. As a result, the user need not actually touch the screen, which is an improvement over conventional touch systems. Instead: i. The user 'pinches' (joins her or his thumb and finger together), e.g. as shown in
Figs. 4d - 4e, typically adjacent a desired screen location, thereby applying pressure to the force sensing actuator 90 which triggers the sleeve device's IR laser according to the sleeve device's current operating mode.
ii. Responsively, the IR Laser 30 is emitted from the sleeve device and impinges upon the surface at the pinching point. After striking the screen, the laser beam is scattered into the space of the room behind the surface due to its transparent character as described above.
iii. he rear sensor detects the points on the screen from which the laser beams are scattering.
For example, Figs. 7a - 7b are front and back isometric illustrations of a flat surface with rear projection configuration, characterized by projection at a very obtuse angle such that projection may be from very close, in conjunction with a rear sensor, whose field of view typically comprises the entire screen. The 4 dotted line segments emanate from, and signify a suitable "field of view" for, both the camera and the projector which as shown includes the entire screen.
Sensor pointing at sleeves:
In Environment 1 (projected screens), the sensor constantly monitors the screen. When the user touches the surface on which the screen is projected s/he applies pressure to the force sensing actuator 90 which triggers one of the sleeve device's IR LED's according to the sleeve device's current operating mode. The light is then emitted from the IR LED and absorbed in the sensor which detects the exact position of the user's fingertip.
In Environment 2 (tangible screens), the sensor constantly monitors the space of the room. When the user join her or his thumb and finger together s/he presses the force sensing actuator 90 which triggers one of the sleeve device's IR LED's according to the sleeve device's current operating mode. Light is then emitted from the IR LED and absorbed in the sensor which detects the exact position of the user's fingertip.
In Environment 3 (laptop/desktop computers), the sensor constantly monitors the keyboard area or the space in front of the screen e.g., if the sensor is pivot-mounted as described above, according to its pivot direction. When the user joins her or his thumb and finger together s/he applies pressure to the force sensing actuator 90 which triggers one of the sleeve device's IR LED's, according to the sleeve device's current operating mode. Light is then emitted from the IR LED and absorbed in the sensor which detects the exact position of the user's fingertip.
The light sources may be coded, e.g. digital coding using discrete on/off, or analog coding using continuous power alternating, in a predetermined manner which enables the light sources to send background information to the sensor regarding the user with which the light sources are associated. The background may for example include the user's identity, her or his sleeve device's current operating mode, and the specific light source currently being used by the user's sleeve device.
Whenever one of the sleeve device's buttons 60 or 70 is pressed and held for more than, say, 0.25 seconds (typically this value is a dynamic parameter), the active light source turns on according to the current work mode. This results in the sleeve device communicating with both the sensor and the application software concurrently which facilitates, for example, remote drag & drop. c. Wireless communication channel between sensor of Figs, lc, l , li, 5b or 5c and
Software application resident on controlled host:
The sensor constantly processes the data it receives including light source position and background information and sends the output to the software application. Utilizing bi-directional communication, every aspect of the sensor may be controlled and modified directly from the software application such as but not limited to any or all of the following: modifying the sensor's state - on, off, standby; creating initial communication with the sensor, and adjusting the sensor's sensitivity.
According to certain embodiments, the system of the present invention includes a wearable input device, which is in two-way data communication with a controlling module, typically comprising a software application, which also is in two-way data communication with a sensor as described herein. The sensor in turn is in typically oneway data communication with the wearable input device. The controlling software application sends commands to an operating system of a controlled entity such as but not limited to a laptop, personal or other variety of computer (in which case the operating system may for example be a Windows operating system), or a cellular telephone or other variety of portable communication device (in which case the operating system may for example be a Symbian or Google Android operating system).
One implementation of the software application which serves as the controlling module is now described with reference to Figs. 6a - 6d. Generally, the software application receives input from the wearable input device/s and from the optic sensor e.g. IR camera and sends control commands, responsively, to the operating system of the controlled entity and, typically, also to the optic sensor (e.g. wake up commands) and/or to sleeve itself (e.g. change work mode commands). In the following description of an embodiment of the controlling software application, which is typically resident in the controlled electronic device, the optical data is, for simplicity, represented by infra red (IR) light and the wireless data is, for simplicity, represented by radio frequency (RF) communication.
Fig. 6a is a simplified block diagram illustration of interactions between various components of the input-generating system, and the input the software application receives from the hardware (sensor or wearable input device), according to certain embodiments of the present invention. The inputs may include some or all of the data set out in the table of Fig. 6B. In Fig. 6b, "event type" refers to the type of application operating mode-changing event received by the software from the sleeve, via the wireless communication channel between Sleeves of Figs. 4a - 4b and software application resident on controlled host, as described above.
The hardware layer is responsible for receiving the raw data, processing it and sending it to the application layer. For example, the application layer might use DLL files or drivers in order to read the data from the hardware. Once the inputs have been received by the hardware layer, they are typically transmitted into the application layer. One way of implementing this, for example, is by using two threads that are continuously running in parallel and reading the inputs: one thread reads the IR data sent by the IR Sensor and one reads the RF data sent by the wearable input device itself.
Certain application modes are now described in detail. Once the application receives the inputs from the hardware layer, it decides what to do with the information. The decision of action to execute may be decided using two factors:
a. The "state" of the application, also termed herein the "Application Mode". b. The Input that has been received.
Provision of a plurality of application modes is optional yet advantageous. Using different application modes allows the system to run several forms of user interaction. For example, in the 'normal mode' (default), when receiving an IR event, the application may change the position of the operating system's cursors. In a different mode, when viewing maps using Google Earth, the IR movements may be translated into map movement, which is done using APIs. The same may be done with the RF inputs. The application is aware of its mode, and functions according to the method of operation which is associated with that mode.
The application may change modes in certain situations, such as but not limited to:
a. When a specific event occurs, for example, when a specific button on the
Wearable input device has been pressed.
b. As a result of user interaction with the environment, for example, when the user is opening a new program.
c. When an outside event happens, for example, when a chat message appears on the screen.
Example application modes include:
Normal (default) mode : may be used in order to control one or more of the applications. In this mode, the IR and RF inputs may control the operating system's cursors movement and clicks. This mode typically incorporates various "Operation Modes" which control the cursors differently, such as but not limited to some or all of the following operation modes each described in detail below: absolute position operation mode, touch operation mode, and mouse-pad operation mode. a. Absolute Position - After retrieving the information from the IR Sensor, the application converts IR positioning values quantifying positions relative to the sensor's field of view into computer screen positioning values relative to the screen position, size and resolution, e.g. as per the calibration method described below. Once the application has calculated the IR position on the screen (XI,
Yl) it commands the operating system to move the relevant cursor to the exact position (XI, Yl) when using windows, for example, it may use a Windows API called SetCursorPos. In this mode, mouse clicks may be effected using the RF signals (Wearable input device button clicks). For example - the sequence 'Right hand left click down' + 'Right hand left click up' may implement a mouse click at the right cursor's current position.
b. Touch - This mode of operation uses the same cursor movement processing as does the Absolute position mode of operation described above. However, in Touch operating mode, mouse clicks may also be triggered by the IR input, in addition to triggering by RF input as described with reference to the absolute position operation mode.
a. The application monitors the number of IR sources found by the IR Sensor.
b. When it senses a new IR source it moves the mouse to the desired location on the screen AND sends a "mouse down" event to the operating system. c. Likewise, whenever that IR source disappears, it sends a "mouse up" event to the operating system.
All of the above may be implemented for more than a single light source, typically limited only by the number of light sources the sensor may detect simultaneously. An example of a use case in which more than one light source may be useful employed is when controlling multi-touch application (e.g. Windows 7).
c. Mouse pad - The purpose of this mode of operation is to simulate a mouse pad using the IR information from the IR Sensor of Figs, lc, lg, li, 5b or 5c. This typically includes moving the cursor related to the cursor current location instead of an absolute location. To do so, the application waits until a new IR source appears, and saves some or all of the following values:
a. The Initial mouse positioning - Al, Bl (where the cursor is currently before being moved, and exactly when the IR appears).
b. The IR Initial Positioning (XI, Yl). After the initial settings have been saved, the application monitors the movement of the IR. Upon reading that the IR has been moved, the application now typically knows:
a. The initial IR Position (XI, Yl).
b. The Current IR Position (X2, Y2).
c. The Initial Cursor position on the screen (Al , B 1)
Knowing these values, the application typically calculates the vector movement of the IR source (distance and direction), translates it into desired vector movement of cursor position (A2, B2) and moves the cursor accordingly.
Conversion ratio: Since the Mouse pad process uses a relation between the IR movement and the cursor movement the user may define a measurement conversion unit between the two. Setting this value may allow the user to move the cursor faster or slower, balancing speed with precision. For example - When setting this value to 2, a 10 inch movement of the IR source in space may trigger a 20 inch movement of the cursor on the controlled screen. In this mode, the application may simulate mouse clicks by reading the RF signals (Wearable input device button clicks) or by detecting an IR source that has been activated for a short period of time (less than a second, for example), just like in a tangible mouse pad.
Fig. 6c is a table setting out example results of various Operation Modes within the Normal Application Mode when an IR source is detected.
Calibration mode: Calibration is the process of defining the relation between the IR sensor's spectrum (scope of view) and the screen. When entering or starting operation in Calibration Mode, the Application shows visual indications on the screen/surface. Then, using the Wearable input device's IR source, the user signals that indicators location to the IR sensor. After the application receives the IR data, it has 2 sets of data: coordinates of the indicators on the screen {(Al, Bl) (A2, B2) etc.} and coordinates of the sensor's scope from which the light has been received accordingly {(XI, Yl) (X2, Y2) etc.} . After calibrating more than 1 point the application uses trigonometric computation in order to convert the sensor's raw coordinates into screen coordinates. The calibration computations also typically handle the direction of the conversion. For example, the left and right points at a sensor located behind the screen (i.e. environment number 4 - surface interaction) are exactly opposite to those retrieved by a sensor located in front of the surface (i.e. environment number 1 - projected screens).
Application mode: Activated when an application that may use the Wearable input device's inputs in a special way is opened such as but not limited to the Google Earth application. Because Google Earth may receive a various kind of inputs such as map moving, zoom in/out, pan and tilt, optic and wireless input received by the software application are converted into meaningful orders to Google Earth, as opposed to merely sending cursor movements and clicks to the operating system. To achieve this, when in application mode number 1 , e.g. Google Earth, the software typically distinguishes between some or all of the following 3 sub-states (a) - (c) which differ from one another e.g. in their input interpretation: sub-state a - One IR Point, Moving the map: When the sensor detects only one IR source is moving, the application orders Google Earth to move the map, according to the distance and direction of the IR movement,
sub-state b -Two IR Points, Rotate and Zoom: When the sensor detects two IR points AND the user has entered the rotate and zoom sub-state, the application orders Google Earth to move and rotate the earth concurrently, e.g. as illustrated by the example set out in Fig. 6d.
sub-state c - Two IR Points, Pan and Tilt: When the sensor detects two IR points AND the user has entered the pan / tilt sub-state, the application orders Google Earth to Pan and Tilt the earth in space, rather than rotating and zooming as in sub-state b. Examples of other special applications which may trigger a certain application mode include but are not limited to specific games where controlling the game is not achievable by conventional movement and click events, and advanced applications which employ complex interaction. The software typically provides individual support for each such application. Gesture mode: A "gesture mode" may be provided which is used to activate a specific operation based on a pre-recorded path of IR inputs. The Gesture Mode may include one or both of the following two sub-states: a. Definition sub-state- The user defines an IR path which the application is recording, and adds an action to this path. Example: a first, e.g. V sign may open a specific virtual keyboard software, a second, e.g. @ sign may open a mail application, a third, e.g. 'e' sign may open a browser application etc.
b. Execution sub-state- The user decides that s/he want to execute a gesture. S/he triggers the application to enter the execution sub-state, e.g. by holding down one button of the wearable input device for a period of at least a predetermined length e.g. a few seconds. The user then moves one or both of her or his hands to create the relevant gesture path. The application then compares this path to the internal gesture database and when finding a match, runs the desired action.
Optionally, multi-touch functionality and/or multi-user support are provided by the software application. According to this embodiment, e.g. as shown in Fig. 4c, the Wearable input device typically includes a pair of wearable input devices which are entantiomers i.e. are mirror images of each other such that they may be worn on both hands, the IR Sensor may send positions of a plurality of IR points to the application simultaneously and the software application is designed to handle multiple IR inputs concurrently. With operating systems that support multi touch, e.g. Windows 7, the standard multi-touch API may be employed in order to control default multi-touch behavior. With operating systems that do not support multi touch, a multi touch environment may be simulated. For example, multi-touch painting software may be created that may display several simulated cursors on the screen based on the detected IR sources. Since the RF and IR data typically include a unique identifier of the Wearable input device e.g. user ID plus identification of hand as either left or right) the application may contain the unique identifier of each Wearable input device that operates each cursor on the screen operated by more than one user. When receiving an RF event (left click up for example, also containing the Wearable input device's unique identifier), painting occurs on the screen, using the appropriately corresponding cursor and without affecting other users' cursors. Optionally, user preferences are accommodated by the software application. Some or all of the variables that define the behavior of the application in specific modes may be saved in the software application's internal memory. However, each and every one of them may be saved per a specific user, specific sensor or any combination between the two. Examples include:
A. Conversion ratio - a certain user defines that whenever s/he interacts with the system using the 'mouse pad' operating mode described herein, regardless of the controlled environment or the sensor being used, s/he always wants to use a conversion ratio size of, say, 4.
B. Calibration - a certain sensor is permanently mounted atop a stationary projector. A user defines that regardless of the active user identity, the calibration setting may always be a particular value, for that specific sensor.
C. Gestures - a certain user may define that when s/he interact with sensor A s/he wants a particular gesture V to do something but when s/he interact with sensor B s/he wants the gesture V to do something else. The user may also define different gestures on each sensor.
One implementation of hardware components of a wearable input device constructed and operative in accordance with certain embodiments of the present invention, is now described with reference to Figs. 8a - 8q. It is appreciated that this and other detailed implementations are described herein merely by way of example and are not of course intended to be limiting.
Fig. 8a is a simplified block diagram of hardware components of a wearable input device constructed and operative in accordance with certain embodiments of the present invention. As shown, three light sources are provided which emit light at a certain wavelength and optical power in order to activate (or not, in case of the red laser) the sensor. The IR laser diode emits light at a wavelength of at least 800nm where the wavelength corresponds to the sensor filter frequency response. The laser diode typically includes a built-in photo diode which absorbs light emitted from the laser and may be interfaced to a feedback circuit to monitor and indicate the laser activation. The IR laser diode may for example comprise a 5.6mm diameter diode cased in a TO-18 footprint as shown in Figs. 8b - 8d. In order to produce a fine small circular beam shape, a specific laser collimator may be mounted on top of the diode which typically matches the 5.6mm diode and may include a holding tube with a locking ring and a 4mm plastic lens.
As for Laser diode driving circuits, an example implementation thereof is shown in Fig. 8e. The IR laser and RED laser may be driven using the same method. A current source controlled by an MCU may be used to supply a steady current to the laser diode. Trimmers R13 and R17 may be used to calibrate this current for the IR laser and red laser respectively, so as to calibrate each laser's current flow in order to control the optical emitted power by the lasers. Signals "SHTD_IR_LASER" and "SHTD_RED_LASER" are used to turn on the current sources supplying current to the laser diodes. The "LS_VOLTAGE_SHT" signal turns on/off the voltage supply to the lasers. This method assures a very low current consumption in the off state, when none of the lasers is turned on, and in the on state too by turning on only one circuit at a time (either the red laser circuit or IR laser circuit).
Suitable Feedback circuitry for the IR laser according to certain embodiments is shown in Fig. 8f. As shown, almost every laser diode is typically supplied with a photodiode which generates current upon laser activation, so as to provide a feedback circuit to trigger a LED each time the laser emits. The example circuit illustrated in Fig. 8f samples the current generated from the IR laser photodiode and converts it to measurable voltage. A yellow LED mounted on the control's unit panel is turned on when the IR laser is emitting. The red laser emits light at a wavelength visible to the human eye and is out of scale regarding the sensor filter frequency response. The red laser may have a 5.6mm diameter case in a TO-18 footprint. Its collimation may be produced using the same type of collimator as the IR laser. Similar to the IR laser, the IR LED emits light at a wavelength of at least 800nm, however, unlike the IR laser, the IR LED light is projected directly to the sensor. The radiant power of the LED should be at least 20m W, to allow the sensor to detect it. The LED may comprise a 940nm LED device with a typical radiant power of 20m W and may have an outline of 3 mm and a length of 3mm.
The relative radiant intensity vs. angular displacement of the LED is illustrated in Fig. 8g. The axes of the graph of Fig. 8g include a normalized radiant intensity from 0-1 and the emitted light angle from -90° to +90°. The parabolic lines mark the intensity axis and the straight lines mark the angles. As shown, each spot on the curve has a parabolic line leading to an intensity reading and a straight line leading to an angle reading. For example pointing at the 0.4 reading and moving along the line to a point on the curve results in a meeting with the 60° angle meaning that the LED has 40% intensity at angle of 60°.
An example implementation for Wireless communication using RF is now described with reference to Fig. 8b which illustrates the CC2510D2 SOC by TI; this comes in a 4X4 mm QFN footprint to minimize board size. Fig. 8h is a top view, pin out illustration of the CC2510O2. Fig. 8i is an example of a suitable application circuit of the CC2510. The RF transceiver may use a 50Ω antenna. For space reduction reasons the antenna is typically a chip antenna of a very small size such as a Fractus ANT
2.4GHZ 802.1 1 BLUETOOTH SMD. The board space used for this type of antenna is very small (proximally 5X15mm of board area). The chip itself has a very low profile of 7X3mm rectangular shape.
USB dongle: The CC251 1G2 is a 2.4GHz SOC that includes a full speed USB controller. RF connectivity from the control unit is received in the host side using one PCB dongle device which incorporates the CC251 102 SOC.
The scroller is operative capacitance sensing. If the application requires high resolution sensors such as scroll bars or wheels, software, running on the host processor, may be employed. The memory for the host typically, depending on the sensor type, stores 10 KB of code and 600 bytes of data memory. Fig. 8j illustrates a Three part capacitance to sensor solution.
A slider, as shown in Fig. 8k, may comprise 5 to 8 or more discrete sensor segments depending on the sensor length, with each segment connected to a CIN input pin on the AD7147/ AD7148. Discrete sliders may be used for applications that employ linear, repeatable output position locations and may comprise discrete sensors elements arranged in a strip, one after the other. The discrete sensing segments may operate like buttons. Each sensing segment is arranged in close proximity to the next sensor; thus, when a user moves a finger along the slider, more than one sensor segment is activated at a time. The slider of Fig. 8k can produce up to 128 output positions. Each segment of the slider employs one CIN input connection to the AD7147/AD7148. Double action surface mounted switches are provided according to certain embodiments. A double action switch allows a two signal device actuated by two different forces to be provided. A first press on the switch triggers a "half press signal" and a subsequent press triggers a "full press signal". This mechanism exhibits a mechanical feedback for each of the presses so as to produce the following 4 electrical signals as shown in Figs. 8L, 8m, 8n and 8o: no to half press, half to full press, full to half press and half to no press.
Referring to the denounce circuitry for denouncing a mechanical switch illustrated in Figs. 8L, 8m, 8n and 8o, it is appreciated that when a switch is operated by a human, spikes of low and high voltages always occur across that switch resulting into a series of High and Low signals that may be interpreted by a digital circuit as more than one pulse instead of as one clean pulse or as a transition from a logic state to another. The circuitry of Figs. 8L, 8m, 8n and 8o includes a denounce circuit including a capacitor, pull up resistor and Schmitt trigger IC.
System connectivity may be provided by the example connection scheme illustrated in fig. 8p. As shown, a light sources board (LSPCB) has 6 SMD pads lined at the rear and connects directly to a 1mm pitch right angle 18 pin connector on the control board PCB (CPCB). Typically, the control board may be unplugged by the operator from the finger without opening the enclosure. The switches are connected to the same 18 pin connector on the CPCB. The cable is soldered directly to the switches. The scroll pad PCB (SPCB) is connected to the same 18 pin connector.
Fig. 8q is a table providing a glossary of terms used in Figs. 8a - 8p and in the description thereof, and elsewhere.
One implementation of software components of a wearable input device constructed and operative in accordance with certain embodiments of the present invention, is now described with reference to Figs. 9a - 9f.
Fig. 9A is an example Sleeves Data flow diagram illustrating example interactions between all HW actors of a wearable input device e.g. Sleeve via which SW interface actors may interact.
Fig. 9b is an example Dongle Data flow illustrating example interactions between all HW actors of a Dongle provided in accordance with certain embodiments, via which SW interface actors interact. Fig. 9c is an example Main module Flow Chart for a wearable input device e.g.
Sleeve in which a main module of a wearable input device e.g. that of Figs. 4a - 4b, monitors and manages some or all of the buttons, state selection, slider and light sources e.g. as shown and described herein, and sends event signals accordingly.
Fig. 9c is an example flow chart for the dongle of Fig. 9b according to which dongle software residing in the dongle receives the RF events messages from a Laser Pointer device, prepares the buffer for events list sending to a USB and sends data according to a Host request, where the host typically comprises the electronic device being controlled by the wearable input device shown and described herein. The Dongle main module typically waits for RF message receiving, tests it and, if the message is valid, adds new events to the USB input buffer etc.
The host DLL (PC, for example) typically contains variables and methods for receiving by USB and displaying the buttons and slider events (among other events). An example Events list, which may be transmitted by protocol RF, from Device to Dongle is set out in the table of Fig. 9e. Fig. 9f is a table providing a glossary of terms used in Figs. 9a - 9e and in the description thereof, and elsewhere.
It is appreciated that certain embodiments of the present invention described in the context of mouse-operated applications may if desired be modified for cursor-based applications other than mouse-operated applications. It is appreciated that IR-based implementations described herein are only by way of example and any other suitable wireless technology may replace the IR implementation described herein. Also, buttons illustrated herein may of course be replaced by any other suitable actuator. Flowchart illustrations shown and described herein are intended to represent any methods which include some or all of the illustrated steps, suitably ordered e.g. as shown.
It is appreciated that terminology such as "mandatory", "required", "need" and "must" refer to implementation choices made within the context of a particular implementation or application described herewithin for clarity and are not intended to be limiting since in an alternative implantation, the same elements might be defined as not mandatory and not required or might even be eliminated altogether.
It is appreciated that software components of the present invention including programs and data may, if desired, be implemented in ROM (read only memory) form including CD-ROMs, EPROMs and EEPROMs, or may be stored in any other suitable computer-readable medium such as but not limited to disks of various kinds, cards of various kinds and RAMs. Components described herein as software may, alternatively, be implemented wholly or partly in hardware, if desired, using conventional techniques. Conversely, components described herein as hardware may, alternatively, be implemented wholly or partly in software, if desired, using conventional techniques.
Included in the scope of the present invention, inter alia, are electromagnetic signals carrying computer-readable instructions for performing any or all of the steps of any of the methods shown and described herein, in any suitable order; machine- readable instructions for performing any or all of the steps of any of the methods shown and described herein, in any suitable order; program storage devices readable by machine, tangibly embodying a program of instructions executable by the machine to perform any or all of the steps of any of the methods shown and described herein, in any suitable order; a computer program product comprising a computer useable medium having computer readable program code, such as executable code, having embodied therein, and/or including computer readable program code for performing, any or all of the steps of any of the methods shown and described herein, in any suitable order; any technical effects brought about by any or all of the steps of any of the methods shown and described herein, when performed in any suitable order; any suitable apparatus or device or combination of such, programmed to perform, alone or in combination, any or all of the steps of any of the methods shown and described herein, in any suitable order; electronic devices each including a processor and a cooperating input device and/or output device and operative to perform in software any steps shown and described herein; information storage devices or physical records, such as disks or hard drives, causing a computer or other device to be configured so as to carry out any or all of the steps of any of the methods shown and described herein, in any suitable order; a program pre-stored e.g. in memory or on an information network such as the Internet, before or after being downloaded, which embodies any or all of the steps of any of the methods shown and described herein, in any suitable order, and the method of uploading or downloading such, and a system including server/s and/or client/s for using such; and hardware which performs any or all of the steps of any of the methods shown and described herein, in any suitable order, either alone or in conjunction with software. Any computer-readable or machine-readable media described herein is intended to include non-transitory computer- or machine-readable media.
Any computations or other forms of analysis described herein may be performed by a suitable computerized method. Any step described herein may be computer- implemented. The invention shown and described herein may include (a) using a computerized method to identify a solution to any of the problems or for any of the objectives described herein, the solution optionally include at least one of a decision, an action, a product, a service or any other information described herein that impacts, in a positive manner, a problem or objectives described herein; and (b) outputting the solution.
Features of the present invention which are described in the context of separate embodiments may also be provided in combination in a single embodiment. Conversely, features of the invention, including method steps, which are described for brevity in the context of a single embodiment or in a certain order may be provided separately or in any suitable subcombination or in a different order, "e.g." is used herein in the sense of a specific example which is not intended to be limiting. Devices, apparatus or systems shown coupled in any of the drawings may in fact be integrated into a single platform in certain embodiments or may be coupled via any appropriate wired or wireless coupling such as but not limited to optical fiber, Ethernet, Wireless LAN, HomePNA, power line communication, cell phone, PDA, Blackberry GPRS, Satellite including GPS, or other mobile delivery. It is appreciated that in the description and drawings shown and described herein, functionalities described or illustrated as systems and sub-units thereof can also be provided as methods and steps therewithin, and functionalities described or illustrated as methods and steps therewithin can also be provided as systems and sub-units thereof. The scale used to illustrate various elements in the drawings is merely exemplary and/or appropriate for clarity of presentation and is not intended to be limiting.

Claims

1. A wearable input device operative to control an electronic system, the input device comprising:
a wearable substrate;
an IR laser source mounted on said wearable substrate and operative to emit a laser beam impinging on a screen at a location whose coordinates depend on motion of the user wearing the input device; and
a laser spot detector operative to detect coordinates of said location within said screen and accordingly to control said electronic system.
2. IR-based apparatus for controlling an electronic device, the apparatus comprising:
an IR camera configured to be mountable on an electronic device, the electronic device having an input area, the IR camera's field of view including said input area, the IR camera being operative to sense IR signals generated by an input device worn on at least one user's hand, when said hand is operating on said input area; and
a controlling functionality operative to receive said IR signals from said IR camera and to control said electronic device accordingly.
3. A wearable input device serving a human user, the device comprising:
a wearable substrate; and
at least one force sensor mounted on said wearable substrate and operative to sense pressure patterns applied by the human user which mimics the pressure patterns the human user would apply to a mouse button; and
a controlling functionality operative to receive signals, indicative of said pressure, from said force sensor and to control a normally cursor-based electronic device accordingly, including commanding the electronic device to respond to each pressure pattern applied by the human user, as it would respond to the same pressure pattern were it to have been applied by the human user to a cursor-based input device operating said electronic device.
4. Wearable input apparatus operative to provide multi -touch control of an electronic system when said apparatus is worn by a human user, the input apparatus comprising:
a first wearable input device operative to control the electronic system when mounted on the human user's right hand; and
a second wearable input device which is a mirrored copy of the first wearable input device and is operative to control the electronic system when mounted on the human user's left hand.
5. Wearable input apparatus operative to control an electronic system when worn by a human user, the input apparatus comprising:
a finger- wearable substrate configured to be mounted on a user's finger and having a tip portion configured to be mounted on the user's finger tip; and
a force sensing device mounted on said tip portion and including at least one force sensor and being operative to sense pressure applied by the human user's thumb when the user presses thumb to finger, and to control said electronic system at least partly in accordance with at least one characteristic of said pressure.
6. Apparatus according to claim 5 and wherein said force sensing device comprises an annular substrate configured to pivot around the user's finger on which a plurality of force sensors are mounted, such that, by rotating said annular substrate, the user selectably positions any of said plurality of force sensors at a location on his finger tip which is accessible to his thumb.
7. Apparatus according to claim 2 and also comprising an input device worn on at least one user's hand which enables a user to alternate the controlling functionality between a first state in which the electronic device is responsive to the input area and a second state in which the electronic device is responsive to said IR signals.
8. Apparatus according to claim 7 wherein said input device includes a states elector allowing the user to alternate the controlling functionality between the first and second states.
9. A wearable input system including:
a wearable input device; and
a docking station operative to receive the wearable input device and selectably retain and release the wearing input device once received, so as to enable a human to wear and shed the wearing input device without manual contact therewith.
10. A system according to claim 9 wherein said docking station is operative to provide mechanical protection to the wearable input device.
11. A system according to claim 9 wherein said docking station is operative to charge the input device while it is being retained within the docking station.
12. A system according to claim 9 wherein retaining of the input device by the docking station is activated by the human's twisting a body part on which the input device is mounted in a first azimuthal direction, after the device has been received by the docking station, and wherein releasing the input device by the docking station is activated by the human's twisting a body part on which the input device is mounted in a second azimuthal direction, opposite to said first direction, while the device is being retained by the docking station.
13. A device according to claim 1 wherein said laser spot detector comprises an IR camera arranged such that its field of view includes said screen.
14. A device according to claim 1 wherein said screen may comprise a projected screen and said laser spot detector may be mounted on a projector projecting said projected screen.
15. A device according to claim 14 wherein said projector comprises an image projector in a handheld device.
16. A device according to claim 1 wherein said laser spot detector comprises an optic-sensitive functionality of the screen itself.
17. A device according to claim 1 wherein said electronic system comprises a computer.
18. A device according to claim 1 wherein said electronic system comprises a portable communication device.
19. A device according to claim 1 wherein said screen comprises a physical screen.
20. A device according to claim 1 wherein said laser source comprises an IR laser source.
21. A device according to claim 13 having at least two user-selectable modes of operation including a remote mode of operation utilizing said laser source and said laser spot detector and a touch mode of operation.
22. A device according to claim 21 wherein said touch mode of operation utilizes: a light source mounted on the wearable input device and activated by an actual touching of the screen by the user;
a sensor which senses a location of the light source and
a controller which receives the light source location from the sensor and controls the electronic system accordingly.
23. A device according to claim 21 wherein selectability of said selectable modes is activated by a human user who pinches thumb to finger.
24. A device according to claim 21 wherein selectability of said selectable modes is activated by a human user who operates a two- manner press mechanism of a pressable element on the wearable input device.
25. Apparatus according to claim 2 wherein said input area includes a keyboard.
26. Apparatus according to claim 2 wherein said input area includes a touch-pad.
27. A device according to claim 3 wherein said controlling functionality is also operative to receive signals, indicative of a mouse-sliding operation simulated by said wearable substrate and to control a normally mouse-operated electronic device accordingly, including commanding the electronic device to respond to each mouse- sliding operation simulated by the human user using said substrate, as it would respond to the same mouse-sliding operation were it to have been applied by the human user to a mouse operating said electronic device.
28. Apparatus according to claim 4 wherein said first and second wearable input devices are entantiomers which are mirror images of each other.
29. Apparatus according to claim 4 and also comprising an optical sensor operative to simultaneously sense positions of a plurality of light points generated by both input devices simultaneously.
30. Apparatus according to claim 4 and also comprising a controlling application operative for simultaneously receiving and simultaneously processing positions of a plurality of light points generated by both input devices simultaneously and for controlling a host device accordingly.
31. Apparatus according to claim 29 wherein said optical sensor comprises an IR sensor and said light points generated by both input devices simultaneously comprise IR light points.
32. A touchless user input system comprising:
a rear projection unit located at the bottom of a surface which projects onto the surface at an angle acute enough to accommodate the small distance between the screen and the projecting unit;
a wearable input device emitting light; and
a rear optical sensor located behind the screen e.g. adjacent the projecting unit, which is operative to see said light emitted by said wearable input unit.
33. A system according to claim 32 and also comprising a force sensing actuator on said wearable input device which triggers emission of said light.
34. A system according to claim 32 wherein a user joins thumb and finger together adjacent a desired screen location, said light comprises IR light which impinges upon the surface at said desired screen location and wherein said rear sensor detects the points on the screen from which the laser beams are scattering.
35. A device according to claim 1 wherein said IR laser source comprises a near-IR laser source.
36. Apparatus according to claim 8 wherein said state selector is controlled by a manual operation by the user in which the user presses together his thumb and a finger wearing said input device.
37. A wearable input device which includes an optic sensor and which, when worn by a human user, provides inputs to an electronic device, the input device having a first operating mode in which the optic sensor points at a screen and a second operating mode in which the optic sensor points at the user.
38. An input device according to claim 37 wherein said screen comprises a projected screen.
39. A wearable input device which, when worn by a human user, provides inputs to an electronic device, the input device having a "touch" operating mode in which the human user presses a screen which is not a touch-screen and the electronic device is controlled as though said screen were a touch screen.
40. A device according to claim 23 which provides to a human user an experience as though the user were holding a cursor with thumb and forefinger and moving said cursor.
41. A wearable input device which, when worn by a human user, provides inputs to an electronic device, the input device having a "touch" operating mode in which the human user presses a screen and the electronic device is controlled as though said screen were a touch screen; and a remote operating mode in which the human user interacts with the screen remotely from afar.
42. A wearable input device which is operative in any of a selectable plurality of interactive environments.
43. An input device according to claim 42 wherein said plurality of environments includes a front projected screen environment.
44. An input device according to claim 42 wherein said plurality of environments includes a rear projected screen environment.
45. An input device according to claim 42 wherein said plurality of environments includes a desktop computer environment.
46. An input device according to claim 42 wherein said plurality of environments includes a laptop computer environment.
47. An input device according to claim 42 wherein said plurality of environments includes a mobile device with an embedded pico projector.
48. An input device according to claim 42 wherein said plurality of environments includes a interactive surface environment.
49. A method for providing IR-based apparatus for controlling an electronic device, the method comprising:
providing an IR camera configured to be mountable on an electronic device, the electronic device having an input area, the IR camera's field of view including said input area, the IR camera being operative to sense IR signals generated by an input device worn on at least one user's hand, when said hand is operating on said input area; and
providing a controlling functionality operative to receive said IR signals from said IR camera and to control said electronic device accordingly.
50. A method for providing a wearable input device serving a human user, the method comprising:
mounting at least one force sensor on a wearable substrate, wherein said force sensor is operative to sense pressure patterns applied by the human user which mimics the pressure patterns the human user would apply to a mouse button; and
providing a controlling functionality operative to receive signals, indicative of said pressure, from said force sensor and to control a normally cursor-based electronic device accordingly, including commanding the electronic device to respond to each pressure pattern applied by the human user, as it would respond to the same pressure pattern were it to have been applied by the human user to a cursor-based input device operating said electronic device.
51. A method for providing wearable input apparatus operative to provide multi- touch control of an electronic system when said apparatus is worn by a human user, the method comprising:
providing a first wearable input device operative to control the electronic system when mounted on the human user's right hand; and providing a second wearable input device which is a mirrored copy of the first wearable input device and is operative to control the electronic system when mounted on the human user's left hand.
52. A method for providing wearable input apparatus operative to control an electronic system when worn by a human user, the method comprising:
providing a finger- wearable substrate configured to be mounted on a user's finger and having a tip portion configured to be mounted on the user's finger tip; and mounting a force sensing device on said tip portion, the force sensing device including at least one force sensor and being operative to sense pressure applied by the human user's thumb when the user presses thumb to finger, and to control said electronic system at least partly in accordance with at least one characteristic of said pressure.
53. A method for providing a wearable input system, the method including:
providing a docking station operative to receive a wearable input device and to selectably retain and release the wearing input device once received, so as to enable a human to wear and shed the wearing input device without manual contact therewith.
54. A method for providing a touchless user input system, the method comprising: providing a rear projection unit located at the bottom of a surface which projects onto the surface at an angle acute enough to accommodate the small distance between the screen and the projecting unit;
providing a wearable input device emitting light; and
providing a rear optical sensor operative to be disposed behind the screen e.g. adjacent the projecting unit, which is operative to see said light emitted by said wearable input unit.
55. A method for providing a wearable input device, the method including:
providing a wearable input device which includes an optic sensor and which, when worn by a human user, provides inputs to an electronic device, the input device having a first operating mode in which the optic sensor points at a screen and a second operating mode in which the optic sensor points at the user.
56. A method for providing a wearable input device, the method including:
providing a wearable input device which, when worn by a human user, provides inputs to an electronic device, the input device having a "touch" operating mode in which the human user presses a screen which is not a touch-screen and the electronic device is controlled as though said screen were a touch screen.
57. A method for providing a wearable input device, the method including:
providing a wearable input device which, when worn by a human user, provides inputs to an electronic device, the input device having a "touch" operating mode in which the human user presses a screen and the electronic device is controlled as though said screen were a touch screen; and
a remote operating mode in which the human user interacts with the screen remotely from afar.
58. A method for providing a wearable input device, the method including:
providing a wearable input device which is operative in any of a selectable plurality of interactive environments.
59. A method for providing a wearable input device operative to control an electronic system, the method comprising:
providing an IR laser source mounted on a wearable substrate and operative to emit a laser beam impinging on a screen at a location whose coordinates depend on motion of the user wearing the input device; and
providing a laser spot detector operative to detect coordinates of said location within said screen and accordingly to control said electronic system.
60. A computer program product, comprising a computer usable medium having a computer readable program code embodied therein, said computer readable program code adapted to be executed to implement any of the methods shown and described herein.
61. Apparatus according to claim 6 wherein said annular substrate comprises at least one light source selectably triggered by at least one of said force sensors respectively.
62. Apparatus according to claim 61 wherein said at least one light source comprises at least one LED.
63. Apparatus according to claim 2 wherein the user can control the keyboard with one hand and control the cursor with the other hand simultaneously.
64. Apparatus according to claim 2 wherein wireless communication actuators on the input device send signals directly to said electronic device.
65. Apparatus according to claim 4 operative to distinguish between input streams emanating from each of the user's hands and to handle both of said input streams concurrently.
66. A device according to claim 22 wherein said light source comprises a LED.
67. A virtual mouse pad at least an order of magnitude larger than a conventional physical embedded mouse pad.
68. A device according to claim 21 wherein said screen is projected from the front.
69. A device according to claim 21 wherein said screen is projected from the rear.
70. A device according to claim 1 and having at least two user-selectable modes of operation including a remote mode of operation.
71. A device according to claim 69 wherein the remote mode of operation utilizes a LED as a light source and wherein said screen comprises a physical screen and said sensor is pointing at the user.
72. Apparatus for controlling a computer comprising:
a camera seated on the computer and having a first position in which the camera points at an area in front of the computer's screen and a second position in which the earner points at the computer's keyboard area; and
a controlling functionality receiving input from said camera and accordingly, controlling said computer.
73. Apparatus according to claim 72 and also comprising a laptop computer on which the camera is seated.
74. Apparatus according to claim 73 wherein said camera is seated on an upper rib of said laptop computer.
75. A computer usable medium on which resides a controlling functionality according to any of the preceding claims.
PCT/IL2010/000834 2009-10-13 2010-10-13 Wearable device for generating input for computerized systems WO2011045786A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US27261009P 2009-10-13 2009-10-13
US61/272,610 2009-10-13

Publications (2)

Publication Number Publication Date
WO2011045786A2 true WO2011045786A2 (en) 2011-04-21
WO2011045786A3 WO2011045786A3 (en) 2011-06-23

Family

ID=43827313

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2010/000834 WO2011045786A2 (en) 2009-10-13 2010-10-13 Wearable device for generating input for computerized systems

Country Status (1)

Country Link
WO (1) WO2011045786A2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014106862A3 (en) * 2013-01-03 2014-09-25 Suman Saurav A method and system enabling control of different digital devices using gesture or motion control
US8854452B1 (en) 2012-05-16 2014-10-07 Google Inc. Functionality of a multi-state button of a computing device
US9535516B2 (en) 2010-02-23 2017-01-03 Muv Interactive Ltd. System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith
US9880619B2 (en) 2010-02-23 2018-01-30 Muy Interactive Ltd. Virtual reality system with a finger-wearable control
US10579216B2 (en) 2016-03-28 2020-03-03 Microsoft Technology Licensing, Llc Applications for multi-touch input detection

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4988981A (en) 1987-03-17 1991-01-29 Vpl Research, Inc. Computer data entry and manipulation apparatus and method
US5453759A (en) 1993-07-28 1995-09-26 Seebach; Jurgen Pointing device for communication with computer systems
US6198485B1 (en) 1998-07-29 2001-03-06 Intel Corporation Method and apparatus for three-dimensional input entry
WO2002037466A1 (en) 2000-11-02 2002-05-10 Essential Reality, Inc Electronic user worn interface device
US6587090B1 (en) 2000-10-03 2003-07-01 Eli D. Jarra Finger securable computer input device
US20030227437A1 (en) 2002-06-05 2003-12-11 Ramirez Nohl W. Computer pointing device and utilization system
WO2004055726A1 (en) 2002-12-18 2004-07-01 National Institute Of Advanced Industrial Science And Technology Interface system
US20060001646A1 (en) 2004-07-02 2006-01-05 Wei Hai Finger worn and operated input device
US20060012567A1 (en) 2004-07-13 2006-01-19 Todd Sicklinger Minature optical mouse and stylus
US7006079B2 (en) 2002-08-30 2006-02-28 Nara Institute Of Science And Technology Information input system
US7042438B2 (en) 2003-09-06 2006-05-09 Mcrae Michael William Hand manipulated data apparatus for computers and video games
US7057604B2 (en) 2001-07-06 2006-06-06 Mikamed Health Technologies Inc. Computer mouse on a glove
US20080042995A1 (en) 2004-08-27 2008-02-21 Lenovo (Beijing) Limited Wearable Signal Input Apparatus for Data Processing System
GB2442973A (en) 2006-10-20 2008-04-23 Kevin Moonie Finger worn computer mouse with an optical sensor on a pivoting arm
US20080317331A1 (en) 2007-06-19 2008-12-25 Microsoft Corporation Recognizing Hand Poses and/or Object Classes
WO2009024971A2 (en) 2007-08-19 2009-02-26 Saar Shai Finger-worn devices and related methods of use
WO2009125258A1 (en) 2008-04-08 2009-10-15 Sony Ericsson Mobile Communications Ab Communication terminals with superimposed user interface
US20090322680A1 (en) 2008-06-30 2009-12-31 Maurizio Sole Festa Radio frequency pointing device
WO2010053260A2 (en) 2008-11-07 2010-05-14 Suh Changsu Mouse controlled via finger movements in air
WO2010064094A1 (en) 2008-12-01 2010-06-10 Sony Ericsson Mobile Communications Ab Portable electronic device with split vision content sharing control and method
US20100188428A1 (en) 2008-10-15 2010-07-29 Lg Electronics Inc. Mobile terminal with image projection

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200907764A (en) * 2007-08-01 2009-02-16 Unique Instr Co Ltd Three-dimensional virtual input and simulation apparatus

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4988981A (en) 1987-03-17 1991-01-29 Vpl Research, Inc. Computer data entry and manipulation apparatus and method
US4988981B1 (en) 1987-03-17 1999-05-18 Vpl Newco Inc Computer data entry and manipulation apparatus and method
US5453759A (en) 1993-07-28 1995-09-26 Seebach; Jurgen Pointing device for communication with computer systems
US6198485B1 (en) 1998-07-29 2001-03-06 Intel Corporation Method and apparatus for three-dimensional input entry
US6587090B1 (en) 2000-10-03 2003-07-01 Eli D. Jarra Finger securable computer input device
WO2002037466A1 (en) 2000-11-02 2002-05-10 Essential Reality, Inc Electronic user worn interface device
US7057604B2 (en) 2001-07-06 2006-06-06 Mikamed Health Technologies Inc. Computer mouse on a glove
US20030227437A1 (en) 2002-06-05 2003-12-11 Ramirez Nohl W. Computer pointing device and utilization system
US7006079B2 (en) 2002-08-30 2006-02-28 Nara Institute Of Science And Technology Information input system
WO2004055726A1 (en) 2002-12-18 2004-07-01 National Institute Of Advanced Industrial Science And Technology Interface system
US7042438B2 (en) 2003-09-06 2006-05-09 Mcrae Michael William Hand manipulated data apparatus for computers and video games
US20060001646A1 (en) 2004-07-02 2006-01-05 Wei Hai Finger worn and operated input device
US20060012567A1 (en) 2004-07-13 2006-01-19 Todd Sicklinger Minature optical mouse and stylus
US20080042995A1 (en) 2004-08-27 2008-02-21 Lenovo (Beijing) Limited Wearable Signal Input Apparatus for Data Processing System
GB2442973A (en) 2006-10-20 2008-04-23 Kevin Moonie Finger worn computer mouse with an optical sensor on a pivoting arm
US20080317331A1 (en) 2007-06-19 2008-12-25 Microsoft Corporation Recognizing Hand Poses and/or Object Classes
WO2009024971A2 (en) 2007-08-19 2009-02-26 Saar Shai Finger-worn devices and related methods of use
WO2009125258A1 (en) 2008-04-08 2009-10-15 Sony Ericsson Mobile Communications Ab Communication terminals with superimposed user interface
US20090322680A1 (en) 2008-06-30 2009-12-31 Maurizio Sole Festa Radio frequency pointing device
US20100188428A1 (en) 2008-10-15 2010-07-29 Lg Electronics Inc. Mobile terminal with image projection
WO2010053260A2 (en) 2008-11-07 2010-05-14 Suh Changsu Mouse controlled via finger movements in air
WO2010064094A1 (en) 2008-12-01 2010-06-10 Sony Ericsson Mobile Communications Ab Portable electronic device with split vision content sharing control and method

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9535516B2 (en) 2010-02-23 2017-01-03 Muv Interactive Ltd. System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith
US9880619B2 (en) 2010-02-23 2018-01-30 Muy Interactive Ltd. Virtual reality system with a finger-wearable control
US10528154B2 (en) 2010-02-23 2020-01-07 Touchjet Israel Ltd System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith
US8854452B1 (en) 2012-05-16 2014-10-07 Google Inc. Functionality of a multi-state button of a computing device
WO2014106862A3 (en) * 2013-01-03 2014-09-25 Suman Saurav A method and system enabling control of different digital devices using gesture or motion control
US10078374B2 (en) 2013-01-03 2018-09-18 Saurav SUMAN Method and system enabling control of different digital devices using gesture or motion control
US10579216B2 (en) 2016-03-28 2020-03-03 Microsoft Technology Licensing, Llc Applications for multi-touch input detection

Also Published As

Publication number Publication date
WO2011045786A3 (en) 2011-06-23

Similar Documents

Publication Publication Date Title
US10528154B2 (en) System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith
US9880619B2 (en) Virtual reality system with a finger-wearable control
US20220253188A1 (en) Systems and methods for controlling virtual scene perspective via physical touch input
US20220121344A1 (en) Methods for interacting with virtual controls and/or an affordance for moving virtual objects in virtual environments
KR102472428B1 (en) Light-Emitting User Input Device
US10534447B2 (en) Multi-surface controller
US9911240B2 (en) Systems and method of interacting with a virtual object
US10444849B2 (en) Multi-surface controller
CN103502923B (en) User and equipment based on touching and non-tactile reciprocation
US8446367B2 (en) Camera-based multi-touch mouse
TWI559174B (en) Gesture based manipulation of three-dimensional images
JP2018505455A (en) Multi-modal gesture-based interactive system and method using one single sensing system
US20220317776A1 (en) Methods for manipulating objects in an environment
AU2011219427B2 (en) A system for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith
Matulic et al. Pensight: Enhanced interaction with a pen-top camera
WO2011045786A2 (en) Wearable device for generating input for computerized systems
KR100749033B1 (en) A method for manipulating a terminal using user's glint, and an apparatus
US20240019938A1 (en) Systems for detecting gestures performed within activation-threshold distances of artificial-reality objects to cause operations at physical electronic devices, and methods of use thereof
US20240028129A1 (en) Systems for detecting in-air and surface gestures available for use in an artificial-reality environment using sensors at a wrist-wearable device, and methods of use thereof
Athira Touchless technology
US11641460B1 (en) Generating a volumetric representation of a capture region
US20240094819A1 (en) Devices, methods, and user interfaces for gesture-based interactions
WO2023244851A1 (en) Systems for detecting in-air and surface gestures available for use in an artificial-reality environment using sensors at a wrist-wearable device, and methods of use thereof
Cheng Direct interaction with large displays through monocular computer vision
KR20240036582A (en) Method and device for managing interactions with a user interface with a physical object

Legal Events

Date Code Title Description
NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10788405

Country of ref document: EP

Kind code of ref document: A2