WO2000075914A1 - Motion detection and tracking system to control navigation and display of object viewers - Google Patents

Motion detection and tracking system to control navigation and display of object viewers Download PDF

Info

Publication number
WO2000075914A1
WO2000075914A1 PCT/US2000/015210 US0015210W WO0075914A1 WO 2000075914 A1 WO2000075914 A1 WO 2000075914A1 US 0015210 W US0015210 W US 0015210W WO 0075914 A1 WO0075914 A1 WO 0075914A1
Authority
WO
WIPO (PCT)
Prior art keywords
recited
computer
user
computer system
certain portion
Prior art date
Application number
PCT/US2000/015210
Other languages
French (fr)
Inventor
James F. Flack
Sina Fateh
David L. Motte
Original Assignee
Vega Vista, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vega Vista, Inc. filed Critical Vega Vista, Inc.
Priority to JP2001502109A priority Critical patent/JP2003501762A/en
Priority to EP00939510A priority patent/EP1101215A4/en
Publication of WO2000075914A1 publication Critical patent/WO2000075914A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer

Definitions

  • the present invention relates generally to user interfaces. More specifically, the invention relates to a computer interface providing motion detection and tracking to control navigation and display of multi-dimensional object databases using a reference navigation target.
  • Traditional computer human interfaces 10 exist in a variety of sizes and forms including desktop computers, remote terminals, and portable devices such as laptop computers, notebook computers, hand held computers, and wearable computers.
  • FIGURE 1 portrays a traditional desktop computer human interface 10.
  • the traditional desktop computer 10 typically includes a display device 12, a keyboard 14, and a pointing device 16.
  • the display device 12 is normally physically connected to the keyboard 14 and pointing device 16 via a computer.
  • the pointing device 16 and buttons 18 may be physically integrated into the keyboard 14.
  • the keyboard 14 is used to enter data into the computer system.
  • the user can control the computer system using the pointing device 16 by making selections on the display device 12. For example, using the pointing device the user can scroll the viewing area by selecting the vertical 38 or horizontal 36 scroll bar.
  • notebook and hand held computers are often made of two mechanically linked components, one essentially containing the display device 12 and the other the keyboard 14 and pointing device 16. Hinges often link these two mechanical components with a flexible ribbon cabling connecting the components and embedded in the hinging mechanism. The two components can be closed like a book, often latching to minimize inadvertent opening.
  • the notebook computer greatly increased the portability of personal computers.
  • PDA Personal Digital Assistant
  • Palm product line (PalmPilotTM) now manufactured by 3Com. These machines are quite small, lightweight and relatively inexpensive, often fitting in a shirt pocket, weighing a few ounces and costing less than $400 when introduced. These machines possess very little memory (often less than 2 megabytes), a small display 28 (roughly 6 cm by 6 cm) and no physical keyboard.
  • the penlike pointing device 26, often stored next to or on the PDA 20, is applied to the display area 28 to enable its user to make choices and interact with the PDA device 20.
  • FIGURE 2 displays a prior art Personal Digital Assistant 20 in typical operation, in this case strapped upon the wrist of a user.
  • the display area 28 is often quite small compared to traditional computer displays 12.
  • the display area 28 contains an array of 160 pixels by 160 pixels in a 6 cm by 6 cm viewing area.
  • part of the display area is further allocated to menus and the like, further limiting the viewing area for an object such as an e-mail message page. This limitation in viewing area is partially addressed by making the menu bar 34 (FIG. 1) found on most traditional computer human interface displays 12 invisible on a PDA display 28 except when a menu button 29 is pressed.
  • Object database programs such as map viewers, present a fairly consistent set of functions for viewing two-dimensional sheets. Where the object being viewed is larger than the display area of the display, controls to horizontally and vertically scroll the display area across the object are provided. Such viewing functions often possess visible controls accessed via a pointing device. As shown in FIGURE 1, horizontal scrolling is often controlled by a slider bar 36 horizontally aligned with a viewing region 40. Vertical scrolling is often controlled by a vertical slider bar 38 vertically aligned with the viewing region 40. Additionally such database interfaces often possess functionality to scroll in directions other than the vertical and horizontal orthogonal directions. This function is usually controlled by pointing to an icon, such as hand icon 42, which is then moved relative to the viewing area 40 while holding down the button 18.
  • an icon such as hand icon 42
  • zoom out and zoom in controls 30, 32 are often either immediately visible or available from a pull down menu as items in one or more menu bars 34.
  • object viewers often include the ability to traverse a hierarchical organization of collections of objects such as folders of e-mail messages, log files of FAXes, project directories of schematics or floor plans, Internet web page links and objects representing various levels or sub-systems within a multi-tiered database.
  • a pen pointer or stylus can be used to activate pan and scroll functions to shift the display contents.
  • the physical display device remains relatively stationary and the larger object is viewed piece-wise and sequentially in small segments corresponding to the limitations of the physical size of the display screen.
  • What is needed is a system that provides a simple and convenient method to control the display contents that also preserves the user's understanding of the relationship between the current segment on the display and the overall content of the object.
  • Such a method is of particular value for personal information appliances such as hand held computers and communications devices with small display screens.
  • Such appliances must satisfy the conflicting requirements of being small and convenient on the one hand and having the performance and utility of modern laptop or desktop computers on the other.
  • the method allows for single-handed control of the display contents.
  • the present invention addresses the aforementioned problems by providing a new method to control the contents presented on a small display screen.
  • the present invention allows the user to easily traverse any and all segments of a large object using a hand held device with a small display screen. By moving the device in the direction the user is interested in, the user is allowed to traverse an object that is much larger than the display.
  • a device in accordance with one aspect of the present invention includes a digital processor, a computer memory, a computer readable medium, a display device, and a means for detecting motion of the display device relative to a reference navigation target.
  • the digital processor is operable to map information resident in the computer readable medium into a virtual display space suitable for conveying the information to the user.
  • the processor from time to time acquires data from the motion detecting means and uses the acquired data to calculate the position of the device relative to the user of the device. Based upon the calculated position of the device relative to the user, the processor displays upon the display device selected portions of the virtual display space.
  • the motion detecting means preferably includes tracking movement of the device relative to a reference navigation target including a unique set of features, and more particularly, the set of features common to all computer users: the human head, face and/or shoulders.
  • Another aspect of the present invention provides a method for assisting a user in preserving awareness of the context of each displayed segment during the control and operation of a computer system while traversing objects having display formats that are larger than the display.
  • This method begins by mapping the full sized object intended for display by the computer system into a virtual display space. Next, a certain portion of the virtual display space is actually displayed. Then, an image is captured by a motion detecting means and a reference navigation target is acquired from the captured image. Finally, the movement of the device is tracked relative to the reference navigation target and the displayed portion of the virtual display space is changed in a manner correlated to the tracked movement.
  • the movement of the device is tracked relative to a reference navigation target including the unique human feature set of the head, face and/or shoulders of the user.
  • the aforementioned object is a type of detailed or content-rich information such as a geographic map, electronic schematic, video or still image, text document or Internet web page.
  • the hand held device is a personal information appliance such as a hand held computer or mobile communication device capable of displaying text and/or graphical information, albeit on a display sized appropriately for a hand held, wearable or pocketable personal information appliance.
  • This aspect of the present invention allows the user to traverse the object as described above.
  • the user can use other functions of the personal information appliance, such as taking notes, conversing with others or recording messages, while using the virtual display space display management application of the present invention.
  • FIGURE 1 displays a prior art system including a traditional computer human interface and a Personal Digital Assistant;
  • FIGURE 2 displays a prior art Personal Digital Assistant in typical operation
  • FIGURE 3 depicts a hand held computer having a video camera for detecting motion of the computer relative to the user in accordance with one embodiment of the current invention and a motion template to be used hereafter to describe the user's control interaction;
  • FIGURE 4 depicts a system block diagram in accordance with one preferred embodiment of the current invention with an embedded database incorporated in the processor and local motion processing means;
  • FIGURE 5 depicts a flow chart of the method in accordance with one preferred embodiment of the present invention.
  • FIGURE 6 depicts the initial display for a map viewing application in accordance with one embodiment of the current invention with the user indicating a zoom and scroll to focus in on California;
  • FIGURE 7 depicts the result of the user control interaction of the previous figure showing a map of California and displaying the next user control interaction, which will cause the display to zoom and focus on the San Francisco Bay Area;
  • FIGURE 8 depicts the result of the user control interaction of the previous figure showing a map of San Francisco Bay Area and displaying the next user control interaction, which will cause the display to zoom and focus on the waterfront of San Francisco;
  • FIGURE 9, 10 and 11 depict the results of the user control interaction of the previous figure showing a map of the San Francisco waterfront and displaying the next user control interaction, which will cause the display to zoom and focus on a portion of the San Francisco waterfront;
  • FIGURE 12 depicts the result of rotational movement of the hand held computer without rotational translation;
  • FIGURE 13 depicts a hand held computer in conjunction with a laptop and desktop computer in accordance with one embodiment of the present invention.
  • FIGURE 14 depicts a personal information appliance in accordance with one embodiment of the present invention.
  • a display device controls an object viewer, where the object being viewed is typically essentially stationary in virtual space in the plane of the display device.
  • One or more imaging devices mounted on the display device and operably coupled to a motion processor are operable to capture an image from which the motion processor acquires a reference navigation target.
  • the reference navigation target preferably includes a unique feature set such as a user's head, face and/or shoulders.
  • the reference navigation target may also include an item having a unique feature set which is attached to the body of the user or to the clothing of the user.
  • the motion processor tracks the movement of the display device relative to the reference navigation target and provides a motion data vector to a digital processor.
  • the digital processor updates a displayed portion of the object in a manner related to the tracked movements of the display device. In this manner the user is able to traverse the entire object and examine the entire object either as a whole or as a sequence of displayed segments.
  • a unique human feature set such as a user's head, face and/or shoulders, is optimally suited for this purpose as in any useful application of the display device, a user is typically positioned in front of the display device and looking at the display screen of the display device.
  • the cameras can be conveniently positioned and oriented to capture the intended feature set for motion tracking.
  • FIGURE 3 depicts a hand held computer 20 in accordance with one embodiment of the current invention, including a video camera 60 oriented in such manner that the user's unique feature set is captured when the user is viewing the display device 28.
  • additional cameras may be mounted on the computer 20 to achieve the objects of the invention.
  • a motion template 62 to be used hereafter to describe the user's control interaction.
  • the hand held computer 20 is considered to have a processor internal to the case controlling the display device 28.
  • the display device 28 shown in FIGURE 3 is disposed in the same housing as the computer 20.
  • the present invention is not limited to devices wherein the display device 28 and computer 20 are physically attached or disposed in a unitary housing.
  • the imaging device or devices are disposed upon or within the housing of the display device to capture the image in accordance with the present invention.
  • the video camera(s) 60 are preferably coupled to a motion processor for providing the internal processor with a motion vector measurement. Note that the various components of the motion vector measurement may be sampled at differing rates. FIGURE 4 depicts such system.
  • the processor 110 incorporates an embedded database 120. Coupled to the processor via connection 114 are a motion processor 115 and camera 116. Also coupled to the processor 1 10 via connection 112 is a display device 118.
  • the connections 112, 114 may be wired or wireless, the only constraint being that the camera 116 is disposed on the display device 118.
  • the motion processor preferably provides the ability to determine rotation of the hand held display device, while simultaneously determining translational motion.
  • certain features of the reference navigation target such as the relative apparent size of a user's head or the relative distance between the user's eyes, are used to enable zoom control to adjust the resolution of detail and/or the amount of information visible upon the display device.
  • the motion processor generates a motion vector relative to a frame of reference including the reference navigation target.
  • Some preferred embodiments will use a 2-D frame of reference while other embodiments will use a 3-D frame of reference.
  • Some preferred embodiments will use a rectilinear axis system, other embodiments will use a radial axis system.
  • the origin will be positioned at a prominent feature of the reference navigation target, such as the human nose.
  • the hand held device 20 may be further preferably augmented with other control inputs such as voice commands or button 61 on one side of the hand held computer 20.
  • the control inputs may be operable to activate and/or deactivate the motion controlled display management function. Additionally, these control inputs may be operable to freeze the display upon activation or to freeze movement of the display in a desired axial or radial direction. Note that for the purpose of this invention, such controls, if buttons, may be positioned on any side or face of the hand held device 20.
  • the motion detection and tracking system of the present invention includes at least one image capture device such as a camera, image storage capabilities, image processing functions and display device motion estimation functions
  • an image capture device provides a captured image of the environment in the immediate vicinity of the hand held device such as a view of the user's head, face and shoulders
  • Image storage capabilities maintain one or more reference images representing feature sets of one or more navigation reference targets such as a gene ⁇ c representation of a user's head, face and shoulders and/or current and previous captured images that can be used by the image processing function
  • the image processing function uses one or more captured images to acquire and identify the location of the navigation reference target such as a user's head, face and/or shoulders m the field of view of the image capture device
  • Pre-stored gene ⁇ c reference image data may be utilized as an aid to identify the navigation reference target withm an image frame contaimng other foreground and background image data
  • the motion estimation process then computes the relative position of the navigation reference target with respect to the display device using growth
  • FIGURE 3 depicts a hand held computer 20 running a map viewer database application
  • the database contains maps of various U S. geographic regions for display on the computer display device
  • the user can zoom to a more specific region of the map, such as a closer view of California as depicted in FIGURE 6
  • a more specific region of the map such as a closer view of California as depicted in FIGURE 6
  • the user can zoom to more specific regions, such as the San Francisco Bay Area (FIGURE 7), the San Francisco waterfront
  • FIGURE 8 a detailed street map of the San Francisco waterfront
  • FIGURE 9 depicts an area of the San Francisco waterfront.
  • the user can explore the map in an eastward direction as depicted in FIGURE 10.
  • Continued movement along the positive x-axis 74 will result in more eastward exploration as depicted in FIGURE 11.
  • FIGURE 12 depicts the result of rotational movement of the hand held computer 20.
  • the display 28 does not change when the computer 20 is rotated along an axis.
  • other embodiments of the invention may include tracking capabilities allowing the invention to track rotation of the computer 20 and enabling the display 28 to be altered according to the rotation of the computer 20. This embodiment would enable a 2-D display to be rotated in 3-D space to present various viewpoints of a 3-D database within the device.
  • a further embodiment of the present invention utilizes a hand held computer 20 in conjunction with a traditional laptop or desktop computer 10, as shown in FIGURE 13.
  • the hand held computer 20 includes a motion detecting means as previously described.
  • the hand held computer 20 is coupled to the desktop computer 10 utilizing an electronic coupling means, including a connecting wire, infrared, or radio transmissions.
  • This embodiment enables a user to utilize the hand held computer 20 much like a typical computer mouse.
  • the user is able to move the hand held computer 20 to move, select or control items displayed on the desktop computer's display device 12.
  • the user is able to traverse virtual objects located in the memory of the hand held device 20 and use this information in conjunction with information contained in the desktop computer 10. For example, a user can use the motion of the hand held computer 20 to traverse a geographic map located in the memory of the hand held device 20.
  • the user wants to know more information about a specific area of interest currently displayed on the hand held computer's display device, the user can upload the specific geographic coordinates into the desktop computer 10 via the electronic coupling connection.
  • the desktop computer 10 uses coordinates from the hand held computer 20 in conjunction with an internal database to provide specific geographic information to the user.
  • the Internet may be used in conjunction with the desktop computer 10 and hand held computer 20 to provide additional information to the user. This furthers the previous example by utilizing the desktop computer to download additional geographic information utilizing Internet protocol.
  • the desktop computer is then utilized to search the Internet for addition geographical information.
  • the desktop computer can search utilizing the uploaded coordinates from the hand held computer 20 directly, or the coordinates can be used in conjunction with an internal database to provide Internet search parameters. Once appropriate information is obtained from the Internet, it can be further downloaded into the hand held computer 20. For example, a more detailed geographic map may be downloaded from the
  • magnification could be controlled by a button 61 while the movement along the x and y axis is still controlled by the motion of the device.
  • Another aspect of the present invention would allow one or more axis to be frozen by the user. The advantage to this arrangement is that accidental movement along that axis would not change the display. For example, the user may want to see what is north of his position. In this case, the user would freeze the x-axis and z-axis, allowing movement only along the y-axis.
  • Another aspect of the present invention would allow the user to interact with two windows in the display of the device.
  • a map application as described above would run.
  • the other window would run another application, such as a screen capture or word-processing application.
  • the user while navigating the virtual map in one window, the user could take notes in the other window, or capture a section of the virtual map in the other window. This allows the user to save certain sections of interest in the virtual map for later printing.
  • the user has access to another database, such as discussed above in relation to wireless remote systems, information about specific places of interest in the virtual map could be displayed in the one window while the user is traversing the virtual map in the first window.
  • the technology of the present invention is not limited to geographic maps.
  • Object viewers can also include but are not limited to architectural, fluidic, electronic, and optical circuitry maps.
  • Other information content could include conventional pages of documents with text, tables, illustrations, pictures, and spreadsheets.
  • the present invention finds particular application in the field of Internet, video telecommunications and hand held video games.
  • the present invention finds additional application in navigating complex object systems including, for example, MRI images.
  • the present invention allows the user to navigate such an object in an easy and intuitive way.
  • a user can navigate from one slice of the MRI image to the next easily using only one hand.
  • objects having multiple dimensions can be easily navigated using the system of the present invention. Functions conventionally accomplished by means of manual control inputs such as clicking and dragging are easily performed by translational and/or rotational movement of the device relative to the navigational reference target.
  • An event queue a standard element of the operating system and applications of both Palm OSTM and Windows CE, two commonly used real-time operating systems for hand held computers, PDAs, telephone-PDA hybrid devices and the like.
  • An event queue contains events, which are happenings within the program such as mouse clicks or key presses. These events are successively stored in event queues ordered by oldest event first. The specifics of an event structure vary from system to system, and as such this discussion will focus on the most common elements of such entities.
  • An event usually contains a designator as to the type of event, often including but not limited to button down, button up, pen down, pen up.
  • Event queues are serviced by event loops, which successively examine the next provided event in the queue and act upon that event.
  • Both the PalmOSTM and Windows CE operating systems support at least one application running.
  • Each application consists of at least one event loop processing an event queue.
  • Hardware related events are usually either part of the operating system of the hand held device or considered “below” the level of the application program. "Higher level” event types such as menu selections, touching scroll bars, mouse buttons and the like are often handled in separate event queues, each with a separate concurrently executing event loop.
  • Such concurrently executing program components are often referred to as threads.
  • Additional hardware such as optional accesso ⁇ es
  • additional event loops may process new hardware events, such as sensor measurements, and generate new data, which is incorporated into events placed into application event queues for application processing.
  • One hardware accessory that the present invention uses is an image capture device that is used for motion detection and tracking.
  • a personal information appliance including a mobile communication device 40 includes a display screen 42 and an image capture device 46.
  • a cursor 44 may be held stationary with respect to the boundaries of the display screen 42. Tracked movement of the device 40 relative to the reference navigation target as a web page 48 is navigated operates to place the cursor 44 over chosen hyperlinks in the web page 48. Control inputs such as voice commands or buttons (not shown) are operable to select the chosen hyperlink and thereby enable navigation of the World Wide Web.

Abstract

A computer program, system and method to track motion and control navigation and display of an object viewer. Information content generated by a digital processor (110) is mapped into a virtual display space suitable for conveying the information to a user. A certain portion of the virtual display space is displayed using a display device (28) coupled to the digital processor (110). An image capture device (60) captures an image from which a reference navigation target is acquired. Tracked movement of a display device (28) relative to the reference navigation target is used to update the displayed certain portion of the virtual display space in a manner related to the tracked movement.

Description

MOTION DETECTION AND TRACKING SYSTEM TO CONTROL NAVIGATION AND DISPLAY OF OBJECT VIEWERS
BACKGROUND OF THE INVENTION
The present invention relates generally to user interfaces. More specifically, the invention relates to a computer interface providing motion detection and tracking to control navigation and display of multi-dimensional object databases using a reference navigation target.
In the last few decades, enormous progress has occurred in developing and perfecting interactions between humans and computer systems. Improvements in user interfaces along with improvements in data capacity, display flexibility, and communication capabilities have lead to the widespread use of applications such as Internet browsers, e-mail, map programs, imaging programs and video games that can be generally described as providing content-rich information to the user. While a discussion of the various stages of user interface evolution is unnecessary, the following highlights of that evolution are illustrative, providing a basis for understanding the utility of the invention claimed herein.
Traditional computer human interfaces 10 exist in a variety of sizes and forms including desktop computers, remote terminals, and portable devices such as laptop computers, notebook computers, hand held computers, and wearable computers.
In the beginning of the personal computer era, the desktop computer, which is still in use today, dominated the market. FIGURE 1 portrays a traditional desktop computer human interface 10. The traditional desktop computer 10 typically includes a display device 12, a keyboard 14, and a pointing device 16. The display device 12 is normally physically connected to the keyboard 14 and pointing device 16 via a computer. The pointing device 16 and buttons 18 may be physically integrated into the keyboard 14.
In the traditional desktop computer human interface 10, the keyboard 14 is used to enter data into the computer system. In addition, the user can control the computer system using the pointing device 16 by making selections on the display device 12. For example, using the pointing device the user can scroll the viewing area by selecting the vertical 38 or horizontal 36 scroll bar.
As semiconductor manufacturing technology developed, portable personal computers such as notebook and hand held computers became increasingly available. Notebook and hand held computers are often made of two mechanically linked components, one essentially containing the display device 12 and the other the keyboard 14 and pointing device 16. Hinges often link these two mechanical components with a flexible ribbon cabling connecting the components and embedded in the hinging mechanism. The two components can be closed like a book, often latching to minimize inadvertent opening.
The notebook computer greatly increased the portability of personal computers.
However, in the 1990's, a new computer interface paradigm emerged which enabled even greater portability and freedom and gave rise to the Personal Digital Assistant 20 (PDA hereafter). One of the first commercially successful PDAs was the Palm product line (PalmPilot™) now manufactured by 3Com. These machines are quite small, lightweight and relatively inexpensive, often fitting in a shirt pocket, weighing a few ounces and costing less than $400 when introduced. These machines possess very little memory (often less than 2 megabytes), a small display 28 (roughly 6 cm by 6 cm) and no physical keyboard. The penlike pointing device 26, often stored next to or on the PDA 20, is applied to the display area 28 to enable its user to make choices and interact with the PDA device 20. External communication is often established via a serial port (not shown) in the PDA connecting to the cradle 22 connected by wire line 24 to a traditional computer 10. As will be appreciated, PDAs such as the PalmPilot™ have demonstrated the commercial reliability of this style of computer interface.
FIGURE 2 displays a prior art Personal Digital Assistant 20 in typical operation, in this case strapped upon the wrist of a user. At least one company, Orang-otang Computers,
Inc. sells a family of wrist mountable cases for a variety of different PDAs. The pen pointer 26 is held in one hand while the PDA 20 is held on the wrist of the other hand. The display area 28 is often quite small compared to traditional computer displays 12. In the case of the Palm product line, the display area 28 contains an array of 160 pixels by 160 pixels in a 6 cm by 6 cm viewing area. Often, part of the display area is further allocated to menus and the like, further limiting the viewing area for an object such as an e-mail message page. This limitation in viewing area is partially addressed by making the menu bar 34 (FIG. 1) found on most traditional computer human interface displays 12 invisible on a PDA display 28 except when a menu button 29 is pressed.
Object database programs, such as map viewers, present a fairly consistent set of functions for viewing two-dimensional sheets. Where the object being viewed is larger than the display area of the display, controls to horizontally and vertically scroll the display area across the object are provided. Such viewing functions often possess visible controls accessed via a pointing device. As shown in FIGURE 1, horizontal scrolling is often controlled by a slider bar 36 horizontally aligned with a viewing region 40. Vertical scrolling is often controlled by a vertical slider bar 38 vertically aligned with the viewing region 40. Additionally such database interfaces often possess functionality to scroll in directions other than the vertical and horizontal orthogonal directions. This function is usually controlled by pointing to an icon, such as hand icon 42, which is then moved relative to the viewing area 40 while holding down the button 18.
Furthermore, object viewers often incorporate the ability to zoom in or out to control the resolution of detail and the amount of information visible upon the display device. Zoom out and zoom in controls 30, 32 are often either immediately visible or available from a pull down menu as items in one or more menu bars 34.
Finally, object viewers often include the ability to traverse a hierarchical organization of collections of objects such as folders of e-mail messages, log files of FAXes, project directories of schematics or floor plans, Internet web page links and objects representing various levels or sub-systems within a multi-tiered database.
In summary, traditional computer human interfaces 10, 20 have been employed in a variety of contexts to provide interactivity with multi-dimensional and/or multi-tiered object programs and systems. These interfaces superficially appear capable of providing a reasonable interface. However, size limitations and associated barriers, drastically limit their functionality and interactivity. When the desired size (e.g. width and/or height) of the object's display format is larger than the size of the display screen itself, a method must be used to control which portion of the object is to be displayed on the screen at any given time. Various methods, in addition to those described above, have been devised to activate pan and scroll functions such as pushing an "arrow" key to shift the display contents in predefined increments in the direction indicated by the arrow key. Alternatively, a pen pointer or stylus can be used to activate pan and scroll functions to shift the display contents. In all of these examples, the physical display device remains relatively stationary and the larger object is viewed piece-wise and sequentially in small segments corresponding to the limitations of the physical size of the display screen.
In actual practice, these typical methods have many inherent problems. If the display screen is small relative to the object to be viewed, many individual steps are necessary for the entire object to be viewed as a sequence of displayed segments. This process may require many sequential command inputs using arrow keys or pen taps, thus generally requiring the use of both hands in the case of hand held computers. Furthermore, the context relationship between the current segment displayed on the screen and the overall content of the whole object can easily become confusing.
What is needed is a system that provides a simple and convenient method to control the display contents that also preserves the user's understanding of the relationship between the current segment on the display and the overall content of the object. Such a method is of particular value for personal information appliances such as hand held computers and communications devices with small display screens. Such appliances must satisfy the conflicting requirements of being small and convenient on the one hand and having the performance and utility of modern laptop or desktop computers on the other. Preferably, the method allows for single-handed control of the display contents.
SUMMARY OF THE INVENTION
The present invention addresses the aforementioned problems by providing a new method to control the contents presented on a small display screen. The present invention allows the user to easily traverse any and all segments of a large object using a hand held device with a small display screen. By moving the device in the direction the user is interested in, the user is allowed to traverse an object that is much larger than the display.
A device in accordance with one aspect of the present invention includes a digital processor, a computer memory, a computer readable medium, a display device, and a means for detecting motion of the display device relative to a reference navigation target. The digital processor is operable to map information resident in the computer readable medium into a virtual display space suitable for conveying the information to the user. The processor from time to time acquires data from the motion detecting means and uses the acquired data to calculate the position of the device relative to the user of the device. Based upon the calculated position of the device relative to the user, the processor displays upon the display device selected portions of the virtual display space. The motion detecting means preferably includes tracking movement of the device relative to a reference navigation target including a unique set of features, and more particularly, the set of features common to all computer users: the human head, face and/or shoulders.
Another aspect of the present invention provides a method for assisting a user in preserving awareness of the context of each displayed segment during the control and operation of a computer system while traversing objects having display formats that are larger than the display. This method begins by mapping the full sized object intended for display by the computer system into a virtual display space. Next, a certain portion of the virtual display space is actually displayed. Then, an image is captured by a motion detecting means and a reference navigation target is acquired from the captured image. Finally, the movement of the device is tracked relative to the reference navigation target and the displayed portion of the virtual display space is changed in a manner correlated to the tracked movement. Preferably the movement of the device is tracked relative to a reference navigation target including the unique human feature set of the head, face and/or shoulders of the user. In especially preferred embodiments, the aforementioned object is a type of detailed or content-rich information such as a geographic map, electronic schematic, video or still image, text document or Internet web page. The hand held device is a personal information appliance such as a hand held computer or mobile communication device capable of displaying text and/or graphical information, albeit on a display sized appropriately for a hand held, wearable or pocketable personal information appliance. This aspect of the present invention allows the user to traverse the object as described above. In addition, the user can use other functions of the personal information appliance, such as taking notes, conversing with others or recording messages, while using the virtual display space display management application of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
FIGURE 1 displays a prior art system including a traditional computer human interface and a Personal Digital Assistant;
FIGURE 2 displays a prior art Personal Digital Assistant in typical operation;
FIGURE 3 depicts a hand held computer having a video camera for detecting motion of the computer relative to the user in accordance with one embodiment of the current invention and a motion template to be used hereafter to describe the user's control interaction;
FIGURE 4 depicts a system block diagram in accordance with one preferred embodiment of the current invention with an embedded database incorporated in the processor and local motion processing means;
FIGURE 5 depicts a flow chart of the method in accordance with one preferred embodiment of the present invention.
FIGURE 6 depicts the initial display for a map viewing application in accordance with one embodiment of the current invention with the user indicating a zoom and scroll to focus in on California;
FIGURE 7 depicts the result of the user control interaction of the previous figure showing a map of California and displaying the next user control interaction, which will cause the display to zoom and focus on the San Francisco Bay Area;
FIGURE 8 depicts the result of the user control interaction of the previous figure showing a map of San Francisco Bay Area and displaying the next user control interaction, which will cause the display to zoom and focus on the waterfront of San Francisco;
FIGURE 9, 10 and 11 depict the results of the user control interaction of the previous figure showing a map of the San Francisco waterfront and displaying the next user control interaction, which will cause the display to zoom and focus on a portion of the San Francisco waterfront; FIGURE 12 depicts the result of rotational movement of the hand held computer without rotational translation;
FIGURE 13 depicts a hand held computer in conjunction with a laptop and desktop computer in accordance with one embodiment of the present invention.
FIGURE 14 depicts a personal information appliance in accordance with one embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
Central to this invention is the concept that motion of a display device relative to a reference navigation target controls an object viewer, where the object being viewed is typically essentially stationary in virtual space in the plane of the display device. One or more imaging devices, such as cameras, mounted on the display device and operably coupled to a motion processor are operable to capture an image from which the motion processor acquires a reference navigation target. The reference navigation target preferably includes a unique feature set such as a user's head, face and/or shoulders. The reference navigation target may also include an item having a unique feature set which is attached to the body of the user or to the clothing of the user. The motion processor tracks the movement of the display device relative to the reference navigation target and provides a motion data vector to a digital processor. The digital processor updates a displayed portion of the object in a manner related to the tracked movements of the display device. In this manner the user is able to traverse the entire object and examine the entire object either as a whole or as a sequence of displayed segments.
A unique human feature set, such as a user's head, face and/or shoulders, is optimally suited for this purpose as in any useful application of the display device, a user is typically positioned in front of the display device and looking at the display screen of the display device. Thus, the cameras can be conveniently positioned and oriented to capture the intended feature set for motion tracking.
FIGURE 3 depicts a hand held computer 20 in accordance with one embodiment of the current invention, including a video camera 60 oriented in such manner that the user's unique feature set is captured when the user is viewing the display device 28. In an unillustrated embodiment, additional cameras may be mounted on the computer 20 to achieve the objects of the invention. Also included in FIGURE 3 is a motion template 62 to be used hereafter to describe the user's control interaction. The hand held computer 20 is considered to have a processor internal to the case controlling the display device 28.
The display device 28 shown in FIGURE 3 is disposed in the same housing as the computer 20. The present invention is not limited to devices wherein the display device 28 and computer 20 are physically attached or disposed in a unitary housing. In the case where the display device and computer are remote one from the other, whether connected by wire or by wireless connection, the imaging device or devices are disposed upon or within the housing of the display device to capture the image in accordance with the present invention.
The video camera(s) 60 are preferably coupled to a motion processor for providing the internal processor with a motion vector measurement. Note that the various components of the motion vector measurement may be sampled at differing rates. FIGURE 4 depicts such system. The processor 110 incorporates an embedded database 120. Coupled to the processor via connection 114 are a motion processor 115 and camera 116. Also coupled to the processor 1 10 via connection 112 is a display device 118. The connections 112, 114 may be wired or wireless, the only constraint being that the camera 116 is disposed on the display device 118. The motion processor preferably provides the ability to determine rotation of the hand held display device, while simultaneously determining translational motion. In a preferred embodiment of the invention, certain features of the reference navigation target, such as the relative apparent size of a user's head or the relative distance between the user's eyes, are used to enable zoom control to adjust the resolution of detail and/or the amount of information visible upon the display device.
The motion processor generates a motion vector relative to a frame of reference including the reference navigation target. Some preferred embodiments will use a 2-D frame of reference while other embodiments will use a 3-D frame of reference. Some preferred embodiments will use a rectilinear axis system, other embodiments will use a radial axis system. In a preferred embodiment, the origin will be positioned at a prominent feature of the reference navigation target, such as the human nose.
The hand held device 20 may be further preferably augmented with other control inputs such as voice commands or button 61 on one side of the hand held computer 20. The control inputs may be operable to activate and/or deactivate the motion controlled display management function. Additionally, these control inputs may be operable to freeze the display upon activation or to freeze movement of the display in a desired axial or radial direction. Note that for the purpose of this invention, such controls, if buttons, may be positioned on any side or face of the hand held device 20.
The motion detection and tracking system of the present invention includes at least one image capture device such as a camera, image storage capabilities, image processing functions and display device motion estimation functions With reference to FIGURE 5, m operation 200 an image capture device provides a captured image of the environment in the immediate vicinity of the hand held device such as a view of the user's head, face and shoulders Image storage capabilities maintain one or more reference images representing feature sets of one or more navigation reference targets such as a geneπc representation of a user's head, face and shoulders and/or current and previous captured images that can be used by the image processing function In operation 210, the image processing function uses one or more captured images to acquire and identify the location of the navigation reference target such as a user's head, face and/or shoulders m the field of view of the image capture device Pre-stored geneπc reference image data may be utilized as an aid to identify the navigation reference target withm an image frame contaimng other foreground and background image data In operation 220, the motion estimation process then computes the relative position of the navigation reference target with respect to the display device using growth motion, relative motion, stereoscopic photogrammetry or other measurement processes This new relative position of the navigation reference target is compared with its previous estimated position and any changes are converted into new motion and position estimates of the display device As the position of the display device relative to the reference navigation target is updated by the motion estimation process, an operation 230 makes this information available to an object viewer application that controls the content of the display on the display device. In operation 240, the displayed portion of a virtual display space is updated in a manner related to the tracked movement
The present invention has a variety of practical uses One embodiment of the present invention would allow a user to traverse a map database using only motion. FIGURE 3 depicts a hand held computer 20 running a map viewer database application The database contains maps of various U S. geographic regions for display on the computer display device
28.
By moving the hand held computer 20 along the positive z-axis, the user can zoom to a more specific region of the map, such as a closer view of California as depicted in FIGURE 6 Continued movement along the positive z-axis allows the user to zoom to more specific regions, such as the San Francisco Bay Area (FIGURE 7), the San Francisco waterfront
(FIGURE 8), and finally to a detailed street map of the San Francisco waterfront (FIGURES 9, 10, and 11) At any zoom level, the user can move the hand held computer 20 along the x-axis, y- axis, or both, to explore the map in the corresponding direction. FIGURE 9 depicts an area of the San Francisco waterfront. By moving the hand held computer 20 along the positive x-axis 70, the user can explore the map in an eastward direction as depicted in FIGURE 10. Continued movement along the positive x-axis 74 will result in more eastward exploration as depicted in FIGURE 11.
FIGURE 12 depicts the result of rotational movement of the hand held computer 20. In this case the display 28 does not change when the computer 20 is rotated along an axis. Note, however, that other embodiments of the invention may include tracking capabilities allowing the invention to track rotation of the computer 20 and enabling the display 28 to be altered according to the rotation of the computer 20. This embodiment would enable a 2-D display to be rotated in 3-D space to present various viewpoints of a 3-D database within the device.
A further embodiment of the present invention utilizes a hand held computer 20 in conjunction with a traditional laptop or desktop computer 10, as shown in FIGURE 13. The hand held computer 20 includes a motion detecting means as previously described. The hand held computer 20 is coupled to the desktop computer 10 utilizing an electronic coupling means, including a connecting wire, infrared, or radio transmissions.
This embodiment enables a user to utilize the hand held computer 20 much like a typical computer mouse. The user is able to move the hand held computer 20 to move, select or control items displayed on the desktop computer's display device 12. In addition, the user is able to traverse virtual objects located in the memory of the hand held device 20 and use this information in conjunction with information contained in the desktop computer 10. For example, a user can use the motion of the hand held computer 20 to traverse a geographic map located in the memory of the hand held device 20. When the user wants to know more information about a specific area of interest currently displayed on the hand held computer's display device, the user can upload the specific geographic coordinates into the desktop computer 10 via the electronic coupling connection. The desktop computer 10 then uses coordinates from the hand held computer 20 in conjunction with an internal database to provide specific geographic information to the user. In addition, the Internet may be used in conjunction with the desktop computer 10 and hand held computer 20 to provide additional information to the user. This furthers the previous example by utilizing the desktop computer to download additional geographic information utilizing Internet protocol. After uploading the coordinates into the desktop computer, as described above, the desktop computer is then utilized to search the Internet for addition geographical information. The desktop computer can search utilizing the uploaded coordinates from the hand held computer 20 directly, or the coordinates can be used in conjunction with an internal database to provide Internet search parameters. Once appropriate information is obtained from the Internet, it can be further downloaded into the hand held computer 20. For example, a more detailed geographic map may be downloaded from the
Internet to the desktop computer 10 and subsequently uploaded to the hand held computer 20 for further traversal by the user. In this way, the information able to be displayed and utilized by the hand held computer 20 is greatly increased.
Another embodiment of the present invention could substitute a command, other than motion, from the user to traverse the virtual map. For example, magnification could be controlled by a button 61 while the movement along the x and y axis is still controlled by the motion of the device. Another aspect of the present invention would allow one or more axis to be frozen by the user. The advantage to this arrangement is that accidental movement along that axis would not change the display. For example, the user may want to see what is north of his position. In this case, the user would freeze the x-axis and z-axis, allowing movement only along the y-axis.
Another aspect of the present invention would allow the user to interact with two windows in the display of the device. In one window a map application as described above would run. The other window would run another application, such as a screen capture or word-processing application. For example, while navigating the virtual map in one window, the user could take notes in the other window, or capture a section of the virtual map in the other window. This allows the user to save certain sections of interest in the virtual map for later printing. In addition, if the user has access to another database, such as discussed above in relation to wireless remote systems, information about specific places of interest in the virtual map could be displayed in the one window while the user is traversing the virtual map in the first window. As will be appreciated, the technology of the present invention is not limited to geographic maps. Object viewers can also include but are not limited to architectural, fluidic, electronic, and optical circuitry maps. Other information content could include conventional pages of documents with text, tables, illustrations, pictures, and spreadsheets. Additionally, the present invention finds particular application in the field of Internet, video telecommunications and hand held video games.
The present invention finds additional application in navigating complex object systems including, for example, MRI images. The present invention allows the user to navigate such an object in an easy and intuitive way. By using the motion driven navigation system of the present invention, a user can navigate from one slice of the MRI image to the next easily using only one hand. Additionally, objects having multiple dimensions can be easily navigated using the system of the present invention. Functions conventionally accomplished by means of manual control inputs such as clicking and dragging are easily performed by translational and/or rotational movement of the device relative to the navigational reference target.
The object viewers and other applications running on the computer system of the present invention use an event queue, a standard element of the operating system and applications of both Palm OS™ and Windows CE, two commonly used real-time operating systems for hand held computers, PDAs, telephone-PDA hybrid devices and the like. An event queue contains events, which are happenings within the program such as mouse clicks or key presses. These events are successively stored in event queues ordered by oldest event first. The specifics of an event structure vary from system to system, and as such this discussion will focus on the most common elements of such entities. An event usually contains a designator as to the type of event, often including but not limited to button down, button up, pen down, pen up. Event queues are serviced by event loops, which successively examine the next provided event in the queue and act upon that event.
Both the PalmOS™ and Windows CE operating systems support at least one application running. Each application consists of at least one event loop processing an event queue. Hardware related events are usually either part of the operating system of the hand held device or considered "below" the level of the application program. "Higher level" event types such as menu selections, touching scroll bars, mouse buttons and the like are often handled in separate event queues, each with a separate concurrently executing event loop. Such concurrently executing program components are often referred to as threads.
Software interfaces to additional hardware, such as optional accessoπes, are often added to basic systems as threads running independently of the main event loop of each application and concurrently with these application event loops. Such additional event loops may process new hardware events, such as sensor measurements, and generate new data, which is incorporated into events placed into application event queues for application processing. One hardware accessory that the present invention uses is an image capture device that is used for motion detection and tracking.
In yet another preferred embodiment of the present invention, the system of the present invention is used to navigate the World Wide Web. With particular reference to FIGURE 14, a personal information appliance including a mobile communication device 40 includes a display screen 42 and an image capture device 46. A cursor 44 may be held stationary with respect to the boundaries of the display screen 42. Tracked movement of the device 40 relative to the reference navigation target as a web page 48 is navigated operates to place the cursor 44 over chosen hyperlinks in the web page 48. Control inputs such as voice commands or buttons (not shown) are operable to select the chosen hyperlink and thereby enable navigation of the World Wide Web.
Although only a few embodiments of the present invention have been described in detail, it should be understood that the present invention may be embodied in many other specific forms without departing from the spirit or scope of the invention. Therefore, the present examples are to be considered as illustrative and not restrictive, and the invention is not to be limited to the details given herein, but may be modified within the scope of the appended claims.

Claims

CLAIMSWe Claim:
1. A computer implemented method for assisting a user in the control and operation of a computer system, the computer system having a display device, the computer system providing information content for display, such information content potentially containing more content such as characters, pictures, lines, links, video or pixels than can be conveniently displayed entirely on the display device at one time, the computer implemented method comprising the acts of:
coupling a display device to a digital processor;
mapping information content generated by the digital processor into a virtual display space suitable for conveying the information to the user;
displaying a certain portion of the virtual display space using the display device;
capturing an image;
acquiring a reference navigation target from the captured image;
tracking movement of the display device relative to the reference navigation target; and
updating the displayed certain portion of the virtual display space in a manner related to the tracked movement.
2. A computer implemented method as recited in claim 1 wherein the reference navigation target is attached to a user's body.
3. A computer implemented method as recited in claim 1 wherein the reference navigation target is a part of a user's body.
4. A computer implemented method as recited in claim 1 wherein the reference navigation target is part of a user's clothing.
5. A computer implemented method as recited in claim 1 wherein the reference navigation target is attached to a user's clothing.
6. A computer implemented method as recited in claim 3 wherein the reference navigation target is a user's head.
7. A computer implemented method as recited in claim 3 wherein the reference navigation target is a user's face.
8. A computer implemented method as recited in claim 3 wherein the reference navigation target is a user's head and face.
9. A computer implemented method as recited in claim 3 wherein the reference navigation target is a user's head and shoulders.
10. A computer implemented method as recited in claim 3 wherein the reference navigation target is a user's face and shoulders.
11. A computer implemented method as recited in claim 1 wherein a virtual magnification of the displayed certain portion is updated in a manner correlated to the tracked movement.
12. A computer implemented method as recited in claim 1 wherein a virtual magnification of the displayed certain portion is updated in response to a command entered into the digital processor by the user.
13. A computer implemented method as recited in claim 1 wherein a virtual oπentation of the displayed certain portion is updated in a manner correlated to the tracked movement.
14. A computer implemented method as recited in claim 1 wherein a virtual orientation of the displayed certain portion is updated in response to a command entered into the digital processor by the user..
15. A computer implemented method as recited in claim 1 wherein an application executing upon the digital processor is a multi-dimensional object database application providing a virtual object.
16. A computer implemented method as recited in claim 15 wherein updating the displayed certain portion includes traversing the virtual object in at least one dimension.
17. A computer implemented method as recited in claim 1 wherein updating the displayed certain portion includes scaling the displayed certain portion.
18. A computer implemented method as recited in claim 17 wherein the displayed certain portion is scaled in response to a command entered into the computer system by the user.
19. A computer implemented method as recited in claim 1 wherein the display device and the digital processor are connected remotely by a wire.
20. A computer implemented method as recited in claim 1 wherein the display device and the digital processor are connected remotely by a wireless connection.
21. A computer implemented method as recited in claim 1 wherein the display device and the digital processor are disposed in a personal information appliance.
22. A computer implemented method as recited in claim 21 wherein the personal information appliance is a hand held computer.
23. A computer implemented method as recited in claim 21 wherein the personal information appliance is a mobile communication device.
24. A computer implemented method as recited in claim 21 wherein the personal information appliance has voice messaging capabilities.
25. A computer implemented method as recited in claim 21 wherein the personal information appliance has data messaging capabilities.
26. A computer implemented method as recited in claim 21 wherein the personal information appliance has handwriting recognition capability.
27. A computer implemented method as recited in claim 21 wherein the personal information appliance has voice recognition capability.
28. A computer implemented method as recited in claim 1 wherein the displayed certain portion includes multiple application windows.
29. A computer implemented method as recited in claim 21 wherein the personal information appliance is coupled to a second computer.
30. A computer implemented method as recited in claim 29 further comprising the act of utilizing the personal information appliance to select information displayed on the second computer.
31. A computer system comprising:
a digital processor;
a computer memory coupled to the digital processor;
a display device coupled to the digital processor;
a motion detector referenced to a reference navigation target and coupled to the display device; and
a computer program embodied on a computer readable medium coupled to the digital processor, the computer program having computer executable instructions for:
mapping information content generated by the computer system into a virtual display space suitable for display via the display device;
displaying a certain portion of the virtual display space via the display device;
capturing an image;
acquiring the reference navigation target from the captured image;
tracking movement of the display device relative to the reference navigation target via the motion detector; and
updating the displayed certain portion of the virtual display space in a manner correlated to the tracked movement.
32. A computer system as recited in claim 31 wherein the navigational reference target is attached to a user's body.
33. A computer system as recited in claim 31 wherein the navigational reference target is a part of a user's body.
34. A computer system as recited in claim 31 wherein the navigational reference target is part of a user's clothing.
35. A computer system as recited in claim 31 wherein the navigational reference target is attached to a user's clothing.
36. A computer system as recited in claim 33 wherein the navigational reference target is a user's head.
37. A computer system as recited in claim 33 wherein the reference navigation target is a user's face.
38. A computer system as recited in claim 33 wherein the reference navigation target is a user's head and face.
39. A computer system as recited in claim 33 wherein the reference navigation target is a user's head and shoulders.
40. A computer system as recited in claim 33 wherein the reference navigation target is a user's face and shoulders.
41. A computer system as recited in claim 31 wherein a virtual magnification of the displayed certain portion is updated in a manner correlated to the tracked movement.
42. A computer system as recited in claim 31 wherein a virtual magnification of the displayed certain portion is updated in response to a command entered into the digital processor by the user.
43. A computer system as recited in claim 31 wherein a virtual orientation of the displayed certain portion is updated in a manner correlated to the tracked movement.
44. A computer system as recited in claim 31 wherein a virtual orientation of the displayed certain portion is updated in response to a command entered into the digital processor by the user.
45. A computer system as recited in claim 31 wherein an application executing upon the digital processor is a multi-dimensional object database application providing a virtual object.
46. A computer system as recited in claim 45 wherein updating the displayed certain portion includes traversing the virtual object in at least one dimension.
47. A computer system as recited in claim 31 wherein updating the displayed certain portion includes scaling the displayed certain portion.
48. A computer system as recited in claim 47 wherein the displayed certain portion is scaled in response to a command entered into the computer system by the user.
49. A computer system as recited in claim 31 wherein the display device and the digital processor are connected remotely by a wire.
50. A computer system as recited in claim 31 wherein the display device and the digital processor are connected remotely by a wireless connection.
51. A computer system as recited in claim 31 wherein the display device and the digital processor are disposed in a personal information appliance.
52. A computer system as recited in claim 51 wherein the personal information appliance is a hand held computer.
53. A computer system as recited in claim 51 wherein the personal information appliance is a mobile communication device.
54. A computer system as recited in claim 51 wherein the personal information appliance has voice messaging capabilities.
55. A computer system as recited in claim 51 wherein the personal information appliance has data messaging capabilities.
56. A computer system as recited in claim 51 wherein the personal information appliance has handwriting recognition capability.
57. A computer system as recited in claim 51 wherein the personal information appliance has voice recognition capability.
58. A computer system as recited in claim 31 wherein the displayed certain portion includes multiple application windows.
59. A computer system as recited in claim 51, wherein the personal information appliance is coupled to a second computer.
60. A computer system as recited in claim 59, further comprising the act of utilizing the personal information appliance to select information displayed on the second computer.
61. A computer system as recited in claim 31 wherein the motion detector further comprises an image capture device operably coupled to an image processor, operable to acquire the reference navigation target from the captured image and track movement of the display device relative to the reference navigation target.
62. A computer program embodied on a computer readable medium comprising: a code segment that maps information content generated by a digital processor into a virtual display space suitable for conveying the information to a user; a code segment that displays a certain portion of the virtual display space using a display device; a code segment that captures an image; a coded segment that acquires a reference navigation target from the captured image; a code segment that tracks movement of the display device relative to the reference navigation target; and a code segment that updates the displayed certain portion of the virtual display space in a manner related to the tracked movement.
63. A computer program as recited in claim 62 wherein the reference navigation target is attached to a user's body.
64. A computer program as recited in claim 62 wherein the reference navigation target is part of a user's body.
65. A computer program as recited in claim 62 wherein the reference navigation target is part of a user's clothing.
66. A computer program as recited in claim 62 wherein the reference navigation target is attached to a user's clothing.
67. A computer program as recited m claim 64 wherein the reference navigation target is a user's head.
68. A computer program as recited in claim 64 wherein the reference navigation target is a user's face.
69. A computer program as recited in claim 64 wherein the reference navigation target is a user's head and face.
70. A computer program as recited in claim 64 wherein the reference navigation target is a user's head and shoulders.
71. A computer program as recited in claim 64 wherein the reference navigation target is a user's face and shoulders.
72. A computer program as recited in claim 62 wherein a virtual magnification of the displayed certain portion is updated in a manner correlated to the tracked movement.
73. A computer program as recited in claim 62 wherein a virtual magnification of the displayed certain portion is updated in response to a command entered into the digital processor by the user.
74. A computer program as recited in claim 62 wherein a virtual oπentation of the displayed certain portion is updated in a manner correlated to the tracked movement.
75. A computer program as recited in claim 62 wherein a virtual orientation of the displayed certain portion is updated in response to a command entered into the digital processor by the user.
76. A computer program as recited in claim 62 wherein an application executing upon the digital processor is a multi-dimensional object database application providing a virtual object.
77. A computer program as recited in claim 76 wherein updating the displayed certain portion includes traversing the virtual object in at least one dimension.
78. A computer program as recited in claim 62 wherein updating the displayed certain portion includes scaling the displayed certain portion.
79. A computer program as recited in claim 78 wherein the displayed certain portion is scaled in response to a command entered into the computer system by the user.
80. A computer program as recited in claim 62 wherein the display device and the digital processor are connected remotely by wire.
81. A computer program as recited in claim 62 wherein the display device and the digital processor are connected remotely by a wireless connection.
82. A computer program as recited in claim 62 wherein the display device and the digital processor are disposed in a personal information appliance.
83. A computer program as recited in claim 82 wherein the personal information appliance is a hand held computer.
84. A computer program as recited in claim 82 wherein the personal information appliance is a mobile communication device.
85. A computer program as recited in claim 82 wherein the personal information appliance has data messaging capabilities.
86. A computer program as recited in claim 82 wherein the personal information appliance has handwriting recognition capability.
87. A computer program as recited in claim 82 wherein the personal information appliance has voice recognition capability.
88. A computer program as recited in claim 62 wherein the displayed certain portion includes multiple application windows.
89. A computer program as recited in claim 82 wherein the personal information appliance is coupled to a second computer.
90. A computer program as recited in claim 89 further comprising a code segment that utilizes the personal information appliance to select information displayed on the second computer.
PCT/US2000/015210 1999-06-08 2000-06-02 Motion detection and tracking system to control navigation and display of object viewers WO2000075914A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2001502109A JP2003501762A (en) 1999-06-08 2000-06-02 Movement detection and tracking system for controlling target viewer navigation and display
EP00939510A EP1101215A4 (en) 1999-06-08 2000-06-02 Motion detection and tracking system to control navigation and display of object viewers

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US32805399A 1999-06-08 1999-06-08
US09/328,053 1999-06-08
US09/441,001 1999-11-09
US09/441,001 US6288704B1 (en) 1999-06-08 1999-11-09 Motion detection and tracking system to control navigation and display of object viewers

Publications (1)

Publication Number Publication Date
WO2000075914A1 true WO2000075914A1 (en) 2000-12-14

Family

ID=26986188

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/015210 WO2000075914A1 (en) 1999-06-08 2000-06-02 Motion detection and tracking system to control navigation and display of object viewers

Country Status (5)

Country Link
US (1) US6288704B1 (en)
EP (1) EP1101215A4 (en)
JP (1) JP2003501762A (en)
CN (1) CN1300415A (en)
WO (1) WO2000075914A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1618553A1 (en) * 2003-03-19 2006-01-25 Franklin Dee Martin Multi-media data collection tool kit having an electronic multi-media "case" file and method of use
EP1748388A2 (en) * 2005-07-25 2007-01-31 LG Electronics Inc. Mobile communication terminal with means for estimating motion direction and method thereof
EP1501019A3 (en) * 2003-06-14 2007-04-04 Lg Electronics Inc. Apparatus and method for automatically compensating for an image gradient of a mobile communication terminal
EP1837741A2 (en) * 2006-03-23 2007-09-26 Accenture Global Services GmbH Gestural input for navigation and manipulation in virtual space
EP1887776A1 (en) * 2006-08-07 2008-02-13 Samsung Electronics Co., Ltd. Portable terminal and user interface control method thereof based on pattern recognition and analysis of image captured by camera
US7340342B2 (en) 2003-08-05 2008-03-04 Research In Motion Limited Mobile device with on-screen optical navigation
EP2065795A1 (en) * 2007-11-30 2009-06-03 Koninklijke KPN N.V. Auto zoom display system and method
US7703121B2 (en) 2004-12-03 2010-04-20 Eastman Kodak Company Method of distributing multimedia data to equipment provided with an image sensor
EP2210450A1 (en) * 2007-11-15 2010-07-28 Sk Telecom Co., LTD Method, system and server playing media using user equipment with motion sensor
US8081157B2 (en) 2006-01-25 2011-12-20 Samsung Electronics Co., Ltd. Apparatus and method of scrolling screen in portable device and recording medium storing program for performing the method
US8213686B2 (en) 2005-01-07 2012-07-03 Qualcomm Incorporated Optical flow based tilt sensor
CN112577488A (en) * 2020-11-24 2021-03-30 腾讯科技(深圳)有限公司 Navigation route determining method, navigation route determining device, computer equipment and storage medium

Families Citing this family (179)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060279542A1 (en) * 1999-02-12 2006-12-14 Vega Vista, Inc. Cellular phones and mobile devices with motion driven control
US20060061550A1 (en) * 1999-02-12 2006-03-23 Sina Fateh Display size emulation system
US20060061551A1 (en) * 1999-02-12 2006-03-23 Vega Vista, Inc. Motion detection and tracking system to control navigation and display of portable displays including on-chip gesture detection
US20020046100A1 (en) * 2000-04-18 2002-04-18 Naoto Kinjo Image display method
US7302280B2 (en) * 2000-07-17 2007-11-27 Microsoft Corporation Mobile phone operation based upon context sensing
US9189069B2 (en) 2000-07-17 2015-11-17 Microsoft Technology Licensing, Llc Throwing gestures for mobile devices
US7289102B2 (en) * 2000-07-17 2007-10-30 Microsoft Corporation Method and apparatus using multiple sensors in a device with a display
US8120625B2 (en) * 2000-07-17 2012-02-21 Microsoft Corporation Method and apparatus using multiple sensors in a device with a display
EP1311803B8 (en) * 2000-08-24 2008-05-07 VDO Automotive AG Method and navigation device for querying target information and navigating within a map view
US7724270B1 (en) 2000-11-08 2010-05-25 Palm, Inc. Apparatus and methods to achieve a variable color pixel border on a negative mode screen with a passive matrix drive
US6961029B1 (en) 2000-11-08 2005-11-01 Palm, Inc. Pixel border for improved viewability of a display device
US7425970B1 (en) * 2000-11-08 2008-09-16 Palm, Inc. Controllable pixel border for a negative mode passive matrix display device
US20020109673A1 (en) * 2001-01-04 2002-08-15 Thierry Valet Method and apparatus employing angled single accelerometer sensing multi-directional motion
JP4596203B2 (en) * 2001-02-19 2010-12-08 ソニー株式会社 Image processing apparatus and method, recording medium, and program
US6798429B2 (en) * 2001-03-29 2004-09-28 Intel Corporation Intuitive mobile device interface to virtual spaces
US20020158908A1 (en) * 2001-04-30 2002-10-31 Kristian Vaajala Web browser user interface for low-resolution displays
US20040174431A1 (en) * 2001-05-14 2004-09-09 Stienstra Marcelle Andrea Device for interacting with real-time streams of content
FI117488B (en) * 2001-05-16 2006-10-31 Myorigo Sarl Browsing information on screen
SE523636C2 (en) * 2001-07-22 2004-05-04 Tomer Shalit Ab Portable computerized handheld device and procedure for handling an object displayed on a screen
US7365734B2 (en) * 2002-08-06 2008-04-29 Rembrandt Ip Management, Llc Control of display content by movement on a fixed spherical space
USRE47457E1 (en) * 2001-08-07 2019-06-25 Facebook, Inc. Control of display content by movement on a fixed spherical space
WO2003015072A1 (en) * 2001-08-07 2003-02-20 Vega Vista Control of display content by movement on a fixed spherical space
US7079132B2 (en) * 2001-08-16 2006-07-18 Siemens Corporate Reseach Inc. System and method for three-dimensional (3D) reconstruction from ultrasound images
US7113618B2 (en) * 2001-09-18 2006-09-26 Intel Corporation Portable virtual reality
US6927757B2 (en) * 2001-09-18 2005-08-09 Intel Corporation Camera driven virtual workspace management
US20030067623A1 (en) * 2001-10-05 2003-04-10 Yuki Akiyama System for reading image information
US7714880B2 (en) * 2001-11-16 2010-05-11 Honeywell International Inc. Method and apparatus for displaying images on a display
US7487444B2 (en) 2002-03-19 2009-02-03 Aol Llc Reformatting columns of content for display
US6943811B2 (en) * 2002-03-22 2005-09-13 David J. Matthews Apparatus and method of managing data objects
JP3964734B2 (en) * 2002-05-17 2007-08-22 富士通テン株式会社 Navigation device
US7184025B2 (en) * 2002-05-31 2007-02-27 Microsoft Corporation Altering a display on a viewing device based upon a user controlled orientation of the viewing device
US20030231189A1 (en) * 2002-05-31 2003-12-18 Microsoft Corporation Altering a display on a viewing device based upon a user controlled orientation of the viewing device
US20040058732A1 (en) 2002-06-14 2004-03-25 Piccionelli Gregory A. Method, system and apparatus for location based gaming
US20070135943A1 (en) * 2002-09-18 2007-06-14 Seiko Epson Corporation Output service providing system that updates information based on positional information, terminal and method of providing output service
US8797402B2 (en) * 2002-11-19 2014-08-05 Hewlett-Packard Development Company, L.P. Methods and apparatus for imaging and displaying a navigable path
US7774158B2 (en) * 2002-12-17 2010-08-10 Evolution Robotics, Inc. Systems and methods for landmark generation for visual simultaneous localization and mapping
US20040119684A1 (en) * 2002-12-18 2004-06-24 Xerox Corporation System and method for navigating information
US8508643B2 (en) * 2003-01-17 2013-08-13 Hewlett-Packard Development Company, L.P. Method and system for processing an image
US7426329B2 (en) 2003-03-06 2008-09-16 Microsoft Corporation Systems and methods for receiving, storing, and rendering digital video, music, and pictures on a personal media player
US7526718B2 (en) * 2003-04-30 2009-04-28 Hewlett-Packard Development Company, L.P. Apparatus and method for recording “path-enhanced” multimedia
JP4864295B2 (en) * 2003-06-02 2012-02-01 富士フイルム株式会社 Image display system, image display apparatus, and program
FI117986B (en) * 2003-06-17 2007-05-15 Onesys Oy Procedure and arrangement for navigation in a real-time three-dimensional medical image model
JP2005026738A (en) * 2003-06-30 2005-01-27 Kyocera Corp Mobile communication apparatus
US20050078086A1 (en) * 2003-10-09 2005-04-14 Grams Richard E. Method and apparatus for controlled display
EP1679689B1 (en) * 2003-10-28 2014-01-01 Panasonic Corporation Image display device and image display method
JP3906200B2 (en) * 2003-11-27 2007-04-18 インターナショナル・ビジネス・マシーンズ・コーポレーション COMMUNICATION DEVICE, COMMUNICATION SYSTEM, COMMUNICATION METHOD, PROGRAM, AND RECORDING MEDIUM
US7460134B2 (en) * 2004-03-02 2008-12-02 Microsoft Corporation System and method for moving computer displayable content into a preferred user interactive focus area
US7365736B2 (en) * 2004-03-23 2008-04-29 Fujitsu Limited Customizable gesture mappings for motion controlled handheld devices
US7301529B2 (en) * 2004-03-23 2007-11-27 Fujitsu Limited Context dependent gesture response
US7180500B2 (en) * 2004-03-23 2007-02-20 Fujitsu Limited User definable gestures for motion controlled handheld devices
US20050212753A1 (en) * 2004-03-23 2005-09-29 Marvit David L Motion controlled remote controller
US7173604B2 (en) * 2004-03-23 2007-02-06 Fujitsu Limited Gesture identification of controlled devices
US7903084B2 (en) * 2004-03-23 2011-03-08 Fujitsu Limited Selective engagement of motion input modes
US7176886B2 (en) * 2004-03-23 2007-02-13 Fujitsu Limited Spatial signatures
US20050212760A1 (en) * 2004-03-23 2005-09-29 Marvit David L Gesture based user interface supporting preexisting symbols
EP1728142B1 (en) * 2004-03-23 2010-08-04 Fujitsu Ltd. Distinguishing tilt and translation motion components in handheld devices
US7301526B2 (en) * 2004-03-23 2007-11-27 Fujitsu Limited Dynamic adaptation of gestures for motion controlled handheld devices
US7280096B2 (en) * 2004-03-23 2007-10-09 Fujitsu Limited Motion sensor engagement for a handheld device
US7176887B2 (en) * 2004-03-23 2007-02-13 Fujitsu Limited Environmental modeling for motion controlled handheld devices
US7180502B2 (en) 2004-03-23 2007-02-20 Fujitsu Limited Handheld device with preferred motion selection
US7365737B2 (en) * 2004-03-23 2008-04-29 Fujitsu Limited Non-uniform gesture precision
US7180501B2 (en) * 2004-03-23 2007-02-20 Fujitsu Limited Gesture based navigation of a handheld user interface
US7301527B2 (en) * 2004-03-23 2007-11-27 Fujitsu Limited Feedback based user interface for motion controlled handheld devices
US7365735B2 (en) * 2004-03-23 2008-04-29 Fujitsu Limited Translation controlled cursor
US7176888B2 (en) 2004-03-23 2007-02-13 Fujitsu Limited Selective engagement of motion detection
US7301528B2 (en) * 2004-03-23 2007-11-27 Fujitsu Limited Distinguishing tilt and translation motion components in handheld devices
JP4241484B2 (en) * 2004-04-14 2009-03-18 日本電気株式会社 Portable terminal device, incoming response message transmission method, and server device
US7400316B2 (en) * 2004-05-28 2008-07-15 International Business Machines Corporation Method and apparatus for dynamically modifying web page display for mobile devices
US20090033630A1 (en) * 2004-06-04 2009-02-05 Koninklijke Philips Electronics, N.V. hand-held device for content navigation by a user
KR101287649B1 (en) * 2004-07-19 2013-07-24 크리에이티브 테크놀로지 엘티디 Method and apparatus for touch scrolling
FI20045300A (en) * 2004-08-17 2006-02-18 Nokia Corp Electronic device and procedure for controlling the functions of the electronic device and software product for implementing the procedure
EP1783585A4 (en) * 2004-08-27 2008-08-06 Fujitsu Ltd Operation screen creating method, display controller, operation screen creating program and computer-readable recording medium on which program is recorded
NO20044073D0 (en) * 2004-09-27 2004-09-27 Isak Engquist Information Processing System and Procedures
KR100678900B1 (en) * 2005-01-26 2007-02-05 삼성전자주식회사 Apparatus and method for displaying graphic object concurrently
US20090305727A1 (en) * 2005-03-04 2009-12-10 Heikki Pylkko Mobile device with wide range-angle optics and a radiation sensor
US20090297062A1 (en) * 2005-03-04 2009-12-03 Molne Anders L Mobile device with wide-angle optics and a radiation sensor
US7610345B2 (en) 2005-07-28 2009-10-27 Vaporstream Incorporated Reduced traceability electronic message system and method
US9282081B2 (en) 2005-07-28 2016-03-08 Vaporstream Incorporated Reduced traceability electronic message system and method
US20070032318A1 (en) * 2005-08-04 2007-02-08 Nishimura Ken A Motion sensor in sporting equipment
EP1915588B1 (en) * 2005-08-17 2015-09-30 TomTom International B.V. Navigation device and method of scrolling map data displayed on a navigation device
US8784336B2 (en) 2005-08-24 2014-07-22 C. R. Bard, Inc. Stylet apparatuses and methods of manufacture
US7647175B2 (en) * 2005-09-09 2010-01-12 Rembrandt Technologies, Lp Discrete inertial display navigation
US20070057911A1 (en) * 2005-09-12 2007-03-15 Sina Fateh System and method for wireless network content conversion for intuitively controlled portable displays
US20070061101A1 (en) * 2005-09-13 2007-03-15 Ibm Corporation Input device for providing position information to information handling systems
US7606552B2 (en) * 2005-11-10 2009-10-20 Research In Motion Limited System and method for activating an electronic device
FI118674B (en) * 2005-11-22 2008-02-15 Planmeca Oy Hardware in a dental environment and a method for controlling a hardware device
WO2007069173A2 (en) * 2005-12-12 2007-06-21 Koninklijke Philips Electronics, N.V. Method and apparatus for large screen interactive control using portable touchscreen device
US8049723B2 (en) * 2005-12-20 2011-11-01 Accenture Global Services Limited Wireless handheld device and method with remote GUI control
TW200734913A (en) * 2006-03-10 2007-09-16 Inventec Appliances Corp Electronic device and method using displacement sensor to move position displayed on screen
DE202006021132U1 (en) 2006-03-31 2012-12-20 Research In Motion Limited Device for providing map locations in user applications using URL strings
ATE409307T1 (en) * 2006-03-31 2008-10-15 Research In Motion Ltd USER INTERFACE METHOD AND APPARATUS FOR CONTROLLING THE VISUAL DISPLAY OF MAPS WITH SELECTABLE MAP ELEMENTS IN MOBILE COMMUNICATION DEVICES
EP1840511B1 (en) * 2006-03-31 2016-03-02 BlackBerry Limited Methods and apparatus for retrieving and displaying map-related data for visually displayed maps of mobile communication devices
US8121610B2 (en) 2006-03-31 2012-02-21 Research In Motion Limited Methods and apparatus for associating mapping functionality and information in contact lists of mobile communication devices
JP5521226B2 (en) * 2006-05-25 2014-06-11 富士フイルム株式会社 Display system, display method, and display program
US8564544B2 (en) 2006-09-06 2013-10-22 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US7794407B2 (en) 2006-10-23 2010-09-14 Bard Access Systems, Inc. Method of locating the tip of a central venous catheter
US8388546B2 (en) 2006-10-23 2013-03-05 Bard Access Systems, Inc. Method of locating the tip of a central venous catheter
WO2008054384A1 (en) * 2006-10-31 2008-05-08 Thomson Licensing A method and apparatus for producing a map for mobile reception at each cell tower
US20080120568A1 (en) * 2006-11-20 2008-05-22 Motorola, Inc. Method and device for entering data using a three dimensional position of a pointer
CN101212570B (en) * 2006-12-25 2011-06-22 鸿富锦精密工业(深圳)有限公司 Photographing mobile communication terminal
WO2008094458A1 (en) * 2007-01-26 2008-08-07 F-Origin, Inc. Viewing images with tilt control on a hand-held device
JP4330637B2 (en) * 2007-02-19 2009-09-16 シャープ株式会社 Portable device
US20080306708A1 (en) * 2007-06-05 2008-12-11 Raydon Corporation System and method for orientation and location calibration for image sensors
US20080305806A1 (en) * 2007-06-11 2008-12-11 Searete Llc, A Limited Liability Corporation Of The Sate Of Delaware Context associating aspects
US20080304512A1 (en) * 2007-06-11 2008-12-11 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Context associating for context designated destination communication system
US20080304648A1 (en) * 2007-06-11 2008-12-11 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Context identifying aspects
US20080313335A1 (en) * 2007-06-15 2008-12-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Communicator establishing aspects with context identifying
CN101330811B (en) * 2007-06-22 2010-12-08 鸿富锦精密工业(深圳)有限公司 Portable electronic device and operation method thereof
WO2009024966A2 (en) * 2007-08-21 2009-02-26 Closevu Ltd. Method for adapting media for viewing on small display screens
US20090089705A1 (en) * 2007-09-27 2009-04-02 Microsoft Corporation Virtual object navigation
US8418083B1 (en) 2007-11-26 2013-04-09 Sprint Communications Company L.P. Applying a navigational mode to a device
US9521961B2 (en) 2007-11-26 2016-12-20 C. R. Bard, Inc. Systems and methods for guiding a medical instrument
US9636031B2 (en) 2007-11-26 2017-05-02 C.R. Bard, Inc. Stylets for use with apparatus for intravascular placement of a catheter
US8849382B2 (en) 2007-11-26 2014-09-30 C. R. Bard, Inc. Apparatus and display methods relating to intravascular placement of a catheter
CN101925333B (en) 2007-11-26 2014-02-12 C·R·巴德股份有限公司 Integrated system for intravascular placement of catheter
US9649048B2 (en) 2007-11-26 2017-05-16 C. R. Bard, Inc. Systems and methods for breaching a sterile field for intravascular placement of a catheter
US8781555B2 (en) 2007-11-26 2014-07-15 C. R. Bard, Inc. System for placement of a catheter including a signal-generating stylet
US10751509B2 (en) 2007-11-26 2020-08-25 C. R. Bard, Inc. Iconic representations for guidance of an indwelling medical device
US10524691B2 (en) 2007-11-26 2020-01-07 C. R. Bard, Inc. Needle assembly including an aligned magnetic element
US10449330B2 (en) 2007-11-26 2019-10-22 C. R. Bard, Inc. Magnetic element-equipped needle assemblies
US8209635B2 (en) * 2007-12-20 2012-06-26 Sony Mobile Communications Ab System and method for dynamically changing a display
US8478382B2 (en) 2008-02-11 2013-07-02 C. R. Bard, Inc. Systems and methods for positioning a catheter
GB2458881A (en) * 2008-03-19 2009-10-07 Robert William Albert Dobson Interface control using motion of a mobile device
JP5120277B2 (en) * 2008-03-31 2013-01-16 アイシン・エィ・ダブリュ株式会社 Navigation device and program
US9253416B2 (en) * 2008-06-19 2016-02-02 Motorola Solutions, Inc. Modulation of background substitution based on camera attitude and motion
US7953462B2 (en) 2008-08-04 2011-05-31 Vartanian Harry Apparatus and method for providing an adaptively responsive flexible display device
US9901714B2 (en) 2008-08-22 2018-02-27 C. R. Bard, Inc. Catheter assembly including ECG sensor and magnetic assemblies
CN103324386A (en) 2008-08-22 2013-09-25 谷歌公司 Anchored navigation in a three dimensional environment on a mobile device
US8437833B2 (en) 2008-10-07 2013-05-07 Bard Access Systems, Inc. Percutaneous magnetic gastrostomy
US20100095250A1 (en) * 2008-10-15 2010-04-15 Raytheon Company Facilitating Interaction With An Application
US8717283B1 (en) * 2008-11-25 2014-05-06 Sprint Communications Company L.P. Utilizing motion of a device to manipulate a display screen feature
FR2940690B1 (en) * 2008-12-31 2011-06-03 Cy Play A METHOD AND DEVICE FOR USER NAVIGATION OF A MOBILE TERMINAL ON AN APPLICATION EXECUTING ON A REMOTE SERVER
EP2382756B1 (en) 2008-12-31 2018-08-22 Lewiner, Jacques Modelisation method of the display of a remote terminal using macroblocks and masks caracterized by a motion vector and transparency data
FR2940703B1 (en) * 2008-12-31 2019-10-11 Jacques Lewiner METHOD AND DEVICE FOR MODELING A DISPLAY
JP5347549B2 (en) * 2009-02-13 2013-11-20 ソニー株式会社 Information processing apparatus and information processing method
US9532724B2 (en) 2009-06-12 2017-01-03 Bard Access Systems, Inc. Apparatus and method for catheter navigation using endovascular energy mapping
JP5795576B2 (en) 2009-06-12 2015-10-14 バード・アクセス・システムズ,インコーポレーテッド Method of operating a computer-based medical device that uses an electrocardiogram (ECG) signal to position an intravascular device in or near the heart
US20100315439A1 (en) * 2009-06-15 2010-12-16 International Business Machines Corporation Using motion detection to process pan and zoom functions on mobile computing devices
WO2011019760A2 (en) 2009-08-10 2011-02-17 Romedex International Srl Devices and methods for endovascular electrography
US11103213B2 (en) 2009-10-08 2021-08-31 C. R. Bard, Inc. Spacers for use with an ultrasound probe
US10639008B2 (en) 2009-10-08 2020-05-05 C. R. Bard, Inc. Support and cover structures for an ultrasound probe head
EP2531098B1 (en) 2010-02-02 2020-07-15 C.R. Bard, Inc. Apparatus and method for catheter navigation and tip location
MX2012013858A (en) 2010-05-28 2013-04-08 Bard Inc C R Insertion guidance system for needles and medical components.
WO2011150376A1 (en) 2010-05-28 2011-12-01 C.R. Bard, Inc. Apparatus for use with needle insertion guidance system
US20110291964A1 (en) * 2010-06-01 2011-12-01 Kno, Inc. Apparatus and Method for Gesture Control of a Dual Panel Electronic Device
US8826495B2 (en) 2010-06-01 2014-09-09 Intel Corporation Hinged dual panel electronic device
US8977987B1 (en) 2010-06-14 2015-03-10 Google Inc. Motion-based interface control on computing device
MX338127B (en) 2010-08-20 2016-04-04 Bard Inc C R Reconfirmation of ecg-assisted catheter tip placement.
EP2619742B1 (en) 2010-09-24 2018-02-28 iRobot Corporation Systems and methods for vslam optimization
CN103189009B (en) 2010-10-29 2016-09-07 C·R·巴德股份有限公司 The bio-impedance auxiliary of Medical Devices is placed
JP5143291B2 (en) * 2011-04-20 2013-02-13 株式会社東芝 Image processing apparatus, method, and stereoscopic image display apparatus
KR20140051284A (en) 2011-07-06 2014-04-30 씨. 알. 바드, 인크. Needle length determination and calibration for insertion guidance system
USD724745S1 (en) 2011-08-09 2015-03-17 C. R. Bard, Inc. Cap for an ultrasound probe
USD699359S1 (en) 2011-08-09 2014-02-11 C. R. Bard, Inc. Ultrasound probe head
US8798840B2 (en) 2011-09-30 2014-08-05 Irobot Corporation Adaptive mapping with spatial summaries of sensor data
WO2013070775A1 (en) 2011-11-07 2013-05-16 C.R. Bard, Inc Ruggedized ultrasound hydrogel insert
US9035878B1 (en) 2012-02-29 2015-05-19 Google Inc. Input system
US8643951B1 (en) 2012-03-15 2014-02-04 Google Inc. Graphical menu and interaction therewith through a viewing window
CN104837413B (en) 2012-06-15 2018-09-11 C·R·巴德股份有限公司 Detect the device and method of removable cap on ultrasonic detector
US9020637B2 (en) 2012-11-02 2015-04-28 Irobot Corporation Simultaneous localization and mapping for a mobile robot
US9576183B2 (en) 2012-11-02 2017-02-21 Qualcomm Incorporated Fast initialization for monocular visual SLAM
US9037396B2 (en) 2013-05-23 2015-05-19 Irobot Corporation Simultaneous localization and mapping for a mobile robot
US9181760B2 (en) 2013-07-24 2015-11-10 Innovations, Inc. Motion-based view scrolling with proportional and dynamic modes
US10126839B2 (en) 2013-07-24 2018-11-13 Innoventions, Inc. Motion-based view scrolling with augmented tilt control
US9939525B2 (en) * 2013-11-29 2018-04-10 L.H. Kosowsky & Associates, Inc. Imaging system for obscured environments
ES2811323T3 (en) 2014-02-06 2021-03-11 Bard Inc C R Systems for the guidance and placement of an intravascular device
CN103970500B (en) * 2014-03-31 2017-03-29 小米科技有限责任公司 The method and device that a kind of picture shows
US9619016B2 (en) 2014-03-31 2017-04-11 Xiaomi Inc. Method and device for displaying wallpaper image on screen
CN104216634A (en) * 2014-08-27 2014-12-17 小米科技有限责任公司 Method and device for displaying manuscript
CN104461020B (en) * 2014-12-31 2018-01-19 珠海全志科技股份有限公司 Equipment physical direction and system logic direction mapping method and system
US10973584B2 (en) 2015-01-19 2021-04-13 Bard Access Systems, Inc. Device and method for vascular access
CN105988667B (en) * 2015-01-30 2019-07-02 广州市百果园信息技术有限公司 The method and device of list display
US10386941B2 (en) * 2015-06-16 2019-08-20 Intel Corporation Gyratory sensing system to enhance wearable device user experience via HMI extension
WO2016210325A1 (en) 2015-06-26 2016-12-29 C.R. Bard, Inc. Connector interface for ecg-based catheter positioning system
US11000207B2 (en) 2016-01-29 2021-05-11 C. R. Bard, Inc. Multiple coil system for tracking a medical device
US11156375B2 (en) * 2016-07-22 2021-10-26 Ademco Inc. Migration of settings from a non-connected building controller to another building controller
US10777007B2 (en) 2017-09-29 2020-09-15 Apple Inc. Cooperative augmented reality map interface
US10992079B2 (en) 2018-10-16 2021-04-27 Bard Access Systems, Inc. Safety-equipped connection systems and methods thereof for establishing electrical connections
CN111782098A (en) * 2020-07-02 2020-10-16 三星电子(中国)研发中心 Page navigation method and device and intelligent equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5322441A (en) * 1990-10-05 1994-06-21 Texas Instruments Incorporated Method and apparatus for providing a portable visual display
US5686940A (en) * 1993-12-24 1997-11-11 Rohm Co., Ltd. Display apparatus

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4502035A (en) 1983-07-11 1985-02-26 Obenauf James E Golfer's head motion sensor
US4649504A (en) * 1984-05-22 1987-03-10 Cae Electronics, Ltd. Optical position and orientation measurement techniques
JPH03189683A (en) * 1989-12-19 1991-08-19 Mita Ind Co Ltd Information processor
US5148477A (en) 1990-08-24 1992-09-15 Board Of Regents Of The University Of Oklahoma Method and apparatus for detecting and quantifying motion of a body part
US5214711A (en) 1990-08-24 1993-05-25 Board Of Regents Of The University Of Oklahoma Method and apparatus for detecting and quantifying motion of a body part
US5177872A (en) * 1990-10-05 1993-01-12 Texas Instruments Incorporated Method and apparatus for monitoring physical positioning of a user
US5469511A (en) * 1990-10-05 1995-11-21 Texas Instruments Incorporated Method and apparatus for presentation of on-line directional sound
US5526022A (en) * 1993-01-06 1996-06-11 Virtual I/O, Inc. Sourceless orientation sensor
US5447305A (en) 1993-02-01 1995-09-05 Creative Sports Design, Inc. Baseball batting aid for detecting motion of head in more than one axis of motion
JP3242219B2 (en) * 1993-06-23 2001-12-25 松下電器産業株式会社 Display device and display method
US5482048A (en) 1993-06-30 1996-01-09 University Of Pittsburgh System and method for measuring and quantitating facial movements
US5581670A (en) 1993-07-21 1996-12-03 Xerox Corporation User interface having movable sheet with click-through tools
CA2124624C (en) 1993-07-21 1999-07-13 Eric A. Bier User interface having click-through tools that can be composed with other tools
JPH0764754A (en) * 1993-08-24 1995-03-10 Hitachi Ltd Compact information processor
JP3727954B2 (en) * 1993-11-10 2005-12-21 キヤノン株式会社 Imaging device
JP3850032B2 (en) * 1995-02-13 2006-11-29 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Portable data processing apparatus provided with gravity control sensor for screen and screen orientation
US5689667A (en) 1995-06-06 1997-11-18 Silicon Graphics, Inc. Methods and system of controlling menus with radial and linear portions
US5666499A (en) 1995-08-04 1997-09-09 Silicon Graphics, Inc. Clickaround tool-based graphical interface with two cursors
US5790769A (en) 1995-08-04 1998-08-04 Silicon Graphics Incorporated System for editing time-based temporal digital media including a pointing device toggling between temporal and translation-rotation modes
US5774591A (en) 1995-12-15 1998-06-30 Xerox Corporation Apparatus and method for recognizing facial expressions and facial gestures in a sequence of images
US5802220A (en) 1995-12-15 1998-09-01 Xerox Corporation Apparatus and method for tracking facial motion through a sequence of images
US6118427A (en) 1996-04-18 2000-09-12 Silicon Graphics, Inc. Graphical user interface with optimal transparency thresholds for maximizing user performance and system efficiency
JPH09305743A (en) 1996-05-20 1997-11-28 Toshiba Corp Human face motion detecting system
JPH1049290A (en) * 1996-08-05 1998-02-20 Sony Corp Device and method for processing information
US5973669A (en) 1996-08-22 1999-10-26 Silicon Graphics, Inc. Temporal data control system
US6115028A (en) 1996-08-22 2000-09-05 Silicon Graphics, Inc. Three dimensional input system using tilt
JPH10240436A (en) * 1996-12-26 1998-09-11 Nikon Corp Information processor and recording medium
US6121953A (en) 1997-02-06 2000-09-19 Modern Cartoons, Ltd. Virtual reality system for sensing facial movements
US6009210A (en) * 1997-03-05 1999-12-28 Digital Equipment Corporation Hands-free interface to a virtual reality environment using head tracking
US5930379A (en) 1997-06-16 1999-07-27 Digital Equipment Corporation Method for detecting human body motion in frames of a video sequence
US6115025A (en) 1997-09-30 2000-09-05 Silicon Graphics, Inc. System for maintaining orientation of a user interface as a display changes orientation
US5916181A (en) 1997-10-24 1999-06-29 Creative Sports Designs, Inc. Head gear for detecting head motion and providing an indication of head movement
US6148271A (en) 1998-01-14 2000-11-14 Silicon Pie, Inc. Speed, spin rate, and curve measuring device
US6151563A (en) 1998-01-14 2000-11-21 Silicon Pie, Inc. Speed, spin rate, and curve measuring device using magnetic field sensors
US6005482A (en) 1998-09-17 1999-12-21 Xerox Corporation Surface mounted information collage

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5322441A (en) * 1990-10-05 1994-06-21 Texas Instruments Incorporated Method and apparatus for providing a portable visual display
US5686940A (en) * 1993-12-24 1997-11-11 Rohm Co., Ltd. Display apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1101215A4 *

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1618553A1 (en) * 2003-03-19 2006-01-25 Franklin Dee Martin Multi-media data collection tool kit having an electronic multi-media "case" file and method of use
EP1618553A4 (en) * 2003-03-19 2011-09-21 Franklin Dee Martin Multi-media data collection tool kit having an electronic multi-media "case" file and method of use
EP1501019A3 (en) * 2003-06-14 2007-04-04 Lg Electronics Inc. Apparatus and method for automatically compensating for an image gradient of a mobile communication terminal
US7340342B2 (en) 2003-08-05 2008-03-04 Research In Motion Limited Mobile device with on-screen optical navigation
US7917284B2 (en) 2003-08-05 2011-03-29 Research In Motion Limited Mobile device with on-screen optical navigation
US8290707B2 (en) 2003-08-05 2012-10-16 Research In Motion Limited Mobile device with on-screen optical navigation
US8600669B2 (en) 2003-08-05 2013-12-03 Blackberry Limited Mobile device with on-screen optical navigation
US7672776B2 (en) 2003-08-05 2010-03-02 Research In Motion Limited Mobile device with on-screen optical navigation
US8086397B2 (en) 2003-08-05 2011-12-27 Research In Motion Limited Mobile device with on-screen optical navigation
US7703121B2 (en) 2004-12-03 2010-04-20 Eastman Kodak Company Method of distributing multimedia data to equipment provided with an image sensor
US8983139B2 (en) 2005-01-07 2015-03-17 Qualcomm Incorporated Optical flow based tilt sensor
US8213686B2 (en) 2005-01-07 2012-07-03 Qualcomm Incorporated Optical flow based tilt sensor
EP1748388A3 (en) * 2005-07-25 2010-07-07 Lg Electronics Inc. Mobile communication terminal with means for estimating motion direction and method thereof
EP1748388A2 (en) * 2005-07-25 2007-01-31 LG Electronics Inc. Mobile communication terminal with means for estimating motion direction and method thereof
US8081157B2 (en) 2006-01-25 2011-12-20 Samsung Electronics Co., Ltd. Apparatus and method of scrolling screen in portable device and recording medium storing program for performing the method
EP1837741A2 (en) * 2006-03-23 2007-09-26 Accenture Global Services GmbH Gestural input for navigation and manipulation in virtual space
EP1837741A3 (en) * 2006-03-23 2013-04-03 Accenture Global Services Limited Gestural input for navigation and manipulation in virtual space
EP2262221A1 (en) * 2006-08-07 2010-12-15 Samsung Electronics Co., Ltd. Portable terminal and user interface control method thereof based on pattern recognition and analysis of image captured by camera
EP1887776A1 (en) * 2006-08-07 2008-02-13 Samsung Electronics Co., Ltd. Portable terminal and user interface control method thereof based on pattern recognition and analysis of image captured by camera
US7693333B2 (en) 2006-08-07 2010-04-06 Samsung Electronics Co., Ltd. Portable terminal and user interface control method thereof based on pattern recognition and analysis of image captured by camera
EP2210450A1 (en) * 2007-11-15 2010-07-28 Sk Telecom Co., LTD Method, system and server playing media using user equipment with motion sensor
EP2210450A4 (en) * 2007-11-15 2015-03-25 Sk Planet Co Ltd Method, system and server playing media using user equipment with motion sensor
US20090141147A1 (en) * 2007-11-30 2009-06-04 Koninklijke Kpn N.V. Auto zoom display system and method
EP2065795A1 (en) * 2007-11-30 2009-06-03 Koninklijke KPN N.V. Auto zoom display system and method
CN112577488A (en) * 2020-11-24 2021-03-30 腾讯科技(深圳)有限公司 Navigation route determining method, navigation route determining device, computer equipment and storage medium
CN112577488B (en) * 2020-11-24 2022-09-02 腾讯科技(深圳)有限公司 Navigation route determining method, navigation route determining device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN1300415A (en) 2001-06-20
EP1101215A4 (en) 2007-07-04
EP1101215A1 (en) 2001-05-23
JP2003501762A (en) 2003-01-14
US6288704B1 (en) 2001-09-11

Similar Documents

Publication Publication Date Title
US6288704B1 (en) Motion detection and tracking system to control navigation and display of object viewers
US20020024506A1 (en) Motion detection and tracking system to control navigation and display of object viewers
US20060061551A1 (en) Motion detection and tracking system to control navigation and display of portable displays including on-chip gesture detection
US20060279542A1 (en) Cellular phones and mobile devices with motion driven control
US9880640B2 (en) Multi-dimensional interface
US10275020B2 (en) Natural user interfaces for mobile image viewing
JP5372157B2 (en) User interface for augmented reality
US20110316888A1 (en) Mobile device user interface combining input from motion sensors and other controls
US9070229B2 (en) Manipulation of graphical objects
US9798443B1 (en) Approaches for seamlessly launching applications
US7330198B2 (en) Three-dimensional object manipulating apparatus, method and computer program
US20100174421A1 (en) User interface for mobile devices
US20100275122A1 (en) Click-through controller for mobile interaction
US20060061550A1 (en) Display size emulation system
US20110254792A1 (en) User interface to provide enhanced control of an application program
WO2006036069A1 (en) Information processing system and method
EP1228422A1 (en) Operation method of user interface of hand-held device
US8661352B2 (en) Method, system and controller for sharing data
US9778824B1 (en) Bookmark overlays for displayed content
CN102279700A (en) Display control apparatus, display control method, display control program, and recording medium
Haro et al. Mobile camera-based user interaction
US9665249B1 (en) Approaches for controlling a computing device based on head movement
WO2014178039A1 (en) Scrolling electronic documents with a smartphone
US10585485B1 (en) Controlling content zoom level based on user head movement
US20060176294A1 (en) Cursor for electronic devices

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 00800044.1

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 2000939510

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2001 502109

Country of ref document: JP

Kind code of ref document: A

AK Designated states

Kind code of ref document: A1

Designated state(s): CN JP

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWP Wipo information: published in national office

Ref document number: 2000939510

Country of ref document: EP