US20070216643A1 - Multipurpose Navigation Keys For An Electronic Device - Google Patents

Multipurpose Navigation Keys For An Electronic Device Download PDF

Info

Publication number
US20070216643A1
US20070216643A1 US11/750,611 US75061107A US2007216643A1 US 20070216643 A1 US20070216643 A1 US 20070216643A1 US 75061107 A US75061107 A US 75061107A US 2007216643 A1 US2007216643 A1 US 2007216643A1
Authority
US
United States
Prior art keywords
press
navigation
navigation controller
key
current key
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/750,611
Inventor
Robert Morris
Stephen Sullivan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Scenera Technologies LLC
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US11/750,611 priority Critical patent/US20070216643A1/en
Assigned to IPAC ACQUISITION SUBSIDIARY I, LLC reassignment IPAC ACQUISITION SUBSIDIARY I, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SULLIVAN, STEPHEN G., MORRIS, ROBERT P.
Assigned to SCENERA TECHNOLOGIES, LLC reassignment SCENERA TECHNOLOGIES, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: IPAC ACQUISITION SUBSIDIARY I, LLC
Publication of US20070216643A1 publication Critical patent/US20070216643A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04892Arrangements for controlling cursor position based on codes indicative of cursor displacements from one discrete location to another, e.g. using cursor control keys associated to different directions or using the tab key

Definitions

  • the present invention relates generally to portable electronic imaging devices, including digital cameras and cell phones, and more particularly to a method and apparatus for implementing navigation and select functions using a multipurpose navigation key.
  • FIGS. 1A and 1B are diagrams illustrating example portions of the hardware interface included on conventional imaging devices.
  • a conventional imaging device 10 is equipped with a liquid-crystal display (LCD) or other type of display screen 12 for displaying objects 14 .
  • Objects that may be displayed on the display screen may include digital still images, video clips, menu items, and icons.
  • the display screen 12 is used as a playback screen for allowing the user to view objects individually or multiple objects at a time.
  • the hardware user interface also includes a number of keys, buttons or switches for operating the device 10 and for navigating between displayed objects 14 . Examples keys include zoom keys (not shown) for zooming a displayed image, a navigation controller 18 , and a select key 20 .
  • a four-way navigation controller 18 is shown in FIG.
  • FIG. 1A which includes four keys; left/right keys 18 a and 18 b , having a horizontal orientation, and up/down keys 18 c and 18 d , having a vertical orientation.
  • FIG. 1B is a diagram similar to FIG. 1A , where like components have like reference numerals, but shows the conventional imaging device 10 with a two-way navigation controller that only includes two keys 18 a and 18 b , rather than four.
  • a user navigates to a desired object 14 by pressing the navigation controller 18 .
  • the displayed object 14 is considered the current selection.
  • a highlight or other indication is moved from object 14 to object 14 as the user navigates to indicate the currently selected object 14 .
  • the user may initiate the default action associated with the current selection by pressing the select key 20 . Examples of actions that can be performed by pressing the select key 20 include edit, open/execute, and delete.
  • the select key 20 is shown in the center of the navigation controller 18 in FIG. 1A , but the select key 20 may also be located outside of the navigation controller, as shown in FIG. 1B .
  • the 2-way/4-way navigation controller 18 may be implemented as an integrated 2-way/4-way key.
  • the user must find the right portion of the navigation controller 18 for the direction of navigation desired. Users of devices with navigation controller keys often get unexpected results from pressing an undesired portion of the navigation controller key 18 . The most typical error is when the user presses a navigation key when intending to press the selection key 20 to initiate the selection function.
  • the present invention provides a portable electronic imaging device that includes a display screen for displaying objects including any combination of digital still images, video clips, menu items, and icons; and a navigation controller comprising navigation keys for allowing a user to navigate between the displayed objects, wherein the user may select a currently displayed object without moving a finger from a navigation key last pressed, thereby implementing navigation and select functions on a single controller.
  • the portable imaging device is configured to detect double-presses and press-and-holds on any navigation key, and either or both of these events may be interpreted as a user selection event that invokes the default operation on the currently selected object(s).
  • the present invention eliminates the need for a user to use a select key, thus reducing user error.
  • the select key may be eliminated from the device altogether, thereby saving space on navigation controller-equipped devices.
  • FIGS. 1A and 1B are diagrams illustrating example portions of the hardware interface included on conventional imaging devices.
  • FIGS. 2A and 2B are diagrams illustrating hardware user interface embodiments for a portable electronic imaging device having a multipurpose navigation controller in accordance with the present invention.
  • FIG. 3 is a flow diagram illustrating a method for implementing navigation and select functions on a portable electronic imaging device by providing a multipurpose navigation controller in accordance with a preferred embodiment of the present invention.
  • the present invention relates to implementing of navigation and select functions on a portable electronic device.
  • the following description is presented to enable one of ordinary skill in the art to make and use the invention and is provided in the context of a patent application and its requirements.
  • Various modifications to the preferred embodiments and the generic principles and features described herein will be readily apparent to those skilled in the art.
  • the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features described herein.
  • the present inventions provides an improved method and apparatus for implementing navigation and select functions on a portable electronic imaging device by providing a multipurpose navigation controller that performs both navigation and select functions.
  • FIGS. 2A and 2B are diagrams illustrating hardware user interface embodiments for a portable electronic imaging device having a multipurpose navigation controller in accordance with the present invention, where like components have like reference numerals.
  • the imaging device 50 equipped with the multipurpose navigation controller 56 of the present invention allows a user to select an object 54 displayed on screen 52 without moving his/her finger from the last navigation key 56 pressed.
  • the portable imaging device 50 is configured to detect double-presses and press-and-holds on any navigation key 56 . Either or both of these events may be interpreted as a user selection event, which when detected invokes the default operation on the currently selected object(s).
  • the device 50 may be configured to detect double-presses and press-and-holds on any navigation key 56 , such that a detected double-press indicates a user selection, while a detected press-and-hold invokes an action on the currently selected object, and vice versa.
  • the multipurpose navigation controller 56 of the present invention no separate selection key is required to indicate a selection event, thus eliminating the need for a separate select key, which potentially saves space on the device and reduces user error.
  • the multi-purpose navigation controller 56 may be implemented as either a 4-way or 2-way navigation controller, as shown in FIGS. 2A and 2B , respectively, and the navigation controller 56 may be implemented with separate navigation keys or as an integrated 4-way/2-way key. Also, in the preferred embodiment, a separate select key is eliminated from the device 50 in order to save space. However, an alternative embodiment, the device 50 may include a separate select key (not shown) for user convenience, whether located in the center of the navigation control or apart therefrom.
  • FIG. 3 is a flow diagram illustrating a method for implementing navigation and select functions on a portable electronic imaging device by providing a multipurpose navigation controller 56 in accordance with a preferred embodiment of the present invention.
  • the process begins when the device 50 detects that one of the navigation keys 56 has been pressed and released in step 100 . If so, the device 50 determines if the time between the previous press of the same key and the current presses is less than a stored double-press time in step 102 .
  • the double-press time 58 is preferably stored in a non-volatile memory 60 in the device 50 along with a release time 60 .
  • both are configurable.
  • the device 50 interprets the key press as a navigation event and displays the next object in step 104 (or moves a highlight to the next object, depending on the current operating mode).
  • the device 50 is further configured to distinguish between fast scrolling during navigation and a double-press as follows. If the time between the previous press of the same key and the current press is less than the stored double-press time 58 in step 102 , then the device 50 examines whether the last couple of presses (e.g., three) were performed on the same navigation key 56 in step 106 . If the last couple of presses were performed on the same key in step 106 , then the device 50 determines that the user is fast-scrolling through displayed objects during navigation in step 108 . Accordingly, the current key press is interpreted as a navigation event and the next object is displayed, as described in step 104 .
  • the last couple of presses e.g., three
  • step 112 the device 50 executes the action associated with the currently selected object.
  • the device 50 detects that one of the navigation keys is pressed, but not released for a time greater than the release time 62 in step 14 , then this “press-and-hold” is interpreted as a selection event in step 110 , and the device 50 executes the action as described in step 112 .

Abstract

A method and apparatus for multipurpose navigation in a portable electronic device are described. According to an exemplary embodiment, a portable electronic imaging device is described including a display screen for displaying objects including at least one of digital still images, video clips, menu items, and icons. The device also includes a navigation controller comprising navigation keys for allowing a user to navigate between the displayed objects by pressing a portion of the navigation controller corresponding to a navigation key last pressed to navigate between the displayed objects, wherein the device is configured to allow the user to select a currently displayed object without moving a finger from the portion of the navigation controller corresponding to the navigation key last pressed to navigate between the displayed objects, thereby implementing navigation and select functions on a single portion of the controller.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a Continuation application of co-pending U.S. patent application Ser. No. 10/869,733, filed Jun. 16, 2004, titled “Multipurpose Navigation Keys For An Electronic Imaging Device,” (now U.S. Pat. No. 7,222,307, issued May 22, 2007), which is commonly owned with this application and is herein incorporated by reference.
  • FIELD OF THE INVENTION
  • The present invention relates generally to portable electronic imaging devices, including digital cameras and cell phones, and more particularly to a method and apparatus for implementing navigation and select functions using a multipurpose navigation key.
  • BACKGROUND OF THE INVENTION
  • Portable electronic imaging devices capable of displaying digital images and video are commonplace today. Examples of such devices include digital cameras, camera-enabled cell phones, MP3 players, and personal digital assistants (PDAs), for instance. FIGS. 1A and 1B are diagrams illustrating example portions of the hardware interface included on conventional imaging devices.
  • Referring to FIG. 1A, a conventional imaging device 10 is equipped with a liquid-crystal display (LCD) or other type of display screen 12 for displaying objects 14. Objects that may be displayed on the display screen may include digital still images, video clips, menu items, and icons. In play mode, the display screen 12 is used as a playback screen for allowing the user to view objects individually or multiple objects at a time. Besides the display screen 12, the hardware user interface also includes a number of keys, buttons or switches for operating the device 10 and for navigating between displayed objects 14. Examples keys include zoom keys (not shown) for zooming a displayed image, a navigation controller 18, and a select key 20. A four-way navigation controller 18 is shown in FIG. 1A, which includes four keys; left/ right keys 18 a and 18 b, having a horizontal orientation, and up/down keys 18 c and 18 d, having a vertical orientation. FIG. 1B is a diagram similar to FIG. 1A, where like components have like reference numerals, but shows the conventional imaging device 10 with a two-way navigation controller that only includes two keys 18 a and 18 b, rather than four.
  • In both embodiments shown in FIGS. 1A and 1B, a user navigates to a desired object 14 by pressing the navigation controller 18. In the case where a single object 14 is displayed on the screen 12, the displayed object 14 is considered the current selection. In the case where multiple objects 14 are displayed, a highlight or other indication is moved from object 14 to object 14 as the user navigates to indicate the currently selected object 14. Once the user navigates to a desired object 14, the user may initiate the default action associated with the current selection by pressing the select key 20. Examples of actions that can be performed by pressing the select key 20 include edit, open/execute, and delete. The select key 20 is shown in the center of the navigation controller 18 in FIG. 1A, but the select key 20 may also be located outside of the navigation controller, as shown in FIG. 1B. In yet other embodiments, the 2-way/4-way navigation controller 18 may be implemented as an integrated 2-way/4-way key.
  • Although the current solution for allowing a user to navigate among objects and to initiate an action associated with the object 14 using a combination of the navigation controller 18 and the select key 20 works for its intended purposes, this implementation has several disadvantages. First, space for keys is limited on portable imaging devices. Having separate navigation and selection keys 18 and 20 occupies valuable space on the device 10. The user must find and press the right key in the correct sequence, which given the small keys on many portable devices due to miniaturization, is not always an easy task.
  • In addition, the user must find the right portion of the navigation controller 18 for the direction of navigation desired. Users of devices with navigation controller keys often get unexpected results from pressing an undesired portion of the navigation controller key 18. The most typical error is when the user presses a navigation key when intending to press the selection key 20 to initiate the selection function.
  • Accordingly, what is needed is an improved method and apparatus for implementing the navigation and select functions on the portable electronic imaging device. The present invention addresses such a need.
  • BRIEF SUMMARY OF THE INVENTION
  • The present invention provides a portable electronic imaging device that includes a display screen for displaying objects including any combination of digital still images, video clips, menu items, and icons; and a navigation controller comprising navigation keys for allowing a user to navigate between the displayed objects, wherein the user may select a currently displayed object without moving a finger from a navigation key last pressed, thereby implementing navigation and select functions on a single controller. In the preferred embodiment, the portable imaging device is configured to detect double-presses and press-and-holds on any navigation key, and either or both of these events may be interpreted as a user selection event that invokes the default operation on the currently selected object(s).
  • According to the method and apparatus disclosed herein, the present invention eliminates the need for a user to use a select key, thus reducing user error. In addition, the select key may be eliminated from the device altogether, thereby saving space on navigation controller-equipped devices.
  • BRIEF DESCRIPTION OF SEVERAL VIEWS OF THE DRAWINGS
  • FIGS. 1A and 1B are diagrams illustrating example portions of the hardware interface included on conventional imaging devices.
  • FIGS. 2A and 2B are diagrams illustrating hardware user interface embodiments for a portable electronic imaging device having a multipurpose navigation controller in accordance with the present invention.
  • FIG. 3 is a flow diagram illustrating a method for implementing navigation and select functions on a portable electronic imaging device by providing a multipurpose navigation controller in accordance with a preferred embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • The present invention relates to implementing of navigation and select functions on a portable electronic device. The following description is presented to enable one of ordinary skill in the art to make and use the invention and is provided in the context of a patent application and its requirements. Various modifications to the preferred embodiments and the generic principles and features described herein will be readily apparent to those skilled in the art. Thus, the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features described herein.
  • The present inventions provides an improved method and apparatus for implementing navigation and select functions on a portable electronic imaging device by providing a multipurpose navigation controller that performs both navigation and select functions.
  • FIGS. 2A and 2B are diagrams illustrating hardware user interface embodiments for a portable electronic imaging device having a multipurpose navigation controller in accordance with the present invention, where like components have like reference numerals. The imaging device 50 equipped with the multipurpose navigation controller 56 of the present invention allows a user to select an object 54 displayed on screen 52 without moving his/her finger from the last navigation key 56 pressed. In the preferred embodiment, the portable imaging device 50 is configured to detect double-presses and press-and-holds on any navigation key 56. Either or both of these events may be interpreted as a user selection event, which when detected invokes the default operation on the currently selected object(s). Thus, when a user navigates to a displayed object 54, he/she can simply double-click the last navigation key 56 pressed (or any navigation key) or press-and-hold the last navigation key 56 pressed to select the current object(s) 54. In a further embodiment, the device 50 may be configured to detect double-presses and press-and-holds on any navigation key 56, such that a detected double-press indicates a user selection, while a detected press-and-hold invokes an action on the currently selected object, and vice versa. With the multipurpose navigation controller 56 of the present invention, no separate selection key is required to indicate a selection event, thus eliminating the need for a separate select key, which potentially saves space on the device and reduces user error.
  • In a preferred embodiment, the multi-purpose navigation controller 56 may be implemented as either a 4-way or 2-way navigation controller, as shown in FIGS. 2A and 2B, respectively, and the navigation controller 56 may be implemented with separate navigation keys or as an integrated 4-way/2-way key. Also, in the preferred embodiment, a separate select key is eliminated from the device 50 in order to save space. However, an alternative embodiment, the device 50 may include a separate select key (not shown) for user convenience, whether located in the center of the navigation control or apart therefrom.
  • FIG. 3 is a flow diagram illustrating a method for implementing navigation and select functions on a portable electronic imaging device by providing a multipurpose navigation controller 56 in accordance with a preferred embodiment of the present invention. The process begins when the device 50 detects that one of the navigation keys 56 has been pressed and released in step 100. If so, the device 50 determines if the time between the previous press of the same key and the current presses is less than a stored double-press time in step 102.
  • Referring again to FIGS. 2A and 2B, the double-press time 58 is preferably stored in a non-volatile memory 60 in the device 50 along with a release time 60. In a preferred embodiment, both are configurable. Referring to FIGS. 2A, 2B, and 3, if the time between presses is greater than the double-press time 58, in step 102, then the device 50 interprets the key press as a navigation event and displays the next object in step 104 (or moves a highlight to the next object, depending on the current operating mode).
  • According to one aspect of the present invention, the device 50 is further configured to distinguish between fast scrolling during navigation and a double-press as follows. If the time between the previous press of the same key and the current press is less than the stored double-press time 58 in step 102, then the device 50 examines whether the last couple of presses (e.g., three) were performed on the same navigation key 56 in step 106. If the last couple of presses were performed on the same key in step 106, then the device 50 determines that the user is fast-scrolling through displayed objects during navigation in step 108. Accordingly, the current key press is interpreted as a navigation event and the next object is displayed, as described in step 104.
  • If the time between the previous press of the same key and the current press is less than the stored double-press time 58 in step 102, but the last couple of presses were not performed on the same navigation key in step 106, then the current key press is interpreted as a selection event in step 110. In step 112, the device 50 executes the action associated with the currently selected object.
  • Also, according to the present invention, if the device 50 detects that one of the navigation keys is pressed, but not released for a time greater than the release time 62 in step 14, then this “press-and-hold” is interpreted as a selection event in step 110, and the device 50 executes the action as described in step 112.
  • A method and apparatus for implementing the navigation and select functions on the portable electronic imaging device using a multipurpose navigation key has been disclosed. The present invention has been described in accordance with the embodiments shown, and one of ordinary skill in the art will readily recognize that there could be variations to the embodiments, and any variations would be within the spirit and scope of the present invention. Accordingly, many modifications may be made by one of ordinary skill in the art without departing from the spirit and scope of the appended claims.

Claims (22)

1. A portable electronic imaging device, comprising:
a display screen for displaying objects including at least one of digital still images, video clips, menu items, and icons; and
a navigation controller comprising navigation keys for allowing a user to navigate between the displayed objects by pressing a portion of the navigation controller corresponding to a navigation key last pressed to navigate between the displayed objects, wherein the device is configured to allow the user to select a currently displayed object without moving a finger from the portion of the navigation controller corresponding to the navigation key last pressed to navigate between the displayed objects, thereby implementing navigation and select functions on a single portion of the controller.
2. The device of claim 1 wherein the device is configured to detect a double-press on any portion of the navigation controller corresponding to a navigation key and is configured to interpret the double-press as a user selection event that invokes an action on the currently selected object.
3. The device of claim 1 wherein the device is configured to detect that a portion of the navigation controller corresponding to a current key has been pressed, and is configured to interpret the current key press as a navigation event if the time between previous presses of the same portion of the navigation controller corresponding to the current key and the current key press is greater than a predetermined double-press time.
4. The device of claim 3 wherein the device is configured to interpret the current key press as a selection event if the time between previous presses of the same portion of the navigation controller corresponding to the current key and the current key press is less than the predetermined double-press time.
5. The device of claim 1 wherein the device is configured to detect that a portion of the navigation controller corresponding to a current key has been pressed, and is configured to interpret the current key press as a fast-scrolling event if the time between previous presses of the same portion of the navigation controller corresponding to the current key and the current key press is less than a predetermined double-press time and a last predetermined plurality of presses of the navigation controller correspond to a same portion of the navigation controller corresponding to the current key.
6. The device of claim 5 wherein the device is configured to interpret the current key press as a selection event if the time between previous presses of the same portion of the navigation controller corresponding to the current key and the current key press is less than a predetermined double-press time and the last predetermined plurality of presses of the navigation controller does not correspond to the same portion of the navigation controller corresponding to the current key.
7. The device of claim 1 wherein the device is configured to detect a press-and-hold on any portion of the navigation controller corresponding to a navigation key, and is configured to interpret the press-and hold as a user selection event that invokes an action on the currently selected object.
8. The device of claim 7, wherein the device is configured to detect the press-and-hold when the current key is pressed, but not released, for a time greater than a release time.
9. The device of claim 1 wherein the device is configured to detect both double-presses and press-and-holds on any portion of the navigation controller corresponding to a navigation key, and a detected double-press indicates a user selection of a current object, while a detected press-and-hold invokes an action on the currently selected object.
10. The device of claim 1 wherein the device is configured to detect both double-presses and press-and-holds on any portion of the navigation controller corresponding to a navigation key, and a detected press-and-hold indicates a user selection of a current object, while the detected double-presses invokes the action on the currently selected object.
11. The device of claim 1 wherein the device comprises at least one of a digital camera, camera-enabled cell phone, MP3 player, and a personal digital assistant.
12. A portable electronic imaging device, comprising:
a display screen for displaying objects including at least one of digital still images, video clips, menu items, and icons; and
a navigation controller comprising navigation keys for allowing a user to navigate between the displayed objects by pressing a portion of the navigation controller corresponding to a navigation key last pressed to navigate between the displayed objects, wherein the device is configured to detect at least one of a press-and-hold and a double-press of the portion of the navigation controller corresponding to the navigation key, thereby allowing the user to select a currently displayed object without moving a finger from the portion of the navigation controller corresponding to the navigation key last pressed to navigate between the displayed objects.
13. A method for providing a portable electronic imaging device with a multipurpose navigation controller, comprising:
displaying objects on a display screen, the objects including at least one of digital still images, video clips, menu items, and icons; and
providing the device with a navigation controller comprising navigation keys for allowing a user to navigate between the displayed objects by pressing a portion of the navigation controller corresponding to a navigation key last pressed to navigate between the displayed objects, wherein the device is configured to allow the user to select a currently displayed object without moving a finger from the portion of the navigation controller corresponding to the navigation key last pressed to navigate between the displayed objects, thereby implementing navigation and select functions on a single portion of the controller.
14. The method of claim 13 comprising configuring the device to detect a double-press on any portion of the navigation controller corresponding to a navigation key and to interpret the double-press as a user selection event that invokes an action on the currently selected object.
15. The method of claim 13 comprising configuring the device to detect that a portion of the navigation controller corresponding to a current key has been pressed, and to interpret the current key press as a navigation event if the time between previous presses of the same portion of the navigation controller corresponding to the current key and the current key press is greater than a predetermined double-press time.
16. The method of claim 15 comprising configuring the device to interpret the current key press as a selection event if the time between previous presses of the same portion of the navigation controller corresponding to the current key and the current key press is less than the predetermined double-press time.
17. The method of claim 13 comprising configuring the device to detect that a portion of the navigation controller corresponding to a current key has been pressed, and to interpret the current key press as a fast-scrolling event if the time between previous presses of the same portion of the navigation controller corresponding to the current key and the current key press is less than a predetermined double-press time and a last predetermined plurality of presses of the navigation controller correspond to a same portion of the navigation controller corresponding to the current key.
18. The method of claim 17 comprising configuring the device to interpret the current key press as a selection event if the time between previous presses of the same portion of the navigation controller corresponding to the current key and the current key press is less than a predetermined double-press time and the last predetermined plurality of presses of the navigation controller does not correspond to the same portion of the navigation controller corresponding to the current key.
19. The method of claim 13 comprising configuring the device to detect a press-and-hold on any portion of the navigation controller corresponding to a navigation key, and to interpret the press-and hold as a user selection event that invokes an action on the currently selected object.
20. The method of claim 19, comprising configuring the device to detect the press-and-hold when the current key is pressed, but not released, for a time greater than a release time.
21. The method of claim 13 comprising configuring the device to detect both double-presses and press-and-holds on any portion of the navigation controller corresponding to a navigation key, wherein a detected double-press indicates a user selection, while a detected press-and-hold invokes an action on the currently selected object.
22. The method of claim 13 comprising configuring the device to detect both double-presses and press-and-holds on any portion of the navigation controller corresponding to a navigation key, wherein a detected press-and-hold indicates a user selection, while the detected double-presses invokes the action on the currently selected object.
US11/750,611 2004-06-16 2007-05-18 Multipurpose Navigation Keys For An Electronic Device Abandoned US20070216643A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/750,611 US20070216643A1 (en) 2004-06-16 2007-05-18 Multipurpose Navigation Keys For An Electronic Device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/869,733 US7222307B2 (en) 2004-06-16 2004-06-16 Multipurpose navigation keys for an electronic imaging device
US11/750,611 US20070216643A1 (en) 2004-06-16 2007-05-18 Multipurpose Navigation Keys For An Electronic Device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/869,733 Continuation US7222307B2 (en) 2004-06-16 2004-06-16 Multipurpose navigation keys for an electronic imaging device

Publications (1)

Publication Number Publication Date
US20070216643A1 true US20070216643A1 (en) 2007-09-20

Family

ID=35482006

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/869,733 Expired - Fee Related US7222307B2 (en) 2004-06-16 2004-06-16 Multipurpose navigation keys for an electronic imaging device
US11/750,611 Abandoned US20070216643A1 (en) 2004-06-16 2007-05-18 Multipurpose Navigation Keys For An Electronic Device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/869,733 Expired - Fee Related US7222307B2 (en) 2004-06-16 2004-06-16 Multipurpose navigation keys for an electronic imaging device

Country Status (5)

Country Link
US (2) US7222307B2 (en)
EP (1) EP1766626A2 (en)
JP (1) JP4403260B2 (en)
CN (1) CN101189567A (en)
WO (1) WO2006009692A2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090135142A1 (en) * 2007-11-27 2009-05-28 Motorola, Inc. Data entry device and method
US8542133B2 (en) 2007-07-06 2013-09-24 Synaptics Incorporated Backlit haptic key
US8599047B2 (en) * 2007-07-06 2013-12-03 Synaptics Incorporated Haptic keyboard assemblies and methods
CN105988686A (en) * 2015-06-10 2016-10-05 乐视致新电子科技(天津)有限公司 Play interface display method and device as well as terminal

Families Citing this family (123)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8645137B2 (en) 2000-03-16 2014-02-04 Apple Inc. Fast, language-independent method for user authentication by voice
US7073130B2 (en) * 2001-01-31 2006-07-04 Microsoft Corporation Methods and systems for creating skins
US6791581B2 (en) * 2001-01-31 2004-09-14 Microsoft Corporation Methods and systems for synchronizing skin properties
US7417625B2 (en) * 2004-04-29 2008-08-26 Scenera Technologies, Llc Method and system for providing input mechanisms on a handheld electronic device
US7222307B2 (en) * 2004-06-16 2007-05-22 Scenera Technologies, Llc Multipurpose navigation keys for an electronic imaging device
US20060015826A1 (en) * 2004-07-13 2006-01-19 Sony Corporation Hard disk multimedia player and method
US20060199616A1 (en) * 2005-03-03 2006-09-07 Agere Systems Inc. Mobile communication device having automatic scrolling capability and method of operation thereof
JP4515409B2 (en) 2005-05-20 2010-07-28 エルジー エレクトロニクス インコーポレイティド Continuous click device for mobile communication terminal and execution method thereof
US20070040808A1 (en) * 2005-08-22 2007-02-22 Creative Technology Ltd. User configurable button
US8677377B2 (en) 2005-09-08 2014-03-18 Apple Inc. Method and apparatus for building an intelligent automated assistant
KR100738902B1 (en) * 2006-03-16 2007-07-12 삼성전자주식회사 Apparatus and method for inputting characters in portable terminal
KR100738901B1 (en) * 2006-03-16 2007-07-12 삼성전자주식회사 Apparatus and method for inputting characters in portable terminal
US20070290992A1 (en) * 2006-06-16 2007-12-20 Creative Technology Ltd Control interface for media player
US9378343B1 (en) 2006-06-16 2016-06-28 Nokia Corporation Automatic detection of required network key type
JP2008040019A (en) * 2006-08-03 2008-02-21 Toshiba Corp Mobile terminal
US9318108B2 (en) 2010-01-18 2016-04-19 Apple Inc. Intelligent automated assistant
US8564543B2 (en) * 2006-09-11 2013-10-22 Apple Inc. Media player with imaged based browsing
US8736557B2 (en) 2006-09-11 2014-05-27 Apple Inc. Electronic device with image based browsers
US8281041B2 (en) * 2006-11-22 2012-10-02 Carefusion 303, Inc. System and method for preventing keypad entry errors
US9001047B2 (en) * 2007-01-07 2015-04-07 Apple Inc. Modal change based on orientation of a portable multifunction device
US8977255B2 (en) 2007-04-03 2015-03-10 Apple Inc. Method and system for operating a multi-function portable electronic device using voice-activation
US7979805B2 (en) * 2007-05-21 2011-07-12 Microsoft Corporation Button discoverability
US9933937B2 (en) 2007-06-20 2018-04-03 Apple Inc. Portable multifunction device, method, and graphical user interface for playing online videos
US9330720B2 (en) 2008-01-03 2016-05-03 Apple Inc. Methods and apparatus for altering audio output signals
US8327272B2 (en) 2008-01-06 2012-12-04 Apple Inc. Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars
US8996376B2 (en) 2008-04-05 2015-03-31 Apple Inc. Intelligent text-to-speech conversion
US10496753B2 (en) 2010-01-18 2019-12-03 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US20100030549A1 (en) 2008-07-31 2010-02-04 Lee Michael M Mobile device having human language translation capability with positional feedback
US8862252B2 (en) * 2009-01-30 2014-10-14 Apple Inc. Audio user interface for displayless electronic device
US8913771B2 (en) * 2009-03-04 2014-12-16 Apple Inc. Portable electronic device having a water exposure indicator label
US10255566B2 (en) 2011-06-03 2019-04-09 Apple Inc. Generating and processing task items that represent tasks to perform
US10241752B2 (en) 2011-09-30 2019-03-26 Apple Inc. Interface for a virtual digital assistant
US10241644B2 (en) 2011-06-03 2019-03-26 Apple Inc. Actionable reminder entries
US9858925B2 (en) 2009-06-05 2018-01-02 Apple Inc. Using context information to facilitate processing of commands in a virtual assistant
US9431006B2 (en) 2009-07-02 2016-08-30 Apple Inc. Methods and apparatuses for automatic speech recognition
US8736561B2 (en) 2010-01-06 2014-05-27 Apple Inc. Device, method, and graphical user interface with content display modes and display rotation heuristics
US8438504B2 (en) 2010-01-06 2013-05-07 Apple Inc. Device, method, and graphical user interface for navigating through multiple viewing areas
US10276170B2 (en) 2010-01-18 2019-04-30 Apple Inc. Intelligent automated assistant
US10553209B2 (en) 2010-01-18 2020-02-04 Apple Inc. Systems and methods for hands-free notification summaries
US10679605B2 (en) 2010-01-18 2020-06-09 Apple Inc. Hands-free list-reading by intelligent automated assistant
US10705794B2 (en) 2010-01-18 2020-07-07 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US8682667B2 (en) 2010-02-25 2014-03-25 Apple Inc. User profiling for selecting user specific voice input processing information
US8572489B2 (en) * 2010-12-16 2013-10-29 Harman International Industries, Incorporated Handlebar audio controls
US9262612B2 (en) 2011-03-21 2016-02-16 Apple Inc. Device access using voice authentication
US10057736B2 (en) 2011-06-03 2018-08-21 Apple Inc. Active transport based notifications
US8994660B2 (en) 2011-08-29 2015-03-31 Apple Inc. Text correction processing
US9483461B2 (en) 2012-03-06 2016-11-01 Apple Inc. Handling speech synthesis of content for multiple languages
US9280610B2 (en) 2012-05-14 2016-03-08 Apple Inc. Crowd sourcing information to fulfill user requests
US9721563B2 (en) 2012-06-08 2017-08-01 Apple Inc. Name recognition system
US9495129B2 (en) 2012-06-29 2016-11-15 Apple Inc. Device, method, and user interface for voice-activated navigation and browsing of a document
US9547647B2 (en) 2012-09-19 2017-01-17 Apple Inc. Voice-based media searching
WO2014197334A2 (en) 2013-06-07 2014-12-11 Apple Inc. System and method for user-specified pronunciation of words for speech synthesis and recognition
US9582608B2 (en) 2013-06-07 2017-02-28 Apple Inc. Unified ranking with entropy-weighted information for phrase-based semantic auto-completion
WO2014197336A1 (en) 2013-06-07 2014-12-11 Apple Inc. System and method for detecting errors in interactions with a voice-based digital assistant
WO2014197335A1 (en) 2013-06-08 2014-12-11 Apple Inc. Interpreting and acting upon commands that involve sharing information with remote devices
US10176167B2 (en) 2013-06-09 2019-01-08 Apple Inc. System and method for inferring user intent from speech inputs
EP3937002A1 (en) 2013-06-09 2022-01-12 Apple Inc. Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant
US9430463B2 (en) 2014-05-30 2016-08-30 Apple Inc. Exemplar-based natural language processing
US10078631B2 (en) 2014-05-30 2018-09-18 Apple Inc. Entropy-guided text prediction using combined word and character n-gram language models
US9842101B2 (en) 2014-05-30 2017-12-12 Apple Inc. Predictive conversion of language input
US9760559B2 (en) 2014-05-30 2017-09-12 Apple Inc. Predictive text input
US9715875B2 (en) 2014-05-30 2017-07-25 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
AU2015266863B2 (en) 2014-05-30 2018-03-15 Apple Inc. Multi-command single utterance input method
US9785630B2 (en) 2014-05-30 2017-10-10 Apple Inc. Text prediction using combined word N-gram and unigram language models
US10659851B2 (en) 2014-06-30 2020-05-19 Apple Inc. Real-time digital assistant knowledge updates
US9338493B2 (en) 2014-06-30 2016-05-10 Apple Inc. Intelligent automated assistant for TV user interactions
WO2016022496A2 (en) 2014-08-06 2016-02-11 Apple Inc. Reduced-size user interfaces for battery management
US10446141B2 (en) 2014-08-28 2019-10-15 Apple Inc. Automatic speech recognition based on user feedback
JP6349030B2 (en) 2014-09-02 2018-06-27 アップル インコーポレイテッド Small interface for managing alerts
US9818400B2 (en) 2014-09-11 2017-11-14 Apple Inc. Method and apparatus for discovering trending terms in speech requests
US10789041B2 (en) 2014-09-12 2020-09-29 Apple Inc. Dynamic thresholds for always listening speech trigger
US10074360B2 (en) 2014-09-30 2018-09-11 Apple Inc. Providing an indication of the suitability of speech recognition
US9668121B2 (en) 2014-09-30 2017-05-30 Apple Inc. Social reminders
US9886432B2 (en) 2014-09-30 2018-02-06 Apple Inc. Parsimonious handling of word inflection via categorical stem + suffix N-gram language models
US9646609B2 (en) 2014-09-30 2017-05-09 Apple Inc. Caching apparatus for serving phonetic pronunciations
US10127911B2 (en) 2014-09-30 2018-11-13 Apple Inc. Speaker identification and unsupervised speaker adaptation techniques
US10552013B2 (en) 2014-12-02 2020-02-04 Apple Inc. Data detection
US9865280B2 (en) 2015-03-06 2018-01-09 Apple Inc. Structured dictation using intelligent automated assistants
US10567477B2 (en) 2015-03-08 2020-02-18 Apple Inc. Virtual assistant continuity
US9886953B2 (en) 2015-03-08 2018-02-06 Apple Inc. Virtual assistant activation
US9721566B2 (en) 2015-03-08 2017-08-01 Apple Inc. Competing devices responding to voice triggers
US9899019B2 (en) 2015-03-18 2018-02-20 Apple Inc. Systems and methods for structured stem and suffix language models
US9842105B2 (en) 2015-04-16 2017-12-12 Apple Inc. Parsimonious continuous-space phrase representations for natural language processing
US10083688B2 (en) 2015-05-27 2018-09-25 Apple Inc. Device voice control for selecting a displayed affordance
US10127220B2 (en) 2015-06-04 2018-11-13 Apple Inc. Language identification from short strings
US9578173B2 (en) 2015-06-05 2017-02-21 Apple Inc. Virtual assistant aided communication with 3rd party service in a communication session
US10101822B2 (en) 2015-06-05 2018-10-16 Apple Inc. Language input correction
US10255907B2 (en) 2015-06-07 2019-04-09 Apple Inc. Automatic accent detection using acoustic models
US10186254B2 (en) 2015-06-07 2019-01-22 Apple Inc. Context-based endpoint detection
US11025565B2 (en) 2015-06-07 2021-06-01 Apple Inc. Personalized prediction of responses for instant messaging
US10747498B2 (en) 2015-09-08 2020-08-18 Apple Inc. Zero latency digital assistant
US10671428B2 (en) 2015-09-08 2020-06-02 Apple Inc. Distributed personal assistant
US9697820B2 (en) 2015-09-24 2017-07-04 Apple Inc. Unit-selection text-to-speech synthesis using concatenation-sensitive neural networks
US11010550B2 (en) 2015-09-29 2021-05-18 Apple Inc. Unified language modeling framework for word prediction, auto-completion and auto-correction
US10366158B2 (en) 2015-09-29 2019-07-30 Apple Inc. Efficient word encoding for recurrent neural network language models
US11587559B2 (en) 2015-09-30 2023-02-21 Apple Inc. Intelligent device identification
US10691473B2 (en) 2015-11-06 2020-06-23 Apple Inc. Intelligent automated assistant in a messaging environment
US10049668B2 (en) 2015-12-02 2018-08-14 Apple Inc. Applying neural network language models to weighted finite state transducers for automatic speech recognition
US10223066B2 (en) 2015-12-23 2019-03-05 Apple Inc. Proactive assistance based on dialog communication between devices
US10446143B2 (en) 2016-03-14 2019-10-15 Apple Inc. Identification of voice inputs providing credentials
US9934775B2 (en) 2016-05-26 2018-04-03 Apple Inc. Unit-selection text-to-speech synthesis based on predicted concatenation parameters
US9972304B2 (en) 2016-06-03 2018-05-15 Apple Inc. Privacy preserving distributed evaluation framework for embedded personalized systems
US10249300B2 (en) 2016-06-06 2019-04-02 Apple Inc. Intelligent list reading
US10049663B2 (en) 2016-06-08 2018-08-14 Apple, Inc. Intelligent automated assistant for media exploration
DK179588B1 (en) 2016-06-09 2019-02-22 Apple Inc. Intelligent automated assistant in a home environment
US10509862B2 (en) 2016-06-10 2019-12-17 Apple Inc. Dynamic phrase expansion of language input
US10586535B2 (en) 2016-06-10 2020-03-10 Apple Inc. Intelligent digital assistant in a multi-tasking environment
US10192552B2 (en) 2016-06-10 2019-01-29 Apple Inc. Digital assistant providing whispered speech
US10490187B2 (en) 2016-06-10 2019-11-26 Apple Inc. Digital assistant providing automated status report
US10067938B2 (en) 2016-06-10 2018-09-04 Apple Inc. Multilingual word prediction
DK179049B1 (en) 2016-06-11 2017-09-18 Apple Inc Data driven natural language event detection and classification
DK201670540A1 (en) 2016-06-11 2018-01-08 Apple Inc Application integration with a digital assistant
DK179343B1 (en) 2016-06-11 2018-05-14 Apple Inc Intelligent task discovery
DK179415B1 (en) 2016-06-11 2018-06-14 Apple Inc Intelligent device arbitration and control
US10043516B2 (en) 2016-09-23 2018-08-07 Apple Inc. Intelligent automated assistant
US10593346B2 (en) 2016-12-22 2020-03-17 Apple Inc. Rank-reduced token representation for automatic speech recognition
DK201770439A1 (en) 2017-05-11 2018-12-13 Apple Inc. Offline personal assistant
DK179496B1 (en) 2017-05-12 2019-01-15 Apple Inc. USER-SPECIFIC Acoustic Models
DK179745B1 (en) 2017-05-12 2019-05-01 Apple Inc. SYNCHRONIZATION AND TASK DELEGATION OF A DIGITAL ASSISTANT
DK201770432A1 (en) 2017-05-15 2018-12-21 Apple Inc. Hierarchical belief states for digital assistants
DK201770431A1 (en) 2017-05-15 2018-12-20 Apple Inc. Optimizing dialogue policy decisions for digital assistants using implicit feedback
DK179560B1 (en) 2017-05-16 2019-02-18 Apple Inc. Far-field extension for digital assistant services
EP3814877A1 (en) * 2018-06-05 2021-05-05 Ellodee Inc. Portable streaming audio player

Citations (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4456931A (en) * 1980-10-31 1984-06-26 Nippon Kogaku K.K. Electronic camera
US4937676A (en) * 1989-02-10 1990-06-26 Polariod Corporation Electronic camera system with detachable printer
US4982291A (en) * 1987-08-27 1991-01-01 Casio Computer Co., Ltd. Electronic still video camera capable of searching desired picture in simple and quick manner
US5021989A (en) * 1986-04-28 1991-06-04 Hitachi, Ltd. Document browsing apparatus with concurrent processing and retrievel
US5138460A (en) * 1987-08-20 1992-08-11 Canon Kabushiki Kaisha Apparatus for forming composite images
US5237648A (en) * 1990-06-08 1993-08-17 Apple Computer, Inc. Apparatus and method for editing a video recording by selecting and displaying video clips
US5274458A (en) * 1991-01-25 1993-12-28 Sony Corporation Video camera
US5465133A (en) * 1988-10-04 1995-11-07 Asahi Kogaku Kogyo Kabushiki Kaisha Still video camera
US5497193A (en) * 1992-10-29 1996-03-05 Sony Corporation Electronic still camera with dual contact shutter switch for picture review
US5513306A (en) * 1990-08-09 1996-04-30 Apple Computer, Inc. Temporal event viewing and editing system
US5559943A (en) * 1994-06-27 1996-09-24 Microsoft Corporation Method and apparatus customizing a dual actuation setting of a computer input device switch
US5608491A (en) * 1994-02-04 1997-03-04 Nikon Corporation Camera with simplified parameter selection and dual mode operation and method of operation
US5635984A (en) * 1991-12-11 1997-06-03 Samsung Electronics Co., Ltd. Multi-picture control circuit and method for electronic still camera
US5682207A (en) * 1993-02-26 1997-10-28 Sony Corporation Image display apparatus for simultaneous display of a plurality of images
US5742339A (en) * 1994-12-27 1998-04-21 Asahi Kogaku Kogyo Kabushiki Kaisha Electronic still video camera
US5781175A (en) * 1986-04-21 1998-07-14 Canon Kabushiki Kaisha Image search apparatus
US5796428A (en) * 1993-10-21 1998-08-18 Hitachi, Ltd. Electronic photography system
US5845166A (en) * 1997-02-20 1998-12-01 Eastman Kodak Company Hybrid camera with identification matching of film and electronic images
US5861918A (en) * 1997-01-08 1999-01-19 Flashpoint Technology, Inc. Method and system for managing a removable memory in a digital camera
US5943043A (en) * 1995-11-09 1999-08-24 International Business Machines Corporation Touch panel "double-touch" input method and detection apparatus
US5969708A (en) * 1996-10-15 1999-10-19 Trimble Navigation Limited Time dependent cursor tool
US5977976A (en) * 1995-04-19 1999-11-02 Canon Kabushiki Kaisha Function setting apparatus
US6097431A (en) * 1996-09-04 2000-08-01 Flashpoint Technology, Inc. Method and system for reviewing and navigating among images on an image capture unit
US6122003A (en) * 1997-08-22 2000-09-19 Flashpoint Technology, Inc. Method and apparatus for changing operating modes of an image capture device
US6160926A (en) * 1998-08-07 2000-12-12 Hewlett-Packard Company Appliance and method for menu navigation
US20040036680A1 (en) * 2002-08-26 2004-02-26 Mark Davis User-interface features for computers with contact-sensitive displays
US6741232B1 (en) * 2002-01-23 2004-05-25 Good Technology, Inc. User interface for a data processing apparatus
US20050009571A1 (en) * 2003-02-06 2005-01-13 Chiam Thor Itt Main menu navigation principle for mobile phone user
US20050283729A1 (en) * 2004-06-16 2005-12-22 Morris Robert P Multipurpose navigation keys for an electronic imaging device
US6995875B2 (en) * 2000-06-07 2006-02-07 Hewlett-Packard Development Company, L.P. Appliance and method for navigating among multiple captured images and functional menus
US7058432B2 (en) * 2001-04-20 2006-06-06 Mitsubishi Denki Kabushiki Kaisha Pointing device and mobile telephone
US7345671B2 (en) * 2001-10-22 2008-03-18 Apple Inc. Method and apparatus for use of rotational user inputs

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0832847A (en) 1994-07-13 1996-02-02 Fuji Photo Film Co Ltd Electronic still camera and its control method
JP3399698B2 (en) 1994-08-23 2003-04-21 株式会社日立製作所 Recording device with camera
JPH08205014A (en) 1995-01-31 1996-08-09 Casio Comput Co Ltd Electronic still camera
JPH08223524A (en) 1995-02-08 1996-08-30 Hitachi Ltd Portable video camera

Patent Citations (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4456931A (en) * 1980-10-31 1984-06-26 Nippon Kogaku K.K. Electronic camera
US5781175A (en) * 1986-04-21 1998-07-14 Canon Kabushiki Kaisha Image search apparatus
US5021989A (en) * 1986-04-28 1991-06-04 Hitachi, Ltd. Document browsing apparatus with concurrent processing and retrievel
US5138460A (en) * 1987-08-20 1992-08-11 Canon Kabushiki Kaisha Apparatus for forming composite images
US4982291A (en) * 1987-08-27 1991-01-01 Casio Computer Co., Ltd. Electronic still video camera capable of searching desired picture in simple and quick manner
US5465133A (en) * 1988-10-04 1995-11-07 Asahi Kogaku Kogyo Kabushiki Kaisha Still video camera
US4937676A (en) * 1989-02-10 1990-06-26 Polariod Corporation Electronic camera system with detachable printer
US5237648A (en) * 1990-06-08 1993-08-17 Apple Computer, Inc. Apparatus and method for editing a video recording by selecting and displaying video clips
US5513306A (en) * 1990-08-09 1996-04-30 Apple Computer, Inc. Temporal event viewing and editing system
US5274458A (en) * 1991-01-25 1993-12-28 Sony Corporation Video camera
US5635984A (en) * 1991-12-11 1997-06-03 Samsung Electronics Co., Ltd. Multi-picture control circuit and method for electronic still camera
US5497193A (en) * 1992-10-29 1996-03-05 Sony Corporation Electronic still camera with dual contact shutter switch for picture review
US5682207A (en) * 1993-02-26 1997-10-28 Sony Corporation Image display apparatus for simultaneous display of a plurality of images
US5796428A (en) * 1993-10-21 1998-08-18 Hitachi, Ltd. Electronic photography system
US5608491A (en) * 1994-02-04 1997-03-04 Nikon Corporation Camera with simplified parameter selection and dual mode operation and method of operation
US5559943A (en) * 1994-06-27 1996-09-24 Microsoft Corporation Method and apparatus customizing a dual actuation setting of a computer input device switch
US5742339A (en) * 1994-12-27 1998-04-21 Asahi Kogaku Kogyo Kabushiki Kaisha Electronic still video camera
US5977976A (en) * 1995-04-19 1999-11-02 Canon Kabushiki Kaisha Function setting apparatus
US5943043A (en) * 1995-11-09 1999-08-24 International Business Machines Corporation Touch panel "double-touch" input method and detection apparatus
US6097431A (en) * 1996-09-04 2000-08-01 Flashpoint Technology, Inc. Method and system for reviewing and navigating among images on an image capture unit
US5969708A (en) * 1996-10-15 1999-10-19 Trimble Navigation Limited Time dependent cursor tool
US5861918A (en) * 1997-01-08 1999-01-19 Flashpoint Technology, Inc. Method and system for managing a removable memory in a digital camera
US5845166A (en) * 1997-02-20 1998-12-01 Eastman Kodak Company Hybrid camera with identification matching of film and electronic images
US6122003A (en) * 1997-08-22 2000-09-19 Flashpoint Technology, Inc. Method and apparatus for changing operating modes of an image capture device
US6160926A (en) * 1998-08-07 2000-12-12 Hewlett-Packard Company Appliance and method for menu navigation
US6995875B2 (en) * 2000-06-07 2006-02-07 Hewlett-Packard Development Company, L.P. Appliance and method for navigating among multiple captured images and functional menus
US7058432B2 (en) * 2001-04-20 2006-06-06 Mitsubishi Denki Kabushiki Kaisha Pointing device and mobile telephone
US7345671B2 (en) * 2001-10-22 2008-03-18 Apple Inc. Method and apparatus for use of rotational user inputs
US6741232B1 (en) * 2002-01-23 2004-05-25 Good Technology, Inc. User interface for a data processing apparatus
US20040036680A1 (en) * 2002-08-26 2004-02-26 Mark Davis User-interface features for computers with contact-sensitive displays
US20050009571A1 (en) * 2003-02-06 2005-01-13 Chiam Thor Itt Main menu navigation principle for mobile phone user
US20050283729A1 (en) * 2004-06-16 2005-12-22 Morris Robert P Multipurpose navigation keys for an electronic imaging device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8542133B2 (en) 2007-07-06 2013-09-24 Synaptics Incorporated Backlit haptic key
US8599047B2 (en) * 2007-07-06 2013-12-03 Synaptics Incorporated Haptic keyboard assemblies and methods
US20090135142A1 (en) * 2007-11-27 2009-05-28 Motorola, Inc. Data entry device and method
CN105988686A (en) * 2015-06-10 2016-10-05 乐视致新电子科技(天津)有限公司 Play interface display method and device as well as terminal

Also Published As

Publication number Publication date
JP4403260B2 (en) 2010-01-27
WO2006009692A2 (en) 2006-01-26
EP1766626A2 (en) 2007-03-28
US7222307B2 (en) 2007-05-22
WO2006009692A3 (en) 2008-02-07
JP2008503930A (en) 2008-02-07
CN101189567A (en) 2008-05-28
US20050283729A1 (en) 2005-12-22

Similar Documents

Publication Publication Date Title
US7222307B2 (en) Multipurpose navigation keys for an electronic imaging device
JP4740971B2 (en) Terminal device with display function
US7984381B2 (en) User interface
US9128597B2 (en) Method for switching user interface, electronic device and recording medium using the same
KR100814395B1 (en) Apparatus and Method for Controlling User Interface Using Jog Shuttle and Navigation Key
JP5192486B2 (en) Input device
US8072435B2 (en) Mobile electronic device, method for entering screen lock state and recording medium thereof
US8294679B2 (en) Display and operation device, operation device, and program
US8629929B2 (en) Image processing apparatus and image processing method
US20050184972A1 (en) Image display apparatus and image display method
KR100869950B1 (en) configuration structure of extendable idle screen of mobile device and display method thereof
KR20070085631A (en) Portable electronic device having user interactive visual interface
US20100097322A1 (en) Apparatus and method for switching touch screen operation
US20030001863A1 (en) Portable digital devices
JP5782168B2 (en) Input / output method and electronic equipment
KR20080079191A (en) Method and device for displaying contents in sliding-type mobile phone
JPH08237338A (en) Roller bar menu access equipment for cellular telephone set and its method
US20030081008A1 (en) Method and apparatus for controlling an electronic device via a menu displayed on a display screen of the electronic device
EP2334038A1 (en) Portable terminal device, image display method used for same, and recording medium to record program for same
WO2003077098A1 (en) Mobile communication device, display control method of mobile communication device, and program therefor
KR100771615B1 (en) A mobile telecommunication device and a data processing method using the device
KR101129661B1 (en) Mobile terminal
JP2009245037A (en) Display method of information display
KR101339837B1 (en) method of executing related function in mobile device having both-sided display
KR20070037549A (en) Mobile communication terminal with multiple input function and its input method

Legal Events

Date Code Title Description
AS Assignment

Owner name: IPAC ACQUISITION SUBSIDIARY I, LLC, NEW HAMPSHIRE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MORRIS, ROBERT P.;SULLIVAN, STEPHEN G.;REEL/FRAME:019319/0719;SIGNING DATES FROM 20040615 TO 20040616

Owner name: SCENERA TECHNOLOGIES, LLC, NEW HAMPSHIRE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IPAC ACQUISITION SUBSIDIARY I, LLC;REEL/FRAME:019319/0745

Effective date: 20061102

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION