US20040036680A1 - User-interface features for computers with contact-sensitive displays - Google Patents
User-interface features for computers with contact-sensitive displays Download PDFInfo
- Publication number
- US20040036680A1 US20040036680A1 US10/452,233 US45223303A US2004036680A1 US 20040036680 A1 US20040036680 A1 US 20040036680A1 US 45223303 A US45223303 A US 45223303A US 2004036680 A1 US2004036680 A1 US 2004036680A1
- Authority
- US
- United States
- Prior art keywords
- display
- input area
- active input
- user
- portable computer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 claims description 9
- 230000004044 response Effects 0.000 claims description 9
- 230000002085 persistent effect Effects 0.000 claims description 2
- 230000006870 function Effects 0.000 description 25
- 230000000007 visual effect Effects 0.000 description 6
- 230000007246 mechanism Effects 0.000 description 4
- 238000010079 rubber tapping Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000006467 substitution reaction Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 241001422033 Thestylus Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000002592 echocardiography Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000008713 feedback mechanism Effects 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/161—Indexing scheme relating to constructional details of the monitor
- G06F2200/1614—Image rotation following screen orientation, e.g. switching from landscape to portrait mode
Abstract
Description
- The application claims benefit of priority to U.S. Provisional Application No. 60/406,264, filed Aug. 26, 2002, entitled “User interface features for a handheld computer,” and naming Mark Davis and Carlo Bernoulli as inventors, the aforementioned priority application being hereby incorporated by reference for all purposes in its entirety.
- The present invention relates to user-interfaces for computers. In particular, the present invention relates to user-interface features for computers with contact-sensitive displays.
- Personal digital assistants (PDAs) are typical of computers that utilize contact-sensitive displays. A PDA is small in size, usually suited to be held by a user on one hand and operated by another hand. The display of the PDA is used to provide additional input functionality in lieu of a large keyboard, a mouse or other input mechanism that is incompatible with the size and portability of the PDA.
- PDAs often provide an active input area on the display, which is a designated region on the display where most of the user-contact and input is entered. One type of active input area used in PALM OS and POCKET PC devices provides for a handwriting recognition area to appear on the display. The user can form strokes on the region of the display where the handwriting recognition area is provided, and technology such as provided by GRAFFITI or JOT, is used recognize the strokes as characters.
- Because the handwriting recognition area is often a frequent location of the user's attention, other input functionality is usually provided in conjunction with or next to the handwriting recognition area. This other input functionality is often in the form of icons and task bars that can be selected in order to cause the PDA to perform some function. In addition, electronic keyboards can be substituted on the display in place of the handwriting recognition area.
- Recently, devices such as TABLET PCs have become popular. Such devices also utilize an immediate handwriting recognition square for recognizing contact strokes provided on a display as characters.
- Embodiments of the invention provide for a configurable user-interface for a computer. Embodiments of the invention may apply to a handheld computer, such as a PDA, having an active input area, where handwriting recognition or digital keyboards may be displayed.
- According to one embodiment, input features such as icons provided with the active input area may be substituted in exchange for other input features.
- According to another embodiment, a display of the handheld computer may be provided in a portrait mode, with a left or right handed orientation. In providing the handedness orientation, the placement and orientation of the active input area in relation to other portions of the display is considered in order to facilitate users who are either left or right handed.
- Other embodiments provide a feedback feature that echoes back to the user a particular character that was just entered through a handwriting recognition scheme. The particular character that is echoed back may be a glyph (e.g. a character before it is displayed as an alphabet or Roman numeral character) that the handheld computer determines match to a handwriting stroke of the user.
- Still further, another embodiment provides for a configurable handwriting recognition area for an active input area. In particular, the handwriting recognition area portion of the active input area may be configurable in terms of the number of cells provided, the shape of each cell, the functionality provided by each cell (e.g. what kind of characters are to be recognized in a particular cell) and the dimensions of each cell in both the lengthwise and widthwise directions.
- Embodiments of the invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings. Like reference numerals are intended to refer to similar elements among different figures.
- FIG. 1 is a simplified frontal view of a handheld computer with a configurable active input area, under an embodiment of the invention.
- FIGS.2A-2D illustrate screen shots of a configurable active input area, under one or more embodiments of the invention.
- FIG. 3 describes a method for replacing elements of an active input area with other elements.
- FIGS.4A-4C illustrate screen shots of an icon in an active input area being replaced by another icon.
- FIGS.5A-5B illustrate screen shots of a handwriting recognition aid, under an embodiment of the invention.
- FIGS.6A-6C are simplified frontal views of a handheld computer that has user-interface features which can be positioned to facilitate landscape modes with handedness orientation.
- FIGS.7A-7D illustrate screen shots of a display of a handheld computer where different active input areas are displayed in left and right handedness orientations.
- FIG. 8 is a block diagram that illustrates a portable computer upon which an embodiment of the invention may be implemented.
- Embodiments of the invention provide a set of configurable user-interface features for computers that have contact-sensitive displays. In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, that the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the present invention.
- Embodiments described herein provide for a portable computer with a contact-sensitive display having a user-interface that is configurable through user-contact with a display surface. In some embodiments, an active input area is provided that is configurable in appearance and functionality. As well be described, the configurable nature of the active input area allows for a flexible user-interface that can accommodate, amongst other considerations, left and right handedness, special business applications, and user-preferences.
- For purpose of description, embodiments of the invention are described in the context of handheld computers, such as PDAs and smart cell phones, which use contact-sensitive displays. Handheld computers, in particular, illustrate the problem of maximizing user-interface functionality and preferences on a device with a relatively small profile. Embodiments of the invention may also be employed with other types of computers that have contact-sensitive displays, such as on tablet computers, laptops and other portable computers.
- In one embodiment, a user-interface can be configured on a computer with a contact-sensitive display. A set of features that are selectable through contact with the display of the computer may be provided on a designated region of the computer's display. When selected, the features cause the computer to perform some function associated with that feature. A tap event, corresponding to an object making a specific form of contact with the display, may be entered by the user to initiate a substitution of one feature for another feature in the designated region. In response to the tap event, a list of alternative features is provided to the user. A selection of one of the alternative features is detected by a user once-again making contact with the display. Then the selected alternative feature is provided on the display instead of the feature that was associated with the tap event.
- According to another embodiment, a portable computer is provided that includes a housing, a contact-sensitive display and a processor. The processor is configured to provide an active input area on the display. The active input area includes functionality where the processor recognizes strokes entered on the display as characters. The portable computer may be oriented in a portrait mode, where the active input area extends primarily in a left-right direction from a perspective of a user that operates the portable computer. The portable computer may also be oriented in a landscape mode, where the active input area extends primarily in a top-bottom direction from the perspective of the user. When in the landscape mode, the processor is configured to provide a handedness orientation for the active input area with respect to the display and other features of the
handheld computer 100. - With respect to embodiments such as described below, an active input area refers to a graphic, contact-sensitive input mechanism provided on a display surface of a computer. The active input area provides functionality that is oriented for making the active input area the primary focus of the user when the user is interacting with the computer. Accordingly, the active input area may provide a handwriting recognition area, keypad, and/or a keyboard that enables a large number of possible user contacts to be entered and uniquely interpreted from one designated region of the display. To provide an example, in one embodiment, an active input area may include a display region designated for recognizing certain user-contacts as character input, including alphabet and numeric characters. The active input area may also be used to receive commands from the user for performing functions such as launching applications. In this way, an active input area may differ from other user-interfaces of a computer (such as mechanical features like keyboard and buttons) in that it is provided on a contact-sensitive display, and it can be used to receive a large number of unique user-inputs that can subsequently be interpreted.
- FIG. 1 illustrates a
handheld computer 100 with a configurableactive input area 110, under an embodiment of the invention. In FIG. 1,handheld computer 100 includes ahousing 120 having afront panel 122. Adisplay surface 124 is exposed on thefront panel 122. Thedisplay surface 124 may be part of a display assembly having a digitizer or other construction in which contact between an object and the display surface is detected and recorded. Thehousing 120 may also provide a plurality ofbuttons 130, or other actuatable mechanisms. Thebuttons 130 can be individually actuated to causehandheld computer 100 to perform some function such as launch a program. - An
active input area 110 is provided ondisplay surface 124. In an embodiment,active input area 110 is purely digital, and can be selected to appear ondisplay surface 124, rather than be a permanent aspect of thedisplay surface 124. Theactive input area 110 includes ahandwriting recognition area 112. A user may initiate contact with an object in the form of a gesture or stroke onhandwriting recognition area 112, and the processing resources ofhandheld computer 100 interpret that stroke as a character or function. Thehandwriting recognition area 112 may be immediate in that a single stroke may be recognized as a character after that stroke is completed. A recognized character of an immediately recognized stroke may be outputted ondisplay surface 124 prior to another stroke being entered. - The
handwriting recognition area 112 itself may be separated into two or more cells. In one embodiment, afirst cell 112A recognizes strokes as alphabetical characters, and asecond cell 112B recognizes strokes as numbers. Additional cells may be provided as needed. For example, embodiments described below provide for a “triple-cell” configuration, where one cell ofhandwriting recognition area 112 is for recognizing strokes as capital letters. Alternatively, a third or additional cell may be for recognizing strokes as functions. - The
active input area 110 also includes a plurality ofactive icons 115, which are placed adjacent to thehandwriting recognition area 112. As used herein, the term “active icon” means an icon that has some functionality associated with it. An active icon can be selected to perform its associated function. Accordingly,active icons 115 are each individually selectable to cause thehandheld computer 100 to perform a function that corresponds to that icon. Unless stated otherwise, reference to icons in this application is intended to mean “active icons”. In one embodiment, a set of fouricons 115 is provided aroundhandwriting recognition area 112, although more or fewer icons may be provided as part ofactive input area 110 as needed or desired. - In one embodiment, one characteristic of the active input area is that it contains multiple user-interface features of different types. Another characteristic of an active input area is that even though it is formed from multiple elements with different functionality, the active input area appears as a unit. Thus, when
active input area 110 is selected to appear, all of the elements designated to be part of the active input area at that particular moment appear with it. With respect to FIG. 1, this would mean that all of theactive icons 115 and thehandwriting recognition area 112 appear as the components of theactive input area 110. Furthermore, these elements appear in the same configuration each time theactive input area 110 is displayed. For example, eachactive icon 115 may occupy the same position relative tohandwriting recognition area 112 each timeactive input area 110 is called on thedisplay surface 124. - When not in use, an embodiment provides that
active input area 110 may be minimized into a task bar or other graphic feature that appears on the display. One embodiment provides that theactive input area 110 may be made to appear ondisplay surface 124 at any time through one or more taps with thedisplay surface 124. Thus, an area ofdisplay surface 124 can be maximized for providing content by minimizingactive input area 110, thus facilitating use ofhandheld computer 100 as, for example, an electronic book reader. - FIGS.2A-2D provide screen shots of
display surface 124 to illustrate where the appearance ofactive input area 110 may be altered or otherwise changed as needed or selected by a user ofhandheld computer 100. FIG. 2A illustrates an embodiment where a user selects to provideactive input area 110 with a triple-cell configuration. Theactive input area 110 may also includeactive icons 115. In the triple-cell configuration, afirst cell 112A (usually on the far left) interprets gestures made on that part of thedisplay surface 124 as small cap characters. Asecond cell 112B (usually on the far right) interprets gestures made on that part of thedisplay surface 124 as numbers. A third cell 112C, which may appear in the middle, interprets gestures made on that part of thedisplay surface 124 as capitalized letters. Such a configuration may be set as a preference of the user. - FIG. 2B illustrates another screen shot of how
active input area 110 can be made to appear on thedisplay surface 124. In an embodiment such as shown,active icons 115 are removed fromactive input area 110. Rather, all ofactive input area 110 is made intohandwriting recognition area 112. Furthermore,cells 112A (for interpreting strokes as characters) and 112B (for interpreting strokes as numbers) are re-sized to be larger widthwise (along the axis X) than the configuration illustrated in FIG. 2A. Furthermore, the dimensions of the twocells cell 112A for characters is larger thancell 112B for numbers. As an example, a configuration such as shown in FIG. 2B may be designated as a user-preference because the user is more likely to use the character entry cell than the numeric entry cell. - FIG. 2C illustrates a configuration where
active input area 110 is formed entirely ofhandwriting recognition area 112, and further thathandwriting recognition area 112 has an enlarged height (along the axis Y). For purpose of illustrating variation, a triple cell configuration is also shown, in that a third cell 112C is also provided for recognizing capital letters. - FIG. 2D illustrates a reverse configuration for
active input area 110, wherehandwriting recognition area 112 is made smaller in height (along axis Y), but not minimized. Such an embodiment provides more room ondisplay surface 124 for providing content, while providing some space for a user to enter strokes ontohandwriting recognition area 112. - In an embodiment,
active input area 110 is adjustable between various configurations, including configurations shown by FIGS. 2A-2D, through user-input with thedisplay surface 124. In one embodiment,boundary lines 212 and 214 may be provided to delineate theactive input area 110 from the remaining portion of thedisplay surface 124. Theboundary line 212 may correspond to a height of theactive input area 110 from an edge 222 of the display surface. The boundary line 214 may correspond to a marker delineating thecells handwriting recognition area 112. In order to adjust the position height of theactive input area 110, one embodiment enables the user to selectboundary line 212 to move it either upward or downward relative to bottom edge 222, to yield configurations shown by FIGS. 2A and 2D respectively. In order to adjust the dimensions of thecells boundary lines 212, 214 may be done through contact with thedisplay surface 124, or through some other means such as menu selection. - According to one embodiment, specific screen shots shown in FIGS.2A-2D illustrate preferences that may be selected by the user. The user's selection may be based on factors such as whether
display surface 124 is to be used primarily for displaying content, or whether character recognition is to be enhanced. - Embodiments of the invention provide for elements of
active input area 110 to be selected and replaced by other elements as the need arises. As described by FIGS. 3 and 4A-4C, the selection and replacement of elements ofactive input area 110 may be done at the user level. - Alternatively, a manufacturer may provide the
handheld computer 100 with a particular (or default) configuration foractive input area 110. Subsequently, vendors or original equipment manufacturers may alter the configuration of thehandheld computer 100 from its original manufacturing in order to suit a particular need. For example,active input area 110 may be configured to include elements (such as icons) for a particular internal business application of a company. In one use, an entity such as the company may alter the configurations of theactive input area 110 one time, and disable the ability of the end user to subsequently reconfigure the active input area. - A more general application for an embodiment of the invention is to enable the end user to configure and reconfigure
active input area 110 as the user desires. According to one embodiment, theactive icons 115 that form part ofactive input area 110 can be selected and configured by a user ofhandheld computer 100. The user may, for example, switch the icons that appear in theactive input area 110, alter the relative positions of such icons, and/or reduce, eliminate or increase the number of icons that appear as part ofactive input area 110. Once the selection of icons for theactive input area 110 is designated by user-input or other means, an embodiment provides that theactive input area 110 appears only with the designated selection of icons, at least until that selection is altered or replaced once again. - FIG. 3 illustrates a method for substituting out one of the
active icons 115 that appear inactive input area 110 for another icon that is selected by the user. Step 310 provides that the active input area is displayed with a designated set ofactive icons 115. Thus, theactive icons 115 ofactive input area 110 may be displayed with a specific orientation, position, appearance and functionality. - In
step 320, a tap event is detected that is associated with one of the icons that appears in theactive input area 110. In one embodiment, the location of where the tap event occurs is what associates the tap event with a particular icon ofactive input area 110. - In
step 325, a determination is made as to whether the detected tap event qualifies as a tap event for substituting out one of the active icons 115 (or some other feature of active input area 110) for an alternative icon. The determination may be based on whether the tap event qualifies based on some pre-determined criteria. This determination may distinguish such a tap event from other taps and tap events which are not for substituting out icons fromactive input area 110. - In one embodiment, the tap event is a “tap and hold” where an object such as a stylus is tapped to the
display surface 124 and held in position for a designated duration. In such an embodiment, the duration in which the object making contact with the display is continuously held in contact with the display may form the criteria as to whether the tap event qualifies. The position where the tap and hold occurs may also be part of the criteria for qualifying the tap event. For example, in order to select a particular icon for replacement, the tap event may be required to occur over a particularactive icon 115, and last a designated duration so that it is identified as a tap event to substitute out the particular icon. Should the tap occur elsewhere, or not for the designated duration, then the tap event would not be recognized as a tap event to substitute out that particular icon. - Rather than a tap and hold event, other embodiments may provide for other types of tap events. Examples of other such tap events include a “tap and drag” event, where the object is tapped to one place on
display surface 124, then dragged continuously to another place on the display surface. For an embodiment where the tap event is a tap and drag, the criteria for qualifying the tap event may be that the first icon is tapped, then the object is continuously dragged across the display to another designated location. - Still further, another alternative form for a tap event is a double-tap or even a triple-tap. For example, a series of three taps within a relatively small duration of time that occurs over one of the
icons 115 may be designated to qualify as a request to substitute out the selected icon. Other examples and scenarios are possible. - If the determination in
step 325 is that the tap event was to not a request to reconfigure the selection of any of theicons 115 in theactive input area 110, then step 330 provides that the tap event is ignored. -
Step 340 provides that a list of the alternative icons is displayed in response to a determination that the tap event was to substitute out one of the active icons. The alternative icons may correspond to icons that are not presented in theactive input area 110, but that are available in that they are each associated with a distinct functionality by thehandheld computer 100. Thus, the selection of any icon provided in the displayed list would causehandheld computer 100 to perform some function associated with that icon. The list may display representations of the available alternative active icons. These representations may correspond to iconic expressions, such as insignias, trademarks, and other graphic associations to the underlying application or functionality. - Once the list is displayed, the user is given an opportunity to select a new icon to replace the icon that has been selected for substitution. In
step 345, a determination is made as to whether the user made another selection for another icon to replace the first icon. In one embodiment, this selection may be made by the user tapping a representation of the second icon from the list provided instep 340. If the determination is that no selection was made from the list, then step 350 provides that the list is displayed until the user taps somewhere else on thedisplay surface 124, or somehow initiates or causes some action to indicate that the list should be closed. For example, the user may launch another application with one of thebuttons 130, or shuthandheld computer 100 off. - If the determination is that a selection of the second icon is made from the list, then step360 provides that the icon selected for substitution is replaced with the icon selected from the list. Until further alterations, this new icon will appear as part of the
active input area 110 each time the active input area is selected to appear. In addition, the next time the list is displayed, a representation of the icon that was substituted out may be provided in the list, so that this icon may be re-selected at a later time as one of the elements of theactive input area 110. - FIGS.4A-4C provide screen shots to illustrate a method such as described in FIG. 3. FIG. 4A shows
active input area 110 provided over atask bar 426. In one embodiment, theactive input area 110 can be minimized or substituted out of the display. An icon or other item representing theactive input area 110 may be provided on thetask bar 426. This icon can be selected by a user through contact with the display, or other means, to cause the active input area to re-appear on thedisplay surface 124. Thetask bar 426 may be persistent, in that it is either always present, or present automatically depending on certain applications or functions performed by thehandheld computer 100. - FIG. 4A shows
active input area 110 with fouractive icons 115 when in a displayed state. Each of theactive icons 115 is assigned a particular function. When the user taps one of theactive icons 115, the function associated with that icon is performed. Examples of functions that can be assigned toactive icons 115 include launching a particular application, performing a utility function (such as displaying a search tool or adjusting the contrast of the computer), or opening a particular record. Rather than change the function associated with a particular icon, embodiments of the invention permit the particular icon displayed in theactive input area 110 to be replaced by a new icon. With the changing of a particular icon, the functionality offered by that icon is changed in place of the functionality provided by the new replacement icon. Thus, the association between an icon in theactive input area 110 and a function or application may be static. This allows the user to have the same visual association between a particular icon and the function associated with that icon. - FIGS.4A-4C illustrate how a first
active icon 115A associated with a “display menu” function can be replaced by a secondactive icon 115B associate with a communication port application (referred to as “dialer”). The firstactive icon 115A is assumed to be selected for exchange for another icon by a tap event. The tap event that selects the firstactive icon 115A for exchange is different than a tap (or other tap event) that would select that and cause thehandheld computer 100 to perform the function of the display menu icon. The act of selecting the firstactive icon 115A in order to cause thehandheld computer 100 to perform the function associated with that icon may be performed simply by tapping the icon one time. In contrast, the tap event that selects the firstactive icon 115A for exchange with another icon may correspond to a stylus tapping ondisplay surface 124 where firstactive icon 115A is provided, and holding the tap for a designated duration. Alternatively, the tap event for exchanging the firstactive icon 115A may correspond to the stylus dragging in contact with the display from a location where thefirst icon 115A is provided to some other location. Still further, the tap event for selecting the firstactive icon 115A for exchange may correspond to a double-tap or triple-tap on the location of the display surface where the firstactive icon 115A is provided. In either case, the tap event for selecting the icon for exchange with another icon is differentiable from the tap or tap event for performing the function of that icon, but the particular act required for the tap event may be one of design choice. - FIG. 4B illustrates a
list 410 that is opened in response to firstactive icon 115A being selected for exchange with another icon. Thelist 410 includes a plurality ofrepresentations 412. Eachrepresentation 412 corresponds to an alternative active icon that is available to be displayed as part ofactive input area 110. Once thelist 410 is opened, if one of therepresentations 412 is selected, an icon of that representation would be generated to replace the firstactive icon 115A. In one embodiment, this would mean that the replacement icon would appear instead of the firstactive icon 115A, in first active icon's position within theactive input area 110. The selection of one of therepresentations 412 inlist 410 may be accomplished by a stylus making contact with a point ondisplay surface 124 where that representation is displayed. - Since the
representations 412 are fairly small, there is the possibility that what the user wishes to select and what the user actually selects is not the same thing. For example, the user may miss the desired representation when tapping thedisplay surface 124. Embodiments of the invention provide a feedback function where the selectedrepresentation 412 is indicated to the user to afford the user an opportunity to change the selection before the selection is made final. In an FIG. 4B, the selection of one of the representations (the one corresponding to “dialer”) is also visually indicated with some feedback. The feedback may correspond to highlighting the selected representation when it is selected from the list. Alternatively, the feedback may correspond to changing the appearance of the selected representation, such as changing its color, size, or shading. As another example, a distinctive audible may be provided to distinguish whichrepresentation 412 from thelist 410 was selected from the user. - In addition to providing feedback, the
list 410 may visually indicate information about the alternative icons, or about the functionality associated with those alternative icons. Fir example, thelist 410 may indicate if certain applications are not available by graying outrepresentations 412 that correspond to those applications. - For purpose of explanation, the particular representation selected in FIG. 4B is assumed to correspond to a second
active icon 115B. FIG. 4C illustrates when secondactive icon 115B is displayed inactive input area 110 in place of firstactive icon 115A. The secondactive icon 115B takes the place of firstactive icon 115A inactive input area 110. Thus, secondactive icon 115B occupies the relative position previously occupied by the firstactive icon 115A inactive input area 110. The firstactive icon 115A is no longer present inactive input area 110, but it is available for reselection and exchange with any other icon that is part of theactive input area 110. Whenactive input area 110 is subsequently called or used, active input area appears withsecond icon 115B, at least until the active input area is re-configured. - In the past, when the user of the
handheld computer 100 wished to associate new iconic functionality withinactive input area 110, the user had to associate that new functionality with an icon that always appeared within the active input area. This required the user to learn a new visual association between that icon of theactive input area 110 and the newly selected functionality that was to be provided with the active input area. In contrast, embodiments such as described with FIGS. 4A-4C enable the user to create static associations between icons that can appear in theactive input area 110 and their respective functionalities. If the user wants a new functionality to be provided by an icon in theactive input area 110, the user selects a new icon for the active input area which already has that functionality assigned to it. The user does not need to select a new function for an icon that cannot be substituted out of theactive input area 110. - Furthermore, embodiments such as described in FIGS.4A-4C enables
active input area 110 to carry icons created by third-party developers for particular applications. Application developers often create the icons that are associated with their programs. The icons are provided in order to let the user launch an application by selecting the icon associated with that icon. Typically, the icons designed by the developers include graphics such as insignias and trademarks, which uniquely identify their application to the user. These icons are often listed in the menu of thehandheld computer 100. With conventional handheld computers, the icon corresponding to the menu function is usually presented in theactive input area 110, but the various icons that represent different applications, including third-party developer applications, are not part of the active input area. In contrast, some conventional computers require the user to select a new function for a wildcard icon that always appears on the display, or switch the functionality of one icon (such as the menu icon) in order to assign that icon a new functionality. With embodiments such as described, however, the icons designed and provided by the developers can be imported by the user (or a vendor) into theactive input area 110. - In an embodiment, the
handheld computer 100 is configured to display the icons that form theactive input area 110 using monochromatic display resources. All of theactive input area 110, includinghandwriting recognition area 112, may be provided using monochromatic resources, even ifhandheld computer 100 has color display resources. Monochromatic resources offer the advantage of being able to display content designed for both color and monochrome. There are many applications which are designed for monochrome environments. By providing for thehandheld computer 100 to display the icons ofactive input area 110 in monochrome, no special consideration needs to be made to distinguish icons made for color from icons made for monochrome, as both types of icons would be displayed in theactive input area 110 in monochrome. - While embodiments described with FIGS.4A-4C contemplate the use of icons as a type of feature that can be switched from and into the
active input area 110, embodiments of the invention may apply to other types of features. For example,handwriting recognition area 112 may be switched out of theactive input area 110 in the same manner as the active icons. Thehandwriting recognition area 112 may be switched out in place of a digital keyboard, or a set of icons. Alternatively, the specific type ofhandwriting recognition area 112 that forms part of theactive input area 110 may be selected in a manner such as described with FIGS. 4A-4C. For example, a two-cell version of handwriting recognition area 112 (see FIG. 2B) may be substituted for a triple-cell version (see FIG. 2B) in a manner described above. - It is possible for
handheld computer 100, or other computer with a contact-sensitive display, to accept character entry on any location ofdisplay surface 124. The acceptance of the character entry may be through display contact mechanisms, such as electronic keyboards and handwriting recognition area. In the case where handwriting recognition is employed, thehandheld computer 100 is configured to recognize strokes entered anywhere ondisplay surface 124, where each stroke is immediately recognized as a corresponding character. For example,handheld computer 100 may be configured to recognize certain strokes, such as provided in GRAFFITI and JOT, as characters or commands when those strokes are entered on locations ofdisplay surface 124 other thanactive input area 110. In the case where an electronic keyboard is provided, the electronic keyboard itself may be provided anywhere on thedisplay surface 124. Any taps entered on regions corresponding to keys of the electronic keyboard are recognized as corresponding characters. - With either stroke recognition or electronic keyboard entry, some degree of error exists in what is entered by the user and what is interpreted by the
handheld computer 100. The display surfaces 124 are often small, causing the user to miss a key, or not enter a stroke correctly. In the case of handwriting recognition, a user is required to draw the stroke to match one of a set of known strokes. If the user's stroke is off, thehandheld computer 100 may recognize the wrong character or command. - In an embodiment,
active input area 110 has functionality other than that of receiving input. One embodiment provides thatactive input area 110 can be used as a visual guide for assisting the user to enter correctly shaped strokes on a remaining portion ofdisplay surface 124. For purpose of explanation, the following terminology is used in this application: a glyph is a recognized form of a stroke; and a stroke is what is traced by a user employing an object to make continuous contact (e.g. between a pen-up and a pen-down) with thedisplay surface 124. In one embodiment, immediate handwriting recognition can be performed by matching a stroke to a glyph, and then displaying a character associated with the glyph. U.S. Pat. No. 6,493,464 (hereby incorporated for all purposes in its entirety by this application) describes an immediate handwriting recognition technique using strokes and glyphs. - With reference to FIG. 5A,
active input area 110 displays a set ofglyphs 552. Theregion 526 ofdisplay surface 124, which excludesactive input area 110, is shown as displaying a stroke 554 recently formed by the user. The stroke 554 may have been formed by, for example, the user tracing a shape on theregion 526. Since the stroke 524 needs to match a shape of a desired glyph in the set ofglyphs 552 in order to be properly recognized, displaying the set of glyphs in theactive input area 110 is useful for providing a visual cue for the user. Such an embodiment may be particularly useful in the case where the user is unfamiliar with the particular stroke recognition technique used by the handheld computer 100 (such as GRAFFITI or JOT). Thus,active input area 110 may also serve as a feedback mechanism for providing visual feedback of a user's input operations. - According to another embodiment,
active input area 110 provides a visual feedback as to the character that was identified from the stroke 554 that the user entered on theregion 526. For example, for stroke 554,active input area 110 may display or somehow indicate simultaneously which character was recognized from that stroke. In FIG. 5B, an indication is shown as to which glyph in the set ofglyphs 552 corresponded to the stroke that the user entered. The indication may be in the form of highlighting or shading one glyph that thehandheld computer 100 determines to have matched the stroke 554 entered by the user onto theregion 526. - The manner in which
active input area 110 and other user-interface features are provided onhandheld computer 100 may be accommodating for landscape modes, with particular handedness configurations. Specifically, theactive input area 110 and other input features can be provided ondisplay surface 124 in a landscape mode, with a particular left handed or right handed orientation. - Different handedness configurations can be provided because the construction of
active input area 110 enables flexibility as to how it can be shaped and positioned. Specifically, whenactive input area 110 is electronically generated, the particular portion ofdisplay surface 124 upon which the active input area is displayed can be selected. Simultaneously, resources for detecting contact to displaysurface 124 may be oriented to recognize the particular forms of contact that correspond to the numerous entries that can be made through theactive input area 110. Thus,active input area 110 can be created and recreated with physical characteristics that suit a particular configuration, such as a handedness orientation. In particular, the position, dimension, shape, orientation and even components ofactive input area 110 are selectable based on orienting all of the features according to a particular handedness. - FIGS.6A-6C shows how the flexibility in the manner
active input area 110 is provided can be used to accommodate various preferences of the user, including left or right handedness of the user in the landscape mode. In FIG. 6A, thehandheld computer 100 is shown in a portrait mode, which may be the default configuration of the handheld computer. Thedisplay surface 124 is assumed to be rectangular in shape, and the portrait mode corresponds to when the length of the display surface extends in an up-down direction from the perspective of the user. The perspective of the user is shown by the axes X and Y, with the X axis corresponding to what the user views as being the up and down direction. The perspective offered with the axes X and Y is that of the user staring into the paper. - With reference to FIG. 6A,
active input area 110 extends a height from a bottom surface 612 ofdisplay surface 124. Thebuttons 130 are provided between the bottom surface 612 ofdisplay surface 124 and a bottom edge 616 of thehousing 120. Based on convention,active input area 110 may be provided at the bottom portion ofdisplay surface 124. Theactive input area 110 may includeactive icons 115. - FIG. 6B illustrates
handheld computer 100 positioned in a landscape mode, with a left handed orientation. The left handed orientation means that most, if not all, of the user-interface features that require the user to make manual contact withhandheld computer 100 are provided on the left-hand side of the handheld computer. Theactive input area 110 is positioned so that when used by a left-handed person, the person's hand will not block the user's view of thedisplay surface 124. The left-hand orientation may be created by rotatingdisplay surface 124 clockwise 90 degrees in the direction of A. When rotated,housing 120 provides the buttons in the top-down configuration, to the left ofdisplay surface 124. Theactive input area 110 may be re-generated to extend the same manner as in the portrait mode. Thus,active input area 110 extends in a top-bottom direction, as defined by axis X, but adjacent to a left boundary 621 (when viewed in the configuration of FIG. 6B) of thedisplay surface 124. - FIG. 6C illustrates
handheld computer 100 positioned in a landscape mode, with a right handed orientation. As with the left handed orientation, most or all of the user-interface features that require the user to make manual contact withhandheld computer 100 are provided on the right-hand side of the handheld computer. Theactive input area 110 is positioned so that when used by a right-handed person, the person's hand will not block the user's view of thedisplay surface 124. The right-hand orientation may be created by rotatingdisplay surface 124 counter-clockwise 90 degrees in the direction of B. When rotated,housing 120 provides the buttons in the top-down configuration, to the right ofdisplay surface 124. Theactive input area 110 may be re-generated to extend the same manner as in the portrait mode. Thus,active input area 110 extends in a top-bottom direction, as defined by axis X, but adjacent to a right boundary 623 (when viewed in the configuration of FIG. 6C) of thedisplay surface 124. - Among other advantages,
handheld computer 100 can be configured to enable its contact-sensitive display to be viewed and used in a landscape mode with particular attention to the handedness of the user. - FIGS.7A-7D show some specific examples of
display surface 124 accommodating different modes and handedness. FIG. 7A illustrates the portrait mode fordisplay surface 124, with the length of thedisplay surface 124 extending in the top-bottom direction, along the axis Y. In the example provided,active input area 110 is displaying a set of keys corresponding to special character and number keys. In FIG. 7B, theactive input area 110 is rotated into the right-handed landscape orientation. The same set of keys provided in theactive input area 110 with FIG. 7A now are stacked vertically, so that the length of theactive input area 110 extends in the direction of the axis Y. - FIGS. 7C and 7D illustrate the
active input area 110 with cells that comprise thehandwriting recognition area 112. When in the portrait mode, an embodiment provides that theleft cell 112A, theright cell 112B and the center cell 112C of thehandwriting recognition area 112 are provided to receive strokes as input. In FIG. 7C, the left-handed landscape orientation is shown, with thecell 112A being in the top position withinactive input area 110, and the cell 112C being in the bottom most position. In the left-handed orientation, theactive input area 110 appears to the left of thedisplay surface 124. In FIG. 7D, the right-handed landscape orientation is shown. The right-handed orientation of FIG. 7D mirrors the orientation ofactive input area 110 in FIG. 7C, except that the active input area appears to the right of thedisplay surface 124. - FIG. 8 illustrates the components of a
portable computer 800, under an embodiment of the invention. Theportable computer 800 may, for example, correspond tohandheld computer 100. In an embodiment,portable computer 800 includes aprocessor 810, an analog-digital (A/D)converter 820, a set ofmechanical buttons 830, avolatile memory 840, anon-volatile memory 845 and a contact-sensitive display assembly 850. Apower source 825 may be used to power the various components of theportable computer 800. One typical component of theportable computer 800 is anexpansion port 842. Typically, multiple such expansion ports are provided on such portable computers. - The contact sensitive display assembly850 may include a
display 852 and adigitizer 854. Adisplay driver 856 may also form part of the display assembly 850. Thedigitizer 854 may be connected to the A/D converter 820. Thedigitizer 854 uses analog signals to detect contact with thedisplay 852, and to track the object making the contact as it moves over the display. The A/D converter converts the signals into a digital form forprocessor 810, which interprets what input in entered by the contact with thedisplay 852. Thedriver 856 may be coupled to theprocessor 810 in order to receive signals that are translated into output on thedisplay 852. The output may correspond to content that appears on thedisplay surface 124 in previous embodiments, as well as to the digitally-createdactive input area 110. - The
display driver 856 may provide some or all of the monochromatic resources that are used to display icons, representations of the icons, and/or theactive input area 110. As mentioned, the monochromatic resources enable the developer to make just one set of icons that work for all applications and all devices, since all such applications and devices can use monochrome, but not all such devices use color. - While an embodiment such as described with FIG. 8 provides for a display assembly that is integrated and formed as part of the housing of the
portable computer 800, other embodiments may provide for a portable computer where the contact-sensitive display is remote to the housing of the portable computer, or at least to the housing where theprocessor 810 is provided. Such an embodiment may provide, for example, a projector that displays the content being provided by theprocessor 810 onto a surface such as a table. Theportable computer 100 may sense the user's interaction with the surface where the projection is provided. Thus, the display surface may be external to the portable computer or its primary housing. - In the foregoing specification, the invention has been described with reference to specific embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
Claims (25)
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/452,233 US20040036680A1 (en) | 2002-08-26 | 2003-05-30 | User-interface features for computers with contact-sensitive displays |
PCT/US2003/026869 WO2004019200A2 (en) | 2002-08-26 | 2003-08-26 | User-interface features for computers with contact sensitive displays |
AU2003262921A AU2003262921A1 (en) | 2002-08-26 | 2003-08-26 | User-interface features for computers with contact sensitive displays |
CA002496774A CA2496774A1 (en) | 2002-08-26 | 2003-08-26 | User-interface features for computers with contact sensitive displays |
EP03793432A EP1558985A2 (en) | 2002-08-26 | 2003-08-26 | User-interface features for computers with contact sensitive displays |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US40626402P | 2002-08-26 | 2002-08-26 | |
US10/452,233 US20040036680A1 (en) | 2002-08-26 | 2003-05-30 | User-interface features for computers with contact-sensitive displays |
Publications (1)
Publication Number | Publication Date |
---|---|
US20040036680A1 true US20040036680A1 (en) | 2004-02-26 |
Family
ID=31997669
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/452,232 Active 2025-04-07 US7406666B2 (en) | 2002-08-26 | 2003-05-30 | User-interface features for computers with contact-sensitive displays |
US10/452,233 Abandoned US20040036680A1 (en) | 2002-08-26 | 2003-05-30 | User-interface features for computers with contact-sensitive displays |
US12/144,545 Expired - Lifetime US7831934B2 (en) | 2002-08-26 | 2008-06-23 | User-interface features for computers with contact-sensitive displays |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/452,232 Active 2025-04-07 US7406666B2 (en) | 2002-08-26 | 2003-05-30 | User-interface features for computers with contact-sensitive displays |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/144,545 Expired - Lifetime US7831934B2 (en) | 2002-08-26 | 2008-06-23 | User-interface features for computers with contact-sensitive displays |
Country Status (5)
Country | Link |
---|---|
US (3) | US7406666B2 (en) |
EP (1) | EP1558985A2 (en) |
AU (1) | AU2003262921A1 (en) |
CA (1) | CA2496774A1 (en) |
WO (1) | WO2004019200A2 (en) |
Cited By (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040046791A1 (en) * | 2002-08-26 | 2004-03-11 | Mark Davis | User-interface features for computers with contact-sensitive displays |
US20050219226A1 (en) * | 2004-04-02 | 2005-10-06 | Ying Liu | Apparatus and method for handwriting recognition |
US20070094417A1 (en) * | 2005-05-16 | 2007-04-26 | Hur Yong S | Mobile terminal having scrolling device and method implementing functions using the same |
US20070216643A1 (en) * | 2004-06-16 | 2007-09-20 | Morris Robert P | Multipurpose Navigation Keys For An Electronic Device |
WO2007118019A2 (en) * | 2006-04-06 | 2007-10-18 | Motorola, Inc. | Method and apparatus for user interface adaptation |
US20080001932A1 (en) * | 2006-06-30 | 2008-01-03 | Inventec Corporation | Mobile communication device |
EP1923778A2 (en) * | 2006-11-16 | 2008-05-21 | LG Electronics, Inc. | Mobile terminal and screen display method thereof |
US20080166049A1 (en) * | 2004-04-02 | 2008-07-10 | Nokia Corporation | Apparatus and Method for Handwriting Recognition |
US20080266244A1 (en) * | 2007-04-30 | 2008-10-30 | Xiaoping Bai | Dual Sided Electrophoretic Display |
US20080305837A1 (en) * | 2007-06-08 | 2008-12-11 | Inventec Corporation | Mobile communication apparatus |
US20080316397A1 (en) * | 2007-06-22 | 2008-12-25 | Polak Robert D | Colored Morphing Apparatus for an Electronic Device |
US20090015597A1 (en) * | 2000-05-18 | 2009-01-15 | Palm, Inc. | Reorienting display on portable computing device |
US20090046072A1 (en) * | 2007-08-13 | 2009-02-19 | Emig David M | Electrically Non-interfering Printing for Electronic Devices Having Capacitive Touch Sensors |
US20090161059A1 (en) * | 2007-12-19 | 2009-06-25 | Emig David M | Field Effect Mode Electro-Optical Device Having a Quasi-Random Photospacer Arrangement |
US20090198132A1 (en) * | 2007-08-10 | 2009-08-06 | Laurent Pelissier | Hand-held ultrasound imaging device having reconfigurable user interface |
EP2115555A1 (en) * | 2007-02-27 | 2009-11-11 | Motorola, Inc. | Adaptable user interface and mechanism for a portable electronic device |
US20090300537A1 (en) * | 2008-05-27 | 2009-12-03 | Park Kenneth J | Method and system for changing format for displaying information on handheld device |
US20100110020A1 (en) * | 2008-10-31 | 2010-05-06 | Sprint Communications Company L.P. | Virtual press number pad |
US20100171693A1 (en) * | 2009-01-06 | 2010-07-08 | Kenichi Tamura | Display control device, display control method, and program |
US7859518B1 (en) | 2001-06-04 | 2010-12-28 | Palm, Inc. | Interface for interaction with display visible from both sides |
US20110012926A1 (en) * | 2009-07-17 | 2011-01-20 | Apple Inc. | Selective rotation of a user interface |
US20110029904A1 (en) * | 2009-07-30 | 2011-02-03 | Adam Miles Smith | Behavior and Appearance of Touch-Optimized User Interface Elements for Controlling Computer Function |
US20110105186A1 (en) * | 2009-10-29 | 2011-05-05 | Research In Motion Limited | Systems and methods for providing direct and indirect navigation modes for touchscreen devices |
US8059232B2 (en) | 2008-02-08 | 2011-11-15 | Motorola Mobility, Inc. | Electronic device and LC shutter for polarization-sensitive switching between transparent and diffusive states |
US20130215062A1 (en) * | 2008-01-29 | 2013-08-22 | Kyocera Corporation | Terminal device with display function |
US20130301272A1 (en) * | 2012-05-09 | 2013-11-14 | Chan Hee Wang | Display device and method for fabricating the same |
US20150253891A1 (en) * | 2008-01-04 | 2015-09-10 | Apple Inc. | Selective rejection of touch contacts in an edge region of a touch surface |
US20150309712A1 (en) * | 2010-11-26 | 2015-10-29 | Hologic, Inc. | User interface for medical image review workstation |
US9632608B2 (en) | 2008-12-08 | 2017-04-25 | Apple Inc. | Selective input signal rejection and modification |
US20180121071A1 (en) * | 2016-11-03 | 2018-05-03 | Ford Global Technologies, Llc | Vehicle display based on vehicle speed |
US9971496B2 (en) | 2014-08-04 | 2018-05-15 | Google Technology Holdings LLC | Method and apparatus for adjusting a graphical user interface on an electronic device |
US10282155B2 (en) | 2012-01-26 | 2019-05-07 | Google Technology Holdings LLC | Portable electronic device and method for controlling operation thereof taking into account which limb possesses the electronic device |
US10459608B2 (en) * | 2014-12-01 | 2019-10-29 | Ebay Inc. | Mobile optimized shopping comparison |
US11307743B2 (en) * | 2017-11-07 | 2022-04-19 | Samsung Electronics Co., Ltd. | Method, electronic device and storage medium for providing mode switching |
US11314391B2 (en) * | 2017-09-08 | 2022-04-26 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Navigation bar controlling method and terminal |
US11379060B2 (en) | 2004-08-25 | 2022-07-05 | Apple Inc. | Wide touchpad on a portable computer |
US11403483B2 (en) | 2017-06-20 | 2022-08-02 | Hologic, Inc. | Dynamic self-learning medical image method and system |
US11406332B2 (en) | 2011-03-08 | 2022-08-09 | Hologic, Inc. | System and method for dual energy and/or contrast enhanced breast imaging for screening, diagnosis and biopsy |
US11419565B2 (en) | 2014-02-28 | 2022-08-23 | IIologic, Inc. | System and method for generating and displaying tomosynthesis image slabs |
US11445993B2 (en) | 2017-03-30 | 2022-09-20 | Hologic, Inc. | System and method for targeted object enhancement to generate synthetic breast tissue images |
US11452486B2 (en) | 2006-02-15 | 2022-09-27 | Hologic, Inc. | Breast biopsy and needle localization using tomosynthesis systems |
US11455754B2 (en) | 2017-03-30 | 2022-09-27 | Hologic, Inc. | System and method for synthesizing low-dimensional image data from high-dimensional image data using an object grid enhancement |
US11508340B2 (en) | 2011-11-27 | 2022-11-22 | Hologic, Inc. | System and method for generating a 2D image using mammography and/or tomosynthesis image data |
US11550466B2 (en) | 2012-08-27 | 2023-01-10 | Samsung Electronics Co., Ltd. | Method of controlling a list scroll bar and an electronic device using the same |
US11589944B2 (en) | 2013-03-15 | 2023-02-28 | Hologic, Inc. | Tomosynthesis-guided biopsy apparatus and method |
US11663780B2 (en) | 2012-02-13 | 2023-05-30 | Hologic Inc. | System and method for navigating a tomosynthesis stack using synthesized image data |
US11701199B2 (en) | 2009-10-08 | 2023-07-18 | Hologic, Inc. | Needle breast biopsy system and method of use |
US11957497B2 (en) | 2017-03-30 | 2024-04-16 | Hologic, Inc | System and method for hierarchical multi-level feature image synthesis and representation |
Families Citing this family (56)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8095879B2 (en) * | 2002-12-10 | 2012-01-10 | Neonode Inc. | User interface for mobile handheld computer unit |
US7886236B2 (en) * | 2003-03-28 | 2011-02-08 | Microsoft Corporation | Dynamic feedback for gestures |
US7895537B2 (en) * | 2003-12-29 | 2011-02-22 | International Business Machines Corporation | Method and apparatus for setting attributes and initiating actions through gestures |
US7496385B2 (en) * | 2003-12-29 | 2009-02-24 | International Business Machines Corporation | Method for viewing information underlying lists and other contexts |
DE102004013415B4 (en) * | 2004-03-18 | 2011-12-08 | Disetronic Licensing Ag | Rotatable display of a medical, pharmaceutical or cosmetic device |
US7454174B2 (en) * | 2004-08-03 | 2008-11-18 | Qualcomm, Incorporated | Estimation of received signal strength |
US20060227100A1 (en) * | 2005-03-30 | 2006-10-12 | Yu Kun | Mobile communication terminal and method |
US7509588B2 (en) | 2005-12-30 | 2009-03-24 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
KR101269375B1 (en) * | 2006-05-24 | 2013-05-29 | 엘지전자 주식회사 | Touch screen apparatus and Imige displaying method of touch screen |
US20090213086A1 (en) * | 2006-04-19 | 2009-08-27 | Ji Suk Chae | Touch screen device and operating method thereof |
KR101327581B1 (en) * | 2006-05-24 | 2013-11-12 | 엘지전자 주식회사 | Apparatus and Operating method of touch screen |
TWI328185B (en) * | 2006-04-19 | 2010-08-01 | Lg Electronics Inc | Touch screen device for potable terminal and method of displaying and selecting menus thereon |
KR20070113018A (en) * | 2006-05-24 | 2007-11-28 | 엘지전자 주식회사 | Apparatus and operating method of touch screen |
KR20070113025A (en) * | 2006-05-24 | 2007-11-28 | 엘지전자 주식회사 | Apparatus and operating method of touch screen |
KR20070113022A (en) * | 2006-05-24 | 2007-11-28 | 엘지전자 주식회사 | Apparatus and operating method of touch screen responds to user input |
TW200805131A (en) * | 2006-05-24 | 2008-01-16 | Lg Electronics Inc | Touch screen device and method of selecting files thereon |
TW200744352A (en) * | 2006-05-26 | 2007-12-01 | Benq Corp | Mobile communication devices and methods for displaying menu thereof |
US20070295540A1 (en) * | 2006-06-23 | 2007-12-27 | Nurmi Mikko A | Device feature activation |
US10313505B2 (en) | 2006-09-06 | 2019-06-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
US8161395B2 (en) * | 2006-11-13 | 2012-04-17 | Cisco Technology, Inc. | Method for secure data entry in an application |
US8120584B2 (en) * | 2006-12-21 | 2012-02-21 | Cypress Semiconductor Corporation | Feedback mechanism for user detection of reference location on a sensing device |
US8519964B2 (en) | 2007-01-07 | 2013-08-27 | Apple Inc. | Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display |
US7999789B2 (en) * | 2007-03-14 | 2011-08-16 | Computime, Ltd. | Electrical device with a selected orientation for operation |
US20090019188A1 (en) * | 2007-07-11 | 2009-01-15 | Igt | Processing input for computing systems based on the state of execution |
KR101365595B1 (en) * | 2007-08-16 | 2014-02-21 | 삼성전자주식회사 | Method for inputting of device containing display unit based on GUI and apparatus thereof |
US8619038B2 (en) | 2007-09-04 | 2013-12-31 | Apple Inc. | Editing interface |
KR20090029138A (en) * | 2007-09-17 | 2009-03-20 | 삼성전자주식회사 | The method of inputting user command by gesture and the multimedia apparatus thereof |
KR101499546B1 (en) | 2008-01-17 | 2015-03-09 | 삼성전자주식회사 | Method and apparatus for controlling display area in touch screen device, and computer readable medium thereof |
US8769427B2 (en) | 2008-09-19 | 2014-07-01 | Google Inc. | Quick gesture input |
US8508475B2 (en) * | 2008-10-24 | 2013-08-13 | Microsoft Corporation | User interface elements positioned for display |
US8423916B2 (en) * | 2008-11-20 | 2013-04-16 | Canon Kabushiki Kaisha | Information processing apparatus, processing method thereof, and computer-readable storage medium |
US20100138781A1 (en) * | 2008-11-30 | 2010-06-03 | Nokia Corporation | Phonebook arrangement |
KR20100067381A (en) * | 2008-12-11 | 2010-06-21 | 삼성전자주식회사 | Method for providing a physical user interface and a portable terminal therefor |
JP5353345B2 (en) * | 2009-03-18 | 2013-11-27 | 株式会社リコー | Information processing apparatus, display processing method, and program |
US8587532B2 (en) | 2009-12-18 | 2013-11-19 | Intel Corporation | Multi-feature interactive touch user interface |
US8881060B2 (en) | 2010-04-07 | 2014-11-04 | Apple Inc. | Device, method, and graphical user interface for managing folders |
US10788976B2 (en) | 2010-04-07 | 2020-09-29 | Apple Inc. | Device, method, and graphical user interface for managing folders with multiple pages |
US8531417B2 (en) | 2010-09-02 | 2013-09-10 | Blackberry Limited | Location of a touch-sensitive control method and apparatus |
JP5075975B2 (en) * | 2010-12-27 | 2012-11-21 | 株式会社東芝 | Information processing apparatus, information processing method, and program |
US8610682B1 (en) * | 2011-02-17 | 2013-12-17 | Google Inc. | Restricted carousel with built-in gesture customization |
KR101898202B1 (en) * | 2012-02-09 | 2018-09-12 | 삼성전자주식회사 | Apparatus and method for guiding writing input for recognation of writing |
US9223483B2 (en) | 2012-02-24 | 2015-12-29 | Blackberry Limited | Method and apparatus for providing a user interface on a device that indicates content operators |
US8539375B1 (en) | 2012-02-24 | 2013-09-17 | Blackberry Limited | Method and apparatus for providing a user interface on a device enabling selection of operations to be performed in relation to content |
EP2631738B1 (en) * | 2012-02-24 | 2016-04-13 | BlackBerry Limited | Method and apparatus for adjusting a user interface to reduce obscuration |
CN103376972A (en) * | 2012-04-12 | 2013-10-30 | 环达电脑(上海)有限公司 | Electronic device and control method of touch control screen of electronic device |
US20140184519A1 (en) * | 2012-12-28 | 2014-07-03 | Hayat Benchenaa | Adapting user interface based on handedness of use of mobile computing device |
US20140331154A1 (en) * | 2013-05-05 | 2014-11-06 | Carrier Corporation | User defined interface system and a method for using the same |
WO2014192125A1 (en) * | 2013-05-30 | 2014-12-04 | 株式会社 東芝 | Electronic device and processing method |
KR102405189B1 (en) | 2013-10-30 | 2022-06-07 | 애플 인크. | Displaying relevant user interface objects |
US10066959B2 (en) | 2014-09-02 | 2018-09-04 | Apple Inc. | User interactions for a mapping application |
AU2016215440B2 (en) | 2015-02-02 | 2019-03-14 | Apple Inc. | Device, method, and graphical user interface for establishing a relationship and connection between two devices |
DK201670595A1 (en) | 2016-06-11 | 2018-01-22 | Apple Inc | Configuring context-specific user interfaces |
DK201770423A1 (en) | 2016-06-11 | 2018-01-15 | Apple Inc | Activity and workout updates |
US11816325B2 (en) * | 2016-06-12 | 2023-11-14 | Apple Inc. | Application shortcuts for carplay |
US11675476B2 (en) | 2019-05-05 | 2023-06-13 | Apple Inc. | User interfaces for widgets |
US11863700B2 (en) * | 2019-05-06 | 2024-01-02 | Apple Inc. | Providing user interfaces based on use contexts and managing playback of media |
Citations (72)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5252951A (en) * | 1989-04-28 | 1993-10-12 | International Business Machines Corporation | Graphical user interface with gesture recognition in a multiapplication environment |
US5276794A (en) * | 1990-09-25 | 1994-01-04 | Grid Systems Corporation | Pop-up keyboard system for entering handwritten data into computer generated forms |
US5347295A (en) * | 1990-10-31 | 1994-09-13 | Go Corporation | Control of a computer through a position-sensed stylus |
US5432720A (en) * | 1992-11-13 | 1995-07-11 | International Business Machines Corporation | Rotatable pen-based computer |
US5483261A (en) * | 1992-02-14 | 1996-01-09 | Itu Research, Inc. | Graphical input controller and method with rear screen image detection |
US5502461A (en) * | 1993-05-11 | 1996-03-26 | Sanyo Electric Co., Ltd. | Hand written character input system/allowing change of size of character writing frames |
US5596697A (en) * | 1993-09-30 | 1997-01-21 | Apple Computer, Inc. | Method for routing items within a computer system |
US5621438A (en) * | 1992-10-12 | 1997-04-15 | Hitachi, Ltd. | Pointing information processing apparatus with pointing function |
US5644737A (en) * | 1995-06-06 | 1997-07-01 | Microsoft Corporation | Method and system for stacking toolbars in a computer display |
US5731801A (en) * | 1994-03-31 | 1998-03-24 | Wacom Co., Ltd. | Two-handed method of displaying information on a computer display |
US5736974A (en) * | 1995-02-17 | 1998-04-07 | International Business Machines Corporation | Method and apparatus for improving visibility and selectability of icons |
US5745718A (en) * | 1995-07-31 | 1998-04-28 | International Business Machines Corporation | Folder bar widget |
US5757371A (en) * | 1994-12-13 | 1998-05-26 | Microsoft Corporation | Taskbar with start menu |
US5760773A (en) * | 1995-01-06 | 1998-06-02 | Microsoft Corporation | Methods and apparatus for interacting with data objects using action handles |
US5801699A (en) * | 1996-01-26 | 1998-09-01 | International Business Machines Corporation | Icon aggregation on a graphical user interface |
US5805167A (en) * | 1994-09-22 | 1998-09-08 | Van Cruyningen; Izak | Popup menus with directional gestures |
US5828376A (en) * | 1996-09-23 | 1998-10-27 | J. D. Edwards World Source Company | Menu control in a graphical user interface |
US5859623A (en) * | 1996-05-14 | 1999-01-12 | Proxima Corporation | Intelligent display system presentation projection arrangement and method of using same |
US5936619A (en) * | 1992-09-11 | 1999-08-10 | Canon Kabushiki Kaisha | Information processor |
US5940488A (en) * | 1996-11-15 | 1999-08-17 | Active Voice Corporation | Telecommunication management system and user interface |
US5973664A (en) * | 1998-03-19 | 1999-10-26 | Portrait Displays, Inc. | Parameterized image orientation for computer displays |
US6018346A (en) * | 1998-01-12 | 2000-01-25 | Xerox Corporation | Freeform graphics system having meeting objects for supporting meeting objectives |
US6037937A (en) * | 1997-12-04 | 2000-03-14 | Nortel Networks Corporation | Navigation tool for graphical user interface |
US6057836A (en) * | 1997-04-01 | 2000-05-02 | Microsoft Corporation | System and method for resizing and rearranging a composite toolbar by direct manipulation |
US6067584A (en) * | 1996-09-09 | 2000-05-23 | National Instruments Corporation | Attribute-based system and method for configuring and controlling a data acquisition task |
US6069623A (en) * | 1997-09-19 | 2000-05-30 | International Business Machines Corporation | Method and system for the dynamic customization of graphical user interface elements |
US6097392A (en) * | 1992-09-10 | 2000-08-01 | Microsoft Corporation | Method and system of altering an attribute of a graphic object in a pen environment |
US6096094A (en) * | 1997-10-03 | 2000-08-01 | National Instruments Corporation | Configuration manager for configuring a data acquisition system |
US6133915A (en) * | 1998-06-17 | 2000-10-17 | Microsoft Corporation | System and method for customizing controls on a toolbar |
US6181344B1 (en) * | 1998-03-20 | 2001-01-30 | Nuvomedia, Inc. | Drag-and-release method for configuring user-definable function key of hand-held computing device |
US20010002126A1 (en) * | 1995-12-01 | 2001-05-31 | Immersion Corporation | Providing force feedback to a user of an interface device based on interactions of a user-controlled cursor in a graphical user interface |
US6300946B1 (en) * | 1997-01-29 | 2001-10-09 | Palm, Inc. | Method and apparatus for interacting with a portable computer |
US6310610B1 (en) * | 1997-12-04 | 2001-10-30 | Nortel Networks Limited | Intelligent touch display |
US20010038394A1 (en) * | 2000-05-08 | 2001-11-08 | Tadao Tsuchimura | Information display system having graphical user interface, and medium |
US20020015064A1 (en) * | 2000-08-07 | 2002-02-07 | Robotham John S. | Gesture-based user interface to multi-level and multi-modal sets of bit-maps |
US6346972B1 (en) * | 1999-05-26 | 2002-02-12 | Samsung Electronics Co., Ltd. | Video display apparatus with on-screen display pivoting function |
US20020021278A1 (en) * | 2000-07-17 | 2002-02-21 | Hinckley Kenneth P. | Method and apparatus using multiple sensors in a device with a display |
US20020033836A1 (en) * | 2000-06-06 | 2002-03-21 | Smith Scott R. | Device and method for changing the orientation and configuration of a display of an electronic device |
US20020059350A1 (en) * | 2000-11-10 | 2002-05-16 | Marieke Iwema | Insertion point bungee space tool |
US20020078037A1 (en) * | 2000-10-12 | 2002-06-20 | Mitsuyuki Hatanaka | Information processing apparatus and method, and program storing medium |
US20020091700A1 (en) * | 2000-01-21 | 2002-07-11 | Steele Robert A. | Unique architecture for handheld computers |
US20020113784A1 (en) * | 2000-12-29 | 2002-08-22 | Feilmeier Michael Leon | Portable computer aided design apparatus and method |
US20020163544A1 (en) * | 2001-03-02 | 2002-11-07 | Baker Bruce R. | Computer device, method and article of manufacture for utilizing sequenced symbols to enable programmed application and commands |
US20020173721A1 (en) * | 1999-08-20 | 2002-11-21 | Novasonics, Inc. | User interface for handheld imaging devices |
US6493464B1 (en) * | 1994-07-01 | 2002-12-10 | Palm, Inc. | Multiple pen stroke character set and handwriting recognition system with immediate response |
US20020188636A1 (en) * | 2001-05-02 | 2002-12-12 | Peck David K. | System and method for in-line editing of web-based documents |
US20030013959A1 (en) * | 1999-08-20 | 2003-01-16 | Sorin Grunwald | User interface for handheld imaging devices |
US20030018245A1 (en) * | 2001-07-17 | 2003-01-23 | Accuimage Diagnostics Corp. | Methods for generating a lung report |
US20030016850A1 (en) * | 2001-07-17 | 2003-01-23 | Leon Kaufman | Systems and graphical user interface for analyzing body images |
US6525749B1 (en) * | 1993-12-30 | 2003-02-25 | Xerox Corporation | Apparatus and method for supporting the implicit structure of freeform lists, outlines, text, tables and diagrams in a gesture-based input system and editing system |
US20030050906A1 (en) * | 1998-08-26 | 2003-03-13 | Gervase Clifton-Bligh | Methods and devices for mapping data files |
US6545669B1 (en) * | 1999-03-26 | 2003-04-08 | Husam Kinawi | Object-drag continuity between discontinuous touch-screens |
US20030073430A1 (en) * | 2001-10-17 | 2003-04-17 | Palm, Inc. | User interface-technique for managing an active call |
US6597347B1 (en) * | 1991-11-26 | 2003-07-22 | Itu Research Inc. | Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom |
US6624831B1 (en) * | 2000-10-17 | 2003-09-23 | Microsoft Corporation | System and process for generating a dynamically adjustable toolbar |
US20030221167A1 (en) * | 2001-04-25 | 2003-11-27 | Eric Goldstein | System, method and apparatus for selecting, displaying, managing, tracking and transferring access to content of web pages and other sources |
US6683623B1 (en) * | 2000-08-30 | 2004-01-27 | New Forum Publishers | System and method for providing and accessing educational information over a computer network |
US6683600B1 (en) * | 2000-04-19 | 2004-01-27 | Microsoft Corporation | Adaptive input pen mode selection |
US20040032413A1 (en) * | 2002-08-13 | 2004-02-19 | Fuller David W. | Multiple views for a measurement system diagram |
US20040113935A1 (en) * | 2001-05-25 | 2004-06-17 | O'neal David | System and method for electronic presentations |
US20040174398A1 (en) * | 2003-03-04 | 2004-09-09 | Microsoft Corporation | System and method for navigating a graphical user interface on a smaller display |
US20050024341A1 (en) * | 2001-05-16 | 2005-02-03 | Synaptics, Inc. | Touch screen with user interface enhancement |
US6952203B2 (en) * | 2002-01-08 | 2005-10-04 | International Business Machines Corporation | Touchscreen user interface: Bluetooth™ stylus for performing right mouse clicks |
US20060048058A1 (en) * | 2001-05-25 | 2006-03-02 | Learning Tree International | System and method for electronic presentations |
US7015894B2 (en) * | 2001-09-28 | 2006-03-21 | Ricoh Company, Ltd. | Information input and output system, method, storage medium, and carrier wave |
US7030888B1 (en) * | 1999-03-01 | 2006-04-18 | Eastman Kodak Company | Color processing |
US7185274B1 (en) * | 1999-12-07 | 2007-02-27 | Microsoft Corporation | Computer user interface architecture wherein users interact with both content and user interface by activating links |
US7190351B1 (en) * | 2002-05-10 | 2007-03-13 | Michael Goren | System and method for data input |
US20070128899A1 (en) * | 2003-01-12 | 2007-06-07 | Yaron Mayer | System and method for improving the efficiency, comfort, and/or reliability in Operating Systems, such as for example Windows |
US20070203906A1 (en) * | 2003-09-22 | 2007-08-30 | Cone Julian M | Enhanced Search Engine |
US20080177994A1 (en) * | 2003-01-12 | 2008-07-24 | Yaron Mayer | System and method for improving the efficiency, comfort, and/or reliability in Operating Systems, such as for example Windows |
US7831934B2 (en) * | 2002-08-26 | 2010-11-09 | Palm, Inc. | User-interface features for computers with contact-sensitive displays |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5889888A (en) * | 1996-12-05 | 1999-03-30 | 3Com Corporation | Method and apparatus for immediate response handwriting recognition system that handles multiple character sets |
CN1217255C (en) | 1999-12-28 | 2005-08-31 | 索尼株式会社 | Electronic device with dispaly function |
-
2003
- 2003-05-30 US US10/452,232 patent/US7406666B2/en active Active
- 2003-05-30 US US10/452,233 patent/US20040036680A1/en not_active Abandoned
- 2003-08-26 WO PCT/US2003/026869 patent/WO2004019200A2/en not_active Application Discontinuation
- 2003-08-26 CA CA002496774A patent/CA2496774A1/en not_active Abandoned
- 2003-08-26 EP EP03793432A patent/EP1558985A2/en not_active Ceased
- 2003-08-26 AU AU2003262921A patent/AU2003262921A1/en not_active Abandoned
-
2008
- 2008-06-23 US US12/144,545 patent/US7831934B2/en not_active Expired - Lifetime
Patent Citations (75)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5252951A (en) * | 1989-04-28 | 1993-10-12 | International Business Machines Corporation | Graphical user interface with gesture recognition in a multiapplication environment |
US5276794A (en) * | 1990-09-25 | 1994-01-04 | Grid Systems Corporation | Pop-up keyboard system for entering handwritten data into computer generated forms |
US5347295A (en) * | 1990-10-31 | 1994-09-13 | Go Corporation | Control of a computer through a position-sensed stylus |
US6597347B1 (en) * | 1991-11-26 | 2003-07-22 | Itu Research Inc. | Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom |
US5483261A (en) * | 1992-02-14 | 1996-01-09 | Itu Research, Inc. | Graphical input controller and method with rear screen image detection |
US6097392A (en) * | 1992-09-10 | 2000-08-01 | Microsoft Corporation | Method and system of altering an attribute of a graphic object in a pen environment |
US5936619A (en) * | 1992-09-11 | 1999-08-10 | Canon Kabushiki Kaisha | Information processor |
US5621438A (en) * | 1992-10-12 | 1997-04-15 | Hitachi, Ltd. | Pointing information processing apparatus with pointing function |
US5432720A (en) * | 1992-11-13 | 1995-07-11 | International Business Machines Corporation | Rotatable pen-based computer |
US5566098A (en) * | 1992-11-13 | 1996-10-15 | International Business Machines Corporation | Rotatable pen-based computer with automatically reorienting display |
US5502461A (en) * | 1993-05-11 | 1996-03-26 | Sanyo Electric Co., Ltd. | Hand written character input system/allowing change of size of character writing frames |
US5596697A (en) * | 1993-09-30 | 1997-01-21 | Apple Computer, Inc. | Method for routing items within a computer system |
US6525749B1 (en) * | 1993-12-30 | 2003-02-25 | Xerox Corporation | Apparatus and method for supporting the implicit structure of freeform lists, outlines, text, tables and diagrams in a gesture-based input system and editing system |
US5731801A (en) * | 1994-03-31 | 1998-03-24 | Wacom Co., Ltd. | Two-handed method of displaying information on a computer display |
US6493464B1 (en) * | 1994-07-01 | 2002-12-10 | Palm, Inc. | Multiple pen stroke character set and handwriting recognition system with immediate response |
US5805167A (en) * | 1994-09-22 | 1998-09-08 | Van Cruyningen; Izak | Popup menus with directional gestures |
US5757371A (en) * | 1994-12-13 | 1998-05-26 | Microsoft Corporation | Taskbar with start menu |
US5760773A (en) * | 1995-01-06 | 1998-06-02 | Microsoft Corporation | Methods and apparatus for interacting with data objects using action handles |
US5736974A (en) * | 1995-02-17 | 1998-04-07 | International Business Machines Corporation | Method and apparatus for improving visibility and selectability of icons |
US5644737A (en) * | 1995-06-06 | 1997-07-01 | Microsoft Corporation | Method and system for stacking toolbars in a computer display |
US5745718A (en) * | 1995-07-31 | 1998-04-28 | International Business Machines Corporation | Folder bar widget |
US20010002126A1 (en) * | 1995-12-01 | 2001-05-31 | Immersion Corporation | Providing force feedback to a user of an interface device based on interactions of a user-controlled cursor in a graphical user interface |
US5801699A (en) * | 1996-01-26 | 1998-09-01 | International Business Machines Corporation | Icon aggregation on a graphical user interface |
US5859623A (en) * | 1996-05-14 | 1999-01-12 | Proxima Corporation | Intelligent display system presentation projection arrangement and method of using same |
US6067584A (en) * | 1996-09-09 | 2000-05-23 | National Instruments Corporation | Attribute-based system and method for configuring and controlling a data acquisition task |
US5828376A (en) * | 1996-09-23 | 1998-10-27 | J. D. Edwards World Source Company | Menu control in a graphical user interface |
US5940488A (en) * | 1996-11-15 | 1999-08-17 | Active Voice Corporation | Telecommunication management system and user interface |
US6300946B1 (en) * | 1997-01-29 | 2001-10-09 | Palm, Inc. | Method and apparatus for interacting with a portable computer |
US6057836A (en) * | 1997-04-01 | 2000-05-02 | Microsoft Corporation | System and method for resizing and rearranging a composite toolbar by direct manipulation |
US6069623A (en) * | 1997-09-19 | 2000-05-30 | International Business Machines Corporation | Method and system for the dynamic customization of graphical user interface elements |
US6096094A (en) * | 1997-10-03 | 2000-08-01 | National Instruments Corporation | Configuration manager for configuring a data acquisition system |
US6037937A (en) * | 1997-12-04 | 2000-03-14 | Nortel Networks Corporation | Navigation tool for graphical user interface |
US6310610B1 (en) * | 1997-12-04 | 2001-10-30 | Nortel Networks Limited | Intelligent touch display |
US6018346A (en) * | 1998-01-12 | 2000-01-25 | Xerox Corporation | Freeform graphics system having meeting objects for supporting meeting objectives |
US5973664A (en) * | 1998-03-19 | 1999-10-26 | Portrait Displays, Inc. | Parameterized image orientation for computer displays |
US6181344B1 (en) * | 1998-03-20 | 2001-01-30 | Nuvomedia, Inc. | Drag-and-release method for configuring user-definable function key of hand-held computing device |
US6133915A (en) * | 1998-06-17 | 2000-10-17 | Microsoft Corporation | System and method for customizing controls on a toolbar |
US20030050906A1 (en) * | 1998-08-26 | 2003-03-13 | Gervase Clifton-Bligh | Methods and devices for mapping data files |
US7030888B1 (en) * | 1999-03-01 | 2006-04-18 | Eastman Kodak Company | Color processing |
US6545669B1 (en) * | 1999-03-26 | 2003-04-08 | Husam Kinawi | Object-drag continuity between discontinuous touch-screens |
US6346972B1 (en) * | 1999-05-26 | 2002-02-12 | Samsung Electronics Co., Ltd. | Video display apparatus with on-screen display pivoting function |
US20020173721A1 (en) * | 1999-08-20 | 2002-11-21 | Novasonics, Inc. | User interface for handheld imaging devices |
US20030013959A1 (en) * | 1999-08-20 | 2003-01-16 | Sorin Grunwald | User interface for handheld imaging devices |
US7185274B1 (en) * | 1999-12-07 | 2007-02-27 | Microsoft Corporation | Computer user interface architecture wherein users interact with both content and user interface by activating links |
US20020091700A1 (en) * | 2000-01-21 | 2002-07-11 | Steele Robert A. | Unique architecture for handheld computers |
US6683600B1 (en) * | 2000-04-19 | 2004-01-27 | Microsoft Corporation | Adaptive input pen mode selection |
US20010038394A1 (en) * | 2000-05-08 | 2001-11-08 | Tadao Tsuchimura | Information display system having graphical user interface, and medium |
US20020033836A1 (en) * | 2000-06-06 | 2002-03-21 | Smith Scott R. | Device and method for changing the orientation and configuration of a display of an electronic device |
US20020021278A1 (en) * | 2000-07-17 | 2002-02-21 | Hinckley Kenneth P. | Method and apparatus using multiple sensors in a device with a display |
US20020015064A1 (en) * | 2000-08-07 | 2002-02-07 | Robotham John S. | Gesture-based user interface to multi-level and multi-modal sets of bit-maps |
US6683623B1 (en) * | 2000-08-30 | 2004-01-27 | New Forum Publishers | System and method for providing and accessing educational information over a computer network |
US20020078037A1 (en) * | 2000-10-12 | 2002-06-20 | Mitsuyuki Hatanaka | Information processing apparatus and method, and program storing medium |
US6624831B1 (en) * | 2000-10-17 | 2003-09-23 | Microsoft Corporation | System and process for generating a dynamically adjustable toolbar |
US20020059350A1 (en) * | 2000-11-10 | 2002-05-16 | Marieke Iwema | Insertion point bungee space tool |
US20020113784A1 (en) * | 2000-12-29 | 2002-08-22 | Feilmeier Michael Leon | Portable computer aided design apparatus and method |
US7076738B2 (en) * | 2001-03-02 | 2006-07-11 | Semantic Compaction Systems | Computer device, method and article of manufacture for utilizing sequenced symbols to enable programmed application and commands |
US20020163544A1 (en) * | 2001-03-02 | 2002-11-07 | Baker Bruce R. | Computer device, method and article of manufacture for utilizing sequenced symbols to enable programmed application and commands |
US20030221167A1 (en) * | 2001-04-25 | 2003-11-27 | Eric Goldstein | System, method and apparatus for selecting, displaying, managing, tracking and transferring access to content of web pages and other sources |
US20020188636A1 (en) * | 2001-05-02 | 2002-12-12 | Peck David K. | System and method for in-line editing of web-based documents |
US20050024341A1 (en) * | 2001-05-16 | 2005-02-03 | Synaptics, Inc. | Touch screen with user interface enhancement |
US20060048058A1 (en) * | 2001-05-25 | 2006-03-02 | Learning Tree International | System and method for electronic presentations |
US20040113935A1 (en) * | 2001-05-25 | 2004-06-17 | O'neal David | System and method for electronic presentations |
US20030016850A1 (en) * | 2001-07-17 | 2003-01-23 | Leon Kaufman | Systems and graphical user interface for analyzing body images |
US20030018245A1 (en) * | 2001-07-17 | 2003-01-23 | Accuimage Diagnostics Corp. | Methods for generating a lung report |
US7015894B2 (en) * | 2001-09-28 | 2006-03-21 | Ricoh Company, Ltd. | Information input and output system, method, storage medium, and carrier wave |
US7231208B2 (en) * | 2001-10-17 | 2007-06-12 | Palm, Inc. | User interface-technique for managing an active call |
US20030073430A1 (en) * | 2001-10-17 | 2003-04-17 | Palm, Inc. | User interface-technique for managing an active call |
US6952203B2 (en) * | 2002-01-08 | 2005-10-04 | International Business Machines Corporation | Touchscreen user interface: Bluetooth™ stylus for performing right mouse clicks |
US7190351B1 (en) * | 2002-05-10 | 2007-03-13 | Michael Goren | System and method for data input |
US20040032413A1 (en) * | 2002-08-13 | 2004-02-19 | Fuller David W. | Multiple views for a measurement system diagram |
US7831934B2 (en) * | 2002-08-26 | 2010-11-09 | Palm, Inc. | User-interface features for computers with contact-sensitive displays |
US20070128899A1 (en) * | 2003-01-12 | 2007-06-07 | Yaron Mayer | System and method for improving the efficiency, comfort, and/or reliability in Operating Systems, such as for example Windows |
US20080177994A1 (en) * | 2003-01-12 | 2008-07-24 | Yaron Mayer | System and method for improving the efficiency, comfort, and/or reliability in Operating Systems, such as for example Windows |
US20040174398A1 (en) * | 2003-03-04 | 2004-09-09 | Microsoft Corporation | System and method for navigating a graphical user interface on a smaller display |
US20070203906A1 (en) * | 2003-09-22 | 2007-08-30 | Cone Julian M | Enhanced Search Engine |
Cited By (89)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090015597A1 (en) * | 2000-05-18 | 2009-01-15 | Palm, Inc. | Reorienting display on portable computing device |
US8031212B2 (en) | 2000-05-18 | 2011-10-04 | Hewlett-Packard Development Company, L.P. | Reorienting display on portable computing device |
US7859518B1 (en) | 2001-06-04 | 2010-12-28 | Palm, Inc. | Interface for interaction with display visible from both sides |
US7831934B2 (en) * | 2002-08-26 | 2010-11-09 | Palm, Inc. | User-interface features for computers with contact-sensitive displays |
US7406666B2 (en) | 2002-08-26 | 2008-07-29 | Palm, Inc. | User-interface features for computers with contact-sensitive displays |
US20040046791A1 (en) * | 2002-08-26 | 2004-03-11 | Mark Davis | User-interface features for computers with contact-sensitive displays |
US20090007025A1 (en) * | 2002-08-26 | 2009-01-01 | Mark Davis | User-interface features for computers with contact-sensitive displays |
US20080166049A1 (en) * | 2004-04-02 | 2008-07-10 | Nokia Corporation | Apparatus and Method for Handwriting Recognition |
US8094938B2 (en) * | 2004-04-02 | 2012-01-10 | Nokia Corporation | Apparatus and method for handwriting recognition |
US7580029B2 (en) * | 2004-04-02 | 2009-08-25 | Nokia Corporation | Apparatus and method for handwriting recognition |
US20050219226A1 (en) * | 2004-04-02 | 2005-10-06 | Ying Liu | Apparatus and method for handwriting recognition |
US20070216643A1 (en) * | 2004-06-16 | 2007-09-20 | Morris Robert P | Multipurpose Navigation Keys For An Electronic Device |
US11379060B2 (en) | 2004-08-25 | 2022-07-05 | Apple Inc. | Wide touchpad on a portable computer |
US20070094417A1 (en) * | 2005-05-16 | 2007-04-26 | Hur Yong S | Mobile terminal having scrolling device and method implementing functions using the same |
US11918389B2 (en) | 2006-02-15 | 2024-03-05 | Hologic, Inc. | Breast biopsy and needle localization using tomosynthesis systems |
US11452486B2 (en) | 2006-02-15 | 2022-09-27 | Hologic, Inc. | Breast biopsy and needle localization using tomosynthesis systems |
WO2007118019A2 (en) * | 2006-04-06 | 2007-10-18 | Motorola, Inc. | Method and apparatus for user interface adaptation |
US10048860B2 (en) | 2006-04-06 | 2018-08-14 | Google Technology Holdings LLC | Method and apparatus for user interface adaptation |
WO2007118019A3 (en) * | 2006-04-06 | 2008-04-24 | Motorola Inc | Method and apparatus for user interface adaptation |
US20080001932A1 (en) * | 2006-06-30 | 2008-01-03 | Inventec Corporation | Mobile communication device |
EP1923778A3 (en) * | 2006-11-16 | 2010-04-07 | LG Electronics, Inc. | Mobile terminal and screen display method thereof |
US8217904B2 (en) | 2006-11-16 | 2012-07-10 | Lg Electronics Inc. | Mobile terminal and screen display method thereof |
EP1923778A2 (en) * | 2006-11-16 | 2008-05-21 | LG Electronics, Inc. | Mobile terminal and screen display method thereof |
US20080119237A1 (en) * | 2006-11-16 | 2008-05-22 | Lg Electronics Inc. | Mobile terminal and screen display method thereof |
EP2163970A2 (en) * | 2007-02-27 | 2010-03-17 | Motorola, Inc. | Adaptable user interface and mechanism for a portable electronic device |
EP2115555A1 (en) * | 2007-02-27 | 2009-11-11 | Motorola, Inc. | Adaptable user interface and mechanism for a portable electronic device |
EP2163970A3 (en) * | 2007-02-27 | 2010-06-09 | Motorola, Inc. | Adaptable user interface and mechanism for a portable electronic device |
EP2115555A4 (en) * | 2007-02-27 | 2010-06-09 | Motorola Inc | Adaptable user interface and mechanism for a portable electronic device |
US8902152B2 (en) | 2007-04-30 | 2014-12-02 | Motorola Mobility Llc | Dual sided electrophoretic display |
US20080266244A1 (en) * | 2007-04-30 | 2008-10-30 | Xiaoping Bai | Dual Sided Electrophoretic Display |
US20080305837A1 (en) * | 2007-06-08 | 2008-12-11 | Inventec Corporation | Mobile communication apparatus |
US7969414B2 (en) * | 2007-06-08 | 2011-06-28 | Inventec Corporation | Mobile communication apparatus |
US20090231283A1 (en) * | 2007-06-22 | 2009-09-17 | Polak Robert D | Colored Morphing Apparatus for an Electronic Device |
US9122092B2 (en) | 2007-06-22 | 2015-09-01 | Google Technology Holdings LLC | Colored morphing apparatus for an electronic device |
US8957863B2 (en) | 2007-06-22 | 2015-02-17 | Google Technology Holdings LLC | Colored morphing apparatus for an electronic device |
US20080316397A1 (en) * | 2007-06-22 | 2008-12-25 | Polak Robert D | Colored Morphing Apparatus for an Electronic Device |
US20090198132A1 (en) * | 2007-08-10 | 2009-08-06 | Laurent Pelissier | Hand-held ultrasound imaging device having reconfigurable user interface |
US20090046072A1 (en) * | 2007-08-13 | 2009-02-19 | Emig David M | Electrically Non-interfering Printing for Electronic Devices Having Capacitive Touch Sensors |
US8077154B2 (en) | 2007-08-13 | 2011-12-13 | Motorola Mobility, Inc. | Electrically non-interfering printing for electronic devices having capacitive touch sensors |
US20090161059A1 (en) * | 2007-12-19 | 2009-06-25 | Emig David M | Field Effect Mode Electro-Optical Device Having a Quasi-Random Photospacer Arrangement |
US8139195B2 (en) | 2007-12-19 | 2012-03-20 | Motorola Mobility, Inc. | Field effect mode electro-optical device having a quasi-random photospacer arrangement |
US10747428B2 (en) * | 2008-01-04 | 2020-08-18 | Apple Inc. | Selective rejection of touch contacts in an edge region of a touch surface |
US9891732B2 (en) * | 2008-01-04 | 2018-02-13 | Apple Inc. | Selective rejection of touch contacts in an edge region of a touch surface |
US11449224B2 (en) | 2008-01-04 | 2022-09-20 | Apple Inc. | Selective rejection of touch contacts in an edge region of a touch surface |
US20150253891A1 (en) * | 2008-01-04 | 2015-09-10 | Apple Inc. | Selective rejection of touch contacts in an edge region of a touch surface |
US20220391086A1 (en) * | 2008-01-04 | 2022-12-08 | Apple Inc. | Selective rejection of touch contacts in an edge region of a touch surface |
US11886699B2 (en) * | 2008-01-04 | 2024-01-30 | Apple Inc. | Selective rejection of touch contacts in an edge region of a touch surface |
US8797270B2 (en) * | 2008-01-29 | 2014-08-05 | Kyocera Corporation | Terminal device with display function |
US9013439B2 (en) | 2008-01-29 | 2015-04-21 | Kyocera Corporation | Terminal device with display function |
US20130215062A1 (en) * | 2008-01-29 | 2013-08-22 | Kyocera Corporation | Terminal device with display function |
US9477338B2 (en) | 2008-01-29 | 2016-10-25 | Kyocera Corporation | Terminal device with display function |
US8059232B2 (en) | 2008-02-08 | 2011-11-15 | Motorola Mobility, Inc. | Electronic device and LC shutter for polarization-sensitive switching between transparent and diffusive states |
US20090300537A1 (en) * | 2008-05-27 | 2009-12-03 | Park Kenneth J | Method and system for changing format for displaying information on handheld device |
US20100110020A1 (en) * | 2008-10-31 | 2010-05-06 | Sprint Communications Company L.P. | Virtual press number pad |
US9632608B2 (en) | 2008-12-08 | 2017-04-25 | Apple Inc. | Selective input signal rejection and modification |
US10452174B2 (en) | 2008-12-08 | 2019-10-22 | Apple Inc. | Selective input signal rejection and modification |
US20100171693A1 (en) * | 2009-01-06 | 2010-07-08 | Kenichi Tamura | Display control device, display control method, and program |
US9053652B2 (en) * | 2009-01-06 | 2015-06-09 | Sony Corporation | Display control device, display control method, and program |
US9766788B2 (en) | 2009-07-17 | 2017-09-19 | Apple Inc. | Selective rotation of a user interface |
US8817048B2 (en) * | 2009-07-17 | 2014-08-26 | Apple Inc. | Selective rotation of a user interface |
US20110012926A1 (en) * | 2009-07-17 | 2011-01-20 | Apple Inc. | Selective rotation of a user interface |
US20110029904A1 (en) * | 2009-07-30 | 2011-02-03 | Adam Miles Smith | Behavior and Appearance of Touch-Optimized User Interface Elements for Controlling Computer Function |
US11701199B2 (en) | 2009-10-08 | 2023-07-18 | Hologic, Inc. | Needle breast biopsy system and method of use |
US20110105186A1 (en) * | 2009-10-29 | 2011-05-05 | Research In Motion Limited | Systems and methods for providing direct and indirect navigation modes for touchscreen devices |
US10444960B2 (en) * | 2010-11-26 | 2019-10-15 | Hologic, Inc. | User interface for medical image review workstation |
US11775156B2 (en) | 2010-11-26 | 2023-10-03 | Hologic, Inc. | User interface for medical image review workstation |
US20150309712A1 (en) * | 2010-11-26 | 2015-10-29 | Hologic, Inc. | User interface for medical image review workstation |
US11406332B2 (en) | 2011-03-08 | 2022-08-09 | Hologic, Inc. | System and method for dual energy and/or contrast enhanced breast imaging for screening, diagnosis and biopsy |
US11837197B2 (en) | 2011-11-27 | 2023-12-05 | Hologic, Inc. | System and method for generating a 2D image using mammography and/or tomosynthesis image data |
US11508340B2 (en) | 2011-11-27 | 2022-11-22 | Hologic, Inc. | System and method for generating a 2D image using mammography and/or tomosynthesis image data |
US10282155B2 (en) | 2012-01-26 | 2019-05-07 | Google Technology Holdings LLC | Portable electronic device and method for controlling operation thereof taking into account which limb possesses the electronic device |
US11663780B2 (en) | 2012-02-13 | 2023-05-30 | Hologic Inc. | System and method for navigating a tomosynthesis stack using synthesized image data |
US9022611B2 (en) * | 2012-05-09 | 2015-05-05 | Samsung Display Co., Ltd. | Display device and method for fabricating the same |
US20130301272A1 (en) * | 2012-05-09 | 2013-11-14 | Chan Hee Wang | Display device and method for fabricating the same |
US11550466B2 (en) | 2012-08-27 | 2023-01-10 | Samsung Electronics Co., Ltd. | Method of controlling a list scroll bar and an electronic device using the same |
US11589944B2 (en) | 2013-03-15 | 2023-02-28 | Hologic, Inc. | Tomosynthesis-guided biopsy apparatus and method |
US11419565B2 (en) | 2014-02-28 | 2022-08-23 | IIologic, Inc. | System and method for generating and displaying tomosynthesis image slabs |
US11801025B2 (en) | 2014-02-28 | 2023-10-31 | Hologic, Inc. | System and method for generating and displaying tomosynthesis image slabs |
US9971496B2 (en) | 2014-08-04 | 2018-05-15 | Google Technology Holdings LLC | Method and apparatus for adjusting a graphical user interface on an electronic device |
US11366572B2 (en) | 2014-12-01 | 2022-06-21 | Ebay Inc. | Mobile optimized shopping comparison |
US10459608B2 (en) * | 2014-12-01 | 2019-10-29 | Ebay Inc. | Mobile optimized shopping comparison |
US20180121071A1 (en) * | 2016-11-03 | 2018-05-03 | Ford Global Technologies, Llc | Vehicle display based on vehicle speed |
US11445993B2 (en) | 2017-03-30 | 2022-09-20 | Hologic, Inc. | System and method for targeted object enhancement to generate synthetic breast tissue images |
US11455754B2 (en) | 2017-03-30 | 2022-09-27 | Hologic, Inc. | System and method for synthesizing low-dimensional image data from high-dimensional image data using an object grid enhancement |
US11957497B2 (en) | 2017-03-30 | 2024-04-16 | Hologic, Inc | System and method for hierarchical multi-level feature image synthesis and representation |
US11403483B2 (en) | 2017-06-20 | 2022-08-02 | Hologic, Inc. | Dynamic self-learning medical image method and system |
US11850021B2 (en) | 2017-06-20 | 2023-12-26 | Hologic, Inc. | Dynamic self-learning medical image method and system |
US11314391B2 (en) * | 2017-09-08 | 2022-04-26 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Navigation bar controlling method and terminal |
US11307743B2 (en) * | 2017-11-07 | 2022-04-19 | Samsung Electronics Co., Ltd. | Method, electronic device and storage medium for providing mode switching |
Also Published As
Publication number | Publication date |
---|---|
US20040046791A1 (en) | 2004-03-11 |
EP1558985A2 (en) | 2005-08-03 |
CA2496774A1 (en) | 2004-03-04 |
US20090007025A1 (en) | 2009-01-01 |
WO2004019200A2 (en) | 2004-03-04 |
US7406666B2 (en) | 2008-07-29 |
AU2003262921A1 (en) | 2004-03-11 |
US7831934B2 (en) | 2010-11-09 |
AU2003262921A8 (en) | 2004-03-11 |
WO2004019200A3 (en) | 2005-02-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7406666B2 (en) | User-interface features for computers with contact-sensitive displays | |
US7966573B2 (en) | Method and system for improving interaction with a user interface | |
US9239673B2 (en) | Gesturing with a multipoint sensing device | |
EP1979804B1 (en) | Gesturing with a multipoint sensing device | |
US9292111B2 (en) | Gesturing with a multipoint sensing device | |
US7898529B2 (en) | User interface having a placement and layout suitable for pen-based computers | |
US7802202B2 (en) | Computer interaction based upon a currently active input device | |
US7644372B2 (en) | Area frequency radial menus | |
US20140123049A1 (en) | Keyboard with gesture-redundant keys removed | |
EP2003538A1 (en) | Method for operating user interface and recording medium for storing program applying the same | |
US20050162402A1 (en) | Methods of interacting with a computer using a finger(s) touch sensing input device with visual feedback | |
JP2010521022A (en) | Virtual keyboard input system using a pointing device used in digital equipment | |
US7479943B1 (en) | Variable template input area for a data input device of a handheld electronic system | |
US7571384B1 (en) | Method and system for handwriting recognition with scrolling input history and in-place editing | |
WO2014043275A1 (en) | Gesturing with a multipoint sensing device | |
AU2016238971B2 (en) | Gesturing with a multipoint sensing device | |
AU2014201419B2 (en) | Gesturing with a multipoint sensing device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: PALM, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DAVIS, MARK;BERNOULLI, CARLO;REEL/FRAME:014143/0898 Effective date: 20030530 |
|
AS | Assignment |
Owner name: JP MORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGEN Free format text: SECURITY AGREEMENT;ASSIGNOR:PALM, INC.;REEL/FRAME:020113/0788 Effective date: 20071024 |
|
AS | Assignment |
Owner name: PALM, INC., CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT;REEL/FRAME:024630/0474 Effective date: 20100701 |
|
AS | Assignment |
Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PALM, INC.;REEL/FRAME:025204/0809 Effective date: 20101027 |
|
AS | Assignment |
Owner name: PALM, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P.;REEL/FRAME:030341/0459 Effective date: 20130430 |
|
AS | Assignment |
Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PALM, INC.;REEL/FRAME:031837/0659 Effective date: 20131218 Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PALM, INC.;REEL/FRAME:031837/0239 Effective date: 20131218 Owner name: PALM, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P.;REEL/FRAME:031837/0544 Effective date: 20131218 |
|
AS | Assignment |
Owner name: QUALCOMM INCORPORATED, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HEWLETT-PACKARD COMPANY;HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P.;PALM, INC.;REEL/FRAME:032177/0210 Effective date: 20140123 |
|
STCV | Information on status: appeal procedure |
Free format text: BOARD OF APPEALS DECISION RENDERED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION |