US20120221976A1 - Radial menu display systems and methods - Google Patents

Radial menu display systems and methods Download PDF

Info

Publication number
US20120221976A1
US20120221976A1 US13/462,403 US201213462403A US2012221976A1 US 20120221976 A1 US20120221976 A1 US 20120221976A1 US 201213462403 A US201213462403 A US 201213462403A US 2012221976 A1 US2012221976 A1 US 2012221976A1
Authority
US
United States
Prior art keywords
menu
graphical
radial menu
dimensional radial
graphical representation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/462,403
Inventor
Greg A. Johns
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Verizon Patent and Licensing Inc
Original Assignee
Verizon Patent and Licensing Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Verizon Patent and Licensing Inc filed Critical Verizon Patent and Licensing Inc
Priority to US13/462,403 priority Critical patent/US20120221976A1/en
Assigned to VERIZON PATENT AND LICENSING INC. reassignment VERIZON PATENT AND LICENSING INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JOHNS, GREG A.
Publication of US20120221976A1 publication Critical patent/US20120221976A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]

Definitions

  • FIG. 1 illustrates an exemplary radial menu display system.
  • FIG. 2 illustrates an exemplary device having the system of FIG. 1 implemented therein.
  • FIG. 3 illustrates another exemplary implementation of the system of FIG. 1 .
  • FIGS. 4A-4B illustrate a graphical representation of an exemplary two-dimensional radial menu in a graphical user interface (“GUI”).
  • GUI graphical user interface
  • FIG. 6 illustrates a graphical representation of an exemplary three-dimensional radial menu in a GUI.
  • FIG. 7 illustrates an exemplary radial menu display method.
  • FIG. 8 illustrates another exemplary radial menu display method.
  • FIG. 9 illustrates a perspective view of an exemplary three-dimensional radial menu model.
  • the graphical representation of the two-dimensional radial menu and the graphical representation of the three-dimensional radial menu may include a common center point that is repositioned in the graphical user interface upon transforming the graphical representation of the two-dimensional radial menu into a graphical representation of a three-dimensional radial menu in the GUI.
  • the graphical representation of the two-dimensional radial menu may include at least one category menu graphical object that is repositioned in the graphical user interface upon transforming the graphical representation of the two-dimensional radial menu into a graphical representation of a three-dimensional radial menu in the GUI.
  • data representative of a radial menu model may be maintained and utilized to render at least one of the two-dimensional radial menu model and the three-dimensional radial menu model in the GUI.
  • the radial menu model may include a center point, a plurality of category menu objects positioned about the center point at a first radial distance from the center point, and a plurality of application menu objects positioned about the plurality of category menu objects at a second radial distance from the center point.
  • exemplary graphical representations of radial menus disclosed herein may provide a convenient, intuitive, consistent, and centric experience for a user who utilizes one or more of the radial menus to navigate and select one or more features accessible to and/or provided by a computing device.
  • Exemplary embodiments of radial menu display systems and methods will now be described in more detail with reference to the accompanying drawings.
  • FIG. 1 illustrates an exemplary computing system 100 (“system 100 ”) configured to provide one or more graphical representations of radial menus.
  • System 100 may include a communication facility 110 , processing facility 120 , storage facility 130 , input/output (“I/O”) facility 140 , radial menu facility 150 , and user interface facility 160 communicatively coupled to one another as shown in FIG. 1 .
  • the components of system 100 may communicate with one another, including sending data to and receiving data from one another, using any suitable communication technologies.
  • system 100 may include any computing hardware and/or instructions (e.g., software programs), or combinations of computing instructions and hardware, configured to perform one or more of the processes described herein.
  • system 100 may be implemented on one physical computing device or may be implemented on more than one physical computing device.
  • system 100 may include any one of a number of computing devices employing any of a number of computer operating systems.
  • One or more of the processes described herein may be implemented at least in part as computer-executable instructions, i.e., instructions executable by one or more computing devices, tangibly embodied in a computer-readable medium.
  • a processor e.g., a microprocessor
  • receives instructions, from a computer-readable medium e.g., from a memory, etc.
  • executes those instructions thereby performing one or more processes, including one or more of the processes described herein.
  • Such instructions may be stored and transmitted using a variety of known computer-readable media.
  • a computer-readable medium includes any medium that participates in providing data (e.g., instructions) that may be read by a computer (e.g., by a processor of a computer). Such a medium may take many forms, including, but not limited to, non-volatile media and volatile media. Non-volatile media may include, for example, optical or magnetic disks and other persistent memory. Volatile media may include, for example, dynamic random access memory (“DRAM”), which typically constitutes a main memory.
  • DRAM dynamic random access memory
  • Computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, a RAM, a PROM, an EPROM, a FLASH-EEPROM, any other memory chip or cartridge, or any other medium from which a computing device can read.
  • each of the components of system 100 may be implemented as hardware, computing instructions (e.g., software) tangibly embodied on a computer-readable medium, or a combination of hardware and tangibly embodied computing instructions configured to perform one or more of the processes described herein.
  • radial menu facility 150 may be implemented as one or more software applications embodied on one or more computer-readable media and configured to direct processing facility 120 , user interface facility 160 , and/or one or more other components of system 100 to execute one or more of the processes described herein.
  • system 100 may be implemented on one or more devices, as may suit a particular application.
  • FIG. 2 illustrates an exemplary phone device 200 (e.g., a home or business phone console such as a Verizon Hub phone device) having system 100 implemented thereon.
  • Device 200 may include one or more of the components of system 100 shown in FIG. 1 and may be configured to perform one or more of the processes and/or operations described herein.
  • FIG. 2 illustrates an exemplary phone console device 200
  • system 100 may be implemented on other devices in other embodiments.
  • Such devices may include, but are not limited to, a communications device, user device, mobile device (e.g., a mobile phone device), handheld device, computer, personal-digital assistant device, set-top box and connected display device (e.g., a television), display device, console device, and any other device configured to perform one or more of the processes and/or operations described herein.
  • mobile device e.g., a mobile phone device
  • handheld device e.g., computer
  • personal-digital assistant device e.g., set-top box and connected display device
  • display device e.g., a television
  • console device e.g., a television
  • device 200 may include input mechanisms such as one or more of the input buttons 220 (e.g., input buttons 220 - 1 and 220 - 2 ) shown in FIG. 2 .
  • Input buttons 220 may be part of I/O facility 140 .
  • FIG. 3 illustrates another exemplary implementation 300 of system 100 .
  • components of system 100 may be distributed across a server subsystem 310 and an access device 320 configured to communicate with server subsystem 310 by way of a network 325 . Distribution of components of system 100 across server subsystem 310 and access device 320 may be arranged as may suit a particular application.
  • I/O facility 140 and user interface facility 160 may be implemented in access device 320 , and one or more of the other facilities may be implemented in server subsystem 310 .
  • I/O facility 140 , radial menu facility 150 , and user interface facility 160 may be implemented in access device 320 , and one or more of the other facilities may be implemented in server subsystem 310 .
  • any component of system 100 may be divided and distributed across server subsystem 310 and access device 320 .
  • radial menu facility 150 and/or user interface facility 160 may be divided and distributed across server subsystem 310 and access device 320 in certain embodiments.
  • Access device 320 may include, but is not limited to, a communications device, mobile device (e.g., a mobile phone device), handheld device, computing device (e.g., a desktop or laptop computer), phone device (e.g., Verizon Hub device), personal-digital assistant device, set-top box and connected display device, gaming device, wireless communications device, and/or any other device having one or more components of system 100 implemented thereon and configured to perform one or more of the processes described herein.
  • mobile device e.g., a mobile phone device
  • handheld device e.g., a desktop or laptop computer
  • phone device e.g., Verizon Hub device
  • personal-digital assistant device e.g., set-top box and connected display device
  • gaming device e.g., gaming device, wireless communications device, and/or any other device having one or more components of system 100 implemented thereon and configured to perform one or more of the processes described herein.
  • Network 325 may include one or more networks, including, but not limited to, wireless networks, mobile telephone networks (e.g., cellular telephone networks), closed media networks, subscriber television networks, cable networks, satellite networks, the Internet, intranets, local area networks, public networks, private networks, optical fiber networks, broadband networks, narrowband networks, voice communications networks, Voice over Internet Protocol “(VoIP”) networks, Public Switched Telephone Networks (“PSTN”), data communications networks, other communications networks, and any other networks capable of carrying communications and/or data between access device 320 and server subsystem 310 . Communications between server subsystem 310 and access device 320 may be transported using any one of above-listed networks, or any combination or sub-combination of the above-listed networks.
  • wireless networks e.g., cellular telephone networks
  • closed media networks subscriber television networks, cable networks, satellite networks, the Internet, intranets, local area networks, public networks, private networks, optical fiber networks, broadband networks, narrowband networks, voice communications networks, Voice over Internet Protocol “(VoIP”) networks, Public Switch
  • Access device 320 and server subsystem 310 may communicate over network 325 using any communication platforms and technologies suitable for transporting data and/or communication signals, including known communication technologies, devices, media, and protocols supportive of remote communications, examples of which include, but are not limited to, data transmission media, communications devices, Transmission Control Protocol (“TCP”), Internet Protocol (“IP”), File Transfer Protocol (“FTP”), Telnet, Hypertext Transfer Protocol (“HTTP”), Hypertext Transfer Protocol Secure (“HTTPS”), Session Initiation Protocol (“SIP”), Simple Object Access Protocol (“SOAP”), Extensible Mark-up Language (“XML”) and variations thereof, Simple Mail Transfer Protocol (“SMTP”), Real-Time Transport Protocol (“RTP”), User Datagram Protocol (“UDP”), Global System for Mobile Communications (“GSM”) technologies, Code Division Multiple Access (“CDMA”) technologies, Evolution Data Optimized Protocol (“EVDO”), Time Division Multiple Access (“TDMA”) technologies, Short Message Service (“SMS”), Multimedia Message Service (“MMS”), radio frequency (“RF”) signaling technologies, wireless communication technologies (e
  • Communication facility 110 may be configured to send, receive, and/or otherwise process data representative of or otherwise associated with communication events.
  • a “communication event” may include any communication between two or more communication devices and/or between two or more persons or entities (“contacts”) by way of the devices.
  • Examples of such communication events may include, but are not limited to, voice communications (e.g., Voice Over IP (“VoIP”), Public Switched Telephone Network (“PSTN”), or other active, attempted, completed, or recorded voice calls and/or messages), text messages (e.g., Short Message Service (“SMS”) messages), media messages (e.g., Multimedia Message Service (“MMS”) messages), e-mail messages, chat messages (e.g., Instant Messaging (“IM”) messages), and subscriber feed messages (e.g., RSS feed messages).
  • voice communications e.g., Voice Over IP (“VoIP”), Public Switched Telephone Network (“PSTN”), or other active, attempted, completed, or recorded voice calls and/or messages
  • text messages e.g., Short Message Service (“SMS”) messages
  • media messages e.g., Multimedia Message Service (“MMS”) messages
  • e-mail messages e.g., chat messages (e.g., Instant Messaging (“IM”) messages)
  • IM Instant Messaging
  • Communication facility 110 may employ any suitable technologies for processing communication events, including sending and/or receiving signals representative of or otherwise associated with communication events over one or more communication networks.
  • communication facility 110 implemented on device 200 may be configured to send and/or receive signals representative of or otherwise associated with communication events to/from another device over one or more communication networks.
  • Communication facility 110 may be configured to maintain data representative of communication events. Such data, which may be referred to as “communications data,” may be stored by communication facility 110 and/or on one or more suitable computer-readable media, such as storage facility 130 .
  • Communications data may include any information descriptive of or otherwise associated with one or more communication events.
  • communications data may include contact information descriptive of contacts associated with communication events (e.g., sender and receiver contact information).
  • contact information may include contact identifiers (e.g., contact names), phone numbers, e-mail addresses, and/or other information descriptive of parties to and/or devices associated with communication events.
  • communications data may include time information associated with communication events, including communication time stamps (e.g., start and end times), communication duration information, and any other information descriptive of time information (e.g., time component) associated with communication events.
  • Communications data may also include device identifiers, routing information, media attachments, communication content, address information, communication status information, communication type indicators, and/or other attributes or information descriptive of or otherwise associated with communication events.
  • Processing facility 120 may include one or more processors and may be configured to execute and/or direct execution of one or more processes or operations described herein. Processing facility 120 may direct execution of operations in accordance with computer-executable instructions such as may be stored in storage facility 130 or another computer-readable medium. As an example, processing facility 120 may be configured to process data, including demodulating, decoding, and parsing acquired data, and encoding and modulating data for transmission by communication facility 110 .
  • Storage facility 130 may include one or more data storage media, devices, or configurations and may employ any type, form, and combination of storage media.
  • storage facility 130 may include, but is not limited to, a hard drive, network drive, flash drive, magnetic disc, optical disc, RAM, DRAM, other non-volatile and/or volatile storage unit, or a combination or sub-combination thereof.
  • Electronic data, including data described herein, may be temporarily and/or permanently stored in storage facility 130 .
  • I/O facility 140 may be configured to receive user input and provide user output and may include any hardware, firmware, software, or combination thereof supportive of input and output capabilities.
  • I/O facility 140 may include one or more devices for capturing user input, including, but not limited to, a microphone, speech recognition technologies, keyboard or keypad, touch screen component (e.g., touch screen display), receiver (e.g., an RF or infrared receiver), and one or more input buttons.
  • I/O facility 140 may include one or more devices for presenting output to a user, including, but not limited to, a graphics engine, a display (e.g., display 210 ), one or more display drivers, one or more audio speakers, and one or more audio drivers. Output may include audio, visual, textual, and/or haptic output.
  • I/O facility 140 is configured to display one or more GUIs for viewing by a user. Exemplary GUIs and GUI views that may be displayed by I/O facility 140 are described further below.
  • User interface facility 160 may be configured to generate, or direct processing facility 120 to generate, one or more user interfaces.
  • user interface facility 160 may be configured to generate and provide data representing one or more GUIs to I/O facility 140 for display.
  • user interface facility 160 may receive data from radial menu facility 150 and utilize the received data to generate a GUI view for display in a GUI.
  • User interface facility 160 may provide data representative of the GUI to I/O facility 140 for display.
  • exemplary GUIs and GUI views are described further below.
  • Radial menu facility 150 may be configured to generate, provide, render and/or utilize data representative of a radial menu for display in a GUI.
  • radial menu facility 150 may provide data representative of a radial menu to user interface facility 160 for inclusion in a GUI.
  • the data representative of a radial menu may be used by user interface facility 160 to display a graphical representation of a radial menu in a GUI.
  • the graphical representation of the radial menu may include a graphical representation of a two-dimensional radial menu and/or a three-dimensional radial menu. Exemplary graphical representations of a two-dimensional radial menu and a three-dimensional radial menu are described further below.
  • radial menu facility 150 may be configured to provide data representative of a transformation from a graphical representation of one radial menu to a graphical representation of another radial menu in a GUI.
  • radial menu facility 150 may provide data representative of a transformation from a graphical representation of a two-dimensional radial menu to a graphical representation of a three-dimensional radial menu in a GUI.
  • radial menu facility 150 may provide data representative of a transformation from a graphical representation of a three-dimensional radial menu to a graphical representation of a two-dimensional radial menu in a GUI. Exemplary transformations between graphical representations of radial menus are described further below.
  • radial menu facility 150 may be configured to maintain data representative of a radial menu model in a computer-readable medium such as storage facility 130 . Radial menu facility 150 may utilize the data representative of the radial menu model to render, or direct one or more components (e.g., processing facility 120 ) of system 100 to render, a graphical representation of the radial menu model in a GUI.
  • a computer-readable medium such as storage facility 130 .
  • Radial menu facility 150 may utilize the data representative of the radial menu model to render, or direct one or more components (e.g., processing facility 120 ) of system 100 to render, a graphical representation of the radial menu model in a GUI.
  • An exemplary radial menu model is described further below.
  • FIGS. 4A , 4 B, and 5 illustrate exemplary graphical representations of radial menus that may be displayed in a GUI.
  • FIG. 4A illustrates a GUI 400 including an exemplary graphical representation of a two-dimensional radial menu 405 displayed therein.
  • Graphical representation of two-dimensional radial menu 405 may include a plurality of graphical objects arranged in a two-dimensional radial configuration within GUI 400 .
  • graphical objects are arranged about a center point in a generally circular configuration. This is illustrative only. Other radial configurations (e.g., an arc or a spiral) may be used in other embodiments.
  • graphical representation of two-dimensional radial menu 405 may include a center point graphical object 410 , a plurality of category menu graphical objects 420 (e.g., category menu graphical objects 420 - 1 through 420 - 5 ), and a plurality of application menu graphical objects 430 (e.g., application menu graphical objects 430 - 1 through 430 - 17 ).
  • Center point graphical object 410 may indicate generally a radial center point of graphical representation of two-dimensional radial menu 405 .
  • Center point graphical object 410 may comprise any suitable visual indicator and/or attribute configured to visually represent the radial center point of the graphical representation of two-dimensional radial menu 405 and/or to distinguish center point graphical object 410 from one or more other graphical objects displayed in GUI 400 .
  • Category menu graphical objects 420 may represent categories of applications (e.g., software and/or device applications). Each category may include a menu category associated with one or more applications that share at least one common attribute. Categories may be defined and organized in any suitable way and may be associated with groups of one or more applications based on any attribute(s) common to one or more applications. Examples of such categories may include, but are not limited to, a communications category, an accessories category, a settings category, a games category, an entertainment category, a media category, a software programs category, a “go to” category, and any other menu category with which an application may be associated. As an example, one of the category menu graphical objects 420 in GUI 400 may represent a communications category associated with one or more communications applications, such as voice communication applications, email applications, and/or text messaging applications.
  • communications categories such as voice communication applications, email applications, and/or text messaging applications.
  • Category menu graphical objects 420 may comprise any suitable visual indicator and/or attribute configured to visually represent one or more categories of applications and/or to distinguish category menu graphical objects 420 from one another and/or from one or more other graphical objects displayed in GUI 400 .
  • category menu graphical objects 420 may include text specifying menu categories represented by the category menu graphical objects 420 .
  • one of the category menu graphical objects 420 may represent a communications menu category and may include text (e.g., “communications”) indicative of the communications menu category.
  • Application menu graphical objects 430 may represent one or more applications, which may include any software and/or device applications provided by (e.g., executable by) computing system 100 and/or accessible to a user of computing system 100 .
  • applications may include, but are not limited to, voice communication applications (e.g., phone call applications), email applications, text messaging applications, instant messaging applications, printing applications, security applications, word processing applications, spreadsheet applications, media player applications, device programming applications, web browser applications, gaming applications, widget applications, and/or any other applications that are executable on computing system 100 .
  • Application menu graphical objects 430 may comprise any suitable visual indicator and/or attribute configured to visually represent one or more applications and/or to distinguish application menu graphical objects 430 from one another and/or from one or more other graphical objects displayed in GUI 400 .
  • application menu graphical objects 430 may include text and/or icons specifying applications represented by the application menu graphical objects 430 .
  • application menu graphical object 430 - 2 may represent an application and may include text and/or an icon indicative of the application.
  • Graphical representation of two-dimensional radial menu 405 may visually represent a hierarchical menu organization of applications and menu categories associated with computing system 100 .
  • graphical representation of two-dimensional radial menu 405 may visually represent relationships between applications and menu categories in GUI 400 .
  • application menu graphical objects 430 may be positioned relative to certain category menu graphical objects 420 to visually indicate one or more hierarchical relationships.
  • a position of category menu graphical object 420 - 2 relative to positions of application menu graphical objects 430 - 1 through 430 - 4 may represent that the applications represented by application menu graphical objects 430 - 1 through 430 - 4 are hierarchically related to a menu category represented by menu graphical object 420 - 2 .
  • the relationships between the applications represented by application menu graphical objects 430 - 1 through 430 - 4 and the menu category represented by category menu graphical object 420 - 2 may be visually depicted by alignment of each of the application menu graphical objects 430 - 1 through 430 - 4 with the category menu graphical object 420 - 2 moving in a particular direction away from center point graphical object 410 .
  • category menu graphical objects 420 may be radially aligned to form an inner radial layer of category menu graphical objects 420 positioned about center point graphical object 410 at a first radial distance from center point graphical object 410 .
  • the inner radial layer of category menu graphical objects 420 may substantially encircle center point graphical object 410 .
  • the inner radial layer of category menu graphical objects 420 may contain certain divisions, of any size or proportion, separating the individual category menu graphical objects 420 .
  • application menu graphical objects 430 may be radially aligned to form an outer radial layer of application menu graphical objects 430 positioned about center point graphical object 410 at a second radial distance from center point graphical object 410 .
  • the outer radial layer of application menu graphical objects 430 may substantially encircle the inner radial layer of category menu graphical objects 420 .
  • the outer layer of application menu graphical objects 430 may contain certain divisions, of any size or proportion, separating the individual application menu graphical objects 430 .
  • one or more graphical objects may be user selectable in graphical representation of two-dimensional radial menu 405 . Accordingly, a user (e.g., a user of device 200 ) may select a particular graphical object displayed as part of two-dimensional radial menu 405 in GUI 400 .
  • a user selection may be detected in any suitable way through any suitable user interfaces, including touch screens, computer mice, image processing mechanisms, voice recognition mechanisms, buttons, joysticks, or any other user interface capable of detecting a user selection.
  • one or more graphical objects may comprise one or more selectable touch objects displayed on a touch screen and that may be selected by a physical object (e.g., a finger or thumb) touching the selectable touch object(s). Accordingly, a user may conveniently select any of the graphical objects included in two-dimensional radial menu with a single touch.
  • FIG. 4B illustrates an exemplary graphical response 460 to a user selection of a graphical object in GUI 400 .
  • I/O facility 140 may detect a user selection of application menu graphical object 430 - 2 in GUI 400 , and radial menu facility 150 may instruct one or more components of system 100 to display graphical response 460 indicating the detected user selection of application menu graphical object 430 - 2 in GUI 400 .
  • Graphical response 460 may include any visual indication of a user selection of application menu graphical object 430 - 2 in GUI 400 .
  • application menu graphical object 430 - 2 may be enlarged in GUI 400 .
  • Graphical responses to user selections may include, without limitation, any graphical changes, movements, animation, events, modifications, or any other visual indications of user selections displayed in a GUI 400 .
  • a user selection of a graphical object in two-dimensional radial menu 405 may be detected in any suitable way, and one or more predetermined actions may be performed by system 100 in response to the user selection.
  • an application may be launched and/or executed in response to a user selection of an application menu graphical object 430 in two-dimensional radial menu 405 .
  • system 100 may execute the voice communication application.
  • a user selection of a graphical object in two-dimensional radial menu 405 may cause system 100 to display another view in GUI 400 .
  • system 100 may display a category menu view in GUI 400 in response to a user selection of a category menu graphical object 420 .
  • a user may select category menu graphical object 420 - 2 in GUI 400 .
  • system 100 may display a category menu view of category menu graphical object 420 - 2 in GUI 400 .
  • FIG. 5 illustrates GUI 400 with an exemplary category menu view 500 displayed therein.
  • category menu view 500 may include overview information 510 associated with a category, one or more selectable options 520 associated with the category, and one or more application icons 530 (e.g., application icons 530 - 1 through 530 - 4 ) associated with one or more applications within the category.
  • category menu view 500 may provide a dashboard view of information and options associated with a category.
  • category menu view 500 is associated with a communications category and includes overview information 510 associated with communications (e.g., communications statistics) and application icons 530 associated with communication applications. A user selection of an application icon 530 may cause system 100 to launch the corresponding application.
  • category menu view 500 in FIG. 5 may correspond to a selected category menu graphical object 420 - 2 in FIG. 4A
  • application icons 530 - 1 through 530 - 4 in FIG. 5 may correspond to application menu graphical objects 430 - 1 through 430 - 4 in FIG. 4A
  • category menu view 500 in FIG. 5 and category menu graphical object 420 - 2 in FIG. 4A may represent the same communications category
  • application menu graphical objects 430 - 1 through 430 - 4 in FIG. 4A and application icons 530 - 1 through 530 - 4 in FIG. 5 may represent the same communication applications (e.g., the same email, voice communication, text messaging, and other applications, respectively).
  • FIG. 6 illustrates GUI 400 with a graphical representation of an exemplary three-dimensional radial menu 605 displayed therein.
  • Graphical representation of three-dimensional radial menu 605 may include a plurality of graphical objects arranged in a three-dimensional radial configuration view within GUI 400 .
  • graphical representation of three-dimensional radial menu 605 may include a center point graphical object 610 and a plurality of category menu graphical objects 620 (e.g., category menu graphical objects 620 - 1 through 620 - 3 ) arranged radially about center point graphical object 610 .
  • category menu graphical objects 620 e.g., category menu graphical objects 620 - 1 through 620 - 3
  • the arrangement may form a three-dimensional radial layer of category menu graphical objects 620 at least partially or substantially encircling center point graphical object 610 at a certain radial distance from center point graphical object 610 .
  • the radial layer of category menu graphical objects 620 may contain certain divisions, of any size or proportion, separating the individual category menu graphical objects 620 as shown in FIG. 6 .
  • the example shown in FIG. 6 is illustrative only. Other radial configurations (e.g., a three-dimensional arc or spiral) may be used in other embodiments.
  • Center point graphical object 610 may indicate generally a radial center point of graphical representation of three-dimensional radial menu 605 .
  • Center point graphical object 610 may comprise any suitable visual indicator (e.g., an oval or ellipse) and/or attribute configured to visually represent the radial center point of the graphical representation of three-dimensional radial menu 405 and/or to distinguish center point graphical object 610 from one or more other graphical objects displayed in GUI 400 .
  • Category menu graphical objects 620 may represent categories of applications, including any of the categories of application mentioned above. As described above, each category may include a menu category associated with one or more applications that share at least one common attribute. Categories may be defined and organized in any suitable way and may be associated with groups of one or more applications based on any attribute(s) common to one or more applications. In the example illustrated in FIG. 6 , category menu graphical object 620 - 2 in GUI 400 represents a communications category associated with one or more communications applications, such as voice communication applications, email applications, and/or text messaging applications.
  • Category menu graphical objects 620 may comprise any suitable visual indicator and/or attribute configured to visually represent one or more categories of applications and/or to distinguish category menu graphical objects 620 from one another and/or from one or more other graphical objects displayed in GUI 400 . As shown in FIG. 6 , for example, category menu graphical objects 620 may comprise graphical representations of three-dimensional category menu views, such as category menu view 500 of FIG. 5 , arranged in a radial configuration about center point graphical object 610 .
  • category menu graphical object 620 - 2 may represent a communications menu category and may be associated with a plurality of communication applications, which may be represented by communication application icons 630 (e.g., icons 630 - 1 through 630 - 4 ) included with category menu graphical object 620 - 2 .
  • communication application icons 630 e.g., icons 630 - 1 through 630 - 4
  • relationships between applications and category menu graphical objects 620 may be represented in other ways.
  • application menu graphical objects may be radially aligned to form an outer layer about center point graphical object 610 at a second radial distance from center point graphical object 610 and at least partially or substantially encircling the radial layer of category menu graphical objects 620 .
  • category menu graphical objects 620 may be transparent or semi-transparent such that application menu graphical objects may be visible through category menu graphical objects 620 .
  • Graphical representation of three-dimensional radial menu 605 may visually represent a hierarchical menu organization of applications and menu categories associated with computing system 100 .
  • graphical representation of three-dimensional radial menu 605 may visually represent relationships between applications and menu categories in GUI 400 .
  • application menu graphical objects and/or application icons 630 may be positioned relative to certain category menu graphical objects 620 to visually indicate one or more hierarchical relationships.
  • a position of application icons 630 relative to a position of category menu graphical object 620 - 2 (e.g., within or overlaid on category menu graphical object 620 - 2 ) may represent that the applications represented by application icons 630 are hierarchically related to a menu category represented by category menu graphical object 420 - 2 .
  • one or more graphical objects may be user selectable in graphical representation of three-dimensional radial menu 605 . Accordingly, a user (e.g., a user of device 200 ) may select a particular graphical object displayed as part of three-dimensional radial menu 605 in GUI 400 . A user selection may be detected in any suitable way through any suitable user interfaces, including any of the ways and/or interfaces mentioned above.
  • one or more graphical objects may comprise one or more selectable touch objects displayed on a touch screen and that may be selected by a physical object (e.g., a finger or thumb) touching the selectable touch object(s). Accordingly, a user may conveniently select any of the graphical objects included in two-dimensional radial menu with a single touch.
  • a user selection of a graphical object in three-dimensional radial menu 605 may be detected in any suitable way, and one or more predetermined actions may be performed by system 100 in response to the user selection.
  • an application may be launched and/or executed in response to a user selection of an application icon 630 in three-dimensional radial menu 605 .
  • application icon 630 - 2 which may represent a voice communication application, for example, system 100 may execute the voice communication application.
  • a user selection of a graphical object in three-dimensional radial menu 605 may cause system 100 to display another view in GUI 400 .
  • system 100 may display a category menu view in GUI 400 in response to a user selection of a category menu graphical object 620 .
  • a user may select category menu graphical object 620 - 2 in GUI 400 .
  • system 100 may display category menu view 500 shown in FIG. 5 .
  • Graphical representation of three-dimensional radial menu 605 may provide a front-view display of at least one of the category menu graphical objects 620 in GUI 400 .
  • graphical representation of three-dimensional radial menu 605 may include a front-view display of category menu graphical object 620 - 2 with one or more other category menu graphical objects 620 - 1 and 620 - 3 positioned in the periphery of GUI 400 adjacent opposite side edges of category menu graphical object 620 - 2 .
  • System 100 may be configured to pivot category menu graphical objects 620 around center point graphical object 610 in GUI 400 .
  • the pivoting may include moving category menu graphical objects 620 in and/or out of the front-view display shown of FIG. 6 .
  • category menu graphical objects 620 may be pivoted to the left around center point graphical object 610 in GUI 400 , which may cause category menu graphical object 620 - 2 to move out of the front-view display and category menu graphical object 620 - 3 to move into the front-view display.
  • system 100 may be configured to detect user input and pivot category menu graphical objects 620 around center point graphical object 610 in response to the detected user input. Any suitable user input may be defined and used to trigger the pivoting.
  • system 100 may be configured to pivot category menu graphical objects 620 around center point graphical object 610 in response to a finger swipe on a touch screen display (e.g., a sideways finger swipe indicative of a pivot direction), a physical object touching a predetermined area and/or graphical object displayed on a touch screen display, or any other suitable input.
  • the graphical representation of three-dimensional radial menu 605 shown in FIG. 6 may be a partial view of three-dimensional radial menu 605 .
  • one or more category menu graphical objects 620 such as category menu graphical objects 620 - 1 and 620 - 3 may be partially displayed in GUI 400 .
  • certain graphical objects of three-dimensional radial menu 605 may not be displayed at all in GUI 400 .
  • category menu graphical objects 620 - 1 through 620 - 3 are displayed in GUI 400 in FIG. 6
  • one or more other category menu graphical objects associated with three-dimensional radial men 605 may be positioned outside of or otherwise omitted from GUI 400 .
  • Such other category menu graphical objects may be move in and out of GUI 400 when system 100 pivots category menu graphical objects 620 around center point graphical object 610 as described above.
  • One or more graphical objects of three-dimensional radial menu 605 may correspond to one or more graphical objects of two-dimensional radial menu 405 .
  • three-dimensional radial menu 605 and two-dimensional radial menu 405 may share a common center point.
  • center point graphical object 410 and center point graphical object 610 may represent a common center point or center point graphical object shared by three-dimensional radial menu 605 and two-dimensional radial menu 405 .
  • the common center point graphical object may be repositioned in GUI 400 as part of a transition between views of three-dimensional radial menu 605 and two-dimensional radial menu 405 in GUI 400 .
  • category menu graphical objects 620 in three-dimensional radial menu 605 may correspond to category menu graphical objects 420 in two-dimensional radial menu 405 .
  • category menu graphical objects 620 and category menu graphical objects 420 may represent the same set of menu categories.
  • category menu graphical objects 420 and/or 620 may be repositioned in GUI 400 as part of a transition between views of two-dimensional radial menu 405 and three-dimensional radial menu 605 in GUI 400 .
  • application menu icons 630 in three-dimensional radial menu 605 may correspond to application menu graphical objects 430 in two-dimensional radial menu 405 .
  • application menu icons 630 and application menu graphical objects 430 may represent the same applications.
  • application menu graphical objects 430 and/or application menu icons 630 may be repositioned and/or modified in GUI 400 as part of a transition between views of two-dimensional radial menu 405 and three-dimensional radial menu 605 in GUI 400 .
  • application menu graphical objects 420 may merge into category menu graphical objects 620 as application menu icons 630 .
  • category menu graphical object 420 - 2 in FIG. 4A and category menu graphical object 620 - 2 in FIG. 6 may represent the same communications menu category
  • application menu graphical objects 430 - 1 through 430 - 4 in FIG. 4A and application menu icons 630 - 1 through 630 - 4 in FIG. 6 may represent the same communications applications.
  • system 100 may be configured to transition between graphical representations of two-dimensional radial menu 405 and three-dimensional radial menu 605 in GUI 400 .
  • radial menu facility 150 may direct processing facility 120 of system 100 to transform graphical representation of two-dimensional radial menu 405 to graphical representation of three-dimensional radial menu 605 and/or to transform graphical representation of three-dimensional radial menu 605 to graphical representation of two-dimensional radial menu 405 in GUI 400 .
  • system 100 may execute a transition between graphical representations of two-dimensional radial menu 405 three-dimensional radial menu 605 in GUI 400 in response to detected user input. For example, a user selection of center point graphical object 410 of two-dimensional radial menu 405 in GUI 400 may trigger a transformation from graphical representation of two-dimensional radial menu 405 shown in FIG. 4A to graphical representation of three-dimensional radial menu 605 shown in FIG. 6 . Similarly, a user selection of center point graphical object 610 of three-dimensional radial menu 605 in GUI 400 may trigger a transformation from graphical representation of three-dimensional radial menu 605 shown in FIG. 6 to graphical representation of two-dimensional radial menu 405 shown in FIG. 4A .
  • FIG. 7 illustrates an exemplary method 700 for radial menu display, which method 700 may include one or more transitions between graphical representations of two-dimensional and three-dimensional radial menus in GUI 400 . While FIG. 7 illustrates exemplary steps according to one embodiment, other embodiments may omit, add to, reorder, and/or modify any of the steps shown in FIG. 7 .
  • a graphical representation of a two-dimensional radial menu may be displayed in a GUI of a computing system.
  • system 100 may display graphical representation of two-dimensional radial menu 405 in GUI 400 , which may include radial menu facility 150 generating and/or providing data representative of two-dimensional radial menu 405 to user interface facility 160 for display in GUI 400 by I/O facility 140 .
  • user input may be detected.
  • the user input may be associated with the graphical representation of two-dimensional radial menu 405 in GUI 400 .
  • System 100 e.g., I/O facility 140 of system 100
  • the user input may include any user input predefined to trigger a transition between graphical representations of radial menus.
  • the user input may include a user selection of center point graphical object 410 included in graphical representation of two-dimensional radial menu 405 .
  • the graphical representation of the two-dimensional radial menu may be transformed, in response to the user input, into a graphical representation of a three-dimensional radial menu in the GUI.
  • system 100 may transform graphical representation of a two-dimensional radial menu 405 to graphical representation of three-dimensional radial menu 605 in GUI 400 , which may include radial menu facility 150 providing data representative of the transformation to user interface facility 160 for display in GUI 400 by I/O facility 140 .
  • step 740 additional user input may be detected.
  • the additional user input may be associated with the graphical representation of three-dimensional radial menu 605 in GUI 400 .
  • System 100 e.g., I/O facility 140 of system 100
  • the additional user input may include any user input predefined to trigger a transition between graphical representations of radial menus.
  • the user input may include a user selection of center point graphical object 610 included in graphical representation of three-dimensional radial menu 605 .
  • the graphical representation of the three-dimensional radial menu may be transformed, in response to the additional user input, back into the graphical representation of the three-dimensional radial menu in the GUI.
  • system 100 may transform graphical representation of a three-dimensional radial menu 605 back into graphical representation of two-dimensional radial menu 405 in GUI 400 , which may include radial menu facility 150 providing data representative of the transformation to user interface facility 160 for display in GUI 400 by I/O facility 140 .
  • a transformation from one graphical representation of a radial menu to another graphical representation of a radial menu may be performed by system 100 in any suitable way.
  • system 100 may replace a displayed graphical representation of a radial menu with another graphical representation of a radial menu in GUI 400 .
  • the transformation may be performed in other ways, which may include, without limitation, repositioning graphical objects, rotating graphical objects, re-orienting graphical objects, changing viewpoints relative to graphical objects, zooming in on or out from graphical objects in GUI 400 , adding a third dimension (e.g., a depth dimension along a z-axis) to one or more graphical objects, or any combination or sub-combination thereof.
  • a third dimension e.g., a depth dimension along a z-axis
  • a transformation from graphical representation of two-dimensional radial menu 405 to graphical representation of a three-dimensional radial menu 605 in GUI 400 may include repositioning a common center point graphical object in GUI 400 .
  • center point graphical object 410 of FIG. 4A may be repositioned in GUI 400 to become center point graphical object 610 in FIG. 6 .
  • at least one category menu graphical object 420 of FIG. 4A may be repositioned in GUI 400 to become at least one corresponding category menu graphical object 620 in FIG. 6 .
  • a transformation from graphical representation of two-dimensional radial menu 405 to graphical representation of a three-dimensional radial menu 605 in GUI 400 may also include zooming in on graphical representation of three-dimensional radial menu 605 , or zooming in on at least one or more graphical objects of graphical representation of three-dimensional radial menu 605 , displayed in GUI 400 .
  • a transformation from graphical representation of two-dimensional radial menu 405 to graphical representation of three-dimensional radial menu 605 in GUI 400 may include repositioning a viewpoint associated with graphical representation of two-dimensional radial menu 405 shown in FIG. 4A to produce graphical representation of three-dimensional radial menu 605 shown in FIG. 6 .
  • the repositioning of the viewpoint may include system 100 moving a viewpoint from a top-down viewpoint of graphical representation of two-dimensional radial menu 405 shown in FIG. 4A to a substantially ground-level viewpoint of graphical representation of three-dimensional radial menu 605 shown in FIG. 6 .
  • the substantially ground-level viewpoint of graphical representation of three-dimensional radial menu 605 may be positioned proximate to center point graphical object 610 of graphical representation of three-dimensional radial menu 605 in GUI 400 , as represented in FIG. 6 .
  • Such a transformation from graphical representation of two-dimensional radial menu 405 to graphical representation of three-dimensional radial menu 605 in GUI 400 may be configured to provide a user experience in which, from a perspective of a user, a user vantage point is moving from a top-down, relatively distant view of two-dimensional radial menu 405 to a ground-level, relatively proximate view of three-dimensional radial menu 605 .
  • This may be configured to facilitate a centric user experience that places a user perspective near the center point of three-dimensional radial menu 605 .
  • a transformation from graphical representation of two-dimensional radial menu 405 to graphical representation of three-dimensional radial menu 605 in GUI 400 may maintain consistent navigational principles and/or inputs between the two-dimensional radial menu 405 and the three-dimensional radial menu 605 , with the three-dimensional radial menu 605 providing a more immersive user experience than is provided by the two-dimensional radial menu 405 .
  • system 100 may display one or more animation effects configured to represent a transformation from one graphical representation of a radial menu to another graphical representation of the radial menu.
  • Graphical representations of radial menus may be rendered by system 100 in GUI 400 in any suitable manner.
  • radial menu facility 150 may utilize data representative of a two-dimensional radial menu model to render a graphical representation of two-dimensional radial menu 405 .
  • radial menu facility 150 may utilize data representative of a three-dimensional radial menu model to render a graphical representation of two-dimensional radial menu 405 .
  • radial menu facility 150 may utilize data representative of a single radial menu model to render graphical representations of two-dimensional radial menu 405 and graphical representation of three-dimensional radial menu 605 .
  • FIG. 8 illustrates another exemplary method 800 for radial menu display. While FIG. 8 illustrates exemplary steps according to one embodiment, other embodiments may omit, add to, reorder, and/or modify any of the steps shown in FIG. 8 .
  • data representative of a radial menu model may be maintained in a computer-readable medium of a computing system.
  • radial menu facility 150 may maintain data representative of a radial menu model in storage facility 130 of system 100 .
  • the data may be maintained in any format suitable for representing a radial menu model that may be used to render one or more graphical representations of radial menus in a GUI.
  • the radial menu model may be utilized to render a graphical representation of the radial menu model in a GUI.
  • radial menu facility 150 may utilize data representative of the radial menu model to render graphical representation of two-dimensional radial menu 405 for display in GUI 400 .
  • the radial menu model may utilized, in response to the user input, to render another graphical representation of the radial menu model in the GUI.
  • radial menu facility 150 may utilize data representative of the radial menu model to render graphical representation of three-dimensional radial menu 405 for display in GUI 400 .
  • data representative of radial menu model may be utilized to generate various graphical representations of the radial menu model in GUI 400 .
  • FIG. 9 illustrates a perspective view of an exemplary three-dimensional radial menu model 900 , which may be utilized in method 700 and/or 800 to render one or more graphical representations of a radial menu in GUI 400 , as described above.
  • radial menu model 900 may include a radial configuration of three-dimensional objects, which may include a center point object 910 , category menu objects 920 (e.g., category menu objects 920 - 1 through 920 - 5 ), and application menu objects 930 (e.g., application menu objects 930 - 1 through 930 - 21 ).
  • Center point graphical object 910 may generally indicate a radial center point of radial menu model 900 .
  • Category menu objects 920 may represent categories of applications, including any of the menu categories described above.
  • Application menu objects 930 may represent one or more applications, which may include any software and/or device applications provided by (e.g., executable by) computing system 100 and/or accessible to a user of computing system 100 .
  • Radial menu model 900 may represent a hierarchical menu organization of applications and menu categories associated with computing system 100 .
  • radial menu model 900 may represent relationships between applications and menu categories. Such relationships may be represented by relative positioning of objects in radial menu model 900 .
  • application menu objects 930 may be positioned relative to certain category menu objects 920 to visually indicate one or more hierarchical relationships.
  • FIG. 9 for example, application menu objects 930 - 1 through 930 - 4 are positioned adjacent to category menu object 920 - 1 to indicate relationships between the application represented by application menu objects 930 - 1 through 930 - 4 and the menu category represented by category menu object 920 - 1 .
  • category menu objects 920 may be radially aligned to form an inner radial layer of category menu objects 920 positioned about center point object 910 at a first radial distance from center point object 910 .
  • the inner radial layer of category menu objects 920 may substantially encircle center point object 910 .
  • application menu objects 930 may be radially aligned to form an outer radial layer of application menu objects 930 positioned about center point object 910 at a second radial distance from center point object 910 .
  • the outer radial layer of application menu objects 930 may substantially encircle the inner radial layer of category menu objects 920 .
  • One or more objects included in radial menu model 900 may be utilized by system 100 to generate a graphical representation of at least a portion of radial menu model 900 in GUI 400 .
  • system 100 may utilize one or more objects of radial menu model 900 , or another similarly configured radial menu model, to generate the graphical representation of two-dimensional radial menu 405 shown in FIG. 4A and/or the graphical representation of three-dimensional radial menu 605 shown in FIG. 6 .
  • system 100 may be configured to move a viewpoint relative to radial menu model 900 to generate various views of radial menu model 900 in GUI 400 .
  • system 100 may use a first viewpoint positioned directly above and a certain distance away from radial menu model 900 to generate the graphical representation of two-dimensional radial menu 405 shown in FIG. 4A .
  • System 100 may reposition the first viewpoint relative to radial menu model 900 to generate another view of radial menu model 900 .
  • the viewpoint may be moved from a first position directly above radial menu model 900 to a second position that provides an angled, zoomed-in perspective view of radial menu model 900 .
  • Such movements of the viewpoint may be used to transform the graphical representation of two-dimensional radial menu 405 shown in FIG. 4A to the graphical representation of three-dimensional radial menu 605 shown in FIG. 6 .
  • movement of a viewpoint relative to radial menu model 900 may be animated in real time in GUI 400 , which may cause radial menu model 900 to appear to be tilted, re-oriented, enlarged, minimized, or otherwise manipulated in GUI 400 .
  • various viewpoints of radial menu model 900 may be used by system 100 to display various graphical representations of radial menu model 900 in GUI 400 .
  • a radial menu may additionally include one or more sub-categories, which may further classify and/or hierarchically organize applications based on one or more common attributes.
  • a sub-category may group applications based on one or more common attributes that are more specific than common attributes of a category. For example, within a communications category represented by a category menu graphical object, a sub-category menu graphical object may represent a sub-category of voice communication applications while another sub-category menu graphical object may represent a sub-category of Internet communication applications.
  • a sub-category may be represented by a sub-category menu graphical object.

Abstract

In certain embodiments, a graphical representation of a two-dimensional radial menu is displayed in a graphical user interface. The graphical representation of the two-dimensional radial menu is transformed into a graphical representation of a three-dimensional radial menu in the graphical user interface. In certain embodiments, the displaying comprises utilizing data representative of a three-dimensional radial menu model to render the graphical representation of the two-dimensional radial menu, based on a first viewpoint, in the graphical user interface, and the transforming comprises utilizing the data representative of the three-dimensional radial menu model to render the graphical representation of the three-dimensional radial menu, based on a second viewpoint, in the graphical user interface. In certain embodiments, the transforming comprises repositioning a viewpoint associated with the graphical representation of the two-dimensional radial menu to produce the graphical representation of the three-dimensional radial menu.

Description

    RELATED APPLICATIONS
  • This application is a continuation application of U.S. patent application Ser. No. 12/492,277, filed on Jun. 26, 2009, and entitled “RADIAL MENU DISPLAY SYSTEMS AND METHODS,” which is hereby incorporated by reference in its entirety.
  • BACKGROUND INFORMATION
  • Advances in electronic technologies and devices have put a wide variety of applications, features, and information at people's fingertips. The proliferation of such applications, features, and information on electronic devices has challenged designers of user interfaces for the electronic devices. For example, a common challenge has been to design and implement user interface elements that provide an intuitive and appropriate balance of information, usability, aesthetics, and functionality. The difficulty of the challenge is exacerbated for electronic devices that have limited resources and/or that are small in size, such as a phone device.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings illustrate various embodiments and are a part of the specification. The illustrated embodiments are merely examples and do not limit the scope of the disclosure. Throughout the drawings, identical or similar reference numbers designate identical or similar elements.
  • FIG. 1 illustrates an exemplary radial menu display system.
  • FIG. 2 illustrates an exemplary device having the system of FIG. 1 implemented therein.
  • FIG. 3 illustrates another exemplary implementation of the system of FIG. 1.
  • FIGS. 4A-4B illustrate a graphical representation of an exemplary two-dimensional radial menu in a graphical user interface (“GUI”).
  • FIG. 5 illustrates a graphical representation of an exemplary category menu view in a GUI.
  • FIG. 6 illustrates a graphical representation of an exemplary three-dimensional radial menu in a GUI.
  • FIG. 7 illustrates an exemplary radial menu display method.
  • FIG. 8 illustrates another exemplary radial menu display method.
  • FIG. 9 illustrates a perspective view of an exemplary three-dimensional radial menu model.
  • DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
  • Exemplary radial menu display systems and methods are described herein. In certain embodiments, a graphical representation of a two-dimensional radial menu may be displayed in a graphical user interface (“GUI”). The graphical representation of the two-dimensional radial menu may be transformed into a graphical representation of a three-dimensional radial menu in the GUI. In certain embodiments, a user selection of a center point graphical object included in the graphical representation of the two-dimensional radial menu may be detected and the transformation of the graphical representation of the two-dimensional radial menu into the graphical representation of a three-dimensional radial menu in the GUI may be executed in response to the user selection.
  • In certain embodiments, the graphical representation of the two-dimensional radial menu and the graphical representation of the three-dimensional radial menu may include a common center point that is repositioned in the graphical user interface upon transforming the graphical representation of the two-dimensional radial menu into a graphical representation of a three-dimensional radial menu in the GUI. In certain embodiments, the graphical representation of the two-dimensional radial menu may include at least one category menu graphical object that is repositioned in the graphical user interface upon transforming the graphical representation of the two-dimensional radial menu into a graphical representation of a three-dimensional radial menu in the GUI.
  • In certain embodiments, data representative of a radial menu model may be maintained and utilized to render at least one of the two-dimensional radial menu model and the three-dimensional radial menu model in the GUI. The radial menu model may include a center point, a plurality of category menu objects positioned about the center point at a first radial distance from the center point, and a plurality of application menu objects positioned about the plurality of category menu objects at a second radial distance from the center point.
  • The exemplary graphical representations of radial menus disclosed herein may provide a convenient, intuitive, consistent, and centric experience for a user who utilizes one or more of the radial menus to navigate and select one or more features accessible to and/or provided by a computing device. Exemplary embodiments of radial menu display systems and methods will now be described in more detail with reference to the accompanying drawings.
  • FIG. 1 illustrates an exemplary computing system 100 (“system 100”) configured to provide one or more graphical representations of radial menus. System 100 may include a communication facility 110, processing facility 120, storage facility 130, input/output (“I/O”) facility 140, radial menu facility 150, and user interface facility 160 communicatively coupled to one another as shown in FIG. 1. The components of system 100 may communicate with one another, including sending data to and receiving data from one another, using any suitable communication technologies.
  • In some examples, system 100, or one or more components of system 100, may include any computing hardware and/or instructions (e.g., software programs), or combinations of computing instructions and hardware, configured to perform one or more of the processes described herein. In particular, it should be understood that system 100, or one or more components of system 100, may be implemented on one physical computing device or may be implemented on more than one physical computing device. Accordingly, system 100 may include any one of a number of computing devices employing any of a number of computer operating systems.
  • One or more of the processes described herein may be implemented at least in part as computer-executable instructions, i.e., instructions executable by one or more computing devices, tangibly embodied in a computer-readable medium. In general, a processor (e.g., a microprocessor) receives instructions, from a computer-readable medium (e.g., from a memory, etc.), and executes those instructions, thereby performing one or more processes, including one or more of the processes described herein. Such instructions may be stored and transmitted using a variety of known computer-readable media.
  • A computer-readable medium (also referred to as a processor-readable medium) includes any medium that participates in providing data (e.g., instructions) that may be read by a computer (e.g., by a processor of a computer). Such a medium may take many forms, including, but not limited to, non-volatile media and volatile media. Non-volatile media may include, for example, optical or magnetic disks and other persistent memory. Volatile media may include, for example, dynamic random access memory (“DRAM”), which typically constitutes a main memory. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, a RAM, a PROM, an EPROM, a FLASH-EEPROM, any other memory chip or cartridge, or any other medium from which a computing device can read.
  • Accordingly, each of the components of system 100 may be implemented as hardware, computing instructions (e.g., software) tangibly embodied on a computer-readable medium, or a combination of hardware and tangibly embodied computing instructions configured to perform one or more of the processes described herein. In certain embodiments, for example, radial menu facility 150 may be implemented as one or more software applications embodied on one or more computer-readable media and configured to direct processing facility 120, user interface facility 160, and/or one or more other components of system 100 to execute one or more of the processes described herein.
  • In certain embodiments, system 100 may be implemented on one or more devices, as may suit a particular application. For example, FIG. 2 illustrates an exemplary phone device 200 (e.g., a home or business phone console such as a Verizon Hub phone device) having system 100 implemented thereon. Device 200 may include one or more of the components of system 100 shown in FIG. 1 and may be configured to perform one or more of the processes and/or operations described herein. While FIG. 2 illustrates an exemplary phone console device 200, system 100 may be implemented on other devices in other embodiments. Such devices may include, but are not limited to, a communications device, user device, mobile device (e.g., a mobile phone device), handheld device, computer, personal-digital assistant device, set-top box and connected display device (e.g., a television), display device, console device, and any other device configured to perform one or more of the processes and/or operations described herein.
  • As shown in FIG. 2, device 200 may include a display 210, which may be part of I/O facility 140 and may include one or more display components and technologies configured to display one or more GUIs for viewing by a user of device 200. For example, display 210 may include a display screen configured to display one or more GUIs for viewing by a user of device 200. In certain implementations, the display screen may comprise a touch screen display configured to receive touch input. The touch screen display may employ any suitable single-touch and/or multi-touch touch screen technologies. Examples of GUIs and various GUI views that may be displayed on a display such as display 210 are described in detail further below. In addition to display 210, device 200 may include input mechanisms such as one or more of the input buttons 220 (e.g., input buttons 220-1 and 220-2) shown in FIG. 2. Input buttons 220 may be part of I/O facility 140.
  • The implementation of system 100 shown in FIG. 2 is illustrative only. Other embodiments may include alternative implementations. As an example, FIG. 3 illustrates another exemplary implementation 300 of system 100. In implementation 300, components of system 100 may be distributed across a server subsystem 310 and an access device 320 configured to communicate with server subsystem 310 by way of a network 325. Distribution of components of system 100 across server subsystem 310 and access device 320 may be arranged as may suit a particular application. In certain examples, I/O facility 140 and user interface facility 160 may be implemented in access device 320, and one or more of the other facilities may be implemented in server subsystem 310. In other examples, I/O facility 140, radial menu facility 150, and user interface facility 160 may be implemented in access device 320, and one or more of the other facilities may be implemented in server subsystem 310. In yet other examples, any component of system 100 may be divided and distributed across server subsystem 310 and access device 320. For instance, radial menu facility 150 and/or user interface facility 160 may be divided and distributed across server subsystem 310 and access device 320 in certain embodiments.
  • Server subsystem 310 may include at least one server with one or more of the components of system 100 implemented thereon, and access device 320 may include any suitable device with one or more components of system 100 implemented thereon. In certain embodiments, for example, access device 320 may include I/O facility 140, or user interface facility 160 and I/O facility 140, such that access device 320 is configured to generate and/or display one or more of the GUIs described herein for viewing by a user 330 of access device. Access device 320 may include, but is not limited to, a communications device, mobile device (e.g., a mobile phone device), handheld device, computing device (e.g., a desktop or laptop computer), phone device (e.g., Verizon Hub device), personal-digital assistant device, set-top box and connected display device, gaming device, wireless communications device, and/or any other device having one or more components of system 100 implemented thereon and configured to perform one or more of the processes described herein.
  • Network 325 may include one or more networks, including, but not limited to, wireless networks, mobile telephone networks (e.g., cellular telephone networks), closed media networks, subscriber television networks, cable networks, satellite networks, the Internet, intranets, local area networks, public networks, private networks, optical fiber networks, broadband networks, narrowband networks, voice communications networks, Voice over Internet Protocol “(VoIP”) networks, Public Switched Telephone Networks (“PSTN”), data communications networks, other communications networks, and any other networks capable of carrying communications and/or data between access device 320 and server subsystem 310. Communications between server subsystem 310 and access device 320 may be transported using any one of above-listed networks, or any combination or sub-combination of the above-listed networks.
  • Access device 320 and server subsystem 310 may communicate over network 325 using any communication platforms and technologies suitable for transporting data and/or communication signals, including known communication technologies, devices, media, and protocols supportive of remote communications, examples of which include, but are not limited to, data transmission media, communications devices, Transmission Control Protocol (“TCP”), Internet Protocol (“IP”), File Transfer Protocol (“FTP”), Telnet, Hypertext Transfer Protocol (“HTTP”), Hypertext Transfer Protocol Secure (“HTTPS”), Session Initiation Protocol (“SIP”), Simple Object Access Protocol (“SOAP”), Extensible Mark-up Language (“XML”) and variations thereof, Simple Mail Transfer Protocol (“SMTP”), Real-Time Transport Protocol (“RTP”), User Datagram Protocol (“UDP”), Global System for Mobile Communications (“GSM”) technologies, Code Division Multiple Access (“CDMA”) technologies, Evolution Data Optimized Protocol (“EVDO”), Time Division Multiple Access (“TDMA”) technologies, Short Message Service (“SMS”), Multimedia Message Service (“MMS”), radio frequency (“RF”) signaling technologies, wireless communication technologies (e.g., Bluetooth, Wi-Fi, etc.), in-band and out-of-band signaling technologies, and other suitable communications technologies.
  • Returning to FIG. 1, each of the components shown therein will now be described in additional detail.
  • Communication facility 110 may be configured to send, receive, and/or otherwise process data representative of or otherwise associated with communication events. As used herein, a “communication event” may include any communication between two or more communication devices and/or between two or more persons or entities (“contacts”) by way of the devices. Examples of such communication events may include, but are not limited to, voice communications (e.g., Voice Over IP (“VoIP”), Public Switched Telephone Network (“PSTN”), or other active, attempted, completed, or recorded voice calls and/or messages), text messages (e.g., Short Message Service (“SMS”) messages), media messages (e.g., Multimedia Message Service (“MMS”) messages), e-mail messages, chat messages (e.g., Instant Messaging (“IM”) messages), and subscriber feed messages (e.g., RSS feed messages).
  • Communication facility 110 may employ any suitable technologies for processing communication events, including sending and/or receiving signals representative of or otherwise associated with communication events over one or more communication networks. As an example, communication facility 110 implemented on device 200 may be configured to send and/or receive signals representative of or otherwise associated with communication events to/from another device over one or more communication networks.
  • Communication facility 110 may be configured to maintain data representative of communication events. Such data, which may be referred to as “communications data,” may be stored by communication facility 110 and/or on one or more suitable computer-readable media, such as storage facility 130. Communications data may include any information descriptive of or otherwise associated with one or more communication events. For example, communications data may include contact information descriptive of contacts associated with communication events (e.g., sender and receiver contact information). Such contact information may include contact identifiers (e.g., contact names), phone numbers, e-mail addresses, and/or other information descriptive of parties to and/or devices associated with communication events. As another example, communications data may include time information associated with communication events, including communication time stamps (e.g., start and end times), communication duration information, and any other information descriptive of time information (e.g., time component) associated with communication events. Communications data may also include device identifiers, routing information, media attachments, communication content, address information, communication status information, communication type indicators, and/or other attributes or information descriptive of or otherwise associated with communication events.
  • Processing facility 120 may include one or more processors and may be configured to execute and/or direct execution of one or more processes or operations described herein. Processing facility 120 may direct execution of operations in accordance with computer-executable instructions such as may be stored in storage facility 130 or another computer-readable medium. As an example, processing facility 120 may be configured to process data, including demodulating, decoding, and parsing acquired data, and encoding and modulating data for transmission by communication facility 110.
  • Storage facility 130 may include one or more data storage media, devices, or configurations and may employ any type, form, and combination of storage media. For example, storage facility 130 may include, but is not limited to, a hard drive, network drive, flash drive, magnetic disc, optical disc, RAM, DRAM, other non-volatile and/or volatile storage unit, or a combination or sub-combination thereof. Electronic data, including data described herein, may be temporarily and/or permanently stored in storage facility 130.
  • I/O facility 140 may be configured to receive user input and provide user output and may include any hardware, firmware, software, or combination thereof supportive of input and output capabilities. For example, I/O facility 140 may include one or more devices for capturing user input, including, but not limited to, a microphone, speech recognition technologies, keyboard or keypad, touch screen component (e.g., touch screen display), receiver (e.g., an RF or infrared receiver), and one or more input buttons.
  • I/O facility 140 may include one or more devices for presenting output to a user, including, but not limited to, a graphics engine, a display (e.g., display 210), one or more display drivers, one or more audio speakers, and one or more audio drivers. Output may include audio, visual, textual, and/or haptic output. In certain embodiments, for example, I/O facility 140 is configured to display one or more GUIs for viewing by a user. Exemplary GUIs and GUI views that may be displayed by I/O facility 140 are described further below.
  • User interface facility 160 may be configured to generate, or direct processing facility 120 to generate, one or more user interfaces. For example, user interface facility 160 may be configured to generate and provide data representing one or more GUIs to I/O facility 140 for display. In certain embodiments, user interface facility 160 may receive data from radial menu facility 150 and utilize the received data to generate a GUI view for display in a GUI. User interface facility 160 may provide data representative of the GUI to I/O facility 140 for display. As mentioned, exemplary GUIs and GUI views are described further below.
  • Radial menu facility 150 may be configured to generate, provide, render and/or utilize data representative of a radial menu for display in a GUI. For example, radial menu facility 150 may provide data representative of a radial menu to user interface facility 160 for inclusion in a GUI. The data representative of a radial menu may be used by user interface facility 160 to display a graphical representation of a radial menu in a GUI. The graphical representation of the radial menu may include a graphical representation of a two-dimensional radial menu and/or a three-dimensional radial menu. Exemplary graphical representations of a two-dimensional radial menu and a three-dimensional radial menu are described further below.
  • In certain embodiments, radial menu facility 150 may be configured to provide data representative of a transformation from a graphical representation of one radial menu to a graphical representation of another radial menu in a GUI. For example, radial menu facility 150 may provide data representative of a transformation from a graphical representation of a two-dimensional radial menu to a graphical representation of a three-dimensional radial menu in a GUI. Alternatively or additionally, radial menu facility 150 may provide data representative of a transformation from a graphical representation of a three-dimensional radial menu to a graphical representation of a two-dimensional radial menu in a GUI. Exemplary transformations between graphical representations of radial menus are described further below.
  • In certain embodiments, radial menu facility 150 may be configured to maintain data representative of a radial menu model in a computer-readable medium such as storage facility 130. Radial menu facility 150 may utilize the data representative of the radial menu model to render, or direct one or more components (e.g., processing facility 120) of system 100 to render, a graphical representation of the radial menu model in a GUI. An exemplary radial menu model is described further below.
  • To help facilitate an understanding of radial menu facility 150 and radial menus, FIGS. 4A, 4B, and 5 illustrate exemplary graphical representations of radial menus that may be displayed in a GUI. FIG. 4A illustrates a GUI 400 including an exemplary graphical representation of a two-dimensional radial menu 405 displayed therein. Graphical representation of two-dimensional radial menu 405 may include a plurality of graphical objects arranged in a two-dimensional radial configuration within GUI 400. In the illustrated example, graphical objects are arranged about a center point in a generally circular configuration. This is illustrative only. Other radial configurations (e.g., an arc or a spiral) may be used in other embodiments.
  • As illustrated in FIG. 4A, graphical representation of two-dimensional radial menu 405 may include a center point graphical object 410, a plurality of category menu graphical objects 420 (e.g., category menu graphical objects 420-1 through 420-5), and a plurality of application menu graphical objects 430 (e.g., application menu graphical objects 430-1 through 430-17). Center point graphical object 410 may indicate generally a radial center point of graphical representation of two-dimensional radial menu 405. Center point graphical object 410 may comprise any suitable visual indicator and/or attribute configured to visually represent the radial center point of the graphical representation of two-dimensional radial menu 405 and/or to distinguish center point graphical object 410 from one or more other graphical objects displayed in GUI 400.
  • Category menu graphical objects 420 may represent categories of applications (e.g., software and/or device applications). Each category may include a menu category associated with one or more applications that share at least one common attribute. Categories may be defined and organized in any suitable way and may be associated with groups of one or more applications based on any attribute(s) common to one or more applications. Examples of such categories may include, but are not limited to, a communications category, an accessories category, a settings category, a games category, an entertainment category, a media category, a software programs category, a “go to” category, and any other menu category with which an application may be associated. As an example, one of the category menu graphical objects 420 in GUI 400 may represent a communications category associated with one or more communications applications, such as voice communication applications, email applications, and/or text messaging applications.
  • Category menu graphical objects 420 may comprise any suitable visual indicator and/or attribute configured to visually represent one or more categories of applications and/or to distinguish category menu graphical objects 420 from one another and/or from one or more other graphical objects displayed in GUI 400. In certain embodiments, for example, category menu graphical objects 420 may include text specifying menu categories represented by the category menu graphical objects 420. For instance, one of the category menu graphical objects 420 may represent a communications menu category and may include text (e.g., “communications”) indicative of the communications menu category.
  • Application menu graphical objects 430 may represent one or more applications, which may include any software and/or device applications provided by (e.g., executable by) computing system 100 and/or accessible to a user of computing system 100. Examples of such applications may include, but are not limited to, voice communication applications (e.g., phone call applications), email applications, text messaging applications, instant messaging applications, printing applications, security applications, word processing applications, spreadsheet applications, media player applications, device programming applications, web browser applications, gaming applications, widget applications, and/or any other applications that are executable on computing system 100.
  • Application menu graphical objects 430 may comprise any suitable visual indicator and/or attribute configured to visually represent one or more applications and/or to distinguish application menu graphical objects 430 from one another and/or from one or more other graphical objects displayed in GUI 400. In certain embodiments, for example, application menu graphical objects 430 may include text and/or icons specifying applications represented by the application menu graphical objects 430. For instance, application menu graphical object 430-2 may represent an application and may include text and/or an icon indicative of the application.
  • Graphical representation of two-dimensional radial menu 405 may visually represent a hierarchical menu organization of applications and menu categories associated with computing system 100. In particular, graphical representation of two-dimensional radial menu 405 may visually represent relationships between applications and menu categories in GUI 400. For instance, application menu graphical objects 430 may be positioned relative to certain category menu graphical objects 420 to visually indicate one or more hierarchical relationships. As an example, a position of category menu graphical object 420-2 relative to positions of application menu graphical objects 430-1 through 430-4 may represent that the applications represented by application menu graphical objects 430-1 through 430-4 are hierarchically related to a menu category represented by menu graphical object 420-2. In the illustrated example, the relationships between the applications represented by application menu graphical objects 430-1 through 430-4 and the menu category represented by category menu graphical object 420-2 may be visually depicted by alignment of each of the application menu graphical objects 430-1 through 430-4 with the category menu graphical object 420-2 moving in a particular direction away from center point graphical object 410.
  • As illustrated in FIG. 4A, category menu graphical objects 420 may be radially aligned to form an inner radial layer of category menu graphical objects 420 positioned about center point graphical object 410 at a first radial distance from center point graphical object 410. In certain embodiments, the inner radial layer of category menu graphical objects 420 may substantially encircle center point graphical object 410. As shown in the illustrated example, the inner radial layer of category menu graphical objects 420 may contain certain divisions, of any size or proportion, separating the individual category menu graphical objects 420.
  • In addition, application menu graphical objects 430 may be radially aligned to form an outer radial layer of application menu graphical objects 430 positioned about center point graphical object 410 at a second radial distance from center point graphical object 410. In certain embodiments, the outer radial layer of application menu graphical objects 430 may substantially encircle the inner radial layer of category menu graphical objects 420. As shown in the illustrated example, the outer layer of application menu graphical objects 430 may contain certain divisions, of any size or proportion, separating the individual application menu graphical objects 430.
  • In certain embodiments, one or more graphical objects, such as center point graphical object 410, category menu graphical objects 420, and/or application menu graphical objects 430, may be user selectable in graphical representation of two-dimensional radial menu 405. Accordingly, a user (e.g., a user of device 200) may select a particular graphical object displayed as part of two-dimensional radial menu 405 in GUI 400. A user selection may be detected in any suitable way through any suitable user interfaces, including touch screens, computer mice, image processing mechanisms, voice recognition mechanisms, buttons, joysticks, or any other user interface capable of detecting a user selection. For example, one or more graphical objects, such as center point graphical object 410, category menu graphical objects 420, and/or application menu graphical objects 430, may comprise one or more selectable touch objects displayed on a touch screen and that may be selected by a physical object (e.g., a finger or thumb) touching the selectable touch object(s). Accordingly, a user may conveniently select any of the graphical objects included in two-dimensional radial menu with a single touch.
  • FIG. 4B illustrates an exemplary graphical response 460 to a user selection of a graphical object in GUI 400. In one example, I/O facility 140 may detect a user selection of application menu graphical object 430-2 in GUI 400, and radial menu facility 150 may instruct one or more components of system 100 to display graphical response 460 indicating the detected user selection of application menu graphical object 430-2 in GUI 400. Graphical response 460 may include any visual indication of a user selection of application menu graphical object 430-2 in GUI 400. For example, application menu graphical object 430-2 may be enlarged in GUI 400. Graphical responses to user selections may include, without limitation, any graphical changes, movements, animation, events, modifications, or any other visual indications of user selections displayed in a GUI 400.
  • A user selection of a graphical object in two-dimensional radial menu 405 may be detected in any suitable way, and one or more predetermined actions may be performed by system 100 in response to the user selection. In certain embodiments, an application may be launched and/or executed in response to a user selection of an application menu graphical object 430 in two-dimensional radial menu 405. In response to a user selection of application menu graphical object 430-2, which may represent a voice communication application, for example, system 100 may execute the voice communication application.
  • In certain embodiments, a user selection of a graphical object in two-dimensional radial menu 405 may cause system 100 to display another view in GUI 400. For example, system 100 may display a category menu view in GUI 400 in response to a user selection of a category menu graphical object 420. As an example, a user may select category menu graphical object 420-2 in GUI 400. In response, system 100 may display a category menu view of category menu graphical object 420-2 in GUI 400.
  • FIG. 5 illustrates GUI 400 with an exemplary category menu view 500 displayed therein. As shown in FIG. 5, category menu view 500 may include overview information 510 associated with a category, one or more selectable options 520 associated with the category, and one or more application icons 530 (e.g., application icons 530-1 through 530-4) associated with one or more applications within the category. Hence, category menu view 500 may provide a dashboard view of information and options associated with a category. In the illustrated example, category menu view 500 is associated with a communications category and includes overview information 510 associated with communications (e.g., communications statistics) and application icons 530 associated with communication applications. A user selection of an application icon 530 may cause system 100 to launch the corresponding application.
  • In certain examples, category menu view 500 in FIG. 5 may correspond to a selected category menu graphical object 420-2 in FIG. 4A, and application icons 530-1 through 530-4 in FIG. 5 may correspond to application menu graphical objects 430-1 through 430-4 in FIG. 4A. That is, category menu view 500 in FIG. 5 and category menu graphical object 420-2 in FIG. 4A may represent the same communications category, and application menu graphical objects 430-1 through 430-4 in FIG. 4A and application icons 530-1 through 530-4 in FIG. 5 may represent the same communication applications (e.g., the same email, voice communication, text messaging, and other applications, respectively).
  • Another exemplary graphical representation of a radial menu will now be described. FIG. 6 illustrates GUI 400 with a graphical representation of an exemplary three-dimensional radial menu 605 displayed therein. Graphical representation of three-dimensional radial menu 605 may include a plurality of graphical objects arranged in a three-dimensional radial configuration view within GUI 400. As shown in the illustrated example, graphical representation of three-dimensional radial menu 605 may include a center point graphical object 610 and a plurality of category menu graphical objects 620 (e.g., category menu graphical objects 620-1 through 620-3) arranged radially about center point graphical object 610. The arrangement may form a three-dimensional radial layer of category menu graphical objects 620 at least partially or substantially encircling center point graphical object 610 at a certain radial distance from center point graphical object 610. The radial layer of category menu graphical objects 620 may contain certain divisions, of any size or proportion, separating the individual category menu graphical objects 620 as shown in FIG. 6. The example shown in FIG. 6 is illustrative only. Other radial configurations (e.g., a three-dimensional arc or spiral) may be used in other embodiments.
  • Center point graphical object 610 may indicate generally a radial center point of graphical representation of three-dimensional radial menu 605. Center point graphical object 610 may comprise any suitable visual indicator (e.g., an oval or ellipse) and/or attribute configured to visually represent the radial center point of the graphical representation of three-dimensional radial menu 405 and/or to distinguish center point graphical object 610 from one or more other graphical objects displayed in GUI 400.
  • Category menu graphical objects 620 may represent categories of applications, including any of the categories of application mentioned above. As described above, each category may include a menu category associated with one or more applications that share at least one common attribute. Categories may be defined and organized in any suitable way and may be associated with groups of one or more applications based on any attribute(s) common to one or more applications. In the example illustrated in FIG. 6, category menu graphical object 620-2 in GUI 400 represents a communications category associated with one or more communications applications, such as voice communication applications, email applications, and/or text messaging applications.
  • Category menu graphical objects 620 may comprise any suitable visual indicator and/or attribute configured to visually represent one or more categories of applications and/or to distinguish category menu graphical objects 620 from one another and/or from one or more other graphical objects displayed in GUI 400. As shown in FIG. 6, for example, category menu graphical objects 620 may comprise graphical representations of three-dimensional category menu views, such as category menu view 500 of FIG. 5, arranged in a radial configuration about center point graphical object 610.
  • As shown in FIG. 6, category menu graphical object 620-2 may represent a communications menu category and may be associated with a plurality of communication applications, which may be represented by communication application icons 630 (e.g., icons 630-1 through 630-4) included with category menu graphical object 620-2. In other embodiments, relationships between applications and category menu graphical objects 620 may be represented in other ways. For example, rather than being positioned within category menu graphical objects 620, application menu graphical objects may be radially aligned to form an outer layer about center point graphical object 610 at a second radial distance from center point graphical object 610 and at least partially or substantially encircling the radial layer of category menu graphical objects 620. In such embodiments, category menu graphical objects 620 may be transparent or semi-transparent such that application menu graphical objects may be visible through category menu graphical objects 620.
  • Graphical representation of three-dimensional radial menu 605 may visually represent a hierarchical menu organization of applications and menu categories associated with computing system 100. In particular, graphical representation of three-dimensional radial menu 605 may visually represent relationships between applications and menu categories in GUI 400. For instance, application menu graphical objects and/or application icons 630 may be positioned relative to certain category menu graphical objects 620 to visually indicate one or more hierarchical relationships. As an example, a position of application icons 630 relative to a position of category menu graphical object 620-2 (e.g., within or overlaid on category menu graphical object 620-2) may represent that the applications represented by application icons 630 are hierarchically related to a menu category represented by category menu graphical object 420-2.
  • In certain embodiments, one or more graphical objects, such as center point graphical object 610, category menu graphical objects 620, and/or application icons 630, may be user selectable in graphical representation of three-dimensional radial menu 605. Accordingly, a user (e.g., a user of device 200) may select a particular graphical object displayed as part of three-dimensional radial menu 605 in GUI 400. A user selection may be detected in any suitable way through any suitable user interfaces, including any of the ways and/or interfaces mentioned above. For example, one or more graphical objects, such as center point graphical object 610, category menu graphical objects 620, and/or application icons 630, may comprise one or more selectable touch objects displayed on a touch screen and that may be selected by a physical object (e.g., a finger or thumb) touching the selectable touch object(s). Accordingly, a user may conveniently select any of the graphical objects included in two-dimensional radial menu with a single touch.
  • A user selection of a graphical object in three-dimensional radial menu 605 may be detected in any suitable way, and one or more predetermined actions may be performed by system 100 in response to the user selection. In certain embodiments, an application may be launched and/or executed in response to a user selection of an application icon 630 in three-dimensional radial menu 605. In response to a user selection of application icon 630-2, which may represent a voice communication application, for example, system 100 may execute the voice communication application.
  • In certain embodiments, a user selection of a graphical object in three-dimensional radial menu 605 may cause system 100 to display another view in GUI 400. For example, system 100 may display a category menu view in GUI 400 in response to a user selection of a category menu graphical object 620. As an example, a user may select category menu graphical object 620-2 in GUI 400. In response, system 100 may display category menu view 500 shown in FIG. 5.
  • Graphical representation of three-dimensional radial menu 605 may provide a front-view display of at least one of the category menu graphical objects 620 in GUI 400. For example, as shown in FIG. 6, graphical representation of three-dimensional radial menu 605 may include a front-view display of category menu graphical object 620-2 with one or more other category menu graphical objects 620-1 and 620-3 positioned in the periphery of GUI 400 adjacent opposite side edges of category menu graphical object 620-2.
  • System 100 may be configured to pivot category menu graphical objects 620 around center point graphical object 610 in GUI 400. The pivoting may include moving category menu graphical objects 620 in and/or out of the front-view display shown of FIG. 6. For example, category menu graphical objects 620 may be pivoted to the left around center point graphical object 610 in GUI 400, which may cause category menu graphical object 620-2 to move out of the front-view display and category menu graphical object 620-3 to move into the front-view display.
  • In certain embodiments, system 100 may be configured to detect user input and pivot category menu graphical objects 620 around center point graphical object 610 in response to the detected user input. Any suitable user input may be defined and used to trigger the pivoting. For example, system 100 may be configured to pivot category menu graphical objects 620 around center point graphical object 610 in response to a finger swipe on a touch screen display (e.g., a sideways finger swipe indicative of a pivot direction), a physical object touching a predetermined area and/or graphical object displayed on a touch screen display, or any other suitable input.
  • In certain embodiments, the graphical representation of three-dimensional radial menu 605 shown in FIG. 6 may be a partial view of three-dimensional radial menu 605. As shown in FIG. 6, for example, one or more category menu graphical objects 620, such as category menu graphical objects 620-1 and 620-3 may be partially displayed in GUI 400. Additionally or alternatively, certain graphical objects of three-dimensional radial menu 605 may not be displayed at all in GUI 400. For example, while category menu graphical objects 620-1 through 620-3 are displayed in GUI 400 in FIG. 6, one or more other category menu graphical objects associated with three-dimensional radial men 605 may be positioned outside of or otherwise omitted from GUI 400. Such other category menu graphical objects may be move in and out of GUI 400 when system 100 pivots category menu graphical objects 620 around center point graphical object 610 as described above.
  • One or more graphical objects of three-dimensional radial menu 605 may correspond to one or more graphical objects of two-dimensional radial menu 405. In certain embodiments, for example, three-dimensional radial menu 605 and two-dimensional radial menu 405 may share a common center point. Hence, center point graphical object 410 and center point graphical object 610 may represent a common center point or center point graphical object shared by three-dimensional radial menu 605 and two-dimensional radial menu 405. As described in detail further below, the common center point graphical object may be repositioned in GUI 400 as part of a transition between views of three-dimensional radial menu 605 and two-dimensional radial menu 405 in GUI 400.
  • As another example of corresponding graphical objects, category menu graphical objects 620 in three-dimensional radial menu 605 may correspond to category menu graphical objects 420 in two-dimensional radial menu 405. For example, category menu graphical objects 620 and category menu graphical objects 420 may represent the same set of menu categories. As described in detail further below, category menu graphical objects 420 and/or 620 may be repositioned in GUI 400 as part of a transition between views of two-dimensional radial menu 405 and three-dimensional radial menu 605 in GUI 400.
  • As yet another example of corresponding graphical objects, application menu icons 630 in three-dimensional radial menu 605 may correspond to application menu graphical objects 430 in two-dimensional radial menu 405. For example, application menu icons 630 and application menu graphical objects 430 may represent the same applications. As described in detail further below, application menu graphical objects 430 and/or application menu icons 630 may be repositioned and/or modified in GUI 400 as part of a transition between views of two-dimensional radial menu 405 and three-dimensional radial menu 605 in GUI 400. For example, application menu graphical objects 420 may merge into category menu graphical objects 620 as application menu icons 630.
  • As an example of corresponding graphical objects, category menu graphical object 420-2 in FIG. 4A and category menu graphical object 620-2 in FIG. 6 may represent the same communications menu category, and application menu graphical objects 430-1 through 430-4 in FIG. 4A and application menu icons 630-1 through 630-4 in FIG. 6 may represent the same communications applications.
  • In certain embodiments, system 100 may be configured to transition between graphical representations of two-dimensional radial menu 405 and three-dimensional radial menu 605 in GUI 400. For example, radial menu facility 150 may direct processing facility 120 of system 100 to transform graphical representation of two-dimensional radial menu 405 to graphical representation of three-dimensional radial menu 605 and/or to transform graphical representation of three-dimensional radial menu 605 to graphical representation of two-dimensional radial menu 405 in GUI 400.
  • In certain embodiments, system 100 may execute a transition between graphical representations of two-dimensional radial menu 405 three-dimensional radial menu 605 in GUI 400 in response to detected user input. For example, a user selection of center point graphical object 410 of two-dimensional radial menu 405 in GUI 400 may trigger a transformation from graphical representation of two-dimensional radial menu 405 shown in FIG. 4A to graphical representation of three-dimensional radial menu 605 shown in FIG. 6. Similarly, a user selection of center point graphical object 610 of three-dimensional radial menu 605 in GUI 400 may trigger a transformation from graphical representation of three-dimensional radial menu 605 shown in FIG. 6 to graphical representation of two-dimensional radial menu 405 shown in FIG. 4A.
  • FIG. 7 illustrates an exemplary method 700 for radial menu display, which method 700 may include one or more transitions between graphical representations of two-dimensional and three-dimensional radial menus in GUI 400. While FIG. 7 illustrates exemplary steps according to one embodiment, other embodiments may omit, add to, reorder, and/or modify any of the steps shown in FIG. 7.
  • In step 710, a graphical representation of a two-dimensional radial menu may be displayed in a GUI of a computing system. For example, system 100 may display graphical representation of two-dimensional radial menu 405 in GUI 400, which may include radial menu facility 150 generating and/or providing data representative of two-dimensional radial menu 405 to user interface facility 160 for display in GUI 400 by I/O facility 140.
  • In step 720, user input may be detected. The user input may be associated with the graphical representation of two-dimensional radial menu 405 in GUI 400. System 100 (e.g., I/O facility 140 of system 100) may detect the user input, which may include any user input predefined to trigger a transition between graphical representations of radial menus. In certain embodiments, the user input may include a user selection of center point graphical object 410 included in graphical representation of two-dimensional radial menu 405.
  • In step 730, the graphical representation of the two-dimensional radial menu may be transformed, in response to the user input, into a graphical representation of a three-dimensional radial menu in the GUI. For example, system 100 may transform graphical representation of a two-dimensional radial menu 405 to graphical representation of three-dimensional radial menu 605 in GUI 400, which may include radial menu facility 150 providing data representative of the transformation to user interface facility 160 for display in GUI 400 by I/O facility 140.
  • In step 740, additional user input may be detected. The additional user input may be associated with the graphical representation of three-dimensional radial menu 605 in GUI 400. System 100 (e.g., I/O facility 140 of system 100) may detect the additional user input, which may include any user input predefined to trigger a transition between graphical representations of radial menus. In certain embodiments, the user input may include a user selection of center point graphical object 610 included in graphical representation of three-dimensional radial menu 605.
  • In step 750, the graphical representation of the three-dimensional radial menu may be transformed, in response to the additional user input, back into the graphical representation of the three-dimensional radial menu in the GUI. For example, system 100 may transform graphical representation of a three-dimensional radial menu 605 back into graphical representation of two-dimensional radial menu 405 in GUI 400, which may include radial menu facility 150 providing data representative of the transformation to user interface facility 160 for display in GUI 400 by I/O facility 140.
  • A transformation from one graphical representation of a radial menu to another graphical representation of a radial menu may be performed by system 100 in any suitable way. In certain embodiments, system 100 may replace a displayed graphical representation of a radial menu with another graphical representation of a radial menu in GUI 400. In certain other embodiments, the transformation may be performed in other ways, which may include, without limitation, repositioning graphical objects, rotating graphical objects, re-orienting graphical objects, changing viewpoints relative to graphical objects, zooming in on or out from graphical objects in GUI 400, adding a third dimension (e.g., a depth dimension along a z-axis) to one or more graphical objects, or any combination or sub-combination thereof.
  • In certain embodiments, for example, a transformation from graphical representation of two-dimensional radial menu 405 to graphical representation of a three-dimensional radial menu 605 in GUI 400 may include repositioning a common center point graphical object in GUI 400. For instance, center point graphical object 410 of FIG. 4A may be repositioned in GUI 400 to become center point graphical object 610 in FIG. 6. In addition, at least one category menu graphical object 420 of FIG. 4A may be repositioned in GUI 400 to become at least one corresponding category menu graphical object 620 in FIG. 6. In certain embodiments, a transformation from graphical representation of two-dimensional radial menu 405 to graphical representation of a three-dimensional radial menu 605 in GUI 400 may also include zooming in on graphical representation of three-dimensional radial menu 605, or zooming in on at least one or more graphical objects of graphical representation of three-dimensional radial menu 605, displayed in GUI 400.
  • In certain embodiments, a transformation from graphical representation of two-dimensional radial menu 405 to graphical representation of three-dimensional radial menu 605 in GUI 400 may include repositioning a viewpoint associated with graphical representation of two-dimensional radial menu 405 shown in FIG. 4A to produce graphical representation of three-dimensional radial menu 605 shown in FIG. 6. The repositioning of the viewpoint may include system 100 moving a viewpoint from a top-down viewpoint of graphical representation of two-dimensional radial menu 405 shown in FIG. 4A to a substantially ground-level viewpoint of graphical representation of three-dimensional radial menu 605 shown in FIG. 6. The substantially ground-level viewpoint of graphical representation of three-dimensional radial menu 605 may be positioned proximate to center point graphical object 610 of graphical representation of three-dimensional radial menu 605 in GUI 400, as represented in FIG. 6.
  • Such a transformation from graphical representation of two-dimensional radial menu 405 to graphical representation of three-dimensional radial menu 605 in GUI 400, as may be represented by FIG. 4A and FIG. 6, may be configured to provide a user experience in which, from a perspective of a user, a user vantage point is moving from a top-down, relatively distant view of two-dimensional radial menu 405 to a ground-level, relatively proximate view of three-dimensional radial menu 605. This may be configured to facilitate a centric user experience that places a user perspective near the center point of three-dimensional radial menu 605. In some examples, a transformation from graphical representation of two-dimensional radial menu 405 to graphical representation of three-dimensional radial menu 605 in GUI 400 may maintain consistent navigational principles and/or inputs between the two-dimensional radial menu 405 and the three-dimensional radial menu 605, with the three-dimensional radial menu 605 providing a more immersive user experience than is provided by the two-dimensional radial menu 405. In certain embodiments, system 100 may display one or more animation effects configured to represent a transformation from one graphical representation of a radial menu to another graphical representation of the radial menu.
  • Graphical representations of radial menus, including graphical representations of two-dimensional radial menu 405 and three-dimensional radial menu 605, may be rendered by system 100 in GUI 400 in any suitable manner. In one example, radial menu facility 150 may utilize data representative of a two-dimensional radial menu model to render a graphical representation of two-dimensional radial menu 405. Alternatively, radial menu facility 150 may utilize data representative of a three-dimensional radial menu model to render a graphical representation of two-dimensional radial menu 405. In certain examples, radial menu facility 150 may utilize data representative of a single radial menu model to render graphical representations of two-dimensional radial menu 405 and graphical representation of three-dimensional radial menu 605.
  • FIG. 8 illustrates another exemplary method 800 for radial menu display. While FIG. 8 illustrates exemplary steps according to one embodiment, other embodiments may omit, add to, reorder, and/or modify any of the steps shown in FIG. 8.
  • In step 810, data representative of a radial menu model may be maintained in a computer-readable medium of a computing system. For example, radial menu facility 150 may maintain data representative of a radial menu model in storage facility 130 of system 100. The data may be maintained in any format suitable for representing a radial menu model that may be used to render one or more graphical representations of radial menus in a GUI.
  • In step 820, the radial menu model may be utilized to render a graphical representation of the radial menu model in a GUI. For example, radial menu facility 150 may utilize data representative of the radial menu model to render graphical representation of two-dimensional radial menu 405 for display in GUI 400.
  • In step 830, user input may be detected. For example, I/O facility 140 may detect user input, which may include any suitable form of user input, including any forms of user input mentioned above.
  • In step 840, the radial menu model may utilized, in response to the user input, to render another graphical representation of the radial menu model in the GUI. For example, radial menu facility 150 may utilize data representative of the radial menu model to render graphical representation of three-dimensional radial menu 405 for display in GUI 400. Hence, data representative of radial menu model may be utilized to generate various graphical representations of the radial menu model in GUI 400.
  • FIG. 9 illustrates a perspective view of an exemplary three-dimensional radial menu model 900, which may be utilized in method 700 and/or 800 to render one or more graphical representations of a radial menu in GUI 400, as described above. As shown in FIG. 9, radial menu model 900 may include a radial configuration of three-dimensional objects, which may include a center point object 910, category menu objects 920 (e.g., category menu objects 920-1 through 920-5), and application menu objects 930 (e.g., application menu objects 930-1 through 930-21).
  • Center point graphical object 910 may generally indicate a radial center point of radial menu model 900. Category menu objects 920 may represent categories of applications, including any of the menu categories described above. Application menu objects 930 may represent one or more applications, which may include any software and/or device applications provided by (e.g., executable by) computing system 100 and/or accessible to a user of computing system 100.
  • Radial menu model 900 may represent a hierarchical menu organization of applications and menu categories associated with computing system 100. In particular, radial menu model 900 may represent relationships between applications and menu categories. Such relationships may be represented by relative positioning of objects in radial menu model 900. For instance, application menu objects 930 may be positioned relative to certain category menu objects 920 to visually indicate one or more hierarchical relationships. In FIG. 9, for example, application menu objects 930-1 through 930-4 are positioned adjacent to category menu object 920-1 to indicate relationships between the application represented by application menu objects 930-1 through 930-4 and the menu category represented by category menu object 920-1.
  • As illustrated in FIG. 9, category menu objects 920 may be radially aligned to form an inner radial layer of category menu objects 920 positioned about center point object 910 at a first radial distance from center point object 910. In certain embodiments, the inner radial layer of category menu objects 920 may substantially encircle center point object 910. In addition, application menu objects 930 may be radially aligned to form an outer radial layer of application menu objects 930 positioned about center point object 910 at a second radial distance from center point object 910. In certain embodiments, the outer radial layer of application menu objects 930 may substantially encircle the inner radial layer of category menu objects 920.
  • One or more objects included in radial menu model 900 may be utilized by system 100 to generate a graphical representation of at least a portion of radial menu model 900 in GUI 400. For example, system 100 may utilize one or more objects of radial menu model 900, or another similarly configured radial menu model, to generate the graphical representation of two-dimensional radial menu 405 shown in FIG. 4A and/or the graphical representation of three-dimensional radial menu 605 shown in FIG. 6.
  • In certain embodiments, system 100 may be configured to move a viewpoint relative to radial menu model 900 to generate various views of radial menu model 900 in GUI 400. For example, system 100 may use a first viewpoint positioned directly above and a certain distance away from radial menu model 900 to generate the graphical representation of two-dimensional radial menu 405 shown in FIG. 4A. System 100 may reposition the first viewpoint relative to radial menu model 900 to generate another view of radial menu model 900. For example, the viewpoint may be moved from a first position directly above radial menu model 900 to a second position that provides an angled, zoomed-in perspective view of radial menu model 900. Such movements of the viewpoint may be used to transform the graphical representation of two-dimensional radial menu 405 shown in FIG. 4A to the graphical representation of three-dimensional radial menu 605 shown in FIG. 6. In certain embodiments, movement of a viewpoint relative to radial menu model 900 may be animated in real time in GUI 400, which may cause radial menu model 900 to appear to be tilted, re-oriented, enlarged, minimized, or otherwise manipulated in GUI 400. In this or similar manner, various viewpoints of radial menu model 900 may be used by system 100 to display various graphical representations of radial menu model 900 in GUI 400.
  • The radial menu model 900 and graphical representations of radial menus described above are illustrative only. Other radial menu configurations may be used in other embodiments. For example, in certain embodiments, a radial menu may additionally include one or more sub-categories, which may further classify and/or hierarchically organize applications based on one or more common attributes. A sub-category may group applications based on one or more common attributes that are more specific than common attributes of a category. For example, within a communications category represented by a category menu graphical object, a sub-category menu graphical object may represent a sub-category of voice communication applications while another sub-category menu graphical object may represent a sub-category of Internet communication applications. A sub-category may be represented by a sub-category menu graphical object.
  • In the preceding description, various exemplary embodiments have been described with reference to the accompanying drawings. It will, however, be evident that various modifications and changes may be made thereto, and additional embodiments may be implemented, without departing from the scope of the invention as set forth in the claims that follow. For example, certain features of one embodiment described herein may be combined with or substituted for features of another embodiment described herein. The description and drawings are accordingly to be regarded in an illustrative rather than a restrictive sense.

Claims (22)

1. A method comprising:
displaying, by a computing system, a graphical representation of a two-dimensional radial menu in a graphical user interface generated by the computing system;
detecting, by the computing system, user input associated with the graphical representation of the two-dimensional radial menu in the graphical user interface; and
transforming, by the computing system in response to the user input, the graphical representation of the two-dimensional radial menu into a graphical representation of a three-dimensional radial menu in the graphical user interface.
2. The method of claim 1, wherein:
the graphical representation of the two-dimensional radial menu and the graphical representation of the three-dimensional radial menu comprise a common center point; and
the transforming comprises repositioning the center point in the graphical user interface.
3. The method of claim 2, wherein the graphical representation of the two-dimensional radial menu comprises a center point graphical object representing the common center point in the graphical user interface.
4. The method of claim 3, wherein the graphical representation of the three-dimensional radial menu comprises the center point graphical object representing the common center point in the graphical user interface.
5. The method of claim 3, wherein the user input comprises a user selection of the center point graphical object included in the graphical representation of the two-dimensional radial menu.
6. The method of claim 1, wherein:
the graphical representation of the two-dimensional radial menu comprises at least one category menu graphical object; and
the transforming comprises repositioning the at least one category menu graphical object in the graphical user interface.
7. The method of claim 1, wherein the transforming comprises repositioning a viewpoint associated with the graphical representation of the two-dimensional radial menu to produce the graphical representation of the three-dimensional radial menu in the graphical user interface.
8. The method of claim 7, wherein the repositioning of the viewpoint comprises moving the viewpoint from a top-down viewpoint of the two-dimensional radial menu to a substantially ground-level viewpoint of the three-dimensional radial menu in the graphical user interface.
9. The method of claim 8, wherein the substantially ground-level viewpoint is positioned proximate to a center point of the graphical representation of the three-dimensional radial menu in the graphical user interface.
10. The method of claim 7, wherein the transforming further comprises zooming in on the graphical representation of the three-dimensional radial menu in the graphical user interface.
11. The method of claim 1, wherein the graphical representation of the two-dimensional radial menu comprises:
a center point graphical object;
a plurality of category menu graphical objects positioned about the center point graphical object at a first radial distance from the center point graphical object; and
a plurality of application menu graphical objects positioned about the plurality of category menu graphical objects at a second radial distance from the center point graphical object.
12. The method of claim 11, wherein:
the category menu graphical objects form an inner radial layer substantially encircling the center point graphical object; and
the application menu graphical objects form an outer radial layer substantially encircling the inner radial layer.
13. The method of claim 1, wherein:
the displaying comprises utilizing data representative of a three-dimensional radial menu model to render the graphical representation of the two-dimensional radial menu, based on a first viewpoint, in the graphical user interface of the computing system; and
the transforming comprises utilizing the data representative of the three-dimensional radial menu model to render the graphical representation of the three-dimensional radial menu, based on a second viewpoint, in the graphical user interface.
14. The method of claim 1, tangibly embodied as computer-executable instructions on at least one non-transitory computer-readable medium, the computer-executable instructions configured to direct at least one processor of the computing system to perform at least one of the displaying and the transforming.
15. A method comprising:
displaying, by a computing system, a graphical representation of a three-dimensional radial menu in a graphical user interface generated by the computing system;
detecting, by the computing system, user input associated with the graphical representation of the three-dimensional radial menu in the graphical user interface; and
transforming, by the computing system in response to the user input, the graphical representation of the three-dimensional radial menu into a graphical representation of a two-dimensional radial menu in the graphical user interface.
16. The method of claim 15, wherein:
the graphical representation of the two-dimensional radial menu and the graphical representation of the three-dimensional radial menu comprise a common center point; and
the transforming comprises repositioning the center point in the graphical user interface.
17. The method of claim 16, wherein the user input comprises a user selection of a center point graphical object graphically representing the center point in the graphical representation of the three-dimensional radial menu.
18. A method comprising:
maintaining, by a computing system, data representative of a radial menu model in a non-transitory computer-readable medium of the computing system, the radial menu model comprising
a center point object,
a plurality of category menu objects positioned about the center point object at a first radial distance from the center point object, and
a plurality of application menu objects positioned about the plurality of category menu objects at a second radial distance from the center point object;
utilizing, by the computing system, the radial menu model to render a graphical representation of one of a two-dimensional radial menu and a three-dimensional radial menu in a graphical user interface of the computing system;
detecting, by the computing system, user input associated with the graphical representation of one of the two-dimensional radial menu and the three-dimensional radial menu in the graphical user interface; and
transforming, by the computing system in response to the user input, the graphical representation of the one of the two-dimensional radial menu and the three-dimensional radial menu into the other of the two-dimensional radial menu and the three-dimensional radial menu in the graphical user interface.
19. A system comprising:
at least one processor; and
a radial menu facility in communication with the at least one processor and configured to direct the at least one processor to
display a graphical representation of a two-dimensional radial menu in a graphical user interface, and
transform the graphical representation of the two-dimensional radial menu into a graphical representation of a three-dimensional radial menu in the graphical user interface.
20. The system of claim 19, wherein:
the graphical representation of the two-dimensional radial menu and the graphical representation of the three-dimensional radial menu comprise a common center point; and
the transformation comprises repositioning the center point in the graphical user interface.
21. The system of claim 19, further comprising:
a storage facility in communication with the at least one processor;
wherein the radial menu facility is further configured to direct the at least one processor to
maintain data representative of a radial menu model in the storage facility, and
utilize the radial menu model to render the two-dimensional radial menu and the three-dimensional radial menu for display in the graphical user interface.
22. The system of claim 21, wherein the radial menu model comprises:
a center point object;
a plurality of category menu objects positioned about the center point object at a first radial distance from the center point object; and
a plurality of application menu objects positioned about the plurality of category menu objects at a second radial distance from the center point object.
US13/462,403 2009-06-26 2012-05-02 Radial menu display systems and methods Abandoned US20120221976A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/462,403 US20120221976A1 (en) 2009-06-26 2012-05-02 Radial menu display systems and methods

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/492,277 US8219930B2 (en) 2009-06-26 2009-06-26 Radial menu display systems and methods
US13/462,403 US20120221976A1 (en) 2009-06-26 2012-05-02 Radial menu display systems and methods

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/492,277 Continuation US8219930B2 (en) 2009-06-26 2009-06-26 Radial menu display systems and methods

Publications (1)

Publication Number Publication Date
US20120221976A1 true US20120221976A1 (en) 2012-08-30

Family

ID=43382180

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/492,277 Active 2030-08-04 US8219930B2 (en) 2009-06-26 2009-06-26 Radial menu display systems and methods
US13/462,403 Abandoned US20120221976A1 (en) 2009-06-26 2012-05-02 Radial menu display systems and methods

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/492,277 Active 2030-08-04 US8219930B2 (en) 2009-06-26 2009-06-26 Radial menu display systems and methods

Country Status (1)

Country Link
US (2) US8219930B2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110271230A1 (en) * 2010-04-30 2011-11-03 Talkwheel.com, Inc. Visualization and navigation system for complex data and discussion platform
US20120047462A1 (en) * 2010-08-19 2012-02-23 Samsung Electronics Co., Ltd. Display apparatus and control method thereof
US9021398B2 (en) 2011-07-14 2015-04-28 Microsoft Corporation Providing accessibility features on context based radial menus
US9026944B2 (en) 2011-07-14 2015-05-05 Microsoft Technology Licensing, Llc Managing content through actions on context based menus
US9582187B2 (en) 2011-07-14 2017-02-28 Microsoft Technology Licensing, Llc Dynamic context based menus
US9746995B2 (en) 2011-07-14 2017-08-29 Microsoft Technology Licensing, Llc Launcher for context based menus

Families Citing this family (221)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8645137B2 (en) 2000-03-16 2014-02-04 Apple Inc. Fast, language-independent method for user authentication by voice
US8677377B2 (en) 2005-09-08 2014-03-18 Apple Inc. Method and apparatus for building an intelligent automated assistant
US9318108B2 (en) 2010-01-18 2016-04-19 Apple Inc. Intelligent automated assistant
US8977255B2 (en) 2007-04-03 2015-03-10 Apple Inc. Method and system for operating a multi-function portable electronic device using voice-activation
US10002189B2 (en) 2007-12-20 2018-06-19 Apple Inc. Method and apparatus for searching using an active ontology
US9330720B2 (en) 2008-01-03 2016-05-03 Apple Inc. Methods and apparatus for altering audio output signals
US8232973B2 (en) 2008-01-09 2012-07-31 Apple Inc. Method, device, and graphical user interface providing word recommendations for text input
US8996376B2 (en) 2008-04-05 2015-03-31 Apple Inc. Intelligent text-to-speech conversion
US10496753B2 (en) 2010-01-18 2019-12-03 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US20100030549A1 (en) 2008-07-31 2010-02-04 Lee Michael M Mobile device having human language translation capability with positional feedback
US8676904B2 (en) 2008-10-02 2014-03-18 Apple Inc. Electronic devices with voice command and contextual data processing capabilities
US20120311585A1 (en) 2011-06-03 2012-12-06 Apple Inc. Organizing task items that represent tasks to perform
US10241644B2 (en) 2011-06-03 2019-03-26 Apple Inc. Actionable reminder entries
US9858925B2 (en) 2009-06-05 2018-01-02 Apple Inc. Using context information to facilitate processing of commands in a virtual assistant
US10241752B2 (en) 2011-09-30 2019-03-26 Apple Inc. Interface for a virtual digital assistant
US9431006B2 (en) 2009-07-02 2016-08-30 Apple Inc. Methods and apparatuses for automatic speech recognition
US8375329B2 (en) * 2009-09-01 2013-02-12 Maxon Computer Gmbh Method of providing a graphical user interface using a concentric menu
US20110167350A1 (en) * 2010-01-06 2011-07-07 Apple Inc. Assist Features For Content Display Device
US10705794B2 (en) 2010-01-18 2020-07-07 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US10276170B2 (en) 2010-01-18 2019-04-30 Apple Inc. Intelligent automated assistant
US10553209B2 (en) 2010-01-18 2020-02-04 Apple Inc. Systems and methods for hands-free notification summaries
US10679605B2 (en) 2010-01-18 2020-06-09 Apple Inc. Hands-free list-reading by intelligent automated assistant
KR101647044B1 (en) * 2010-02-11 2016-08-09 삼성전자 주식회사 Method and apparatus for displaying screen in mobile terminal
US9367205B2 (en) 2010-02-19 2016-06-14 Microsoft Technolgoy Licensing, Llc Radial menus with bezel gestures
US8682667B2 (en) 2010-02-25 2014-03-25 Apple Inc. User profiling for selecting user specific voice input processing information
TWI494794B (en) * 2010-05-10 2015-08-01 Cal Comp Electronics & Comm Co Electronic apparatus and method of choosing data thereof
US8550920B1 (en) 2010-05-28 2013-10-08 Wms Gaming, Inc. Providing and controlling embeddable gaming content
KR101435594B1 (en) * 2010-05-31 2014-08-29 삼성전자주식회사 Display apparatus and display mehod thereof
CN102467315A (en) 2010-10-29 2012-05-23 国际商业机器公司 Method and system for controlling electronic equipment with touch type signal input device
US9262612B2 (en) 2011-03-21 2016-02-16 Apple Inc. Device access using voice authentication
US9329766B2 (en) * 2011-06-02 2016-05-03 Lenovo (Singapore) Pte. Ltd. Homepage re-assignment
US9310958B2 (en) * 2011-06-02 2016-04-12 Lenovo (Singapore) Pte. Ltd. Dock for favorite applications
US10057736B2 (en) 2011-06-03 2018-08-21 Apple Inc. Active transport based notifications
US9417754B2 (en) * 2011-08-05 2016-08-16 P4tents1, LLC User interface system, method, and computer program product
US8994660B2 (en) 2011-08-29 2015-03-31 Apple Inc. Text correction processing
US8707211B2 (en) 2011-10-21 2014-04-22 Hewlett-Packard Development Company, L.P. Radial graphical user interface
US9405435B2 (en) * 2011-11-02 2016-08-02 Hendricks Investment Holdings, Llc Device navigation icon and system, and method of use thereof
US8869068B2 (en) * 2011-11-22 2014-10-21 Backplane, Inc. Content sharing application utilizing radially-distributed menus
US9875023B2 (en) * 2011-11-23 2018-01-23 Microsoft Technology Licensing, Llc Dial-based user interfaces
CN104039581B (en) * 2012-01-09 2017-08-18 奥迪股份公司 For the method and apparatus for the 3D diagrams that user interface is produced in vehicle
KR101331498B1 (en) * 2012-02-07 2013-11-20 부가벤쳐스 The apparatus and method of idea wheel
US10134385B2 (en) 2012-03-02 2018-11-20 Apple Inc. Systems and methods for name pronunciation
US9483461B2 (en) 2012-03-06 2016-11-01 Apple Inc. Handling speech synthesis of content for multiple languages
US9952741B2 (en) * 2012-03-08 2018-04-24 Zte Corporation Mobile terminal application icon management method and mobile terminal
US9280610B2 (en) 2012-05-14 2016-03-08 Apple Inc. Crowd sourcing information to fulfill user requests
US10417037B2 (en) 2012-05-15 2019-09-17 Apple Inc. Systems and methods for integrating third party services with a digital assistant
US9721563B2 (en) 2012-06-08 2017-08-01 Apple Inc. Name recognition system
US9495129B2 (en) 2012-06-29 2016-11-15 Apple Inc. Device, method, and user interface for voice-activated navigation and browsing of a document
US9547647B2 (en) 2012-09-19 2017-01-17 Apple Inc. Voice-based media searching
US10503359B2 (en) 2012-11-15 2019-12-10 Quantum Interface, Llc Selection attractive interfaces, systems and apparatuses including such interfaces, methods for making and using same
US10289204B2 (en) 2012-11-15 2019-05-14 Quantum Interface, Llc Apparatuses for controlling electrical devices and software programs and methods for making and using same
JP2014130419A (en) * 2012-12-28 2014-07-10 Sony Corp Information processing device, information processing method, and program
US9600103B1 (en) 2012-12-31 2017-03-21 Allscripts Software, Llc Method for ensuring use intentions of a touch screen device
KR102516577B1 (en) 2013-02-07 2023-04-03 애플 인크. Voice trigger for a digital assistant
USD739872S1 (en) * 2013-02-22 2015-09-29 Samsung Electronics Co., Ltd. Display screen with animated graphical user interface
US10652394B2 (en) 2013-03-14 2020-05-12 Apple Inc. System and method for processing voicemail
US10748529B1 (en) 2013-03-15 2020-08-18 Apple Inc. Voice activated device for use with a voice-based digital assistant
CN103226446B (en) * 2013-05-16 2016-08-03 上海欧拉网络技术有限公司 The event response method of user interface and mobile device for mobile device
US9817548B2 (en) * 2013-05-20 2017-11-14 Citrix Systems, Inc. Providing enhanced user interfaces
USD741898S1 (en) * 2013-05-29 2015-10-27 Microsoft Corporation Display screen with animated graphical user interface
KR101835903B1 (en) 2013-05-29 2018-03-07 김지영 The apparatus of idea wheel
WO2014197334A2 (en) 2013-06-07 2014-12-11 Apple Inc. System and method for user-specified pronunciation of words for speech synthesis and recognition
WO2014197336A1 (en) 2013-06-07 2014-12-11 Apple Inc. System and method for detecting errors in interactions with a voice-based digital assistant
US9582608B2 (en) 2013-06-07 2017-02-28 Apple Inc. Unified ranking with entropy-weighted information for phrase-based semantic auto-completion
WO2014197335A1 (en) 2013-06-08 2014-12-11 Apple Inc. Interpreting and acting upon commands that involve sharing information with remote devices
EP3008641A1 (en) 2013-06-09 2016-04-20 Apple Inc. Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant
US10176167B2 (en) 2013-06-09 2019-01-08 Apple Inc. System and method for inferring user intent from speech inputs
KR20140144320A (en) * 2013-06-10 2014-12-18 삼성전자주식회사 Method and apparatus for providing user interface in electronic device
US10296160B2 (en) 2013-12-06 2019-05-21 Apple Inc. Method for extracting salient dialog usage from live data
USD751606S1 (en) * 2013-12-30 2016-03-15 Beijing Qihoo Technology Co., Ltd. Display screen with animated graphical user interface
JP6268526B2 (en) * 2014-03-17 2018-01-31 オムロン株式会社 MULTIMEDIA DEVICE, MULTIMEDIA DEVICE CONTROL METHOD, AND MULTIMEDIA DEVICE CONTROL PROGRAM
DE102014205574A1 (en) * 2014-03-26 2015-10-01 Ford Global Technologies, Llc System for controlling and selecting massage functions of a motor vehicle seat
US9569520B1 (en) 2014-03-31 2017-02-14 Juniper Networks, Inc. Classification of software based on user interface elements
USD783035S1 (en) * 2014-05-01 2017-04-04 Beijing Qihoo Technology Co., Ltd Display screen with an animated graphical user interface
US10170123B2 (en) 2014-05-30 2019-01-01 Apple Inc. Intelligent assistant for home automation
US9430463B2 (en) 2014-05-30 2016-08-30 Apple Inc. Exemplar-based natural language processing
US9715875B2 (en) 2014-05-30 2017-07-25 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
EP3149728B1 (en) 2014-05-30 2019-01-16 Apple Inc. Multi-command single utterance input method
US10078631B2 (en) 2014-05-30 2018-09-18 Apple Inc. Entropy-guided text prediction using combined word and character n-gram language models
US9760559B2 (en) 2014-05-30 2017-09-12 Apple Inc. Predictive text input
US9842101B2 (en) 2014-05-30 2017-12-12 Apple Inc. Predictive conversion of language input
US9785630B2 (en) 2014-05-30 2017-10-10 Apple Inc. Text prediction using combined word N-gram and unigram language models
US9633004B2 (en) 2014-05-30 2017-04-25 Apple Inc. Better resolution when referencing to concepts
US10255267B2 (en) 2014-05-30 2019-04-09 Apple Inc. Device, method, and graphical user interface for a predictive keyboard
US9971492B2 (en) 2014-06-04 2018-05-15 Quantum Interface, Llc Dynamic environment for object and attribute display and interaction
US9338493B2 (en) 2014-06-30 2016-05-10 Apple Inc. Intelligent automated assistant for TV user interactions
US10659851B2 (en) 2014-06-30 2020-05-19 Apple Inc. Real-time digital assistant knowledge updates
WO2016011190A1 (en) * 2014-07-15 2016-01-21 T6 Health Systems Llc Healthcare information analysis and graphical display presentation system
US10446141B2 (en) 2014-08-28 2019-10-15 Apple Inc. Automatic speech recognition based on user feedback
US9818400B2 (en) 2014-09-11 2017-11-14 Apple Inc. Method and apparatus for discovering trending terms in speech requests
US10789041B2 (en) 2014-09-12 2020-09-29 Apple Inc. Dynamic thresholds for always listening speech trigger
US9886432B2 (en) 2014-09-30 2018-02-06 Apple Inc. Parsimonious handling of word inflection via categorical stem + suffix N-gram language models
US10127911B2 (en) 2014-09-30 2018-11-13 Apple Inc. Speaker identification and unsupervised speaker adaptation techniques
US9668121B2 (en) 2014-09-30 2017-05-30 Apple Inc. Social reminders
US9646609B2 (en) 2014-09-30 2017-05-09 Apple Inc. Caching apparatus for serving phonetic pronunciations
US10074360B2 (en) 2014-09-30 2018-09-11 Apple Inc. Providing an indication of the suitability of speech recognition
US10788948B2 (en) 2018-03-07 2020-09-29 Quantum Interface, Llc Systems, apparatuses, interfaces and implementing methods for displaying and manipulating temporal or sequential objects
US11205075B2 (en) 2018-01-10 2021-12-21 Quantum Interface, Llc Interfaces, systems and apparatuses for constructing 3D AR environment overlays, and methods for making and using same
US10552013B2 (en) 2014-12-02 2020-02-04 Apple Inc. Data detection
USD759081S1 (en) * 2014-12-11 2016-06-14 Microsoft Corporation Display screen with animated graphical user interface
WO2016116891A1 (en) * 2015-01-22 2016-07-28 Realitygate (Pty) Ltd Hierarchy navigation in a user interface
US10152299B2 (en) 2015-03-06 2018-12-11 Apple Inc. Reducing response latency of intelligent automated assistants
US9865280B2 (en) 2015-03-06 2018-01-09 Apple Inc. Structured dictation using intelligent automated assistants
US9886953B2 (en) 2015-03-08 2018-02-06 Apple Inc. Virtual assistant activation
US10567477B2 (en) 2015-03-08 2020-02-18 Apple Inc. Virtual assistant continuity
US9721566B2 (en) 2015-03-08 2017-08-01 Apple Inc. Competing devices responding to voice triggers
US9899019B2 (en) 2015-03-18 2018-02-20 Apple Inc. Systems and methods for structured stem and suffix language models
US9842105B2 (en) 2015-04-16 2017-12-12 Apple Inc. Parsimonious continuous-space phrase representations for natural language processing
US10460227B2 (en) 2015-05-15 2019-10-29 Apple Inc. Virtual assistant in a communication session
US10083688B2 (en) 2015-05-27 2018-09-25 Apple Inc. Device voice control for selecting a displayed affordance
US10127220B2 (en) 2015-06-04 2018-11-13 Apple Inc. Language identification from short strings
US10101822B2 (en) 2015-06-05 2018-10-16 Apple Inc. Language input correction
US9578173B2 (en) 2015-06-05 2017-02-21 Apple Inc. Virtual assistant aided communication with 3rd party service in a communication session
US10255907B2 (en) 2015-06-07 2019-04-09 Apple Inc. Automatic accent detection using acoustic models
US10186254B2 (en) 2015-06-07 2019-01-22 Apple Inc. Context-based endpoint detection
US11025565B2 (en) 2015-06-07 2021-06-01 Apple Inc. Personalized prediction of responses for instant messaging
US10503264B1 (en) 2015-06-16 2019-12-10 Snap Inc. Radial gesture navigation
US20160378747A1 (en) 2015-06-29 2016-12-29 Apple Inc. Virtual assistant for media playback
US10747498B2 (en) 2015-09-08 2020-08-18 Apple Inc. Zero latency digital assistant
US10671428B2 (en) 2015-09-08 2020-06-02 Apple Inc. Distributed personal assistant
US9697820B2 (en) 2015-09-24 2017-07-04 Apple Inc. Unit-selection text-to-speech synthesis using concatenation-sensitive neural networks
US10366158B2 (en) 2015-09-29 2019-07-30 Apple Inc. Efficient word encoding for recurrent neural network language models
US11010550B2 (en) 2015-09-29 2021-05-18 Apple Inc. Unified language modeling framework for word prediction, auto-completion and auto-correction
US11587559B2 (en) 2015-09-30 2023-02-21 Apple Inc. Intelligent device identification
US10691473B2 (en) 2015-11-06 2020-06-23 Apple Inc. Intelligent automated assistant in a messaging environment
US10049668B2 (en) 2015-12-02 2018-08-14 Apple Inc. Applying neural network language models to weighted finite state transducers for automatic speech recognition
US10223066B2 (en) 2015-12-23 2019-03-05 Apple Inc. Proactive assistance based on dialog communication between devices
USD858481S1 (en) * 2016-03-02 2019-09-03 Dolby Laboratories Licensing Corporation Headphones
US10446143B2 (en) 2016-03-14 2019-10-15 Apple Inc. Identification of voice inputs providing credentials
US10530731B1 (en) 2016-03-28 2020-01-07 Snap Inc. Systems and methods for chat with audio and video elements
US9934775B2 (en) 2016-05-26 2018-04-03 Apple Inc. Unit-selection text-to-speech synthesis based on predicted concatenation parameters
US9972304B2 (en) 2016-06-03 2018-05-15 Apple Inc. Privacy preserving distributed evaluation framework for embedded personalized systems
US10249300B2 (en) 2016-06-06 2019-04-02 Apple Inc. Intelligent list reading
US11227589B2 (en) 2016-06-06 2022-01-18 Apple Inc. Intelligent list reading
KR20170138279A (en) * 2016-06-07 2017-12-15 엘지전자 주식회사 Mobile terminal and method for controlling the same
US10049663B2 (en) 2016-06-08 2018-08-14 Apple, Inc. Intelligent automated assistant for media exploration
DK179309B1 (en) 2016-06-09 2018-04-23 Apple Inc Intelligent automated assistant in a home environment
US10586535B2 (en) 2016-06-10 2020-03-10 Apple Inc. Intelligent digital assistant in a multi-tasking environment
US10067938B2 (en) 2016-06-10 2018-09-04 Apple Inc. Multilingual word prediction
US10509862B2 (en) 2016-06-10 2019-12-17 Apple Inc. Dynamic phrase expansion of language input
US10490187B2 (en) 2016-06-10 2019-11-26 Apple Inc. Digital assistant providing automated status report
US10192552B2 (en) 2016-06-10 2019-01-29 Apple Inc. Digital assistant providing whispered speech
DK179343B1 (en) 2016-06-11 2018-05-14 Apple Inc Intelligent task discovery
DK179415B1 (en) 2016-06-11 2018-06-14 Apple Inc Intelligent device arbitration and control
DK179049B1 (en) 2016-06-11 2017-09-18 Apple Inc Data driven natural language event detection and classification
DK201670540A1 (en) 2016-06-11 2018-01-08 Apple Inc Application integration with a digital assistant
DE102016007998A1 (en) * 2016-06-30 2018-01-04 Man Truck & Bus Ag Technology for operating a motor vehicle
US10474753B2 (en) 2016-09-07 2019-11-12 Apple Inc. Language identification using recurrent neural networks
US10043516B2 (en) 2016-09-23 2018-08-07 Apple Inc. Intelligent automated assistant
US11281993B2 (en) 2016-12-05 2022-03-22 Apple Inc. Model and ensemble compression for metric learning
US10593346B2 (en) 2016-12-22 2020-03-17 Apple Inc. Rank-reduced token representation for automatic speech recognition
US11204787B2 (en) 2017-01-09 2021-12-21 Apple Inc. Application integration with a digital assistant
DK201770383A1 (en) 2017-05-09 2018-12-14 Apple Inc. User interface for correcting recognition errors
US10417266B2 (en) 2017-05-09 2019-09-17 Apple Inc. Context-aware ranking of intelligent response suggestions
US10726832B2 (en) 2017-05-11 2020-07-28 Apple Inc. Maintaining privacy of personal information
US10395654B2 (en) 2017-05-11 2019-08-27 Apple Inc. Text normalization based on a data-driven learning network
DK201770439A1 (en) 2017-05-11 2018-12-13 Apple Inc. Offline personal assistant
DK179745B1 (en) 2017-05-12 2019-05-01 Apple Inc. SYNCHRONIZATION AND TASK DELEGATION OF A DIGITAL ASSISTANT
DK201770429A1 (en) 2017-05-12 2018-12-14 Apple Inc. Low-latency intelligent automated assistant
US11301477B2 (en) 2017-05-12 2022-04-12 Apple Inc. Feedback analysis of a digital assistant
DK179496B1 (en) 2017-05-12 2019-01-15 Apple Inc. USER-SPECIFIC Acoustic Models
DK201770431A1 (en) 2017-05-15 2018-12-20 Apple Inc. Optimizing dialogue policy decisions for digital assistants using implicit feedback
DK201770432A1 (en) 2017-05-15 2018-12-21 Apple Inc. Hierarchical belief states for digital assistants
US10311144B2 (en) 2017-05-16 2019-06-04 Apple Inc. Emoji word sense disambiguation
US20180336275A1 (en) 2017-05-16 2018-11-22 Apple Inc. Intelligent automated assistant for media exploration
US20180336892A1 (en) 2017-05-16 2018-11-22 Apple Inc. Detecting a trigger of a digital assistant
DK179560B1 (en) 2017-05-16 2019-02-18 Apple Inc. Far-field extension for digital assistant services
US10403278B2 (en) 2017-05-16 2019-09-03 Apple Inc. Methods and systems for phonetic matching in digital assistant services
US10657328B2 (en) 2017-06-02 2020-05-19 Apple Inc. Multi-task recurrent neural network architecture for efficient morphology handling in neural language modeling
US10574715B2 (en) 2017-08-03 2020-02-25 Streaming Global, Inc. Method and system for aggregating content streams based on sensor data
US10445429B2 (en) 2017-09-21 2019-10-15 Apple Inc. Natural language understanding using vocabularies with compressed serialized tries
USD901522S1 (en) * 2017-09-27 2020-11-10 Toyota Research Institute, Inc. Vehicle heads-up display screen or portion thereof with a graphical user interface
USD868103S1 (en) * 2017-09-27 2019-11-26 Toyota Research Institute, Inc. Display screen or portion thereof with an animated graphical user interface
US10755051B2 (en) 2017-09-29 2020-08-25 Apple Inc. Rule-based natural language processing
US10880353B2 (en) 2017-10-12 2020-12-29 Streaming Global, Inc. Systems and methods for cloud storage direct streaming
US10636424B2 (en) 2017-11-30 2020-04-28 Apple Inc. Multi-turn canned dialog
US10733982B2 (en) 2018-01-08 2020-08-04 Apple Inc. Multi-directional dialog
US10733375B2 (en) 2018-01-31 2020-08-04 Apple Inc. Knowledge-based framework for improving natural language understanding
US10789959B2 (en) 2018-03-02 2020-09-29 Apple Inc. Training speaker recognition models for digital assistants
US10592604B2 (en) 2018-03-12 2020-03-17 Apple Inc. Inverse text normalization for automatic speech recognition
US10818288B2 (en) 2018-03-26 2020-10-27 Apple Inc. Natural assistant interaction
US10909331B2 (en) 2018-03-30 2021-02-02 Apple Inc. Implicit identification of translation payload with neural machine translation
US11145294B2 (en) 2018-05-07 2021-10-12 Apple Inc. Intelligent automated assistant for delivering content from user experiences
US10928918B2 (en) 2018-05-07 2021-02-23 Apple Inc. Raise to speak
US10984780B2 (en) 2018-05-21 2021-04-20 Apple Inc. Global semantic word embeddings using bi-directional recurrent neural networks
DK179822B1 (en) 2018-06-01 2019-07-12 Apple Inc. Voice interaction at a primary device to access call functionality of a companion device
DK201870355A1 (en) 2018-06-01 2019-12-16 Apple Inc. Virtual assistant operation in multi-device environments
DK180639B1 (en) 2018-06-01 2021-11-04 Apple Inc DISABILITY OF ATTENTION-ATTENTIVE VIRTUAL ASSISTANT
US11386266B2 (en) 2018-06-01 2022-07-12 Apple Inc. Text correction
US10892996B2 (en) 2018-06-01 2021-01-12 Apple Inc. Variable latency device coordination
US10496705B1 (en) 2018-06-03 2019-12-03 Apple Inc. Accelerated task performance
USD887431S1 (en) * 2018-06-18 2020-06-16 Genomic Prediction, Inc. Display screen with graphical user interface
US11010561B2 (en) 2018-09-27 2021-05-18 Apple Inc. Sentiment prediction from textual data
US11462215B2 (en) 2018-09-28 2022-10-04 Apple Inc. Multi-modal inputs for voice commands
US10839159B2 (en) 2018-09-28 2020-11-17 Apple Inc. Named entity normalization in a spoken dialog system
US11170166B2 (en) 2018-09-28 2021-11-09 Apple Inc. Neural typographical error modeling via generative adversarial networks
USD916777S1 (en) * 2018-10-15 2021-04-20 Koninklijke Philips N.V. Display screen with graphical user interface
USD914042S1 (en) * 2018-10-15 2021-03-23 Koninklijke Philips N.V. Display screen with graphical user interface
US11475898B2 (en) 2018-10-26 2022-10-18 Apple Inc. Low-latency multi-speaker speech recognition
USD936665S1 (en) * 2018-11-21 2021-11-23 Biosense Webster (Israel) Ltd. Portion of a computer screen with a graphical user interface
US11638059B2 (en) 2019-01-04 2023-04-25 Apple Inc. Content playback on multiple devices
US11348573B2 (en) 2019-03-18 2022-05-31 Apple Inc. Multimodality in digital assistant systems
US11307752B2 (en) 2019-05-06 2022-04-19 Apple Inc. User configurable task triggers
DK201970509A1 (en) 2019-05-06 2021-01-15 Apple Inc Spoken notifications
US11475884B2 (en) 2019-05-06 2022-10-18 Apple Inc. Reducing digital assistant latency when a language is incorrectly determined
US11423908B2 (en) 2019-05-06 2022-08-23 Apple Inc. Interpreting spoken requests
US11140099B2 (en) 2019-05-21 2021-10-05 Apple Inc. Providing message response suggestions
DK201970510A1 (en) 2019-05-31 2021-02-11 Apple Inc Voice identification in digital assistant systems
US11496600B2 (en) 2019-05-31 2022-11-08 Apple Inc. Remote execution of machine-learned models
DK180129B1 (en) 2019-05-31 2020-06-02 Apple Inc. User activity shortcut suggestions
US11289073B2 (en) 2019-05-31 2022-03-29 Apple Inc. Device text to speech
US11194467B2 (en) 2019-06-01 2021-12-07 Apple Inc. Keyboard management user interfaces
US11360641B2 (en) 2019-06-01 2022-06-14 Apple Inc. Increasing the relevance of new available information
US11321904B2 (en) 2019-08-30 2022-05-03 Maxon Computer Gmbh Methods and systems for context passing between nodes in three-dimensional modeling
US11488406B2 (en) 2019-09-25 2022-11-01 Apple Inc. Text detection using global geometry estimators
US11762530B2 (en) 2020-02-05 2023-09-19 Legacy Productions, Inc. Interface for radial selection of time-based events
US11714928B2 (en) 2020-02-27 2023-08-01 Maxon Computer Gmbh Systems and methods for a self-adjusting node workspace
TWD210778S (en) * 2020-05-06 2021-04-01 宏碁股份有限公司 Graphical user interface for a display screen
US11373369B2 (en) 2020-09-02 2022-06-28 Maxon Computer Gmbh Systems and methods for extraction of mesh geometry from straight skeleton for beveled shapes
US11416136B2 (en) 2020-09-14 2022-08-16 Apple Inc. User interfaces for assigning and responding to user inputs
USD1021950S1 (en) * 2020-11-17 2024-04-09 Carrier Corporation Display screen or portion thereof with icon

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5898435A (en) * 1995-10-02 1999-04-27 Sony Corporation Image controlling device and image controlling method
US20060212829A1 (en) * 2005-03-17 2006-09-21 Takao Yahiro Method, program and device for displaying menu
US20070256029A1 (en) * 2006-05-01 2007-11-01 Rpo Pty Llimited Systems And Methods For Interfacing A User With A Touch-Screen
US20080022228A1 (en) * 2006-07-24 2008-01-24 Samsung Electronics Co., Ltd. User interface device and method of implementing the same

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5602564A (en) * 1991-11-14 1997-02-11 Hitachi, Ltd. Graphic data processing system
US6097393A (en) * 1996-09-03 2000-08-01 The Takshele Corporation Computer-executed, three-dimensional graphical resource management process and system
US6359635B1 (en) * 1999-02-03 2002-03-19 Cary D. Perttunen Methods, articles and apparatus for visibly representing information and for providing an input interface
US6549219B2 (en) * 1999-04-09 2003-04-15 International Business Machines Corporation Pie menu graphical user interface
AU2004240229B2 (en) * 2004-12-20 2011-04-07 Canon Kabushiki Kaisha A radial, three-dimensional, hierarchical file system view
JP2007287135A (en) * 2006-03-20 2007-11-01 Denso Corp Image display controller and program for image display controller
US20100241999A1 (en) * 2009-03-19 2010-09-23 Microsoft Corporation Canvas Manipulation Using 3D Spatial Gestures

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5898435A (en) * 1995-10-02 1999-04-27 Sony Corporation Image controlling device and image controlling method
US20060212829A1 (en) * 2005-03-17 2006-09-21 Takao Yahiro Method, program and device for displaying menu
US20070256029A1 (en) * 2006-05-01 2007-11-01 Rpo Pty Llimited Systems And Methods For Interfacing A User With A Touch-Screen
US20080022228A1 (en) * 2006-07-24 2008-01-24 Samsung Electronics Co., Ltd. User interface device and method of implementing the same

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110271230A1 (en) * 2010-04-30 2011-11-03 Talkwheel.com, Inc. Visualization and navigation system for complex data and discussion platform
US20120047462A1 (en) * 2010-08-19 2012-02-23 Samsung Electronics Co., Ltd. Display apparatus and control method thereof
US9021398B2 (en) 2011-07-14 2015-04-28 Microsoft Corporation Providing accessibility features on context based radial menus
US9026944B2 (en) 2011-07-14 2015-05-05 Microsoft Technology Licensing, Llc Managing content through actions on context based menus
US9086794B2 (en) 2011-07-14 2015-07-21 Microsoft Technology Licensing, Llc Determining gestures on context based menus
US9116602B2 (en) 2011-07-14 2015-08-25 Microsoft Technology Licensing, Llc Providing customization of context based menus
US9250766B2 (en) 2011-07-14 2016-02-02 Microsoft Technology Licensing, Llc Labels and tooltips for context based menus
US9582187B2 (en) 2011-07-14 2017-02-28 Microsoft Technology Licensing, Llc Dynamic context based menus
US9746995B2 (en) 2011-07-14 2017-08-29 Microsoft Technology Licensing, Llc Launcher for context based menus

Also Published As

Publication number Publication date
US8219930B2 (en) 2012-07-10
US20100333030A1 (en) 2010-12-30

Similar Documents

Publication Publication Date Title
US8219930B2 (en) Radial menu display systems and methods
JP6194278B2 (en) Notification of mobile device events
CN108027706B (en) Application interface display method and terminal equipment
US9270628B2 (en) System and method for providing notifications on a mobile computing device
US8375326B2 (en) Contextual-based and overlaid user interface elements
US9990105B2 (en) Accessible contextual controls within a graphical user interface
US8723808B2 (en) Mobile terminal including touch rotary dial display
AU2014388291B2 (en) Configurable electronic communication element
US9448692B1 (en) Graphical user interface for displaying menu options
US8839129B2 (en) User interface for a communication device
JP6609361B2 (en) Multi-participant live communication user interface
US20110202879A1 (en) Graphical context short menu
US10819840B2 (en) Voice communication method
WO2010071710A1 (en) Communications convergence and user interface systems, apparatuses, and methods
KR20180004783A (en) Information processing method, terminal, and computer storage medium
JP2017508365A (en) Reminder notification linked to an entity
CN113934301A (en) Dismissing notifications in response to presented notifications
WO2016184295A1 (en) Instant messenger method, user equipment and system
US11765114B2 (en) Voice communication method
US20130067374A1 (en) Method for directly manipulating incoming interactions in an instant communication client application
JP2012503364A (en) Predetermined response method and apparatus for wireless device
AU2022202360B2 (en) Voice communication method
US20150200889A1 (en) System and method for sending messages
WO2018213506A2 (en) Voice communication method
AU2019100525C4 (en) Voice communication method

Legal Events

Date Code Title Description
AS Assignment

Owner name: VERIZON PATENT AND LICENSING INC., NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:JOHNS, GREG A.;REEL/FRAME:028144/0375

Effective date: 20090622

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION