US20120131479A1 - Resolution Independent User Interface Design - Google Patents

Resolution Independent User Interface Design Download PDF

Info

Publication number
US20120131479A1
US20120131479A1 US13/359,169 US201213359169A US2012131479A1 US 20120131479 A1 US20120131479 A1 US 20120131479A1 US 201213359169 A US201213359169 A US 201213359169A US 2012131479 A1 US2012131479 A1 US 2012131479A1
Authority
US
United States
Prior art keywords
user interface
graphical user
interface element
layers
subset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/359,169
Inventor
Mark Zimmer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US10/876,298 external-priority patent/US8068103B2/en
Application filed by Apple Inc filed Critical Apple Inc
Priority to US13/359,169 priority Critical patent/US20120131479A1/en
Assigned to APPLE INC. reassignment APPLE INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ZIMMER, MARK
Publication of US20120131479A1 publication Critical patent/US20120131479A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Definitions

  • the invention relates generally to graphical user interface design and more particularly to a means for specifying a graphical user interface object in a procedural and largely display resolution independent manner.
  • GUI graphical user interface
  • a designer designs a graphical user interface object (e.g., a pushbutton, scrollbar, or slider) for a specified resolution.
  • a graphical user interface object e.g., a pushbutton, scrollbar, or slider
  • display of the originally designed object may become distorted. This is particularly a problem when a graphical object is designed at a first resolution (e.g., 75 or 100 pixels per inch) and the user's display is at a second, higher resolution (e.g., 120 or 150 pixels per inch).
  • the invention provides a method to represent a graphical user interface object's material map in a procedural and, therefore, resolution independent manner.
  • the method includes receiving values for each of a plurality of attributes associated with a material map object, associating a value for each of the plurality of attributes, and storing the plurality of attributes and their associated values in a file.
  • the file may be a “flat” file or a hierarchically-ordered file.
  • the collection of attribute-value pairs comprise a complete description of the graphical user interface object's material map and may be used by a rendering module to create a visual representation of the material map at any number of resolutions.
  • material maps in accordance with the invention are represented procedurally, they may be encrypted to prevent unauthorized inspection or use.
  • FIG. 1 shows, in block-diagram format, generation of recipe files in accordance with one embodiment of the invention.
  • FIG. 2 shows a hierarchical structure for use in a recipe file in accordance with one embodiment of the invention.
  • FIG. 3 shows, in block-diagram format, use of recipe files in accordance with one embodiment of the invention.
  • FIG. 4 shows a screen image of a graphical user interface object design application in accordance with one embodiment of the invention.
  • FIGS. 5A through 5J show screen images of various graphical object layer attributes and values in accordance with one embodiment of the invention.
  • FIGS. 6A through 6E show screen images of various features of a material map editor window in accordance with one embodiment of the invention.
  • FIGS. 7A and 7B show screen images of a light source editor window in accordance with one embodiment of the invention.
  • a graphical user interface object may be completely described by a collection of resolution-independent attributes.
  • the collection of all attributes for a given object type define that type of object (e.g., pushbuttons). While the attributes used to define an object may vary, in whole or in part, from object-type to object-type, one of ordinary skill in the art will recognize those attributes needed to completely specify a given object. For example, while some attributes may be associated with a number of different graphical interface objects (e.g., those associated with an object's location in a display window), many attributes may change from object to object (e.g., buttons have attributes associated with the “button” metaphor while check-boxes and slider tracks have attributes particular to their visual metaphor). Accordingly, the attributes identified herein are illustrative only and should not be used to limit the claimed methods, devices and systems.
  • the values associated with each of the specific attributes define a particular implementation or embodiment of the object (e.g., a regular size, deactivated pushbutton).
  • some attribute-values are specified independent of the resolution at which the object is to be displayed while other attribute-values are specified for two or more resolutions.
  • those attributes associated with the physical location or geometry of an object may be associated with a plurality of values—one value for each specified resolution.
  • Other attributes are associated with fixed, resolution independent, values (e.g., visual characteristics such as opacity, color and curvature).
  • buttonsRadius whose radius (attribute “buttonRadius”) is specified for each of five predetermined resolutions (e.g., 100, 120, 125, 133.3 and 150 pixels per inch), but whose outline color (attributes “outlineRed”, “outlineGreen”, and “outlineBlue”) and opacity (attribute “buttonOpacity”) are fixed and resolution independent.
  • the collection of all attribute-values for a specified object completely define its visual characteristics and, as such, may be used to drive the graphical generation of the object (for example, by a rendering engine or operating system module responsible for rendering images).
  • object definitions in accordance with the invention is that graphical objects are defined in terms of a collection of resolution independent attributes.
  • each attribute may be associated with a plurality of values, thereby permitting the designer to optimize the object's design for each of a specified number of resolutions.
  • Still another benefit of object definitions in accordance with the invention is that if the actual displayed resolution of the graphical object is between two of the resolutions specified by the designer, the rendering engine may interpolate between the two values—a technique that generally provides a significantly improved display over prior art up-sampling or down-sampling techniques.
  • a graphical user interface object's design is specified by a collection of attribute-value pairs that are retained or stored in a file, hereinafter referred to as a “recipe” file.
  • the recipe file may be a “flat” file consisting of sequential listing of attribute-value pairs.
  • the recipe file may be a hierarchically ordered file representing an inverted tree, where the root of the tree identifies the type of graphical object (e.g., a pushbutton, a check-box or a slider track) and the first level below the root identifies categories associated with the object (e.g., size, color and state).
  • hierarchically ordered recipe files are stored as eXtensible Markup Language (“XML”) files. Attributes and their values are then associated with each node. In this way, every aspect of a graphical user interface object may be systematically identified and recorded in the recipe file in a manner that is wholly independent from the method used to physically draw (render) the image on a computer display device.
  • XML eXtensible Markup Language
  • recipe files are generated.
  • recipe files are used to generate visual representations of the graphical user interface object for one or more applications at substantially any resolution.
  • phase 1 100 is typically performed by designer 105 interacting with GUI design application 110 to generate one or more recipe files 115 for application 120 .
  • recipe file(s) 115 may be organized in a hierarchical manner.
  • FIG. 2 shows hierarchy 200 that has been found useful in the design of graphical user interface objects in accordance with the invention.
  • root node 205 identifies the object type (e.g., pushbutton).
  • Subordinate to root node 205 are nodes representing the relative size of the displayed object: Regular 210 , Small 215 and Mini 220 . It will be recognized that the sizes represented by nodes 210 , 215 and 220 refer to the relative physical sizes of the displayed object and do not relate to the resolution at which such objects are displayed.
  • Subordinate to the size nodes are “color” nodes, representing the fact that each (sub-) type of object may be associated with similar or separate and distinct color characteristics.
  • Subordinate to the color nodes are states that each version of the object may assume, where a state is defined by the collection of attribute-value pairs associated with that (sub-) type of object.
  • each node in the tree (root node 205 included) has a set of attribute-value pairs associated with it.
  • the root is fully populated—it always contains all attribute-value pairs needed to define the object.
  • each subordinate node only contains attribute-value pairs that serve to override the inherited attribute values of their parent node.
  • Regular node 210 may only possess size-based attributes (and their associated values), while Color-A node subordinate to node 210 may only serve to override the object's material map attribute-value (the main color of the object) and also perhaps the outline color attribute values.
  • phase 2 300 is typically performed when user 305 interacts with application 120 in such a manner as to require display of the graphical user object designed in accordance with phase 1 100 .
  • application 120 transmits recipe file 115 to rendering engine 310 which returns bitmap 315 which, ultimately, is presented to user 305 via display unit 320 .
  • application 120 may extract the necessary information (in the form of attribute-value pairs) from recipe file 115 and transmit those to rendering engine 310 .
  • application 120 may indicate to rendering engine 310 where the recipe file(s) are located.
  • rendering engine 310 may be a stand-alone component or module directly accessible by applications or, alternatively, may be but one module of a larger graphical processing framework supported by the underlying operating system.
  • One such modular or “framework” approach is described in the commonly owned and co-pending patent application entitled “System for Optimizing Graphics Operations” by John Harper, Ralph Brunner, Peter Graffagnino, and Mark Zimmer, Ser. No. 10/825,694, incorporated herein by reference in its entirety.
  • a GUI designer may use interface object design application 400 to design, for example, a pushbutton object.
  • Illustrative design application 400 includes browser region 405 , resolution display region 410 , expanded bit display region 415 , object shape region 420 and user interface construction region 425 .
  • Browser region 405 permits the selected display of various aspects of an object's design.
  • region 405 provides a graphical representation of a recipe file's hierarchical structure in an Apple standard “Finder” format: the left-most pane identifies the recipe file's root (which, in turn, identifies the type of object—a pushbutton); the middle pane identifies categories of pushbuttons (e.g., inactive, mini, pressed, pulsed, regular and small). Subsequent panes display subcategories associated with a selected category. For example, disclosure triangles 430 indicate that pushbutton categories mini, pressed, pulsed, regular and small have additional aspects—the details of which are displayed in the right-most (and possibly subsequent) panes when one of these categories is selected.
  • Resolution display region 410 identifies one or more resolutions for which the object is being designed. As shown, the designer has specified that at least some attribute values for a pushbutton are specified for resolutions of 100, 120, 125, 133.3 and 150 pixels per inch. As noted above, not all attribute values are specified for each of these resolutions, only those that the designer determines are significant to the object's display. Illustrative attribute-value pairs for a pushbutton object and a scrollbar object, including those attributes having multiple values, are shown in Table 1 below. (It will be recognized that the objects shown in region 410 are not actually displayed at the indicated resolution, but are instead “simulations” of how the object would appear at those resolutions.)
  • Expanded bit display region 415 shows an expanded representation of the selected resolution image.
  • region 415 shows the 8 ⁇ pixel-zoomed representation of the 100 pixel per inch pushbutton. Region 415 may be used, for example, to visually inspect the quality of the user interface object rendering or to compare the user interface object rendering with another pixel-zoomed rendering displayed in an image-editing application.
  • Object shape region 420 permits the designer to select, view and specify attribute values associated with a particular shape of the object being designed.
  • a pushbutton's shape may be any one of the shapes identified by shape buttons 435 : Round, Lozenge (“Lozen . . . ”), Round Rectangle (“Round . . . ”), odd (“Scroll bar cap odd end”) or custom.
  • area 440 shows specific attributes associated with the selected shape and, through controls such as slider 445 , text box 450 , radio button 455 or color well 460 permits the designer to change the value associated with those attributes.
  • User interface construction region 425 serves as the primary interface for viewing and specifying attribute values associated with an object's various visual characteristics or layers.
  • a pushbutton may be comprised of Button, Inlay, Outside Shadow, Inside Shadow, Inside Coloring, Outline, Highlight, Figure, Master and Template Match layers.
  • Each layer may be individually selected (noted by a check box alongside the layer's title) and each layer's respective attributes (and means for setting their value) may be individually disclosed through activation of their disclosure triangles (the dark triangle to the immediate left of each layer title's check box), see FIGS. 5A through 5J .
  • a graphical user interface object may be completely defined by a collection of attribute-value pairs that may be used by a rendering engine (or similar module) to display the object.
  • one or more attributes may have two or more values, wherein each value is associated with a specific display resolution. This latter feature permits a designer to uniquely and specifically optimize a single design for multiple resolutions with the added benefit of providing sufficient information for interpolation (generally performed by the rendering engine) should the actual resolution be different from any of the specified resolutions.
  • Table 1 comprises a listing of attributes and their associated values for a pushbutton object and a scrollbar object.
  • Attribute-Value Pairs Attribute Pushbutton Value Scrollbar Value buttonCenterX [1] 48/100, 48/133.333, 48/100, 48.5/120, 47.5/150 48/125, 48.5/133.333, 48.5/150 buttonCenterY [1] 48/100, 48/120, 48.5/125, 48/100, 48.5/120, 48.5/133.333, 48/150 48/125, 48.5/133.333, 48.5/150 buttonCurvature 0.7071 0.7071 buttonMaterialAngle [2] 0 0 buttonMaterialBlur [2] 0 0 buttonMaterialBright [2] 0 0 buttonMaterialChoke [2] 1 1 buttonMaterialContr [2] 0 0 buttonMaterialEdge [2] 0 0 buttonMaterialFlat [2] 0 0 buttonMaterialName [2], [3] clearmap aquamaterial buttonMaterialPull [2]
  • Attributes whose values are set through material maps i.e., button material map, highlight material map and inlay material map.
  • material maps i.e., button material map, highlight material map and inlay material map.
  • graphical user interface objects identified in Table 2 may be defined/specified using the attributes (left-hand column) identified in Table 1. It will also be recognized that while many of the attributes above are specified by values in units of pixels, in other embodiments attribute values may be expressed in terms of a relative factor to a predetermined size factor.
  • the body color of the object (for each point on the object) and the anti-aliased visibility mask of the object are needed.
  • the body color of an object may be obtained by using a three-dimensional representation of the object, or by creating a virtual representation of the object that defines the surface normal for each pixel on the object. Once a unit-length surface normal vector is computed at a point p, the x and y coordinate values of this vector may be used to compute the apparent color of the object at point p by looking up a color from the object's relevant material map.
  • the term “material map” is also referred to as “environmental map,” “reflection map” and “sphere map.”) If the map is n pixels high and n pixels wide (this is done because a shaded sphere is inscribed in the map), one can address the material map at the two-dimensional location given by:
  • the color of the material map at this location may be used as the color for the object at point p. To get an accurate result, it is typical for a material map to be much larger (for example, 256 ⁇ 256 pixels or larger) than the graphical user interface object being rendered.
  • This same technique may be used in conjunction with a material map that possesses alpha (transparency) information.
  • a color with alpha
  • a highlight may be overlaid onto the object by using the alpha as a coverage fraction for the color from the map.
  • Standard compositing methods may be used to accomplish this overlay operation.
  • An object's anti-aliased visibility mask may be obtained by computing a field that provides distance from the edge of the object. This field can be evaluated using a procedural description of the object. For example, a lozenge may be defined as the set of points at distance r or less from a finite non-zero-length line segment from point (p1x, p1y) to point (p2x, p2y). The distance d from the aforementioned line segment may be calculated at point (px, py) by a function such as that provided in Table 3 below.
  • an anti-aliased transparency value (mask) for the graphical user interface object may be computed as shown in Table 4.
  • the same distance field may be used to construct the outline of the user interface object.
  • material maps may be represented in procedural fashion.
  • the interface object design application 400 (see FIG. 4 ) may be enhanced to provide an interface through which material maps may be defined and, subsequently, represented in a procedural fashion.
  • interface object design application 400 may be enhanced to provide material map editor window 600 that includes material map display region 605 , material map control region 610 , light property region 615 and light list region 620 .
  • Material map display region 605 graphically displays the currently selected material map and may also display individual light sources associated with the displayed material map.
  • light sources may be created by “clicking” on the material map's graphical display where a light handle does not already exist (in FIG. 6A , light handles are represented by circles in display region 605 ).
  • a new light source is created in this way, a duplicate of the currently selected light source is created and placed at the location which was clicked.
  • a light source may be moved by selecting (e.g., “clicking”) and dragging the light handles to the desired position.
  • a user may place a light source “behind” the material map by placing the light source off the material map's surfaces such as shown with light handles 625 in FIG. 6A .
  • a light source may be selected by “clicking” on it (see discussion below).
  • Material map control region 610 permits the user to control the overall presentation of a material map and, in addition, selection and storage of the material map.
  • the material map may be displayed having a checkerboard background by selecting the “Over Checkerboard” check-box and light handles may be displayed by selecting the “Display Light Handles” check-box.
  • the “Show” drop-down menu allows a user to show the material map to be built (by selecting the “Material” item) or a flat material map that is an image in accordance with prior art material maps (by selecting the “Original” item). Referring to FIG.
  • the “Files” drop-down menu permits the user to open a previously stored procedural representation of a material map (i.e., a Material Recipe file), save the displayed material map as a material recipe file, open an original graphical material map (i.e., an image) or save the displayed material map as an image (i.e., to generate a graphical material map).
  • a material map i.e., a Material Recipe file
  • save the displayed material map as a material recipe file open an original graphical material map (i.e., an image) or save the displayed material map as an image (i.e., to generate a graphical material map).
  • Light property list region 615 permits the user to set various properties of a selected light source. For example, if light source 630 is selected (see FIG. 6A ), its properties would appear in region 615 . Through slider controls of the type generally known in the art properties of the selected light source may be adjusted. For example, the power slider may be used to set whether the light source is diffuse (a lower value) or specular (a higher value).
  • the image scale slider may be used to set the size of the image that is reflected off the material map's surface; the image angle slider may be used to control the rotation of the reflected image about the point of contact between the image and the material map's surface; and the image X and Y offset sliders may be used to control the point at which the reflected image contacts the material map's surface.
  • a light source's “type” may be selected as colored, image, image masked or reflection map.
  • a “colored” light source is a colored light with a circular distribution (when viewed straight-on) and is defined such that it can describe a diffuse (e.g., a power value of 0) or specular (e.g., a power value of 64) light source.
  • An “image” light source describes an image, potentially masked by using an alpha transparency mask, that is placed tangent to the material map's surface at the location of its associated light handle.
  • An “image masked” light source is like an “image” light source but is additionally masked by the same distribution as defined by a colored light source.
  • a “reflection map” light source is a whole light ball image that is placed on top of the material map's surface and which may be rotated by “image angle” sliders. It is noted that for a reflection map type of light source, of all the image-applicable sliders, only the image angle slider is applicable.
  • the “Blend Mode” of the selected light source may be set through the “Blend Mode” drop-down menu as shown in FIG. 6E . Since each of the blend modes identified in FIG. 6E are generally known in the art, they will not be described further.
  • Light list region 620 lists all light sources associated with the displayed material map.
  • region 620 permits the user to activate each light source (e.g., through “On” check-boxes) and to set the “Brightness” of each light source (e.g., through slider-type controls) individually.
  • individual entries in the list of light sources may be selected (e.g., by “clicking” on its “Name”) and dragged up or down in the list to adjust its display priority.
  • the first light source in the list has the front-most in priority while the last light source in the list has the back-most priority. In this way, light sources may be treated like “layers” that are composited on top of each other using blend modes. This manner of layering objects is well-known in the art.
  • image, image masked or reflection map lights may be procedurally defined through light maker window 700 .
  • light maker window 700 includes light image display region 705 , light image primitive region 710 , and light image focus region 715 .
  • light image display region 705 shows a graphical representation of a light source that is defined in accordance with the attribute or characteristic values set (and shown) in regions 710 and 715 .
  • a circular type light image is shown in FIG. 7A .
  • light image primitive region 710 other types of light sources that may be defined procedurally in accordance with the invention include rectangle shapes (“Rectangle”), horizontal lines (“Lines”) and rectangles (i.e., a pane split horizontally or vertically, or both).
  • Rectangle rectangle shapes
  • Lines horizontal lines
  • rectangles i.e., a pane split horizontally or vertically, or both.
  • an existing image file image may be imported by selecting the “File” item in FIG. 7B .
  • light image primitive region 710 includes five (5) slider controls that may be used to adjust or set the value corresponding to the light source's radius, width, height, spacing and thickness characteristics or attributes.
  • the Radius slider applies to the Circle image type, and controls its size.
  • the Width and Height sliders apply to the Rectangle, Lines, and Rectangles image types and control their size (bounds).
  • the Spacing slider applies to the Lines and Rectangles image types, and control the distance in between lines or between sub-panes.
  • the Thickness slider applies to the Lines and Rectangles image type, and controls the amount of coverage for the lines or sub-panes.
  • slider controls may be used in light image focus region 715 to set the amount of blur applied to the image.
  • the Upper and Lower sliders control the amount of blur, in pixels, applied to the top and bottom of the image primitive. In the example illustrated in FIG. 7A , it applies to the top of the circle and the bottom of the circle, respectively.
  • each property identified through light property region 615 and light list region 620 is used as an attribute tag and the properties associated value (e.g., Power value 1.0 and Img. Scale value of 100 as shown in FIG. 6A ) as a value for that property. This is in keeping with the attribute-value description illustrated in Table 1 above.
  • attribute value pairs may be stored in a flat file or a hierarchically-ordered file such as an XML file.
  • default values may be assigned to one or more of the material map's properties.
  • a single light source may be instantiated with a specified collection of default property values.
  • a default light source may be “Colored” (see FIG. 6D ), have a “Normal” blend mode (see FIG. 6E ), a “Power” value of 1, be “on” and have a brightness of 50%.
  • recipe files in accordance with the invention may be used to dynamically generate the images which can reduce the amount of memory needed to store a user interface and substantially reduce the time required to create a specific user interface element.
  • material maps in accordance with the invention are represented in a procedural manner they are resolution independent. This is in sharp contrast with prior art material maps that rely on an image having a set or fixed resolution.
  • a single (procedurally defined) material map may be used for all resolutions rather than having to have multiple material maps—each at a display resolution.
  • material maps in accordance with the invention are procedural in nature, they may be encrypted to protect their content. (That is, the text recipe file is encrypted.)
  • attributes other than, or in addition to, those identified in Table 1 and in FIG. 6 may be used to specify an object.
  • hierarchical storage means other than an XML file may be used to store an object's procedural specification.
  • a programmable control device may be a single computer processor, a special purpose processor (e.g., a digital signal processor, a graphics processing unit or a programmable graphics processing unit), a plurality of processors coupled by a communications link or a custom designed state machine.
  • Custom designed state machines may be embodied in a hardware device such as an integrated circuit including, but not limited to, application specific integrated circuits (“ASICs”) or field programmable gate array (“FPGAs”).
  • Storage devices suitable for tangibly embodying program instructions include, but are not limited to: magnetic disks (fixed, floppy, and removable) and tape; optical media such as CD-ROMs and digital video disks (“DVDs”); and semiconductor memory devices such as Electrically Programmable Read-Only Memory (“EPROM”), Electrically Erasable Programmable Read-Only Memory (“EEPROM”), Programmable Gate Arrays and flash devices.
  • EPROM Electrically Programmable Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • flash devices such as Electrically Programmable Read-Only Memory (“EPROM”), Electrically Erasable Programmable Read-Only Memory (“EEPROM”), Programmable Gate Arrays and flash devices.

Abstract

Graphical user interface material map objects are specified by a collection of attribute-value pairs, the collection of which comprises a complete description of the material map and may be used by a rendering engine to create a visual representation of the material map at any resolution. That is, material map representations in accordance with the invention are resolution independent. Another benefit of representing material maps in accordance with the invention is that they may be encrypted to prevent unauthorized inspection or use.

Description

    RELATED APPLICATION
  • This is a continuation application of U.S. patent application Ser. No. 11/459,140 filed 21 Jul. 2006, now U.S. Pat. No. ______, which is a continuation-in-part of U.S. patent application Ser. No. 10/876,298, now U.S. Pat. No. 8,068,103 entitled “User-Interface Design,” filed 24 Jun. 2004. This application claims priority to and incorporates each of the aforementioned applications in their entirety. This application is also related to U.S. patent application Ser. No. 11/696,631 filed 4 Apr. 2007 having the same title and inventor.
  • BACKGROUND
  • The invention relates generally to graphical user interface design and more particularly to a means for specifying a graphical user interface object in a procedural and largely display resolution independent manner.
  • Designing an efficient, ergonomic and aesthetically pleasing user interface is an integral stage of most application development projects. The graphical user interface (“GUI”) is what the user sees and interacts with. Accordingly, the GUI must present information and choices to a user in a way that is not only pleasing and natural to the eye but conducive to efficient use of the underlying application. One major concern in the development of modern GUIs is the resolution of the various objects that comprise the GUI. Typically, a designer designs a graphical user interface object (e.g., a pushbutton, scrollbar, or slider) for a specified resolution. As the resolution of the user's display changes, however, display of the originally designed object may become distorted. This is particularly a problem when a graphical object is designed at a first resolution (e.g., 75 or 100 pixels per inch) and the user's display is at a second, higher resolution (e.g., 120 or 150 pixels per inch).
  • In the past, two general techniques have been used to address the problem associated with displaying objects designed for a first resolution but which are displayed at a second resolution. In the first, an original (low resolution) object is up-sampled to generate a larger image (e.g., through linear or bicubic interpolation). This technique results in blurry edges such that the user interface no longer looks crisp. In the second, an original object is designed for display at a high resolution and is then down-sampled to an unknown target resolution. While useful in some circumstances, it is not possible a priori to know what width to give a line (e.g., an object's edge) at the higher resolution such that when down-sampled it remains crisp. This is particularly true when there are multiple target resolutions. Thus, both up-sampling and down-sampling techniques tend to disturb the designer's specified line width. One of ordinary skill in the art will recognize that line width is a critical factor in GUI design as the width of lines define the edge of graphical objects. If edges appear blurry or ill-defined, the entire GUI design may be compromised.
  • Thus, it would be beneficial to provide a means to specify the design of a graphical user interface object independent of its display resolution. Such a description may advantageously be used by a rendering module to display the designed object at substantially any resolution.
  • SUMMARY
  • In one embodiment, the invention provides a method to represent a graphical user interface object's material map in a procedural and, therefore, resolution independent manner. The method includes receiving values for each of a plurality of attributes associated with a material map object, associating a value for each of the plurality of attributes, and storing the plurality of attributes and their associated values in a file. The file may be a “flat” file or a hierarchically-ordered file. The collection of attribute-value pairs comprise a complete description of the graphical user interface object's material map and may be used by a rendering module to create a visual representation of the material map at any number of resolutions. In addition, because material maps in accordance with the invention are represented procedurally, they may be encrypted to prevent unauthorized inspection or use.
  • Those of ordinary skill in the art will recognize that methods in accordance with the described invention may be embodied in programs, program modules or applications that may be stored in any media that is readable and executable by a computer system.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows, in block-diagram format, generation of recipe files in accordance with one embodiment of the invention.
  • FIG. 2 shows a hierarchical structure for use in a recipe file in accordance with one embodiment of the invention.
  • FIG. 3 shows, in block-diagram format, use of recipe files in accordance with one embodiment of the invention.
  • FIG. 4 shows a screen image of a graphical user interface object design application in accordance with one embodiment of the invention.
  • FIGS. 5A through 5J show screen images of various graphical object layer attributes and values in accordance with one embodiment of the invention.
  • FIGS. 6A through 6E show screen images of various features of a material map editor window in accordance with one embodiment of the invention.
  • FIGS. 7A and 7B show screen images of a light source editor window in accordance with one embodiment of the invention.
  • DETAILED DESCRIPTION
  • Methods, devices and systems to describe or capture the design of graphical user interface objects in a procedural and, largely resolution independent, manner are described. The following embodiments of the invention, described in terms of graphical user interface object design conforming to the Apple Human Interface Guidelines, are illustrative only and are not to be considered limiting in any respect. (The Apple Human Interface Guidelines are available from Apple Computer, Inc. of Cupertino, Calif.)
  • It has been determined that a graphical user interface object may be completely described by a collection of resolution-independent attributes. The collection of all attributes for a given object type define that type of object (e.g., pushbuttons). While the attributes used to define an object may vary, in whole or in part, from object-type to object-type, one of ordinary skill in the art will recognize those attributes needed to completely specify a given object. For example, while some attributes may be associated with a number of different graphical interface objects (e.g., those associated with an object's location in a display window), many attributes may change from object to object (e.g., buttons have attributes associated with the “button” metaphor while check-boxes and slider tracks have attributes particular to their visual metaphor). Accordingly, the attributes identified herein are illustrative only and should not be used to limit the claimed methods, devices and systems.
  • Just as a specific collection of attributes define a specific type of graphical interface object (e.g., pushbuttons), the values associated with each of the specific attributes define a particular implementation or embodiment of the object (e.g., a regular size, deactivated pushbutton). In accordance with the invention, some attribute-values are specified independent of the resolution at which the object is to be displayed while other attribute-values are specified for two or more resolutions. In general, those attributes associated with the physical location or geometry of an object may be associated with a plurality of values—one value for each specified resolution. Other attributes are associated with fixed, resolution independent, values (e.g., visual characteristics such as opacity, color and curvature). By way of example, consider a pushbutton object whose radius (attribute “buttonRadius”) is specified for each of five predetermined resolutions (e.g., 100, 120, 125, 133.3 and 150 pixels per inch), but whose outline color (attributes “outlineRed”, “outlineGreen”, and “outlineBlue”) and opacity (attribute “buttonOpacity”) are fixed and resolution independent.
  • Thus, in accordance with the invention the collection of all attribute-values for a specified object completely define its visual characteristics and, as such, may be used to drive the graphical generation of the object (for example, by a rendering engine or operating system module responsible for rendering images). One benefit of object definitions in accordance with the invention is that graphical objects are defined in terms of a collection of resolution independent attributes. Another benefit of object definitions in accordance with the invention is that each attribute may be associated with a plurality of values, thereby permitting the designer to optimize the object's design for each of a specified number of resolutions. Still another benefit of object definitions in accordance with the invention is that if the actual displayed resolution of the graphical object is between two of the resolutions specified by the designer, the rendering engine may interpolate between the two values—a technique that generally provides a significantly improved display over prior art up-sampling or down-sampling techniques.
  • In accordance with the invention, a graphical user interface object's design is specified by a collection of attribute-value pairs that are retained or stored in a file, hereinafter referred to as a “recipe” file. In one embodiment, the recipe file may be a “flat” file consisting of sequential listing of attribute-value pairs. In another embodiment, the recipe file may be a hierarchically ordered file representing an inverted tree, where the root of the tree identifies the type of graphical object (e.g., a pushbutton, a check-box or a slider track) and the first level below the root identifies categories associated with the object (e.g., size, color and state). In one particular embodiment, hierarchically ordered recipe files are stored as eXtensible Markup Language (“XML”) files. Attributes and their values are then associated with each node. In this way, every aspect of a graphical user interface object may be systematically identified and recorded in the recipe file in a manner that is wholly independent from the method used to physically draw (render) the image on a computer display device.
  • Methods, devices and systems in accordance with the invention may be described in terms of two phases. In a first phase, recipe files are generated. In a second phase the recipe files are used to generate visual representations of the graphical user interface object for one or more applications at substantially any resolution.
  • Referring to FIG. 1, phase 1 100 is typically performed by designer 105 interacting with GUI design application 110 to generate one or more recipe files 115 for application 120. As noted above, recipe file(s) 115 may be organized in a hierarchical manner. FIG. 2 shows hierarchy 200 that has been found useful in the design of graphical user interface objects in accordance with the invention. As illustrated, root node 205 identifies the object type (e.g., pushbutton). Subordinate to root node 205 are nodes representing the relative size of the displayed object: Regular 210, Small 215 and Mini 220. It will be recognized that the sizes represented by nodes 210, 215 and 220 refer to the relative physical sizes of the displayed object and do not relate to the resolution at which such objects are displayed. Subordinate to the size nodes are “color” nodes, representing the fact that each (sub-) type of object may be associated with similar or separate and distinct color characteristics. Subordinate to the color nodes are states that each version of the object may assume, where a state is defined by the collection of attribute-value pairs associated with that (sub-) type of object. Thus, each node in the tree (root node 205 included) has a set of attribute-value pairs associated with it. In one embodiment, the root is fully populated—it always contains all attribute-value pairs needed to define the object. In this embodiment, each subordinate node only contains attribute-value pairs that serve to override the inherited attribute values of their parent node. For example, “Regular” node 210 may only possess size-based attributes (and their associated values), while Color-A node subordinate to node 210 may only serve to override the object's material map attribute-value (the main color of the object) and also perhaps the outline color attribute values.
  • Referring to FIG. 3, phase 2 300 is typically performed when user 305 interacts with application 120 in such a manner as to require display of the graphical user object designed in accordance with phase 1 100. In one embodiment when this occurs, application 120 transmits recipe file 115 to rendering engine 310 which returns bitmap 315 which, ultimately, is presented to user 305 via display unit 320. In another embodiment, application 120 may extract the necessary information (in the form of attribute-value pairs) from recipe file 115 and transmit those to rendering engine 310. In still another embodiment, application 120 may indicate to rendering engine 310 where the recipe file(s) are located. One of ordinary skill will recognize that rendering engine 310 may be a stand-alone component or module directly accessible by applications or, alternatively, may be but one module of a larger graphical processing framework supported by the underlying operating system. One such modular or “framework” approach is described in the commonly owned and co-pending patent application entitled “System for Optimizing Graphics Operations” by John Harper, Ralph Brunner, Peter Graffagnino, and Mark Zimmer, Ser. No. 10/825,694, incorporated herein by reference in its entirety.
  • Referring to FIG. 4, in one embodiment a GUI designer may use interface object design application 400 to design, for example, a pushbutton object. Illustrative design application 400 includes browser region 405, resolution display region 410, expanded bit display region 415, object shape region 420 and user interface construction region 425.
  • Browser region 405 permits the selected display of various aspects of an object's design. In particular, region 405 provides a graphical representation of a recipe file's hierarchical structure in an Apple standard “Finder” format: the left-most pane identifies the recipe file's root (which, in turn, identifies the type of object—a pushbutton); the middle pane identifies categories of pushbuttons (e.g., inactive, mini, pressed, pulsed, regular and small). Subsequent panes display subcategories associated with a selected category. For example, disclosure triangles 430 indicate that pushbutton categories mini, pressed, pulsed, regular and small have additional aspects—the details of which are displayed in the right-most (and possibly subsequent) panes when one of these categories is selected.
  • Resolution display region 410 identifies one or more resolutions for which the object is being designed. As shown, the designer has specified that at least some attribute values for a pushbutton are specified for resolutions of 100, 120, 125, 133.3 and 150 pixels per inch. As noted above, not all attribute values are specified for each of these resolutions, only those that the designer determines are significant to the object's display. Illustrative attribute-value pairs for a pushbutton object and a scrollbar object, including those attributes having multiple values, are shown in Table 1 below. (It will be recognized that the objects shown in region 410 are not actually displayed at the indicated resolution, but are instead “simulations” of how the object would appear at those resolutions.)
  • Expanded bit display region 415 shows an expanded representation of the selected resolution image. In the illustrated example of FIG. 4, region 415 shows the 8× pixel-zoomed representation of the 100 pixel per inch pushbutton. Region 415 may be used, for example, to visually inspect the quality of the user interface object rendering or to compare the user interface object rendering with another pixel-zoomed rendering displayed in an image-editing application.
  • Object shape region 420 permits the designer to select, view and specify attribute values associated with a particular shape of the object being designed. For example, in the illustrated embodiment a pushbutton's shape may be any one of the shapes identified by shape buttons 435: Round, Lozenge (“Lozen . . . ”), Round Rectangle (“Round . . . ”), odd (“Scroll bar cap odd end”) or custom. Immediately beneath shape buttons 435, area 440 shows specific attributes associated with the selected shape and, through controls such as slider 445, text box 450, radio button 455 or color well 460 permits the designer to change the value associated with those attributes.
  • User interface construction region 425 serves as the primary interface for viewing and specifying attribute values associated with an object's various visual characteristics or layers. In the embodiment of FIG. 4, for example, a pushbutton may be comprised of Button, Inlay, Outside Shadow, Inside Shadow, Inside Coloring, Outline, Highlight, Figure, Master and Template Match layers. Each layer may be individually selected (noted by a check box alongside the layer's title) and each layer's respective attributes (and means for setting their value) may be individually disclosed through activation of their disclosure triangles (the dark triangle to the immediate left of each layer title's check box), see FIGS. 5A through 5J.
  • Thus, in accordance with the invention a graphical user interface object may be completely defined by a collection of attribute-value pairs that may be used by a rendering engine (or similar module) to display the object. Further, one or more attributes may have two or more values, wherein each value is associated with a specific display resolution. This latter feature permits a designer to uniquely and specifically optimize a single design for multiple resolutions with the added benefit of providing sufficient information for interpolation (generally performed by the rendering engine) should the actual resolution be different from any of the specified resolutions. By way of example only, Table 1 comprises a listing of attributes and their associated values for a pushbutton object and a scrollbar object.
  • TABLE 1
    Illustrative Attribute-Value Pairs
    Attribute Pushbutton Value Scrollbar Value
    buttonCenterX [1] 48/100, 48/133.333, 48/100, 48.5/120,
    47.5/150 48/125, 48.5/133.333,
    48.5/150
    buttonCenterY [1] 48/100, 48/120, 48.5/125, 48/100, 48.5/120,
    48.5/133.333, 48/150 48/125, 48.5/133.333,
    48.5/150
    buttonCurvature 0.7071 0.7071
    buttonMaterialAngle [2] 0 0
    buttonMaterialBlur [2] 0 0
    buttonMaterialBright [2] 0 0
    buttonMaterialChoke [2] 1 1
    buttonMaterialContr [2] 0 0
    buttonMaterialEdge [2] 0 0
    buttonMaterialFlat [2] 0 0
    buttonMaterialName [2], [3] clearmap aquamaterial
    buttonMaterialPull [2] 0 0
    buttonMaxX 55 217
    buttonMaxY 60 213
    buttonMinX 23 185
    buttonMinY 36 189
    buttonOffsetX 0 0
    buttonOffsetY 0 0
    buttonOpacity 0.868217 1
    buttonPoint1X [1] 43/100, 42/120, 41.5/125, 189
    41.5/133.333, 40/150
    buttonPoint1Y [1] 48/100, 48/120, 48.5/125, 201
    48.5/133.333, 48/150
    buttonPoint2X [1] 53/100, 54/120, 54.5/125, 213
    54.5/133.333, 55/150
    buttonPoint2Y [1] 48/100, 48/120, 48.5/125, 201
    48.5/133.333, 48/150
    buttonRadius [1] 10.5/100, 13/125, 7.5/100, 9/120,
    14/133.333, 15.5/150 9.5/125, 11/150
    buttonRoundness 0.5 0.5
    buttonType 1 0
    figureBlendMode 0 0
    figureBlue 0 0
    figureFillWithColor 0 0
    figureGreen 0 0
    figureName [3] mixed figure
    figureOpacity 1 1
    figureRed 0 0
    figureSize 0.5 0.5
    figureXPosition 0 0
    figureYPosition 0 0
    highlightMaterialAngle [2] 0 0
    highlightMaterialBlur [2] 41.509434 0
    highlightMaterialBright [2] −0.245283 0
    highlightMaterialChoke [2] 0.532075 1
    highlightMaterialContr [2] 0.433962 0
    highlightMaterialEdge [2] 0.481132 0
    highlightMaterialFlat [2] −0.226415 0
    highlightMaterialName [2] glasshighlightmaterial highlightmaterial
    highlightMaterialPull [2] −0.057/100, −0.038/120, 0
    −0.075/125, −0.075/150
    highlightOpacity 0.279683 1
    inlayMaterialAngle [2] 0 0
    inlayMaterialBlur [2] 0 0
    inlayMaterialBright [2] 0 0
    inlayMaterialChoke [2] 1 1
    inlayMaterialContr [2] 0 0
    inlayMaterialEdge [2] 0 0
    inlayMaterialFlat [2] 0 0
    inlayMaterialName [2], [3] inlaymaterial inlaymaterial
    inlayMaterialPull [2] 0 0
    inlayMaxRadius [1] 12.757/100, 15.795/125, 9.133/100, 10.935/120,
    17.01/133.333, 18.833/150 11.543/125,
    13.365/150
    inlayOpacity 1 1
    inlayThickness 0.43 0.43
    insideColoringBlue 0.386252 0
    insideColoringGreen 0.336153 0
    insideColoringOpacity 0.1 0.1
    insideColoringRed 0.705882 0
    insideShadowBlue 0 0
    insideShadowBlurRadius [1] 1.5/100, 1.857/125, 1.072/100, 1.286/120,
    2/133.333, 2.214/150 1.358/125, 1.572/150
    insideShadowGreen 0 0
    insideShadowOffsetX −0 0
    insideShadowOffsetY [1] 0.75/100, 0.929/125, 0.536/100, 0.643/120,
    1/133.333, 1.107/150 0.679/125, 0.786/150
    insideShadowOpacity 0.60686 1
    insideShadowRed 0 0
    masterOpacity 0.744186 1
    oddDirection 0 0
    outlineBlue 0.968326 0
    outlineFade 1 0
    outlineFadeAngle 0 0
    outlineFadeWidth [1] 31.78/100, 39.346/125, 1/100, 1.2/120,
    42.373/133.333, 46.913/150 1.267/125, 1.467/150
    outlineGreen 0.176788 0
    outlineMaxRadius [1] 10.5/100, 13/125, 7.5/100, 9/120,
    14/133.333, 15.5/150 9.5/125, 11/150
    outlineOpacity 0.601583 0.4
    outlineRed 0.242527 0
    outlineThickness [1] 0.175/100, 0.144/120, 0.267/100, 0.222/120,
    0.139/125, 0.129/133.333, 0.211/125,
    0.116/150 0.2/133.333, 0.182/150
    outsideShadowBlue 0 0
    outsideShadowBlurRadius 0.66 1.07175
    outsideShadowGreen 0 0
    outsideShadowOffsetX −0 0
    outsideShadowOffsetY 1.503958 0
    outsideShadowOpacity 0.601583 1
    outsideShadowRed 0 0
    outsideShadowScale 1 1
    roundRectHorizontal 1 1
    roundRectPointerDirection 0 0
    roundRectPointerShape 0 0
    roundRectPointiness 1.570796 1.570796
    showButton 1 1
    showFigure 1 0
    showHighlight 1 1
    showInlay 0 0
    showInsideColoring 0 0
    showInsideShadow 0 0
    showOutline 0 0
    showOutlineShadow 0 0
    templateMatchBottom [1] 7.308/100, 9.048/125, 7.5/100, 9/120,
    9.744/133.333, 10.788/150 9.5/125, 11/150
    templateMatchChop [1] 1 0
    templateMatchHoriz [1] 12.348/100, 15.288/125, 7.5/100, 9/120,
    16.464/133.333, 18.228/150 9.5/125, 11/150
    templateMatchLeft [1] 6.552/100, 8.112/125, 7.5/100, 9/120,
    8.736/133.333, 9.672/150 9.5/125, 11/150
    templateMatchRight [1] 6.3/100, 7.8/125, 7.5/100, 9/120,
    8.4/133.333, 9.3/150 9.5/125, 11/150
    templateMatchTop [1] 3.024/100, 3.744/125, 7.5/100, 9/120,
    4.032/133.333, 4.464/150 9.5/125, 11/150
    templateMatchVert 0 7.5/100, 9/120,
    9.5/125, 11/150
    undulationAmount 0 0
    undulationBlue 0 0
    undulationGreen 0 0
    undulationPeriod [1] 22/100, 27/125, 28/133.333, 16
    33/150
    undulationRed 0 0
    [1] The notation W/100, X/125, Y/133.333 and Z/150 indicates a value W should be used for a resolution of 100 pixels per inch, and so forth.
    [2] Attributes whose values are set through material maps (i.e., button material map, highlight material map and inlay material map).
    [3] Represents a file name. For example, an extension is added (e.g., “.png” for image files or “.pdf” for vector line art files).
  • In a current embodiment, graphical user interface objects identified in Table 2 may be defined/specified using the attributes (left-hand column) identified in Table 1. It will also be recognized that while many of the attributes above are specified by values in units of pixels, in other embodiments attribute values may be expressed in terms of a relative factor to a predetermined size factor.
  • TABLE 2
    Illustrative Graphical Interface Objects
    Help Button (regular, small, mini) Back Button (regular, small)
    Round Button (regular, small) Push Button (regular, small, mini)
    Square Bevel Button (regular, small, mini) Rounded Bevel Button (regular, small, mini)
    Metal Button (regular, small, mini) Segment Control (regular, small, mini)
    Window Title Bar Controls (regular, small, mini) Disclosure Button (regular, small, mini)
    Arrow Pop-Up Button (regular, small, mini) Pop-Up Button (regular, small, mini)
    Combo Button (regular, small, mini) Pulldown Button (regular, small, mini)
    Check Box (regular, small, mini) Radio Button (regular, small, mini)
    Scroll Bar Track (regular, small) Scroll Bar Thumb (regular, small)
    Scroll Bar Caps (regular, small) Slider Track (regular, small, mini)
    Circular Slider Thumb (regular, small, mini) Pointed Slider Thumb (north, east, south,
    west orientations) (regular, small, mini)
    Rectangular Text Field (regular, small, mini) Round Text Field (regular, small, mini)
    Tabs (north, east, south, west orientations) Determinate Progress Bar (regular, small)
    Asynchronous Progress Indicator iDisk Synch Progress Indicator
    Pane Splitter Drawer
    List Box Metal Window Shaping
  • To create a graphical user interface object, the body color of the object (for each point on the object) and the anti-aliased visibility mask of the object are needed. The body color of an object may be obtained by using a three-dimensional representation of the object, or by creating a virtual representation of the object that defines the surface normal for each pixel on the object. Once a unit-length surface normal vector is computed at a point p, the x and y coordinate values of this vector may be used to compute the apparent color of the object at point p by looking up a color from the object's relevant material map. (One of ordinary skill in the art will recognize that the term “material map” is also referred to as “environmental map,” “reflection map” and “sphere map.”) If the map is n pixels high and n pixels wide (this is done because a shaded sphere is inscribed in the map), one can address the material map at the two-dimensional location given by:
  • ( ( x + 1 ) n 2 , ( y + 1 ) n 2 ) EQ . 1
  • The color of the material map at this location may be used as the color for the object at point p. To get an accurate result, it is typical for a material map to be much larger (for example, 256×256 pixels or larger) than the graphical user interface object being rendered.
  • This same technique may be used in conjunction with a material map that possesses alpha (transparency) information. Once a color (with alpha) is looked up from a transparency material map, a highlight may be overlaid onto the object by using the alpha as a coverage fraction for the color from the map. Standard compositing methods may be used to accomplish this overlay operation.
  • An object's anti-aliased visibility mask may be obtained by computing a field that provides distance from the edge of the object. This field can be evaluated using a procedural description of the object. For example, a lozenge may be defined as the set of points at distance r or less from a finite non-zero-length line segment from point (p1x, p1y) to point (p2x, p2y). The distance d from the aforementioned line segment may be calculated at point (px, py) by a function such as that provided in Table 3 below.
  • TABLE 3
    Illustrative Field (Distance) Calculation for a Lozenge Object
    Let vx, vy, length, wx, wy, and d be d = |(wx × vy) − (wy × vx)|
    floating point values, then  determine distance from line if
    vx = p1x − p2x ((vx × wx) + (vy × wy)) > 0
    vy = p1y − p2y  that is, if past point p1
    length = {square root over ((vx)2 +  (vy)2)}{square root over ((vx)2 +  (vy)2)}  d = {square root over ((wx)2 + (wy)2)}{square root over ((wx)2 + (wy)2)}
    vx = vx length  use distance from p1
    wx = px − p2x
    vy = vy length wy = py − p2y
    wx = px − p1x if ((vx × wy) − (vy × wy)) < 0
    wy = py − p1y  that is, if past point p2
     Continued in next column (→)  d = {square root over ((wx)2 +  (wy)2)}{square root over ((wx)2 +  (wy)2)}
     use distance from p2
  • Given the distance function d defined above (see Table 1), an anti-aliased transparency value (mask) for the graphical user interface object may be computed as shown in Table 4. The same distance field may be used to construct the outline of the user interface object.
  • TABLE 4
    Illustrative Transparency Value (Mask) Calculation
    mask = r−d
    if (mask > 1.0)mask
    = 1.0
    if (mask < 0.0)
    mask = 0.0
  • In another embodiment, material maps may be represented in procedural fashion. In this embodiment the interface object design application 400 (see FIG. 4) may be enhanced to provide an interface through which material maps may be defined and, subsequently, represented in a procedural fashion. In contrast, prior art material maps for use in user interface elements used images as described above.
  • Referring to FIG. 6A, interface object design application 400 may be enhanced to provide material map editor window 600 that includes material map display region 605, material map control region 610, light property region 615 and light list region 620.
  • Material map display region 605 graphically displays the currently selected material map and may also display individual light sources associated with the displayed material map. In the embodiment of FIG. 6A, light sources may be created by “clicking” on the material map's graphical display where a light handle does not already exist (in FIG. 6A, light handles are represented by circles in display region 605). In one embodiment, when a new light source is created in this way, a duplicate of the currently selected light source is created and placed at the location which was clicked. In similar fashion, a light source may be moved by selecting (e.g., “clicking”) and dragging the light handles to the desired position. Further, a user may place a light source “behind” the material map by placing the light source off the material map's surfaces such as shown with light handles 625 in FIG. 6A. Similarly, a light source may be selected by “clicking” on it (see discussion below).
  • Material map control region 610 permits the user to control the overall presentation of a material map and, in addition, selection and storage of the material map. For example, the material map may be displayed having a checkerboard background by selecting the “Over Checkerboard” check-box and light handles may be displayed by selecting the “Display Light Handles” check-box. Referring to FIG. 6B, the “Show” drop-down menu allows a user to show the material map to be built (by selecting the “Material” item) or a flat material map that is an image in accordance with prior art material maps (by selecting the “Original” item). Referring to FIG. 6C, the “Files” drop-down menu permits the user to open a previously stored procedural representation of a material map (i.e., a Material Recipe file), save the displayed material map as a material recipe file, open an original graphical material map (i.e., an image) or save the displayed material map as an image (i.e., to generate a graphical material map).
  • Light property list region 615 permits the user to set various properties of a selected light source. For example, if light source 630 is selected (see FIG. 6A), its properties would appear in region 615. Through slider controls of the type generally known in the art properties of the selected light source may be adjusted. For example, the power slider may be used to set whether the light source is diffuse (a lower value) or specular (a higher value). For image and image masked light sources (see discussion below): the image scale slider may be used to set the size of the image that is reflected off the material map's surface; the image angle slider may be used to control the rotation of the reflected image about the point of contact between the image and the material map's surface; and the image X and Y offset sliders may be used to control the point at which the reflected image contacts the material map's surface.
  • In addition, drop-down menus may be used to set the “Type” and “Blend Mode” for the selected light source. Referring to FIG. 6D, in the illustrative embodiment a light source's “type” may be selected as colored, image, image masked or reflection map. A “colored” light source is a colored light with a circular distribution (when viewed straight-on) and is defined such that it can describe a diffuse (e.g., a power value of 0) or specular (e.g., a power value of 64) light source. An “image” light source describes an image, potentially masked by using an alpha transparency mask, that is placed tangent to the material map's surface at the location of its associated light handle. An “image masked” light source is like an “image” light source but is additionally masked by the same distribution as defined by a colored light source. A “reflection map” light source is a whole light ball image that is placed on top of the material map's surface and which may be rotated by “image angle” sliders. It is noted that for a reflection map type of light source, of all the image-applicable sliders, only the image angle slider is applicable. In a similar fashion, the “Blend Mode” of the selected light source may be set through the “Blend Mode” drop-down menu as shown in FIG. 6E. Since each of the blend modes identified in FIG. 6E are generally known in the art, they will not be described further.
  • Light list region 620 lists all light sources associated with the displayed material map. In addition, region 620 permits the user to activate each light source (e.g., through “On” check-boxes) and to set the “Brightness” of each light source (e.g., through slider-type controls) individually. In the illustrated embodiment, individual entries in the list of light sources may be selected (e.g., by “clicking” on its “Name”) and dragged up or down in the list to adjust its display priority. In one embodiment, the first light source in the list has the front-most in priority while the last light source in the list has the back-most priority. In this way, light sources may be treated like “layers” that are composited on top of each other using blend modes. This manner of layering objects is well-known in the art.
  • Referring to FIG. 7A, in one embodiment, image, image masked or reflection map lights may be procedurally defined through light maker window 700. As shown, light maker window 700 includes light image display region 705, light image primitive region 710, and light image focus region 715. As in prior editing windows, light image display region 705 shows a graphical representation of a light source that is defined in accordance with the attribute or characteristic values set (and shown) in regions 710 and 715. For illustrative purposes, a circular type light image is shown in FIG. 7A. Referring to FIG. 7B and light image primitive region 710, other types of light sources that may be defined procedurally in accordance with the invention include rectangle shapes (“Rectangle”), horizontal lines (“Lines”) and rectangles (i.e., a pane split horizontally or vertically, or both). In addition, an existing image file image may be imported by selecting the “File” item in FIG. 7B.
  • Referring again to FIG. 7A, light image primitive region 710 includes five (5) slider controls that may be used to adjust or set the value corresponding to the light source's radius, width, height, spacing and thickness characteristics or attributes. In the illustrative embodiment, the Radius slider applies to the Circle image type, and controls its size. The Width and Height sliders apply to the Rectangle, Lines, and Rectangles image types and control their size (bounds). The Spacing slider applies to the Lines and Rectangles image types, and control the distance in between lines or between sub-panes. The Thickness slider applies to the Lines and Rectangles image type, and controls the amount of coverage for the lines or sub-panes. For example, a value of 50% would make the lines the same thickness as the distance between them. Similarly, slider controls may be used in light image focus region 715 to set the amount of blur applied to the image. The Upper and Lower sliders control the amount of blur, in pixels, applied to the top and bottom of the image primitive. In the example illustrated in FIG. 7A, it applies to the top of the circle and the bottom of the circle, respectively.
  • Once a material map has been defined in material map editor window 600, the user may save the map in a procedural file by selecting the “Save Material Recipe” item from the “Files” drop-down menu (see FIG. 6C). In one embodiment, each property identified through light property region 615 and light list region 620 is used as an attribute tag and the properties associated value (e.g., Power value 1.0 and Img. Scale value of 100 as shown in FIG. 6A) as a value for that property. This is in keeping with the attribute-value description illustrated in Table 1 above. As previously noted, attribute value pairs may be stored in a flat file or a hierarchically-ordered file such as an XML file.
  • Referring again to FIG. 6A, when a new material map is to be generated (such as when a material map editor window 600 is initially opened), default values may be assigned to one or more of the material map's properties. For instance, a single light source may be instantiated with a specified collection of default property values. By way of example only, a default light source may be “Colored” (see FIG. 6D), have a “Normal” blend mode (see FIG. 6E), a “Power” value of 1, be “on” and have a brightness of 50%.
  • It will be recognized by those of ordinary skill in the art that the information (i.e., attribute or properties and their associated values) retained in a material map's recipe file may be used to generate a graphical representation of the material map. Unlike prior art user interface material maps, recipe files in accordance with the invention may be used to dynamically generate the images which can reduce the amount of memory needed to store a user interface and substantially reduce the time required to create a specific user interface element. In addition, because material maps in accordance with the invention are represented in a procedural manner they are resolution independent. This is in sharp contrast with prior art material maps that rely on an image having a set or fixed resolution. Thus, a single (procedurally defined) material map may be used for all resolutions rather than having to have multiple material maps—each at a display resolution. Further, because material maps in accordance with the invention are procedural in nature, they may be encrypted to protect their content. (That is, the text recipe file is encrypted.)
  • Various changes or modifications in the foregoing description may be made without departing from the concept of the invention. For example, attributes other than, or in addition to, those identified in Table 1 and in FIG. 6 may be used to specify an object. In addition, hierarchical storage means other than an XML file may be used to store an object's procedural specification.
  • It will be recognized that methods to represent and render a graphical user interface object in accordance with this description may be performed by a programmable control device executing instructions organized into one or more program modules. A programmable control device may be a single computer processor, a special purpose processor (e.g., a digital signal processor, a graphics processing unit or a programmable graphics processing unit), a plurality of processors coupled by a communications link or a custom designed state machine. Custom designed state machines may be embodied in a hardware device such as an integrated circuit including, but not limited to, application specific integrated circuits (“ASICs”) or field programmable gate array (“FPGAs”). Storage devices suitable for tangibly embodying program instructions include, but are not limited to: magnetic disks (fixed, floppy, and removable) and tape; optical media such as CD-ROMs and digital video disks (“DVDs”); and semiconductor memory devices such as Electrically Programmable Read-Only Memory (“EPROM”), Electrically Erasable Programmable Read-Only Memory (“EEPROM”), Programmable Gate Arrays and flash devices.
  • Accordingly, the preceding descriptions were presented to enable any person skilled in the art to make and use the invention as claimed and were provided in the context of the particular examples discussed above, variations of which will be readily apparent to those skilled in the art. Accordingly, the claims appended hereto are not intended to be limited by the disclosed embodiments, but are to be accorded their widest scope consistent with the principles and features disclosed herein.

Claims (20)

1. A computer program product to define a graphical user interface element, the computer program product embodied on a non-transitory computer readable medium and comprising computer code to cause a processor to:
make available a plurality of layers for procedurally defining a graphical user interface element;
receive an indication of association of a subset of the plurality of layers to the graphical user interface element;
receive a specification of a plurality of attribute values to associate with the subset of the plurality of layers;
store information associated with the subset of the plurality of layers and the plurality of attribute values associated with the graphical user interface element, wherein the information procedurally defines the graphical user interface element.
2. The computer program product of 1, wherein the subset of the plurality of layers is sufficiently complete to permit the graphical user interface element to be rendered.
3. The computer program product of claim 1, further comprising computer code to cause the processor to specify a plurality of default attribute values to associate with the subset of the plurality of layers to procedurally define the graphical user interface element.
4. The computer program product of 1, wherein the computer code to cause the processor to store further comprises computer code to store the information in a material map recipe file.
5. The method of 4, wherein the information in the material map recipe file is hierarchically organized.
6. The computer program product of claim 4, further comprising computer code to cause the processor to:
provide at least a portion of the information in the material map recipe file to a rendering engine for rendering a visual representation of the graphical user interface element; and
generate, by the rendering engine, a visual representation of the graphical user interface element.
7. The computer program product of claim 6, wherein the computer code to cause the processor to generate further comprises computer code to cause the processor to generate, by the rendering engine, a visual representation of the graphical user interface element at an operational resolution setting of a display device.
8. A method of defining a graphical user interface element using one or more processing devices, the method comprising:
making available a plurality of layers to procedurally define a graphical user interface element;
receiving an indication at a processing device to associate a subset of the plurality of layers to the graphical user interface element;
receiving an indication of specification of a plurality of attribute values to associate with the subset of the plurality of layers; and
storing information associated with the subset of the plurality of layers and the plurality of attribute values associated with the graphical user interface element, wherein the information provides details sufficient to permit displaying the graphical user interface element at a plurality of operational resolution settings.
9. The method of 8, wherein the subset of the plurality of layers is sufficiently complete to permit the graphical user interface element to be rendered.
10. The method of 9, further comprising using at least a portion of the stored information to cause rendering of the graphical user interface element.
11. The method of 10, wherein the act of rendering comprises rendering the graphical user interface element at an operational resolution setting of a display device.
12. The method of 10, wherein the act of rendering is performed by an operating system level module.
13. The method of 8, wherein the act of storing information comprises storing the information in a material map recipe file.
14. The method of 13, wherein the information in the material map recipe file is hierarchically organized.
15. The method of 13, further comprising providing at least a portion of the information in the material map recipe file to a rendering engine for generating a visual representation of the graphical user interface element at an operational resolution setting of a display device.
16. The method of claim 8, wherein the plurality of attribute values are received through a graphical user interface design application.
17. The method of claim 8, wherein the graphical user interface element is selected from the group consisting of a pushbutton, a bevel button, a metal button, a disclosure button, a pop-up button, a combo button, a pull-down button, a check box, a radio button, a segmented control, a window title bar, a scroll bar track, a scroll bar thumb, a scrollbar cap, a slider track, a slider thumb, a text field, a progress bar, a progress indicator, a list box, a drawer, and a pane splitter.
18. The method of claim 8, wherein the act of receiving an indication of specification comprises automatically specifying a plurality of attribute values to associate with the subset of the plurality of layers to define the graphical user interface element.
19. A non-transitory program storage device, readable by a programmable control device, comprising instructions stored thereon for causing the programmable control device to perform a method in accordance with claim 8.
20. A computer system comprising:
one or more processing units;
a storage unit communicatively coupled to the one or more processing devices; and
a display device communicatively coupled to the storage unit and the one or more processing units wherein the one or more processing units are collectively configured to—
make available a plurality of layers for procedurally defining a graphical user interface element;
receive an indication of association of a subset of the plurality of layers to the graphical user interface element;
receive a specification of a plurality of attribute values to associate with the subset of the plurality of layers; and
store information associated with the plurality of layers and the plurality of attribute values associated with the graphical user interface element, wherein the information procedurally defines the graphical user interface element.
US13/359,169 2004-06-24 2012-01-26 Resolution Independent User Interface Design Abandoned US20120131479A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/359,169 US20120131479A1 (en) 2004-06-24 2012-01-26 Resolution Independent User Interface Design

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US10/876,298 US8068103B2 (en) 2004-06-24 2004-06-24 User-interface design
US11/459,140 US8130237B2 (en) 2004-06-24 2006-07-21 Resolution independent user interface design
US13/359,169 US20120131479A1 (en) 2004-06-24 2012-01-26 Resolution Independent User Interface Design

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/459,140 Continuation US8130237B2 (en) 2004-06-24 2006-07-21 Resolution independent user interface design

Publications (1)

Publication Number Publication Date
US20120131479A1 true US20120131479A1 (en) 2012-05-24

Family

ID=38285079

Family Applications (3)

Application Number Title Priority Date Filing Date
US11/459,140 Active 2026-07-16 US8130237B2 (en) 2004-06-24 2006-07-21 Resolution independent user interface design
US11/696,631 Active US7907146B2 (en) 2004-06-24 2007-04-04 Resolution independent user interface design
US13/359,169 Abandoned US20120131479A1 (en) 2004-06-24 2012-01-26 Resolution Independent User Interface Design

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US11/459,140 Active 2026-07-16 US8130237B2 (en) 2004-06-24 2006-07-21 Resolution independent user interface design
US11/696,631 Active US7907146B2 (en) 2004-06-24 2007-04-04 Resolution independent user interface design

Country Status (1)

Country Link
US (3) US8130237B2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103870229A (en) * 2012-12-17 2014-06-18 联想(北京)有限公司 Display method and electronic device
USD732555S1 (en) * 2012-07-19 2015-06-23 D2L Corporation Display screen with graphical user interface
USD733167S1 (en) * 2012-07-20 2015-06-30 D2L Corporation Display screen with graphical user interface
WO2021155770A1 (en) * 2020-02-04 2021-08-12 华为技术有限公司 Display method and electronic device

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8120623B2 (en) * 2006-03-15 2012-02-21 Kt Tech, Inc. Apparatuses for overlaying images, portable devices having the same and methods of overlaying images
US20090002386A1 (en) * 2007-06-29 2009-01-01 Apple Inc. Graphical Representation Creation Mechanism
KR20090042342A (en) * 2007-10-26 2009-04-30 주식회사 메디슨 Device having soft buttons and method for changing attributes theirof
US8201101B2 (en) * 2007-10-31 2012-06-12 Stratovan Corporation Resolution independent layout
US8370759B2 (en) * 2008-09-29 2013-02-05 Ancestry.com Operations Inc Visualizing, creating and editing blending modes methods and systems
US8645823B1 (en) 2008-10-28 2014-02-04 Adobe Systems Incorporated Converting static websites to resolution independent websites in a web development environment
US8418068B1 (en) * 2008-12-05 2013-04-09 Brian Backus System, software application, and method for customizing a high-resolution image via the internet
US20100228494A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method including determining subject advisory information based on prior determined subject advisory information
US20130096987A1 (en) * 2011-10-06 2013-04-18 Ut Battelle, Llc Citizen engagement for energy efficient communities
USD756383S1 (en) * 2012-10-31 2016-05-17 Google Inc. Display screen with graphical user interface
US9946806B2 (en) 2013-05-07 2018-04-17 Axure Software Solutions, Inc. Exporting responsive designs from a graphical design tool
WO2014182484A1 (en) * 2013-05-07 2014-11-13 Axure Software Solutions, Inc. Design environment for responsive graphical designs
US8671352B1 (en) 2013-05-07 2014-03-11 Axure Software Solutions, Inc. Variable dimension version editing for graphical designs
US9389759B2 (en) * 2013-05-07 2016-07-12 Axure Software Solutions, Inc. Environment for responsive graphical designs
USD732560S1 (en) * 2013-06-09 2015-06-23 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD817994S1 (en) 2013-09-03 2018-05-15 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
US10291729B2 (en) 2014-01-21 2019-05-14 Apptimize, Inc. User interface modification and usage tracking
CN109416465B (en) 2016-05-12 2022-09-09 生命技术公司 Systems, methods, and devices for image capture and display
USD829231S1 (en) * 2017-01-13 2018-09-25 Action Target Inc. Display screen or portion thereof with graphical user interface
USD843405S1 (en) * 2017-06-05 2019-03-19 Apple Inc. Display screen or portion thereof with graphical user interface
BR112019026121A2 (en) * 2017-06-16 2020-07-07 Microsoft Technology Licensing, Llc generation of rules-based user interface
US10592589B1 (en) 2018-08-21 2020-03-17 Axure Software Solutions, Inc. Multi-view masters for graphical designs
US10997761B2 (en) * 2018-11-09 2021-05-04 Imaginear Inc. Systems and methods for creating and delivering augmented reality content
USD927524S1 (en) * 2020-01-27 2021-08-10 Tax Smart Research, LLC Display screen with graphical user interface
USD927523S1 (en) * 2020-01-27 2021-08-10 Tax Smart Research, LLC Display screen with graphical user interface
CN111337259B (en) * 2020-03-13 2022-03-04 洛阳拖拉机研究所有限公司 Engine continuous data acquisition and display method
USD941837S1 (en) * 2020-03-16 2022-01-25 Innovator Capital Management, LLC Display screen with animated graphical user interface

Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5648795A (en) * 1993-02-26 1997-07-15 Binar Graphics, Inc. Method of resetting a computer video display mode
US6028604A (en) * 1997-08-27 2000-02-22 Microsoft Corporation User friendly remote system interface providing previews of applications
US20010030667A1 (en) * 2000-04-10 2001-10-18 Kelts Brett R. Interactive display interface for information objects
US6313851B1 (en) * 1997-08-27 2001-11-06 Microsoft Corporation User friendly remote system interface
US20010048448A1 (en) * 2000-04-06 2001-12-06 Raiz Gregory L. Focus state themeing
US6404441B1 (en) * 1999-07-16 2002-06-11 Jet Software, Inc. System for creating media presentations of computer software application programs
US20020091758A1 (en) * 2001-01-05 2002-07-11 Singh Raj R. Map viewing, publishing, and provisioning system
US6456305B1 (en) * 1999-03-18 2002-09-24 Microsoft Corporation Method and system for automatically fitting a graphical display of objects to the dimensions of a display window
US20020158908A1 (en) * 2001-04-30 2002-10-31 Kristian Vaajala Web browser user interface for low-resolution displays
US20030043191A1 (en) * 2001-08-17 2003-03-06 David Tinsley Systems and methods for displaying a graphical user interface
US20030076340A1 (en) * 2001-09-18 2003-04-24 International Business Machines Corporation Computer system, display device, display controller, image processing method, display resolution change method, and computer program
US6587129B1 (en) * 1997-10-06 2003-07-01 Canon Kabushiki Kaisha User interface for image acquisition devices
US6606103B1 (en) * 1999-11-30 2003-08-12 Uhc Llc Infinite resolution scheme for graphical user interface object
US20030234799A1 (en) * 2002-06-20 2003-12-25 Samsung Electronics Co., Ltd. Method of adjusting an image size of a display apparatus in a computer system, system for the same, and medium for recording a computer program therefor
US20040090470A1 (en) * 2002-10-30 2004-05-13 Kim Hong-Ki Method, display system, and computer software for controlling icon appearance
US20040100490A1 (en) * 2002-11-21 2004-05-27 International Business Machines Corporation Skin button enhancements for remote control
US20040243940A1 (en) * 2003-05-31 2004-12-02 Samsung Electronics Co., Ltd Display apparatus and method of adjusting display settings thereof
US20050021970A1 (en) * 2003-07-21 2005-01-27 Curtis Reese Embedded data layers
US20050132286A1 (en) * 2000-06-12 2005-06-16 Rohrabaugh Gary B. Resolution independent vector display of internet content
US20050140694A1 (en) * 2003-10-23 2005-06-30 Sriram Subramanian Media Integration Layer
US20050182972A1 (en) * 2004-02-13 2005-08-18 Apostolopoulos John G. Methods for generating data for describing scalable media
US20050231512A1 (en) * 2004-04-16 2005-10-20 Niles Gregory E Animation of an object using behaviors
US6970602B1 (en) * 1998-10-06 2005-11-29 International Business Machines Corporation Method and apparatus for transcoding multimedia using content analysis
US20060136847A1 (en) * 2004-12-16 2006-06-22 International Business Machines Corporation Method and computer program product for verifying a computer renderable document for on-screen appearance
US7590947B1 (en) * 2004-05-28 2009-09-15 Adobe Systems Incorporated Intelligent automatic window sizing
US8004535B2 (en) * 2006-06-01 2011-08-23 Qualcomm Incorporated Apparatus and method for selectively double buffering portions of displayable content

Family Cites Families (81)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2647239B1 (en) 1989-05-22 1991-07-05 Bull Sa METHOD FOR GENERATING INTERFACES FOR USER APPLICATIONS VISUALIZABLE ON THE SCREEN OF A COMPUTER SYSTEM AND DEVICE FOR IMPLEMENTING SAID METHOD
US5388201A (en) 1990-09-14 1995-02-07 Hourvitz; Leonard Method and apparatus for providing multiple bit depth windows
JPH0520044A (en) 1991-07-15 1993-01-29 Personal Joho Kankyo Kyokai User interface device
EP0528631B1 (en) 1991-08-13 1998-05-20 Xerox Corporation Electronic image generation
US5274760A (en) 1991-12-24 1993-12-28 International Business Machines Corporation Extendable multiple image-buffer for graphics systems
US5357603A (en) 1992-06-15 1994-10-18 Microsoft Corporation Method and system for changing a shape type while maintaining existing graphic characteristics
DE69315969T2 (en) 1992-12-15 1998-07-30 Sun Microsystems Inc Presentation of information in a display system with transparent windows
US5471572A (en) 1993-07-09 1995-11-28 Silicon Graphics, Inc. System and method for adding detail to texture imagery in computer generated interactive graphics
US5513342A (en) * 1993-12-28 1996-04-30 International Business Machines Corporation Display window layout system that automatically accommodates changes in display resolution, font size and national language
US5632033A (en) * 1994-01-18 1997-05-20 Sybase, Inc. Variable resolution method and arrangement
US5721848A (en) * 1994-02-04 1998-02-24 Oracle Corporation Method and apparatus for building efficient and flexible geometry management widget classes
US6757438B2 (en) 2000-02-28 2004-06-29 Next Software, Inc. Method and apparatus for video compression using microwavelets
US6031937A (en) 1994-05-19 2000-02-29 Next Software, Inc. Method and apparatus for video compression using block and wavelet techniques
AUPM704194A0 (en) 1994-07-25 1994-08-18 Canon Information Systems Research Australia Pty Ltd Efficient methods for the evaluation of a graphical programming language
US5949409A (en) 1994-12-02 1999-09-07 Sony Corporation Image processing in which the image is divided into image areas with specific color lookup tables for enhanced color resolution
JP3578498B2 (en) 1994-12-02 2004-10-20 株式会社ソニー・コンピュータエンタテインメント Image information processing device
JP3647487B2 (en) 1994-12-02 2005-05-11 株式会社ソニー・コンピュータエンタテインメント Texture mapping device
US5537630A (en) 1994-12-05 1996-07-16 International Business Machines Corporation Method and system for specifying method parameters in a visual programming system
US5877762A (en) 1995-02-27 1999-03-02 Apple Computer, Inc. System and method for capturing images of screens which display multiple windows
US6307574B1 (en) 1995-05-08 2001-10-23 Apple Computer, Inc. Graphical user interface with hierarchical structure for customizable menus and control objects
US5877741A (en) 1995-06-07 1999-03-02 Seiko Epson Corporation System and method for implementing an overlay pathway
US6331856B1 (en) 1995-11-22 2001-12-18 Nintendo Co., Ltd. Video game system with coprocessor providing high speed efficient 3D graphics and digital audio signal processing
EP1387287B1 (en) 1996-02-29 2008-08-13 Sony Computer Entertainment Inc. Picture processing apparatus and picture processing method
US5764229A (en) 1996-05-09 1998-06-09 International Business Machines Corporation Method of and system for updating dynamic translucent windows with buffers
JP3537259B2 (en) 1996-05-10 2004-06-14 株式会社ソニー・コンピュータエンタテインメント Data processing device and data processing method
US5841435A (en) * 1996-07-26 1998-11-24 International Business Machines Corporation Virtual windows desktop
US6006231A (en) 1996-09-10 1999-12-21 Warp 10 Technologies Inc. File format for an image including multiple versions of an image, and related system and method
US5933155A (en) 1996-11-06 1999-08-03 Silicon Graphics, Inc. System and method for buffering multiple frames while controlling latency
US6204851B1 (en) 1997-04-04 2001-03-20 Intergraph Corporation Apparatus and method for applying effects to graphical images
US6272558B1 (en) 1997-10-06 2001-08-07 Canon Kabushiki Kaisha Application programming interface for manipulating flashpix files
US6509913B2 (en) * 1998-04-30 2003-01-21 Openwave Systems Inc. Configurable man-machine interface
US6577317B1 (en) 1998-08-20 2003-06-10 Apple Computer, Inc. Apparatus and method for geometry operations in a 3D-graphics pipeline
US6771264B1 (en) 1998-08-20 2004-08-03 Apple Computer, Inc. Method and apparatus for performing tangent space lighting and bump mapping in a deferred shading graphics processor
KR100327236B1 (en) * 1998-12-07 2002-05-09 윤종용 Optical scanning system of printing machine
US6753878B1 (en) 1999-03-08 2004-06-22 Hewlett-Packard Development Company, L.P. Parallel pipelined merge engines
US6982695B1 (en) * 1999-04-22 2006-01-03 Palmsource, Inc. Method and apparatus for software control of viewing parameters
US6369830B1 (en) 1999-05-10 2002-04-09 Apple Computer, Inc. Rendering translucent layers in a display system
US6542160B1 (en) 1999-06-18 2003-04-01 Phoenix Technologies Ltd. Re-generating a displayed image
US7136790B1 (en) * 1999-08-09 2006-11-14 General Electric Company Method, system, and program product for enabling design of products having a visual effect
US6483524B1 (en) 1999-10-01 2002-11-19 Global Graphics Software Limited Prepress workflow method using raster image processor
US6618048B1 (en) 1999-10-28 2003-09-09 Nintendo Co., Ltd. 3D graphics rendering system for performing Z value clamping in near-Z range to maximize scene resolution of visually important Z components
US6411301B1 (en) 1999-10-28 2002-06-25 Nintendo Co., Ltd. Graphics system interface
US6452600B1 (en) 1999-10-28 2002-09-17 Nintendo Co., Ltd. Graphics system interface
US7057612B2 (en) * 2000-01-12 2006-06-06 Balfour Technologies Llc Method and system for a four-dimensional temporal visualization data browser
US6857061B1 (en) 2000-04-07 2005-02-15 Nintendo Co., Ltd. Method and apparatus for obtaining a scalar value directly from a vector register
US6707462B1 (en) 2000-05-12 2004-03-16 Microsoft Corporation Method and system for implementing graphics control constructs
US6717599B1 (en) 2000-06-29 2004-04-06 Microsoft Corporation Method, system, and computer program product for implementing derivative operators with graphics hardware
US6734873B1 (en) 2000-07-21 2004-05-11 Viewpoint Corporation Method and system for displaying a composited image
US6704024B2 (en) * 2000-08-07 2004-03-09 Zframe, Inc. Visual content browsing using rasterized representations
US7523411B2 (en) * 2000-08-22 2009-04-21 Bruce Carlin Network-linked interactive three-dimensional composition and display of saleable objects in situ in viewer-selected scenes for purposes of object promotion and procurement, and generation of object advertisements
US6664958B1 (en) 2000-08-23 2003-12-16 Nintendo Co., Ltd. Z-texturing
US6580430B1 (en) 2000-08-23 2003-06-17 Nintendo Co., Ltd. Method and apparatus for providing improved fog effects in a graphics system
US6639595B1 (en) 2000-08-23 2003-10-28 Nintendo Co., Ltd. Achromatic lighting in a graphics system and method
US6636214B1 (en) 2000-08-23 2003-10-21 Nintendo Co., Ltd. Method and apparatus for dynamically reconfiguring the order of hidden surface processing based on rendering mode
US6664962B1 (en) 2000-08-23 2003-12-16 Nintendo Co., Ltd. Shadow mapping in a low cost graphics system
US6609977B1 (en) 2000-08-23 2003-08-26 Nintendo Co., Ltd. External interfaces for a 3D graphics system
US6697074B2 (en) 2000-11-28 2004-02-24 Nintendo Co., Ltd. Graphics system interface
JP3548521B2 (en) 2000-12-05 2004-07-28 Necマイクロシステム株式会社 Translucent image processing apparatus and method
JP3450833B2 (en) 2001-02-23 2003-09-29 キヤノン株式会社 Image processing apparatus and method, program code, and storage medium
US20020174181A1 (en) 2001-04-13 2002-11-21 Songxiang Wei Sharing OpenGL applications using application based screen sampling
US7564460B2 (en) 2001-07-16 2009-07-21 Microsoft Corporation Systems and methods for providing intermediate targets in a graphics system
US6785685B2 (en) * 2001-08-22 2004-08-31 International Business Machines Corporation Approach for transforming XML document to and from data objects in an object oriented framework for content management applications
US7365889B2 (en) * 2001-09-26 2008-04-29 Hewlett-Packard Development Company, L.P. System and method for transparency optimization
US20040054610A1 (en) * 2001-11-28 2004-03-18 Monetaire Monetaire wealth management platform
US7257776B2 (en) 2002-02-05 2007-08-14 Microsoft Corporation Systems and methods for scaling a graphical user interface according to display dimensions and using a tiered sizing schema to define display objects
US6906720B2 (en) 2002-03-12 2005-06-14 Sun Microsystems, Inc. Multipurpose memory system for use in a graphics system
DE10225316A1 (en) * 2002-06-06 2003-12-18 Philips Intellectual Property User interface display optimization method in which display window sizes or objects are optimized according to the their content, available space and selected preference rules
GB2392072B (en) 2002-08-14 2005-10-19 Autodesk Canada Inc Generating Image Data
DE10242087A1 (en) 2002-09-11 2004-03-25 Daimlerchrysler Ag Image processing device e.g. for entertainment electronics, has hardware optimized for vector computation and color mixing,
US7675529B1 (en) * 2003-02-25 2010-03-09 Apple Inc. Method and apparatus to scale graphical user interfaces
US6911984B2 (en) 2003-03-12 2005-06-28 Nvidia Corporation Desktop compositor using copy-on-write semantics
US7577912B2 (en) 2003-03-27 2009-08-18 Sap Ag Suggestive form factors
US20050010872A1 (en) * 2003-07-07 2005-01-13 International Business Machines Corporation Look and feel to enhance usability on Unix platforms
US7167173B2 (en) * 2003-09-17 2007-01-23 International Business Machines Corporation Method and structure for image-based object editing
US7277572B2 (en) * 2003-10-10 2007-10-02 Macpearl Design Llc Three-dimensional interior design system
US7839419B2 (en) 2003-10-23 2010-11-23 Microsoft Corporation Compositing desktop window manager
US7398478B2 (en) * 2003-11-14 2008-07-08 Microsoft Corporation Controlled non-proportional scaling display
US7274370B2 (en) 2003-12-18 2007-09-25 Apple Inc. Composite graphics rendered using multiple frame buffers
US7038695B2 (en) 2004-03-30 2006-05-02 Mstar Semiconductor, Inc. User interface display apparatus using texture mapping method
JP2005321972A (en) * 2004-05-07 2005-11-17 Sony Corp Information processor, processing method for information processor, and processing program for information processor
US7305414B2 (en) * 2005-04-05 2007-12-04 Oracle International Corporation Techniques for efficient integration of text searching with queries over XML data

Patent Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5767834A (en) * 1993-02-26 1998-06-16 Binar Graphics, Inc. Method of resetting a computer video display mode
US5648795A (en) * 1993-02-26 1997-07-15 Binar Graphics, Inc. Method of resetting a computer video display mode
US20040189715A1 (en) * 1997-08-27 2004-09-30 Microsoft Corp. User friendly remote system interface
US6028604A (en) * 1997-08-27 2000-02-22 Microsoft Corporation User friendly remote system interface providing previews of applications
US6313851B1 (en) * 1997-08-27 2001-11-06 Microsoft Corporation User friendly remote system interface
US6587129B1 (en) * 1997-10-06 2003-07-01 Canon Kabushiki Kaisha User interface for image acquisition devices
US6970602B1 (en) * 1998-10-06 2005-11-29 International Business Machines Corporation Method and apparatus for transcoding multimedia using content analysis
US6456305B1 (en) * 1999-03-18 2002-09-24 Microsoft Corporation Method and system for automatically fitting a graphical display of objects to the dimensions of a display window
US6404441B1 (en) * 1999-07-16 2002-06-11 Jet Software, Inc. System for creating media presentations of computer software application programs
US6606103B1 (en) * 1999-11-30 2003-08-12 Uhc Llc Infinite resolution scheme for graphical user interface object
US20010048448A1 (en) * 2000-04-06 2001-12-06 Raiz Gregory L. Focus state themeing
US20010030667A1 (en) * 2000-04-10 2001-10-18 Kelts Brett R. Interactive display interface for information objects
US20050132286A1 (en) * 2000-06-12 2005-06-16 Rohrabaugh Gary B. Resolution independent vector display of internet content
US20020091758A1 (en) * 2001-01-05 2002-07-11 Singh Raj R. Map viewing, publishing, and provisioning system
US20020158908A1 (en) * 2001-04-30 2002-10-31 Kristian Vaajala Web browser user interface for low-resolution displays
US20030043191A1 (en) * 2001-08-17 2003-03-06 David Tinsley Systems and methods for displaying a graphical user interface
US20030076340A1 (en) * 2001-09-18 2003-04-24 International Business Machines Corporation Computer system, display device, display controller, image processing method, display resolution change method, and computer program
US20030234799A1 (en) * 2002-06-20 2003-12-25 Samsung Electronics Co., Ltd. Method of adjusting an image size of a display apparatus in a computer system, system for the same, and medium for recording a computer program therefor
US20040090470A1 (en) * 2002-10-30 2004-05-13 Kim Hong-Ki Method, display system, and computer software for controlling icon appearance
US20040100490A1 (en) * 2002-11-21 2004-05-27 International Business Machines Corporation Skin button enhancements for remote control
US20040243940A1 (en) * 2003-05-31 2004-12-02 Samsung Electronics Co., Ltd Display apparatus and method of adjusting display settings thereof
US20050021970A1 (en) * 2003-07-21 2005-01-27 Curtis Reese Embedded data layers
US20050140694A1 (en) * 2003-10-23 2005-06-30 Sriram Subramanian Media Integration Layer
US20050182972A1 (en) * 2004-02-13 2005-08-18 Apostolopoulos John G. Methods for generating data for describing scalable media
US20050231512A1 (en) * 2004-04-16 2005-10-20 Niles Gregory E Animation of an object using behaviors
US7590947B1 (en) * 2004-05-28 2009-09-15 Adobe Systems Incorporated Intelligent automatic window sizing
US20060136847A1 (en) * 2004-12-16 2006-06-22 International Business Machines Corporation Method and computer program product for verifying a computer renderable document for on-screen appearance
US8004535B2 (en) * 2006-06-01 2011-08-23 Qualcomm Incorporated Apparatus and method for selectively double buffering portions of displayable content

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD732555S1 (en) * 2012-07-19 2015-06-23 D2L Corporation Display screen with graphical user interface
USD733167S1 (en) * 2012-07-20 2015-06-30 D2L Corporation Display screen with graphical user interface
CN103870229A (en) * 2012-12-17 2014-06-18 联想(北京)有限公司 Display method and electronic device
WO2021155770A1 (en) * 2020-02-04 2021-08-12 华为技术有限公司 Display method and electronic device

Also Published As

Publication number Publication date
US7907146B2 (en) 2011-03-15
US8130237B2 (en) 2012-03-06
US20060284878A1 (en) 2006-12-21
US20070171233A1 (en) 2007-07-26

Similar Documents

Publication Publication Date Title
US7907146B2 (en) Resolution independent user interface design
US8508549B2 (en) User-interface design
JP3598303B2 (en) Method of selectively displaying and activating overlapping display objects on a display, and computer system
EP0694829B1 (en) A method and apparatus for visualization of database search results
KR101497172B1 (en) Altering the appearance of a digital image using a shape
KR101331330B1 (en) Semi-transparent highlighting of selected objects in electronic documents
US20150248191A1 (en) User interface presentation of information in reconfigured or overlapping containers
US20030142140A1 (en) Adjusting the tint of a translucent window to convey status
US20060112348A1 (en) Multiple-mode window presentation system and process
US20110267372A1 (en) Compound Lenses for Multi-Source Data Presentation
US20120221973A1 (en) Color labeling in a graphical user interface
US20040125138A1 (en) Detail-in-context lenses for multi-layer images
KR20110039190A (en) Visualization of datasets
CA2705804A1 (en) System and method for image editing of electronic product design
US10365791B2 (en) Computer user interface including lens-based enhancement of graph edges
US7616219B2 (en) Drawing style domains
US20230377228A1 (en) Map data visualizations with multiple superimposed marks layers
CN103955469B (en) A kind of method and apparatus that page zoom-in and zoom-out is carried out in browser
US9916602B2 (en) Batch image processing tool
CN113672149A (en) View display method and device, electronic equipment and computer storage medium
US7986331B1 (en) Source lens for viewing and editing artwork
Röhlig et al. Visibility widgets for unveiling occluded data in 3d terrain visualization
US20140040859A1 (en) Generating and Presenting Property Editors
CN110264547A (en) A kind of translucent display methods of land data based on map overlay
Tate et al. Seg3d basic functionality

Legal Events

Date Code Title Description
AS Assignment

Owner name: APPLE INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ZIMMER, MARK;REEL/FRAME:027601/0920

Effective date: 20060720

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION