US20080177825A1 - Server assisted device independent markup language - Google Patents

Server assisted device independent markup language Download PDF

Info

Publication number
US20080177825A1
US20080177825A1 US11/933,082 US93308207A US2008177825A1 US 20080177825 A1 US20080177825 A1 US 20080177825A1 US 93308207 A US93308207 A US 93308207A US 2008177825 A1 US2008177825 A1 US 2008177825A1
Authority
US
United States
Prior art keywords
document
mobile device
rendered
rendering
rendering context
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/933,082
Inventor
Micah Dubinko
Zhaowei Charlie Jiang
Nigel Choi
Chen Li
Keith Anthony Marlow
Guang Yang
Olga Volodymyrivna Gavrylyako
James Liang
Jeff Leung
Michael Jeremy Temkin
Abdul Rasel Khan
Ming Sui
Hui Guo
Jaekwon Park
Surendra Sadanand Rajam
Takayuki Tei
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yahoo Inc
Original Assignee
Yahoo Inc until 2017
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yahoo Inc until 2017 filed Critical Yahoo Inc until 2017
Priority to US11/933,082 priority Critical patent/US20080177825A1/en
Assigned to YAHOO! INC. reassignment YAHOO! INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KHAN, ABDUL RASEL, RAJAM, SURENDRA SADANAND, MARLOW, KEITH ANTHONY, TEI, TAKAYUKI, JIANG, ZHAOWEI CHARLIE, PARK, JAEKWON, SUI, MING, CHOI, NIGEL, DUBINKO, MICAH, GAVRYLYAKO, OLGA VOLODYMYRIVNA, LI, CHEN, YANG, GUANG, GUO, HUI, LIANG, JIAN, TEMKIN, MICHAEL, LEUNG, JEFF
Publication of US20080177825A1 publication Critical patent/US20080177825A1/en
Assigned to YAHOO HOLDINGS, INC. reassignment YAHOO HOLDINGS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YAHOO! INC.
Assigned to OATH INC. reassignment OATH INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YAHOO HOLDINGS, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • G06F16/9577Optimising the visualization of content, e.g. distillation of HTML documents
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F15/00Digital computers in general; Data processing equipment in general
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F15/00Digital computers in general; Data processing equipment in general
    • G06F15/16Combinations of two or more digital computers each having at least an arithmetic unit, a program unit and a register, e.g. for a simultaneous processing of several programs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/103Formatting, i.e. changing of presentation of documents
    • G06F40/114Pagination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/12Use of codes for handling textual entities
    • G06F40/14Tree-structured documents
    • G06F40/143Markup, e.g. Standard Generalized Markup Language [SGML] or Document Type Definition [DTD]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units

Definitions

  • the invention is generally directed to providing content over a network, and more particularly to enabling content from disparate sources to be expressed with a markup language that is both resolved with the assistance of a server and independent of a target remote device.
  • FIG. 1 illustrates a diagram of one embodiment of an exemplary system in which the invention may be practiced
  • FIG. 2 shows one embodiment of an exemplary mobile device
  • FIG. 3 illustrates one embodiment of an exemplary network device
  • FIG. 4 shows one embodiment of an exemplary platform for rendering a markup language document for display by a target remote device
  • FIG. 5A illustrates another embodiment of an exemplary platform that employs pipelined stages to render a markup language document for display by a target remote device
  • FIG. 5B shows one embodiment of exemplary pipelined stages that are employed with a platform to render a markup language document for display by a target remote device;
  • FIG. 6 illustrates yet another embodiment of an exemplary class tree for modules that enable a platform to render a markup language document for display by a target remote device;
  • FIG. 7 shows an overview of a process for generally employing a platform to render a markup language document for display by a target remote device
  • FIG. 8 illustrates an overview of a process for employing a platform for rendering a markup language document for display by a target remote device
  • FIG. 9 shows an overview of a process for employing class tree for modules that enable rendering of a markup language document for display by a target remote device
  • FIG. 10 illustrates a process for pipelined stages that render a markup language document for display by a target remote device
  • FIG. 11A shows a process for employing Temporary IDs and Indexes to pipeline process the rendering of a markup language document for display by a target remote device
  • FIG. 11B illustrates a process for employing Temporary IDs and Temporary Indexes to render a markup language document for display by a target remote device, in accordance with the invention.
  • the term “or” is an inclusive “or” operator, and is equivalent to the term “and/or,” unless the context clearly dictates otherwise.
  • the term “based on” is not exclusive and allows for being based on additional factors not described, unless the context clearly dictates otherwise.
  • the meaning of “a,” “an,” and “the” include plural references.
  • the meaning of “in” includes “in” and “on.”
  • receiving an item, such as a request, response, or other message, from a device or component includes receiving the message indirectly, such as when forwarded by one or more other devices or components.
  • sending an item to a device or component includes sending the item indirectly, such as when forwarded by one or more other devices or components.
  • MIN mobile identification number
  • a cellular telephone's phone number may be used as a MIN.
  • mobile client application refers to an application that runs on a mobile device.
  • a mobile client application may be written in one or more of a variety of languages, such as ‘C’, ‘C++’, ‘J2ME’, “Brew”, Java, and the like. Browsers, email clients, text messaging clients, calendars, and games are examples of mobile client applications.
  • network application refers to a computer-based application that communicates, directly or indirectly, with at least one other component across a network.
  • Web sites, email servers, messaging servers, and game servers are examples of network applications.
  • URI uniform resource identifier
  • URL uniform resource locator
  • UPN uniform resource name
  • RFC 3986 describes a syntax for a URI.
  • URI is not limited to this syntax, and may include other syntaxes.
  • the invention is directed to a platform for enabling customized rendering of markup language pages provided over a network for subsequent display by a remote device.
  • a rendering context and a markup language (ML) document for the target remote device is received by the platform that enables processes that can paginate and fully render pages that are subsequently delivered for display by the target remote device.
  • a post-rendering process may also be provided to perform additional processing of media items for the rendered ML document. This additional processing may include retrieving and embedding images in pages of the rendered ML document. For example, if a rendered page is in XML format, and includes a link to an image, the post-processing component may retrieve the image and embed it within the XML page as base 64 encoded data or another format.
  • the platform is markup language agnostic and can employ templates in the custom rendering process.
  • the remote device is arranged as a client device that provides for wired and/or wireless communication over a network.
  • the markup language document can be provided in virtually any standard or non-standard format, including, but not limited to, Handheld Device Markup Language (HDML), Wireless Markup Language (WML), WMLScript, Standard Generalized Markup Language (SMGL), HyperText Markup Language (HTML), mobile HTML (mHTML), (cHTML), eXtensible Markup Language (XML), and the like.
  • HDML Handheld Device Markup Language
  • WML Wireless Markup Language
  • WMLScript Standard Generalized Markup Language
  • SMGL Standard Generalized Markup Language
  • HTML HyperText Markup Language
  • mHTML mobile HTML
  • cHTML eXtensible Markup Language
  • XML eXtensible Markup Language
  • the markup language document may be provided to the platform in a mobile Mark Up Language (MML) that includes modules that are independent of the target remote device.
  • MML mobile Mark Up Language
  • the platform walks the MML document to identify MML modules and instantiates a tree of classes based on the included modules. These classes are arranged to generate a corresponding XML document that includes customized code to handle the rendering context for the target mobile device, e.g., known bugs in software and hardware for the target remote device, and resource constraints such as display screen size, memory size, and communication link(s) provided by the carrier.
  • the platform can subsequently fully render this XML document for display by the target remote device.
  • the rendering context can be arranged as a data structure that contains the various parameters and data that are employed by the platform to optimize the rendering of each page in the ML document for display with a particular remote device.
  • the rendering context is generally provided to the platform by a separate application, platform, or process that can be managed by a content provider, carrier, and/or another 3 rd party service.
  • the rendering context for a remote device can include, but is not limited to, screen size, color capabilities, type of markup language, browser application, known bugs in a software or hardware version of the mobile device or network gateway, or the like.
  • the platform can store configuration data related to attributes of a variety of remote devices and network carriers and methods of storing and retrieving the configuration data.
  • the storage and retrieval of data and/or parameters associated with a rendering context for a target remote device is provided in an HTTP cookie.
  • the rendering of the ML document can include a plurality of different processes, including: (1) determining if location information regarding the target remote device can be provided in response to a request; (2) minifying cascading style sheets (CSS) by removing extraneous sheets; (3) fontifying cascading style sheets by stripping them out and rewriting as ML code; (4) shrinking the ML document to remove any ML code that is extraneous to the target remote device; (5) specifying one or more attributes of each media item to be embedded or linked in a page; (6) employing an estimate of each page size to repaginate the ML document pages to a size that is no more than the effective display screen size of the target remote device; (7) tailoring/removing ML code in a page that is estimated to be rendered as larger than the effective display screen size of the target remote device; and/or (8) rewriting URLs in a page to include locations of content and/or session identification information.
  • CSS cascading style sheets
  • the rendering of the pages of the ML document for the target remote device may further include rewriting links or URIs within the document. For example, if the ML document includes a link to an image in one format, the link may be modified to an alternate image in a different format, if the target remote device is unable to display the first format.
  • a link may also be rewritten to include a parameter, such as a value to identify a continuing session, so that a new request using the link returns the session identifier.
  • a temporary XML ID can be added to each identified portion of the ML document that doesn't have an existing XML ID.
  • a temporary index can be built for each XML ID which can be used by the plurality of processes to quickly find and render the corresponding elements (identified portions), e.g., a string(s), an image(s), and the like.
  • the temporary XML IDs can be removed. The use of temporary XML IDs and temporary index can reduce the likelihood that a particular process has to walk the entire ML document to perform its portion of the full rendering of the document for the target remote device.
  • FIG. 1 shows components of one embodiment of an environment in which the invention may be practiced. Not all the components may be required to practice the invention, and variations in the arrangement and type of the components may be made without departing from the spirit or scope of the invention.
  • system 100 of FIG. 1 includes local area networks (“LANs”)/wide area networks (“WANs”)-(network) 105 , wireless network 110 , Content Rendering Platform (CRP) 106 , mobile devices (client devices) 102 - 104 , client device 101 , and content provider 107 .
  • LANs local area networks
  • WANs wide area networks
  • CRM Content Rendering Platform
  • mobile devices 102 - 103 may include virtually any portable computing device capable of receiving and sending a message over a network, such as network 105 , wireless network 110 , or the like.
  • Mobile devices 102 - 104 may also be described generally as client devices that are configured to be portable.
  • mobile devices 102 - 104 may include virtually any portable computing device capable of connecting to another computing device and receiving information.
  • Such devices include portable devices such as, cellular telephones, smart phones, display pagers, radio frequency (RF) devices, infrared (IR) devices, Personal Digital Assistants (PDAs), handheld computers, laptop computers, wearable computers, tablet computers, integrated devices combining one or more of the preceding devices, and the like.
  • mobile devices 102 - 104 typically range widely in terms of capabilities and features.
  • a cell phone may have a numeric keypad and a few lines of monochrome LCD display on which only text may be displayed.
  • a web-enabled mobile device may have a touch sensitive screen, a stylus, and several lines of color LCD display in which both text and graphics may be displayed.
  • a web-enabled mobile device may include a browser application that is configured to receive and to send web pages, web-based messages, and the like.
  • the browser application may be configured to receive and display graphics, text, multimedia, and the like, employing virtually any web based language, including a wireless application protocol messages (WAP), and the like.
  • WAP wireless application protocol
  • the browser application for the mobile device is enabled to employ Handheld Device Markup Language (HDML), Wireless Markup Language (WML), WMLScript, JavaScript, Standard Generalized Markup Language (SMGL), HyperText Markup Language (HTML), eXtensible Markup Language (XML), and the like, to display content and communicate messages.
  • HDML Handheld Device Markup Language
  • WML Wireless Markup Language
  • WMLScript Wireless Markup Language
  • JavaScript Standard Generalized Markup Language
  • SMGL Standard Generalized Markup Language
  • HTML HyperText Markup Language
  • XML eXtensible Markup Language
  • Mobile devices 102 - 104 also may include at least one other client application that is configured to receive content from another computing device.
  • the client application may include a capability to provide and receive textual content, graphical content, audio content, and the like.
  • the client application may further provide information that identifies itself, including a type, capability, name, and the like.
  • mobile devices 102 - 104 may uniquely identify themselves through any of a variety of mechanisms, including a phone number, Mobile Identification Number (MIN), an electronic serial number (ESN), or other mobile device identifier.
  • MIN Mobile Identification Number
  • ESN electronic serial number
  • the information may also indicate a content format that the mobile device is enabled to employ, mobile device manufacturer, model number, display colors, display size, enabled features, and wireless carrier. Such information may be provided in a message, or the like, sent to CRP 106 , client device 101 , or other computing devices.
  • Mobile devices 102 - 104 may also be configured to communicate a message, such as through Short Message Service (SMS), Multimedia Message Service (MMS), instant messaging (IM), internet relay chat (IRC), Mardam-Bey's IRC (mIRC), Jabber, and the like, between another computing device, such as CRP 106 , client device 101 , or the like.
  • SMS Short Message Service
  • MMS Multimedia Message Service
  • IM instant messaging
  • IRC internet relay chat
  • IRC Mardam-Bey's IRC
  • Jabber Jabber
  • Mobile devices 102 - 104 may be further configured to enable a user to participate in communications sessions, such as IM sessions.
  • mobile devices 102 - 104 may include a client application that is configured to manage various actions on behalf of the client device.
  • the client application may enable a user to interact with the browser application, email application, IM applications, SMS application, MMS application, and the like.
  • Mobile devices 102 - 104 may further be configured to include a client application that enables the end-user to log into an end-user account that may be managed by another computing device, such as Content Provider 107 .
  • Such end-user account may be configured to enable the end-user to receive emails, send/receive IM messages, SMS messages, access selected web pages, participate in a social networking activity, or the like. However, participation in various social networking activities may also be performed without logging into the end-user account.
  • mobile devices 102 - 104 may also communicate with non-mobile client devices, such as client device 101 , or the like.
  • Client device 101 may include virtually any computing device capable of communicating over a network to send and receive information, including social networking information, or the like.
  • the set of such devices may include devices that typically connect using a wired or wireless communications medium such as personal computers, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, or the like.
  • Wireless network 110 is configured to couple mobile devices 102 - 104 and its components with network 105 .
  • Wireless network 110 may include any of a variety of wireless sub-networks that may further overlay stand-alone ad-hoc networks, and the like, to provide an infrastructure-oriented connection for mobile devices 102 - 104 .
  • Such sub-networks may include mesh networks, Wireless LAN (WLAN) networks, cellular networks, and the like.
  • Wireless network 110 may further include an autonomous system of terminals, gateways, routers, and the like connected by wireless radio links, and the like. These connectors may be configured to move freely and randomly and organize themselves arbitrarily, such that the topology of wireless network 110 may change rapidly.
  • Wireless network 110 may further employ a plurality of access technologies including 2nd (2G), 3rd (3G), and 4 th (3G) generation radio access for cellular systems, WLAN, WiMax, Wireless Router (WR) mesh, and the like.
  • Access technologies such as 2G, 3G, 3G, and future wireless access networks may enable wide area coverage for mobile devices, such as mobile devices 102 - 104 with various degrees of mobility.
  • wireless network 110 may enable a radio connection through a radio network access such as Global System for Mobil communication (GSM), General Packet Radio Services (GPRS), Enhanced Data GSM Environment (EDGE), Wideband Code Division Multiple Access (WCDMA), Universal Mobile Telephone System (UMTS), and the like.
  • GSM Global System for Mobil communication
  • GPRS General Packet Radio Services
  • EDGE Enhanced Data GSM Environment
  • WCDMA Wideband Code Division Multiple Access
  • UMTS Universal Mobile Telephone System
  • wireless network 110 may include virtually any wireless communication mechanism by which information may travel between mobile devices 102 - 104 and another
  • Network 105 is configured to couple CRP 106 and its components with other computing devices, including, mobile devices 102 - 104 , client device 101 , and through wireless network 110 to mobile devices 102 - 104 .
  • Network 105 is enabled to employ any form of computer readable media for communicating information from one electronic device to another.
  • network 105 can include the Internet in addition to local area networks (LANs), wide area networks (WANs), direct connections, such as through a universal serial bus (USB) port, other forms of computer-readable media, or any combination thereof.
  • LANs local area networks
  • WANs wide area networks
  • USB universal serial bus
  • a router acts as a link between LANs, enabling messages to be sent from one to another.
  • communication links within LANs typically include twisted wire pair or coaxial cable
  • communication links between networks may utilize analog telephone lines, full or fractional dedicated digital lines including T1, T2, T3, and T4, Integrated Services Digital Networks (ISDNs), Digital Subscriber Lines (DSLs), wireless links including satellite links, or other communications links known to those skilled in the art.
  • ISDNs Integrated Services Digital Networks
  • DSLs Digital Subscriber Lines
  • remote computers and other related electronic devices could be remotely connected to either LANs or WANs via a modem and temporary telephone link.
  • network 105 includes any communication method by which information may travel between CRP 106 , client device 101 , and other computing devices.
  • communication media typically embodies computer-readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave, data signal, or other transport mechanism and includes any information delivery media.
  • modulated data signal and “carrier-wave signal” includes a signal that has one or more of its characteristics set or changed in such a manner as to encode information, instructions, data, and the like, in the signal.
  • communication media includes wired media such as twisted pair, coaxial cable, fiber optics, wave guides, and other wired media and wireless media such as acoustic, RF, infrared, and other wireless media.
  • CRP 106 may include any computing device capable of connecting to network 105 to enable a platform for language agnostic rendering of markup language templates and pages for subsequent display by a particular remote device, such as mobile devices 102 - 104 and client device 101 .
  • a rendering context for the particular remote device and a markup language document are received by the platform, which processes both to generate a fully rendered markup language document that is subsequently delivered to, and displayed by, that particular remote device.
  • Devices that may operate as CRP 106 include personal computers desktop computers, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, servers, and the like.
  • FIG. 1 illustrates CRP 106 as a single computing device, the invention is not so limited.
  • one or more functions of CRP 106 may be distributed across one or more distinct computing devices.
  • content rendering and the like may be performed by a plurality of computing devices, without departing from the scope or spirit of the present invention.
  • Content provider 107 can also include a variety of services used to provide content to remote devices. Such services include, but are not limited to web services, third-party services, audio services, video services, email services, IM services, SMS services, MMS services, VoIP services, video game services, gaming services, calendaring services, shopping services, photo services, or the like. Devices that may operate as content provider 107 include personal computers desktop computers, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, servers, and the like.
  • FIG. 2 shows one embodiment of mobile device 200 that may be included in a system implementing the invention.
  • Mobile device 200 may include many more or less components than those shown in FIG. 2 . However, the components shown are sufficient to disclose an illustrative embodiment for practicing the present invention.
  • Mobile device 200 may represent, for example, mobile devices 102 - 104 of FIG. 1 .
  • mobile device 200 includes a processing unit (CPU) 222 in communication with a mass memory 230 via a bus 224 .
  • Mobile device 200 also includes a power supply 226 , one or more network interfaces 250 , an audio interface 252 , a display 254 , a keypad 256 , an illuminator 258 , an input/output interface 260 , a haptic interface 262 , and an optional global positioning systems (GPS) receiver 264 .
  • Power supply 226 provides power to mobile device 200 .
  • a rechargeable or non-rechargeable battery may be used to provide power.
  • the power may also be provided by an external power source, such as an AC adapter or a powered docking cradle that supplements and/or recharges a battery.
  • Mobile device 200 may optionally communicate with a base station (not shown), or directly with another computing device.
  • Network interface 250 includes circuitry for coupling mobile device 200 to one or more networks, and is constructed for use with one or more communication protocols and technologies including, but not limited to, global system for mobile communication (GSM), code division multiple access (CDMA), Wide CDMA (CDMA), time division multiple access (TDMA), Universal Mobile Telephone Service (UMTS), user datagram protocol (UDP), transmission control protocol/Internet protocol (TCP/IP), SMS, general packet radio service (GPRS), WAP, ultra wide band (UWB), IEEE 802.16 Worldwide Interoperability for Microwave Access (WiMax), SIP/RTP, or any of a variety of other wireless communication protocols.
  • GSM global system for mobile communication
  • CDMA code division multiple access
  • CDMA Wide CDMA
  • TDMA time division multiple access
  • UMTS Universal Mobile Telephone Service
  • UDP user datagram protocol
  • TCP/IP transmission control protocol/Internet protocol
  • SMS general packet radio service
  • GPRS
  • Audio interface 252 is arranged to produce and receive audio signals such as the sound of a human voice.
  • audio interface 252 may be coupled to a speaker and microphone (not shown) to enable telecommunication with others and/or generate an audio acknowledgement for some action.
  • Display 254 may be a liquid crystal display (LCD), gas plasma, light emitting diode (LED), or any other type of display used with a computing device.
  • Display 254 may also include a touch sensitive screen arranged to receive input from an object such as a stylus or a digit from a human hand.
  • Keypad 256 may comprise any input device arranged to receive input from a user.
  • keypad 256 may include a push button numeric dial, or a keyboard.
  • Keypad 256 may also include command buttons that are associated with selecting and sending images.
  • Illuminator 258 may provide a status indication and/or provide light. Illuminator 258 may remain active for specific periods of time or in response to events. For example, when illuminator 258 is active, it may backlight the buttons on keypad 256 and stay on while the client device is powered. Also, illuminator 258 may backlight these buttons in various patterns when particular actions are performed, such as dialing another client device. Illuminator 258 may also cause light sources positioned within a transparent or translucent case of the client device to illuminate in response to actions.
  • Mobile device 200 also comprises input/output interface 260 for communicating with external devices, such as a headset, or other input or output devices not shown in FIG. 2 .
  • Input/output interface 260 can utilize one or more communication technologies, such as USB, infrared, BluetoothTM, or the like.
  • Haptic interface 262 is arranged to provide tactile feedback to a user of the client device. For example, the haptic interface may be employed to vibrate mobile device 200 in a particular way when another user of a computing device is calling.
  • GPS transceiver 264 can determine the physical coordinates of mobile device 200 on the surface of the Earth, which typically outputs a location as latitude and longitude values. GPS transceiver 264 can also employ other geo-positioning mechanisms, including, but not limited to, triangulation, assisted GPS (AGPS), E-OTD, CI, SAI, ETA, BSS or the like, to further determine the physical location of mobile device 200 on the surface of the Earth. It is understood that under different conditions, GPS transceiver 264 can determine a physical location within millimeters for mobile device 200 ; and in other cases, the determined physical location may be less precise, such as within a meter or significantly greater distances. In one embodiment, however, mobile device may through other components, provide other information that may be employed to determine a physical location of the device, including for example, a MAC address, IP address, or the like.
  • Mass memory 230 includes a RAM 232 , a ROM 234 , and other storage means. Mass memory 230 illustrates another example of computer storage media for storage of information such as computer readable instructions, data structures, program modules or other data. Mass memory 230 stores a basic input/output system (“BIOS”) 240 for controlling low-level operation of mobile device 200 . The mass memory also stores an operating system 241 for controlling the operation of mobile device 200 . It will be appreciated that this component may include a general purpose operating system such as a version of UNIX, or LINUXTM, or a specialized client communication operating system such as Windows MobileTM, or the Symbian (operating system. The operating system may include, or interface with a Java virtual machine module that enables control of hardware components and/or operating system operations via Java application programs.
  • BIOS basic input/output system
  • Memory 230 further includes one or more data storage 244 , which can be utilized by mobile device 200 to store, among other things, applications 242 and/or other data.
  • data storage 244 may also be employed to store information that describes various capabilities of mobile device 200 . The information may then be provided to another device based on any of a variety of events, including being sent as part of a header during a communication, sent upon request, or the like.
  • Applications 242 may include computer executable instructions which, when executed by mobile device 200 , transmit, receive, and/or otherwise process messages (e.g., SMS, MMS, IM, email, and/or other messages), audio, video, and enable telecommunication with another user of another client device.
  • Other examples of application programs include calendars, browsers, email clients, IM applications, SMS applications, VoIP applications, contact managers, task managers, transcoders, database programs, word processing programs, security applications, spreadsheet programs, video games, gaming programs, search programs, shopping cart programs, and so forth.
  • Applications 242 may further include browser 245 .
  • Browser 245 may be configured to receive and enable a display of rendered content provided by CRP 106 from content provider 107 . Further, browser 245 enables the user of mobile device 200 to select different actions displayed by the rendered content. In at least one embodiment, browser 245 enables the user to select one or more of a product to purchase, search for content and display the result, call a mobile telephonic device, display and respond to messages, or the like. Various embodiments for rendering the content for display on the mobile device are described in more detail below.
  • FIG. 3 shows one embodiment of a network device, according to one embodiment of the invention.
  • Network device 300 may include many more components than those shown. The components shown, however, are sufficient to disclose an illustrative embodiment for practicing the invention.
  • Network device 300 may represent, for example, CRP 106 , Client device 101 , and/or Content provider 107 of FIG. 1 .
  • Network device 300 includes processing unit 312 , video display adapter 314 , and a mass memory, all in communication with each other via bus 322 .
  • the mass memory generally includes RAM 316 , ROM 332 , and one or more permanent mass storage devices, such as hard disk drive 328 , tape drive, optical drive, and/or floppy disk drive.
  • the mass memory stores operating system 320 for controlling the operation of network device 300 . Any general-purpose operating system may be employed.
  • BIOS Basic input/output system
  • network device 300 also can communicate with the Internet, or some other communications network, via network interface unit 310 , which is constructed for use with various communication protocols including the TCP/IP protocol.
  • Network interface unit 310 is sometimes known as a transceiver, transceiving device, or network interface card (NIC).
  • Computer storage media may include volatile, nonvolatile, removable, and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
  • Examples of computer storage media include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computing device.
  • the mass memory also stores program code and data.
  • One or more applications 350 are loaded into mass memory and run on operating system 320 .
  • Examples of application programs may include transcoders, schedulers, calendars, database programs, word processing programs, HTTP programs, customizable user interface programs, IPSec applications, encryption programs, security programs, VPN programs, SMS message servers, IM message servers, email servers, account management and so forth.
  • Content rendering platform (CRP) 354 may also be included as an application program within applications 350 .
  • CRP 354 is configurable as a platform and/or a server that receives rendering context information and a markup language document from one or more resources. CRP 354 automatically renders and tailors the markup language document, based at least in part on the rendering context, in a manner suited for subsequent display and/or interaction on a target remote device.
  • FIG. 4 illustrates an over view of one embodiment of platform 400 for employing a composite Markup Language (ML) document and a Rendering Context to fully render the document for subsequent display on a target remote device.
  • Composite ML document 406 can be decomposed into more ML code.
  • Rendering Context 404 can include a unique identifier for identifying the particular combination of data and parameters for a target remote device.
  • main component 402 receives both composite ML document 406 and rendering context 404 .
  • Main component 402 validates the composite ML code and checks the data, parameters and unique identifier included with rendering context 404 . If an error is detected, main component 402 provides a notification of such an error, which can be provided to the target remote device and/or the provider of the composite ML document to the main component.
  • the composite ML document is provided by a content provider, other platform, and/or application to platform 400 .
  • the rendering context is separately determined either in real time and/or out of band from one or more services, platforms, applications, and/or sources, including the manufacturer of the remote device, header information in a message from the remote device, information from a gateway for a carrier that is in communication with the remote device, known bugs in software and/or hardware for the mobile device.
  • Cache 412 is arranged to store a document object model tree for code included in the composite ML document. Also, if a document object model tree isn't initially present in cache 412 , ML parser 410 parses the document to create composite ML document object model tree 414 which is subsequently stored in the cache. In any case, the cached tree is passed to markup expander 414 which generates a document object model tree of native ML tags. Resource manager 434 employs component 436 to enable substitution for named resources and component 438 to expand composite tags from a library.
  • Markup expander 418 provides Native ML tree 424 to Conditioner 426 where pre-resolution processes are performed, e.g., removal of whitespace and pagination for subsequent display by the target remote device.
  • the conditioned native ML tree is provided to Markup Resolver 428 where Component 430 executes the rendering of templates and component 432 renders tags that are included in a cache.
  • Tag templates are provided by Component 440 which is associated with Resource Manager 434 .
  • Markup Resolver 428 provides Main component 402 with a fully rendered version of the initially provided composite ML document. Main component 402 subsequently provides the fully rendered markup Document 416 for delivery to the target remote device.
  • the rendering can include a plurality of different processes that contribute to the full rendering of ML document, including: (1) determining if location information regarding the target remote device can be provided in response to a request; (2) minifying cascading style sheets (CSS) by removing extraneous sheets; (3) fontifying cascading style sheets by stripping them out and rewriting as ML code; (4) shrinking the ML document to remove any ML code that is extraneous to the target remote device; (5) specifying one or more attributes of each media item to be embedded or linked in a page; (6) employing an estimate of each page size to repaginate the ML document pages to a size that is no more than the effective display screen size of the target remote device; (7) tailoring/removing ML code in a page that is estimated to be rendered as larger than the effective display screen size of the target remote device; and (8) rewriting URLs to include locations of media items and/or session identification information.
  • CSS cascading style sheets
  • the pagination process is generally performed in conjunction with the combined operation of Markup Resolver 428 and Conditioner 426 .
  • Rendering Context 404 is queried by Conditioner 426 for the effective/maximum page size for the target remote device for each page.
  • An general estimate of the size of each page is determined by analyzing the nodes included in each the page. The estimate is compared to the effective page size to determine if there is an offset of page data that must be either repaginated to another new page or tailored/cut from an existing page.
  • Conditioner 426 also walks the native ML tree to identify each node that is defined as breakable onto which the offset falls. The container following the breakable node is identified by Conditioner 426 with a tag as the break off point for subsequent markup resolution by Markup Resolver 428 .
  • the pagination process is completed by Markup Resolver 428 which renders nodes in a page that are tagged by Conditioner 426 for rendering and not others. However, each time a node is rendered, the size of the rendered node is noted in a running total size for that particular page.
  • the resolver checks to see if the split occurs either within another tag or not. If the split occurs on a tag, then the split is set to follow a previous tag or data. However, if the split doesn't occur on a tag, then a breaking method is performed to find an offset such that the split occurs over whitespace over an overly long word, e.g., 100 characters long which can be adjusted by Resource Manager 434 . Additionally, after inserting the split in the page, Markup Resolver 428 can render an end of page tag that includes displayable links to the next and previous pages of the split page as necessary.
  • the platform can facilitate a post-rendering component to embed a media item in the fully rendered markup document for subsequent display/playback with the target remote device.
  • the media item can include an image, audio file, sound, graphic, video, animation, or the like. Additionally, localization of the text and any other element for the fully rendered document can be performed prior to providing the document to the target remote device.
  • FIG. 5A illustrates overview 500 of one embodiment of platform 502 for pipelined full rendering of Markup Language (ML) document 504 based at least in part on corresponding Rendering Context 506 for subsequent display on a target remote device.
  • Rendering Context 506 and ML document 504 can be pre-processed by master document assembler 508 where informalities are handled, e.g., units can be added, blanks filled in, and/or footers added.
  • Master Document assembler 508 provides (preprocessed) ML document 504 and rendering context 504 to platform 502 where pipeline builder component 512 walks the document to determine and identify the different stages of processing that contribute to fully rendering the ML document, and which is based also at least in part on the rendering context.
  • Pipeline builder component 512 employs pipeline stage library 514 to generate pipeline stages component 516 which is arranged to include a stage for each identified process that contributes to the full rendering of ML document 504 . Further, main component 510 generates a document object model (DOM) tree that is based on ML document 504 and Rendering Context 506 . Both the DOM tree and Rendering Context 506 are provided to at least a portion of the stages in pipeline stages component 516 .
  • DOM document object model
  • the stages can enable a plurality of different processes that contribute to the full rendering of ML document 504 , including: (1) determining if location information regarding the target remote device can be provided in response to a request; (2) minifying cascading style sheets (CSS) by removing extraneous sheets; (3) fontifying cascading style sheets by stripping them out and rewriting as ML code; (4) shrinking the ML document to remove any ML code that is extraneous to the target remote device; (5) specifying one or more attributes of each media item to be embedded or linked in a page; (6) employing an estimate of each page size to repaginate the ML document pages to a size that is no more than the effective display screen size of the target remote device; (7) tailoring/removing ML code in a page that is estimated to be rendered as larger than the effective display screen size of the target remote device; and (8) rewriting URLs to include locations of media items and/or session identification information.
  • CSS cascading style sheets
  • Temp Index Add and Temp Index Remove stages can optionally be included in pipeline stage library 514 and included as a processing stage in pipeline stage component 516 .
  • the Temp Index Add stage can be arranged to walk ML document 504 and identify portions in the document that don't already have an XML ID tag, and subsequently provide a Temp ID tag these identified portions. Also, this stage can build a Temporary Index for all of the temporary and existing XML ID tags. Other pipelined stages in component 516 can subsequently use the Temporary Index to more quickly access just those elements in the ML document that are to be rendered by a particular stage.
  • the Temp Index Remove stage removes the Temp XML IDs and Temporary Index. The use of Temp XML IDs and a Temporary Index can reduce the likelihood that a particular process has to walk the entire ML document to perform its portion of the full rendering of the document for the target remote device.
  • Serializer 518 receives the full rendered DOM tree for the target remote device and converts it into a stream of character bytes that are suitable for transmission over a communication link with the device. Additionally, a separate component can be arranged to localize the character bytes for a particular language, and another component can be arranged to embed a media item in the character stream for the target remote device.
  • FIG. 5B illustrates an overview of one embodiment of a plurality of exemplary stages that can be included in pipelined stages component 516 as shown in FIG. 5A .
  • Component 516 is shown to include as follows: Temp Index Add stage 530 , Location Request stage 532 , Minify stage 534 , Paginate stage 536 , Fontify stage 538 , Media Item stage 540 , and Temp Index Remove stage 542 .
  • the stages that are included in component 516 correspond to a particular target remote device and some of the same, but not all stages, plus possibly other stages might also be included in component 516 if full rendering is to be performed for a different target remote device.
  • FIG. 6 illustrates overview 600 of a class tree that can be instantiated from modules included in a document with modules that are created in a mobile Markup Language (MML).
  • MML is relatively independent of considerations for the target remote device and its classes can be instantiated by at least the different embodiments of a platform that are discussed above.
  • the main component of a platform can automatically parse the MML document to identify each MML module and subsequently instantiate a tree of classes based on the included MML modules.
  • These classes are arranged to generate a corresponding document that includes native XML for a target mobile device.
  • the platform can subsequently parse this corresponding native XML document to fully render the intent of the programmer for display by the target remote device.
  • the classes are arranged to generate virtually any type of Markup Language document that can include native code for rendering a display of pages on the target remote device.
  • the rendering can include a plurality of different processes that contribute to the full rendering of MML document, including: (1) determining if location information regarding the target remote device can be provided in response to a request; (2) minifying cascading style sheets (CSS) by removing extraneous sheets; (3) fontifying cascading style sheets by stripping them out and rewriting as ML code; (4) shrinking the ML document to remove any ML code that is extraneous to the target remote device; (5) specifying one or more attributes of each media item to be embedded or linked in a page; (6) employing an estimate of each page size to repaginate the ML document pages to a size that is no more than the effective display screen size of the target remote device; (7) tailoring/removing ML code in a page that is estimated to be rendered as larger than the effective display screen size of the target remote device; and (8) rewriting URLs to include locations of media items and/or session identification information.
  • CSS cascading style sheets
  • the MML document can include at least two categories of data structures, e.g., structure and module tags.
  • the structure category can include content tags, form tags, mode tags, and organization tags, and the like.
  • the module category can include generic module tags, global module tags, and property module tags. Additionally, some of the module tags can be arranged for tailoring the rendering of the MML document.
  • the MML tags are primarily used to capture the intent of the programmer, not the styling necessary to actually render pages for display on a target remote device.
  • the generic module tag ⁇ module.list> is handled as a request by the platform to render a list for display on a target mobile device.
  • the platform handles these details by using data and parameters included in the Rendering Context, e.g., portrait or landscape orientation of the list, width and height of the effective display screen, and the pagination size for displaying pages.
  • the rendering of the intent of the MML tags is handled by at least one of the platforms discussed above in a manner substantially similar to other tags provided in other types of markup language documents.
  • FIG. 7 illustrates overview 700 of a process generally employed by any embodiments of the inventive platforms to render a Markup Language (ML) document for display by a target remote device.
  • ML Markup Language
  • FIG. 7 illustrates overview 700 of a process generally employed by any embodiments of the inventive platforms to render a Markup Language (ML) document for display by a target remote device.
  • a Rendering Context as defined elsewhere in the application, for a particular target remote device is received.
  • a Markup Language document is also received for rendering for the target remote device.
  • the ML document and the Rendering Context are generated and subsequently provided by one or more of the same or different resources, including, but not limited to, content providers, carriers, web services, affiliates, users, and websites.
  • the Markup Language document can be coded in virtually any standard or non-standard Markup Language.
  • the data and parameters included in the Rendering Context for the target remote device are employed by one or more processes to fully render the ML document for display by the remote device.
  • the process can optionally provide a media item that can be either referenced by or embedded in the fully rendered ML document.
  • the fully rendered ML document plus any optionally included media items are made available for subsequent display by the target remote device.
  • the carrier provides the fully rendered ML document as a character string over a communication link to the target remote device.
  • the process returns to performing other actions.
  • FIG. 8 illustrates overview 800 of the process that can be performed by at least the platform illustrated in FIG. 4 .
  • the process steps to block 802 where a Markup Language (ML) document and Rendering Context are received from another resource, and the ML document is to be rendered for a target remote device.
  • the process checks the validity of the data and parameters included in the ML document and the Rendering Context.
  • a document object model tree is generated.
  • the Rendering Context enables the DOM tree to be populated with ML code that is native to the target remote device.
  • Stepping to block 810 the process determines pagination for the rendered pages of the ML document.
  • the process paginates these pages so that they are no greater in size than the effective size of the display screen for the target remote device. In some cases, next and previous links are created for newly paginated pages.
  • the process can also remove relatively unnecessary whitespace in the ML document.
  • the process parses the populated DOM tree to resolve the native ML code into a fully rendered ML document.
  • one or more media items can optionally be either referenced by or embedded in the fully rendered ML document.
  • the fully rendered ML document plus any optionally included media items are made available for subsequent display by the target remote device.
  • the carrier provides the fully rendered ML document as a character string over a communication link to the target remote device. Next, the process returns to performing other actions.
  • FIG. 9 illustrates overview 900 of a process for employing at least one of the disclosed embodiments of a platform to fully render a document written in a mobile Markup Language (MML) that is arranged to provide a programmers intent, and not the particular details associated with the actual rendering of this document for subsequent display by a target remote device.
  • MML mobile Markup Language
  • the process flows to block 902 where the MML document and a Rendering Context for the target mobile device are received from one or more of the same, or different, resources.
  • the MML document is parsed to identify each module in the document, and each identified module is subsequently instantiated.
  • the process generates a tree of ML code from the instantiated modules and where the generated ML code is also native to the target remote device.
  • the ML code in the tree is XML code.
  • the tree is parsed to resolve the ML code and paginate pages for subsequent display by the target remote device.
  • the process generates the fully rendered ML document for display by the target remote device.
  • the process can optionally provide a media item that can be either referenced by or embedded in the fully rendered ML document.
  • the fully rendered ML document plus any optionally included media items are made available for subsequent display by the target remote device.
  • the carrier provides the fully rendered ML document as a character string over a communication link to the target remote device.
  • the process returns to performing other actions.
  • FIG. 10 illustrates overview 1000 of a process for employing a platform to fully render a Markup Language (ML) document with a platform that provides pipelined stages for rendering different portions of the ML document.
  • the process flows to block 1002 where the ML document and a Rendering Context for the target mobile device are received from one or more of the same, or different, resources.
  • the ML document is assembled for processing. For example, common headers and footers, units, and other incidental elements can be added to the ML document.
  • the process parses the ML document and reviews the data and parameters of the Rendering Context to build an instance of pipelined stages that can enable the full rendering of the elements included in the ML document.
  • the process also converts the assembled ML document into a document object model (DOM) document.
  • DOM document object model
  • the process performs pipelined stage rendering and pagination of the elements included in the DOM document based at least in part on the Rendering Context for the target remote device.
  • the process serializes the rendered DOM document into a character string, which can optionally be localized to a particular language that corresponds to the target remote device.
  • the process can optionally provide a media item that can be either referenced by or embedded in the fully rendered character string.
  • the fully rendered character string plus any optionally included media items are made available for subsequent display by the target remote device.
  • the carrier provides the fully rendered ML document as a character string over a communication link to the target remote device.
  • the process returns to performing other actions.
  • FIG. 11A illustrates process 1100 for a platform that temporarily creates and indexes temporary IDs for identified portions in a received XML document to speed up the pipeline stage rendering of the document's pages for subsequent display by a target remote device.
  • a received Markup Language (XML) document is parsed to identify portions for subsequent processing.
  • a Rendering Context for a target remote device is also received.
  • decision block 1106 if an XML ID exists for an identified portion of the document, then the process moves to block 1108 . However, if the identified portion of the document doesn't correspond to an existing XML ID, then a temporary XML ID is added to the identified portion at block 1104 .
  • an identified portion of the document corresponds to either an existing or temporary XML ID
  • the process flows to block 1108 where a temporary index is generated for the XML document that lists the location of identified portions in the XML document based at least in part on the existing or temporary XML ID.
  • the temporary index and the Rendering Context are employed to estimate the page size of the XML document for display on the target remote device, and subsequently tailor and/or repaginate pages with oversized estimates. Also, session IDs can be added and URLs rewritten as necessary. Advancing to block 1112 , the process employs the temporary index and Rendering Context to perform pipelined stage rendering on identified portions until the entire XML document is fully rendered.
  • the temporary index and the temporary XML IDs are removed from the XML document.
  • the process can optionally provide a media item that can be either referenced by or embedded in the fully rendered XML document.
  • the fully rendered XML document plus any optionally included media items are made available for subsequent display by the target remote device.
  • the carrier provides the fully rendered ML document as a character string over a communication link to the target remote device.
  • FIG. 11B illustrates process 1120 for a platform that temporarily creates and indexes temporary IDs for identified portions in a received Markup Language (ML) document to speed up the rendering of the document's pages for subsequent display by a target remote device.
  • ML Markup Language
  • the ML document can be written in virtually any standard or non-standard Markup language.
  • the process flows to block 1122 where a received Markup Language (ML) document is parsed to identify identified portions of the document.
  • a Rendering Context for a target remote device is also received.
  • decision block 1124 if an XML ID exists for an identified portion, then the process moves to block 1128 . However, if the identified portion doesn't correspond to an existing ML ID, then a temporary ML ID is added to the identified portion at block 1126 . In any case, after each identified portion of the document corresponds to either an existing or temporary ML ID, the process flows to block 1128 where a temporary index is generated for the ML document that lists the location of the identified portions in the ML document based at least in part on the existing or temporary ML ID.
  • ML Markup Language
  • the temporary index and the Rendering Context are employed to estimate the page size of the XML document for display on the target remote device, and subsequently tailor and/or repaginate pages with oversized estimates. Also, session IDs can be added and URLs rewritten as needed. Advancing to block 1132 , the process employs the temporary index and Rendering Context to perform rendering of the identified portions until the entire ML document is fully rendered.
  • the temporary index and the temporary ML IDs are removed from the rendered ML document.
  • the process can optionally provide a media item that can be either referenced by or embedded in the fully rendered ML document.
  • the fully rendered ML document plus any optionally included media items are made available for subsequent display by the target remote device.
  • the carrier provides the fully rendered ML document as a character string over a communication link to the target remote device.
  • blocks of the flowchart illustrations support combinations of means for performing the specified actions, combinations of steps for performing the specified actions and program instruction means for performing the specified actions. It will also be understood that each block of the flowchart illustration, and combinations of blocks in the flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified actions or steps, or combinations of special purpose hardware and computer instructions.

Abstract

A platform for customized rendering of an editable mobile markup language document for a mobile device. A rendering context and an editable mobile markup language (MML) document for the mobile device is received by the platform, which performs processes that can paginate and fully render pages that are subsequently delivered for display by the mobile device. The MML document and its modules and content are independent of the rendering context for the mobile device. The mobile device can be arranged as a client device that provides for wired and/or wireless communication over a network.

Description

    RELATED APPLICATION
  • This utility patent application is a Continuation of U.S. patent application Ser. No. 11/537,593, filed on Sep. 26, 2006, the benefits of which are claimed under 35 U.S.C. §120, and are further incorporated herein by reference.
  • FIELD OF THE INVENTION
  • The invention is generally directed to providing content over a network, and more particularly to enabling content from disparate sources to be expressed with a markup language that is both resolved with the assistance of a server and independent of a target remote device.
  • BACKGROUND OF THE INVENTION
  • Recent surveys have identified over 10,000 different models of mobile devices, such as mobile telephones, in operation world wide. To meet the growing popularity of mobile devices, ten or more new models are being introduced into the marketplace each week. Also, there are hundreds of different carriers around the world that enable a wide variety of wireless services and communication links with different capabilities for providing content to mobile devices and other remotely located devices. Consequently, the context for providing and rendering content for use with a target remote device and/or a carrier can vary widely.
  • For example, there is not a standard size or color palette for display screens. Consequently, content rendered for use with a color display of one size may or may not be accurately displayable, if at all, by a monochrome display of a different size. Also, the capacity and reliability of different communication links provided by carriers to their individual customers can significantly impact the accurate and timely rendering of content for display on a remote device. Additionally, the general operation and known bugs in client applications such as browsers can widely differ. Furthermore, a developer may create content in one or more different languages that can have parameters that must be considered for accurate rendering of the content for display on a target remote device.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Non-limiting and non-exhaustive embodiments of the present invention are described with reference to the following drawings. In the drawings, like reference numerals refer to like parts throughout the various figures unless otherwise specified.
  • For a better understanding of the present invention, reference will be made to the following Detailed Description Of The Embodiments, which is to be read in association with the accompanying drawings, wherein:
  • FIG. 1 illustrates a diagram of one embodiment of an exemplary system in which the invention may be practiced;
  • FIG. 2 shows one embodiment of an exemplary mobile device;
  • FIG. 3 illustrates one embodiment of an exemplary network device;
  • FIG. 4 shows one embodiment of an exemplary platform for rendering a markup language document for display by a target remote device;
  • FIG. 5A illustrates another embodiment of an exemplary platform that employs pipelined stages to render a markup language document for display by a target remote device;
  • FIG. 5B shows one embodiment of exemplary pipelined stages that are employed with a platform to render a markup language document for display by a target remote device;
  • FIG. 6 illustrates yet another embodiment of an exemplary class tree for modules that enable a platform to render a markup language document for display by a target remote device;
  • FIG. 7 shows an overview of a process for generally employing a platform to render a markup language document for display by a target remote device;
  • FIG. 8 illustrates an overview of a process for employing a platform for rendering a markup language document for display by a target remote device;
  • FIG. 9 shows an overview of a process for employing class tree for modules that enable rendering of a markup language document for display by a target remote device;
  • FIG. 10 illustrates a process for pipelined stages that render a markup language document for display by a target remote device;
  • FIG. 11A shows a process for employing Temporary IDs and Indexes to pipeline process the rendering of a markup language document for display by a target remote device; and
  • FIG. 11B illustrates a process for employing Temporary IDs and Temporary Indexes to render a markup language document for display by a target remote device, in accordance with the invention.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS
  • The present invention now will be described more fully hereinafter with reference to the accompanying drawings, which form a part hereof, and which show, by way of illustration, specific exemplary embodiments by which the invention may be practiced. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Among other things, the present invention may be embodied as methods or devices. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. The following detailed description is, therefore, not to be taken in a limiting sense.
  • Throughout the specification and claims, the following terms take the meanings explicitly associated herein, unless the context clearly dictates otherwise. The phrase “in one embodiment” as used herein does not necessarily refer to the same embodiment, though it may. Furthermore, the phrase “in another embodiment” as used herein does not necessarily refer to a different embodiment, although it may. Thus, as described below, various embodiments of the invention may be readily combined, without departing from the scope or spirit of the invention.
  • In addition, as used herein, the term “or” is an inclusive “or” operator, and is equivalent to the term “and/or,” unless the context clearly dictates otherwise. The term “based on” is not exclusive and allows for being based on additional factors not described, unless the context clearly dictates otherwise. In addition, throughout the specification, the meaning of “a,” “an,” and “the” include plural references. The meaning of “in” includes “in” and “on.”
  • As used herein, the term “receiving” an item, such as a request, response, or other message, from a device or component includes receiving the message indirectly, such as when forwarded by one or more other devices or components. Similarly, “sending” an item to a device or component includes sending the item indirectly, such as when forwarded by one or more other devices or components.
  • As used herein, the term “mobile identification number” (MIN) refers to a number that uniquely identifies a mobile device within a mobile carrier's network. A cellular telephone's phone number may be used as a MIN.
  • As used herein, the term “mobile client application” refers to an application that runs on a mobile device. A mobile client application may be written in one or more of a variety of languages, such as ‘C’, ‘C++’, ‘J2ME’, “Brew”, Java, and the like. Browsers, email clients, text messaging clients, calendars, and games are examples of mobile client applications.
  • As used herein, the term “network application” refers to a computer-based application that communicates, directly or indirectly, with at least one other component across a network. Web sites, email servers, messaging servers, and game servers are examples of network applications.
  • As used herein, the term “uniform resource identifier” (URI) refers to an identifier used to identify an abstract or physical resource. The term URI includes a uniform resource locator (URL) and a uniform resource name (URN). RFC 3986 describes a syntax for a URI. As used herein, the term URI is not limited to this syntax, and may include other syntaxes.
  • Briefly stated, the invention is directed to a platform for enabling customized rendering of markup language pages provided over a network for subsequent display by a remote device. A rendering context and a markup language (ML) document for the target remote device is received by the platform that enables processes that can paginate and fully render pages that are subsequently delivered for display by the target remote device. A post-rendering process may also be provided to perform additional processing of media items for the rendered ML document. This additional processing may include retrieving and embedding images in pages of the rendered ML document. For example, if a rendered page is in XML format, and includes a link to an image, the post-processing component may retrieve the image and embed it within the XML page as base 64 encoded data or another format. The platform is markup language agnostic and can employ templates in the custom rendering process. Also, in at least one embodiment, the remote device is arranged as a client device that provides for wired and/or wireless communication over a network.
  • The markup language document can be provided in virtually any standard or non-standard format, including, but not limited to, Handheld Device Markup Language (HDML), Wireless Markup Language (WML), WMLScript, Standard Generalized Markup Language (SMGL), HyperText Markup Language (HTML), mobile HTML (mHTML), (cHTML), eXtensible Markup Language (XML), and the like.
  • In at least one embodiment, the markup language document may be provided to the platform in a mobile Mark Up Language (MML) that includes modules that are independent of the target remote device. In this case, the platform walks the MML document to identify MML modules and instantiates a tree of classes based on the included modules. These classes are arranged to generate a corresponding XML document that includes customized code to handle the rendering context for the target mobile device, e.g., known bugs in software and hardware for the target remote device, and resource constraints such as display screen size, memory size, and communication link(s) provided by the carrier. The platform can subsequently fully render this XML document for display by the target remote device.
  • The rendering context can be arranged as a data structure that contains the various parameters and data that are employed by the platform to optimize the rendering of each page in the ML document for display with a particular remote device. The rendering context is generally provided to the platform by a separate application, platform, or process that can be managed by a content provider, carrier, and/or another 3rd party service. The rendering context for a remote device can include, but is not limited to, screen size, color capabilities, type of markup language, browser application, known bugs in a software or hardware version of the mobile device or network gateway, or the like. Also, the platform can store configuration data related to attributes of a variety of remote devices and network carriers and methods of storing and retrieving the configuration data. In at least one embodiment, the storage and retrieval of data and/or parameters associated with a rendering context for a target remote device is provided in an HTTP cookie.
  • The rendering of the ML document can include a plurality of different processes, including: (1) determining if location information regarding the target remote device can be provided in response to a request; (2) minifying cascading style sheets (CSS) by removing extraneous sheets; (3) fontifying cascading style sheets by stripping them out and rewriting as ML code; (4) shrinking the ML document to remove any ML code that is extraneous to the target remote device; (5) specifying one or more attributes of each media item to be embedded or linked in a page; (6) employing an estimate of each page size to repaginate the ML document pages to a size that is no more than the effective display screen size of the target remote device; (7) tailoring/removing ML code in a page that is estimated to be rendered as larger than the effective display screen size of the target remote device; and/or (8) rewriting URLs in a page to include locations of content and/or session identification information.
  • The rendering of the pages of the ML document for the target remote device may further include rewriting links or URIs within the document. For example, if the ML document includes a link to an image in one format, the link may be modified to an alternate image in a different format, if the target remote device is unable to display the first format. A link may also be rewritten to include a parameter, such as a value to identify a continuing session, so that a new request using the link returns the session identifier.
  • Additionally, during the rendering of the ML document, a temporary XML ID can be added to each identified portion of the ML document that doesn't have an existing XML ID. In the ML document, a temporary index can be built for each XML ID which can be used by the plurality of processes to quickly find and render the corresponding elements (identified portions), e.g., a string(s), an image(s), and the like. Once the rendering is completed, the temporary XML IDs can be removed. The use of temporary XML IDs and temporary index can reduce the likelihood that a particular process has to walk the entire ML document to perform its portion of the full rendering of the document for the target remote device.
  • Illustrative Operating Environment
  • FIG. 1 shows components of one embodiment of an environment in which the invention may be practiced. Not all the components may be required to practice the invention, and variations in the arrangement and type of the components may be made without departing from the spirit or scope of the invention. As shown, system 100 of FIG. 1 includes local area networks (“LANs”)/wide area networks (“WANs”)-(network) 105, wireless network 110, Content Rendering Platform (CRP) 106, mobile devices (client devices) 102-104, client device 101, and content provider 107.
  • One embodiment of mobile devices 102-103 is described in more detail below in conjunction with FIG. 2. Generally, however, mobile devices 102-104 may include virtually any portable computing device capable of receiving and sending a message over a network, such as network 105, wireless network 110, or the like. Mobile devices 102-104 may also be described generally as client devices that are configured to be portable. Thus, mobile devices 102-104 may include virtually any portable computing device capable of connecting to another computing device and receiving information. Such devices include portable devices such as, cellular telephones, smart phones, display pagers, radio frequency (RF) devices, infrared (IR) devices, Personal Digital Assistants (PDAs), handheld computers, laptop computers, wearable computers, tablet computers, integrated devices combining one or more of the preceding devices, and the like. As such, mobile devices 102-104 typically range widely in terms of capabilities and features. For example, a cell phone may have a numeric keypad and a few lines of monochrome LCD display on which only text may be displayed. In another example, a web-enabled mobile device may have a touch sensitive screen, a stylus, and several lines of color LCD display in which both text and graphics may be displayed.
  • A web-enabled mobile device may include a browser application that is configured to receive and to send web pages, web-based messages, and the like. The browser application may be configured to receive and display graphics, text, multimedia, and the like, employing virtually any web based language, including a wireless application protocol messages (WAP), and the like. In one embodiment, the browser application for the mobile device is enabled to employ Handheld Device Markup Language (HDML), Wireless Markup Language (WML), WMLScript, JavaScript, Standard Generalized Markup Language (SMGL), HyperText Markup Language (HTML), eXtensible Markup Language (XML), and the like, to display content and communicate messages.
  • Mobile devices 102-104 also may include at least one other client application that is configured to receive content from another computing device. The client application may include a capability to provide and receive textual content, graphical content, audio content, and the like. The client application may further provide information that identifies itself, including a type, capability, name, and the like. In one embodiment, mobile devices 102-104 may uniquely identify themselves through any of a variety of mechanisms, including a phone number, Mobile Identification Number (MIN), an electronic serial number (ESN), or other mobile device identifier. The information may also indicate a content format that the mobile device is enabled to employ, mobile device manufacturer, model number, display colors, display size, enabled features, and wireless carrier. Such information may be provided in a message, or the like, sent to CRP 106, client device 101, or other computing devices.
  • Mobile devices 102-104 may also be configured to communicate a message, such as through Short Message Service (SMS), Multimedia Message Service (MMS), instant messaging (IM), internet relay chat (IRC), Mardam-Bey's IRC (mIRC), Jabber, and the like, between another computing device, such as CRP 106, client device 101, or the like. However, the present invention is not limited to these message protocols, and virtually any other message protocol may be employed.
  • Mobile devices 102-104 may be further configured to enable a user to participate in communications sessions, such as IM sessions. As such, mobile devices 102-104 may include a client application that is configured to manage various actions on behalf of the client device. For example, the client application may enable a user to interact with the browser application, email application, IM applications, SMS application, MMS application, and the like.
  • Mobile devices 102-104 may further be configured to include a client application that enables the end-user to log into an end-user account that may be managed by another computing device, such as Content Provider 107. Such end-user account, for example, may be configured to enable the end-user to receive emails, send/receive IM messages, SMS messages, access selected web pages, participate in a social networking activity, or the like. However, participation in various social networking activities may also be performed without logging into the end-user account. Additionally, mobile devices 102-104 may also communicate with non-mobile client devices, such as client device 101, or the like.
  • Client device 101 may include virtually any computing device capable of communicating over a network to send and receive information, including social networking information, or the like. The set of such devices may include devices that typically connect using a wired or wireless communications medium such as personal computers, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, or the like.
  • Wireless network 110 is configured to couple mobile devices 102-104 and its components with network 105. Wireless network 110 may include any of a variety of wireless sub-networks that may further overlay stand-alone ad-hoc networks, and the like, to provide an infrastructure-oriented connection for mobile devices 102-104. Such sub-networks may include mesh networks, Wireless LAN (WLAN) networks, cellular networks, and the like.
  • Wireless network 110 may further include an autonomous system of terminals, gateways, routers, and the like connected by wireless radio links, and the like. These connectors may be configured to move freely and randomly and organize themselves arbitrarily, such that the topology of wireless network 110 may change rapidly.
  • Wireless network 110 may further employ a plurality of access technologies including 2nd (2G), 3rd (3G), and 4th (3G) generation radio access for cellular systems, WLAN, WiMax, Wireless Router (WR) mesh, and the like. Access technologies such as 2G, 3G, 3G, and future wireless access networks may enable wide area coverage for mobile devices, such as mobile devices 102-104 with various degrees of mobility. For example, wireless network 110 may enable a radio connection through a radio network access such as Global System for Mobil communication (GSM), General Packet Radio Services (GPRS), Enhanced Data GSM Environment (EDGE), Wideband Code Division Multiple Access (WCDMA), Universal Mobile Telephone System (UMTS), and the like. In essence, wireless network 110 may include virtually any wireless communication mechanism by which information may travel between mobile devices 102-104 and another computing device, network, and the like.
  • Network 105 is configured to couple CRP 106 and its components with other computing devices, including, mobile devices 102-104, client device 101, and through wireless network 110 to mobile devices 102-104. Network 105 is enabled to employ any form of computer readable media for communicating information from one electronic device to another. Also, network 105 can include the Internet in addition to local area networks (LANs), wide area networks (WANs), direct connections, such as through a universal serial bus (USB) port, other forms of computer-readable media, or any combination thereof. On an interconnected set of LANs, including those based on differing architectures and protocols, a router acts as a link between LANs, enabling messages to be sent from one to another. Also, communication links within LANs typically include twisted wire pair or coaxial cable, while communication links between networks may utilize analog telephone lines, full or fractional dedicated digital lines including T1, T2, T3, and T4, Integrated Services Digital Networks (ISDNs), Digital Subscriber Lines (DSLs), wireless links including satellite links, or other communications links known to those skilled in the art. Furthermore, remote computers and other related electronic devices could be remotely connected to either LANs or WANs via a modem and temporary telephone link. In essence, network 105 includes any communication method by which information may travel between CRP 106, client device 101, and other computing devices.
  • Additionally, communication media typically embodies computer-readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave, data signal, or other transport mechanism and includes any information delivery media. The terms “modulated data signal,” and “carrier-wave signal” includes a signal that has one or more of its characteristics set or changed in such a manner as to encode information, instructions, data, and the like, in the signal. By way of example, communication media includes wired media such as twisted pair, coaxial cable, fiber optics, wave guides, and other wired media and wireless media such as acoustic, RF, infrared, and other wireless media.
  • One embodiment of CRP 106 is described in more detail below in conjunction with FIG. 3. Briefly, however, CRP 106 may include any computing device capable of connecting to network 105 to enable a platform for language agnostic rendering of markup language templates and pages for subsequent display by a particular remote device, such as mobile devices 102-104 and client device 101. A rendering context for the particular remote device and a markup language document are received by the platform, which processes both to generate a fully rendered markup language document that is subsequently delivered to, and displayed by, that particular remote device. Devices that may operate as CRP 106 include personal computers desktop computers, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, servers, and the like.
  • Although FIG. 1 illustrates CRP 106 as a single computing device, the invention is not so limited. For example, one or more functions of CRP 106 may be distributed across one or more distinct computing devices. For example, content rendering and the like, may be performed by a plurality of computing devices, without departing from the scope or spirit of the present invention.
  • Content provider 107 can also include a variety of services used to provide content to remote devices. Such services include, but are not limited to web services, third-party services, audio services, video services, email services, IM services, SMS services, MMS services, VoIP services, video game services, gaming services, calendaring services, shopping services, photo services, or the like. Devices that may operate as content provider 107 include personal computers desktop computers, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, servers, and the like.
  • Illustrative Mobile Device
  • FIG. 2 shows one embodiment of mobile device 200 that may be included in a system implementing the invention. Mobile device 200 may include many more or less components than those shown in FIG. 2. However, the components shown are sufficient to disclose an illustrative embodiment for practicing the present invention. Mobile device 200 may represent, for example, mobile devices 102-104 of FIG. 1.
  • As shown in the figure, mobile device 200 includes a processing unit (CPU) 222 in communication with a mass memory 230 via a bus 224. Mobile device 200 also includes a power supply 226, one or more network interfaces 250, an audio interface 252, a display 254, a keypad 256, an illuminator 258, an input/output interface 260, a haptic interface 262, and an optional global positioning systems (GPS) receiver 264. Power supply 226 provides power to mobile device 200. A rechargeable or non-rechargeable battery may be used to provide power. The power may also be provided by an external power source, such as an AC adapter or a powered docking cradle that supplements and/or recharges a battery.
  • Mobile device 200 may optionally communicate with a base station (not shown), or directly with another computing device. Network interface 250 includes circuitry for coupling mobile device 200 to one or more networks, and is constructed for use with one or more communication protocols and technologies including, but not limited to, global system for mobile communication (GSM), code division multiple access (CDMA), Wide CDMA (CDMA), time division multiple access (TDMA), Universal Mobile Telephone Service (UMTS), user datagram protocol (UDP), transmission control protocol/Internet protocol (TCP/IP), SMS, general packet radio service (GPRS), WAP, ultra wide band (UWB), IEEE 802.16 Worldwide Interoperability for Microwave Access (WiMax), SIP/RTP, or any of a variety of other wireless communication protocols. Network interface 250 is sometimes known as a transceiver, transceiving device, or network interface card (NIC).
  • Audio interface 252 is arranged to produce and receive audio signals such as the sound of a human voice. For example, audio interface 252 may be coupled to a speaker and microphone (not shown) to enable telecommunication with others and/or generate an audio acknowledgement for some action. Display 254 may be a liquid crystal display (LCD), gas plasma, light emitting diode (LED), or any other type of display used with a computing device. Display 254 may also include a touch sensitive screen arranged to receive input from an object such as a stylus or a digit from a human hand.
  • Keypad 256 may comprise any input device arranged to receive input from a user. For example, keypad 256 may include a push button numeric dial, or a keyboard. Keypad 256 may also include command buttons that are associated with selecting and sending images. Illuminator 258 may provide a status indication and/or provide light. Illuminator 258 may remain active for specific periods of time or in response to events. For example, when illuminator 258 is active, it may backlight the buttons on keypad 256 and stay on while the client device is powered. Also, illuminator 258 may backlight these buttons in various patterns when particular actions are performed, such as dialing another client device. Illuminator 258 may also cause light sources positioned within a transparent or translucent case of the client device to illuminate in response to actions.
  • Mobile device 200 also comprises input/output interface 260 for communicating with external devices, such as a headset, or other input or output devices not shown in FIG. 2. Input/output interface 260 can utilize one or more communication technologies, such as USB, infrared, Bluetooth™, or the like. Haptic interface 262 is arranged to provide tactile feedback to a user of the client device. For example, the haptic interface may be employed to vibrate mobile device 200 in a particular way when another user of a computing device is calling.
  • Optional GPS transceiver 264 can determine the physical coordinates of mobile device 200 on the surface of the Earth, which typically outputs a location as latitude and longitude values. GPS transceiver 264 can also employ other geo-positioning mechanisms, including, but not limited to, triangulation, assisted GPS (AGPS), E-OTD, CI, SAI, ETA, BSS or the like, to further determine the physical location of mobile device 200 on the surface of the Earth. It is understood that under different conditions, GPS transceiver 264 can determine a physical location within millimeters for mobile device 200; and in other cases, the determined physical location may be less precise, such as within a meter or significantly greater distances. In one embodiment, however, mobile device may through other components, provide other information that may be employed to determine a physical location of the device, including for example, a MAC address, IP address, or the like.
  • Mass memory 230 includes a RAM 232, a ROM 234, and other storage means. Mass memory 230 illustrates another example of computer storage media for storage of information such as computer readable instructions, data structures, program modules or other data. Mass memory 230 stores a basic input/output system (“BIOS”) 240 for controlling low-level operation of mobile device 200. The mass memory also stores an operating system 241 for controlling the operation of mobile device 200. It will be appreciated that this component may include a general purpose operating system such as a version of UNIX, or LINUX™, or a specialized client communication operating system such as Windows Mobile™, or the Symbian (operating system. The operating system may include, or interface with a Java virtual machine module that enables control of hardware components and/or operating system operations via Java application programs.
  • Memory 230 further includes one or more data storage 244, which can be utilized by mobile device 200 to store, among other things, applications 242 and/or other data. For example, data storage 244 may also be employed to store information that describes various capabilities of mobile device 200. The information may then be provided to another device based on any of a variety of events, including being sent as part of a header during a communication, sent upon request, or the like.
  • Applications 242 may include computer executable instructions which, when executed by mobile device 200, transmit, receive, and/or otherwise process messages (e.g., SMS, MMS, IM, email, and/or other messages), audio, video, and enable telecommunication with another user of another client device. Other examples of application programs include calendars, browsers, email clients, IM applications, SMS applications, VoIP applications, contact managers, task managers, transcoders, database programs, word processing programs, security applications, spreadsheet programs, video games, gaming programs, search programs, shopping cart programs, and so forth. Applications 242 may further include browser 245.
  • Browser 245 may be configured to receive and enable a display of rendered content provided by CRP 106 from content provider 107. Further, browser 245 enables the user of mobile device 200 to select different actions displayed by the rendered content. In at least one embodiment, browser 245 enables the user to select one or more of a product to purchase, search for content and display the result, call a mobile telephonic device, display and respond to messages, or the like. Various embodiments for rendering the content for display on the mobile device are described in more detail below.
  • Illustrative Network Device
  • FIG. 3 shows one embodiment of a network device, according to one embodiment of the invention. Network device 300 may include many more components than those shown. The components shown, however, are sufficient to disclose an illustrative embodiment for practicing the invention. Network device 300 may represent, for example, CRP 106, Client device 101, and/or Content provider 107 of FIG. 1.
  • Network device 300 includes processing unit 312, video display adapter 314, and a mass memory, all in communication with each other via bus 322. The mass memory generally includes RAM 316, ROM 332, and one or more permanent mass storage devices, such as hard disk drive 328, tape drive, optical drive, and/or floppy disk drive. The mass memory stores operating system 320 for controlling the operation of network device 300. Any general-purpose operating system may be employed. Basic input/output system (“BIOS”) 318 is also provided for controlling the low-level operation of network device 300. As illustrated in FIG. 3, network device 300 also can communicate with the Internet, or some other communications network, via network interface unit 310, which is constructed for use with various communication protocols including the TCP/IP protocol. Network interface unit 310 is sometimes known as a transceiver, transceiving device, or network interface card (NIC).
  • The mass memory as described above illustrates another type of computer-readable media, namely computer storage media. Computer storage media may include volatile, nonvolatile, removable, and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. Examples of computer storage media include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computing device.
  • The mass memory also stores program code and data. One or more applications 350 are loaded into mass memory and run on operating system 320. Examples of application programs may include transcoders, schedulers, calendars, database programs, word processing programs, HTTP programs, customizable user interface programs, IPSec applications, encryption programs, security programs, VPN programs, SMS message servers, IM message servers, email servers, account management and so forth. Content rendering platform (CRP) 354 may also be included as an application program within applications 350.
  • CRP 354 is configurable as a platform and/or a server that receives rendering context information and a markup language document from one or more resources. CRP 354 automatically renders and tailors the markup language document, based at least in part on the rendering context, in a manner suited for subsequent display and/or interaction on a target remote device.
  • Illustrative Platforms
  • FIG. 4 illustrates an over view of one embodiment of platform 400 for employing a composite Markup Language (ML) document and a Rendering Context to fully render the document for subsequent display on a target remote device. Composite ML document 406 can be decomposed into more ML code. Also, Rendering Context 404 can include a unique identifier for identifying the particular combination of data and parameters for a target remote device.
  • From one or more other services, platforms, and/or applications, main component 402 receives both composite ML document 406 and rendering context 404. Main component 402 validates the composite ML code and checks the data, parameters and unique identifier included with rendering context 404. If an error is detected, main component 402 provides a notification of such an error, which can be provided to the target remote device and/or the provider of the composite ML document to the main component.
  • In at least one embodiment, the composite ML document is provided by a content provider, other platform, and/or application to platform 400. Similarly, the rendering context is separately determined either in real time and/or out of band from one or more services, platforms, applications, and/or sources, including the manufacturer of the remote device, header information in a message from the remote device, information from a gateway for a carrier that is in communication with the remote device, known bugs in software and/or hardware for the mobile device.
  • Cache 412 is arranged to store a document object model tree for code included in the composite ML document. Also, if a document object model tree isn't initially present in cache 412, ML parser 410 parses the document to create composite ML document object model tree 414 which is subsequently stored in the cache. In any case, the cached tree is passed to markup expander 414 which generates a document object model tree of native ML tags. Resource manager 434 employs component 436 to enable substitution for named resources and component 438 to expand composite tags from a library.
  • Markup expander 418 provides Native ML tree 424 to Conditioner 426 where pre-resolution processes are performed, e.g., removal of whitespace and pagination for subsequent display by the target remote device. The conditioned native ML tree is provided to Markup Resolver 428 where Component 430 executes the rendering of templates and component 432 renders tags that are included in a cache. Tag templates are provided by Component 440 which is associated with Resource Manager 434. Markup Resolver 428 provides Main component 402 with a fully rendered version of the initially provided composite ML document. Main component 402 subsequently provides the fully rendered markup Document 416 for delivery to the target remote device.
  • The rendering can include a plurality of different processes that contribute to the full rendering of ML document, including: (1) determining if location information regarding the target remote device can be provided in response to a request; (2) minifying cascading style sheets (CSS) by removing extraneous sheets; (3) fontifying cascading style sheets by stripping them out and rewriting as ML code; (4) shrinking the ML document to remove any ML code that is extraneous to the target remote device; (5) specifying one or more attributes of each media item to be embedded or linked in a page; (6) employing an estimate of each page size to repaginate the ML document pages to a size that is no more than the effective display screen size of the target remote device; (7) tailoring/removing ML code in a page that is estimated to be rendered as larger than the effective display screen size of the target remote device; and (8) rewriting URLs to include locations of media items and/or session identification information.
  • The pagination process is generally performed in conjunction with the combined operation of Markup Resolver 428 and Conditioner 426. Rendering Context 404 is queried by Conditioner 426 for the effective/maximum page size for the target remote device for each page. An general estimate of the size of each page is determined by analyzing the nodes included in each the page. The estimate is compared to the effective page size to determine if there is an offset of page data that must be either repaginated to another new page or tailored/cut from an existing page. Conditioner 426 also walks the native ML tree to identify each node that is defined as breakable onto which the offset falls. The container following the breakable node is identified by Conditioner 426 with a tag as the break off point for subsequent markup resolution by Markup Resolver 428.
  • Further, the pagination process is completed by Markup Resolver 428 which renders nodes in a page that are tagged by Conditioner 426 for rendering and not others. However, each time a node is rendered, the size of the rendered node is noted in a running total size for that particular page. After Markup Resolver 428 renders the container that is tagged for splitting, the resolver checks to see if the split occurs either within another tag or not. If the split occurs on a tag, then the split is set to follow a previous tag or data. However, if the split doesn't occur on a tag, then a breaking method is performed to find an offset such that the split occurs over whitespace over an overly long word, e.g., 100 characters long which can be adjusted by Resource Manager 434. Additionally, after inserting the split in the page, Markup Resolver 428 can render an end of page tag that includes displayable links to the next and previous pages of the split page as necessary.
  • Also, although not shown, the platform can facilitate a post-rendering component to embed a media item in the fully rendered markup document for subsequent display/playback with the target remote device. The media item can include an image, audio file, sound, graphic, video, animation, or the like. Additionally, localization of the text and any other element for the fully rendered document can be performed prior to providing the document to the target remote device.
  • FIG. 5A illustrates overview 500 of one embodiment of platform 502 for pipelined full rendering of Markup Language (ML) document 504 based at least in part on corresponding Rendering Context 506 for subsequent display on a target remote device. Rendering Context 506 and ML document 504 can be pre-processed by master document assembler 508 where informalities are handled, e.g., units can be added, blanks filled in, and/or footers added. Master Document assembler 508 provides (preprocessed) ML document 504 and rendering context 504 to platform 502 where pipeline builder component 512 walks the document to determine and identify the different stages of processing that contribute to fully rendering the ML document, and which is based also at least in part on the rendering context. Pipeline builder component 512 employs pipeline stage library 514 to generate pipeline stages component 516 which is arranged to include a stage for each identified process that contributes to the full rendering of ML document 504. Further, main component 510 generates a document object model (DOM) tree that is based on ML document 504 and Rendering Context 506. Both the DOM tree and Rendering Context 506 are provided to at least a portion of the stages in pipeline stages component 516.
  • The stages can enable a plurality of different processes that contribute to the full rendering of ML document 504, including: (1) determining if location information regarding the target remote device can be provided in response to a request; (2) minifying cascading style sheets (CSS) by removing extraneous sheets; (3) fontifying cascading style sheets by stripping them out and rewriting as ML code; (4) shrinking the ML document to remove any ML code that is extraneous to the target remote device; (5) specifying one or more attributes of each media item to be embedded or linked in a page; (6) employing an estimate of each page size to repaginate the ML document pages to a size that is no more than the effective display screen size of the target remote device; (7) tailoring/removing ML code in a page that is estimated to be rendered as larger than the effective display screen size of the target remote device; and (8) rewriting URLs to include locations of media items and/or session identification information.
  • Additionally, Temp Index Add and Temp Index Remove stages (not shown) can optionally be included in pipeline stage library 514 and included as a processing stage in pipeline stage component 516. The Temp Index Add stage can be arranged to walk ML document 504 and identify portions in the document that don't already have an XML ID tag, and subsequently provide a Temp ID tag these identified portions. Also, this stage can build a Temporary Index for all of the temporary and existing XML ID tags. Other pipelined stages in component 516 can subsequently use the Temporary Index to more quickly access just those elements in the ML document that are to be rendered by a particular stage. Once the rendering is completed by the other pipelined stages, the Temp Index Remove stage removes the Temp XML IDs and Temporary Index. The use of Temp XML IDs and a Temporary Index can reduce the likelihood that a particular process has to walk the entire ML document to perform its portion of the full rendering of the document for the target remote device.
  • Serializer 518 receives the full rendered DOM tree for the target remote device and converts it into a stream of character bytes that are suitable for transmission over a communication link with the device. Additionally, a separate component can be arranged to localize the character bytes for a particular language, and another component can be arranged to embed a media item in the character stream for the target remote device.
  • FIG. 5B illustrates an overview of one embodiment of a plurality of exemplary stages that can be included in pipelined stages component 516 as shown in FIG. 5A. Component 516 is shown to include as follows: Temp Index Add stage 530, Location Request stage 532, Minify stage 534, Paginate stage 536, Fontify stage 538, Media Item stage 540, and Temp Index Remove stage 542. The stages that are included in component 516 correspond to a particular target remote device and some of the same, but not all stages, plus possibly other stages might also be included in component 516 if full rendering is to be performed for a different target remote device.
  • Illustrative Mobile Markup Language
  • FIG. 6 illustrates overview 600 of a class tree that can be instantiated from modules included in a document with modules that are created in a mobile Markup Language (MML). MML is relatively independent of considerations for the target remote device and its classes can be instantiated by at least the different embodiments of a platform that are discussed above. The main component of a platform can automatically parse the MML document to identify each MML module and subsequently instantiate a tree of classes based on the included MML modules. These classes are arranged to generate a corresponding document that includes native XML for a target mobile device. The platform can subsequently parse this corresponding native XML document to fully render the intent of the programmer for display by the target remote device. In another embodiment, the classes are arranged to generate virtually any type of Markup Language document that can include native code for rendering a display of pages on the target remote device.
  • The rendering can include a plurality of different processes that contribute to the full rendering of MML document, including: (1) determining if location information regarding the target remote device can be provided in response to a request; (2) minifying cascading style sheets (CSS) by removing extraneous sheets; (3) fontifying cascading style sheets by stripping them out and rewriting as ML code; (4) shrinking the ML document to remove any ML code that is extraneous to the target remote device; (5) specifying one or more attributes of each media item to be embedded or linked in a page; (6) employing an estimate of each page size to repaginate the ML document pages to a size that is no more than the effective display screen size of the target remote device; (7) tailoring/removing ML code in a page that is estimated to be rendered as larger than the effective display screen size of the target remote device; and (8) rewriting URLs to include locations of media items and/or session identification information.
  • Additionally, the MML document can include at least two categories of data structures, e.g., structure and module tags. The structure category can include content tags, form tags, mode tags, and organization tags, and the like. Also, the module category can include generic module tags, global module tags, and property module tags. Additionally, some of the module tags can be arranged for tailoring the rendering of the MML document.
  • A listing of at least a portion the MML tags are listed below.
  • Structure Tags
  • Content Tags - <text>
    Type Tags - <media>; <mi> (media item)
    Form Tags - <input> , <submit>, <hidden>, <label>, <value>
    Mode Tags - <a>, <heading>, <subheading>,
    Organization Tags - <p>, <item>, <array>, <join>, <yml>
  • Module Tags
  • Generic Modules - <module.container>, <module.grid>, <module.pane>,
    <module.list>, <module.plain>
    styling (Attribute)
    Global Modules - <yahoo.header>, <yahoo.footernav>,
    <yahoo.footerlegal>
    Property Module - <yahoo.sch.result>
  • Although the names of the tags in the two categories tend to be descriptive of their functionality (which somewhat parallels similar sounding tags in other markup languages), the MML tags are primarily used to capture the intent of the programmer, not the styling necessary to actually render pages for display on a target remote device. For example, the generic module tag <module.list> is handled as a request by the platform to render a list for display on a target mobile device. However, since the <module.list> tag doesn't specify substantive details necessary to actually perform the rendering, the platform handles these details by using data and parameters included in the Rendering Context, e.g., portrait or landscape orientation of the list, width and height of the effective display screen, and the pagination size for displaying pages. The rendering of the intent of the MML tags is handled by at least one of the platforms discussed above in a manner substantially similar to other tags provided in other types of markup language documents.
  • Generalized Operation of Platforms
  • The methods and processes for certain aspects of the invention will now be described with respect to FIGS. 7-11B.
  • FIG. 7 illustrates overview 700 of a process generally employed by any embodiments of the inventive platforms to render a Markup Language (ML) document for display by a target remote device. Moving from a start block the process steps to block 702 where a Rendering Context, as defined elsewhere in the application, for a particular target remote device is received. At block 704, a Markup Language document is also received for rendering for the target remote device. The ML document and the Rendering Context are generated and subsequently provided by one or more of the same or different resources, including, but not limited to, content providers, carriers, web services, affiliates, users, and websites. Additionally, the Markup Language document can be coded in virtually any standard or non-standard Markup Language.
  • Flowing to block 706, the data and parameters included in the Rendering Context for the target remote device are employed by one or more processes to fully render the ML document for display by the remote device. Moving to block 708, the process can optionally provide a media item that can be either referenced by or embedded in the fully rendered ML document. At block 710, the fully rendered ML document plus any optionally included media items are made available for subsequent display by the target remote device. In one embodiment, the carrier provides the fully rendered ML document as a character string over a communication link to the target remote device. Next, the process returns to performing other actions.
  • FIG. 8 illustrates overview 800 of the process that can be performed by at least the platform illustrated in FIG. 4. Moving from a start block, the process steps to block 802 where a Markup Language (ML) document and Rendering Context are received from another resource, and the ML document is to be rendered for a target remote device. Moving to block 804, the process checks the validity of the data and parameters included in the ML document and the Rendering Context. At block 806, a document object model tree is generated. At block 808, the Rendering Context enables the DOM tree to be populated with ML code that is native to the target remote device.
  • Stepping to block 810, the process determines pagination for the rendered pages of the ML document. The process paginates these pages so that they are no greater in size than the effective size of the display screen for the target remote device. In some cases, next and previous links are created for newly paginated pages. The process can also remove relatively unnecessary whitespace in the ML document. At block 812, the process parses the populated DOM tree to resolve the native ML code into a fully rendered ML document. Advancing to block 814, one or more media items can optionally be either referenced by or embedded in the fully rendered ML document. At block 816, the fully rendered ML document plus any optionally included media items are made available for subsequent display by the target remote device. In one embodiment, the carrier provides the fully rendered ML document as a character string over a communication link to the target remote device. Next, the process returns to performing other actions.
  • FIG. 9 illustrates overview 900 of a process for employing at least one of the disclosed embodiments of a platform to fully render a document written in a mobile Markup Language (MML) that is arranged to provide a programmers intent, and not the particular details associated with the actual rendering of this document for subsequent display by a target remote device. Moving from a start block, the process flows to block 902 where the MML document and a Rendering Context for the target mobile device are received from one or more of the same, or different, resources. At block 904, the MML document is parsed to identify each module in the document, and each identified module is subsequently instantiated. Flowing to block 906, the process generates a tree of ML code from the instantiated modules and where the generated ML code is also native to the target remote device. In at least one embodiment, the ML code in the tree is XML code.
  • Advancing to block 908, the tree is parsed to resolve the ML code and paginate pages for subsequent display by the target remote device. At block 910, the process generates the fully rendered ML document for display by the target remote device. Moving to block 912, the process can optionally provide a media item that can be either referenced by or embedded in the fully rendered ML document. At block 914, the fully rendered ML document plus any optionally included media items are made available for subsequent display by the target remote device. In one embodiment, the carrier provides the fully rendered ML document as a character string over a communication link to the target remote device. Next, the process returns to performing other actions.
  • FIG. 10 illustrates overview 1000 of a process for employing a platform to fully render a Markup Language (ML) document with a platform that provides pipelined stages for rendering different portions of the ML document. Moving from a start block, the process flows to block 1002 where the ML document and a Rendering Context for the target mobile device are received from one or more of the same, or different, resources. At block 1004, the ML document is assembled for processing. For example, common headers and footers, units, and other incidental elements can be added to the ML document. Next, at block 1006, the process parses the ML document and reviews the data and parameters of the Rendering Context to build an instance of pipelined stages that can enable the full rendering of the elements included in the ML document. The process also converts the assembled ML document into a document object model (DOM) document.
  • At block 1008, the process performs pipelined stage rendering and pagination of the elements included in the DOM document based at least in part on the Rendering Context for the target remote device. Flowing to block 1010, the process serializes the rendered DOM document into a character string, which can optionally be localized to a particular language that corresponds to the target remote device.
  • Moving to block 1012, the process can optionally provide a media item that can be either referenced by or embedded in the fully rendered character string. At block 1014, the fully rendered character string plus any optionally included media items are made available for subsequent display by the target remote device. In one embodiment, the carrier provides the fully rendered ML document as a character string over a communication link to the target remote device. Next, the process returns to performing other actions.
  • Illustrative Temp Indexing
  • FIG. 11A illustrates process 1100 for a platform that temporarily creates and indexes temporary IDs for identified portions in a received XML document to speed up the pipeline stage rendering of the document's pages for subsequent display by a target remote device.
  • Moving from a start block, the process flows to block 1102 where a received Markup Language (XML) document is parsed to identify portions for subsequent processing. A Rendering Context for a target remote device is also received. At decision block 1106, if an XML ID exists for an identified portion of the document, then the process moves to block 1108. However, if the identified portion of the document doesn't correspond to an existing XML ID, then a temporary XML ID is added to the identified portion at block 1104. In any case, after an identified portion of the document corresponds to either an existing or temporary XML ID, the process flows to block 1108 where a temporary index is generated for the XML document that lists the location of identified portions in the XML document based at least in part on the existing or temporary XML ID.
  • At block 1110, the temporary index and the Rendering Context are employed to estimate the page size of the XML document for display on the target remote device, and subsequently tailor and/or repaginate pages with oversized estimates. Also, session IDs can be added and URLs rewritten as necessary. Advancing to block 1112, the process employs the temporary index and Rendering Context to perform pipelined stage rendering on identified portions until the entire XML document is fully rendered.
  • At block 1114, the temporary index and the temporary XML IDs are removed from the XML document. Moving to block 1116, the process can optionally provide a media item that can be either referenced by or embedded in the fully rendered XML document. At block 1118, the fully rendered XML document plus any optionally included media items are made available for subsequent display by the target remote device. In one embodiment, the carrier provides the fully rendered ML document as a character string over a communication link to the target remote device. Next, the process returns to performing other actions.
  • FIG. 11B illustrates process 1120 for a platform that temporarily creates and indexes temporary IDs for identified portions in a received Markup Language (ML) document to speed up the rendering of the document's pages for subsequent display by a target remote device. The ML document can be written in virtually any standard or non-standard Markup language.
  • Moving from a start block, the process flows to block 1122 where a received Markup Language (ML) document is parsed to identify identified portions of the document. A Rendering Context for a target remote device is also received. At decision block 1124, if an XML ID exists for an identified portion, then the process moves to block 1128. However, if the identified portion doesn't correspond to an existing ML ID, then a temporary ML ID is added to the identified portion at block 1126. In any case, after each identified portion of the document corresponds to either an existing or temporary ML ID, the process flows to block 1128 where a temporary index is generated for the ML document that lists the location of the identified portions in the ML document based at least in part on the existing or temporary ML ID.
  • At block 1130, the temporary index and the Rendering Context are employed to estimate the page size of the XML document for display on the target remote device, and subsequently tailor and/or repaginate pages with oversized estimates. Also, session IDs can be added and URLs rewritten as needed. Advancing to block 1132, the process employs the temporary index and Rendering Context to perform rendering of the identified portions until the entire ML document is fully rendered.
  • At block 1134, the temporary index and the temporary ML IDs are removed from the rendered ML document. Moving to block 1136, the process can optionally provide a media item that can be either referenced by or embedded in the fully rendered ML document. At block 1138, the fully rendered ML document plus any optionally included media items are made available for subsequent display by the target remote device. In one embodiment, the carrier provides the fully rendered ML document as a character string over a communication link to the target remote device. Next, the process returns to performing other actions.
  • It will be understood that each block of the above flowchart illustrations, and combinations of blocks in the flowchart illustrations, can be implemented by computer program instructions. These program instructions may be provided to a processor to produce a machine, such that the instructions, which execute on the processor, create means for implementing the actions specified in the flowchart block or blocks. The computer program instructions may be executed by a processor to cause a series of operational steps to be performed by the processor to produce a computer implemented process such that the instructions executing on the processor provide steps for implementing the actions listed in the flowcharts discussed above.
  • Accordingly, blocks of the flowchart illustrations support combinations of means for performing the specified actions, combinations of steps for performing the specified actions and program instruction means for performing the specified actions. It will also be understood that each block of the flowchart illustration, and combinations of blocks in the flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified actions or steps, or combinations of special purpose hardware and computer instructions.
  • In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made to the specific exemplary embodiments without departing from the broader spirit and scope of the invention as set forth in the appended claims. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.

Claims (22)

1. A method for generating a markup language (ML) document that is remotely rendered for display by a mobile device, comprising:
editing the ML document to provide content;
parsing the ML document to identify included modules;
instantiating a tree of classes based on the identified modules;
employing the tree of classes to generate an extended markup language (XML) document that corresponds to content in the ML document, wherein the XML document further includes code that is customizable for a rendering context of the mobile device; and
enabling the remote rendering of the XML document for subsequent display of its content by the mobile device.
2. The method of claim 1, wherein the ML document is editable independent of the rendering context for the mobile device.
3. The method of claim 1, wherein the rendering context provides information regarding at least one of a software constraint, display screen size, a memory size, a type of processor, or a type of communication link.
4. The method of claim 1, further comprising employing the rendering context to enable pagination of the rendered XML document, wherein a size of each page in the rendered XML document is paginated to be no greater than an effective size for display by the mobile device.
5. The method of claim 1, further comprising remotely communicating the XML document to the mobile device that is fully rendered.
6. The method of claim 1, wherein the included modules are independent of the rendering context of the mobile device.
7. A machine readable storage medium having machine executable data stored thereon, which when executed causes actions for generating a markup language (ML) document that is remotely rendered for display by a mobile device, comprising:
editing content for the ML document;
parsing the ML document to identify included modules;
instantiating a tree of classes based on the identified modules;
employing the tree of classes to generate an extended markup language (XML) document that corresponds to content in the ML document, wherein the XML document further includes code that is customizable for a rendering context of the mobile device; and
enabling the remote rendering of the XML document for subsequent display of its content by the mobile device.
8. The machine readable storage medium of claim 7, wherein the ML document is editable independent of the rendering context for the mobile device.
9. The machine readable storage medium of claim 7, wherein the rendering context provides information regarding at least one of a software constraint, display screen size, a memory size, a type of processor, and a type of communication link.
10. The machine readable storage medium of claim 7, further comprising employing the rendering context to enable pagination of the ML document, wherein a size of each page in the rendered ML document is paginated to be no greater than an effective size for display by the remote device.
11. The machine readable storage medium of claim 7, further comprising enabling the transmission of the XML document to the mobile device that is fully rendered.
12. The machine readable storage medium of claim 7, wherein the included modules are independent of the rendering context of the mobile device.
13. A system for generating a markup language (ML) document that is remotely rendered for display by a mobile device, comprising:
a server, including:
an interface for communicating over a network;
a memory for storing data;
a processor arranged to enable actions embodied by at least a portion of the stored data, comprising:
enabling editing of content for the ML document;
parsing the ML document to identify included modules;
instantiating a tree of classes based on the identified modules;
employing the tree of classes to generate an extended markup language (XML) document that corresponds to content in the ML document, wherein the XML document further includes code that is customizable for a rendering context of the mobile device; and
rendering the XML document based at least in part on the rendering context; and
transmitting the rendered XML document to the mobile device; and
the mobile device, including:
a display;
an interface for communicating over the network;
a memory for storing data;
a processor arranged to enable actions embodied by at least a portion of the stored data, comprising:
receiving the rendered XML document; and
displaying the rendered XML document.
14. The system of claim 13, wherein the ML document is editable independent of the rendering context for the mobile device.
15. The system of claim 13, wherein the rendering context provides information regarding at least one of a software constraint, display screen size, a memory size, a type of processor, or a type of communication link.
16. The system of claim 13, further comprising employing the rendering context to enable pagination of the rendered XML document, wherein a size of each page in the rendered XML document is paginated to be no greater than an effective size for display by the mobile device.
17. The system of claim 13, wherein the transmitted XML document is fully rendered for the mobile device.
18. The system of claim 13, wherein the included modules are independent of the rendering context of the mobile device.
19. A server for generating a markup language (ML) document that is remotely rendered for display by a mobile device, comprising:
an interface for communicating over a network;
a memory for storing data;
a processor arranged to enable actions embodied by at least a portion of the stored data, comprising:
enabling editing of content for the ML document;
parsing the ML document to identify included modules;
instantiating a tree of classes based on the identified modules;
employing the tree of classes to generate an extended markup language (XML) document that corresponds to content in the ML document, wherein the XML document further includes code that is customizable for a rendering context of the mobile device; and
rendering the XML document based at least in part on the rendering context; and
transmitting the rendered XML document to the mobile device; and
enabling the rendered XML document to be displayed at the mobile device.
20. The server of claim 19, wherein the ML document is editable independent of the rendering context for the mobile device.
21. The server of claim 19, wherein the rendering context provides information regarding at least one of a software constraint, display screen size, a memory size, a type of processor, or a type of communication link.
22. The server of claim 19, wherein the included modules are independent of the rendering context of the mobile device.
US11/933,082 2006-09-29 2007-10-31 Server assisted device independent markup language Abandoned US20080177825A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/933,082 US20080177825A1 (en) 2006-09-29 2007-10-31 Server assisted device independent markup language

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/537,593 US10452756B2 (en) 2006-09-29 2006-09-29 Platform for rendering content for a remote device
US11/933,082 US20080177825A1 (en) 2006-09-29 2007-10-31 Server assisted device independent markup language

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/537,593 Continuation US10452756B2 (en) 2006-09-29 2006-09-29 Platform for rendering content for a remote device

Publications (1)

Publication Number Publication Date
US20080177825A1 true US20080177825A1 (en) 2008-07-24

Family

ID=39230517

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/537,593 Active 2029-04-08 US10452756B2 (en) 2006-09-29 2006-09-29 Platform for rendering content for a remote device
US11/933,082 Abandoned US20080177825A1 (en) 2006-09-29 2007-10-31 Server assisted device independent markup language

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/537,593 Active 2029-04-08 US10452756B2 (en) 2006-09-29 2006-09-29 Platform for rendering content for a remote device

Country Status (7)

Country Link
US (2) US10452756B2 (en)
EP (1) EP2069970A4 (en)
JP (1) JP5108016B2 (en)
KR (1) KR101117396B1 (en)
CN (1) CN101523386B (en)
HK (1) HK1137534A1 (en)
WO (1) WO2008039581A1 (en)

Cited By (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080301236A1 (en) * 2007-05-31 2008-12-04 Microsoft Corporation Contextual social language
WO2010096192A1 (en) * 2009-02-18 2010-08-26 Exbiblio B.V. Interacting with rendered documents using a multi-function mobile device, such as a mobile phone
US20110074831A1 (en) * 2009-04-02 2011-03-31 Opsis Distribution, LLC System and method for display navigation
US7990556B2 (en) 2004-12-03 2011-08-02 Google Inc. Association of a portable scanner with input/output and storage devices
US8019648B2 (en) 2004-02-15 2011-09-13 Google Inc. Search engines and systems with handheld document data capture devices
US8081849B2 (en) 2004-12-03 2011-12-20 Google Inc. Portable scanning and memory device
US8146156B2 (en) 2004-04-01 2012-03-27 Google Inc. Archive of text captures from rendered documents
US8179563B2 (en) 2004-08-23 2012-05-15 Google Inc. Portable scanning device
US8261094B2 (en) 2004-04-19 2012-09-04 Google Inc. Secure data gathering from rendered documents
US8346620B2 (en) 2004-07-19 2013-01-01 Google Inc. Automatic modification of web pages
CN102902692A (en) * 2011-07-28 2013-01-30 腾讯科技(北京)有限公司 Method and system for processing and showing network media information
US20130046831A1 (en) * 2011-08-15 2013-02-21 Verizon Patent And Licensing Inc. Method and system for providing context-based view content management
US8442331B2 (en) 2004-02-15 2013-05-14 Google Inc. Capturing text from rendered documents using supplemental information
US20130124976A1 (en) * 2008-12-27 2013-05-16 Yoram Zahavi Method and system for inserting data in a web page that is transmitted to a handheld device
US8447066B2 (en) 2009-03-12 2013-05-21 Google Inc. Performing actions based on capturing information from rendered documents, such as documents under copyright
US8447111B2 (en) 2004-04-01 2013-05-21 Google Inc. Triggering actions in response to optically or acoustically capturing keywords from a rendered document
US8489624B2 (en) 2004-05-17 2013-07-16 Google, Inc. Processing techniques for text capture from a rendered document
US8583601B1 (en) 2007-09-28 2013-11-12 Emc Corporation Imminent failure backup
US8600196B2 (en) 2006-09-08 2013-12-03 Google Inc. Optical scanners, such as hand-held optical scanners
US8621349B2 (en) 2004-04-01 2013-12-31 Google Inc. Publishing techniques for adding value to a rendered document
US8619287B2 (en) 2004-04-01 2013-12-31 Google Inc. System and method for information gathering utilizing form identifiers
US8620083B2 (en) 2004-12-03 2013-12-31 Google Inc. Method and system for character recognition
US8619147B2 (en) 2004-02-15 2013-12-31 Google Inc. Handheld device for capturing text from both a document printed on paper and a document displayed on a dynamic display device
US20140047310A1 (en) * 2012-08-13 2014-02-13 Business Objects Software Ltd. Mobile drilldown viewer for standardized data
US8655966B1 (en) 2010-03-31 2014-02-18 Emc Corporation Mobile device data protection
US8683005B1 (en) 2010-03-31 2014-03-25 Emc Corporation Cache-based mobile device network resource optimization
US20140095988A1 (en) * 2012-09-29 2014-04-03 Bookboard, Inc. Creating and consuming streaming e-book content
US8694744B1 (en) 2010-03-31 2014-04-08 Emc Corporation Mobile device snapshot backup
US8694597B1 (en) 2010-03-31 2014-04-08 Emc Corporation Mobile device group-based data sharing
US8713418B2 (en) 2004-04-12 2014-04-29 Google Inc. Adding value to a rendered document
US8762854B2 (en) * 2007-11-07 2014-06-24 Cabot Communications Limited Systems and methods for itemising web pages for display on a screen
US8793162B2 (en) 2004-04-01 2014-07-29 Google Inc. Adding information or functionality to a rendered document via association with an electronic counterpart
US8799303B2 (en) 2004-02-15 2014-08-05 Google Inc. Establishing an interactive environment for rendered documents
WO2014149492A1 (en) * 2013-03-15 2014-09-25 Google Inc. Screencasting for multi-screen applications
US8874504B2 (en) 2004-12-03 2014-10-28 Google Inc. Processing techniques for visual capture data from a rendered document
US8892495B2 (en) 1991-12-23 2014-11-18 Blanding Hovenweep, Llc Adaptive pattern recognition based controller apparatus and method and human-interface therefore
US8903759B2 (en) 2004-12-03 2014-12-02 Google Inc. Determining actions involving captured information and electronic content associated with rendered documents
US8924352B1 (en) 2007-03-31 2014-12-30 Emc Corporation Automated priority backup and archive
US8990235B2 (en) 2009-03-12 2015-03-24 Google Inc. Automatically providing content associated with captured information, such as information captured in real-time
US9081799B2 (en) 2009-12-04 2015-07-14 Google Inc. Using gestalt information to identify locations in printed information
US9116890B2 (en) 2004-04-01 2015-08-25 Google Inc. Triggering actions in response to optically or acoustically capturing keywords from a rendered document
US9143638B2 (en) 2004-04-01 2015-09-22 Google Inc. Data capture from rendered documents using handheld device
US9152650B1 (en) 2010-03-31 2015-10-06 Emc Corporation Mobile device data recovery
US9268852B2 (en) 2004-02-15 2016-02-23 Google Inc. Search engines and systems with handheld document data capture devices
US9292482B1 (en) * 2015-04-30 2016-03-22 Workiva Inc. System and method for convergent document collaboration
US9323784B2 (en) 2009-12-09 2016-04-26 Google Inc. Image search using text-based elements within the contents of images
US9361131B1 (en) * 2011-06-24 2016-06-07 Amazon Technologies, Inc. Network resource access via a mobile shell
US9454764B2 (en) 2004-04-01 2016-09-27 Google Inc. Contextual dynamic advertising based upon captured rendered text
US9514089B1 (en) * 2010-03-31 2016-12-06 EMC IP Holding Company LLC Mobile device network data synchronization
US9529858B2 (en) 2014-03-06 2016-12-27 Yahoo! Inc. Methods and systems for ranking items on a presentation area based on binary outcomes
US9535563B2 (en) 1999-02-01 2017-01-03 Blanding Hovenweep, Llc Internet appliance system and method
US20170091160A1 (en) * 2015-09-30 2017-03-30 Samsung Display Co. Ltd. Display system and virtual web device in the cloud
WO2017067401A1 (en) * 2015-10-19 2017-04-27 阿里巴巴集团控股有限公司 Method and equipment for processing page
US10325014B2 (en) 2015-04-30 2019-06-18 Workiva Inc. System and method for convergent document collaboration
US10769431B2 (en) 2004-09-27 2020-09-08 Google Llc Handheld device for capturing text from both a document printed on paper and a document displayed on a dynamic display device
US11100281B1 (en) 2020-08-17 2021-08-24 Workiva Inc. System and method for maintaining links and revisions
US11100277B1 (en) 2021-02-15 2021-08-24 Workiva Inc. Systems, methods, and computer-readable media for flow-through formatting for links
US11354362B1 (en) 2021-05-06 2022-06-07 Workiva Inc. System and method for copying linked documents
US11443108B2 (en) 2020-08-17 2022-09-13 Workiva Inc. System and method for document management using branching
US11640495B1 (en) 2021-10-15 2023-05-02 Workiva Inc. Systems and methods for translation comments flowback
US11755825B2 (en) 2019-09-12 2023-09-12 Workiva Inc. Method, system, and computing device for facilitating private drafting

Families Citing this family (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7920101B2 (en) 2004-10-08 2011-04-05 Sharp Laboratories Of America, Inc. Methods and systems for imaging device display standardization
US8237946B2 (en) 2004-10-08 2012-08-07 Sharp Laboratories Of America, Inc. Methods and systems for imaging device accounting server redundancy
US8384925B2 (en) 2004-10-08 2013-02-26 Sharp Laboratories Of America, Inc. Methods and systems for imaging device accounting data management
US8213034B2 (en) 2004-10-08 2012-07-03 Sharp Laboratories Of America, Inc. Methods and systems for providing remote file structure access on an imaging device
US8230328B2 (en) * 2004-10-08 2012-07-24 Sharp Laboratories Of America, Inc. Methods and systems for distributing localized display elements to an imaging device
US8428484B2 (en) 2005-03-04 2013-04-23 Sharp Laboratories Of America, Inc. Methods and systems for peripheral accounting
US8345272B2 (en) 2006-09-28 2013-01-01 Sharp Laboratories Of America, Inc. Methods and systems for third-party control of remote imaging jobs
US8181107B2 (en) 2006-12-08 2012-05-15 Bytemobile, Inc. Content adaptation
US8019863B2 (en) 2008-03-28 2011-09-13 Ianywhere Solutions, Inc. Synchronizing events between mobile devices and servers
US8719365B1 (en) * 2009-02-12 2014-05-06 Adobe Systems Incorporated Graphic output from remote execution of applications redirected with dynamically sized virtual screen
US20100211893A1 (en) * 2009-02-19 2010-08-19 Microsoft Corporation Cross-browser page visualization presentation
US20110093619A1 (en) * 2009-10-16 2011-04-21 Ianywhere Solutions, Inc. Synchronizing Tasks between Mobile Devices and Servers
US20110126113A1 (en) * 2009-11-23 2011-05-26 c/o Microsoft Corporation Displaying content on multiple web pages
JP4818454B1 (en) * 2010-08-27 2011-11-16 株式会社東芝 Display device and display method
CN102035856A (en) * 2010-12-31 2011-04-27 深圳瑞高信息技术有限公司 Game community management method and system and game customer terminals
CN102074221B (en) * 2011-01-06 2012-08-22 深圳芯邦科技股份有限公司 Method and device for displaying characters
US9015576B2 (en) 2011-05-16 2015-04-21 Microsoft Technology Licensing, Llc Informed partitioning of data in a markup-based document
US20130227398A1 (en) * 2011-08-23 2013-08-29 Opera Software Asa Page based navigation and presentation of web content
US9536251B2 (en) * 2011-11-15 2017-01-03 Excalibur Ip, Llc Providing advertisements in an augmented reality environment
US20130159839A1 (en) * 2011-12-14 2013-06-20 Microsoft Corporation Semantic compression of cascading style sheets
CN104025068B (en) * 2012-01-02 2017-06-13 国际商业机器公司 The Conflict solving of the CSS definition from multiple sources
US20140272891A1 (en) * 2013-03-15 2014-09-18 Joseph Saladino System and method for remote fitness training
EP2802122A1 (en) * 2013-05-07 2014-11-12 Nagravision S.A. A Media Player for Receiving Media Content from a Remote Server
CN103294819B (en) * 2013-06-14 2018-08-31 北京新学堂网络科技有限公司 The method that Pagination Display is carried out to web page contents using HTML5 technologies
US10579713B2 (en) * 2014-05-30 2020-03-03 Apple Inc. Application Markup language
US9898450B2 (en) * 2014-11-07 2018-02-20 Rakuten Kobo Inc. System and method for repagination of display content
CN107526716B (en) * 2016-09-30 2020-02-11 腾讯科技(深圳)有限公司 Mail display method and device
US20180191798A1 (en) * 2016-12-30 2018-07-05 Google Inc. Methods and systems for server-side rendering of native content for presentation
US10552480B1 (en) * 2017-02-21 2020-02-04 Amazon Technologies, Inc. Package management for asset processing
CN111338628B (en) * 2020-03-10 2023-07-18 中国联合网络通信集团有限公司 Component rendering method and device
US11263387B1 (en) * 2020-05-05 2022-03-01 Progress Software Corporation Technology agnostic page builder architecture
CN112817915A (en) * 2021-02-01 2021-05-18 浪潮云信息技术股份公司 Automatic multi-product document uniform publishing and displaying method
CN113704824A (en) * 2021-08-31 2021-11-26 平安普惠企业管理有限公司 Synchronous generation method, device and equipment of page guide mark and storage medium

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020019812A1 (en) * 2000-06-16 2002-02-14 Board Karen Eleanor System and service for receiving, customizing, and re-broadcasting high-speed financial data to users operating wireless network-capable devices
US6473609B1 (en) * 1995-12-11 2002-10-29 Openwave Systems Inc. Method and architecture for interactive two-way communication devices to interact with a network
US20030236917A1 (en) * 2002-06-17 2003-12-25 Gibbs Matthew E. Device specific pagination of dynamically rendered data
US20040019628A1 (en) * 2002-07-09 2004-01-29 Puri Anish N. System for remotely rendering content for output by a printer
US20040110490A1 (en) * 2001-12-20 2004-06-10 Steele Jay D. Method and apparatus for providing content to media devices
US20040148571A1 (en) * 2003-01-27 2004-07-29 Lue Vincent Wen-Jeng Method and apparatus for adapting web contents to different display area
US20060161646A1 (en) * 2005-01-19 2006-07-20 Marc Chene Policy-driven mobile forms applications
US20060200761A1 (en) * 2001-06-29 2006-09-07 Melia Technologies, Ltd Content management and transformation system for digital content
US7162221B2 (en) * 2001-12-14 2007-01-09 Inphonic, Inc. Systems, methods, and computer program products for registering wireless device users in direct marketing campaigns
US20070113172A1 (en) * 2005-11-14 2007-05-17 Jochen Behrens Method and apparatus for virtualized XML parsing
US20070288853A1 (en) * 2006-06-09 2007-12-13 Nextair Corporation Software, methods and apparatus facilitating presentation of a wireless communication device user interface with multi-language support
US20080040659A1 (en) * 2004-02-05 2008-02-14 Stephen Doyle Markup Language Translator System
US7584423B2 (en) * 2000-06-12 2009-09-01 Gary Rohrabaugh Method, proxy and system to support full-page web browsing on hand-held devices
US7900137B2 (en) * 2003-10-22 2011-03-01 Opera Software Asa Presenting HTML content on a screen terminal display

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6278992B1 (en) * 1997-03-19 2001-08-21 John Andrew Curtis Search engine using indexing method for storing and retrieving data
US6178430B1 (en) * 1998-05-11 2001-01-23 Mci Communication Corporation Automated information technology standards management system
SE524391C2 (en) * 1998-12-28 2004-08-03 Spyglass Inc Method and system for content conversion of electronic documents for wireless clients.
WO2000073981A1 (en) * 1999-05-28 2000-12-07 Anoto Ab Recording of information
JP2001195391A (en) 2000-01-14 2001-07-19 Nec Information Service Ltd Format conversion and page division relay server
US7072984B1 (en) * 2000-04-26 2006-07-04 Novarra, Inc. System and method for accessing customized information over the internet using a browser for a plurality of electronic devices
WO2001086462A1 (en) 2000-05-08 2001-11-15 Leap Wireless International, Inc. Method of converting html/xml to hdml/wml in real-time for display on mobile devices
WO2002050719A2 (en) * 2000-12-18 2002-06-27 Kargo, Inc. A system and method for delivering content to mobile devices
GB0107784D0 (en) * 2001-03-28 2001-05-16 Hewlett Packard Co Improvement relating to developing documents
JP2003271508A (en) 2002-03-14 2003-09-26 Ntt Comware Corp Contents conversion system for portable terminal and contents conversion method
JP2005018390A (en) 2003-06-26 2005-01-20 Hitachi Ltd Description language converting method and execution system, and processing program
DE602004011952T2 (en) 2003-06-30 2008-07-03 International Business Machines Corp. Method and system for improving the presentation of HTML pages in an Internet access device
US7356843B1 (en) * 2003-10-01 2008-04-08 Symantec Corporation Security incident identification and prioritization
JP4389707B2 (en) 2004-07-16 2009-12-24 ソニー株式会社 Electronic device apparatus, server apparatus, Web page processing method and program thereof
JP2006243829A (en) 2005-02-28 2006-09-14 Toshiba Corp Method and system for converting web content

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6473609B1 (en) * 1995-12-11 2002-10-29 Openwave Systems Inc. Method and architecture for interactive two-way communication devices to interact with a network
US7584423B2 (en) * 2000-06-12 2009-09-01 Gary Rohrabaugh Method, proxy and system to support full-page web browsing on hand-held devices
US20020019812A1 (en) * 2000-06-16 2002-02-14 Board Karen Eleanor System and service for receiving, customizing, and re-broadcasting high-speed financial data to users operating wireless network-capable devices
US20060200761A1 (en) * 2001-06-29 2006-09-07 Melia Technologies, Ltd Content management and transformation system for digital content
US7162221B2 (en) * 2001-12-14 2007-01-09 Inphonic, Inc. Systems, methods, and computer program products for registering wireless device users in direct marketing campaigns
US20040110490A1 (en) * 2001-12-20 2004-06-10 Steele Jay D. Method and apparatus for providing content to media devices
US20030236917A1 (en) * 2002-06-17 2003-12-25 Gibbs Matthew E. Device specific pagination of dynamically rendered data
US20070113179A1 (en) * 2002-06-17 2007-05-17 Microsoft Corporation Device specific pagination of dynamically rendered data
US20040019628A1 (en) * 2002-07-09 2004-01-29 Puri Anish N. System for remotely rendering content for output by a printer
US20040148571A1 (en) * 2003-01-27 2004-07-29 Lue Vincent Wen-Jeng Method and apparatus for adapting web contents to different display area
US7900137B2 (en) * 2003-10-22 2011-03-01 Opera Software Asa Presenting HTML content on a screen terminal display
US20080040659A1 (en) * 2004-02-05 2008-02-14 Stephen Doyle Markup Language Translator System
US20060161646A1 (en) * 2005-01-19 2006-07-20 Marc Chene Policy-driven mobile forms applications
US20070113172A1 (en) * 2005-11-14 2007-05-17 Jochen Behrens Method and apparatus for virtualized XML parsing
US20070288853A1 (en) * 2006-06-09 2007-12-13 Nextair Corporation Software, methods and apparatus facilitating presentation of a wireless communication device user interface with multi-language support

Cited By (92)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8892495B2 (en) 1991-12-23 2014-11-18 Blanding Hovenweep, Llc Adaptive pattern recognition based controller apparatus and method and human-interface therefore
US9535563B2 (en) 1999-02-01 2017-01-03 Blanding Hovenweep, Llc Internet appliance system and method
US8799303B2 (en) 2004-02-15 2014-08-05 Google Inc. Establishing an interactive environment for rendered documents
US9268852B2 (en) 2004-02-15 2016-02-23 Google Inc. Search engines and systems with handheld document data capture devices
US8019648B2 (en) 2004-02-15 2011-09-13 Google Inc. Search engines and systems with handheld document data capture devices
US8064700B2 (en) 2004-02-15 2011-11-22 Google Inc. Method and system for character recognition
US8515816B2 (en) 2004-02-15 2013-08-20 Google Inc. Aggregate analysis of text captures performed by multiple users from rendered documents
US8447144B2 (en) 2004-02-15 2013-05-21 Google Inc. Data capture from rendered documents using handheld device
US8831365B2 (en) 2004-02-15 2014-09-09 Google Inc. Capturing text from rendered documents using supplement information
US8214387B2 (en) 2004-02-15 2012-07-03 Google Inc. Document enhancement system and method
US8442331B2 (en) 2004-02-15 2013-05-14 Google Inc. Capturing text from rendered documents using supplemental information
US8619147B2 (en) 2004-02-15 2013-12-31 Google Inc. Handheld device for capturing text from both a document printed on paper and a document displayed on a dynamic display device
US8447111B2 (en) 2004-04-01 2013-05-21 Google Inc. Triggering actions in response to optically or acoustically capturing keywords from a rendered document
US8620760B2 (en) 2004-04-01 2013-12-31 Google Inc. Methods and systems for initiating application processes by data capture from rendered documents
US8793162B2 (en) 2004-04-01 2014-07-29 Google Inc. Adding information or functionality to a rendered document via association with an electronic counterpart
US9633013B2 (en) 2004-04-01 2017-04-25 Google Inc. Triggering actions in response to optically or acoustically capturing keywords from a rendered document
US9454764B2 (en) 2004-04-01 2016-09-27 Google Inc. Contextual dynamic advertising based upon captured rendered text
US8621349B2 (en) 2004-04-01 2013-12-31 Google Inc. Publishing techniques for adding value to a rendered document
US9143638B2 (en) 2004-04-01 2015-09-22 Google Inc. Data capture from rendered documents using handheld device
US8781228B2 (en) 2004-04-01 2014-07-15 Google Inc. Triggering actions in response to optically or acoustically capturing keywords from a rendered document
US8619287B2 (en) 2004-04-01 2013-12-31 Google Inc. System and method for information gathering utilizing form identifiers
US8505090B2 (en) 2004-04-01 2013-08-06 Google Inc. Archive of text captures from rendered documents
US8146156B2 (en) 2004-04-01 2012-03-27 Google Inc. Archive of text captures from rendered documents
US9514134B2 (en) 2004-04-01 2016-12-06 Google Inc. Triggering actions in response to optically or acoustically capturing keywords from a rendered document
US9116890B2 (en) 2004-04-01 2015-08-25 Google Inc. Triggering actions in response to optically or acoustically capturing keywords from a rendered document
US8713418B2 (en) 2004-04-12 2014-04-29 Google Inc. Adding value to a rendered document
US9030699B2 (en) 2004-04-19 2015-05-12 Google Inc. Association of a portable scanner with input/output and storage devices
US8261094B2 (en) 2004-04-19 2012-09-04 Google Inc. Secure data gathering from rendered documents
US8489624B2 (en) 2004-05-17 2013-07-16 Google, Inc. Processing techniques for text capture from a rendered document
US8799099B2 (en) 2004-05-17 2014-08-05 Google Inc. Processing techniques for text capture from a rendered document
US8346620B2 (en) 2004-07-19 2013-01-01 Google Inc. Automatic modification of web pages
US9275051B2 (en) 2004-07-19 2016-03-01 Google Inc. Automatic modification of web pages
US8179563B2 (en) 2004-08-23 2012-05-15 Google Inc. Portable scanning device
US10769431B2 (en) 2004-09-27 2020-09-08 Google Llc Handheld device for capturing text from both a document printed on paper and a document displayed on a dynamic display device
US8953886B2 (en) 2004-12-03 2015-02-10 Google Inc. Method and system for character recognition
US8903759B2 (en) 2004-12-03 2014-12-02 Google Inc. Determining actions involving captured information and electronic content associated with rendered documents
US8874504B2 (en) 2004-12-03 2014-10-28 Google Inc. Processing techniques for visual capture data from a rendered document
US8081849B2 (en) 2004-12-03 2011-12-20 Google Inc. Portable scanning and memory device
US7990556B2 (en) 2004-12-03 2011-08-02 Google Inc. Association of a portable scanner with input/output and storage devices
US8620083B2 (en) 2004-12-03 2013-12-31 Google Inc. Method and system for character recognition
US8600196B2 (en) 2006-09-08 2013-12-03 Google Inc. Optical scanners, such as hand-held optical scanners
US8924352B1 (en) 2007-03-31 2014-12-30 Emc Corporation Automated priority backup and archive
US20080301236A1 (en) * 2007-05-31 2008-12-04 Microsoft Corporation Contextual social language
US8583601B1 (en) 2007-09-28 2013-11-12 Emc Corporation Imminent failure backup
US8762854B2 (en) * 2007-11-07 2014-06-24 Cabot Communications Limited Systems and methods for itemising web pages for display on a screen
US20130124976A1 (en) * 2008-12-27 2013-05-16 Yoram Zahavi Method and system for inserting data in a web page that is transmitted to a handheld device
US9152615B2 (en) * 2008-12-27 2015-10-06 Flash Networks, Ltd Method and system for inserting data in a web page that is transmitted to a handheld device
US8418055B2 (en) 2009-02-18 2013-04-09 Google Inc. Identifying a document by performing spectral analysis on the contents of the document
US8638363B2 (en) 2009-02-18 2014-01-28 Google Inc. Automatically capturing information, such as capturing information using a document-aware device
WO2010096192A1 (en) * 2009-02-18 2010-08-26 Exbiblio B.V. Interacting with rendered documents using a multi-function mobile device, such as a mobile phone
US8447066B2 (en) 2009-03-12 2013-05-21 Google Inc. Performing actions based on capturing information from rendered documents, such as documents under copyright
US8990235B2 (en) 2009-03-12 2015-03-24 Google Inc. Automatically providing content associated with captured information, such as information captured in real-time
US9075779B2 (en) 2009-03-12 2015-07-07 Google Inc. Performing actions based on capturing information from rendered documents, such as documents under copyright
US20110074831A1 (en) * 2009-04-02 2011-03-31 Opsis Distribution, LLC System and method for display navigation
US9081799B2 (en) 2009-12-04 2015-07-14 Google Inc. Using gestalt information to identify locations in printed information
US9323784B2 (en) 2009-12-09 2016-04-26 Google Inc. Image search using text-based elements within the contents of images
US9514089B1 (en) * 2010-03-31 2016-12-06 EMC IP Holding Company LLC Mobile device network data synchronization
US8655966B1 (en) 2010-03-31 2014-02-18 Emc Corporation Mobile device data protection
US8694597B1 (en) 2010-03-31 2014-04-08 Emc Corporation Mobile device group-based data sharing
US8683005B1 (en) 2010-03-31 2014-03-25 Emc Corporation Cache-based mobile device network resource optimization
US9152650B1 (en) 2010-03-31 2015-10-06 Emc Corporation Mobile device data recovery
US8694744B1 (en) 2010-03-31 2014-04-08 Emc Corporation Mobile device snapshot backup
US9361131B1 (en) * 2011-06-24 2016-06-07 Amazon Technologies, Inc. Network resource access via a mobile shell
CN102902692A (en) * 2011-07-28 2013-01-30 腾讯科技(北京)有限公司 Method and system for processing and showing network media information
US9443029B2 (en) * 2011-08-15 2016-09-13 Verizon Patent And Licensing Inc. Method and system for providing context-based view content management
US20130046831A1 (en) * 2011-08-15 2013-02-21 Verizon Patent And Licensing Inc. Method and system for providing context-based view content management
US20140047310A1 (en) * 2012-08-13 2014-02-13 Business Objects Software Ltd. Mobile drilldown viewer for standardized data
US20140095988A1 (en) * 2012-09-29 2014-04-03 Bookboard, Inc. Creating and consuming streaming e-book content
US9836437B2 (en) 2013-03-15 2017-12-05 Google Llc Screencasting for multi-screen applications
WO2014149492A1 (en) * 2013-03-15 2014-09-25 Google Inc. Screencasting for multi-screen applications
US9529858B2 (en) 2014-03-06 2016-12-27 Yahoo! Inc. Methods and systems for ranking items on a presentation area based on binary outcomes
US11048861B2 (en) 2015-04-30 2021-06-29 Workiva Inc. System and method for convergent document collaboration
US10325014B2 (en) 2015-04-30 2019-06-18 Workiva Inc. System and method for convergent document collaboration
US10331776B2 (en) 2015-04-30 2019-06-25 Workiva Inc. System and method for convergent document collaboration
US9292482B1 (en) * 2015-04-30 2016-03-22 Workiva Inc. System and method for convergent document collaboration
US10878182B2 (en) 2015-04-30 2020-12-29 Workiva Inc. Computing device for convergent document collaboration
US9552343B2 (en) 2015-04-30 2017-01-24 Workiva Inc. System and method for convergent document collaboration
US11361150B2 (en) 2015-04-30 2022-06-14 Workiva Inc. System and method for convergent document collaboration
US20170091160A1 (en) * 2015-09-30 2017-03-30 Samsung Display Co. Ltd. Display system and virtual web device in the cloud
US10534852B2 (en) * 2015-09-30 2020-01-14 Samsung Display Co., Ltd. Display system and virtual web device in the cloud
WO2017067401A1 (en) * 2015-10-19 2017-04-27 阿里巴巴集团控股有限公司 Method and equipment for processing page
US11755825B2 (en) 2019-09-12 2023-09-12 Workiva Inc. Method, system, and computing device for facilitating private drafting
US11100281B1 (en) 2020-08-17 2021-08-24 Workiva Inc. System and method for maintaining links and revisions
US11443108B2 (en) 2020-08-17 2022-09-13 Workiva Inc. System and method for document management using branching
US11544451B2 (en) 2020-08-17 2023-01-03 Workiva Inc. System and method for maintaining links and revisions
US11861300B2 (en) 2020-08-17 2024-01-02 Workiva Inc. System and method for maintaining links and revisions
US11734505B2 (en) 2020-08-17 2023-08-22 Workiva Inc. System and method for document branching
US11100277B1 (en) 2021-02-15 2021-08-24 Workiva Inc. Systems, methods, and computer-readable media for flow-through formatting for links
US11436405B1 (en) 2021-02-15 2022-09-06 Workiva Inc. Systems, methods, and computer-readable media for flow-through formatting for links
US11354362B1 (en) 2021-05-06 2022-06-07 Workiva Inc. System and method for copying linked documents
US11698935B2 (en) 2021-05-06 2023-07-11 Workiva Inc. System and method for copying linked documents
US11640495B1 (en) 2021-10-15 2023-05-02 Workiva Inc. Systems and methods for translation comments flowback

Also Published As

Publication number Publication date
KR101117396B1 (en) 2012-03-07
JP5108016B2 (en) 2012-12-26
JP2010505194A (en) 2010-02-18
EP2069970A4 (en) 2010-01-06
EP2069970A1 (en) 2009-06-17
HK1137534A1 (en) 2010-07-30
CN101523386B (en) 2013-05-22
CN101523386A (en) 2009-09-02
WO2008039581A1 (en) 2008-04-03
KR20090077807A (en) 2009-07-15
US10452756B2 (en) 2019-10-22
US20080155396A1 (en) 2008-06-26

Similar Documents

Publication Publication Date Title
US10452756B2 (en) Platform for rendering content for a remote device
US9479343B2 (en) Engine for processing content rules associated with locations in a page
US8224308B1 (en) Mobile device catalog registration based on user agents and customer snapshots of capabilities
US11170402B2 (en) Evaluating page content to determine user interest
US10503367B2 (en) Recasting a form-based user interface into a mobile device user interface using common data
US7668942B2 (en) Generating document templates that are robust to structural variations
JP4985769B2 (en) Providing actionable events based on customized user information in text messages for intercepted mobile devices
US7996000B1 (en) Managing page sizes for a mobile device using estimation of content customizer techniques
EP3485450B1 (en) Network based advertisement data traffic latency reduction
US9723057B2 (en) Reducing web page load latency by scheduling sets of successive outgoing HTTP calls
US8515908B2 (en) Determining related keywords based on lifestream feeds
US20080220747A1 (en) Scrolling mobile advertisements
US7949625B2 (en) Automated management of brand rules for providing content
JP5665812B2 (en) Mobile monetization
US20090055398A1 (en) Retrieving mobile user context information using tokenized virtual dictionaries
US20090063267A1 (en) Mobile intelligence tasks
US20120150855A1 (en) Cross-market model adaptation with pairwise preference data
US7590634B2 (en) Detection of inaccessible resources

Legal Events

Date Code Title Description
AS Assignment

Owner name: YAHOO| INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DUBINKO, MICAH;JIANG, ZHAOWEI CHARLIE;CHOI, NIGEL;AND OTHERS;REEL/FRAME:021042/0001;SIGNING DATES FROM 20041028 TO 20071202

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: YAHOO HOLDINGS, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YAHOO| INC.;REEL/FRAME:042963/0211

Effective date: 20170613

AS Assignment

Owner name: OATH INC., NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YAHOO HOLDINGS, INC.;REEL/FRAME:045240/0310

Effective date: 20171231