US9232206B2 - Multimedia framework to provide ultra-low power multimedia playback - Google Patents

Multimedia framework to provide ultra-low power multimedia playback Download PDF

Info

Publication number
US9232206B2
US9232206B2 US13/921,196 US201313921196A US9232206B2 US 9232206 B2 US9232206 B2 US 9232206B2 US 201313921196 A US201313921196 A US 201313921196A US 9232206 B2 US9232206 B2 US 9232206B2
Authority
US
United States
Prior art keywords
multimedia
component
framework
application
monolithic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US13/921,196
Other versions
US20130279874A1 (en
Inventor
Mayuresh Kulkarni
Dhiraj Nadgouda
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nvidia Corp
Original Assignee
Nvidia Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nvidia Corp filed Critical Nvidia Corp
Priority to US13/921,196 priority Critical patent/US9232206B2/en
Assigned to NVIDIA CORPORATION reassignment NVIDIA CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KULKARNI, MAYURESH, NADGOUDA, DHIRAJ
Publication of US20130279874A1 publication Critical patent/US20130279874A1/en
Application granted granted Critical
Publication of US9232206B2 publication Critical patent/US9232206B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/87Regeneration of colour television signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/3293Power saving characterised by the action undertaken by switching to a less power-consuming processor, e.g. sub-CPU
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4436Power management, e.g. shutting down unused components of the receiver
    • Y02B60/121
    • Y02B60/1217
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Definitions

  • This disclosure relates generally to multimedia processing systems and, more particularly, to a method, apparatus, and a system to obtain an ultra-low power multimedia playback capability in multimedia players utilizing an appropriate pin-less multimedia framework implementation.
  • Multimedia frameworks simplify tasks related to multimedia handling over processing systems and processing system networks. Tasks may be simplified by easing multimedia capturing and playback, and multimedia streaming.
  • Multimedia frameworks such as DirectShow® for Windows®, can include modular multimedia components (e.g., filters in DirectShow® for Windows®). Specific interfaces may be provided by the typifying multimedia framework through the modular multimedia components. For example, a multimedia framework may identify roles for each component, and may give each component specific interfaces, thereby enabling the components to handle tasks and notify events.
  • pins may be provided. These pins, which may be data structures, are components that are aggregated within a component. In existing systems, every component communicates with every other component through pins thereof. Information regarding the correspondence between components and respective individual pins needs to be provided. Also, an architecture utilizing pins tends to create threads including worker threads corresponding to individual pins. As data flow occurs in worker threads, memory and resource consumption in a pin-based architecture are causes for concern.
  • a multimedia framework to provide multimedia playback on a multimedia player includes a monolithic multimedia component including a specific interface provided by the multimedia framework.
  • the specific interface signifies roles identified by the multimedia framework for the monolithic multimedia component.
  • the multimedia framework also includes a component control unit layer to serve as a point of control of an application, and to control a data flow through the monolithic multimedia component.
  • the application queries the component control unit layer for the specific interface, the specific interface passes a pointer thereof that signifies a role required by the application matching a role identified by the multimedia framework for the monolithic multimedia component to the application, to indicate support of necessary interfaces providing communication between the application and the component control unit layer.
  • a multimedia stack is interfaced with the monolithic multimedia component.
  • the multimedia stack includes a first block to parse an input, one or more second block(s) to transform a first block output data of the first block and/or a third block to place an output from the one or more second block(s) on a rendering device.
  • a command from the application is transmitted to a tunnel of a multimedia stack interfaced with the monolithic multimedia component, and the multimedia stack ensures that the same monolithic multimedia component serves as a source component to read and to parse the input, one or more transform component(s) to transform the output data of the source component and/or a renderer to place the output of the one or more transform component(s) on the rendering device.
  • the application is unaware of the same monolithic multimedia component serving as the source component, the one or more transform component(s) and/or the renderer.
  • a non-transitory medium readable through a multimedia player and including instructions embodied therein to implement a multimedia framework to provide multimedia playback on the multimedia player.
  • the non-transitory medium includes instructions to signify roles identified by the multimedia framework for a monolithic multimedia component thereof through a specific interface provided by the multimedia framework, and instructions to control a data flow through the monolithic multimedia component through a component control unit layer configured to serve as a point of control of an application.
  • the non-transitory medium includes instructions to pass a pointer through the specific interface signifying a role required by the application matching a role identified by the multimedia framework for the monolithic multimedia component to the application to indicate support of necessary interfaces providing communication between the application and the component control unit layer, in response to the application querying the component control unit layer for the specific interface.
  • the non-transitory medium includes instructions to transmit a command from the application a tunnel of a multimedia stack interfaced with the monolithic multimedia component, and instructions to implement the multimedia stack with a first block to parse an input, one or more second block(s) to transform a first block output data of the first block and/or a third block to place an output from the one or more second block(s) on a rendering device to ensure that the same monolithic multimedia component serves as a source component to read and to parse the input, one or more transform component(s) to transform the output data of the source component and/or a renderer to place the output of the one or more transform component(s) on the rendering device.
  • the application is unaware of the same monolithic multimedia component serving as the source component, the one or more transform component(s) and/or the renderer.
  • a multimedia player to execute a multimedia framework to provide multimedia playback thereon includes a processor to execute instructions to signify roles identified by the multimedia framework for a monolithic multimedia component thereof through a specific interface provided by the multimedia framework, and to control a data flow through the monolithic multimedia component through a component control unit layer configured to serve as a point of control of an application.
  • the processor is also configured to execute instructions to pass a pointer through the specific interface signifying a role required by the application matching a role identified by the multimedia framework for the monolithic multimedia component to the application to indicate support of necessary interfaces providing communication between the application and the component control unit layer, in response to the application querying the component control unit layer for the specific interface.
  • the processor is configured to execute instructions to transmit a command from the application a tunnel of a multimedia stack interfaced with the monolithic multimedia component, and to implement the multimedia stack with a first block to parse an input, one or more second block(s) to transform a first block output data of the first block and/or a third block to place an output from the one or more second block(s) on a rendering device to ensure that the same monolithic multimedia component serves: a source component to read and to parse the input, one or more transform component(s) to transform the output data of the source component, and/or a renderer to place the output of the one or more transform component(s) on the rendering device.
  • the application is unaware of the same monolithic multimedia component serving as the source component, the one or more transform component(s) and/or the renderer.
  • FIG. 1 a schematic view of a multimedia framework, exemplifying a pin-based architecture, in accordance with one or more embodiments.
  • FIG. 2 is a schematic view of a pin-less multimedia framework architecture, in accordance with one or more embodiments.
  • FIG. 3 is a schematic view of a multimedia processing system using the multimedia framework of FIG. 2 , in accordance with one or more embodiments.
  • FIG. 4 is a process flow diagram that details the operations involved in a method of multimedia processing that offers power savings, in accordance with one or more embodiments.
  • FIG. 5 is a mobile device including a multimedia framework implementation, in accordance with one or more embodiments.
  • example embodiments discussed below provide a multimedia framework for ultra-low power multimedia playback.
  • a method of multimedia processing in a multimedia processing system utilizing the implementation of the aforementioned multimedia framework may result in reduced power dissipation in the multimedia processing system.
  • multimedia processing systems include, but are not limited to, mobile processors in portable multimedia players.
  • FIG. 1 is a schematic view of a multimedia framework 100 , exemplifying a pin-based architecture.
  • FIG. 1 illustrates a multimedia frame work 100 that includes a framework application layer 102 , an application interface 104 , a component control unit layer 106 , a source component 108 , an output pin A 114 , an input pin A 118 , a transform component 110 , an output pin B 116 , an input pin B 120 , and a renderer 112 .
  • the multimedia framework 100 may be a multimedia layer providing multimedia capture, processing, and playback (e.g., DirectShow® for Windows®) from local or remote sources.
  • the multimedia framework 100 may be above a foundation layer that facilitates access of hardware (e.g., sound card).
  • the framework application layer 102 may communicate with the component control unit layer 106 through the application interface 104 .
  • An application at the framework application layer 102 level may perform a required task by connecting the source component 108 , transform component 110 , and the renderer 112 together with the help of the component control unit layer 106 .
  • the application interface(s) 104 may, therefore, facilitate communication between the application and the component control unit layer 106 by including necessary interfaces required for the aforementioned communication.
  • the component control unit layer 106 e.g., Filter Graph Manager in DirectShow® for Windows®
  • the components ( 108 , 110 , and 112 ) may include interfaces that signify roles thereof identified by the multimedia framework 100 .
  • Dataflow may be enabled through pins ( 114 , 116 , 118 , and 120 ) that serve as interfaces between the components ( 108 , 110 , and 112 ).
  • the directionality of the pins ( 114 , 116 , 118 , and 120 ) influences the order in which components are arranged and connected to one another.
  • the source component 108 may read and parse from an input file, and may send a bit-stream to downstream components. Therefore, the source component 108 of FIG. 1 may have one output pin (Output Pin A 114 ) and no input pins.
  • the transform component 110 may do custom processing on the bit-stream to send data downstream.
  • the custom processing may include a parsing, a decoding or a requisite data operation.
  • the transform component 110 of FIG. 1 may include upstream and downstream components therein.
  • the transform component 110 is shown in FIG. 1 as having one input pin (Input Pin A 118 ) and one output pin (Output Pin B 116 ) for example purposes.
  • the renderer 112 may receive the processed output data of the transform process and place it on a rendering device.
  • the rendering process may also include displaying a multimedia on screen, playing an audio file on a sound card, writing a data to a file etc.
  • the renderer 112 may have one input pin (Input Pin B 120 ) and no output pins.
  • a component configuration may merely include a source component and a renderer. Such a configuration may merely be used for playing a multimedia file without processing.
  • FIG. 2 is a schematic view of a pin-less multimedia framework 200 architecture, in accordance with one or more embodiments.
  • FIG. 2 illustrates a multimedia framework 200 that includes a framework application layer 202 , an application interface 204 , a component control unit layer 206 , a specific interface 216 , a monolithic multimedia component 214 , a first block 208 , a second block 210 , a third block 212 , and a tunnel 218 .
  • an application at the framework application layer 202 level may perform a required task through the monolithic multimedia component 214 with the help of the component control unit layer 206 .
  • the application interface(s) 204 may, therefore, facilitate communication between the application and the component control unit layer 206 by including necessary interfaces required for the aforementioned communication.
  • the component control unit layer 206 may serve as a point of control of an application, and may also control a dataflow through the monolithic multimedia component 214 .
  • the multimedia framework 200 may identify roles for the monolithic multimedia component 214 through the specific interface 216 .
  • the multimedia framework 200 may avoid the need for pins by transmitting commands from the application to a tunnel 218 of a multimedia stack 220 interfaced with the monolithic multimedia component 214 .
  • the multimedia stack 220 may include a first block 208 to parse an input, one or more of a second block 210 to transform the output of the first block 208 , and a third block 212 to place the resulting data of the second block 210 on a rendering device.
  • the output of one block of the multimedia stack 220 may be fed as an input to the next block downstream through the tunnel 218 .
  • the specific interface 216 may pass a pointer thereof to signify a role required by the application of the multimedia framework 200 matching the role identified by the multimedia framework 200 for the monolithic multimedia component 214 .
  • commands from the application e.g., seek, fast-forward, rewind etc.
  • commands from the application may then be transmitted from the application to the tunnel 218 of the multimedia stack 220 to enable requisite functions to be performed with proper notification. This may cause the application to be unaware of the underlying component architecture. Specifically, even though the tasks of a source component 108 , transform component 110 , and renderer 112 of FIG.
  • the application may see all source component, transform component, and renderer related interfaces that indicate the functional presence of a source component, transform component, and renderer, akin to FIG. 1 .
  • the multimedia framework 200 may include an audio-related interface that is responsible to change audio-balance. This implies that the audio-related interface may have to be implemented by a renderer.
  • the interface may pass a pointer thereof to the application.
  • the application may see the renderer related interface that indicates the functional presence of a renderer, although internally only a single monolithic multimedia component 214 may be present.
  • FIG. 3 is a schematic view of a multimedia processing system 300 using the multimedia framework 200 of FIG. 2 , in accordance with one or more embodiments.
  • the multimedia framework 200 in the multimedia processing system 300 may communicate with a local file 302 through the monolithic multimedia component 214 .
  • the local file 302 may be an audio file, a video file or an audio/video (A/V) file that serves as the input.
  • the multimedia framework 200 may communicate with a global network of interconnected computers (e.g., Internet 304 ) through the monolithic multimedia component 214 .
  • the input to the monolithic multimedia component 214 may then be a multimedia file stream or a Uniform Resource Locator (URL) including a multimedia file.
  • URL Uniform Resource Locator
  • the input to the monolithic multimedia component 214 may be processed by the multimedia framework 200 , and the processed multimedia data may be played back on the file playback system 306 .
  • the file playback system 306 may be a media player or a device capable of playing a media content.
  • the file playback system 306 may render a multimedia stream or a URL including a multimedia file for download on a computer or a mobile device.
  • a capability of capturing multimedia prior to creation of an input file may be provided to the multimedia framework 200 . In one embodiment, the capture may be accomplished using a web camera or a video camera.
  • the multimedia framework 200 may be provided with the capability of performing a multimedia file format conversion to convenience compatibility in a plurality of multimedia devices.
  • a high definition (HD) file may be converted to a 3 gp file to convenience compatibility on a mobile device.
  • the multimedia stack 220 may be part of a multimedia processor 350 that operates at a frequency (e.g., 150 MHz) lower than the frequency (e.g., 650 MHz) of a central processor 330 that includes the framework application layer 202 , the component control unit layer 206 , and the monolithic multimedia component 214 .
  • the multimedia stack 220 is shown as being part of the multimedia processor 350 interfaced with the monolithic multimedia component 214 .
  • the activity of the central processor 330 may be restricted to a requisite parsing. In one or more embodiments, the aforementioned restriction of the activity of the central processor 330 may allow for ultra-low power multimedia playback, thereby resulting in power savings. In one or more embodiments, the lack of need for a pin-based architecture in the multimedia framework 200 of the multimedia processing system 300 of FIG. 3 may provide for memory and resource savings.
  • a single monolithic multimedia component 214 serving as a parser, decoder, and render may reduce the number of components required to be loaded in memory for playback purposes.
  • the multimedia processor 350 may be part of a System-on-a-Chip (SoC).
  • SoC System-on-a-Chip
  • a performance of the multimedia processor 350 may be improved by the use of multimedia accelerator modules.
  • the multimedia processing system 300 may be a mobile processor used in mobile phones.
  • the central processor 330 may then be a Central Processing Unit (CPU) of the mobile processor.
  • the CPU which may be the maximum power consuming element of the multimedia processing system 300 , may go into a “sleep” mode, “waking” up only to do the requisite parsing.
  • the multimedia framework 200 may have exclusive compatibility with particular implementations of hardware.
  • FIG. 4 is a process flow diagram that details the operations involved in a method of multimedia processing that offers power savings, in accordance with one or more embodiments.
  • a multimedia processor 350 may be connected to the multimedia processing system 300 including the central processor 330 of FIG. 3 .
  • a pin-less multimedia frame work 200 of FIG. 2 may be implemented.
  • the tasks related to the source component 108 , the transform component 110 , and the renderer 112 of FIG. 1 may be executed on the multimedia processor 350 including the multimedia stack 220 .
  • the requisite parsing may solely be executed on the central processor 330 to result in power savings.
  • implementing the multimedia framework 200 in the multimedia processing system 300 may provide for an ultra-low power multimedia playback.
  • power savings may be decreased ten-fold compared to a multimedia processing system including a pin-based multimedia framework 100 implementation.
  • the multimedia framework 200 may provide for 100 hours of audio playback using a 900 mA battery.
  • FIG. 5 shows a mobile device 500 including a multimedia framework implementation, in accordance with one or more embodiments.
  • the mobile device 500 may include a processor/media interface module 510 that, in turn, may include a multimedia processor 502 to which most multimedia input, processing, and playback related tasks are off-loaded to, and a central processor 504 on which only requisite parsing is executed.
  • the multimedia processor 502 may be interfaced with an audio device 506 that, in turn, may be interfaced with a display 508 to cause an output video to be displayed with audio.
  • the multimedia processing system 300 of FIG. 3 and the mobile device 500 of FIG. 5 may involve structural modifications that are well within the scope of the various embodiments.
  • the various devices and modules described herein may be enabled and operated using hardware circuitry (e.g., CMOS based logic circuitry), firmware, software or any combination of hardware, firmware, and software.
  • the various electrical structure and methods may be embodied using transistors, logic gates, and electrical circuits (e.g., application specific integrated (ASIC) circuitry and/or in Digital Signal Processor (DSP) circuitry).
  • ASIC application specific integrated
  • DSP Digital Signal Processor

Abstract

A multimedia framework includes a monolithic multimedia component to include a specific interface provided by the multimedia framework, and a component control unit layer to serve as a point of control of an application, and to control a data flow through the monolithic multimedia component. When the application queries the component control unit layer for the specific interface, the specific interface passes a pointer thereof that signifies a role required by the application matching a role identified by the multimedia framework for the monolithic multimedia component to the application. A command from the application is transmitted to a tunnel of a multimedia stack interfaced with the monolithic multimedia component to ensure that the same monolithic multimedia component serves as a source component, one or more transform component(s) and/or a renderer. The application is unaware of the multi-tasking associated with the monolithic multimedia component.

Description

CLAIM OF PRIORITY
This application is a Divisional application of the U.S. Non-Provisional application Ser. No. 12/499,809 titled MULTIMEDIA FRAMEWORK TO PROVIDE ULTRA-LOW POWER MULTIMEDIA PLAYBACK filed on Jul. 9, 2009.
FIELD OF TECHNOLOGY
This disclosure relates generally to multimedia processing systems and, more particularly, to a method, apparatus, and a system to obtain an ultra-low power multimedia playback capability in multimedia players utilizing an appropriate pin-less multimedia framework implementation.
BACKGROUND
Multimedia frameworks simplify tasks related to multimedia handling over processing systems and processing system networks. Tasks may be simplified by easing multimedia capturing and playback, and multimedia streaming. Multimedia frameworks, such as DirectShow® for Windows®, can include modular multimedia components (e.g., filters in DirectShow® for Windows®). Specific interfaces may be provided by the typifying multimedia framework through the modular multimedia components. For example, a multimedia framework may identify roles for each component, and may give each component specific interfaces, thereby enabling the components to handle tasks and notify events.
To enable interfacing between components, member objects called pins may be provided. These pins, which may be data structures, are components that are aggregated within a component. In existing systems, every component communicates with every other component through pins thereof. Information regarding the correspondence between components and respective individual pins needs to be provided. Also, an architecture utilizing pins tends to create threads including worker threads corresponding to individual pins. As data flow occurs in worker threads, memory and resource consumption in a pin-based architecture are causes for concern.
SUMMARY
Disclosed are a method, an apparatus, and a system to provide ultra-low power multimedia playback in multimedia players utilizing an appropriate pin-less multimedia framework implementation.
In one aspect, a multimedia framework to provide multimedia playback on a multimedia player includes a monolithic multimedia component including a specific interface provided by the multimedia framework. The specific interface signifies roles identified by the multimedia framework for the monolithic multimedia component. The multimedia framework also includes a component control unit layer to serve as a point of control of an application, and to control a data flow through the monolithic multimedia component. When the application queries the component control unit layer for the specific interface, the specific interface passes a pointer thereof that signifies a role required by the application matching a role identified by the multimedia framework for the monolithic multimedia component to the application, to indicate support of necessary interfaces providing communication between the application and the component control unit layer.
A multimedia stack is interfaced with the monolithic multimedia component. The multimedia stack includes a first block to parse an input, one or more second block(s) to transform a first block output data of the first block and/or a third block to place an output from the one or more second block(s) on a rendering device. A command from the application is transmitted to a tunnel of a multimedia stack interfaced with the monolithic multimedia component, and the multimedia stack ensures that the same monolithic multimedia component serves as a source component to read and to parse the input, one or more transform component(s) to transform the output data of the source component and/or a renderer to place the output of the one or more transform component(s) on the rendering device. Further, the application is unaware of the same monolithic multimedia component serving as the source component, the one or more transform component(s) and/or the renderer.
In another aspect, a non-transitory medium, readable through a multimedia player and including instructions embodied therein to implement a multimedia framework to provide multimedia playback on the multimedia player, is disclosed. The non-transitory medium includes instructions to signify roles identified by the multimedia framework for a monolithic multimedia component thereof through a specific interface provided by the multimedia framework, and instructions to control a data flow through the monolithic multimedia component through a component control unit layer configured to serve as a point of control of an application. Also, the non-transitory medium includes instructions to pass a pointer through the specific interface signifying a role required by the application matching a role identified by the multimedia framework for the monolithic multimedia component to the application to indicate support of necessary interfaces providing communication between the application and the component control unit layer, in response to the application querying the component control unit layer for the specific interface.
Further, the non-transitory medium includes instructions to transmit a command from the application a tunnel of a multimedia stack interfaced with the monolithic multimedia component, and instructions to implement the multimedia stack with a first block to parse an input, one or more second block(s) to transform a first block output data of the first block and/or a third block to place an output from the one or more second block(s) on a rendering device to ensure that the same monolithic multimedia component serves as a source component to read and to parse the input, one or more transform component(s) to transform the output data of the source component and/or a renderer to place the output of the one or more transform component(s) on the rendering device. The application is unaware of the same monolithic multimedia component serving as the source component, the one or more transform component(s) and/or the renderer.
In yet another aspect, a multimedia player to execute a multimedia framework to provide multimedia playback thereon, is disclosed. The multimedia player includes a processor to execute instructions to signify roles identified by the multimedia framework for a monolithic multimedia component thereof through a specific interface provided by the multimedia framework, and to control a data flow through the monolithic multimedia component through a component control unit layer configured to serve as a point of control of an application. The processor is also configured to execute instructions to pass a pointer through the specific interface signifying a role required by the application matching a role identified by the multimedia framework for the monolithic multimedia component to the application to indicate support of necessary interfaces providing communication between the application and the component control unit layer, in response to the application querying the component control unit layer for the specific interface.
Further, the processor is configured to execute instructions to transmit a command from the application a tunnel of a multimedia stack interfaced with the monolithic multimedia component, and to implement the multimedia stack with a first block to parse an input, one or more second block(s) to transform a first block output data of the first block and/or a third block to place an output from the one or more second block(s) on a rendering device to ensure that the same monolithic multimedia component serves: a source component to read and to parse the input, one or more transform component(s) to transform the output data of the source component, and/or a renderer to place the output of the one or more transform component(s) on the rendering device. The application is unaware of the same monolithic multimedia component serving as the source component, the one or more transform component(s) and/or the renderer.
The methods and systems disclosed herein may be implemented in any means for achieving various aspects, and may be executed in a form of a machine-readable medium embodying a set of instructions that, when executed by a machine, cause the machine to perform any of the operations disclosed herein. Other features will be apparent from the accompanying drawings and from the detailed description that follows.
BRIEF DESCRIPTION OF THE DRAWINGS
Example embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
FIG. 1 a schematic view of a multimedia framework, exemplifying a pin-based architecture, in accordance with one or more embodiments.
FIG. 2 is a schematic view of a pin-less multimedia framework architecture, in accordance with one or more embodiments.
FIG. 3 is a schematic view of a multimedia processing system using the multimedia framework of FIG. 2, in accordance with one or more embodiments.
FIG. 4 is a process flow diagram that details the operations involved in a method of multimedia processing that offers power savings, in accordance with one or more embodiments.
FIG. 5 is a mobile device including a multimedia framework implementation, in accordance with one or more embodiments.
Other features of the present embodiments will be apparent from the accompanying drawings and from the detailed description that follows.
DETAILED DESCRIPTION
Disclosed are a method, an apparatus, and a system to provide ultra-low power multimedia playback in multimedia players utilizing an appropriate pin-less multimedia framework implementation. Although the present embodiments have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the various embodiments.
In general, example embodiments discussed below provide a multimedia framework for ultra-low power multimedia playback. In one or more embodiments, a method of multimedia processing in a multimedia processing system utilizing the implementation of the aforementioned multimedia framework may result in reduced power dissipation in the multimedia processing system. Examples of multimedia processing systems include, but are not limited to, mobile processors in portable multimedia players.
Various exemplary embodiments will now be described with reference to the accompanying figures.
FIG. 1 is a schematic view of a multimedia framework 100, exemplifying a pin-based architecture. Particularly, FIG. 1 illustrates a multimedia frame work 100 that includes a framework application layer 102, an application interface 104, a component control unit layer 106, a source component 108, an output pin A 114, an input pin A 118, a transform component 110, an output pin B 116, an input pin B 120, and a renderer 112. The multimedia framework 100 may be a multimedia layer providing multimedia capture, processing, and playback (e.g., DirectShow® for Windows®) from local or remote sources. The multimedia framework 100 may be above a foundation layer that facilitates access of hardware (e.g., sound card).
Referring to the exemplary multimedia framework 100 of FIG. 1, the framework application layer 102 may communicate with the component control unit layer 106 through the application interface 104. An application at the framework application layer 102 level may perform a required task by connecting the source component 108, transform component 110, and the renderer 112 together with the help of the component control unit layer 106. The application interface(s) 104 may, therefore, facilitate communication between the application and the component control unit layer 106 by including necessary interfaces required for the aforementioned communication. The component control unit layer 106 (e.g., Filter Graph Manager in DirectShow® for Windows®) may control arrangements of the source component 108, transform component 110, and the renderer 112, and may also control a dataflow therethrough. The components (108, 110, and 112) may include interfaces that signify roles thereof identified by the multimedia framework 100.
Dataflow may be enabled through pins (114, 116, 118, and 120) that serve as interfaces between the components (108, 110, and 112). The directionality of the pins (114, 116, 118, and 120) influences the order in which components are arranged and connected to one another. The source component 108 may read and parse from an input file, and may send a bit-stream to downstream components. Therefore, the source component 108 of FIG. 1 may have one output pin (Output Pin A 114) and no input pins. The transform component 110 may do custom processing on the bit-stream to send data downstream. The custom processing may include a parsing, a decoding or a requisite data operation. As there can exist a plurality of transform components, the transform component 110 of FIG. 1 may include upstream and downstream components therein. The transform component 110 is shown in FIG. 1 as having one input pin (Input Pin A 118) and one output pin (Output Pin B 116) for example purposes.
The renderer 112 may receive the processed output data of the transform process and place it on a rendering device. The rendering process may also include displaying a multimedia on screen, playing an audio file on a sound card, writing a data to a file etc. As the renderer 112 is at the end of a component chain, the renderer 112 may have one input pin (Input Pin B 120) and no output pins.
At a basic level, a component configuration may merely include a source component and a renderer. Such a configuration may merely be used for playing a multimedia file without processing.
FIG. 2 is a schematic view of a pin-less multimedia framework 200 architecture, in accordance with one or more embodiments. Particularly, FIG. 2 illustrates a multimedia framework 200 that includes a framework application layer 202, an application interface 204, a component control unit layer 206, a specific interface 216, a monolithic multimedia component 214, a first block 208, a second block 210, a third block 212, and a tunnel 218.
In one or more embodiments, an application at the framework application layer 202 level may perform a required task through the monolithic multimedia component 214 with the help of the component control unit layer 206. The application interface(s) 204 may, therefore, facilitate communication between the application and the component control unit layer 206 by including necessary interfaces required for the aforementioned communication. The component control unit layer 206 may serve as a point of control of an application, and may also control a dataflow through the monolithic multimedia component 214. The multimedia framework 200 may identify roles for the monolithic multimedia component 214 through the specific interface 216.
In one or more embodiments, the multimedia framework 200 may avoid the need for pins by transmitting commands from the application to a tunnel 218 of a multimedia stack 220 interfaced with the monolithic multimedia component 214. In one or more embodiments, the multimedia stack 220 may include a first block 208 to parse an input, one or more of a second block 210 to transform the output of the first block 208, and a third block 212 to place the resulting data of the second block 210 on a rendering device. In one or more embodiments, the output of one block of the multimedia stack 220 may be fed as an input to the next block downstream through the tunnel 218.
In one or more embodiments, when the application at the framework application layer 202 level queries the component control unit layer 206 for the specific interface 216, the specific interface 216 may pass a pointer thereof to signify a role required by the application of the multimedia framework 200 matching the role identified by the multimedia framework 200 for the monolithic multimedia component 214. In one or more embodiments, commands from the application (e.g., seek, fast-forward, rewind etc.) may then be transmitted from the application to the tunnel 218 of the multimedia stack 220 to enable requisite functions to be performed with proper notification. This may cause the application to be unaware of the underlying component architecture. Specifically, even though the tasks of a source component 108, transform component 110, and renderer 112 of FIG. 1 are performed by the same monolithic multimedia component 214 utilizing the tunnel 218 of the multimedia stack 220, the application may see all source component, transform component, and renderer related interfaces that indicate the functional presence of a source component, transform component, and renderer, akin to FIG. 1.
For example, the multimedia framework 200 may include an audio-related interface that is responsible to change audio-balance. This implies that the audio-related interface may have to be implemented by a renderer. In the multimedia framework 200 of FIG. 2, whenever the audio-related interface is queried by the application, the interface may pass a pointer thereof to the application. The application may see the renderer related interface that indicates the functional presence of a renderer, although internally only a single monolithic multimedia component 214 may be present.
FIG. 3 is a schematic view of a multimedia processing system 300 using the multimedia framework 200 of FIG. 2, in accordance with one or more embodiments. In one or more embodiments, the multimedia framework 200 in the multimedia processing system 300 may communicate with a local file 302 through the monolithic multimedia component 214. The local file 302 may be an audio file, a video file or an audio/video (A/V) file that serves as the input. In one or more embodiments, the multimedia framework 200 may communicate with a global network of interconnected computers (e.g., Internet 304) through the monolithic multimedia component 214. The input to the monolithic multimedia component 214 may then be a multimedia file stream or a Uniform Resource Locator (URL) including a multimedia file.
The input to the monolithic multimedia component 214 may be processed by the multimedia framework 200, and the processed multimedia data may be played back on the file playback system 306. The file playback system 306 may be a media player or a device capable of playing a media content. In one or more embodiments, the file playback system 306 may render a multimedia stream or a URL including a multimedia file for download on a computer or a mobile device. In one or more embodiments, a capability of capturing multimedia prior to creation of an input file may be provided to the multimedia framework 200. In one embodiment, the capture may be accomplished using a web camera or a video camera. In one or more embodiments, the multimedia framework 200 may be provided with the capability of performing a multimedia file format conversion to convenience compatibility in a plurality of multimedia devices. In one embodiment, for example, a high definition (HD) file may be converted to a 3 gp file to convenience compatibility on a mobile device.
In one or more embodiments, the multimedia stack 220 may be part of a multimedia processor 350 that operates at a frequency (e.g., 150 MHz) lower than the frequency (e.g., 650 MHz) of a central processor 330 that includes the framework application layer 202, the component control unit layer 206, and the monolithic multimedia component 214. The multimedia stack 220 is shown as being part of the multimedia processor 350 interfaced with the monolithic multimedia component 214.
In one or more embodiments, as most tasks may be offloaded to the multimedia processor 350, the activity of the central processor 330 may be restricted to a requisite parsing. In one or more embodiments, the aforementioned restriction of the activity of the central processor 330 may allow for ultra-low power multimedia playback, thereby resulting in power savings. In one or more embodiments, the lack of need for a pin-based architecture in the multimedia framework 200 of the multimedia processing system 300 of FIG. 3 may provide for memory and resource savings.
In one or more embodiments, a single monolithic multimedia component 214 serving as a parser, decoder, and render may reduce the number of components required to be loaded in memory for playback purposes. In one embodiment, the multimedia processor 350 may be part of a System-on-a-Chip (SoC). In one or more embodiments, a performance of the multimedia processor 350 may be improved by the use of multimedia accelerator modules. In one embodiment, the multimedia processing system 300 may be a mobile processor used in mobile phones. The central processor 330 may then be a Central Processing Unit (CPU) of the mobile processor. The CPU, which may be the maximum power consuming element of the multimedia processing system 300, may go into a “sleep” mode, “waking” up only to do the requisite parsing. In one or more embodiments, the multimedia framework 200 may have exclusive compatibility with particular implementations of hardware.
FIG. 4 is a process flow diagram that details the operations involved in a method of multimedia processing that offers power savings, in accordance with one or more embodiments. In operation 405, a multimedia processor 350 may be connected to the multimedia processing system 300 including the central processor 330 of FIG. 3. In operation 410, a pin-less multimedia frame work 200 of FIG. 2 may be implemented. In operation 415, the tasks related to the source component 108, the transform component 110, and the renderer 112 of FIG. 1 may be executed on the multimedia processor 350 including the multimedia stack 220. The requisite parsing may solely be executed on the central processor 330 to result in power savings. In one embodiment, implementing the multimedia framework 200 in the multimedia processing system 300 may provide for an ultra-low power multimedia playback. In one or more embodiments, power savings may be decreased ten-fold compared to a multimedia processing system including a pin-based multimedia framework 100 implementation. In one example embodiment, the multimedia framework 200 may provide for 100 hours of audio playback using a 900 mA battery.
FIG. 5 shows a mobile device 500 including a multimedia framework implementation, in accordance with one or more embodiments. In one or more embodiments, the mobile device 500 may include a processor/media interface module 510 that, in turn, may include a multimedia processor 502 to which most multimedia input, processing, and playback related tasks are off-loaded to, and a central processor 504 on which only requisite parsing is executed. In one or more embodiments, the multimedia processor 502 may be interfaced with an audio device 506 that, in turn, may be interfaced with a display 508 to cause an output video to be displayed with audio.
Although the present embodiments have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the various embodiments. For example, the multimedia processing system 300 of FIG. 3 and the mobile device 500 of FIG. 5 may involve structural modifications that are well within the scope of the various embodiments. Also, for example, the various devices and modules described herein may be enabled and operated using hardware circuitry (e.g., CMOS based logic circuitry), firmware, software or any combination of hardware, firmware, and software. For example, the various electrical structure and methods may be embodied using transistors, logic gates, and electrical circuits (e.g., application specific integrated (ASIC) circuitry and/or in Digital Signal Processor (DSP) circuitry).
In addition, it will be appreciated that the various operations, processes, and methods disclosed herein may be embodied in a machine-readable medium and/or a machine accessible medium compatible with a data processing system (e.g., a computer system), and may be performed in any order (e.g., including using means for achieving the various operations). Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.

Claims (20)

What is claimed is:
1. A multimedia framework to provide multimedia playback on a multimedia player, the multimedia framework comprising:
a monolithic multimedia component to include a specific interface provided by the multimedia framework, the specific interface signifying roles identified by the multimedia framework for the monolithic multimedia component; and
a component control unit layer to serve as a point of control of an application, and to control a data flow through the monolithic multimedia component,
wherein when the application queries the component control unit layer for the specific interface, the specific interface passes a pointer thereof that signifies a role required by the application matching a role identified by the multimedia framework for the monolithic multimedia component, to the application, to indicate support of necessary interfaces providing communication between the application and the component control unit layer, and
wherein a command from the application is transmitted to a tunnel of a
multimedia stack interfaced with the monolithic multimedia component, the multimedia stack comprising more than one of:
a first block to parse an input,
at least one second block to transform a first block output data of the first block, and
a third block to place an output from the at least one second block on a rendering device,
to ensure that the same monolithic multimedia component serves as more than one of:
a source component to read and to parse the input,
at least one transform component to transform the output data of the source component, and
a renderer to place the output of the at least one transform component on the rendering device, and
to further ensure that the application is unaware of the same monolithic multimedia component serving as the at least one of the source component, the at least one transform component, and the renderer.
2. The multimedia framework of claim l, wherein the input to be read by the monolithic multimedia component when serving as a source component is one of:
an audio tile,
a video file,
an audio/video (AN) file,
a Uniform Resource Locator (URL) comprising a multimedia file stream, and
a URL comprising a multimedia file.
3. The multimedia framework of claim 2, wherein the multimedia framework is capable of capturing a multimedia prior to creation of an input tile comprising the multimedia.
4. The multimedia framework of claim 2, wherein the multimedia framework is capable of performing a multimedia file format conversion to convenience capability in a plurality of multimedia devices.
5. The multimedia framework of claim 3, wherein the multimedia capturing capability involves multimedia capturing using one of a web camera and a video camera.
6. A non-transitory medium, readable through a multimedia player and including instructions embodied therein to implement a multimedia framework to provide multimedia playback on the multimedia player, comprising:
instructions to signify roles identified by the multimedia framework for a monolithic multimedia component thereof through a specific interface provided by the multimedia framework;
instructions to control a data flow through the monolithic multimedia component through a component control unit layer configured to serve as a point of control of an application;
instructions to pass a pointer through the specific interface signifying a role required by the application matching a role identified by the multimedia framework for the monolithic multimedia component to the application to indicate support of necessary interfaces providing communication between the application and the component control unit layer, in response to the application querying the component control unit layer for the specific interface;
instructions to transmit a command from the application a tunnel of a multimedia stack interfaced with the monolithic multimedia component; and
instructions to implement the multimedia stack with more than one of:
a first block to parse an input,
at least one second block to transform a first block output the first block, and
a third block to place an output from the at least one second block on a rendering device,
to ensure that the same monolithic multimedia component serves as more than one of:
a source component to read and to parse the input,
at least one transform component to transform the output the source component, and
a renderer to place the output of the at least one transform component on the rendering device, and
to further ensure that the application is unaware of the same monolithic multimedia component serving as the at least one of the source component, the at least one transform component, and the renderer.
7. The non-transitory medium of claim 6, comprising instructions compatible with the input to be read by the monolithic multimedia component when serving as a source component being one of:
an audio tile,
a video file,
an file,
a URL, comprising a multimedia file stream, and
a URL comprising a multimedia file.
8. The non-transitory medium of claim 7, comprising instructions to provide a capability to the multimedia framework to capture a multimedia prior to creation of an input file comprising the multimedia.
9. The non-transitory medium of claim 8, comprising instructions to provide a capability to the multimedia framework to perform a multimedia file format conversion to convenience compatibility with a plurality of multimedia devices.
10. The non-transitory medium of claim 8, comprising instructions compatible with the multimedia capturing capability involving multimedia capturing using one of a web camera and a video camera.
11. A multimedia player to execute a multimedia framework to provide multimedia playback thereon, comprising:
a memory for storing instructions; and
a processor which, when executing said instructions, operates to:
signify roles identified by the multimedia framework for a monolithic multimedia component thereof through a specific interface provided by the multimedia framework,
control a data flow through the monolithic multimedia component through a component control unit layer configured to serve as a point of control of an application,
pass a pointer through the specific interface signifying a role required by the application matching a role identified by the multimedia framework for the monolithic multimedia component to the application to indicate support of necessary interfaces providing communication between the application and the component control unit layer, in response to the application querying the component control unit layer for the specific interface,
transmit a command from the application a tunnel of a multimedia stack interfaced with the monolithic multimedia component, and
implement the multimedia stack with more than one of:
a first block to parse an input,
at least one second block to transform a first block output data of the first block, and
a third block to place an output from the at least one second block on a rendering device,
to ensure that the same monolithic multimedia component serves as more than one of:
a source component to read and to parse the input,
at least one transform component to transform the output data of the source component, and
a renderer to place the output of the at least one transform component on the rendering device, and
to further ensure that the application is unaware of the same monolithic multimedia component serving as the at least one of the source component, the at least one transform component, and the renderer.
12. The multimedia player of claim 11, wherein the processor is configured to execute instructions compatible with the input to be read by the monolithic multimedia component when serving as a source component being one of:
an audio file,
a video file,
an AA/ file,
a URL, comprising a multimedia file stream, and
a URL, comprising a multimedia file.
13. The multimedia player of claim 12, wherein the processor is further configured to execute instructions to capture, through the multimedia framework, a multimedia prior to creation of an input file comprising the multimedia.
14. The multimedia player of claim 13, wherein the processor is further configured to execute instructions to perform, through the multimedia framework, a multimedia file format conversion to convenience compatibility with a plurality of multimedia devices.
15. The multimedia player of claim 13, wherein the processor is configured to execute instructions to enable multimedia capturing using one of a web camera and a video camera.
16. The multimedia player of claim 11,
wherein the processor includes a central processor and a multimedia processor, the central processor operating at a frequency higher than that of the multimedia processor, and
wherein the multimedia framework executes at least one task associated therewith through the multimedia processor.
17. The multimedia player of claim 11,
wherein the multimedia processor operating at the lower frequency is part of a System- on-a-Chip (SoC).
18. The multimedia player of claim 16,
wherein the multimedia processor is configured to improve performance thereof through a use of multimedia accelerator modules.
19. The multimedia player of claim 11,
wherein the processor is a mobile processor used in mobile phones.
20. The multimedia player of claim 16,
wherein the central processor is solely configured to execute a requisite parsing associated with an input data to the processor.
US13/921,196 2009-07-09 2013-06-18 Multimedia framework to provide ultra-low power multimedia playback Active 2030-01-30 US9232206B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/921,196 US9232206B2 (en) 2009-07-09 2013-06-18 Multimedia framework to provide ultra-low power multimedia playback

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/499,809 US8488951B2 (en) 2009-07-09 2009-07-09 Multimedia framework to provide ultra-low power multimedia playback
US13/921,196 US9232206B2 (en) 2009-07-09 2013-06-18 Multimedia framework to provide ultra-low power multimedia playback

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/499,809 Division US8488951B2 (en) 2009-07-09 2009-07-09 Multimedia framework to provide ultra-low power multimedia playback

Publications (2)

Publication Number Publication Date
US20130279874A1 US20130279874A1 (en) 2013-10-24
US9232206B2 true US9232206B2 (en) 2016-01-05

Family

ID=43427532

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/499,809 Active 2031-10-14 US8488951B2 (en) 2009-07-09 2009-07-09 Multimedia framework to provide ultra-low power multimedia playback
US13/921,196 Active 2030-01-30 US9232206B2 (en) 2009-07-09 2013-06-18 Multimedia framework to provide ultra-low power multimedia playback

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/499,809 Active 2031-10-14 US8488951B2 (en) 2009-07-09 2009-07-09 Multimedia framework to provide ultra-low power multimedia playback

Country Status (1)

Country Link
US (2) US8488951B2 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011081367A2 (en) * 2009-12-28 2011-07-07 전자부품연구원 Multimedia-data-processing method
CN105898348A (en) * 2015-12-15 2016-08-24 乐视网信息技术(北京)股份有限公司 Method and device for reducing CPU temperature of video play terminal
JP6631374B2 (en) * 2016-04-13 2020-01-15 富士通株式会社 Information processing apparatus, operation status collection program, and operation status collection method

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5511002A (en) 1993-09-13 1996-04-23 Taligent, Inc. Multimedia player component object system
US5848291A (en) 1993-09-13 1998-12-08 Object Technology Licensing Corp. Object-oriented framework for creating multimedia applications
US5913038A (en) * 1996-12-13 1999-06-15 Microsoft Corporation System and method for processing multimedia data streams using filter graphs
US5982360A (en) 1997-06-08 1999-11-09 United Microelectronics Corp. Adaptive-selection method for memory access priority control in MPEG processor
EP1001337A2 (en) * 1998-10-30 2000-05-17 Fujitsu Limited Apparatus, method and architecture for task oriented applications
US20020037160A1 (en) 2000-08-22 2002-03-28 David Locket Multimedia signal processing system
US6452644B1 (en) 1997-06-11 2002-09-17 Koninklijke Philips Electronics N.V. Method of controlling reception in data broadcast receiver
US20040267940A1 (en) * 2003-06-27 2004-12-30 Microsoft Corporation Media plug-in registration and dynamic loading
US7292572B2 (en) 2002-12-11 2007-11-06 Lsi Corporation Multi-level register bank based configurable ethernet frame parser
US20090060032A1 (en) 2007-05-11 2009-03-05 Advanced Micro Devices, Inc. Software Video Transcoder with GPU Acceleration
US20100146198A1 (en) 2008-12-10 2010-06-10 Nvidia Corporation Optimal power usage in decoding a content stream stored in a secondary storage
US20100153758A1 (en) 2006-08-31 2010-06-17 Ati Technologies Ulc Method and apparatus for optimizing power consumption in a multiprocessor environment
US8165644B2 (en) 2007-08-29 2012-04-24 Qualcomm Incorporated Server initiated power mode switching in portable communication devices
US20120192208A1 (en) 2009-06-29 2012-07-26 Nokia Corporation Method, Apparatus and Computer Program for Providing Multimedia Functions Using a Software Wrapper Component
US8311487B2 (en) 2010-05-06 2012-11-13 Research In Motion Limited Multimedia playback calibration methods, devices and systems
US20130055293A1 (en) 2011-08-31 2013-02-28 Divx, Llc Systems and methods for utilizing supported players via a shared multimedia framework
US8555251B2 (en) * 2005-03-24 2013-10-08 Sony Corporation Signal processing apparatus with user-configurable circuit configuration

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5218704A (en) * 1989-10-30 1993-06-08 Texas Instruments Real-time power conservation for portable computers
US6343363B1 (en) * 1994-09-22 2002-01-29 National Semiconductor Corporation Method of invoking a low power mode in a computer system using a halt instruction
KR100409187B1 (en) * 1994-08-16 2004-03-10 소니 가부시끼 가이샤 TV signal receiver and program switching device and method and remote controller
DE19638594A1 (en) * 1996-09-20 1998-03-26 Basf Ag Process for the purification of wastewater and circulating water in papermaking, deinking and pulp bleaching
US7187694B1 (en) * 2002-03-29 2007-03-06 Pmc-Sierra, Inc. Generic packet parser
KR101026593B1 (en) * 2002-10-11 2011-04-04 소니 주식회사 Network control confirmation system, control communication terminal, server, and network control confirmation method
EA011136B1 (en) * 2004-08-31 2008-12-30 Биотек Прогресс, А.С. Method and devices for continuous processing of renewable raw materials
US20060136764A1 (en) * 2004-12-22 2006-06-22 Munguia Peter R Methods and apparatus to manage power consumption of a system
US20070121953A1 (en) * 2005-11-28 2007-05-31 Mediatek Inc. Audio decoding system and method
US20070128334A1 (en) * 2005-12-04 2007-06-07 William Pittman Additives to enhance various distillers grains
US9582060B2 (en) * 2006-08-31 2017-02-28 Advanced Silicon Technologies Llc Battery-powered device with reduced power consumption based on an application profile data
US8103952B2 (en) * 2007-03-27 2012-01-24 Konica Minolta Laboratory U.S.A., Inc. Directed SAX parser for XML documents
KR20090008057A (en) * 2007-07-16 2009-01-21 삼성전자주식회사 Vedio apparatus and method for supplying power using the same
CN101689117A (en) * 2007-07-30 2010-03-31 松下电器产业株式会社 Semiconductor integrated circuit and video/audio processing device using the same

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5848291A (en) 1993-09-13 1998-12-08 Object Technology Licensing Corp. Object-oriented framework for creating multimedia applications
US6421692B1 (en) 1993-09-13 2002-07-16 Object Technology Licensing Corporation Object-oriented multimedia [data routing system] presentation control system
US5511002A (en) 1993-09-13 1996-04-23 Taligent, Inc. Multimedia player component object system
US5913038A (en) * 1996-12-13 1999-06-15 Microsoft Corporation System and method for processing multimedia data streams using filter graphs
US5982360A (en) 1997-06-08 1999-11-09 United Microelectronics Corp. Adaptive-selection method for memory access priority control in MPEG processor
US6452644B1 (en) 1997-06-11 2002-09-17 Koninklijke Philips Electronics N.V. Method of controlling reception in data broadcast receiver
EP1001337A2 (en) * 1998-10-30 2000-05-17 Fujitsu Limited Apparatus, method and architecture for task oriented applications
US20020037160A1 (en) 2000-08-22 2002-03-28 David Locket Multimedia signal processing system
US7292572B2 (en) 2002-12-11 2007-11-06 Lsi Corporation Multi-level register bank based configurable ethernet frame parser
US20040267940A1 (en) * 2003-06-27 2004-12-30 Microsoft Corporation Media plug-in registration and dynamic loading
US8555251B2 (en) * 2005-03-24 2013-10-08 Sony Corporation Signal processing apparatus with user-configurable circuit configuration
US20100153758A1 (en) 2006-08-31 2010-06-17 Ati Technologies Ulc Method and apparatus for optimizing power consumption in a multiprocessor environment
US20090060032A1 (en) 2007-05-11 2009-03-05 Advanced Micro Devices, Inc. Software Video Transcoder with GPU Acceleration
US8165644B2 (en) 2007-08-29 2012-04-24 Qualcomm Incorporated Server initiated power mode switching in portable communication devices
US20100146198A1 (en) 2008-12-10 2010-06-10 Nvidia Corporation Optimal power usage in decoding a content stream stored in a secondary storage
US20120192208A1 (en) 2009-06-29 2012-07-26 Nokia Corporation Method, Apparatus and Computer Program for Providing Multimedia Functions Using a Software Wrapper Component
US8311487B2 (en) 2010-05-06 2012-11-13 Research In Motion Limited Multimedia playback calibration methods, devices and systems
US20130055293A1 (en) 2011-08-31 2013-02-28 Divx, Llc Systems and methods for utilizing supported players via a shared multimedia framework

Also Published As

Publication number Publication date
US20110008012A1 (en) 2011-01-13
US8488951B2 (en) 2013-07-16
US20130279874A1 (en) 2013-10-24

Similar Documents

Publication Publication Date Title
WO2018121014A1 (en) Video play control method and apparatus and terminal device
WO2018113318A1 (en) Multi-channel ddr interleaving control method and device, and storage medium
US9264835B2 (en) Exposing off-host audio processing capabilities
WO2010138785A1 (en) Display and interaction environment for mobile devices
US20180063481A1 (en) Human interface device (hid) based control of video data conversion at docking station
WO2017129130A1 (en) Audio processing method, server, user equipment, and system
US9407863B2 (en) System and method for processing visual information
CN103428582B (en) Video playing method and device and client
US9232206B2 (en) Multimedia framework to provide ultra-low power multimedia playback
WO2017092561A1 (en) Method and apparatus for realizing playing of audio and video contents
US20180063477A1 (en) Tablet docking stations as adapter for video conferencing system
WO2020224337A1 (en) Split-screen playback method and apparatus for screen-locked video, device, and storage medium
US20160026510A1 (en) Structured logging system
WO2018014794A1 (en) Smart television system
US20080010482A1 (en) Remote control of a media computing device
CN111694866A (en) Data searching and storing method, data searching system, data searching device, data searching equipment and data searching medium
US20110145429A1 (en) Multi-granular stream processing
US20120066696A1 (en) Generic hardware and software platform for electronic devices in multimedia, graphics, and computing applications
US9318143B2 (en) Motion detection enabled power optimized display
CN101442627A (en) Control method for peer-to-peer calculation set-top box player
US8612451B1 (en) Searching for data structures
CN109672745A (en) The online control method for playing back of audio and device for FreeRTOS
TWI531964B (en) Method, apparatus and machine-readable medium for audio distribution
CN101009836A (en) Embedded video playing device based on the dual processor
TWI611304B (en) Media shadow files and system

Legal Events

Date Code Title Description
AS Assignment

Owner name: NVIDIA CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KULKARNI, MAYURESH;NADGOUDA, DHIRAJ;REEL/FRAME:030639/0051

Effective date: 20130618

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

CC Certificate of correction
MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8