US8792555B2 - Methods and systems for resizing multimedia content - Google Patents

Methods and systems for resizing multimedia content Download PDF

Info

Publication number
US8792555B2
US8792555B2 US11/668,828 US66882807A US8792555B2 US 8792555 B2 US8792555 B2 US 8792555B2 US 66882807 A US66882807 A US 66882807A US 8792555 B2 US8792555 B2 US 8792555B2
Authority
US
United States
Prior art keywords
data
segment
encoding
module
motion information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US11/668,828
Other versions
US20080037624A1 (en
Inventor
Gordon Kent Walker
Vijayalakshmi R. Raveendran
Binita Gupta
Phanikumar Bhamidipati
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qualcomm Inc
Original Assignee
Qualcomm Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Inc filed Critical Qualcomm Inc
Priority to US11/668,828 priority Critical patent/US8792555B2/en
Assigned to QUALCOMM INCORPORATED reassignment QUALCOMM INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WALKER, GORDON KENT, GUPTA, BINITA, BHAMIDIPATI, PHANIKUMAR, RAVEENDRAN, VIJAYALAKSHMI R.
Publication of US20080037624A1 publication Critical patent/US20080037624A1/en
Application granted granted Critical
Publication of US8792555B2 publication Critical patent/US8792555B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/587Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal sub-sampling or interpolation, e.g. decimation or subsequent interpolation of pictures in a video sequence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/109Selection of coding mode or of prediction mode among a plurality of temporal predictive coding modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/11Selection of coding mode or of prediction mode among a plurality of spatial predictive coding modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/115Selection of the code volume for a coding unit prior to coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/124Quantisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/132Sampling, masking or truncation of coding units, e.g. adaptive resampling, frame skipping, frame interpolation or high-frequency transform coefficient masking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • H04N19/14Coding unit complexity, e.g. amount of activity or edge presence estimation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output
    • H04N19/15Data rate or code amount at the encoder output by monitoring actual compressed data size at the memory before deciding storage at the transmission buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output
    • H04N19/152Data rate or code amount at the encoder output by measuring the fullness of the transmission buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/154Measured or subjectively estimated visual quality after decoding, e.g. measurement of distortion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/189Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding
    • H04N19/192Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding the adaptation method, adaptation tool or adaptation type being iterative or recursive
    • H04N19/194Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding the adaptation method, adaptation tool or adaptation type being iterative or recursive involving only two passes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/189Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding
    • H04N19/196Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding being specially adapted for the computation of encoding parameters, e.g. by averaging previously computed encoding parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • H04N19/463Embedding additional information in the video signal during the compression process by compressing encoding parameters before transmission
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/513Processing of motion vectors
    • H04N19/517Processing of motion vectors by encoding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/61Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/85Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression

Definitions

  • the disclosure relates to multimedia encoding and decoding and, more particularly, multimedia resizing for efficient statistical multiplexing.
  • Data networks such as wireless communication networks, have to trade off between services customized for a single terminal and services provided to a large number of terminals.
  • services customized for a single terminal For example, the distribution of multimedia content to a large number of resource limited portable devices (subscribers) is a complicated problem. Therefore, it is very important for network administrators, content retailers, and service providers to have a way to distribute content and/or other network services in a fast and efficient manner for presentation on networked devices.
  • Content delivery/media distribution systems may pack real time and non real time services into a transmission frame and deliver the frame to devices on a network.
  • a communication network may utilize Orthogonal Frequency Division Multiplexing (OFDM) to provide communications between a network server and one or more mobile devices.
  • OFDM Orthogonal Frequency Division Multiplexing
  • This technology provides a transmission frame having data slots that are packed with services to be delivered and transmitted over a distribution network.
  • this disclosure describes techniques for resizing multimedia content for efficient statistical multiplexing. More specifically, in response to a request to resize a segment of data, an encoding module resizes the segment of data to reduce the bit rate of the segment of data.
  • bit rate refers to the number of bits used per unit of time to represent the segment of multimedia data. Often bit rate is specified in kilobits per second (kbits/s). Thus, the bit rate of the segment of data corresponds to the size of the segment of data. Therefore reducing the bit rate of the segment of data reduces the size of the segment of data.
  • the encoding module may resize the segment of data by adjusting the amount of motion information to be encoded. For example, the encoding module may reduce the amount of motion information that is to be encoded. In other words, the encoding module associated with the segment of data to be resized uses a fewer number of bits to encode the motion information, thus reducing the size of the segment of data. In one aspect of this disclosure, the encoding module associated with the selected segment of data may merge two or more motion vectors. As another example, the encoding module may reselect an encoding mode for one or more blocks of pixels of at least one frame within the segment of data to reduce the amount of motion information to be encoded.
  • the encoding module may also adjust one or more encoding variables, such as a bit rate at which the segment of data is re-encoded or a quantization parameter used for re-encoding the segment of data, to further reduce the bit rate of the segment of data.
  • one or more encoding variables such as a bit rate at which the segment of data is re-encoded or a quantization parameter used for re-encoding the segment of data
  • a method for encoding streams of multimedia data comprises receiving a request to resize a segment of data associated with the stream of digital multimedia data and resizing the segment of data by adjusting an amount of motion information to be encoded for the segment of data in response to the request.
  • an apparatus for encoding a stream of digital multimedia data comprises an interface that receives a request to resize a segment of data associated with the stream of digital multimedia data and a resizing module that resizes the segment of data by adjusting an amount of motion information to be encoded for the segment of data in response to the request.
  • an apparatus for encoding a stream of digital multimedia data comprises means for receiving a request to resize a segment of data associated with the stream of digital multimedia data and means for resizing the segment of data by adjusting an amount of motion information to be encoded for the segment of data in response to the request.
  • a processor for processing digital video data is adapted to receive a request to resize a segment of data associated with the stream of digital multimedia data and resize the segment of data by adjusting an amount of motion information to be encoded for the segment of data in response to the request.
  • the techniques described herein may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the techniques may be realized in whole or in part by a computer program product including a computer readable medium comprising instructions that, when executed by a processor, performs one or more of the methods described herein. Accordingly, this disclosure also contemplates a computer-program product for processing digital video data that comprises a computer readable medium comprising instructions that cause at least one computer to receive a request to resize a segment of data associated with the stream of digital multimedia data and resize the segment of data by adjusting an amount of motion information to be encoded for the segment of data in response to the request.
  • FIG. 1 is a block diagram illustrating an exemplary encoding and decoding system.
  • FIG. 2 is a block diagram illustrating another exemplary encoding and decoding system.
  • FIG. 3 is a block diagram illustrating an exemplary encoder module for use within a multimedia encoding device.
  • FIG. 4 is a flow diagram illustrating exemplary operation of an encoder module encoding multimedia data in accordance with the techniques of this disclosure.
  • FIG. 5 is a flow diagram illustrating exemplary operation of an encoder module reducing the amount of motion information in accordance with one of the aspects of this disclosure.
  • this disclosure describes techniques for resizing multimedia content for efficient statistical multiplexing. More specifically, in response to a request to resize a segment of data, an encoding module resizes the segment of data to reduce the bit rate of the segment of data.
  • bit rate refers to the number of bits used per unit of time to represent the segment of multimedia data. Often bit rate is specified in kilobits per second (kbits/s). Thus, the bit rate of the segment of data corresponds to the size of the segment of data. Therefore reducing the bit rate of the segment of data reduces the size of the segment of data.
  • the encoding module may resize the segment of data by adjusting the amount of motion information to be encoded. For example, the encoding module may reduce the amount of motion information that is to be encoded. In other words, the encoding module associated with the segment of data to be resized uses a fewer number of bits to encode the motion information, thus reducing the size of the segment of data. In one aspect of this disclosure, the encoding module associated with the selected segment of data may merge two or more motion vectors. As another example, the encoding module may reselect an encoding mode for one or more blocks of pixels of at least one frame within the segment of data to reduce the amount of motion information to be encoded.
  • the encoding module may also adjust one or more encoding variables, such as a bit rate at which the segment of data is re-encoded or a quantization parameter used for re-encoding the segment of data, to further reduce the bit rate of the segment of data.
  • one or more encoding variables such as a bit rate at which the segment of data is re-encoded or a quantization parameter used for re-encoding the segment of data
  • FIG. 1 is a block diagram illustrating an exemplary encoding and decoding system 10 .
  • Encoding and decoding system 10 includes a multimedia encoding device 12 and a multimedia decoding device 14 .
  • Multimedia encoding device 12 encodes multimedia data, combines the encoded data and transmits the combined data to multimedia decoding device 14 via a transmission channel 16 .
  • Multimedia encoding device 12 may form part of a broadcast network component used to broadcast one or more channels of multimedia data.
  • Multimedia encoding device 12 may, for example, form part of a wireless base station, server, or any infrastructure node that is used to broadcast one or more channels of encoded multimedia data to one or more wireless devices, such as multimedia decoding device 14 .
  • Multimedia encoding device 12 may encode a plurality of services that include one or more flows of multimedia data, combine the encoded flows and transmit the combined flows to a multimedia decoding device via a transmission channel 16 .
  • the services may include both real-time and non-real-time multimedia content or service such as news, sports, weather, financial information, movies, and/or applications, programs, scripts, electronic mail, file transfer or any other type of suitable content or service.
  • multimedia encoding device 12 encodes, combines, and transmits portions of the flows of data received over a period of time. As an example, multimedia encoding device 12 may operate on the flows on a per second basis.
  • multimedia encoding device 12 encodes one-second segments of data of the plurality of flows, combines the one-second segments of data to form a superframe of data, and transmits the superframe over transmission channel 16 via a transmitter 22 .
  • the term “superframe” refers to a group of segments of data collected over a predetermined time period or window, such as a one second time period or window.
  • the segments of data include one or more frames of data.
  • the techniques of this disclosure are described in the context of one-second segments of data, the techniques may also be utilized for combining and transmitting other segments of data, such as for segments of data received over a different period of time, that may or may not be a fixed period of time, or for individual frames or sets of frames of data.
  • superframes could be defined to cover larger or smaller time intervals than one-second periods, or even variable time periods.
  • Multimedia decoding device 14 may comprise a user-device that receives the encoded multimedia data transmitted by multimedia encoding device 12 .
  • decoding device 14 may be implemented as part of a digital television, a wireless communication device, a portable digital assistant (PDA), a laptop computer or desktop computer, a digital music and video device, such as those sold under the trademark “iPod,” or a radiotelephone such as cellular, satellite or terrestrial-based radiotelephone.
  • PDA portable digital assistant
  • iPod a digital music and video device
  • radiotelephone such as cellular, satellite or terrestrial-based radiotelephone.
  • Transmission channel 16 may comprise any wired or wireless medium, or combination thereof.
  • transmission channel 16 is a fixed bandwidth channel.
  • the amount of transmission channel resources available for transmitting the combined flows of data is limited.
  • the number of bits of data that multimedia encoding device 12 may transmit via transmission channel 16 is limited by the amount of transmission channel resources available for transmission.
  • the number of bits of data that multimedia encoding device 12 may transmit via transmission channel 16 is limited by the amount of air-link or air-interface resources available.
  • Transmission channel 16 may comprise one or more radio access technologies, such as global system for mobile communications (GSM), code division multiple access (CMDA), CDMA 2000, wideband CDMA (W-CDMA), CDMA 1x Evolution-Data Optimized (EV-DO), frequency division multiple access (FDMA), time division multiple access (TDMA) or the broad family of standards developed to facilitate wireless networking defined by the various IEEE 801.11x standards.
  • GSM global system for mobile communications
  • CMDA code division multiple access
  • W-CDMA wideband CDMA
  • FDMA frequency division multiple access
  • TDMA time division multiple access
  • Multimedia encoding device 12 may attempt to output each of the flows of data at a constant quality level or bit rate.
  • the techniques described herein are applicable in either context.
  • multimedia encoding device 12 selects a bit rate for the flows of data based on a target quality level.
  • the target quality level used to determine the bit rate can be pre-selected, selected by a user, selected through an automatic process or a semi-automatic process requiring an input from a user or from another process, or selected dynamically by the encoding device or system based on predetermined criteria.
  • a target quality level can be selected based on, for example, the type of encoding application, or the type of client device that would be receiving the multimedia data. If the number of bits necessary to output each of the flows of data at the target quality level exceeds the amount of bits for which there is available transmission channel resources, multimedia encoding device 12 manages bit allocation among the flows in an attempt to preserve the highest overall quality for the plurality of flows.
  • multimedia encoding device 12 includes encoder modules 18 A- 18 N (collectively, “encoder modules 18 ”), a non-real-time (NRT) service module 19 , a multiplex module 20 and a transmitter 22 .
  • Encoder modules 18 receive flows of digital multimedia data from one or more sources.
  • Encoder modules 18 may, for example, receive the flows of multimedia data from a memory or an image capture device coupled to encoder modules 18 .
  • the flows of multimedia data may comprise live real-time video, audio, or video and audio flows to be coded and transmitted as a broadcast or on-demand, or may comprise pre-recorded and stored video, audio, or video and audio flows to be coded and transmitted as a broadcast or on-demand.
  • Encoder modules 18 send delivery requirements, such as quality and rate information associated with the real-time services of encoder modules 18 , associated with the segments of data to multiplex module 20 .
  • NRT service module 19 may also send delivery requirements, such as priority and latency requirements, associated with NRT services.
  • Encoder modules 18 and NRT service module 19 may send the delivery requirements to multiplex module 20 in response to a request from multiplex module 20 .
  • Encoder modules 18 and NRT service module 19 may communicate with multiplex module 20 using one or more control channels in accordance with a number of different communication protocols.
  • multiplex modules 20 may communicate using protocols that utilize the message transport layer (MTL) as the underlying transport mechanism.
  • MTL message transport layer
  • Multiplex module 20 analyzes the delivery requirements, e.g., the quality and rate information, priority requirements and latency requirements, to determine whether there are sufficient transmission channel resources to transmit the segments of data that encoder modules 18 desire to include in the current superframe. In other words, multiplex module 20 determines whether the segments of data that encoder modules 18 and NRT service module 19 desires to include in the current superframe fit within the fixed bandwidth channel.
  • delivery requirements e.g., the quality and rate information, priority requirements and latency requirements
  • Multiplex module 20 may, for example, determine an amount of transmission channel resources necessary to send each of the segments of data at the sizes and/or bit rates corresponding to a selected one of the quality levels. Multiplex module 20 may sum the amounts of transmission channel resources necessary to send the segments of data and compare the sum total of transmission channel resources required by all the segments of data with an amount of available transmission channel resources to determine whether there are sufficient transmission channel resources to send the segments of data. If multiplex module 20 determines that the plurality of segments of data do not fit within the available bandwidth, e.g., the sum total of necessary transmission channel resources exceeds the available transmission channel resources, multiplex module 20 selects one or more of the segments to be resized. Multiplex module 20 may attempt to select the segments of data to be resized that have a least amount of impact in quality at the corresponding reduced size.
  • Multiplex module 20 sends a request to encoder modules 18 associated with the selected segments of data to resize the flows of digital multimedia data in accordance with the reduced bit allocation or reduced bit rate. Additionally, multiplex module 20 may send a request to NRT module 19 to resize one or more of the NRT services. Multiplex module 20 may send a resize request to the encoder modules 18 and NRT module 19 associated with the selected segments via the control channel. The resize request may specify a maximum size, e.g., in bits, for the selected segment of data or a reduced bit rate for the segment of data.
  • Encoder modules 18 associated with the selected segments of data receive the resize requests associated with their respective segments of data, and resize the segments of multimedia data.
  • Encoder modules 18 and NRT service module 19 may resize the segments of data in a number of different ways. With respect to the real-time services, encoder modules 18 associated with the selected segments of data adjust the amount of motion information that is to be encoded to resize the segment of data. For example, encoder modules 18 may reduce the amount of motion information that is to be encoded. In other words, encoder modules 18 associated with the selected segments of data use a fewer number of bits to encode the motion information, thus reducing the size of the segment of data.
  • encoder modules 18 associated with the selected segments of data may merge two or more motion vectors to reduce the amount of motion information to be encoded.
  • encoder modules 18 may reselect an encoding mode for one or more blocks of pixels of at least one frame within the segment of data. As will be described in more detail below, reselecting the encoding mode for the blocks of pixels may have a similar effect to merging motion vectors.
  • encoder modules 18 may increase the amount of motion information that is to be encoded when the resize request indicates that there is additional bandwidth available, e.g., again by reselecting an encoding mode for one or more blocks of pixels of a frame within the segment of data.
  • the encoding modes may, for example, comprise one or more prediction modes.
  • the encoding modes may comprise prediction modes, such as macroblock level inter-modes (e.g., Inter 16 ⁇ 16, Inter 16 ⁇ 8, Inter 8 ⁇ 16, and Inter 8 ⁇ 8) or 8 ⁇ 8 sub-partition modes (e.g., Inter 8 ⁇ 8, Inter 8 ⁇ 4, Inter 4 ⁇ 8 and Inter 4 ⁇ 4) as well as intra-predication modes for the same block sizes.
  • the encoding modes may also include forward, backward or bi-directional prediction mode selection for each of the inter-modes.
  • Encoder modules 18 associated with the selected segments of data may also adjust one or more encoding variables in addition to adjusting the amount of motion information to be encoded. In other words, encoder modules 18 may adjust one or more encoding variable and reduce the amount of motion information to be encoded to further resize the segments of data. For example, encoder modules 18 associated with the selected segments of data may adjust a bit rate at which the segments of data are re-encoded. As described above, encoder modules 18 associated with the selected segments of data may reduce the bit rate as specified in the resize request. Alternatively, encoder modules 18 may determine the reduced bit rate based on the number of bits allocated to the segment of data.
  • encoder modules 18 associated with the selected segments of data may adjust quantization parameters (QPs) used to re-encode the segments of data or frame rate at which to encode the subsequent segments of data.
  • QPs quantization parameters
  • Encoder modules 18 re-encode the segments of data or size-adjusted segments of data using the adjusted encoding variables. In this manner, encoder modules 18 associated with the selected segments of data resize the segments of data to satisfy the size or rate requirements specified in the resize requests.
  • Multiplex module 20 collects the encoded segments of data when multiplex module 20 is ready to generate the current superframe. Multiplex module 20 may, for example, send transfer requests to encoder modules 18 via the control channel. In response to the requests, encoder modules 18 send the encoded segments of multimedia data to multiplex module 20 . Multiplex module 20 combines the flows of multimedia data to form a superframe and sends the superframe to transmitter 22 for transmission to one or more decoding devices via transmission channel 16 . In this manner, multiplex module 20 manages bit allocation among the flows to fit all the segments of data into the fixed bandwidth channel 16 while preserving the highest overall quality of the plurality of flows of data.
  • multiplex module 20 may receive delivery requirements, such as priority and latency requirements, for non-real-time services from NRT service module 19 , and analyze the delivery requirements of both the real-time and non-real-time services to determine whether the services fit within the fixed bandwidth channel.
  • Multiplex module 20 may also require resizing of one or more of the non-real-time services. In this manner, multiplex module 20 may arbitrate between real-time services and non-real-time services.
  • this disclosure describes use of the encoding techniques in the context of real-time services.
  • multimedia encoding device 12 The components in multimedia encoding device 12 are exemplary of those applicable to implement the techniques described herein. Multimedia encoding device 12 , however, may include many other components, if desired. For example, multimedia encoding device may include more than one NRT service module 19 . Moreover, the techniques of this disclosure are not necessarily limited to use in a system like that of FIG. 1 , nor a broadcast system. The techniques may find application in any multimedia encoding environment in which encoding techniques are used to encode a plurality of flows of multimedia data for transmission over a transmission channel with limited bandwidth. The illustrated components of multimedia encoding device 12 may be integrated as part of an encoder/decoder (CODEC).
  • CDEC encoder/decoder
  • multimedia encoding device 12 may be implemented as one or more processors, digital signal processors, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), discrete logic, software, hardware, firmware, or any combinations thereof.
  • multimedia encoding device 12 may comply with a multimedia coding standard such as Moving Picture Experts Group (MPEG-4), one or more of the standards developed by the International Telecommunication Union Standardization Sector (ITU-T), e.g., H.263 or ITU-T H.264, or another coding standard.
  • MPEG-4 Moving Picture Experts Group
  • ITU-T International Telecommunication Union Standardization Sector
  • H.263 International Telecommunication Union Standardization Sector
  • H.264 International Telecommunication Union Standardization Sector
  • FIG. 2 is a block diagram illustrating another exemplary encoding and decoding system 30 .
  • Encoding and decoding system 30 conforms substantially to encoding and decoding system 12 of FIG. 1 , but the resizing of selected segments of multimedia data is performed by resizing modules 32 A- 32 N (collectively, “resizing modules 32 ”) associated with the selected segments of data.
  • resizing modules 32 resizing modules 32 A- 32 N (collectively, “resizing modules 32 ”) associated with the selected segments of data.
  • the functionality of encoder modules 18 of FIG. 1 is divided between encoder modules 34 A- 34 N (collectively, “encoder modules 34 ”) and resizing modules 32 .
  • encoder modules 34 provide multiplex module 20 with delivery requirements, such as at least quality and rate information, associated with each of the segments of data for use in allocating the available bandwidth to the segments of data and selecting one or more of the segments of data to be resized when the allocation fails.
  • delivery requirements such as at least quality and rate information
  • Resizing modules 32 receive requests from multiplex module 20 to resize the segments of data in accordance with the requirements specified by multiplex module 20 in the resize request.
  • resizing modules 32 associated with the selected segments of data reduces the amount of motion information to be encoded, e.g., by merging one or more motion vectors in accordance with the techniques described herein, thus reducing the size of the segments of data.
  • resizing modules 32 may adjust one or more encoding variables, e.g., bit rate, frame rate or QP, to reduce the size of the segments of data.
  • FIG. 3 is a block diagram illustrating an exemplary encoder module 40 for use within a multimedia encoding device, such as multimedia encoding device 12 of FIG. 1 .
  • Encoder module 40 may, for example, represent any one of encoder modules 18 of encoding device 12 of FIG. 1 .
  • Encoder module 40 includes a multiplex module interface 42 , a content classification module 44 , quality-rate information generation module 46 , and an encoding module 48 .
  • Encoding module 48 further includes a resizing module 50 that resizes segments of data selected for resizing.
  • Encoder module 40 receives one or more flows of multimedia data from a source.
  • Encoder module 40 may, for example, receive the flows of multimedia data from a memory or an image capture device coupled to encoder module 40 .
  • the flows of multimedia data may comprise live real-time video, audio, or video and audio flows to be coded and transmitted as a broadcast, or may comprise a pre-recorded and stored video, audio, or video and audio flows to be coded and transmitted as a broadcast or on-demand.
  • Encoder module 40 may be configured to operate at a constant bit rate or quality level. For example, in some cases, encoder module 40 attempts to maintain a constant perceived quality metric for the flows of data regardless of the content of the data. In other words, encoder module 40 may attempt to output every flow of data at a target quality level. To maintain a constant or similar perceived quality level, encoder module 40 may select different bit rates for segments of data with different content. To this end, content classification module 44 classifies the segments of data based on their content. Content classification module 44 may classify the segment of data based on the complexity (e.g., spatial complexity and/or temporal complexity) of the data of the segment. One exemplary content classification method is described in co-pending and commonly assigned U.S. patent application Ser. No.
  • content classification module 44 may classify motion information, e.g., motion vectors, into categories of “high,” “medium,” and “low” (on an x-axis) and classify texture information, e.g., contrast ratio values, into categories of “high,” “medium,” and “low,” (on a y-axis) and the content classification is indicated at the point of intersection.
  • This classification may be associated, for example, with a particular quality-rate curve.
  • Content classification module 44 associates the segments of data with one or more delivery requirements based on the classifications.
  • Content classification module 44 may, for example, associate the segments of data with respective quality and rate information, such as quality-rate curves, quality-rate tables or the like.
  • the quality-rate curves model a quality metric, such as peak signal to noise ratio (PSNR), as a function of a bit rate.
  • Encoder module 40 may be configured with quality-rate curves that have been computed offline.
  • Quality-rate information generation module 46 may maintain a plurality of quality-rate curves that represent quality-rate characteristics for flows of data with varying content. As an example, quality-rate information generation module 46 may maintain quality-rate curves for eight different classes associated with varying levels of motion and texture in the content of the flows.
  • quality-rate information generation module 46 may maintain quality-rate curves that use a quality metric other than PSNR, such as mean opinion scores (MOS). Alternatively, quality-rate information generation module 46 may adjust the quality-rate curves to account for the fact that constant PSNR does not necessarily mean constant perceived quality. For example, quality-rate information generation module 46 may adjust traditional quality-rate curves by an offset as described in detail in co-pending and commonly assigned U.S. patent application Ser. No. 11/373,577, entitled “CONTENT CLASSIFICATION FOR MULTIMEDIA PROCESSING” and filed on Mar. 10, 2006, the entire content of which is incorporated herein be reference.
  • quality-rate information generation module 46 may adjust the target quality level associated with each of the content curves by an offset. Segments of data that include high motion, high texture content may, for example, be encoded at a slightly lower quality with respect to the target quality level, whereas segments of data that include low motion, low texture content may be encoded at slightly higher quality with respect to the target quality level. Because each content class has its own adjusted quality level relative to the overall target quality level, encoder module 40 may normalize the quality level for each content class to measure the current quality level at encoder module 40 .
  • content classification module 44 may associate the segments of data with pre-computed quality-rate tables that indicate one or more quality levels associated with the segments and sizes of the segment at each of the quality levels. To do so, content classification module 44 may associate the segment of data with a quality-rate curve, which corresponds to a particular one of the quality-rate tables. Quality-rate information generation module 46 may pre-compute the quality-rate curves, the adjusted quality-rate curves, or quality-rate tables, and store the pre-computed quality and rate information within a memory (not shown). Content classification module 44 may access the pre-computed quality and rate information when needed. Alternatively, quality-rate information generation module 46 may generate quality and rate information for the segments of data in real-time. For example, quality-rate information generation module 46 may create quality-rate tables based on the quality-rate curve associated with the segment of data.
  • Encoder module 40 sends the delivery requirements, e.g., the quality and rate information, associated with each of the segments of data to be included within the current superframe to multiplex module 20 ( FIG. 1 ) via multiplex module interface 42 .
  • the quality and rate information assists multiplex module 20 in monitoring the size of the current superframe and determining which of the segments of data to resize, if resizing is required to fit the segments of data within the current superframe.
  • Encoder module 40 may send the quality and rate information to multiplex module 20 in response to a request from multiplex module 20 .
  • the quality and rate information may comprise a quality-rate curve or quality-rate table associated with the segment of data.
  • resizing module 50 may adjust the amount of motion information to be encoded. For example, resizing module 50 may reduce the amount of motion information associated with the segments of data.
  • each of the segments of data includes one or more frames of data.
  • the frames of data may be partitioned into a plurality of blocks of pixels. Some blocks of pixels, often referred to as “macroblocks,” comprise a grouping of sub-blocks of pixels.
  • a 16 ⁇ 16 macroblock may comprise four 8 ⁇ 8 sub-blocks of pixels.
  • the H.264 standard permits encoding of blocks with a variety of different sizes, e.g., 16 ⁇ 16, 16 ⁇ 8, 8 ⁇ 16, 8 ⁇ 8, 4 ⁇ 4, 8 ⁇ 4, and 4 ⁇ 8.
  • Each of the sub-blocks may include at least one motion vector describing the motion field for that particular sub-block.
  • resizing module 50 may merge motion vectors of these sub-blocks to generate a single motion vector for the macroblock. Additionally, resizing module 50 may merge the motion vectors associated with the macroblocks to generate a single motion vector for the frame. Resizing module 50 thus reduces the number of motion vectors, resulting in a reduction in the amount of information to be encoded and thus the size of the segment of data.
  • resizing module 50 reselects an encoding mode for one or more blocks of pixels of at least one frame within the segment of data based on the new bit budget, i.e., based on the maximum number of bits or bit rate indicated in the resize request. As described above, resizing module 50 may reselect an encoding prediction mode based on the new bit budget.
  • resizing module 50 may reselect inter-prediction modes (e.g., Inter 16 ⁇ 16, Inter 16 ⁇ 8, Inter 8 ⁇ 16, and Inter 8 ⁇ 8) or 8 ⁇ 8 sub-partition prediction modes (e.g., Inter 8 ⁇ 8, Inter 8 ⁇ 4, Inter 4 ⁇ 8 and Inter 4 ⁇ 4) as well as intra-predication modes for one or more of the blocks of pixels of a frame within the segment of data.
  • the encoding modes may also include forward, backward or bi-directional prediction mode selection for each of the inter-modes.
  • resizing module 50 may reselect a forward, backward or bidirectional prediction mode for each of the selected inter-modes. In other words, resizing module 50 redoes the mode decision for the macroblocks, sub-blocks or other partition of pixels.
  • resizing module 50 may reselect the encoding mode based on the information generated during a first pass encoding.
  • encoding module 48 traverses each mode during a first pass encoding and generates motion vectors for each of the sub-blocks corresponding to each mode.
  • encoding module 48 generates a number of different motion vectors for each sub-block, with each of the different motion vectors corresponding to a particular mode.
  • encoding module 48 selects a mode for the blocks of pixels and encodes the motion vectors for the selected modes.
  • the mode information generated during the first pass encoding, i.e., the motion vectors associated with each of the modes is stored in a memory (not shown) for use by resizing module 50 during reselection.
  • resizing module 50 may reselect the mode for one or more of the sub-blocks according to the new bit budget specified by multiplex module in an attempt to select a better mode from the modes of the first pass. For example, resizing module 50 may reselect the mode to use Inter16 ⁇ 16 mode which has only one motion vector to replace the original Intra 16 ⁇ 8 mode which needs two motion vectors. In this manner, reselecting the encoding mode for one or more blocks of pixels has results in a similar outcome as merging motion information. Resizing module 50 retrieves the motion information generated for the newly selected mode, e.g., Inter 16 ⁇ 16 mode in the above example, and stored in the memory during the first pass encoding.
  • the motion information generated for the newly selected mode e.g., Inter 16 ⁇ 16 mode in the above example
  • resizing module 50 may generate motion information for the reselected mode using motion information associated with the old mode. For example, resizing module may merge the motion information of the Intra 16 ⁇ 8 mode to generate a motion vector for the new Inter 16 ⁇ 16 mode. Resizing module 50 may reselect the mode decision based on a quality-rate optimization rule. In other words, resizing module 50 may reselect the mode that has a small impact on the quality of the segment of data but a significant impact on bit rate saving. Since motion estimation is not necessary (although it may be performed) during the resizing, the computational complexity compared to the first pass is marginal. In this manner, resizing module 50 redoes the mode decision to select the motion vectors to merge together.
  • TABLE 1 includes results comparing the sizes of data re-encoded using the mode of the first pass and data re-encoded after redoing the mode decision for four segments of data. The different segments are identified as an animated segment, a music segment, a news segment and a sports segment.
  • TABLE 1 illustrates a three types of coding results (column 1), a total bit rate, i.e., size (column 2), a base layer to enhancement layer ratio (column 3), a percentage of motion information in P frames (column4), a base layer Luma PSNR (column 5) and an enhancement layer Luma PSNR (column 6).
  • the three types of coding results are coding results for the first pass (first row of each segment), re-encoding results using the mode of the first pass (second row of each segment), and re-encoding results using the reselected mode decisions described above (third row of each segment).
  • the quantization parameter for re-encoding comes from increasing the quantization parameter used in the first pass by six.
  • re-encoding after redoing the mode decision can reduce the rate more significantly than re-encoding using the first pass mode.
  • B:E base layer to enhancement layer ratio
  • the mode decided at the first pass is not optimal for a different bit budget. Therefore redoing the mode decision may significantly reduce the size of the segment of data.
  • resizing module 50 may also adjust one or more encoding variables to reduce the size of the selected segment of data.
  • the encoding variables may include a bit rate at which the selected segments of data are re-encoded, a QP at which the selected segments of data are re-encoded or the like.
  • resizing module 50 may re-encode the segment of data at a reduced bit rate to reduce the size of the selected segment of data.
  • the reduced bit rate may be specified within the resize request.
  • rate control module 52 may select a reduced bit rate based on other information, such as a maximum size, specified in the resize request.
  • resizing module 50 may make other adjustments to resize the segment of data such as re-encoding the segment of data using an adjusted quantization parameter.
  • resizing of the segment of data may cause the quality level of the segment of data to fall below the target quality level.
  • multiplex module 20 selects the segments to be re-encoded such that the overall quality of all the segments of data is preserved. If the quality level of the resized segment of data falls below a minimum quality level associated with encoder module 40 , resizing module 50 may resize the segment of data such that the quality level of the resized segment of data is greater than or equal to the minimum quality level.
  • rate control module 52 may select a higher bit rate that results in the segment of data being encoded at the minimum quality level.
  • Encoder module 40 receives a request from multiplex module 20 to send the encoded segments of data to be included within the current superframe. In response to the request from multiplex module 20 , encoder module 40 sends the encoded segments of data to multiplex module 20 . As described above, encoder module 40 sends the segments of data that were not selected for resizing at the original bit rate and sends the segments of data that were selected for resizing at the reduced bit rate.
  • encoder module 40 The components in encoder module 40 are exemplary of those applicable to implement the techniques described herein. Encoder module 40 , however, may include many other components, if desired.
  • the components in encoder module 40 may be implemented as one or more processors, digital signal processors, ASICs, FPGAs, discrete logic, software, hardware, firmware, or any combinations thereof.
  • encoder module 40 may comply with a multimedia coding standard such as MPEG-4, ITU-T H.263, ITU-T H.264, or another coding standard. Depiction of different features as modules is intended to highlight different functional aspects of encoder module 40 and does not necessarily imply that such modules must be realized by separate hardware or software components. Rather, functionality associated with one or more modules may be integrated within common or separate hardware or software components. Thus, the disclosure should not be limited to the example of encoder module 40 .
  • FIG. 4 is a flow diagram illustrating exemplary operation of an encoder module, such as encoder module 40 of FIG. 3 , encoding multimedia data in accordance with the techniques of this disclosure.
  • Encoder module 40 receives one or more flows of multimedia data from a source ( 60 ).
  • Encoder module 40 may, for example, receive the flows of multimedia data from a memory or an image capture device coupled to encoder module 40 .
  • the flows of multimedia data may comprise live real-time content, non real-time content, or a combination of real-time content and non real-time content.
  • Encoder module 40 classifies the segments of data based on their content ( 62 ).
  • Content classification module 44 may, for example, classify the received segments of data based on the complexity (e.g., spatial complexity and/or temporal complexity) of the data of the segment.
  • Content classification module 44 further associates the segments of data with quality and rate information based on the classification ( 64 ).
  • content classification module 44 may associate the segments of data with one of a plurality of quality-rate curves. As described above, the quality-rate curves may be pre-computed and stored in a memory.
  • content classification module 44 may associate the segments of data with one of a plurality of pre-computed quality-rate tables.
  • Encoder module 40 may generate additional quality and rate information for the segments of data ( 66 ).
  • quality and rate information generation module 46 may generate quality-rate tables for each of the segments of data.
  • the quality-rate tables indicate one or more quality levels associated with the segments of data and sizes of the segment of data at each of the quality levels.
  • Encoder module 40 sends the quality and rate information associated with the segment of data to a multiplex module 20 ( 68 ).
  • Encoder module 40 may, for example, send the quality and rate information associated with the segment of data in response to a request from the multiplex module.
  • Encoder module 40 may, for example, send a quality-rate curve and/or a quality-rate table associated with the segment of data.
  • the multiplex module uses the quality and rate information to monitor the size of a current superframe and to assist the multiplex module in determining which of the segments of data need to be resized.
  • encoder module 40 receives a resize request from the multiplex module 20 ( 70 ).
  • the resize request from the multiplex module 20 may include a reduced bit rate or maximum size, e.g., in bits, for the segment of data.
  • resizing module 50 resizes the encoded segment of data to reduce the size of the segment of data ( 72 ).
  • encoder modules 18 associated with the selected segments of data reduce the amount of motion information that is to be encoded, thus reducing the size of the segments of data. For example, encoder modules 18 associated with the selected segments of data may merge two or more motion vectors.
  • encoder modules 18 may reselect an encoding mode of one or more blocks of pixels based on the new bit budget (i.e., size) to reduce the size of the motion information to be encoded. Reselecting the encoding mode has the same result as merging motion vectors together. Additionally, resizing module 50 may adjust one or more encoding variables to further reduce the size of the segment of data. Resizing module 50 may, for example, re-encode the segment of data at a reduced bit rate a higher QP to reduce the size of the segment of data.
  • Encoder module 40 receives a request from the multiplex module 20 to send the encoded content of the segments of data to be included within the current superframe ( 74 ). In response to the request from the multiplex module, encoder module 2600 sends the encoded content of the segment of data to multiplex module 20 ( 76 ). As described above, encoder module 40 sends segments of data that were not selected for resizing at the original size and sends segments of data that were selected for resizing at the reduced size.
  • FIG. 5 is a flow diagram illustrating exemplary operation of an encoder module, such as encoder module 40 ( FIG. 3 ), reducing the amount of motion information in accordance with one of the aspects described herein.
  • encoding module 48 selects a block of pixels of a frame within the segment of data ( 80 ).
  • Encoding module 48 generates motion information for each mode for the selected block of pixels ( 82 ).
  • encoding module 48 may generate motion information for a forward prediction mode, a backward prediction mode and a bidirectional prediction mode.
  • Encoding module 48 stores the motion information generated for each of the modes in a memory for use during resizing ( 86 ).
  • Encoding module 48 selects one of the modes for the block of pixels to be used during a first pass encoding ( 84 ).
  • Encoding module 48 may, for example, select the modes for the blocks of pixels during the first pass encoding based on a quality-rate optimization rule. In other words, encoding module 48 selects the modes to achieve a target quality level.
  • Encoding module 48 determines whether there are any more blocks of pixels within the frames of the segment of data ( 88 ). When there are additional blocks of pixels within the frames of the segment of data, encoding module 48 selects the next block of pixels, generates motion information for each mode for the selected block of pixels, stores the motion information generated for each of the modes in a memory and selects one of the modes for the block of pixels to be used during a first pass encoding. If there are no additional blocks of pixels, encoding module waits to receive a resize request for the segment of data ( 90 ).
  • resizing module 50 Upon receiving a resize request for the segment of data, resizing module 50 reselects the mode for one or more of the blocks of pixels ( 92 ). Resizing module 50 may, for example, reselect the encoding mode to use Inter16 ⁇ 16 mode which has only one motion vector to replace the original Intra 16 ⁇ 8 mode which needs two motion vectors. Resizing module 50 retrieves the motion information generated for the Inter 16 ⁇ 16 mode and stored in the memory during the first pass encoding. Thus, redoing the mode decision during the re-encode has a similar result to merging one or more motion vectors. Resizing module 50 may reselect the mode decision based on a quality-rate optimization rule. In other words, resizing module 50 may reselect the mode that has a small impact on the quality of the segment of data but a significant impact on bit rate saving.
  • Resizing module 50 re-encodes the segment of data with the reduced motion information ( 96 ). In this manner, resizing module 50 resizes the segment of data by reducing the amount of motion information associated with the segment of data. As described above, resizing module 50 may additionally adjust one or more encoding variables to further resize the segment of data. For example, resizing module 50 may reduce a bit rate at which the segment of data is re-encoded or increase a quantization parameter used during re-encoding of the segment of data.
  • Computer-readable media may include computer storage media, communication media, or both, and may include any medium that facilitates transfer of a computer program from one place to another.
  • a storage media may be any available media that can be accessed by a computer.
  • such computer-readable media can comprise RAM, such as synchronous dynamic random access memory (SDRAM), read-only memory (ROM), non-volatile random access memory (NVRAM), ROM, electrically erasable programmable read-only memory (EEPROM), FLASH memory, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other tangible medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
  • RAM such as synchronous dynamic random access memory (SDRAM), read-only memory (ROM), non-volatile random access memory (NVRAM), ROM, electrically erasable programmable read-only memory (EEPROM), FLASH memory, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other tangible medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
  • SDRAM synchronous dynamic random access memory
  • ROM read-only memory
  • NVRAM non-vol
  • any connection is properly termed a computer-readable medium.
  • the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave
  • the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium.
  • Disk and disc includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically, e.g., with lasers. Combinations of the above should also be included within the scope of computer-readable media.
  • a computer program product includes a computer-readable medium as well as any materials associated with the computer-readable medium, including packaging materials within which the computer-readable medium is packaged.
  • the code associated with a computer-readable medium of a computer program product may be executed by a computer, e.g., by one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, ASICs, FPGAs, or other equivalent integrated or discrete logic circuitry.
  • DSPs digital signal processors
  • ASICs application specific integrated circuitry
  • FPGAs field-programmable gate arrays

Abstract

This disclosure describes techniques for resizing multimedia content for efficient statistical multiplexing. In response to a request to resize a current segment of data, an encoding module associated with the selected segment adjusts the amount of motion information to be encoded to resize the segment of data. For example, the encoding module associated with the selected segment of data may merge two or more motion vectors to reduce the amount of motion information to be encoded. As another example, the encoding module reselects encoding modes for one or more blocks of pixels of at least one frame within the segment of data.

Description

This application claims the benefit of U.S. Provisional Application No. 60/763,995, filed Jan. 31, 2006, and entitled “MULTIMEDIA CONTENT RE/ENCODING AND STATISTICAL MULTIPLEXING”, the entire content of which is incorporated herein by reference.
TECHNICAL FIELD
The disclosure relates to multimedia encoding and decoding and, more particularly, multimedia resizing for efficient statistical multiplexing.
BACKGROUND
Data networks, such as wireless communication networks, have to trade off between services customized for a single terminal and services provided to a large number of terminals. For example, the distribution of multimedia content to a large number of resource limited portable devices (subscribers) is a complicated problem. Therefore, it is very important for network administrators, content retailers, and service providers to have a way to distribute content and/or other network services in a fast and efficient manner for presentation on networked devices.
Content delivery/media distribution systems may pack real time and non real time services into a transmission frame and deliver the frame to devices on a network. For example, a communication network may utilize Orthogonal Frequency Division Multiplexing (OFDM) to provide communications between a network server and one or more mobile devices. This technology provides a transmission frame having data slots that are packed with services to be delivered and transmitted over a distribution network.
SUMMARY
In general, this disclosure describes techniques for resizing multimedia content for efficient statistical multiplexing. More specifically, in response to a request to resize a segment of data, an encoding module resizes the segment of data to reduce the bit rate of the segment of data. The term “bit rate” as used herein, refers to the number of bits used per unit of time to represent the segment of multimedia data. Often bit rate is specified in kilobits per second (kbits/s). Thus, the bit rate of the segment of data corresponds to the size of the segment of data. Therefore reducing the bit rate of the segment of data reduces the size of the segment of data.
In accordance with the techniques of this disclosure, the encoding module may resize the segment of data by adjusting the amount of motion information to be encoded. For example, the encoding module may reduce the amount of motion information that is to be encoded. In other words, the encoding module associated with the segment of data to be resized uses a fewer number of bits to encode the motion information, thus reducing the size of the segment of data. In one aspect of this disclosure, the encoding module associated with the selected segment of data may merge two or more motion vectors. As another example, the encoding module may reselect an encoding mode for one or more blocks of pixels of at least one frame within the segment of data to reduce the amount of motion information to be encoded. In addition to adjusting the amount of motion information to be encoded, the encoding module may also adjust one or more encoding variables, such as a bit rate at which the segment of data is re-encoded or a quantization parameter used for re-encoding the segment of data, to further reduce the bit rate of the segment of data.
In one aspect, a method for encoding streams of multimedia data comprises receiving a request to resize a segment of data associated with the stream of digital multimedia data and resizing the segment of data by adjusting an amount of motion information to be encoded for the segment of data in response to the request.
In another aspect, an apparatus for encoding a stream of digital multimedia data comprises an interface that receives a request to resize a segment of data associated with the stream of digital multimedia data and a resizing module that resizes the segment of data by adjusting an amount of motion information to be encoded for the segment of data in response to the request.
In a further aspect, an apparatus for encoding a stream of digital multimedia data comprises means for receiving a request to resize a segment of data associated with the stream of digital multimedia data and means for resizing the segment of data by adjusting an amount of motion information to be encoded for the segment of data in response to the request.
In another aspect, a processor for processing digital video data is adapted to receive a request to resize a segment of data associated with the stream of digital multimedia data and resize the segment of data by adjusting an amount of motion information to be encoded for the segment of data in response to the request.
The techniques described herein may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the techniques may be realized in whole or in part by a computer program product including a computer readable medium comprising instructions that, when executed by a processor, performs one or more of the methods described herein. Accordingly, this disclosure also contemplates a computer-program product for processing digital video data that comprises a computer readable medium comprising instructions that cause at least one computer to receive a request to resize a segment of data associated with the stream of digital multimedia data and resize the segment of data by adjusting an amount of motion information to be encoded for the segment of data in response to the request.
The details of one or more aspects are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of this disclosure will be apparent from the description and drawings, and from the claims.
BRIEF DESCRIPTION OF DRAWINGS
FIG. 1 is a block diagram illustrating an exemplary encoding and decoding system.
FIG. 2 is a block diagram illustrating another exemplary encoding and decoding system.
FIG. 3 is a block diagram illustrating an exemplary encoder module for use within a multimedia encoding device.
FIG. 4 is a flow diagram illustrating exemplary operation of an encoder module encoding multimedia data in accordance with the techniques of this disclosure.
FIG. 5 is a flow diagram illustrating exemplary operation of an encoder module reducing the amount of motion information in accordance with one of the aspects of this disclosure.
DETAILED DESCRIPTION
In general, this disclosure describes techniques for resizing multimedia content for efficient statistical multiplexing. More specifically, in response to a request to resize a segment of data, an encoding module resizes the segment of data to reduce the bit rate of the segment of data. The term “bit rate” as used herein, refers to the number of bits used per unit of time to represent the segment of multimedia data. Often bit rate is specified in kilobits per second (kbits/s). Thus, the bit rate of the segment of data corresponds to the size of the segment of data. Therefore reducing the bit rate of the segment of data reduces the size of the segment of data.
In accordance with the techniques of this disclosure, the encoding module may resize the segment of data by adjusting the amount of motion information to be encoded. For example, the encoding module may reduce the amount of motion information that is to be encoded. In other words, the encoding module associated with the segment of data to be resized uses a fewer number of bits to encode the motion information, thus reducing the size of the segment of data. In one aspect of this disclosure, the encoding module associated with the selected segment of data may merge two or more motion vectors. As another example, the encoding module may reselect an encoding mode for one or more blocks of pixels of at least one frame within the segment of data to reduce the amount of motion information to be encoded. In addition to adjusting the amount of motion information to be encoded, the encoding module may also adjust one or more encoding variables, such as a bit rate at which the segment of data is re-encoded or a quantization parameter used for re-encoding the segment of data, to further reduce the bit rate of the segment of data.
FIG. 1 is a block diagram illustrating an exemplary encoding and decoding system 10. Encoding and decoding system 10 includes a multimedia encoding device 12 and a multimedia decoding device 14. Multimedia encoding device 12 encodes multimedia data, combines the encoded data and transmits the combined data to multimedia decoding device 14 via a transmission channel 16. Multimedia encoding device 12 may form part of a broadcast network component used to broadcast one or more channels of multimedia data. Multimedia encoding device 12 may, for example, form part of a wireless base station, server, or any infrastructure node that is used to broadcast one or more channels of encoded multimedia data to one or more wireless devices, such as multimedia decoding device 14.
Multimedia encoding device 12 may encode a plurality of services that include one or more flows of multimedia data, combine the encoded flows and transmit the combined flows to a multimedia decoding device via a transmission channel 16. The services may include both real-time and non-real-time multimedia content or service such as news, sports, weather, financial information, movies, and/or applications, programs, scripts, electronic mail, file transfer or any other type of suitable content or service. In one aspect of this disclosure, multimedia encoding device 12 encodes, combines, and transmits portions of the flows of data received over a period of time. As an example, multimedia encoding device 12 may operate on the flows on a per second basis. In other words, multimedia encoding device 12 encodes one-second segments of data of the plurality of flows, combines the one-second segments of data to form a superframe of data, and transmits the superframe over transmission channel 16 via a transmitter 22. As used herein, the term “superframe” refers to a group of segments of data collected over a predetermined time period or window, such as a one second time period or window. The segments of data include one or more frames of data. Although the techniques of this disclosure are described in the context of one-second segments of data, the techniques may also be utilized for combining and transmitting other segments of data, such as for segments of data received over a different period of time, that may or may not be a fixed period of time, or for individual frames or sets of frames of data. In other words, superframes could be defined to cover larger or smaller time intervals than one-second periods, or even variable time periods.
Multimedia decoding device 14 may comprise a user-device that receives the encoded multimedia data transmitted by multimedia encoding device 12. By way of example, decoding device 14 may be implemented as part of a digital television, a wireless communication device, a portable digital assistant (PDA), a laptop computer or desktop computer, a digital music and video device, such as those sold under the trademark “iPod,” or a radiotelephone such as cellular, satellite or terrestrial-based radiotelephone. Although only a single multimedia decoding device 14 is illustrated in FIG. 1 for simplicity, multimedia encoding device 12 may transmit the combined flows of data to more than one multimedia decoding device.
Transmission channel 16 may comprise any wired or wireless medium, or combination thereof. In one aspect, transmission channel 16 is a fixed bandwidth channel. In other words, the amount of transmission channel resources available for transmitting the combined flows of data is limited. Thus, the number of bits of data that multimedia encoding device 12 may transmit via transmission channel 16 is limited by the amount of transmission channel resources available for transmission. In the wireless context, for example, the number of bits of data that multimedia encoding device 12 may transmit via transmission channel 16 is limited by the amount of air-link or air-interface resources available. Transmission channel 16 may comprise one or more radio access technologies, such as global system for mobile communications (GSM), code division multiple access (CMDA), CDMA 2000, wideband CDMA (W-CDMA), CDMA 1x Evolution-Data Optimized (EV-DO), frequency division multiple access (FDMA), time division multiple access (TDMA) or the broad family of standards developed to facilitate wireless networking defined by the various IEEE 801.11x standards.
Multimedia encoding device 12 may attempt to output each of the flows of data at a constant quality level or bit rate. The techniques described herein are applicable in either context. In the case of trying to maintain a constant quality, for example, multimedia encoding device 12 selects a bit rate for the flows of data based on a target quality level. The target quality level used to determine the bit rate can be pre-selected, selected by a user, selected through an automatic process or a semi-automatic process requiring an input from a user or from another process, or selected dynamically by the encoding device or system based on predetermined criteria. A target quality level can be selected based on, for example, the type of encoding application, or the type of client device that would be receiving the multimedia data. If the number of bits necessary to output each of the flows of data at the target quality level exceeds the amount of bits for which there is available transmission channel resources, multimedia encoding device 12 manages bit allocation among the flows in an attempt to preserve the highest overall quality for the plurality of flows.
As shown in FIG. 1, multimedia encoding device 12 includes encoder modules 18A-18N (collectively, “encoder modules 18”), a non-real-time (NRT) service module 19, a multiplex module 20 and a transmitter 22. Encoder modules 18 receive flows of digital multimedia data from one or more sources. Encoder modules 18 may, for example, receive the flows of multimedia data from a memory or an image capture device coupled to encoder modules 18. The flows of multimedia data may comprise live real-time video, audio, or video and audio flows to be coded and transmitted as a broadcast or on-demand, or may comprise pre-recorded and stored video, audio, or video and audio flows to be coded and transmitted as a broadcast or on-demand.
Encoder modules 18 send delivery requirements, such as quality and rate information associated with the real-time services of encoder modules 18, associated with the segments of data to multiplex module 20. NRT service module 19 may also send delivery requirements, such as priority and latency requirements, associated with NRT services. Encoder modules 18 and NRT service module 19 may send the delivery requirements to multiplex module 20 in response to a request from multiplex module 20. Encoder modules 18 and NRT service module 19 may communicate with multiplex module 20 using one or more control channels in accordance with a number of different communication protocols. In one aspect, multiplex modules 20 may communicate using protocols that utilize the message transport layer (MTL) as the underlying transport mechanism.
Multiplex module 20 analyzes the delivery requirements, e.g., the quality and rate information, priority requirements and latency requirements, to determine whether there are sufficient transmission channel resources to transmit the segments of data that encoder modules 18 desire to include in the current superframe. In other words, multiplex module 20 determines whether the segments of data that encoder modules 18 and NRT service module 19 desires to include in the current superframe fit within the fixed bandwidth channel.
Multiplex module 20 may, for example, determine an amount of transmission channel resources necessary to send each of the segments of data at the sizes and/or bit rates corresponding to a selected one of the quality levels. Multiplex module 20 may sum the amounts of transmission channel resources necessary to send the segments of data and compare the sum total of transmission channel resources required by all the segments of data with an amount of available transmission channel resources to determine whether there are sufficient transmission channel resources to send the segments of data. If multiplex module 20 determines that the plurality of segments of data do not fit within the available bandwidth, e.g., the sum total of necessary transmission channel resources exceeds the available transmission channel resources, multiplex module 20 selects one or more of the segments to be resized. Multiplex module 20 may attempt to select the segments of data to be resized that have a least amount of impact in quality at the corresponding reduced size.
Multiplex module 20 sends a request to encoder modules 18 associated with the selected segments of data to resize the flows of digital multimedia data in accordance with the reduced bit allocation or reduced bit rate. Additionally, multiplex module 20 may send a request to NRT module 19 to resize one or more of the NRT services. Multiplex module 20 may send a resize request to the encoder modules 18 and NRT module 19 associated with the selected segments via the control channel. The resize request may specify a maximum size, e.g., in bits, for the selected segment of data or a reduced bit rate for the segment of data.
Encoder modules 18 associated with the selected segments of data receive the resize requests associated with their respective segments of data, and resize the segments of multimedia data. Encoder modules 18 and NRT service module 19 may resize the segments of data in a number of different ways. With respect to the real-time services, encoder modules 18 associated with the selected segments of data adjust the amount of motion information that is to be encoded to resize the segment of data. For example, encoder modules 18 may reduce the amount of motion information that is to be encoded. In other words, encoder modules 18 associated with the selected segments of data use a fewer number of bits to encode the motion information, thus reducing the size of the segment of data. In one aspect, encoder modules 18 associated with the selected segments of data may merge two or more motion vectors to reduce the amount of motion information to be encoded. As another example, encoder modules 18 may reselect an encoding mode for one or more blocks of pixels of at least one frame within the segment of data. As will be described in more detail below, reselecting the encoding mode for the blocks of pixels may have a similar effect to merging motion vectors. In another example, encoder modules 18 may increase the amount of motion information that is to be encoded when the resize request indicates that there is additional bandwidth available, e.g., again by reselecting an encoding mode for one or more blocks of pixels of a frame within the segment of data.
The encoding modes may, for example, comprise one or more prediction modes. In the context of H.264, for example, the encoding modes may comprise prediction modes, such as macroblock level inter-modes (e.g., Inter 16×16, Inter 16×8, Inter 8×16, and Inter 8×8) or 8×8 sub-partition modes (e.g., Inter 8×8, Inter 8×4, Inter 4×8 and Inter 4×4) as well as intra-predication modes for the same block sizes. Additionally, the encoding modes may also include forward, backward or bi-directional prediction mode selection for each of the inter-modes.
Encoder modules 18 associated with the selected segments of data may also adjust one or more encoding variables in addition to adjusting the amount of motion information to be encoded. In other words, encoder modules 18 may adjust one or more encoding variable and reduce the amount of motion information to be encoded to further resize the segments of data. For example, encoder modules 18 associated with the selected segments of data may adjust a bit rate at which the segments of data are re-encoded. As described above, encoder modules 18 associated with the selected segments of data may reduce the bit rate as specified in the resize request. Alternatively, encoder modules 18 may determine the reduced bit rate based on the number of bits allocated to the segment of data. As another example, encoder modules 18 associated with the selected segments of data may adjust quantization parameters (QPs) used to re-encode the segments of data or frame rate at which to encode the subsequent segments of data. Encoder modules 18 re-encode the segments of data or size-adjusted segments of data using the adjusted encoding variables. In this manner, encoder modules 18 associated with the selected segments of data resize the segments of data to satisfy the size or rate requirements specified in the resize requests.
Multiplex module 20 collects the encoded segments of data when multiplex module 20 is ready to generate the current superframe. Multiplex module 20 may, for example, send transfer requests to encoder modules 18 via the control channel. In response to the requests, encoder modules 18 send the encoded segments of multimedia data to multiplex module 20. Multiplex module 20 combines the flows of multimedia data to form a superframe and sends the superframe to transmitter 22 for transmission to one or more decoding devices via transmission channel 16. In this manner, multiplex module 20 manages bit allocation among the flows to fit all the segments of data into the fixed bandwidth channel 16 while preserving the highest overall quality of the plurality of flows of data.
The techniques of this disclosure may also be applied to non-real time services or a combination of real-time services and non-real time services. For example, multiplex module 20 may receive delivery requirements, such as priority and latency requirements, for non-real-time services from NRT service module 19, and analyze the delivery requirements of both the real-time and non-real-time services to determine whether the services fit within the fixed bandwidth channel. Multiplex module 20 may also require resizing of one or more of the non-real-time services. In this manner, multiplex module 20 may arbitrate between real-time services and non-real-time services. For purposes of illustration, however, this disclosure describes use of the encoding techniques in the context of real-time services.
The components in multimedia encoding device 12 are exemplary of those applicable to implement the techniques described herein. Multimedia encoding device 12, however, may include many other components, if desired. For example, multimedia encoding device may include more than one NRT service module 19. Moreover, the techniques of this disclosure are not necessarily limited to use in a system like that of FIG. 1, nor a broadcast system. The techniques may find application in any multimedia encoding environment in which encoding techniques are used to encode a plurality of flows of multimedia data for transmission over a transmission channel with limited bandwidth. The illustrated components of multimedia encoding device 12 may be integrated as part of an encoder/decoder (CODEC).
The components in multimedia encoding device 12 may be implemented as one or more processors, digital signal processors, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), discrete logic, software, hardware, firmware, or any combinations thereof. Moreover, multimedia encoding device 12 may comply with a multimedia coding standard such as Moving Picture Experts Group (MPEG-4), one or more of the standards developed by the International Telecommunication Union Standardization Sector (ITU-T), e.g., H.263 or ITU-T H.264, or another coding standard. Depiction of different features as modules is intended to highlight different functional aspects of multimedia encoding device 12 and does not necessarily imply that such modules must be realized by separate hardware or software components. Rather, functionality associated with one or more modules may be integrated within common or separate hardware or software components. Thus, the disclosure should not be limited to the example of multimedia encoding device 12.
FIG. 2 is a block diagram illustrating another exemplary encoding and decoding system 30. Encoding and decoding system 30 conforms substantially to encoding and decoding system 12 of FIG. 1, but the resizing of selected segments of multimedia data is performed by resizing modules 32A-32N (collectively, “resizing modules 32”) associated with the selected segments of data. Thus, the functionality of encoder modules 18 of FIG. 1 is divided between encoder modules 34A-34N (collectively, “encoder modules 34”) and resizing modules 32. In other words, encoder modules 34 provide multiplex module 20 with delivery requirements, such as at least quality and rate information, associated with each of the segments of data for use in allocating the available bandwidth to the segments of data and selecting one or more of the segments of data to be resized when the allocation fails.
Resizing modules 32 receive requests from multiplex module 20 to resize the segments of data in accordance with the requirements specified by multiplex module 20 in the resize request. In particular, resizing modules 32 associated with the selected segments of data reduces the amount of motion information to be encoded, e.g., by merging one or more motion vectors in accordance with the techniques described herein, thus reducing the size of the segments of data. In addition to reducing the amount of motion information to be encoded resizing modules 32 may adjust one or more encoding variables, e.g., bit rate, frame rate or QP, to reduce the size of the segments of data.
FIG. 3 is a block diagram illustrating an exemplary encoder module 40 for use within a multimedia encoding device, such as multimedia encoding device 12 of FIG. 1. Encoder module 40 may, for example, represent any one of encoder modules 18 of encoding device 12 of FIG. 1. Encoder module 40 includes a multiplex module interface 42, a content classification module 44, quality-rate information generation module 46, and an encoding module 48. Encoding module 48 further includes a resizing module 50 that resizes segments of data selected for resizing.
Encoder module 40 receives one or more flows of multimedia data from a source. Encoder module 40 may, for example, receive the flows of multimedia data from a memory or an image capture device coupled to encoder module 40. The flows of multimedia data may comprise live real-time video, audio, or video and audio flows to be coded and transmitted as a broadcast, or may comprise a pre-recorded and stored video, audio, or video and audio flows to be coded and transmitted as a broadcast or on-demand.
Encoder module 40 may be configured to operate at a constant bit rate or quality level. For example, in some cases, encoder module 40 attempts to maintain a constant perceived quality metric for the flows of data regardless of the content of the data. In other words, encoder module 40 may attempt to output every flow of data at a target quality level. To maintain a constant or similar perceived quality level, encoder module 40 may select different bit rates for segments of data with different content. To this end, content classification module 44 classifies the segments of data based on their content. Content classification module 44 may classify the segment of data based on the complexity (e.g., spatial complexity and/or temporal complexity) of the data of the segment. One exemplary content classification method is described in co-pending and commonly assigned U.S. patent application Ser. No. 11/373,577, entitled “CONTENT CLASSIFICATION FOR MULTIMEDIA PROCESSING” and filed on Mar. 10, 2006, the entire content of which is incorporated herein by reference. For example, content classification module 44 may classify motion information, e.g., motion vectors, into categories of “high,” “medium,” and “low” (on an x-axis) and classify texture information, e.g., contrast ratio values, into categories of “high,” “medium,” and “low,” (on a y-axis) and the content classification is indicated at the point of intersection. This classification may be associated, for example, with a particular quality-rate curve.
Content classification module 44 associates the segments of data with one or more delivery requirements based on the classifications. Content classification module 44 may, for example, associate the segments of data with respective quality and rate information, such as quality-rate curves, quality-rate tables or the like. The quality-rate curves model a quality metric, such as peak signal to noise ratio (PSNR), as a function of a bit rate. Encoder module 40 may be configured with quality-rate curves that have been computed offline. Alternatively, quality-rate information generation module 46 may generate the quality-rate curves by, for example, modeling the quality-rate curves using a logarithmic function of the form:
Q=a*ln(r)+b,
where Q is the quality metric, r is the bit rate, and a and b are constants computed using a number of sample data points. Quality-rate information generation module 46 may maintain a plurality of quality-rate curves that represent quality-rate characteristics for flows of data with varying content. As an example, quality-rate information generation module 46 may maintain quality-rate curves for eight different classes associated with varying levels of motion and texture in the content of the flows. To account for the fact that constant PSNR does not necessarily mean constant perceived quality, quality-rate information generation module 46 may maintain quality-rate curves that use a quality metric other than PSNR, such as mean opinion scores (MOS). Alternatively, quality-rate information generation module 46 may adjust the quality-rate curves to account for the fact that constant PSNR does not necessarily mean constant perceived quality. For example, quality-rate information generation module 46 may adjust traditional quality-rate curves by an offset as described in detail in co-pending and commonly assigned U.S. patent application Ser. No. 11/373,577, entitled “CONTENT CLASSIFICATION FOR MULTIMEDIA PROCESSING” and filed on Mar. 10, 2006, the entire content of which is incorporated herein be reference.
Alternatively, quality-rate information generation module 46 may adjust the target quality level associated with each of the content curves by an offset. Segments of data that include high motion, high texture content may, for example, be encoded at a slightly lower quality with respect to the target quality level, whereas segments of data that include low motion, low texture content may be encoded at slightly higher quality with respect to the target quality level. Because each content class has its own adjusted quality level relative to the overall target quality level, encoder module 40 may normalize the quality level for each content class to measure the current quality level at encoder module 40. Encoder module 40 may achieve this normalization according to the linear equation below:
Q norm =Q r −Q k,
where Qnorm is the normalized quality level, Qr is the recorded quality level, and Qk is the adjustment offset in quality level for curve k. If quality normalization is not a linear function, rank determination may be performed after quality normalization.
In another example, content classification module 44 may associate the segments of data with pre-computed quality-rate tables that indicate one or more quality levels associated with the segments and sizes of the segment at each of the quality levels. To do so, content classification module 44 may associate the segment of data with a quality-rate curve, which corresponds to a particular one of the quality-rate tables. Quality-rate information generation module 46 may pre-compute the quality-rate curves, the adjusted quality-rate curves, or quality-rate tables, and store the pre-computed quality and rate information within a memory (not shown). Content classification module 44 may access the pre-computed quality and rate information when needed. Alternatively, quality-rate information generation module 46 may generate quality and rate information for the segments of data in real-time. For example, quality-rate information generation module 46 may create quality-rate tables based on the quality-rate curve associated with the segment of data.
Encoder module 40 sends the delivery requirements, e.g., the quality and rate information, associated with each of the segments of data to be included within the current superframe to multiplex module 20 (FIG. 1) via multiplex module interface 42. The quality and rate information assists multiplex module 20 in monitoring the size of the current superframe and determining which of the segments of data to resize, if resizing is required to fit the segments of data within the current superframe. Encoder module 40 may send the quality and rate information to multiplex module 20 in response to a request from multiplex module 20. As described above, the quality and rate information may comprise a quality-rate curve or quality-rate table associated with the segment of data.
If any of the segments of data associated with encoder module 40 need to be resized, multiplex module 20 sends a resize request to encoder module 40. In response to the resize request, resizing module 50 resizes the segment of multimedia data to reduce the size of the segment of data. In accordance with the techniques of this disclosure, resizing module 50 may adjust the amount of motion information to be encoded. For example, resizing module 50 may reduce the amount of motion information associated with the segments of data. As described above, each of the segments of data includes one or more frames of data. The frames of data may be partitioned into a plurality of blocks of pixels. Some blocks of pixels, often referred to as “macroblocks,” comprise a grouping of sub-blocks of pixels. As an example, a 16×16 macroblock may comprise four 8×8 sub-blocks of pixels. The H.264 standard permits encoding of blocks with a variety of different sizes, e.g., 16×16, 16×8, 8×16, 8×8, 4×4, 8×4, and 4×8. Each of the sub-blocks may include at least one motion vector describing the motion field for that particular sub-block. To reduce the amount of motion information associated with the segments of data, and thus reduce the size of the segment of data, resizing module 50 may merge motion vectors of these sub-blocks to generate a single motion vector for the macroblock. Additionally, resizing module 50 may merge the motion vectors associated with the macroblocks to generate a single motion vector for the frame. Resizing module 50 thus reduces the number of motion vectors, resulting in a reduction in the amount of information to be encoded and thus the size of the segment of data.
In one aspect of this disclosure, resizing module 50 reselects an encoding mode for one or more blocks of pixels of at least one frame within the segment of data based on the new bit budget, i.e., based on the maximum number of bits or bit rate indicated in the resize request. As described above, resizing module 50 may reselect an encoding prediction mode based on the new bit budget. In the context of H.264, for example, resizing module 50 may reselect inter-prediction modes (e.g., Inter 16×16, Inter 16×8, Inter 8×16, and Inter 8×8) or 8×8 sub-partition prediction modes (e.g., Inter 8×8, Inter 8×4, Inter 4×8 and Inter 4×4) as well as intra-predication modes for one or more of the blocks of pixels of a frame within the segment of data. Additionally, the encoding modes may also include forward, backward or bi-directional prediction mode selection for each of the inter-modes. Alternatively or additionally, resizing module 50 may reselect a forward, backward or bidirectional prediction mode for each of the selected inter-modes. In other words, resizing module 50 redoes the mode decision for the macroblocks, sub-blocks or other partition of pixels.
In one aspect, resizing module 50 may reselect the encoding mode based on the information generated during a first pass encoding. In particular, encoding module 48 traverses each mode during a first pass encoding and generates motion vectors for each of the sub-blocks corresponding to each mode. In other words, encoding module 48 generates a number of different motion vectors for each sub-block, with each of the different motion vectors corresponding to a particular mode. During the first pass encoding, encoding module 48 selects a mode for the blocks of pixels and encodes the motion vectors for the selected modes. The mode information generated during the first pass encoding, i.e., the motion vectors associated with each of the modes is stored in a memory (not shown) for use by resizing module 50 during reselection.
In response to the resize request, resizing module 50 may reselect the mode for one or more of the sub-blocks according to the new bit budget specified by multiplex module in an attempt to select a better mode from the modes of the first pass. For example, resizing module 50 may reselect the mode to use Inter16×16 mode which has only one motion vector to replace the original Intra 16×8 mode which needs two motion vectors. In this manner, reselecting the encoding mode for one or more blocks of pixels has results in a similar outcome as merging motion information. Resizing module 50 retrieves the motion information generated for the newly selected mode, e.g., Inter 16×16 mode in the above example, and stored in the memory during the first pass encoding. In this manner, redoing the mode decision during the re-encode has a similar result to merging one or more motion vectors. Alternatively, resizing module 50 may generate motion information for the reselected mode using motion information associated with the old mode. For example, resizing module may merge the motion information of the Intra 16×8 mode to generate a motion vector for the new Inter 16×16 mode. Resizing module 50 may reselect the mode decision based on a quality-rate optimization rule. In other words, resizing module 50 may reselect the mode that has a small impact on the quality of the segment of data but a significant impact on bit rate saving. Since motion estimation is not necessary (although it may be performed) during the resizing, the computational complexity compared to the first pass is marginal. In this manner, resizing module 50 redoes the mode decision to select the motion vectors to merge together.
TABLE 1 includes results comparing the sizes of data re-encoded using the mode of the first pass and data re-encoded after redoing the mode decision for four segments of data. The different segments are identified as an animated segment, a music segment, a news segment and a sports segment. TABLE 1 illustrates a three types of coding results (column 1), a total bit rate, i.e., size (column 2), a base layer to enhancement layer ratio (column 3), a percentage of motion information in P frames (column4), a base layer Luma PSNR (column 5) and an enhancement layer Luma PSNR (column 6). The three types of coding results are coding results for the first pass (first row of each segment), re-encoding results using the mode of the first pass (second row of each segment), and re-encoding results using the reselected mode decisions described above (third row of each segment). The quantization parameter for re-encoding comes from increasing the quantization parameter used in the first pass by six.
TABLE 1
Total PSNR PSNR
Rate B:E PMV % Base Enh.
Animated
1st Pass 269 1.1:1 53% 31.2 35.4
Re-encode with 1st pass mode 161 1.5:1 73% 27.2 31.1
Re-encode with MD 135 1.1:1 57% 27.2 30.9
Music
1st Pass 281   1:1 51% 35.1 36.6
Re-encode with 1st pass mode 166 1.5:1 72% 32.0 33.2
Re-encode with MD 128 1.2:1 57% 32.0 32.8
News
1st Pass 297 0.8:1 46% 32.9 37.2
Re-encode with 1st pass mode 182 1.1:1 78% 28.9 33.0
Re-encode with MD 133 0.9:1 64% 28.9 32.6
Sports
1st Pass 301   1:1 60% 31.6 34.5
Re-encode with 1st pass mode 179 1.3:1 80% 27.6 30.3
Re-encode with MD 145 1.1:1 67% 27.5 29.9
As illustrated in the results of TABLE 1, re-encoding after redoing the mode decision (MD) can reduce the rate more significantly than re-encoding using the first pass mode. Looking at the base layer to enhancement layer ratio (B:E) it can be seen that it is hard to get a 1:1 when the re-encoding is performed with the first pass mode because of the size of motion information in P frames. Moreover, a larger base layer did not pay off for the base layer quality as shown in the base layer PSNR. Thus, the mode decided at the first pass is not optimal for a different bit budget. Therefore redoing the mode decision may significantly reduce the size of the segment of data.
In addition to reducing the amount of motion information, resizing module 50 may also adjust one or more encoding variables to reduce the size of the selected segment of data. The encoding variables may include a bit rate at which the selected segments of data are re-encoded, a QP at which the selected segments of data are re-encoded or the like. For example, resizing module 50 may re-encode the segment of data at a reduced bit rate to reduce the size of the selected segment of data. In some cases, the reduced bit rate may be specified within the resize request. Alternatively, rate control module 52 may select a reduced bit rate based on other information, such as a maximum size, specified in the resize request. Alternatively or additionally, resizing module 50 may make other adjustments to resize the segment of data such as re-encoding the segment of data using an adjusted quantization parameter.
In some cases, resizing of the segment of data may cause the quality level of the segment of data to fall below the target quality level. However, as described above, multiplex module 20 selects the segments to be re-encoded such that the overall quality of all the segments of data is preserved. If the quality level of the resized segment of data falls below a minimum quality level associated with encoder module 40, resizing module 50 may resize the segment of data such that the quality level of the resized segment of data is greater than or equal to the minimum quality level. For example, if a bit rate included within the resize request results in the segment of data being encoded at a quality level below the minimum quality level associated with encoder module 40, rate control module 52 may select a higher bit rate that results in the segment of data being encoded at the minimum quality level.
Encoder module 40 receives a request from multiplex module 20 to send the encoded segments of data to be included within the current superframe. In response to the request from multiplex module 20, encoder module 40 sends the encoded segments of data to multiplex module 20. As described above, encoder module 40 sends the segments of data that were not selected for resizing at the original bit rate and sends the segments of data that were selected for resizing at the reduced bit rate.
The components in encoder module 40 are exemplary of those applicable to implement the techniques described herein. Encoder module 40, however, may include many other components, if desired. The components in encoder module 40 may be implemented as one or more processors, digital signal processors, ASICs, FPGAs, discrete logic, software, hardware, firmware, or any combinations thereof. Moreover, encoder module 40 may comply with a multimedia coding standard such as MPEG-4, ITU-T H.263, ITU-T H.264, or another coding standard. Depiction of different features as modules is intended to highlight different functional aspects of encoder module 40 and does not necessarily imply that such modules must be realized by separate hardware or software components. Rather, functionality associated with one or more modules may be integrated within common or separate hardware or software components. Thus, the disclosure should not be limited to the example of encoder module 40.
FIG. 4 is a flow diagram illustrating exemplary operation of an encoder module, such as encoder module 40 of FIG. 3, encoding multimedia data in accordance with the techniques of this disclosure. Encoder module 40 receives one or more flows of multimedia data from a source (60). Encoder module 40 may, for example, receive the flows of multimedia data from a memory or an image capture device coupled to encoder module 40. The flows of multimedia data may comprise live real-time content, non real-time content, or a combination of real-time content and non real-time content.
Encoder module 40 classifies the segments of data based on their content (62). Content classification module 44 (FIG. 3) may, for example, classify the received segments of data based on the complexity (e.g., spatial complexity and/or temporal complexity) of the data of the segment. Content classification module 44 further associates the segments of data with quality and rate information based on the classification (64). As an example, content classification module 44 may associate the segments of data with one of a plurality of quality-rate curves. As described above, the quality-rate curves may be pre-computed and stored in a memory. As another example, content classification module 44 may associate the segments of data with one of a plurality of pre-computed quality-rate tables.
Encoder module 40 may generate additional quality and rate information for the segments of data (66). For example, quality and rate information generation module 46 may generate quality-rate tables for each of the segments of data. As describe above, the quality-rate tables indicate one or more quality levels associated with the segments of data and sizes of the segment of data at each of the quality levels.
Encoder module 40 sends the quality and rate information associated with the segment of data to a multiplex module 20 (68). Encoder module 40 may, for example, send the quality and rate information associated with the segment of data in response to a request from the multiplex module. Encoder module 40 may, for example, send a quality-rate curve and/or a quality-rate table associated with the segment of data. As described in detail above, the multiplex module uses the quality and rate information to monitor the size of a current superframe and to assist the multiplex module in determining which of the segments of data need to be resized.
If any of the segments of data associated with encoder module 40 need to be resized, encoder module 40 receives a resize request from the multiplex module 20 (70). The resize request from the multiplex module 20 may include a reduced bit rate or maximum size, e.g., in bits, for the segment of data. In response to the resize request, resizing module 50 resizes the encoded segment of data to reduce the size of the segment of data (72). In accordance with the techniques of this disclosure, encoder modules 18 associated with the selected segments of data reduce the amount of motion information that is to be encoded, thus reducing the size of the segments of data. For example, encoder modules 18 associated with the selected segments of data may merge two or more motion vectors. As will be described herein, encoder modules 18 may reselect an encoding mode of one or more blocks of pixels based on the new bit budget (i.e., size) to reduce the size of the motion information to be encoded. Reselecting the encoding mode has the same result as merging motion vectors together. Additionally, resizing module 50 may adjust one or more encoding variables to further reduce the size of the segment of data. Resizing module 50 may, for example, re-encode the segment of data at a reduced bit rate a higher QP to reduce the size of the segment of data.
Encoder module 40 receives a request from the multiplex module 20 to send the encoded content of the segments of data to be included within the current superframe (74). In response to the request from the multiplex module, encoder module 2600 sends the encoded content of the segment of data to multiplex module 20 (76). As described above, encoder module 40 sends segments of data that were not selected for resizing at the original size and sends segments of data that were selected for resizing at the reduced size.
FIG. 5 is a flow diagram illustrating exemplary operation of an encoder module, such as encoder module 40 (FIG. 3), reducing the amount of motion information in accordance with one of the aspects described herein. Initially, encoding module 48 selects a block of pixels of a frame within the segment of data (80). Encoding module 48 generates motion information for each mode for the selected block of pixels (82). For example, encoding module 48 may generate motion information for a forward prediction mode, a backward prediction mode and a bidirectional prediction mode. Encoding module 48 stores the motion information generated for each of the modes in a memory for use during resizing (86). Encoding module 48 selects one of the modes for the block of pixels to be used during a first pass encoding (84). Encoding module 48 may, for example, select the modes for the blocks of pixels during the first pass encoding based on a quality-rate optimization rule. In other words, encoding module 48 selects the modes to achieve a target quality level.
Encoding module 48 determines whether there are any more blocks of pixels within the frames of the segment of data (88). When there are additional blocks of pixels within the frames of the segment of data, encoding module 48 selects the next block of pixels, generates motion information for each mode for the selected block of pixels, stores the motion information generated for each of the modes in a memory and selects one of the modes for the block of pixels to be used during a first pass encoding. If there are no additional blocks of pixels, encoding module waits to receive a resize request for the segment of data (90).
Upon receiving a resize request for the segment of data, resizing module 50 reselects the mode for one or more of the blocks of pixels (92). Resizing module 50 may, for example, reselect the encoding mode to use Inter16×16 mode which has only one motion vector to replace the original Intra 16×8 mode which needs two motion vectors. Resizing module 50 retrieves the motion information generated for the Inter 16×16 mode and stored in the memory during the first pass encoding. Thus, redoing the mode decision during the re-encode has a similar result to merging one or more motion vectors. Resizing module 50 may reselect the mode decision based on a quality-rate optimization rule. In other words, resizing module 50 may reselect the mode that has a small impact on the quality of the segment of data but a significant impact on bit rate saving.
Resizing module 50 re-encodes the segment of data with the reduced motion information (96). In this manner, resizing module 50 resizes the segment of data by reducing the amount of motion information associated with the segment of data. As described above, resizing module 50 may additionally adjust one or more encoding variables to further resize the segment of data. For example, resizing module 50 may reduce a bit rate at which the segment of data is re-encoded or increase a quantization parameter used during re-encoding of the segment of data.
Based on the teachings described herein, one skilled in the art should appreciate that an aspect disclosed herein may be implemented independently of any other aspects and that two or more of these aspects may be combined in various ways. The techniques described herein may be implemented in hardware, software, firmware, or any combination thereof. If implemented in hardware, the techniques may be realized using digital hardware, analog hardware or a combination thereof. If implemented in software, the techniques may be realized at least in part by one or more stored or transmitted instructions or code on a computer-readable medium. Computer-readable media may include computer storage media, communication media, or both, and may include any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a computer.
By way of example, and not limitation, such computer-readable media can comprise RAM, such as synchronous dynamic random access memory (SDRAM), read-only memory (ROM), non-volatile random access memory (NVRAM), ROM, electrically erasable programmable read-only memory (EEPROM), FLASH memory, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other tangible medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
Also, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically, e.g., with lasers. Combinations of the above should also be included within the scope of computer-readable media.
A computer program product, as disclosed herein, includes a computer-readable medium as well as any materials associated with the computer-readable medium, including packaging materials within which the computer-readable medium is packaged. The code associated with a computer-readable medium of a computer program product may be executed by a computer, e.g., by one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, ASICs, FPGAs, or other equivalent integrated or discrete logic circuitry. In some aspects, the functionality described herein may be provided within dedicated software modules or hardware modules configured for encoding and decoding, or incorporated in a combined CODEC.
Various aspects have been described. These and other aspects are within the scope of the following claims.

Claims (53)

What is claimed is:
1. A method for encoding a stream of digital multimedia data, the method comprising:
determining, by a multiplex module, when an amount of available transmission channel resources for a current transmission frame is exceeded by a plurality of segments of data of the stream of digital multimedia data;
generating, by the multiplex module, a request to resize at least one segment of data of the plurality of segments of data;
receiving, at an encoder module, the request to resize the at least one segment of data associated with the stream of digital multimedia data for the current transmission frame, the request sent in response to a comparison between quality and rate information of the at least one segment of data and transmission channel resources with respect to the current transmission frame;
resizing, at the encoder module, the at least one segment of data by adjusting an amount of motion information to be encoded with the at least one segment of data in response to the request;
transmitting, from the encoder module to the multiplex module, the at least one resized segment of data and at least one unresized segment of data for the current transmission frame; and
generating, by the multiplex module, the current transmission frame by collecting the at least one resized segment of data and the at least one unresized segment of data.
2. The method of claim 1, wherein resizing the segment of data by adjusting the amount of motion information to be encoded comprises resizing the segment of data by reducing the amount of motion information to be encoded.
3. The method of claim 2, wherein reducing the amount of motion information to be encoded comprises merging motion information associated with two or more blocks of pixels of at least one frame within the segment of data.
4. The method of claim 1, further comprising reselecting encoding modes for one or more blocks of pixels based on information generated during a first pass encoding of the segment of data.
5. The method of claim 4, wherein reselecting the modes for blocks of pixels comprises reselecting the encoding modes for one or more blocks of pixels that result in a smallest impact in quality of the segment of data and a highest reduction in bit rate of the segment of data.
6. The method of claim 4, wherein reselecting encoding modes for one or more blocks of pixels comprises reselecting one of intra-modes or inter-modes associated with the blocks of pixels.
7. The method of claim 4, wherein reselecting encoding modes for one or more blocks of pixels comprises reselecting one of a forward prediction mode, a backward prediction mode, and a bi-directional prediction mode for one or more inter-mode blocks of pixels.
8. The method of claim 1, wherein adjusting the amount of motion information to be encoded comprises increasing the amount of motion information to be encoded.
9. The method of claim 1, wherein resizing the segment of data further comprises adjusting one or more encoding variables in response to the request to further resize the segment of data.
10. The method of claim 9, wherein adjusting one or more encoding variables comprises adjusting one of a bit rate at which to encode the segment of data, a quantization parameter at which to encode the segment of data, and a frame rate at which to encode the segment of data.
11. The method of claim 1, wherein the segment of data comprises portions of real-time data.
12. An apparatus for encoding a stream of digital multimedia data, the apparatus comprising:
a multiplex module that:
determines when an amount of available transmission channel resources for a current transmission frame is exceeded by a plurality of segments of data of the stream of digital multimedia data; and
generates a request to resize at least one segment of data of the plurality of segments of data;
an interface that receives the request to resize the at least one segment of data of the segments of data associated with the stream of digital multimedia data for the current transmission frame, the request sent in response to a comparison between quality and rate information of the at least one segment of data and transmission channel resources with respect to the current transmission frame; and
a resizing module that:
resizes the at least one segment of data by adjusting an amount of motion information to be encoded with the at least one segment of data in response to the request; and
transmits, to the multiplex module, the at least one resized segment of data and at least one unresized segment of data for the current transmission frame,
wherein the multiplex module collects the at least one resized segment of data and the at least one unresized segment of data to generate the current transmission frame.
13. The apparatus of claim 12, wherein the resizing module resizes the segment of data to reduce the amount of motion information to be encoded.
14. The apparatus of claim 13, wherein the resizing module merges motion information associated with two or more blocks of pixels of at least one frame within the segment of data to reduce the amount of motion information.
15. The apparatus of claim 12, wherein the resizing module reselects encoding modes for one or more blocks of pixels based on information generated during a first pass encoding of the segment of data.
16. The apparatus of claim 15, wherein the resizing module reselects the encoding modes for one or more blocks of pixels that result in a smallest impact in quality of the segment of data and a highest reduction in bit rate of the segment of data.
17. The apparatus of claim 15, wherein the resizing module reselects one of intramodes or inter-modes associated with the blocks of pixels.
18. The apparatus of claim 15, wherein the resizing module reselects one of a forward prediction mode, a backward prediction mode, and a bi-directional prediction mode for one or more inter-mode blocks of pixels.
19. The apparatus of claim 12, wherein the resizing module increases the amount of motion information to be encoded.
20. The apparatus of claim 12, wherein the resizing module adjusts one or more encoding variables in response to the request to further resize the segment of data.
21. The apparatus of claim 20, wherein the resizing module adjusts one of a bit rate at which to encode the segment of data, a quantization parameter at which to encode the segment of data, and a frame rate at which to encode the segment of data.
22. The apparatus of claim 12, wherein the segment of data comprises portions of real-time data.
23. An apparatus for encoding a stream of digital multimedia data, the apparatus comprising:
means for determining when an amount of available transmission channel resources for a current transmission frame is exceeded by a plurality of segments of data of the stream of digital multimedia data and for generating a request to resize at least one segment of data of the plurality of segments of data;
means for receiving the request to resize the at least one segment of data associated with the stream of digital multimedia data for the current transmission frame, the request sent in response to a comparison between quality and rate information of the at least one segment of data and transmission channel resources with respect to the current transmission frame;
means for resizing the at least one segment of data by adjusting an amount of motion information to be encoded with the at least one segment of data in response to the request;
means for transmitting the at least one resized segment of data and at least one unresized segment of data; and
means for generating the current transmission frame by collecting the at least one resized segment of data and the at least one unresized segment of data.
24. The apparatus of claim 23, wherein the resizing means reduces the amount of motion information to be encoded.
25. The apparatus of claim 24, wherein the resizing means merges motion information associated with two or more blocks of pixels of at least one frame within the segment of data.
26. The apparatus of claim 23, wherein the resizing means reselects modes for one or more blocks of pixels based on information generated during a first pass encoding of the segment of data.
27. The apparatus of claim 26, wherein the resizing means reselects the encoding modes for one or more blocks of pixels that result in a smallest impact in quality of the segment of data and a highest reduction in bit rate of the segment of data.
28. The apparatus of claim 26, wherein the resizing means reselects one of intramodes or inter-modes associated with the blocks of pixels.
29. The apparatus of claim 26, wherein the resizing means reselects one of a forward prediction mode, a backward prediction mode, and a bi-directional prediction mode for one or more inter-mode blocks of pixels.
30. The apparatus of claim 23, wherein the resizing means increases the amount of motion information to be encoded.
31. The apparatus of claim 23, wherein the resizing means adjusts one or more encoding variables in response to the request to further resize the segment of data.
32. The apparatus of claim 31, wherein the resizing means adjusts one of a bit rate at which to encode the segment of data, a frame rate at which to encode the segment of data, and a quantization parameter at which to encode the segment of data.
33. The apparatus of claim 23, wherein the segment of data comprises portions of real-time data.
34. A processor for processing digital video data, the processor being adapted to:
determine when an amount of available transmission channel resources for a current transmission frame is exceeded by a plurality of segments of data of the digital video data and generate a request to resize at least one segment of data of the plurality of segments of data associated with the digital video data;
receive the request to resize the at least one segment of data associated with the digital video data, the request sent in response to a comparison between quality and rate information of the at least one segment of data and transmission channel resources with respect to the current transmission frame;
resize the at least one segment of data by adjusting an amount of motion information to be encoded with the at least one segment of data in response to the request; and
generate the current transmission frame by collecting the at least one resized segment of data and at least one unresized segment of data of the plurality of segments of data.
35. The processor of claim 34, wherein the processor is adapted to reduce the amount of motion information to be encoded.
36. The processor of claim 35, wherein the processor is adapted to merge motion information associated with two or more blocks of pixels of at least one frame within the segment of data.
37. The processor of claim 34, wherein the processor is adapted to:
reselect encoding modes for one or more blocks of pixels based on information generated during a first pass encoding of the segment of data.
38. The processor of claim 37, wherein the processor is adapted to reselect the encoding modes for one or more blocks of pixels that result in a smallest impact in quality of the segment of data and a highest reduction in bit rate of the segment of data.
39. The processor of claim 37, wherein the processor is adapted to reselect one of intra-modes or inter-modes associated with the blocks of pixels.
40. The processor of claim 37, wherein the processor is adapted to reselect one of a forward prediction mode, a backward prediction mode, and a bi-directional prediction mode for one or more inter-mode blocks of pixels.
41. The processor of claim 34, wherein the processor is adapted to increase the amount of motion information to be encoded.
42. The processor of claim 34, wherein the processor is adapted to adjust one or more encoding variables in response to the request to further resize the segment of data.
43. The processor of claim 42, wherein the processor is adapted to adjust one of a bit rate at which to encode the segment of data, a frame rate at which to encode the segment of data and a quantization parameter at which to encode the segment of data.
44. A non-transitory computer-readable medium for processing digital video data comprising instructions that cause at least one computer to:
determine when an amount of available transmission channel resources for a current transmission frame is exceeded by a plurality of segments of data of the digital video data and generate a request to resize at least one segment of data associated with the digital video data;
receive the request to resize the at least one segment of data associated with the digital video data for the current transmission frame, the request sent in response to a comparison between quality and rate information of the at least one segment of data and transmission channel resources with respect to the current transmission frame;
resize the at least one segment of data by adjusting an amount of motion information to be encoded with the segment of data in response to the request; and
generate the current transmission frame by collecting the at least one resized segment of data and at least one unresized segment of data of the plurality of segments of data.
45. The non-transitory computer-readable medium of claim 44, wherein the instructions that cause the computer to resize the segment of data by adjusting the amount of motion information to be encoded comprise instructions to cause the computer to reduce the amount of motion information to be encoded.
46. The non-transitory computer-readable medium of claim 45, wherein the instructions that cause the computer to reduce the amount of motion information to be encoded comprise instructions that cause the computer to merge motion information associated with two or more blocks of pixels of at least one frame within the segment of data.
47. The non-transitory computer-readable medium of claim 44, wherein the computer readable medium further comprises instructions that cause the computer to reselect encoding modes for one or more blocks of pixels based on information generated during a first pass encoding of the segment of data.
48. The non-transitory computer-readable medium of claim 47, wherein the instructions that cause the computer to reselect encoding modes comprise instructions that cause the computer to reselect the encoding modes for one or more blocks of pixels that result in a smallest impact in quality of the segment of data and a highest reduction in bit rate of the segment of data.
49. The non-transitory computer-readable medium of claim 47, wherein the instructions that cause the computer to reselect encoding modes comprise instructions that cause the computer to reselect one of intra-modes or intermodes associated with the blocks of pixels.
50. The non-transitory computer-readable medium of claim 47, wherein the instructions that cause the computer to reselect encoding modes comprise instructions that cause the computer to reselect one of a forward prediction mode, a backward prediction mode, and a bi-directional prediction mode for one or more intermode blocks of pixels.
51. The non-transitory computer-readable medium of claim 44, wherein the instructions that cause the computer to adjust the amount of motion information to be encoded comprise instructions that cause the computer to increase the amount of motion information to be encoded.
52. The non-transitory computer-readable medium of claim 44, wherein the computer readable medium further comprises instructions that cause the computer to adjust one or more encoding variables in response to the request to further resize the segment of data.
53. The non-transitory computer-readable medium of claim 52, wherein the instructions that cause the computer to adjust one or more encoding variables comprise instructions that cause the computer to adjust one of a bit rate at which to encode the segment of data, a frame rate at which to encode the segment of data and a quantization parameter at which to encode the segment of data.
US11/668,828 2006-01-31 2007-01-30 Methods and systems for resizing multimedia content Expired - Fee Related US8792555B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/668,828 US8792555B2 (en) 2006-01-31 2007-01-30 Methods and systems for resizing multimedia content

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US76399506P 2006-01-31 2006-01-31
US11/668,828 US8792555B2 (en) 2006-01-31 2007-01-30 Methods and systems for resizing multimedia content

Publications (2)

Publication Number Publication Date
US20080037624A1 US20080037624A1 (en) 2008-02-14
US8792555B2 true US8792555B2 (en) 2014-07-29

Family

ID=38267577

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/669,000 Expired - Fee Related US8582905B2 (en) 2006-01-31 2007-01-30 Methods and systems for rate control within an encoding device
US11/668,828 Expired - Fee Related US8792555B2 (en) 2006-01-31 2007-01-30 Methods and systems for resizing multimedia content

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/669,000 Expired - Fee Related US8582905B2 (en) 2006-01-31 2007-01-30 Methods and systems for rate control within an encoding device

Country Status (8)

Country Link
US (2) US8582905B2 (en)
EP (2) EP1982526A2 (en)
JP (2) JP2009525706A (en)
KR (2) KR20080102139A (en)
CN (3) CN101507279A (en)
AR (2) AR059272A1 (en)
TW (2) TW200746835A (en)
WO (2) WO2007090178A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110125907A1 (en) * 2003-11-24 2011-05-26 At&T Intellectual Property I, L.P. Methods, Systems, and Products for Providing Communications Services
US9996898B2 (en) 2014-05-30 2018-06-12 International Business Machines Corporation Flexible control in resizing of visual displays

Families Citing this family (63)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7653085B2 (en) 2005-04-08 2010-01-26 Qualcomm Incorporated Methods and apparatus for enhanced delivery of content over data network
US7974193B2 (en) * 2005-04-08 2011-07-05 Qualcomm Incorporated Methods and systems for resizing multimedia content based on quality and rate information
US20070201388A1 (en) * 2006-01-31 2007-08-30 Qualcomm Incorporated Methods and systems for resizing multimedia content based on quality and rate information
US8582905B2 (en) 2006-01-31 2013-11-12 Qualcomm Incorporated Methods and systems for rate control within an encoding device
US8665967B2 (en) * 2006-02-15 2014-03-04 Samsung Electronics Co., Ltd. Method and system for bit reorganization and packetization of uncompressed video for transmission over wireless communication channels
KR100770849B1 (en) * 2006-02-17 2007-10-26 삼성전자주식회사 Apparatus and method of adapting compressed video in wireless fading environment
US20070230461A1 (en) * 2006-03-29 2007-10-04 Samsung Electronics Co., Ltd. Method and system for video data packetization for transmission over wireless channels
US8175041B2 (en) * 2006-12-14 2012-05-08 Samsung Electronics Co., Ltd. System and method for wireless communication of audiovisual data having data size adaptation
US8804829B2 (en) * 2006-12-20 2014-08-12 Microsoft Corporation Offline motion description for video generation
EP2059049A1 (en) * 2007-11-07 2009-05-13 British Telecmmunications public limited campany Video coding
KR101398212B1 (en) * 2008-03-18 2014-05-26 삼성전자주식회사 Memory device and encoding and/or decoding method
US8176524B2 (en) * 2008-04-22 2012-05-08 Samsung Electronics Co., Ltd. System and method for wireless communication of video data having partial data compression
US8396114B2 (en) * 2009-01-29 2013-03-12 Microsoft Corporation Multiple bit rate video encoding using variable bit rate and dynamic resolution for adaptive video streaming
KR101830882B1 (en) * 2010-09-02 2018-02-22 삼성전자주식회사 Method and apparatus for generating control packet
US8813144B2 (en) * 2011-01-10 2014-08-19 Time Warner Cable Enterprises Llc Quality feedback mechanism for bandwidth allocation in a switched digital video system
US8751565B1 (en) 2011-02-08 2014-06-10 Google Inc. Components for web-based configurable pipeline media processing
KR20120103517A (en) * 2011-03-10 2012-09-19 한국전자통신연구원 Method for intra prediction and apparatus thereof
WO2012121575A2 (en) 2011-03-10 2012-09-13 한국전자통신연구원 Method and device for intra-prediction
US8681866B1 (en) 2011-04-28 2014-03-25 Google Inc. Method and apparatus for encoding video by downsampling frame resolution
US9106787B1 (en) 2011-05-09 2015-08-11 Google Inc. Apparatus and method for media transmission bandwidth control using bandwidth estimation
US8924580B2 (en) 2011-08-12 2014-12-30 Cisco Technology, Inc. Constant-quality rate-adaptive streaming
US9591318B2 (en) 2011-09-16 2017-03-07 Microsoft Technology Licensing, Llc Multi-layer encoding and decoding
US9398300B2 (en) * 2011-10-07 2016-07-19 Texas Instruments Incorporated Method, system and apparatus for intra-prediction in video signal processing using combinable blocks
US9659437B2 (en) 2012-09-28 2017-05-23 Bally Gaming, Inc. System and method for cross platform persistent gaming sessions using a mobile device
US20130097220A1 (en) * 2011-10-14 2013-04-18 Bally Gaming, Inc. Streaming bitrate control and management
US9672688B2 (en) 2011-10-14 2017-06-06 Bally Gaming, Inc. System and method for cross platform persistent gaming sessions using a mobile device
US9767642B2 (en) 2011-10-14 2017-09-19 Bally Gaming, Inc. System and method for cross platform persistent gaming sessions using a mobile device
TWI580264B (en) * 2011-11-10 2017-04-21 Sony Corp Image processing apparatus and method
US11089343B2 (en) 2012-01-11 2021-08-10 Microsoft Technology Licensing, Llc Capability advertisement, configuration and control for video coding and decoding
US9042441B2 (en) * 2012-04-25 2015-05-26 At&T Intellectual Property I, Lp Apparatus and method for media streaming
US9185429B1 (en) 2012-04-30 2015-11-10 Google Inc. Video encoding and decoding using un-equal error protection
US9125073B2 (en) 2012-08-03 2015-09-01 Intel Corporation Quality-aware adaptive streaming over hypertext transfer protocol using quality attributes in manifest file
KR102004928B1 (en) * 2012-12-04 2019-07-29 에스케이하이닉스 주식회사 Data storage device and processing method for error correction code thereof
US9172740B1 (en) 2013-01-15 2015-10-27 Google Inc. Adjustable buffer remote access
US9311692B1 (en) 2013-01-25 2016-04-12 Google Inc. Scalable buffer remote access
US9225979B1 (en) 2013-01-30 2015-12-29 Google Inc. Remote access encoding
US20140286438A1 (en) * 2013-03-19 2014-09-25 Nvidia Corporation Quality of service management server and method of managing streaming bit rate
US9438652B2 (en) * 2013-04-15 2016-09-06 Opentv, Inc. Tiered content streaming
WO2015054812A1 (en) 2013-10-14 2015-04-23 Microsoft Technology Licensing, Llc Features of base color index map mode for video and image coding and decoding
WO2015054813A1 (en) 2013-10-14 2015-04-23 Microsoft Technology Licensing, Llc Encoder-side options for intra block copy prediction mode for video and image coding
AU2013403224B2 (en) 2013-10-14 2018-10-18 Microsoft Technology Licensing, Llc Features of intra block copy prediction mode for video and image coding and decoding
US10390034B2 (en) * 2014-01-03 2019-08-20 Microsoft Technology Licensing, Llc Innovations in block vector prediction and estimation of reconstructed sample values within an overlap area
CN105917650B (en) 2014-01-03 2019-12-24 微软技术许可有限责任公司 Method for encoding/decoding video and image, computing device and computer readable medium
US11284103B2 (en) 2014-01-17 2022-03-22 Microsoft Technology Licensing, Llc Intra block copy prediction with asymmetric partitions and encoder-side search patterns, search ranges and approaches to partitioning
US10542274B2 (en) 2014-02-21 2020-01-21 Microsoft Technology Licensing, Llc Dictionary encoding and decoding of screen content
CN105247871B (en) 2014-03-04 2020-09-15 微软技术许可有限责任公司 Block flipping and skip mode in intra block copy prediction
WO2015192353A1 (en) 2014-06-19 2015-12-23 Microsoft Technology Licensing, Llc Unified intra block copy and inter prediction modes
US10306021B1 (en) * 2014-08-21 2019-05-28 Amazon Technologies, Inc. Streaming content to multiple clients
MX2017004211A (en) 2014-09-30 2017-11-15 Microsoft Technology Licensing Llc Rules for intra-picture prediction modes when wavefront parallel processing is enabled.
US20160100173A1 (en) * 2014-10-03 2016-04-07 International Business Machines Corporation Enhanced Video Streaming
US9591325B2 (en) 2015-01-27 2017-03-07 Microsoft Technology Licensing, Llc Special case handling for merged chroma blocks in intra block copy prediction mode
US10659783B2 (en) 2015-06-09 2020-05-19 Microsoft Technology Licensing, Llc Robust encoding/decoding of escape-coded pixels in palette mode
EP3364655A4 (en) * 2015-11-11 2018-10-10 Samsung Electronics Co., Ltd. Method and apparatus for decoding video, and method and apparatus for encoding video
US10301858B2 (en) * 2016-06-14 2019-05-28 Microsoft Technology Licensing, Llc Hinge mechanism
US10501973B2 (en) 2016-06-14 2019-12-10 Microsoft Technology Licensing, Llc Hinge with free-stop function
US9840861B1 (en) 2016-06-14 2017-12-12 Microsoft Technology Licensing, Llc Hinged device with snap open lock
US10061359B2 (en) 2016-07-28 2018-08-28 Microsoft Technology Licensing, Llc Hinged device with living hinge
US10372539B2 (en) * 2017-11-20 2019-08-06 Western Digital Technologies, Inc. Variable length CLDPC encoder and method of operation in an autonomous vehicle
US10986349B2 (en) 2017-12-29 2021-04-20 Microsoft Technology Licensing, Llc Constraints on locations of reference blocks for intra block copy prediction
US10924812B2 (en) * 2018-02-15 2021-02-16 Cisco Technology, Inc. Constant quality video encoding with encoding parameter fine-tuning
JP7229696B2 (en) 2018-08-02 2023-02-28 キヤノン株式会社 Information processing device, information processing method, and program
US11722710B1 (en) * 2021-12-03 2023-08-08 Amazon Technologies, Inc. Dynamic encoding parameter adjustment
CN114245168B (en) * 2021-12-16 2023-12-08 北京数码视讯技术有限公司 Multimedia stream transmission regulation device and method

Citations (103)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5426463A (en) 1993-02-22 1995-06-20 Rca Thomson Licensing Corporation Apparatus for controlling quantizing in a video signal compressor
JPH0818976A (en) 1994-06-29 1996-01-19 Toshiba Corp Dynamic image encoder/decoder
EP0759667A2 (en) 1995-08-22 1997-02-26 Digi-Media Vision Limited Statistical multiplexing
JPH09233475A (en) 1996-02-23 1997-09-05 Tsushin Hoso Kiko Image coder
US5764632A (en) 1996-04-01 1998-06-09 Nokia Mobile Phones Limited Mobile terminal having improved paging channel acquisition in a system using a digital control channel
JPH10276439A (en) 1997-03-28 1998-10-13 Sharp Corp Moving image encoding and decoding device using motion compensation inter-frame prediction system capable of area integration
US5844613A (en) * 1997-03-17 1998-12-01 Microsoft Corporation Global motion estimator for motion video signal encoding
US5854658A (en) 1995-12-26 1998-12-29 C-Cube Microsystems Inc. Statistical multiplexing system which encodes a sequence of video images using a plurality of video encoders
KR19990077445A (en) 1998-03-19 1999-10-25 포만 제프리 엘 A real-time single pass variable bit rate control strategy and encoder
JP2000031964A (en) 1998-07-10 2000-01-28 Digital Vision Laboratories:Kk Stream distribution system
WO2000008891A1 (en) 1998-08-03 2000-02-17 Siemens Aktiengesellschaft Method for switching a first communication link to a second communication link between two communication systems
US6038256A (en) 1996-12-31 2000-03-14 C-Cube Microsystems Inc. Statistical multiplexed video encoding using pre-encoding a priori statistics and a priori and a posteriori statistics
JP2000092471A (en) 1998-09-09 2000-03-31 Matsushita Electric Ind Co Ltd Video server device and its band managing method, and recording medium where band managing program is recorded
EP1014632A2 (en) 1998-12-23 2000-06-28 Teles AG Informationstechnologien Method and switching device to transfer data
US6091777A (en) * 1997-09-18 2000-07-18 Cubic Video Technologies, Inc. Continuously adaptive digital video compression system and method for a web streamer
KR20000049059A (en) 1996-10-11 2000-07-25 러셀 비. 밀러 Adaptive rate control for digital video compression
JP2000324498A (en) 1999-05-13 2000-11-24 Nec Corp Animation encoding device
EP1061522A2 (en) 1999-06-16 2000-12-20 Victor Company of Japan, Ltd. Recording apparatus
US6185253B1 (en) 1997-10-31 2001-02-06 Lucent Technology, Inc. Perceptual compression and robust bit-rate control system
JP2001136139A (en) 1999-11-05 2001-05-18 Sony Corp Multiplexer and multiplexing method
US6243495B1 (en) * 1998-02-13 2001-06-05 Grass Valley (Us) Inc. Method a group of picture structure in MPEG video
WO2001047283A1 (en) 1999-12-22 2001-06-28 General Instrument Corporation Video compression for multicast environments using spatial scalability and simulcast coding
US6256423B1 (en) 1998-09-18 2001-07-03 Sarnoff Corporation Intra-frame quantizer selection for video compression
US6310915B1 (en) * 1998-11-20 2001-10-30 Harmonic Inc. Video transcoder with bitstream look ahead for rate control and statistical multiplexing
EP1168731A1 (en) 2000-02-02 2002-01-02 Mitsubishi Denki Kabushiki Kaisha Method and apparatus for distributing compressed bit stream to user device over network
JP2002010260A (en) 2000-06-27 2002-01-11 Mitsubishi Electric Corp Motion vector detection method and motion picture encoding apparatus
WO2002007447A1 (en) 2000-07-14 2002-01-24 Sony United Kingdom Limited Data encoding apparatus with multiple encoders
US20020056005A1 (en) 2000-11-09 2002-05-09 Alcatel Method, devices and program modules for data transmission with assured quality of service
US20020101832A1 (en) 2001-01-30 2002-08-01 Xixan Chen Method and system for controlling device transmit power in a wireless communication network
US20020137535A1 (en) 2001-01-19 2002-09-26 Denso Corporation Open-loop power control enhancement for blind rescue channel operation
US20020141357A1 (en) 2001-02-01 2002-10-03 Samsung Electronics Co., Ltd. Method for providing packet call service in radio telecommunication system
US20020142772A1 (en) 2000-12-05 2002-10-03 Hunzinger Jason F. Minimum interference multiple-access method and system for connection rescue
WO2002082743A2 (en) 2001-04-05 2002-10-17 Cowave Networks, Inc. Reservation protocol in a node network
JP2003046582A (en) 2001-07-30 2003-02-14 Matsushita Electric Ind Co Ltd Video signal encoder, video signal decoder and video transmission system
US6526031B1 (en) 2001-06-21 2003-02-25 Motorola, Inc. Forward power control determination in spread spectrum communications systems
US20030040315A1 (en) 2001-08-20 2003-02-27 Farideh Khaleghi Reduced state transition delay and signaling overhead for mobile station state transitions
US20030053416A1 (en) 2001-09-19 2003-03-20 Microsoft Corporation Generalized reference decoder for image or video processing
US6539124B2 (en) * 1999-02-03 2003-03-25 Sarnoff Corporation Quantizer selection based on region complexities derived using a rate distortion model
US20030083093A1 (en) 2001-10-31 2003-05-01 Lg Electronics Inc. Method and apparatus for adjusting power in communication system
US6560231B1 (en) 1997-07-23 2003-05-06 Ntt Mobile Communications Network, Inc. Multiplex transmission system and bandwidth control method
US20030093515A1 (en) 2001-11-14 2003-05-15 Kauffman Marc W. Quality of service control of streamed content delivery
US6574279B1 (en) 2000-02-02 2003-06-03 Mitsubishi Electric Research Laboratories, Inc. Video transcoding using syntactic and semantic clues
US20030110297A1 (en) * 2001-12-12 2003-06-12 Tabatabai Ali J. Transforming multimedia data for delivery to multiple heterogeneous devices
US20030112366A1 (en) 2001-11-21 2003-06-19 General Instrument Corporation Apparatus and methods for improving video quality delivered to a display device
US20030123413A1 (en) 1998-11-09 2003-07-03 Samsung Electronics Co., Ltd. RSMA control device and method for mobile communication system
US20030185369A1 (en) 2002-03-29 2003-10-02 Oliver Neal C. Telephone conference bridge provided via a plurality of computer telephony resource algorithms
US6674796B1 (en) 2000-02-14 2004-01-06 Harmonic, Inc. Statistical multiplexed video encoding for diverse video formats
US20040013102A1 (en) 2001-06-27 2004-01-22 Mo-Han Fong Mapping information in wireless communications systems
US20040013103A1 (en) 2001-06-27 2004-01-22 Hang Zhang Communication of control information in wireless communication systems
US20040028139A1 (en) 2002-08-06 2004-02-12 Andre Zaccarin Video encoding
EP1395039A1 (en) 2002-04-05 2004-03-03 Matsushita Electric Industrial Co., Ltd. Code amount control device and code amount control method
US20040042438A1 (en) 2002-08-15 2004-03-04 James Jiang Trunking system for CDMA wireless communication
US6704281B1 (en) 1999-01-15 2004-03-09 Nokia Mobile Phones Ltd. Bit-rate control in a multimedia device
WO2004023743A2 (en) 2002-09-06 2004-03-18 Matsushita Electric Industrial Co., Ltd. Methods for performing medium dedication in order to ensure the quality of service for delivering real-time data across wireless network
CN1492685A (en) 2002-10-03 2004-04-28 ���µ�����ҵ��ʽ���� Image coding method and image coding device
US20040087331A1 (en) 1999-06-28 2004-05-06 Jong-Yoon Hwang Apparatus and method of controlling forward link power when in discontinuous transmission mode in a mobile communication
US6744743B2 (en) 2000-03-30 2004-06-01 Qualcomm Incorporated Method and apparatus for controlling transmissions of a communications system
US20040114817A1 (en) 2002-07-01 2004-06-17 Nikil Jayant Efficient compression and transport of video over a network
EP1443511A2 (en) 2003-01-22 2004-08-04 Matsushita Electric Industrial Co., Ltd. Method and device for ensuring storage time for digital broadcast
US20040165783A1 (en) 2001-09-26 2004-08-26 Interact Devices, Inc. System and method for dynamically switching quality settings of a codec to maintain a target data rate
JP2004248144A (en) 2003-02-17 2004-09-02 Ricoh Co Ltd Image processor, image compressor, image processing method, image compressing method, program, and recording medium
US20040220966A1 (en) 2003-05-02 2004-11-04 Justin Ridge Method and apparatus for providing a multimedia data stream
JP2004349855A (en) 2003-05-20 2004-12-09 Mitsubishi Electric Corp Coder
CN1555612A (en) 2001-05-03 2004-12-15 �����ɷ� Method and apparatus for controlling uplink transmissions of a wireless communication system
US20050058058A1 (en) 2003-07-30 2005-03-17 Samsung Electronics Co., Ltd. Ranging method in a mobile communication system using orthogonal frequency division multiple access
US20050063330A1 (en) 2003-09-20 2005-03-24 Samsung Electronics Co., Ltd. Method for uplink bandwidth request and allocation based on a quality of service class in a broadband wireless access communication system
US20050078759A1 (en) 2003-08-27 2005-04-14 Interdigital Technology Corporation Subcarrier and bit allocation for real time services in multiuser orthogonal frequency division multiplex (OFDM) systems
US20050110188A1 (en) 2003-10-23 2005-05-26 John Rausch Orifice plate and method of forming orifice plate for fluid ejection device
US20050120128A1 (en) 2003-12-02 2005-06-02 Wilife, Inc. Method and system of bandwidth management for streaming data
JP2005167514A (en) 2003-12-01 2005-06-23 Matsushita Electric Ind Co Ltd Streaming data communication system, streaming data communication apparatus and suitable bit rate detecting method
US20050210515A1 (en) 2004-03-22 2005-09-22 Lg Electronics Inc. Server system for performing communication over wireless network and operating method thereof
US20050223013A1 (en) 2000-10-23 2005-10-06 Matthew Jarman Delivery of navigation data for playback of audio and video content
US20050220188A1 (en) 1997-03-14 2005-10-06 Microsoft Corporation Digital video signal encoder and encoding method
US6959042B1 (en) * 2001-10-01 2005-10-25 Cisco Technology, Inc. Methods and apparatus for measuring compressed video signals and applications to statistical remultiplexing
US20050282571A1 (en) 2004-06-02 2005-12-22 Valentin Oprescu-Surcobe Method and apparatus for regulating a delivery of a broadcast-multicast service in a packet data communication system
US20050286631A1 (en) 2004-06-27 2005-12-29 Hsi Jung Wu Encoding with visual masking
JP2006014288A (en) 2004-05-26 2006-01-12 Matsushita Electric Ind Co Ltd Motion vector coding equipment, method, program and medium
US20060013298A1 (en) 2004-06-27 2006-01-19 Xin Tong Multi-pass video encoding
US20060171460A1 (en) 2005-01-05 2006-08-03 Yoichi Masuda Image conversion apparatus
US7095754B2 (en) 2000-11-03 2006-08-22 At&T Corp. Tiered contention multiple access (TCMA): a method for priority-based shared channel access
WO2006099082A2 (en) 2005-03-10 2006-09-21 Qualcomm Incorporated Content adaptive multimedia processing
WO2006110876A2 (en) 2005-04-08 2006-10-19 Qualcomm Incorporated Methods and apparatus for enhanced delivery of content over a data network
US7180905B2 (en) 2001-11-02 2007-02-20 At & T Corp. Access method for periodic contention-free sessions
WO2007035238A2 (en) * 2005-09-16 2007-03-29 Sony Electronics, Inc. Motion vector selection
US20070076599A1 (en) 2005-09-30 2007-04-05 The Boeing Company System and method for providing integrated services across cryptographic boundaries in a network
US7245605B2 (en) 2001-11-02 2007-07-17 At&T Corp. Preemptive packet for maintaining contiguity in cyclic prioritized multiple access (CPMA) contention-free sessions
US7248600B2 (en) 2001-11-02 2007-07-24 At&T Corp. ‘Shield’: protecting high priority channel access attempts in overlapped wireless cells
US20070204067A1 (en) 2006-01-31 2007-08-30 Qualcomm Incorporated Methods and systems for rate control within an encoding device
US20070201388A1 (en) 2006-01-31 2007-08-30 Qualcomm Incorporated Methods and systems for resizing multimedia content based on quality and rate information
US7272181B2 (en) 1999-10-21 2007-09-18 Toshiba America Electronic Components, Inc. Method and apparatus for estimating and controlling the number of bits output from a video coder
US7277415B2 (en) 2001-11-02 2007-10-02 At&T Corp. Staggered startup for cyclic prioritized multiple access (CPMA) contention-free sessions
US7280517B2 (en) 2001-11-02 2007-10-09 At&T Corp. Wireless LANs and neighborhood capture
US20070274340A1 (en) 2005-04-08 2007-11-29 Qualcomm Incorporated Methods and systems for resizing multimedia content based on quality and rate information
US20080037420A1 (en) 2003-10-08 2008-02-14 Bob Tang Immediate ready implementation of virtually congestion free guaranteed service capable network: external internet nextgentcp (square waveform) TCP friendly san
US20080152003A1 (en) * 2006-12-22 2008-06-26 Qualcomm Incorporated Multimedia data reorganization between base layer and enhancement layer
US7400642B2 (en) 2003-08-29 2008-07-15 Samsung Electronics Co., Ltd Apparatus and method for controlling operational states of medium access control layer in a broadband wireless access communication system
US7571246B2 (en) 2004-07-29 2009-08-04 Microsoft Corporation Media transrating over a bandwidth-limited network
US7659907B1 (en) 2002-03-29 2010-02-09 Graphics Properties Holdings, Inc. System and method for providing dynamic control of a graphics session
US7746836B2 (en) * 2006-10-16 2010-06-29 Motorola, Inc. Method and apparatus for re-registration of connections for service continuity in an agnostic access internet protocol multimedia communication system
US7835437B1 (en) * 2003-03-10 2010-11-16 Ji Zhang Statistical remultiplexing of compressed video segments
US7876821B2 (en) * 2002-09-05 2011-01-25 Agency For Science, Technology And Research Method and an apparatus for controlling the rate of a video sequence; a video encoding device
US7913277B1 (en) * 2006-03-30 2011-03-22 Nortel Networks Limited Metadata extraction and re-insertion and improved transcoding in digital media systems
US8351513B2 (en) * 2006-12-19 2013-01-08 Allot Communications Ltd. Intelligent video signal encoding utilizing regions of interest information

Patent Citations (116)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5426463A (en) 1993-02-22 1995-06-20 Rca Thomson Licensing Corporation Apparatus for controlling quantizing in a video signal compressor
JPH0818976A (en) 1994-06-29 1996-01-19 Toshiba Corp Dynamic image encoder/decoder
EP0759667A2 (en) 1995-08-22 1997-02-26 Digi-Media Vision Limited Statistical multiplexing
US5854658A (en) 1995-12-26 1998-12-29 C-Cube Microsystems Inc. Statistical multiplexing system which encodes a sequence of video images using a plurality of video encoders
JPH09233475A (en) 1996-02-23 1997-09-05 Tsushin Hoso Kiko Image coder
US5764632A (en) 1996-04-01 1998-06-09 Nokia Mobile Phones Limited Mobile terminal having improved paging channel acquisition in a system using a digital control channel
US7023915B2 (en) 1996-10-11 2006-04-04 Qualcomm, Incorporated Adaptive rate control for digital video compression
KR20000049059A (en) 1996-10-11 2000-07-25 러셀 비. 밀러 Adaptive rate control for digital video compression
US6038256A (en) 1996-12-31 2000-03-14 C-Cube Microsystems Inc. Statistical multiplexed video encoding using pre-encoding a priori statistics and a priori and a posteriori statistics
US20050220188A1 (en) 1997-03-14 2005-10-06 Microsoft Corporation Digital video signal encoder and encoding method
US5844613A (en) * 1997-03-17 1998-12-01 Microsoft Corporation Global motion estimator for motion video signal encoding
JPH10276439A (en) 1997-03-28 1998-10-13 Sharp Corp Moving image encoding and decoding device using motion compensation inter-frame prediction system capable of area integration
US6560231B1 (en) 1997-07-23 2003-05-06 Ntt Mobile Communications Network, Inc. Multiplex transmission system and bandwidth control method
US6091777A (en) * 1997-09-18 2000-07-18 Cubic Video Technologies, Inc. Continuously adaptive digital video compression system and method for a web streamer
US6185253B1 (en) 1997-10-31 2001-02-06 Lucent Technology, Inc. Perceptual compression and robust bit-rate control system
US6243495B1 (en) * 1998-02-13 2001-06-05 Grass Valley (Us) Inc. Method a group of picture structure in MPEG video
KR19990077445A (en) 1998-03-19 1999-10-25 포만 제프리 엘 A real-time single pass variable bit rate control strategy and encoder
JP2000031964A (en) 1998-07-10 2000-01-28 Digital Vision Laboratories:Kk Stream distribution system
WO2000008891A1 (en) 1998-08-03 2000-02-17 Siemens Aktiengesellschaft Method for switching a first communication link to a second communication link between two communication systems
JP2000092471A (en) 1998-09-09 2000-03-31 Matsushita Electric Ind Co Ltd Video server device and its band managing method, and recording medium where band managing program is recorded
US6256423B1 (en) 1998-09-18 2001-07-03 Sarnoff Corporation Intra-frame quantizer selection for video compression
US20030123413A1 (en) 1998-11-09 2003-07-03 Samsung Electronics Co., Ltd. RSMA control device and method for mobile communication system
US6310915B1 (en) * 1998-11-20 2001-10-30 Harmonic Inc. Video transcoder with bitstream look ahead for rate control and statistical multiplexing
EP1014632A2 (en) 1998-12-23 2000-06-28 Teles AG Informationstechnologien Method and switching device to transfer data
US6704281B1 (en) 1999-01-15 2004-03-09 Nokia Mobile Phones Ltd. Bit-rate control in a multimedia device
US6539124B2 (en) * 1999-02-03 2003-03-25 Sarnoff Corporation Quantizer selection based on region complexities derived using a rate distortion model
JP2000324498A (en) 1999-05-13 2000-11-24 Nec Corp Animation encoding device
EP1061522A2 (en) 1999-06-16 2000-12-20 Victor Company of Japan, Ltd. Recording apparatus
US20040087331A1 (en) 1999-06-28 2004-05-06 Jong-Yoon Hwang Apparatus and method of controlling forward link power when in discontinuous transmission mode in a mobile communication
US7272181B2 (en) 1999-10-21 2007-09-18 Toshiba America Electronic Components, Inc. Method and apparatus for estimating and controlling the number of bits output from a video coder
JP2001136139A (en) 1999-11-05 2001-05-18 Sony Corp Multiplexer and multiplexing method
WO2001047283A1 (en) 1999-12-22 2001-06-28 General Instrument Corporation Video compression for multicast environments using spatial scalability and simulcast coding
US6574279B1 (en) 2000-02-02 2003-06-03 Mitsubishi Electric Research Laboratories, Inc. Video transcoding using syntactic and semantic clues
EP1168731A1 (en) 2000-02-02 2002-01-02 Mitsubishi Denki Kabushiki Kaisha Method and apparatus for distributing compressed bit stream to user device over network
US6674796B1 (en) 2000-02-14 2004-01-06 Harmonic, Inc. Statistical multiplexed video encoding for diverse video formats
US6744743B2 (en) 2000-03-30 2004-06-01 Qualcomm Incorporated Method and apparatus for controlling transmissions of a communications system
JP2002010260A (en) 2000-06-27 2002-01-11 Mitsubishi Electric Corp Motion vector detection method and motion picture encoding apparatus
WO2002007447A1 (en) 2000-07-14 2002-01-24 Sony United Kingdom Limited Data encoding apparatus with multiple encoders
US20050223013A1 (en) 2000-10-23 2005-10-06 Matthew Jarman Delivery of navigation data for playback of audio and video content
US7095754B2 (en) 2000-11-03 2006-08-22 At&T Corp. Tiered contention multiple access (TCMA): a method for priority-based shared channel access
US7024480B2 (en) 2000-11-09 2006-04-04 Alcatel Method, devices and program modules for data transmission with assured quality of service
US20020056005A1 (en) 2000-11-09 2002-05-09 Alcatel Method, devices and program modules for data transmission with assured quality of service
US20020142772A1 (en) 2000-12-05 2002-10-03 Hunzinger Jason F. Minimum interference multiple-access method and system for connection rescue
US20020137535A1 (en) 2001-01-19 2002-09-26 Denso Corporation Open-loop power control enhancement for blind rescue channel operation
US20020101832A1 (en) 2001-01-30 2002-08-01 Xixan Chen Method and system for controlling device transmit power in a wireless communication network
US20020141357A1 (en) 2001-02-01 2002-10-03 Samsung Electronics Co., Ltd. Method for providing packet call service in radio telecommunication system
WO2002082743A2 (en) 2001-04-05 2002-10-17 Cowave Networks, Inc. Reservation protocol in a node network
US7042856B2 (en) 2001-05-03 2006-05-09 Qualcomm, Incorporation Method and apparatus for controlling uplink transmissions of a wireless communication system
US20060262750A1 (en) 2001-05-03 2006-11-23 Qualcomm Incorporated Method and apparatus for controlling uplink transmissions of a wireless communication system
CN1555612A (en) 2001-05-03 2004-12-15 �����ɷ� Method and apparatus for controlling uplink transmissions of a wireless communication system
US6526031B1 (en) 2001-06-21 2003-02-25 Motorola, Inc. Forward power control determination in spread spectrum communications systems
US20040013103A1 (en) 2001-06-27 2004-01-22 Hang Zhang Communication of control information in wireless communication systems
US20040013102A1 (en) 2001-06-27 2004-01-22 Mo-Han Fong Mapping information in wireless communications systems
JP2003046582A (en) 2001-07-30 2003-02-14 Matsushita Electric Ind Co Ltd Video signal encoder, video signal decoder and video transmission system
US20030040315A1 (en) 2001-08-20 2003-02-27 Farideh Khaleghi Reduced state transition delay and signaling overhead for mobile station state transitions
US20030053416A1 (en) 2001-09-19 2003-03-20 Microsoft Corporation Generalized reference decoder for image or video processing
US20040165783A1 (en) 2001-09-26 2004-08-26 Interact Devices, Inc. System and method for dynamically switching quality settings of a codec to maintain a target data rate
US6959042B1 (en) * 2001-10-01 2005-10-25 Cisco Technology, Inc. Methods and apparatus for measuring compressed video signals and applications to statistical remultiplexing
US20030083093A1 (en) 2001-10-31 2003-05-01 Lg Electronics Inc. Method and apparatus for adjusting power in communication system
US7180905B2 (en) 2001-11-02 2007-02-20 At & T Corp. Access method for periodic contention-free sessions
US7245605B2 (en) 2001-11-02 2007-07-17 At&T Corp. Preemptive packet for maintaining contiguity in cyclic prioritized multiple access (CPMA) contention-free sessions
US7248600B2 (en) 2001-11-02 2007-07-24 At&T Corp. ‘Shield’: protecting high priority channel access attempts in overlapped wireless cells
US7277415B2 (en) 2001-11-02 2007-10-02 At&T Corp. Staggered startup for cyclic prioritized multiple access (CPMA) contention-free sessions
US7280517B2 (en) 2001-11-02 2007-10-09 At&T Corp. Wireless LANs and neighborhood capture
US20030093515A1 (en) 2001-11-14 2003-05-15 Kauffman Marc W. Quality of service control of streamed content delivery
US20030112366A1 (en) 2001-11-21 2003-06-19 General Instrument Corporation Apparatus and methods for improving video quality delivered to a display device
US20030110297A1 (en) * 2001-12-12 2003-06-12 Tabatabai Ali J. Transforming multimedia data for delivery to multiple heterogeneous devices
US7659907B1 (en) 2002-03-29 2010-02-09 Graphics Properties Holdings, Inc. System and method for providing dynamic control of a graphics session
US20030185369A1 (en) 2002-03-29 2003-10-02 Oliver Neal C. Telephone conference bridge provided via a plurality of computer telephony resource algorithms
EP1395039A1 (en) 2002-04-05 2004-03-03 Matsushita Electric Industrial Co., Ltd. Code amount control device and code amount control method
US7936818B2 (en) * 2002-07-01 2011-05-03 Arris Group, Inc. Efficient compression and transport of video over a network
US20040114817A1 (en) 2002-07-01 2004-06-17 Nikil Jayant Efficient compression and transport of video over a network
US20040028139A1 (en) 2002-08-06 2004-02-12 Andre Zaccarin Video encoding
US20040042438A1 (en) 2002-08-15 2004-03-04 James Jiang Trunking system for CDMA wireless communication
US7876821B2 (en) * 2002-09-05 2011-01-25 Agency For Science, Technology And Research Method and an apparatus for controlling the rate of a video sequence; a video encoding device
WO2004023743A2 (en) 2002-09-06 2004-03-18 Matsushita Electric Industrial Co., Ltd. Methods for performing medium dedication in order to ensure the quality of service for delivering real-time data across wireless network
CN1492685A (en) 2002-10-03 2004-04-28 ���µ�����ҵ��ʽ���� Image coding method and image coding device
EP1443511A2 (en) 2003-01-22 2004-08-04 Matsushita Electric Industrial Co., Ltd. Method and device for ensuring storage time for digital broadcast
JP2004248144A (en) 2003-02-17 2004-09-02 Ricoh Co Ltd Image processor, image compressor, image processing method, image compressing method, program, and recording medium
US7406202B2 (en) 2003-02-17 2008-07-29 Ricoh Company, Ltd. Image processing apparatus, image compression apparatus, image processing method, image compression method, program, and recording medium
US20040213466A1 (en) 2003-02-17 2004-10-28 Taku Kodama Image processing apparatus, image compression apparatus, image processing method, image compression method, program, and recording medium
US7835437B1 (en) * 2003-03-10 2010-11-16 Ji Zhang Statistical remultiplexing of compressed video segments
US20040220966A1 (en) 2003-05-02 2004-11-04 Justin Ridge Method and apparatus for providing a multimedia data stream
JP2004349855A (en) 2003-05-20 2004-12-09 Mitsubishi Electric Corp Coder
US20050058058A1 (en) 2003-07-30 2005-03-17 Samsung Electronics Co., Ltd. Ranging method in a mobile communication system using orthogonal frequency division multiple access
US20050078759A1 (en) 2003-08-27 2005-04-14 Interdigital Technology Corporation Subcarrier and bit allocation for real time services in multiuser orthogonal frequency division multiplex (OFDM) systems
US7400642B2 (en) 2003-08-29 2008-07-15 Samsung Electronics Co., Ltd Apparatus and method for controlling operational states of medium access control layer in a broadband wireless access communication system
US20050063330A1 (en) 2003-09-20 2005-03-24 Samsung Electronics Co., Ltd. Method for uplink bandwidth request and allocation based on a quality of service class in a broadband wireless access communication system
US20080037420A1 (en) 2003-10-08 2008-02-14 Bob Tang Immediate ready implementation of virtually congestion free guaranteed service capable network: external internet nextgentcp (square waveform) TCP friendly san
US20050110188A1 (en) 2003-10-23 2005-05-26 John Rausch Orifice plate and method of forming orifice plate for fluid ejection device
JP2005167514A (en) 2003-12-01 2005-06-23 Matsushita Electric Ind Co Ltd Streaming data communication system, streaming data communication apparatus and suitable bit rate detecting method
US20050120128A1 (en) 2003-12-02 2005-06-02 Wilife, Inc. Method and system of bandwidth management for streaming data
US20050210515A1 (en) 2004-03-22 2005-09-22 Lg Electronics Inc. Server system for performing communication over wireless network and operating method thereof
JP2006014288A (en) 2004-05-26 2006-01-12 Matsushita Electric Ind Co Ltd Motion vector coding equipment, method, program and medium
US7415241B2 (en) 2004-06-02 2008-08-19 Motorola, Inc. Method and apparatus for regulating a delivery of a broadcast-multicast service in a packet data communication system
US20050282571A1 (en) 2004-06-02 2005-12-22 Valentin Oprescu-Surcobe Method and apparatus for regulating a delivery of a broadcast-multicast service in a packet data communication system
US20060013298A1 (en) 2004-06-27 2006-01-19 Xin Tong Multi-pass video encoding
US20050286631A1 (en) 2004-06-27 2005-12-29 Hsi Jung Wu Encoding with visual masking
US7571246B2 (en) 2004-07-29 2009-08-04 Microsoft Corporation Media transrating over a bandwidth-limited network
US20060171460A1 (en) 2005-01-05 2006-08-03 Yoichi Masuda Image conversion apparatus
WO2006099082A2 (en) 2005-03-10 2006-09-21 Qualcomm Incorporated Content adaptive multimedia processing
WO2006110876A2 (en) 2005-04-08 2006-10-19 Qualcomm Incorporated Methods and apparatus for enhanced delivery of content over a data network
US20070274340A1 (en) 2005-04-08 2007-11-29 Qualcomm Incorporated Methods and systems for resizing multimedia content based on quality and rate information
US7653085B2 (en) 2005-04-08 2010-01-26 Qualcomm Incorporated Methods and apparatus for enhanced delivery of content over data network
US20110299587A1 (en) 2005-04-08 2011-12-08 Qualcomm Incorporated Methods and systems for resizing multimedia content based on quality and rate information
US7974193B2 (en) 2005-04-08 2011-07-05 Qualcomm Incorporated Methods and systems for resizing multimedia content based on quality and rate information
WO2007035238A2 (en) * 2005-09-16 2007-03-29 Sony Electronics, Inc. Motion vector selection
US7623458B2 (en) 2005-09-30 2009-11-24 The Boeing Company System and method for providing integrated services across cryptographic boundaries in a network
US20070076599A1 (en) 2005-09-30 2007-04-05 The Boeing Company System and method for providing integrated services across cryptographic boundaries in a network
US20070204067A1 (en) 2006-01-31 2007-08-30 Qualcomm Incorporated Methods and systems for rate control within an encoding device
US20080037624A1 (en) 2006-01-31 2008-02-14 Qualcomm Incorporated Methods and systems for resizing multimedia content
US20070201388A1 (en) 2006-01-31 2007-08-30 Qualcomm Incorporated Methods and systems for resizing multimedia content based on quality and rate information
US7913277B1 (en) * 2006-03-30 2011-03-22 Nortel Networks Limited Metadata extraction and re-insertion and improved transcoding in digital media systems
US7746836B2 (en) * 2006-10-16 2010-06-29 Motorola, Inc. Method and apparatus for re-registration of connections for service continuity in an agnostic access internet protocol multimedia communication system
US8351513B2 (en) * 2006-12-19 2013-01-08 Allot Communications Ltd. Intelligent video signal encoding utilizing regions of interest information
US20080152003A1 (en) * 2006-12-22 2008-06-26 Qualcomm Incorporated Multimedia data reorganization between base layer and enhancement layer

Non-Patent Citations (16)

* Cited by examiner, † Cited by third party
Title
3rd Generation Partnership Project (3GPP), "Physical Layer Standard for CDMA 2000 Spread Spectrum Systems", Doc. No. 3GPP2 C.S0002-C, Release C, Version 1.0, Dated May 28, 2002.
ISR-PCT/US2007/061416, dated Dec. 12, 2007.
ISR—PCT/US2007/061416, dated Dec. 12, 2007.
ISR-PCT/US2007/061418; dated Mar. 8, 2007.
ISR—PCT/US2007/061418; dated Mar. 8, 2007.
ISR-PCT/US2007/061419, dated Mar. 8, 2007.
ISR—PCT/US2007/061419, dated Mar. 8, 2007.
Jia Zhike, et al., "Adaptive quantization scheme for very low bit rate video coding" Fifth Asia-Pacific Confernece on Communications and Fourth Optoelectronics and Communications Conference. APCC/OECC' 99; vol. 2, 1999, pp. 940-943.
Nakajima, T. et al. "Continuous media storage system supporting VBR streams." Real Time Computing Systems and Applications, 1996. Proceedings, 3rd International Workshop on Seoul, S. Korea. Oct. 30-Nov. 1, 1996. Los Alamitos, CA, IEEE Computer. Soc. US. Oct. 30, 1996, pp. 26-33.
R. Thomas Derryberry et al, "Overview of CDMA2000, Revision D", May 28, 2002, XP-002311845.
TIA-1099, Forward Link Only Air Interface Specification for Terrestrial Mobile Multimedia Multicast, Oct. 2006.
Written Opinion-PCT/US2007/061418-ISA/EPO-Aug. 3, 2007.
Written Opinion—PCT/US2007/061418—ISA/EPO—Aug. 3, 2007.
Zhang X., et al., "Constant-Quality Constrained-Rate Allocation For FGS Video Bitstreams," Proceedings of SPIE Conference on Visual Communications and Image Processing (VCIP), Jan. 2002, pp. 817-827, vol. 4671.
Zheng, B. et al., "TSFD: Two stage frame dropping for scalable video transmission over data networks"-2001 IEEE Workshop on High Performance Switching and Routing, May 29-31, 2001, pp. 43-47.
Zheng, B. et al., "TSFD: Two stage frame dropping for scalable video transmission over data networks"—2001 IEEE Workshop on High Performance Switching and Routing, May 29-31, 2001, pp. 43-47.

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110125907A1 (en) * 2003-11-24 2011-05-26 At&T Intellectual Property I, L.P. Methods, Systems, and Products for Providing Communications Services
US9240901B2 (en) * 2003-11-24 2016-01-19 At&T Intellectual Property I, L.P. Methods, systems, and products for providing communications services by determining the communications services require a subcontracted processing service and subcontracting to the subcontracted processing service in order to provide the communications services
US10230658B2 (en) 2003-11-24 2019-03-12 At&T Intellectual Property I, L.P. Methods, systems, and products for providing communications services by incorporating a subcontracted result of a subcontracted processing service into a service requested by a client device
US9996898B2 (en) 2014-05-30 2018-06-12 International Business Machines Corporation Flexible control in resizing of visual displays
US10540744B2 (en) 2014-05-30 2020-01-21 International Business Machines Corporation Flexible control in resizing of visual displays

Also Published As

Publication number Publication date
TW200746835A (en) 2007-12-16
EP1982526A2 (en) 2008-10-22
WO2007090178A2 (en) 2007-08-09
TW200737850A (en) 2007-10-01
JP2009525705A (en) 2009-07-09
WO2007090178A3 (en) 2007-11-01
KR20080102139A (en) 2008-11-24
AR059273A1 (en) 2008-03-19
CN101371590A (en) 2009-02-18
US8582905B2 (en) 2013-11-12
CN101375604A (en) 2009-02-25
AR059272A1 (en) 2008-03-19
KR20080102141A (en) 2008-11-24
CN101507279A (en) 2009-08-12
KR100987232B1 (en) 2010-10-12
US20070204067A1 (en) 2007-08-30
EP1980111A2 (en) 2008-10-15
US20080037624A1 (en) 2008-02-14
JP2009525706A (en) 2009-07-09
WO2007090177A3 (en) 2007-10-18
WO2007090177A2 (en) 2007-08-09

Similar Documents

Publication Publication Date Title
US8792555B2 (en) Methods and systems for resizing multimedia content
US8885470B2 (en) Methods and systems for resizing multimedia content based on quality and rate information
US10123015B2 (en) Macroblock-level adaptive quantization in quality-aware video optimization
US20070201388A1 (en) Methods and systems for resizing multimedia content based on quality and rate information
KR101104654B1 (en) Methods and systems for quality controlled encoding
JP2008533844A (en) Situation adaptive bandwidth adjustment in video rate control
KR101583896B1 (en) Video coding
Singh et al. Optimising QoE for scalable video multicast over WLAN
CN112004084B (en) Code rate control optimization method and system by utilizing quantization parameter sequencing
Ozcelebi et al. Minimum delay content adaptive video streaming over variable bitrate channels with a novel stream switching solution
CN112004083B (en) Method and system for optimizing code rate control by utilizing inter-frame prediction characteristics
Kobayashi et al. A real-time 4K HEVC multi-channel encoding system with content-aware bitrate control
JP2017011574A (en) Moving image data distribution management apparatus, moving image data distribution management method, and program

Legal Events

Date Code Title Description
AS Assignment

Owner name: QUALCOMM INCORPORATED, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WALKER, GORDON KENT;RAVEENDRAN, VIJAYALAKSHMI R.;GUPTA, BINITA;AND OTHERS;REEL/FRAME:020039/0762;SIGNING DATES FROM 20071005 TO 20071027

Owner name: QUALCOMM INCORPORATED, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WALKER, GORDON KENT;RAVEENDRAN, VIJAYALAKSHMI R.;GUPTA, BINITA;AND OTHERS;SIGNING DATES FROM 20071005 TO 20071027;REEL/FRAME:020039/0762

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.)

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20180729