US20130254802A1 - Selection of advertisements for placement with content - Google Patents

Selection of advertisements for placement with content Download PDF

Info

Publication number
US20130254802A1
US20130254802A1 US13/619,961 US201213619961A US2013254802A1 US 20130254802 A1 US20130254802 A1 US 20130254802A1 US 201213619961 A US201213619961 A US 201213619961A US 2013254802 A1 US2013254802 A1 US 2013254802A1
Authority
US
United States
Prior art keywords
content
content item
video
candidate
sponsored content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/619,961
Inventor
Reuven Lax
Poorva Arankalle
Shamim Samadi
Rajas Moonka
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Google LLC
Original Assignee
Google LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google LLC filed Critical Google LLC
Priority to US13/619,961 priority Critical patent/US20130254802A1/en
Assigned to GOOGLE INC. reassignment GOOGLE INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ARANKALLE, POORVA, LAX, REUVEN, MOONKA, RAJAS, SAMADI, SHAMIM
Publication of US20130254802A1 publication Critical patent/US20130254802A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising

Definitions

  • Online video is a growing medium. The popularity of online video services reflects this growth. Advertisers see online video as another way to reach their customers. Many advertisers are interested in maximizing the number of actions (e.g., impressions and/or click-throughs) for their advertisements. To achieve this, advertisers make efforts to place advertisements with content that is relevant to their advertisements. For example, an advertiser can target its car advertisements to a website about cars.
  • one aspect of the subject matter described in this specification can be embodied in methods that include the actions of receiving a request for sponsored content for presentation with a content item, where the content item is included in a document; identifying one or more candidate sponsored content items based on a plurality of criteria, which includes information related to the content item independent of the document and information related to the document; selecting one or more of the candidate sponsored content items; and transmitting the selected sponsored content items for presentation with the content item.
  • Other embodiments of this aspect include corresponding systems, apparatus, and computer program products.
  • another aspect of the subject matter described in this specification can be embodied in methods that include the actions of receiving a request for sponsored content for presentation with a content item, where the content item is included in a presentation environment; identifying one or more candidate sponsored content items based on a plurality of criteria, which includes information related to the content item and information related to the presentation environment; selecting one or more of the candidate sponsored content items; transmitting the selected sponsored content items for presentation with the content item in the presentation environment.
  • Other embodiments of this aspect include corresponding systems, apparatus, and computer program products.
  • Advertisements can be placed for presentation with third party video content.
  • the placed advertisements are selected for relevance to video content, content of a page in which the video is embedded, and other related content. Advertisers can automatically and dynamically target embedded video content that may change over time.
  • FIG. 1 is a block diagram illustrating an example network environment.
  • FIG. 2 is a block diagram illustrating an example advertising delivery system.
  • FIGS. 3-4 are examples of a user interface illustrating advertising content displayed on a screen with video content.
  • FIG. 5 is a flow diagram illustrating an example process for selecting and delivering advertisements.
  • FIG. 6 is a block diagram illustrating an example generic computer and an example generic mobile computer device.
  • FIG. 1 shows an example of a network environment 100 .
  • the environment 100 includes a sponsored content (e.g., advertisement) provider 102 , a content sponsor (e.g., provider) 104 , and one or more user devices 106 , at least some of which communicate across network 108 .
  • a sponsored content e.g., advertisement
  • a content sponsor e.g., provider
  • user devices 106 at least some of which communicate across network 108 .
  • discussion provided herein will make reference to the delivery of advertisements to a content provider. Other forms of sponsored content are possible.
  • the advertisement provider 102 can provide advertising content (e.g., advertisements or “ads” or “ad” in the singular) for presentation with content items (e.g., text, images, video, audio, games, multimedia playlists (such as static, dynamic or executable lists of multiple content items to be played), embedded executables or software) provided by the content provider 104 .
  • content items e.g., text, images, video, audio, games, multimedia playlists (such as static, dynamic or executable lists of multiple content items to be played), embedded executables or software
  • a video can be provided by the content provider 104 through the network 108 to one or more user devices 106 .
  • the ad content can be distributed, through network 108 , to one or more user devices 106 before, during, or after presentation of the video.
  • advertisement provider 102 is coupled with one or more advertising repositories (not shown).
  • the repositories store advertising that can be presented with various types of content items, including video and audio.
  • the environment 100 can be used to identify relevant advertising content based on the content of the content item and perhaps other criteria.
  • the advertisement provider 102 can acquire information about the subject matter of a video, such as by reading video metadata that includes keywords that describe the subject matter of the video, and/or analyzing speech in the video, and/or analyzing the visual content of the video, for example.
  • the information can be used to identify relevant advertisements, from which one or more are selected for presentation with the video.
  • content items can include various forms of electronic media.
  • a content item can include text, audio, video, advertisements, configuration parameters, documents, video files published on the Internet, television programs, audio podcasts, video podcasts, live or recorded talk shows, video voicemail, segments of a video conversation, and other distributable resources.
  • a “video content item” is an item of content that includes content that can be perceived visually when played, rendered, or decoded.
  • a video includes video data, and optionally audio data, executable code, hyperlinks, and metadata.
  • Video data includes content in the video that can be perceived visually when the video content item is played, rendered, or decoded.
  • Audio data includes content in the video that can be perceived aurally when the video content item is played, decoded, or rendered.
  • Executable code includes, for example, interactive ADOBE FLASH, JavaScript, or other forms of interactive content.
  • Hyperlinks include, for example, links embedded in or associated with the video or executable code that provide an address link, such as a Uniform Resource Locator (URL), to other content or executable code.
  • URL Uniform Resource Locator
  • a video can include video data and any accompanying audio data regardless of whether or not the video is ultimately stored on a tangible medium.
  • a video can include, for example, a live or recorded television program, a live or recorded theatrical or dramatic work, a music video, a televised event (e.g., a sports event, a political event, a news event, etc.), video voicemail, etc.
  • Video, audio or other content items can also be part of a media playlist.
  • a video content item can also include many types of associated data or metadata.
  • Metadata includes, for example, tags, labels, keywords, time stamps, XML-enclosed data, or other non-displayed information about the content.
  • types of associated data include video data, audio data, closed-caption or subtitle data, a transcript, content descriptions (e.g., title, actor list, genre information, first performance or release date, etc.), related still images, user-supplied or provider-provided tags and ratings, etc.
  • Some of this data, such as the description can refer to the entire video content item, while other data (e.g., the closed-caption data) may be temporally-based or timecoded.
  • the temporally-based data may be used to detect scene or content changes to determine relevant portions of that data for targeting ad content to users.
  • the executable code and/or metadata may include interactive playlists of media content, such as lists of video, audio, web links, or other content types.
  • a video content item has one or more interstitial advertisement slots.
  • One or more video advertisements can be presented in between the portions of the video separated by an advertisement slot, similar to television advertising commercials that are presented between portions of a television program.
  • the positions of the advertisement slots can be specified by metadata associated with the video and stored at the video provider 209 .
  • the positions of the slots can be manually specified by the author of the video or automatically determined based on an analysis of the video.
  • An example technique for analyzing a video to determine positions of advertisement slots is disclosed in U.S. patent application Ser. No. 11/737,038, titled “Characterizing Content for Identification of Advertising,” filed Apr. 18, 2007, which is incorporated by reference in its entirety herein. Further details related to advertisement slots are disclosed in U.S. patent application Ser. No. 11/550,388, titled “Using Viewing Signals in Targeted Video Advertising,” filed Oct. 17, 2006, which is incorporated by reference in its entirety herein.
  • an “audio content item” is an item of content that can be perceived aurally when played, rendered, or decoded.
  • An audio content item includes audio data and optionally metadata.
  • the audio data includes content in the audio content item that can be perceived aurally when the video content item is played, decoded, or rendered.
  • An audio content item may include audio data regardless of whether or not the audio content item is ultimately stored on a tangible medium.
  • An audio content item may include, for example, a live or recorded radio program, a live or recorded theatrical or dramatic work, a musical performance, a sound recording, a televised event (e.g., a sports event, a political event, a news event, etc.), voicemail, etc.
  • Each of different forms or formats of the audio data (e.g., original, compressed, packetized, streamed, etc.) may be considered to be an audio content item (e.g., the same audio content item, or different audio content items).
  • Advertising content can include text, graphics, video, audio, banners, links, executable code and scripts, and other web or television programming related data.
  • ad content can be formatted differently, based on whether it is primarily directed to websites, media players, email, television programs, closed captioning, etc.
  • ad content directed to a website may be formatted for display in a frame within a web browser.
  • ad content directed to a video player may be presented “in-stream” as a video content item is played in the video player.
  • in-stream ad content may replace the video content item in a video player for some period of time or inserted between portions of the video content item.
  • An in-stream ad can be pre-roll (before the video content item), post-roll (after the video content item), or interstitial.
  • An in-stream ad may include video, audio, text, animated images, still images, or some combination thereof.
  • the advertisement can appear in the same form as video content, as an overlay over video content, or in other forms. Examples of forms of advertisements for video that can be used with the implementations described in this specification are disclosed in U.S. Provisional Application No. 60/915,654, entitled “User Interfaces For Web-Based Video Player,” filed May 2, 2007; and U.S. patent application Ser. No. 11/760,709, entitled “Systems and Processes for Presenting Informational Content,” filed Jun. 8, 2007, which are incorporated by reference in their entirety herein.
  • the content provider 104 can present content to a user device 106 through the network 108 .
  • the content providers 104 are web servers where the content includes webpages or other content written in the Hypertext Markup Language (HTML), or any language suitable for authoring webpages.
  • HTML Hypertext Markup Language
  • content provider 104 can include users, web publishers, and other entities capable of distributing content over a network.
  • the content provider 104 may make the content accessible through a known URL.
  • the content provider 104 can receive requests for content (e.g., articles, discussion threads, music, audio, video, graphics, search results, webpage listings, etc.). The content provider 104 can retrieve the requested content in response to the request or service the request in some other way.
  • the advertisement provider 102 can broadcast content as well (e.g., not necessarily responsive to a request).
  • Content provided by content provider 104 can include news, weather, entertainment, or other consumable textual, audio, video, game, or multimedia content. More particularly, the content can include various resources, such as documents (e.g., webpages, plain text documents, dynamic network applications provided to the user on-the-fly, Portable Document Format (PDF) documents, images), video or audio clips, etc.
  • documents e.g., webpages, plain text documents, dynamic network applications provided to the user on-the-fly, Portable Document Format (PDF) documents, images), video or audio clips, etc.
  • PDF Portable Document Format
  • the content can be graphic-intensive, media-rich data, such as, for example, FLASH-based content that presents video and audio, Asynchronous JavaScript and XML (AJAX) based web applications or web pages, and the like.
  • AJAX Asynchronous JavaScript and XML
  • the content provider 104 can provide video content items for presentation to a user.
  • the content provider 104 can provide a video content item as a stream or as a downloadable file to a user device 106 .
  • the content provider 104 can also provide a video player module with a video.
  • a content item e.g., a video content item, an audio content item
  • a player module from the content provider 104 is embedded into a document (e.g., a webpage provided by content provider 104 ).
  • the document includes the player module and a reference to the video or audio content item.
  • the document can be sent to a user device 106 .
  • the embedded player module can retrieve the referenced video or audio content item for playback at the user device 106 .
  • the player can provide a one or more pieces of content, such as via a playlist.
  • a document and a content item embedded in the documents are provided by different content providers.
  • a video from a first content provider is embedded in a document (e.g., a webpage) from a second content provider.
  • the document includes a location (e.g., a URL) of the video at the first content provider.
  • the video can be obtained from the second content provider and played back at the user device 106 .
  • the environment 100 includes one or more user devices 106 .
  • the user device 106 can include a desktop computer, laptop computer, a media player (e.g., an MP3 player, a streaming audio player, a streaming video player, a television, a computer, a mobile device, etc.), a mobile phone, a browser facility (e.g., a web browser application), an e-mail facility, telephony means, a set top box, a television device or other electronic device that can access advertisements and other content via network 108 .
  • the content provider 104 may allow user device 106 to access content (e.g., webpages, videos, etc.).
  • the network 108 facilitates wireless or wireline communication between the advertisement provider 102 , the content provider 104 , and any other local or remote computers (e.g., user device 106 ).
  • the network 108 may be all or a portion of an enterprise or secured network.
  • the network 108 may be a virtual private network (VPN) between the content provider 104 and the user device 106 across a wireline or a wireless link. While illustrated as a single or continuous network, the network 108 may be logically divided into various sub-nets or virtual networks without departing from the scope of this disclosure, so long as at least a portion of the network 108 may facilitate communications between the advertisement provider 102 , content provider 104 , and at least one client (e.g., user device 106 ). In certain implementations, the network 108 may be a secure network associated with the enterprise and certain local or remote clients 106 .
  • Examples of network 108 include a local area network (LAN), a wide area network (WAN), a wireless phone network, a Wi-Fi network, and the Internet.
  • LAN local area network
  • WAN wide area network
  • wireless phone network a Wi-Fi network
  • Internet the Internet
  • the content provider 104 may transmit information about how, when, and/or where the ads are to be rendered, and/or information about the results of that rendering (e.g., ad spot, specified segment, position, selection or not, impression time, impression date, size, temporal length, volume, conversion or not, etc.) back to the advertisement provider 102 through the network 108 .
  • information about how, when, and/or where the ads are to be rendered and/or information about the results of that rendering (e.g., ad spot, specified segment, position, selection or not, impression time, impression date, size, temporal length, volume, conversion or not, etc.) back to the advertisement provider 102 through the network 108 .
  • ad spot e.g., specified segment, position, selection or not, impression time, impression date, size, temporal length, volume, conversion or not, etc.
  • such information may be provided back to the advertisement provider 102 by some other means.
  • FIG. 2 is a block diagram illustrating an example advertising delivery system 200 .
  • System 200 includes, or is communicatively coupled with, advertisement provider 201 , content provider 203 , and user device 205 , at least some of which communicate across network 207 .
  • the advertisement delivery system 200 is an example implementation of the network environment 100 , where advertisement provider 201 is an implementation of advertisement provider 102 , content provider 203 is an implementation of content provider 104 , user device 205 is an implementation of user device 106 , and network 207 is an implementation of network 108 .
  • the advertisement provider 201 includes a content analyzer 202 , an ad selection module 204 , an ad server 206 , and a surrounding content module 208 .
  • the content analyzer 202 can analyze received content items (e.g., videos) to determine one or more targeting criteria for content items.
  • the content analyzer 202 may implement various analysis methods, including, but not limited to weighting schemes, speech processing, image or object recognition, and statistical methods.
  • Analysis methods can be applied to the contextual elements of the received content item (e.g., a video, an audio clip) to determine relevant targeting criteria.
  • the received content can undergo one or more of audio volume normalization, automatic speech recognition, transcoding, indexing, image recognition, etc.
  • the content analyzer 202 includes a speech to text module 210 and an image recognition module 212 . Other modules are possible.
  • the speech to text module 210 can analyze a video to identify speech in a video or audio file or stream.
  • a video content item may be received in the system 200 .
  • the speech-to-text module 210 can analyze the video content item as a whole.
  • Textual information may be derived from the speech included in the audio data of the video or audio content item by performing speech recognition on the audio content, producing in some implementations hypothesized words annotated with confidence scores, or in other implementations a lattice which contains many hypotheses.
  • Examples of speech recognition techniques include techniques based on hidden Markov models, dynamic programming, or neural networks.
  • the speech analysis may include identifying phonemes, converting the phonemes to text, interpreting the phonemes as words or word combinations, and providing a representation of the words, and/or word combinations, which best corresponds with the received input speech (e.g., speech in the audio data of a video content item).
  • the text can be further processed to determine the subject matter of the video or audio content item. For example, keyword spotting (e.g., word or utterance recognition), pattern recognition (e.g., defining noise ratios, sound lengths, etc.), or structural pattern recognition (e.g., syntactic patterns, grammar, graphical patterns, etc.) may be used to determine the subject matter, including different segments, of the video content item.
  • further processing may be carried out on the video or audio content item to refine the identification of subject matter in the video or audio content item.
  • a video or audio content item can also include timecoded metadata.
  • timecoded metadata include closed-captions, subtitles, or transcript data that includes a textual representation of the speech or dialogue in the video or audio content item.
  • a caption data module (not shown) at the advertisement provider 201 extracts the textual representation from the closed-caption, subtitle, or transcript data of the content item and used the extracted text to identify subject matter in the video or audio content item.
  • the extracted text can be a supplement to or a substitute for a speech recognition analysis on the video or audio content item.
  • Further processing of received content can also include image or object recognition.
  • automatic object recognition can be applied to received or acquired video data of a video content item to determine targeting criteria for one or more objects associated with the video content item.
  • the image recognition module 212 may automatically extract still frames from a video content item for analysis.
  • the analysis may identify targeting criteria relevant to objects identified by the analysis.
  • the analysis may also identify changes between sequential frames of the video content item that may be indicia of different scenes (e.g., fading to black).
  • object recognition techniques include appearance-based object recognition, and object recognition based on local features.
  • An example of object recognition is disclosed in U.S. patent application Ser. No. 11/608,219, entitled “Image Based Contextual Advertisement Method and Branded Barcodes,” filed Dec. 7, 2006, which is incorporated by reference in its entirety herein.
  • the surrounding content module 208 reads content that is in proximity to (e.g., surrounds) an embedded content item.
  • the surrounding content module 208 reads the content that surrounds the embedded content item, i.e. the content of the document other than the video itself.
  • the surrounding content can be matched against advertisements to identify advertisements that have some subject matter relevance to the document, for example.
  • Advertisement provider 201 includes an ad selection module 204 .
  • the ad selection module 204 identifies, for a presentation of a content item, candidate advertisements and selects one or more of these candidate advertisements for presentation with the content item.
  • the ad selection module 204 can identify the candidate advertisements based on multiple criteria, including advertiser preferences, content provider preferences, the content of the video, and content surrounding or in proximity to the video. From these candidate advertisements, one or more are selected for presentation with the video.
  • advertisements are selected for presentation, in accordance with an auction. Further details on the identification of candidate advertisements and selection of advertisements are described below in reference to FIG. 5 .
  • Advertisement provider 201 includes an ad server 206 .
  • Ad server 206 may directly, or indirectly, enter, maintain, and track ad information.
  • the ads may be in the form of graphical ads such as so-called banner ads, text only ads, image ads, audio ads, video ads, ads combining one of more of any of such components, etc.
  • the ads may also include embedded information, such as a link, and/or machine executable instructions.
  • metadata can be associated with an advertisement.
  • the metadata can include one or more keywords indicating topics or concepts to which the advertisement is relevant or other information that indicates the subject matter of the advertisement.
  • the advertiser specifies one or more keywords that express the subject matter of the advertisement or one or more categories or verticals to which the advertisement is targeted.
  • Ad server 206 can receive requests for advertisements from a user device 205 . In response to these requests, the ad server 206 can transmit selected advertisements to the user device 205 . Further, the ad server 206 can receive usage information (e.g., advertisement click-through information) from the user device 205 . Although not shown, other entities may provide usage information (e.g., whether or not a conversion or selection related to the ad occurred) to the ad server 206 . For example, this usage information may include measured or observed user behavior related to ads that have been served. In some implementations, usage information can be measured in a privacy-preserving manner such that individually user identifiable information is filtered by the ad server, by the user device, or through an intermediary device.
  • usage information e.g., advertisement click-through information
  • other entities may provide usage information (e.g., whether or not a conversion or selection related to the ad occurred) to the ad server 206 . For example, this usage information may include measured or observed user behavior related to
  • the ad server 206 may include information concerning accounts, campaigns, creatives, targeting, advertiser preferences for ad placement, etc.
  • the term “account” relates to information for a given advertiser (e.g., a unique email address, a password, billing information, etc.).
  • a “campaign,” “advertising campaign,” or “ad campaign” refers to one or more groups of one or more advertisements, and may include a start date, an end date, budget information, targeting information, syndication information, etc.
  • the advertisement provider 201 can use one or more advertisement repositories 214 for selecting ads for presentation to a user.
  • the repositories 214 may include any memory or database module and may take the form of volatile or non-volatile memory including, without limitation, magnetic media, optical media, random access memory (RAM), read-only memory (ROM), removable media, or any other suitable local or remote memory component.
  • the content provider 203 includes a content server 215 .
  • the content server 215 can serve various types of content, including documents, video content items, and audio content items, for example.
  • the content server 215 can serve documents that have other content (e.g., video or audio content items from the content provider 203 or another content provider) embedded within.
  • the content that can be served by the content server 215 is stored in a content repository.
  • different types of content can be stored in separate repositories.
  • documents can be stored in a document repository 217
  • video content items and metadata associated with the video content items can be stored in a video content repository 218 .
  • video metadata is written using the Extensible Markup Language (XML).
  • the content server 215 serves video content items, such as video streams or video files, for example. Video player applications may be used to render video streams or files. Ads may be served by the ad server 206 in association with video content items. For example, one or more ads may be served before, during, or after a music video, program, program segment, etc. Alternatively, one or more ads may be served in association with a music video, program, program segment, etc.
  • the videos that can be served by the content server 215 are stored in a videos and video metadata repository 218 .
  • the content provider 203 also provides access to documents (e.g., a webpage) that include information (e.g., subject matter and categories of the videos) about the videos in the video repository 218 .
  • FIG. 2 is illustrated as having one content provider providing both documents and content items (e.g., video content items), in some implementations, separate content providers, with different content servers, provide documents and content items, respectively.
  • a first content provider can provide document retrieved from a document repository, where the documents include references to content items (e.g., video or audio content items) stored in a content item repository and served by a second content server.
  • the advertisement provider 201 and the content provider 203 can provide content to a user device 205 .
  • the user device 205 is an example of an ad consumer.
  • the user device 205 may include a user device such as a media player (e.g., an MP3 player, a streaming audio player, a streaming video player, a television, a computer, a mobile device, etc.), a browser facility, an e-mail facility, telephony means, etc.
  • the user device 205 includes an application 220 for rendering and presenting content and advertisements.
  • the application is a web browser that can render and display documents (e.g., webpages) and other content.
  • one or more plug-ins can be associated with the application. The plug-ins can facilitate rendering and presentation of content (e.g., video or audio content items) or advertisements by the application 220 .
  • the user device 205 includes a content player module 222 and an advertising module 224 .
  • the content player module 222 can play back content items, such as video or audio content items, for example.
  • the content player module 222 is embedded in a document (e.g., a webpage), and received by the user device 205 when the document is received by the user device 205 .
  • the content player module 222 is executed by the application 220 or a plug-in to the application 220 .
  • the advertising module 224 sends requests for advertisements to the advertising provider 201 and receives advertisements responsive to the requests from the advertising provider 201 .
  • the advertising module 224 can provide the advertisements to the application 220 or the content player module 222 for presentation to the user.
  • the advertising module 224 is a sub-module of the content player module 222 .
  • FIG. 3 is an example user interface 300 illustrating video content displayed on a screen with surrounding content.
  • the user interface 300 illustrates an example web browser user interface.
  • the content shown in the user interface 300 can be presented in a webpage, an MP3 player, a streaming audio player, a streaming video player, a television, a computer, a mobile device, etc.
  • the content shown in the user interface 300 may be provided by advertisement provider 201 , content provider 203 , another networked device, or some combination of those providers.
  • the user interface 300 includes a content player region 302 and one or more “surrounding content” regions 304 A, 304 B, and 304 C.
  • the content player region 302 may include a media player for presenting text, images, video, or audio, or any combination thereof. An example of what can be shown in the content player region 302 is described in further detail below in relation to FIG. 4 .
  • the surrounding content regions 304 A, 304 B, and 304 C can display text, graphics, links, third party add-ins (e.g., search controls, download buttons, etc.), video and audio clips (e.g., graphics), help instructions (e.g., text, html, pop-up controls, etc.), and advertisements (e.g., banner ads, flash-based video/audio ads, scrolling ads, etc.).
  • third party add-ins e.g., search controls, download buttons, etc.
  • video and audio clips e.g., graphics
  • help instructions e.g., text, html, pop-up controls, etc.
  • advertisements e.g., banner ads, flash-based video/audio ads, scrolling ads, etc.
  • the surrounding content regions 304 A, 304 B, and 304 C are portions of a webpage in which the content player is embedded.
  • the surrounding content may be related to the content displayed in the content player region 302 .
  • advertisements related to the content in the content player region 302 can be displayed in any of the surrounding content regions 304 A, 304 B, 304 C.
  • the content in the content player region 302 is chosen for display with the surrounding content because they are related in subject matter (e.g., a video and a video player embedded with a posting in a blog).
  • the surrounding content is not related to the content in the content player region 302 .
  • the surrounding content regions 304 A, 304 B, and 304 C may be in proximity to the content player region 302 during the presentation of video content in the region 302 .
  • the surrounding content regions 304 A, 304 B, and 304 C can be adjacent to the content player region 302 , either above, below, or to the side of the content player region 302 .
  • the user interface 300 may include an add-on, such as a stock ticker with text advertisements. The stock ticker can be presented in one of the surrounding content regions 304 A, 304 B, or 304 C.
  • FIG. 4 illustrates an example user interface that can be displayed in a video player, such as in content player region 302 .
  • Content items such as video, audio, and so forth can be displayed in the content player region 302 .
  • the region 302 includes a content display portion 402 for displaying a content item, a portion 404 for displaying information (e.g., title, running time, etc.) about the content item, player controls 405 (e.g., volume adjustment, full-screen mode, play/pause button, progress bar and slider, option menu, etc.), an advertisement display portion 408 , and a multi-purpose portion 406 that can be used to display various content (e.g., advertisements, closed-captions/subtitles/transcript of the content item, related links, etc.).
  • content display portion 402 for displaying a content item
  • a portion 404 for displaying information (e.g., title, running time, etc.) about the content item
  • player controls 405 e.g., volume adjustment, full-screen
  • the content shown represents a video (or audio) interview occurring between a person located in New York City, New York and a person located in Los Angeles, Calif.
  • the interview is displayed in the content display portion 402 of the region 302 .
  • the region 302 may be presented as a stream, upon visiting a particular site presenting the interview, or after the execution of a downloaded file containing the interview or a link to the interview.
  • the region 302 may display additional content (e.g., advertisement content) that relates to the content shown in the video interview.
  • the additional content may change according to what is displayed in the region 302 .
  • the additional content can be substantially available as content from the content provider 203 and/or the advertisement provider 201 .
  • on-screen advertisement is displayed in the multi-purpose portion 406 .
  • An additional on-screen advertisement is displayed in the advertisement display portion 408 .
  • on-screen advertisements may include text-and-audio, video, text, animated images, still images, or some combination thereof.
  • the content display portion 402 can display advertisements targeted to audio-only content, such as ads capable of being displayed in-stream with a podcast or web monitored radio broadcasts.
  • the advertisement provider 201 may provide interstitial advertisements, sound bytes, or news information in the audio stream of music or disc jockey conversations.
  • Advertisements may be presented on the content display portion 402 .
  • Temporal placement of advertisements relative to a video content item may vary.
  • an advertisement presentation may be pre-roll, mid-roll or post-roll placement.
  • the progress bar in the player controls 405 also shows the positions of the advertisement slots in the content item being played.
  • the multi-purpose portion 406 may also include a skip advertisement link or control 410 .
  • the skip advertisement link 410 is selected by the user, the currently displayed video advertisement is skipped and playback continues from the first frame of the video after the skipped video advertisement (or, playback stops if the skipped video advertisement is a post-roll advertisement).
  • the skip advertisement link or control 410 is a link.
  • the skip advertisement link or control 410 may be a button, selectable icon, or some other user-selectable user interface object.
  • the ability of a user to skip advertisements may effect the selection of an advertisement to be presented by the advertisement provider 201 .
  • a large number of skips for an advertisement may indicate that the advertisement is ineffective or unpopular, and thus can be made less likely to be selected for presentation (e.g., by decreasing its bid in a placement auction or being counted as a negative weighting factor in auctions or towards advertising impressions).
  • advertisements that are infrequently skipped may have a positive weighting factor and or have a positive impact toward advertising impressions.
  • FIG. 5 illustrates an example process 500 for selecting and delivering advertisements for presentation with a content item.
  • the process 500 will be described in reference to a system (e.g., system 200 ) that performs the process.
  • a system e.g., system 200
  • process 500 will be described with reference to a video content item (e.g., a video stream or file), but it should be appreciated that process 500 is also applicable to other content items, such as audio content items, as well.
  • a request for one or more advertisements is received ( 502 ).
  • a user device 205 can receive a document from a content provider 203 .
  • the document can include an embedded video content item (hereinafter referred to as the “video”), an embedded content player (e.g., content player module 222 ), and an advertising module (e.g., advertising module 224 ).
  • the advertising module 224 sends a request for advertisements to the advertisement provider 201 .
  • the request can be for one or more advertisements for presentation with the video.
  • the request can specify the type (e.g., computer, mobile phone, etc.) of the user device 205 that sent the request. For example, the request can specify that the user device is a desktop or notebook computer or a mobile phone.
  • the request can include metadata associated with the video.
  • the metadata can specify preferences on what and how advertisements are presented with the video.
  • the preferences can be set by the author of the video, the author of the document in which the video is embedded, or content provider 203 (hereinafter referred to collectively as “content provider preferences”).
  • the content provider preferences specify the available positions in which video advertisements can be presented.
  • the metadata can specify that the video has two interstitial positions, a pre-roll position, and a post-roll position for advertisements, and that advertisements may be placed in any of these positions.
  • the metadata can also include data that provides an indication of the content (e.g., the subject matter) of the video.
  • the metadata can include one or more keywords that specify topics or concepts to which the video is relevant.
  • the metadata can include one or more categories or verticals (hereinafter referred as “channels”) into which the video can be classified.
  • the metadata can also include information referencing one or more playlists or playlist categories on which the content is placed or linked.
  • the topical data or the channel data can be used by the advertisement provider 201 to identify advertisements that are relevant to the video. In some other implementations, the topical or channel information of the video can be determined by analyzing the video.
  • the metadata can specify other types of content provider preferences as well.
  • the metadata can specify whether video advertisements displayed with the video can be skipped or not.
  • the metadata can also specify one or more target demographics for the video.
  • the metadata can specify the time lengths of advertisement placement positions, the maximum allowable advertisement duration, and/or which types of advertisements (e.g., text ads, overlay ads, banner ads, pop-up ads, video ads, etc.) are allowed.
  • the preferences in the metadata include blacklisted keywords or advertisers. An advertisement associated with a blacklisted keyword or advertiser is automatically disqualified from consideration for placement with the video.
  • an advertiser in the blacklist is specified by a name and/or a URL associated with the advertiser.
  • One or more candidate advertisements are identified ( 504 ).
  • one or more advertisements from the advertisements repository 214 are identified by the ad selection module 204 to be candidates in an auction for placement with the video.
  • the candidate advertisements are identified based on multiple criteria.
  • a criterion for identification of candidate advertisements is advertiser preferences.
  • an advertiser can set preferences related to the presentation and targeting of the advertisement.
  • the preferences can include preferences related to presentation position (e.g., pre-roll, post-roll, interstitial), capability of being skipped by the user, target demographics, type of device, and so forth.
  • the advertiser preferences are matched against the content provider preferences.
  • the advertiser can specify a preference as to whether a video advertisement can be presented pre-roll, post-roll, or in an interstitial slot.
  • the advertiser can specify a preference as to whether a user viewing a video advertisement can be given the capability to skip it.
  • the advertiser can specify a preference as to the target demographics to which the advertisement may be shown.
  • the advertiser can specify that the advertisement is best viewed on a desktop/notebook computer or a mobile phone. Other types of preferences are possible.
  • the content provider preferences may include a maximum allowable time length per video ad. A video ad may be disqualified if its length is longer than the maximum time length. As another example, the content provider preferences may include restrictions on which types of ads are allowed. If the ad does not belong to one of the allowed types, the ad can be disqualified.
  • an advertiser preference can be an absolute preference or a relative preference. If a preference is an absolute preference and it is not satisfied by the video, then the advertisement is disqualified from the auction, regardless of satisfaction of other preferences and relevance of the advertisement to the video or the document. If the preference is a relative preference and it is not completely satisfied, then the advertisement may still be eligible for the auction, subject to other criteria, but the incomplete satisfaction of the preference can be used to modify a bid associated with the advertisement in the auction.
  • the advertiser preferences can specify a bid modifier for a preference, in case that preference is not satisfied completely.
  • the preferences can include a channel to which the advertisement is targeted.
  • An advertiser can target an advertisement to a particular channel. Examples of channels include “news videos” or “action movies.”
  • an advertisement targeted to a channel is identified as a candidate advertisement for auctions for placement with videos in that channel.
  • an advertisement targeted to a channel can be disregarded with respect to a particular video in the channel if other advertiser preferences for the advertisement are not satisfied by the video. For example, an advertisement can be disregarded if the advertiser specified that the advertisement be presented only as a pre-roll ad but a video only allows interstitial and post-roll advertisements.
  • Yet another criterion for identifying candidate advertisements is the content (e.g., subject matter) of the video.
  • the video can be associated with metadata specifying one or more topics or concepts.
  • a document provided by the content provider e.g., a webpage
  • the video can be retrieved by the advertisement provider 201 from the content provider 203 and analyzed by the content analyzer 202 .
  • One or more topics or concepts to which the video is relevant can be determined from the analysis.
  • the content of the advertisement are compared against the topics or concepts of the video.
  • the topics or concepts of the video are compared to the topics and concepts of the advertisement.
  • topical keywords of the video are compared to the topical keywords of the advertisement using, for example, keyword analysis, and a score can be determined from the comparison. If the score exceeds a threshold, for example, the advertisement can be identified as a candidate for the auction, pending satisfaction of other criteria, if any. Topics can also be determined from image, audio or video analysis.
  • the topical keywords for a video can be determined using search engine data.
  • data related to interactions to a video by users who were directed to the video by a search query e.g., the video was a result for the search query
  • the video was a result for the search query can be an indication of whether the search query is a good topical keyword for the video.
  • Interactions with the video can include watching the video in its entirety, watching the video repeatedly, fast-forwarding or advancing the video, or skipping the video.
  • a search engine can collect the query and interaction data anonymously and the advertisement provider 201 can analyze the data to determine if a search query is a good keyword for the video based on the interactions of the users.
  • search query can be determined to be a poor keyword for the video.
  • search query can be determined to be a good keyword for the video.
  • the good keywords are associated with the video and can be compared to the keywords for the advertisements.
  • an advertisement can target specific channels.
  • a video can be analyzed to determine to which channel the video can be classified, or the channels to which the video can be classified can be specified in the video metadata.
  • videos can be classified into channels based on the analysis of the videos, if no channels were pre-specified. If the targeted channels of the advertisement match any of the channels to which the video is classified, the advertisement is identified as a candidate for the auction, subject to satisfaction of other preferences (e.g., satisfaction of content provider preferences regarding placement position).
  • advertisements can target categories of playlists or individual playlists based on such categorization.
  • a further criterion for identifying candidate advertisements is the surrounding content around the embedded video.
  • the surrounding content can include the content of the document in which the video is embedded.
  • the surrounding content can also include the use profile of the owner of the document or the user viewing the video. For example, if a video is embedded in a person's page on a social networking site, the user profile of the person owning the page and/or the user profile of the user who is viewing the page and the video are surrounding content.
  • the surrounding content module 208 can compare the surrounding content against the topics and concepts of the advertisement using, for example, keyword analysis, and a score can be determined from the comparison. If the score exceeds a threshold, for example, the advertisement can be identified as a candidate for the auction, pending satisfaction of other criteria, if any.
  • a “user profile” of a page owner or of a viewer of the video is demographic information associated with the respective user, such as age, gender, education, etc.
  • Demographic information of a user can be extracted from the user's profile on a social networking site, for example.
  • the demographic information can also be obtained from third parties. The demographic information can be obtained separate from any personally identifiable information.
  • a further criterion for identifying candidate advertisements is content of a page that links to the video.
  • Hyperlinks to a video can be identified by a web crawler system.
  • Content in the pages that contain the hyperlinks can be extracted.
  • the anchor text of a hyperlink and text content surrounding the hyperlink can be extracted.
  • the extracted text can be compared to the topics and concepts of the advertisements, similar to the comparison of the surrounding content and the advertisement described above.
  • the content from the linking page can be weighted higher if the linking page is a high-quality page based on inbound and outbound links, for example.
  • the page linking to the video or the page in which the video is embedded can be analyzed to determine if the content in the page is related specifically to the video or includes content related to other videos as well.
  • the page can be parsed in a structured way to identify components such as titles, particular blog posts, and so on.
  • content related to other videos can be weighted less or disregarded.
  • interaction data e.g., skipping, fast-forwarding, watching multiple times, etc.
  • the interaction data can be used to determine which hyperlinks and content related to the hyperlinks are more relevant to the video for use in advertisement selection.
  • another criterion is a text transcript of speech in the video generated by a speech recognition process applied to the video.
  • the transcript can be a source of keywords for the video; keyword analysis can be applied to the transcript to identify keywords for the video.
  • the speech recognition process can also determine per-term confidence levels for terms in the transcript, where a term can include words, phrases, numbers, characters, etc.
  • the per-term confidence levels measure a confidence of the speech recognition process that the corresponding terms were recognized correctly.
  • the confidence levels can be used as weights for terms in the transcript. For example, for two terms in the transcript of approximately equal relevance to the video, the term with the higher confidence level is given more weight, and thus more influence, when matching advertisements to terms in the transcript.
  • An example of using speech recognition, including confidence levels for terms, to identify advertisements relevant to content is disclosed in U.S. patent application Ser. No. 11/241,834, titled “Using Speech Recognition to Determine Advertisements Relevant to Audio Content and/or Audio Content Relevant to Advertisements,” filed Sep. 30, 2005, which is incorporated by reference herein in its entirety.
  • the transcript can also be used to verify any publisher-provided metadata regarding the subject matter of the video.
  • the publisher of the video can declare in the video metadata that the video is about cars. If the transcript indicates that the video is about baseball (e.g., because the transcript has no mention of “cars”), the video and the publisher can be flagged for further manual review. If the manual review confirms that the video metadata does not accurately describe the subject matter of the video, the publisher can be penalized (e.g., banned from receiving advertisements from the advertisement provider 201 , lesser penalties).
  • the comparison with the surrounding content is omitted if the advertisement targets particular channels of videos.
  • both the content of the video and the surrounding content can affect the identification of candidate advertisements.
  • a video about cars If the video is embedded in a webpage about racing, then advertisements for racing cars or sports cars, for example, can be identified as candidate advertisements. If the video is embedded in a webpage about parenting, then advertisements for family friendly cars (e.g., minivans) or child safety seats, for example, can be identified as candidate advertisements. Advertisements for racing cars or sports cars would be less likely to be identified as candidate advertisements in the second scenario because of the difference in the surrounding content (webpage about parenting vs. webpage about racing). In some implementations, content that is not likely to be relevant to the content of a webpage can be excluded from the targeting.
  • an advertisement is compared against both the content of the video and of the surrounding content, and the comparisons are combined in a weighted manner.
  • the comparison between the topics of the advertisement and the topics of the video, as determined from an analysis of the video can yield one score
  • the comparison between the topics of the advertisement and the content of the document in which the video is embedded yields another score.
  • comparison to the user profile of the document owner and/or to the user profile of a user viewing the document can yield respective scores.
  • Weights can be specified for each of these types of comparisons.
  • the weights can be specified by the author or provider of the video or the owner or provider of the document in which the video is embedded.
  • the weights can be specified per provider and/or per content item.
  • the weights can serve as an indication of which comparison should play a larger role in affecting which advertisements are candidates.
  • the weights can be adjusted over time to optimize the weightings. For example, weights can be determined from experimentation and empirical analysis of performance of advertisements placed using different weights. As another example, a machine learning process can be used to analyze the advertisement performance and adjust weights without manual intervention.
  • the scores from the comparisons to the content of the video and of the surrounding content can be combined into a weighted score for the advertisement using, for example, a linear combination.
  • a weighted score S can be determined for an advertisement. If the weighted score exceeds a specified threshold, then the advertisement is identified as a candidate advertisement for the auction. Otherwise, the advertisement is disregarded or disqualified.
  • One or more of the candidate advertisements are selected ( 506 ).
  • the identified candidate advertisements are placed into an auction for presentation with the video.
  • Advertisers can enter bids using the ad server 206 , for example.
  • the bids can be cost-per-click (CPC) bids or cost-per-thousand impression (CPM) bids.
  • CPC cost-per-click
  • CPM cost-per-thousand impression
  • the CPM bid of an advertisement can converted to an estimate CPC bid by multiplying the CPM bid with a click-through rate of the advertisement.
  • the advertisements in the auction are then ordered based on their actual or estimate CPC bids.
  • the advertiser can enter a base bid and any number of bid modifiers.
  • An example of a bid modifier is a decrease of $0.50 from the bid if the video allows skipping of advertisements.
  • Another example of a bid modifier is a decrease of $0.50 from the bid if the video does not allow pre-roll advertisements.
  • a bid of an advertisement can also be modified by the advertisement provider 201 based on the performance or quality of the advertisement over time.
  • quality measures of an advertisement can include a click-through rate, a conversion rate, a skip rate, the number of times the advertisement was viewed completely vs. the number of impressions, and so forth. Advertisements of low quality based on these measures can have their bids decreased. For example, if the advertisement has a high rate of being skipped by viewers and/or has a low conversion rate, the bid can be decreased.
  • the advertisements with the highest bids are selected by the ad selection module 204 to fill the available advertisement placement positions.
  • the three advertisements with the highest bids are selected for the positions.
  • the advertisements are placed into the positions in their bid order.
  • the advertisement with the highest bid is placed in the earliest position, the advertisement with the next highest bid is placed in the next position, and so forth.
  • the bid order determines a priority as to which of the selected advertisements will get a desired position. For example, if there are a pre-roll position, an interstitial slot, and a post-roll position, a highest bidding advertisement that prefers an interstitial slot will get the interstitial slot, while the next advertisement in the bid order can be placed in a pre-roll or a post-roll position even if the advertisement also prefers an interstitial slot.
  • the selected advertisements are transmitted for presentation with the video ( 508 ).
  • the ad server 206 transmits the advertisements to the user device 205 .
  • the advertisements are received by the advertising module 224 and are placed into their respective positions.
  • the content player module 222 displays the advertisements at their respective positions. For example, the advertisements can be displayed in-stream with a video in the content display region 402 ( FIG. 4 ).
  • the user response to the advertisements and the video can be monitored. For example, data on click-throughs of the advertisement, viewing of the respective advertisements in their entirety (or not), viewing of the video in its entirety, and skipping of the advertisements can be collected by the advertisement provider 201 .
  • the advertisement provider can use this data to determine the performance of the advertisements, for example.
  • the performance data can be used to determine the amount of revenue to which the content provider is entitled.
  • An example of a revenue sharing scheme is disclosed in U.S.
  • the performance data can be used as a signal to adjust the weights of the scores of comparisons to the video and to the document or to adjust the bids of ads in an auction. For example, if the performance of an ad is good with respect to a particular video, the performance data can be used by the advertisement provider 201 to increase one or more weights with respect to the content provider of the video or with respect to the video. On the other hand, if the performance is poor, the weights can be decreased. As another example, the bid for a poor performing ad can be decreased, and the bid for a well performing ad can be increased.
  • the identification of candidate advertisements begins, after receiving the request for advertisements, with the determination of advertisements that are relevant to the video and to the document based on comparisons to the video content and to the document content, as described above.
  • An advertisement can be scored using a linear combination, for example, of the scores for the individual comparisons. For example, the score from the comparison to the video content can be weighted by a first weight to yield a first weighted score (e.g., ⁇ A) and the comparison to the document can be weighted by a second weight to yield a second weighted score (e.g., ⁇ B). The two weighted scores are added to yield the score for the advertisement. Ads with scores above a threshold are identified as candidates for further consideration.
  • the identified advertisements are then filtered by matching them against the content provider preferences. For example, if the content provider preferences specify a desire for interstitial ads only, and each ad can be at most 30 seconds long, then ads targeted for a pre-roll or post-roll position or that are longer than 30 seconds can be disqualified. As another example, if the content provider preferences specify a desire for video ads only, then non-video ads are disqualified.
  • One or more of the non-disqualified ads are selected based on an auction, where each of the non-disqualified ads is associated with a bid.
  • the ads with the top bids are selected, subject to length restrictions. For example, if the auction is for one interstitial slot of 1 minute long, and the ad with the high bid is 40 seconds long, then the remaining 20 seconds in the slot can be filled by an ad of 20-seconds or shorter with the next highest bid, or it can go unfilled if there are no 20-second ad in the auction.
  • the ads selected from the auction are transmitted to the user device 205 for presentation to the user.
  • Data on user responses to the advertisements and the video can be collected.
  • the collected data can be used by the advertising provider 201 to adjust the weights in the determination of relevant advertisements and/or adjust the bids of ads in auctions.
  • FIG. 6 shows an example of a generic computer device 600 and a generic mobile computer device 650 , which may be used with the techniques described above.
  • Computing device 600 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, television set-top boxes, servers, blade servers, mainframes, and other appropriate computers.
  • Computing device 650 is intended to represent various forms of mobile devices, such as personal digital assistants, cellular telephones, smartphones, and other similar computing devices.
  • the components shown here, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit the implementations described and/or the claims.
  • Computing device 600 includes a processor 602 , memory 604 , a storage device 606 , a high-speed interface 608 connecting to memory 604 and high-speed expansion ports 610 , and a low speed interface 612 connecting to low speed bus 614 and storage device 606 .
  • Each of the components 602 , 604 , 606 , 608 , 610 , and 612 are interconnected using various buses, and may be mounted on a common motherboard or in other manners as appropriate.
  • the processor 602 can process instructions for execution within the computing device 600 , including instructions stored in the memory 604 or on the storage device 606 to display graphical information for a GUI on an external input/output device, such as display 616 coupled to high speed interface 608 .
  • multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory.
  • multiple computing devices 600 may be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system).
  • the memory 604 stores information within the computing device 600 .
  • the memory 604 is a volatile memory unit or units.
  • the memory 604 is a non-volatile memory unit or units.
  • the memory 604 may also be another form of computer-readable medium, such as a magnetic or optical disk.
  • the storage device 606 is capable of providing mass storage for the computing device 600 .
  • the storage device 606 may be or contain a computer-readable medium, such as a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations.
  • a computer program product can be tangibly embodied in an information carrier.
  • the computer program product may also contain instructions that, when executed, perform one or more methods, such as those described above.
  • the information carrier is a computer- or machine-readable medium, such as the memory 604 , the storage device 606 , memory on processor 602 , or a propagated signal.
  • the high speed controller 608 manages bandwidth-intensive operations for the computing device 600 , while the low speed controller 612 manages lower bandwidth-intensive operations. Such allocation of functions is exemplary only.
  • the high-speed controller 608 is coupled to memory 604 , display 616 (e.g., through a graphics processor or accelerator), and to high-speed expansion ports 610 , which may accept various expansion cards (not shown).
  • low-speed controller 612 is coupled to storage device 606 and low-speed expansion port 614 .
  • the low-speed expansion port which may include various communication ports (e.g., USB, Bluetooth, Ethernet, wireless Ethernet) may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
  • input/output devices such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
  • the computing device 600 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a standard server 620 , or multiple times in a group of such servers. It may also be implemented as part of a rack server system 624 . In addition, it may be implemented in a personal computer such as a laptop computer 622 . Alternatively, components from computing device 600 may be combined with other components in a mobile device (not shown), such as device 650 . Each of such devices may contain one or more of computing device 600 , 650 , and an entire system may be made up of multiple computing devices 600 , 650 communicating with each other.
  • Computing device 650 includes a processor 652 , memory 664 , an input/output device such as a display 654 , a communication interface 666 , and a transceiver 668 , among other components.
  • the device 650 may also be provided with a storage device, such as a microdrive or other device, to provide additional storage.
  • a storage device such as a microdrive or other device, to provide additional storage.
  • Each of the components 650 , 652 , 664 , 654 , 666 , and 668 are interconnected using various buses, and several of the components may be mounted on a common motherboard or in other manners as appropriate.
  • the processor 652 can execute instructions within the computing device 650 , including instructions stored in the memory 664 .
  • the processor may be implemented as a chipset of chips that include separate and multiple analog and digital processors.
  • the processor may provide, for example, for coordination of the other components of the device 650 , such as control of user interfaces, applications run by device 650 , and wireless communication by device 650 .
  • Processor 652 may communicate with a user through control interface 658 and display interface 656 coupled to a display 654 .
  • the display 654 may be, for example, a TFT (Thin-Film-Transistor Liquid Crystal Display) display or an OLED (Organic Light Emitting Diode) display, or other appropriate display technology.
  • the display interface 656 may comprise appropriate circuitry for driving the display 654 to present graphical and other information to a user.
  • the control interface 658 may receive commands from a user and convert them for submission to the processor 652 .
  • an external interface 662 may be provide in communication with processor 652 , so as to enable near area communication of device 650 with other devices. External interface 662 may provide, for example, for wired communication in some implementations, or for wireless communication in other implementations, and multiple interfaces may also be used.
  • the memory 664 stores information within the computing device 650 .
  • the memory 664 can be implemented as one or more of a computer-readable medium or media, a volatile memory unit or units, or a non-volatile memory unit or units.
  • Expansion memory 674 may also be provided and connected to device 650 through expansion interface 672 , which may include, for example, a SIMM (Single In Line Memory Module) card interface.
  • SIMM Single In Line Memory Module
  • expansion memory 674 may provide extra storage space for device 650 , or may also store applications or other information for device 650 .
  • expansion memory 674 may include instructions to carry out or supplement the processes described above, and may include secure information also.
  • expansion memory 674 may be provide as a security module for device 650 , and may be programmed with instructions that permit secure use of device 650 .
  • secure applications may be provided via the SIMM cards, along with additional information, such as placing identifying information on the SIMM card in a non-hackable manner.
  • the memory may include, for example, flash memory and/or NVRAM memory, as discussed below.
  • a computer program product is tangibly embodied in an information carrier.
  • the computer program product contains instructions that, when executed, perform one or more methods, such as those described above.
  • the information carrier is a computer- or machine-readable medium, such as the memory 664 , expansion memory 674 , memory on processor 652 , or a propagated signal that may be received, for example, over transceiver 668 or external interface 662 .
  • Device 650 may communicate wirelessly through communication interface 666 , which may include digital signal processing circuitry where necessary. Communication interface 666 may provide for communications under various modes or protocols, such as GSM voice calls, SMS, EMS, or MMS messaging, CDMA, TDMA, PDC, WCDMA, CDMA2000, or GPRS, among others. Such communication may occur, for example, through radio-frequency transceiver 668 . In addition, short-range communication may occur, such as using a Bluetooth, WiFi, or other such transceiver (not shown). In addition, GPS (Global Positioning System) receiver module 670 may provide additional navigation- and location-related wireless data to device 650 , which may be used as appropriate by applications running on device 650 .
  • GPS Global Positioning System
  • Device 650 may also communicate audibly using audio codec 660 , which may receive spoken information from a user and convert it to usable digital information. Audio codec 660 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset of device 650 . Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, etc.) and may also include sound generated by applications operating on device 650 .
  • Audio codec 660 may receive spoken information from a user and convert it to usable digital information. Audio codec 660 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset of device 650 . Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, etc.) and may also include sound generated by applications operating on device 650 .
  • the computing device 650 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a cellular telephone 680 . It may also be implemented as part of a smartphone 682 , personal digital assistant, or other similar mobile device.
  • the disclosed and other embodiments and the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them.
  • the disclosed and other embodiments can be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer-readable medium for execution by, or to control the operation of, data processing apparatus.
  • the computer-readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more them.
  • data processing apparatus encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers.
  • the apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them.
  • a propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode information for transmission to suitable receiver apparatus.
  • a computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
  • a computer program does not necessarily correspond to a file in a file system.
  • a program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
  • a computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • the processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output.
  • the processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
  • processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
  • a processor will receive instructions and data from a read-only memory or a random access memory or both.
  • the essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data.
  • a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks.
  • mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks.
  • a computer need not have such devices.
  • Computer-readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • semiconductor memory devices e.g., EPROM, EEPROM, and flash memory devices
  • magnetic disks e.g., internal hard disks or removable disks
  • magneto-optical disks e.g., CD-ROM and DVD-ROM disks.
  • the processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
  • the disclosed embodiments can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer.
  • a display device e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
  • a keyboard and a pointing device e.g., a mouse or a trackball
  • Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
  • the disclosed embodiments can be implemented in a computing system that includes a back-end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front-end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of what is disclosed here, or any combination of one or more such back-end, middleware, or front-end components.
  • the components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
  • LAN local area network
  • WAN wide area network
  • the computing system can include clients and servers.
  • a client and server are generally remote from each other and typically interact through a communication network.
  • the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.

Abstract

Methods, systems, and apparatus, including computer program products, for selecting advertisements. A request for sponsored content for presentation with a content item in a document is received. One or more candidate sponsored content items are identified based on one or more criteria. The criteria includes information related to the content item independent of the document, and information related to the document. One or more of the candidate sponsored content items are selected. The selected sponsored content items are transmitted for presentation with the content item.

Description

    RELATED APPLICATIONS
  • This application is a continuation of and claims the benefit under 35 U.S.C. §120 of U.S. patent application Ser. No. 11/850,652, titled “Selection Of Advertisements For Placement With Content,” filed Sep. 5, 2007, which was a non-provisional of and claimed benefit under 35 U.S.C. §119 of U.S. Patent Application No. 60/946,702, titled “Selection of Advertisements for Placement with Content,” filed Jun. 27, 2007, which are incorporated by reference herein in its entirety.
  • BACKGROUND
  • The subject matter of this specification relates generally to advertising.
  • Online video is a growing medium. The popularity of online video services reflects this growth. Advertisers see online video as another way to reach their customers. Many advertisers are interested in maximizing the number of actions (e.g., impressions and/or click-throughs) for their advertisements. To achieve this, advertisers make efforts to place advertisements with content that is relevant to their advertisements. For example, an advertiser can target its car advertisements to a website about cars.
  • SUMMARY
  • In general, one aspect of the subject matter described in this specification can be embodied in methods that include the actions of receiving a request for sponsored content for presentation with a content item, where the content item is included in a document; identifying one or more candidate sponsored content items based on a plurality of criteria, which includes information related to the content item independent of the document and information related to the document; selecting one or more of the candidate sponsored content items; and transmitting the selected sponsored content items for presentation with the content item. Other embodiments of this aspect include corresponding systems, apparatus, and computer program products.
  • In general, another aspect of the subject matter described in this specification can be embodied in methods that include the actions of receiving a request for sponsored content for presentation with a content item, where the content item is included in a presentation environment; identifying one or more candidate sponsored content items based on a plurality of criteria, which includes information related to the content item and information related to the presentation environment; selecting one or more of the candidate sponsored content items; transmitting the selected sponsored content items for presentation with the content item in the presentation environment. Other embodiments of this aspect include corresponding systems, apparatus, and computer program products.
  • Particular embodiments of the subject matter described in this specification can be implemented to realize one or more of the following advantages. Advertisements can be placed for presentation with third party video content. The placed advertisements are selected for relevance to video content, content of a page in which the video is embedded, and other related content. Advertisers can automatically and dynamically target embedded video content that may change over time.
  • The details of one or more embodiments of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages of the subject matter will become apparent from the description, the drawings, and the claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram illustrating an example network environment.
  • FIG. 2 is a block diagram illustrating an example advertising delivery system.
  • FIGS. 3-4 are examples of a user interface illustrating advertising content displayed on a screen with video content.
  • FIG. 5 is a flow diagram illustrating an example process for selecting and delivering advertisements.
  • FIG. 6 is a block diagram illustrating an example generic computer and an example generic mobile computer device.
  • Like reference numbers and designations in the various drawings indicate like elements.
  • DETAILED DESCRIPTION
  • FIG. 1 shows an example of a network environment 100. The environment 100 includes a sponsored content (e.g., advertisement) provider 102, a content sponsor (e.g., provider) 104, and one or more user devices 106, at least some of which communicate across network 108. By way of example, discussion provided herein will make reference to the delivery of advertisements to a content provider. Other forms of sponsored content are possible. In some implementations, the advertisement provider 102 can provide advertising content (e.g., advertisements or “ads” or “ad” in the singular) for presentation with content items (e.g., text, images, video, audio, games, multimedia playlists (such as static, dynamic or executable lists of multiple content items to be played), embedded executables or software) provided by the content provider 104. For example, a video can be provided by the content provider 104 through the network 108 to one or more user devices 106. The ad content can be distributed, through network 108, to one or more user devices 106 before, during, or after presentation of the video. In some implementations, advertisement provider 102 is coupled with one or more advertising repositories (not shown). The repositories store advertising that can be presented with various types of content items, including video and audio.
  • In some implementations, the environment 100 can be used to identify relevant advertising content based on the content of the content item and perhaps other criteria. For example, the advertisement provider 102 can acquire information about the subject matter of a video, such as by reading video metadata that includes keywords that describe the subject matter of the video, and/or analyzing speech in the video, and/or analyzing the visual content of the video, for example. The information can be used to identify relevant advertisements, from which one or more are selected for presentation with the video.
  • In some implementations, content items can include various forms of electronic media. For example, a content item can include text, audio, video, advertisements, configuration parameters, documents, video files published on the Internet, television programs, audio podcasts, video podcasts, live or recorded talk shows, video voicemail, segments of a video conversation, and other distributable resources.
  • In some implementations, a “video content item” is an item of content that includes content that can be perceived visually when played, rendered, or decoded. A video includes video data, and optionally audio data, executable code, hyperlinks, and metadata. Video data includes content in the video that can be perceived visually when the video content item is played, rendered, or decoded. Audio data includes content in the video that can be perceived aurally when the video content item is played, decoded, or rendered. Executable code includes, for example, interactive ADOBE FLASH, JavaScript, or other forms of interactive content. Hyperlinks include, for example, links embedded in or associated with the video or executable code that provide an address link, such as a Uniform Resource Locator (URL), to other content or executable code. A video can include video data and any accompanying audio data regardless of whether or not the video is ultimately stored on a tangible medium. A video can include, for example, a live or recorded television program, a live or recorded theatrical or dramatic work, a music video, a televised event (e.g., a sports event, a political event, a news event, etc.), video voicemail, etc. Video, audio or other content items can also be part of a media playlist.
  • A video content item can also include many types of associated data or metadata. Metadata includes, for example, tags, labels, keywords, time stamps, XML-enclosed data, or other non-displayed information about the content. Examples of types of associated data include video data, audio data, closed-caption or subtitle data, a transcript, content descriptions (e.g., title, actor list, genre information, first performance or release date, etc.), related still images, user-supplied or provider-provided tags and ratings, etc. Some of this data, such as the description, can refer to the entire video content item, while other data (e.g., the closed-caption data) may be temporally-based or timecoded. In some implementations, the temporally-based data may be used to detect scene or content changes to determine relevant portions of that data for targeting ad content to users. The executable code and/or metadata may include interactive playlists of media content, such as lists of video, audio, web links, or other content types.
  • In some implementations, a video content item has one or more interstitial advertisement slots. One or more video advertisements can be presented in between the portions of the video separated by an advertisement slot, similar to television advertising commercials that are presented between portions of a television program. The positions of the advertisement slots can be specified by metadata associated with the video and stored at the video provider 209. The positions of the slots can be manually specified by the author of the video or automatically determined based on an analysis of the video. An example technique for analyzing a video to determine positions of advertisement slots is disclosed in U.S. patent application Ser. No. 11/737,038, titled “Characterizing Content for Identification of Advertising,” filed Apr. 18, 2007, which is incorporated by reference in its entirety herein. Further details related to advertisement slots are disclosed in U.S. patent application Ser. No. 11/550,388, titled “Using Viewing Signals in Targeted Video Advertising,” filed Oct. 17, 2006, which is incorporated by reference in its entirety herein.
  • In some implementations, an “audio content item” is an item of content that can be perceived aurally when played, rendered, or decoded. An audio content item includes audio data and optionally metadata. The audio data includes content in the audio content item that can be perceived aurally when the video content item is played, decoded, or rendered. An audio content item may include audio data regardless of whether or not the audio content item is ultimately stored on a tangible medium. An audio content item may include, for example, a live or recorded radio program, a live or recorded theatrical or dramatic work, a musical performance, a sound recording, a televised event (e.g., a sports event, a political event, a news event, etc.), voicemail, etc. Each of different forms or formats of the audio data (e.g., original, compressed, packetized, streamed, etc.) may be considered to be an audio content item (e.g., the same audio content item, or different audio content items).
  • Advertising content can include text, graphics, video, audio, banners, links, executable code and scripts, and other web or television programming related data. As such, ad content can be formatted differently, based on whether it is primarily directed to websites, media players, email, television programs, closed captioning, etc. For example, ad content directed to a website may be formatted for display in a frame within a web browser. As another example, ad content directed to a video player may be presented “in-stream” as a video content item is played in the video player. In some implementations, in-stream ad content may replace the video content item in a video player for some period of time or inserted between portions of the video content item. An in-stream ad can be pre-roll (before the video content item), post-roll (after the video content item), or interstitial. An in-stream ad may include video, audio, text, animated images, still images, or some combination thereof. The advertisement can appear in the same form as video content, as an overlay over video content, or in other forms. Examples of forms of advertisements for video that can be used with the implementations described in this specification are disclosed in U.S. Provisional Application No. 60/915,654, entitled “User Interfaces For Web-Based Video Player,” filed May 2, 2007; and U.S. patent application Ser. No. 11/760,709, entitled “Systems and Processes for Presenting Informational Content,” filed Jun. 8, 2007, which are incorporated by reference in their entirety herein.
  • The content provider 104 can present content to a user device 106 through the network 108. In some implementations, the content providers 104 are web servers where the content includes webpages or other content written in the Hypertext Markup Language (HTML), or any language suitable for authoring webpages. In general, content provider 104 can include users, web publishers, and other entities capable of distributing content over a network. In some implementations, the content provider 104 may make the content accessible through a known URL.
  • The content provider 104 can receive requests for content (e.g., articles, discussion threads, music, audio, video, graphics, search results, webpage listings, etc.). The content provider 104 can retrieve the requested content in response to the request or service the request in some other way. The advertisement provider 102 can broadcast content as well (e.g., not necessarily responsive to a request).
  • Content provided by content provider 104 can include news, weather, entertainment, or other consumable textual, audio, video, game, or multimedia content. More particularly, the content can include various resources, such as documents (e.g., webpages, plain text documents, dynamic network applications provided to the user on-the-fly, Portable Document Format (PDF) documents, images), video or audio clips, etc. In some implementations, the content can be graphic-intensive, media-rich data, such as, for example, FLASH-based content that presents video and audio, Asynchronous JavaScript and XML (AJAX) based web applications or web pages, and the like.
  • The content provider 104 can provide video content items for presentation to a user. The content provider 104 can provide a video content item as a stream or as a downloadable file to a user device 106. The content provider 104 can also provide a video player module with a video. In some implementations, a content item (e.g., a video content item, an audio content item) and a player module from the content provider 104 is embedded into a document (e.g., a webpage provided by content provider 104). The document includes the player module and a reference to the video or audio content item. The document can be sent to a user device 106. At the user device 106, the embedded player module can retrieve the referenced video or audio content item for playback at the user device 106. The player can provide a one or more pieces of content, such as via a playlist.
  • In some implementations, a document and a content item embedded in the documents are provided by different content providers. In an example implementation, a video from a first content provider is embedded in a document (e.g., a webpage) from a second content provider. The document includes a location (e.g., a URL) of the video at the first content provider. When the document is rendered at the user device 106, the video can be obtained from the second content provider and played back at the user device 106.
  • The environment 100 includes one or more user devices 106. The user device 106 can include a desktop computer, laptop computer, a media player (e.g., an MP3 player, a streaming audio player, a streaming video player, a television, a computer, a mobile device, etc.), a mobile phone, a browser facility (e.g., a web browser application), an e-mail facility, telephony means, a set top box, a television device or other electronic device that can access advertisements and other content via network 108. The content provider 104 may allow user device 106 to access content (e.g., webpages, videos, etc.).
  • The network 108 facilitates wireless or wireline communication between the advertisement provider 102, the content provider 104, and any other local or remote computers (e.g., user device 106). The network 108 may be all or a portion of an enterprise or secured network. In another example, the network 108 may be a virtual private network (VPN) between the content provider 104 and the user device 106 across a wireline or a wireless link. While illustrated as a single or continuous network, the network 108 may be logically divided into various sub-nets or virtual networks without departing from the scope of this disclosure, so long as at least a portion of the network 108 may facilitate communications between the advertisement provider 102, content provider 104, and at least one client (e.g., user device 106). In certain implementations, the network 108 may be a secure network associated with the enterprise and certain local or remote clients 106.
  • Examples of network 108 include a local area network (LAN), a wide area network (WAN), a wireless phone network, a Wi-Fi network, and the Internet.
  • In some implementations, the content provider 104 may transmit information about how, when, and/or where the ads are to be rendered, and/or information about the results of that rendering (e.g., ad spot, specified segment, position, selection or not, impression time, impression date, size, temporal length, volume, conversion or not, etc.) back to the advertisement provider 102 through the network 108. Alternatively, or in addition, such information may be provided back to the advertisement provider 102 by some other means.
  • FIG. 2 is a block diagram illustrating an example advertising delivery system 200. System 200 includes, or is communicatively coupled with, advertisement provider 201, content provider 203, and user device 205, at least some of which communicate across network 207. In some implementations, the advertisement delivery system 200 is an example implementation of the network environment 100, where advertisement provider 201 is an implementation of advertisement provider 102, content provider 203 is an implementation of content provider 104, user device 205 is an implementation of user device 106, and network 207 is an implementation of network 108.
  • In some implementations, the advertisement provider 201 includes a content analyzer 202, an ad selection module 204, an ad server 206, and a surrounding content module 208. The content analyzer 202 can analyze received content items (e.g., videos) to determine one or more targeting criteria for content items. For example, the content analyzer 202 may implement various analysis methods, including, but not limited to weighting schemes, speech processing, image or object recognition, and statistical methods.
  • Analysis methods can be applied to the contextual elements of the received content item (e.g., a video, an audio clip) to determine relevant targeting criteria. For example, the received content can undergo one or more of audio volume normalization, automatic speech recognition, transcoding, indexing, image recognition, etc. In some implementations, the content analyzer 202 includes a speech to text module 210 and an image recognition module 212. Other modules are possible.
  • The speech to text module 210 can analyze a video to identify speech in a video or audio file or stream. For example, a video content item may be received in the system 200. The speech-to-text module 210 can analyze the video content item as a whole. Textual information may be derived from the speech included in the audio data of the video or audio content item by performing speech recognition on the audio content, producing in some implementations hypothesized words annotated with confidence scores, or in other implementations a lattice which contains many hypotheses. Examples of speech recognition techniques include techniques based on hidden Markov models, dynamic programming, or neural networks.
  • In some implementations, the speech analysis may include identifying phonemes, converting the phonemes to text, interpreting the phonemes as words or word combinations, and providing a representation of the words, and/or word combinations, which best corresponds with the received input speech (e.g., speech in the audio data of a video content item). The text can be further processed to determine the subject matter of the video or audio content item. For example, keyword spotting (e.g., word or utterance recognition), pattern recognition (e.g., defining noise ratios, sound lengths, etc.), or structural pattern recognition (e.g., syntactic patterns, grammar, graphical patterns, etc.) may be used to determine the subject matter, including different segments, of the video content item. In some implementations, further processing may be carried out on the video or audio content item to refine the identification of subject matter in the video or audio content item.
  • A video or audio content item can also include timecoded metadata. Examples of timecoded metadata include closed-captions, subtitles, or transcript data that includes a textual representation of the speech or dialogue in the video or audio content item. In some implementations, a caption data module (not shown) at the advertisement provider 201 extracts the textual representation from the closed-caption, subtitle, or transcript data of the content item and used the extracted text to identify subject matter in the video or audio content item. The extracted text can be a supplement to or a substitute for a speech recognition analysis on the video or audio content item.
  • Further processing of received content can also include image or object recognition. For example, automatic object recognition can be applied to received or acquired video data of a video content item to determine targeting criteria for one or more objects associated with the video content item. For example, the image recognition module 212 may automatically extract still frames from a video content item for analysis. The analysis may identify targeting criteria relevant to objects identified by the analysis. The analysis may also identify changes between sequential frames of the video content item that may be indicia of different scenes (e.g., fading to black). Examples of object recognition techniques include appearance-based object recognition, and object recognition based on local features. An example of object recognition is disclosed in U.S. patent application Ser. No. 11/608,219, entitled “Image Based Contextual Advertisement Method and Branded Barcodes,” filed Dec. 7, 2006, which is incorporated by reference in its entirety herein.
  • The surrounding content module 208 reads content that is in proximity to (e.g., surrounds) an embedded content item. In implementations where a video or audio content item is embedded in a document, the surrounding content module 208 reads the content that surrounds the embedded content item, i.e. the content of the document other than the video itself. The surrounding content can be matched against advertisements to identify advertisements that have some subject matter relevance to the document, for example.
  • Advertisement provider 201 includes an ad selection module 204. In some implementations, the ad selection module 204 identifies, for a presentation of a content item, candidate advertisements and selects one or more of these candidate advertisements for presentation with the content item. The ad selection module 204 can identify the candidate advertisements based on multiple criteria, including advertiser preferences, content provider preferences, the content of the video, and content surrounding or in proximity to the video. From these candidate advertisements, one or more are selected for presentation with the video. In some implementations, advertisements are selected for presentation, in accordance with an auction. Further details on the identification of candidate advertisements and selection of advertisements are described below in reference to FIG. 5.
  • Advertisement provider 201 includes an ad server 206. Ad server 206 may directly, or indirectly, enter, maintain, and track ad information. The ads may be in the form of graphical ads such as so-called banner ads, text only ads, image ads, audio ads, video ads, ads combining one of more of any of such components, etc. The ads may also include embedded information, such as a link, and/or machine executable instructions. In some implementations, metadata can be associated with an advertisement. In some implementations, the metadata can include one or more keywords indicating topics or concepts to which the advertisement is relevant or other information that indicates the subject matter of the advertisement. In some implementations, the advertiser specifies one or more keywords that express the subject matter of the advertisement or one or more categories or verticals to which the advertisement is targeted.
  • Ad server 206 can receive requests for advertisements from a user device 205. In response to these requests, the ad server 206 can transmit selected advertisements to the user device 205. Further, the ad server 206 can receive usage information (e.g., advertisement click-through information) from the user device 205. Although not shown, other entities may provide usage information (e.g., whether or not a conversion or selection related to the ad occurred) to the ad server 206. For example, this usage information may include measured or observed user behavior related to ads that have been served. In some implementations, usage information can be measured in a privacy-preserving manner such that individually user identifiable information is filtered by the ad server, by the user device, or through an intermediary device.
  • The ad server 206 may include information concerning accounts, campaigns, creatives, targeting, advertiser preferences for ad placement, etc. The term “account” relates to information for a given advertiser (e.g., a unique email address, a password, billing information, etc.). A “campaign,” “advertising campaign,” or “ad campaign” refers to one or more groups of one or more advertisements, and may include a start date, an end date, budget information, targeting information, syndication information, etc.
  • The advertisement provider 201 can use one or more advertisement repositories 214 for selecting ads for presentation to a user. The repositories 214 may include any memory or database module and may take the form of volatile or non-volatile memory including, without limitation, magnetic media, optical media, random access memory (RAM), read-only memory (ROM), removable media, or any other suitable local or remote memory component.
  • The content provider 203 includes a content server 215. The content server 215 can serve various types of content, including documents, video content items, and audio content items, for example. In some implementations, the content server 215 can serve documents that have other content (e.g., video or audio content items from the content provider 203 or another content provider) embedded within. The content that can be served by the content server 215 is stored in a content repository. In some implementations, different types of content can be stored in separate repositories. For example, documents can be stored in a document repository 217, and video content items and metadata associated with the video content items can be stored in a video content repository 218. In an exemplary implementation, video metadata is written using the Extensible Markup Language (XML).
  • In some implementations, the content server 215 serves video content items, such as video streams or video files, for example. Video player applications may be used to render video streams or files. Ads may be served by the ad server 206 in association with video content items. For example, one or more ads may be served before, during, or after a music video, program, program segment, etc. Alternatively, one or more ads may be served in association with a music video, program, program segment, etc. The videos that can be served by the content server 215 are stored in a videos and video metadata repository 218. In some implementations, the content provider 203 also provides access to documents (e.g., a webpage) that include information (e.g., subject matter and categories of the videos) about the videos in the video repository 218.
  • While FIG. 2 is illustrated as having one content provider providing both documents and content items (e.g., video content items), in some implementations, separate content providers, with different content servers, provide documents and content items, respectively. For example, a first content provider can provide document retrieved from a document repository, where the documents include references to content items (e.g., video or audio content items) stored in a content item repository and served by a second content server.
  • In operation, the advertisement provider 201 and the content provider 203 can provide content to a user device 205. The user device 205 is an example of an ad consumer. The user device 205 may include a user device such as a media player (e.g., an MP3 player, a streaming audio player, a streaming video player, a television, a computer, a mobile device, etc.), a browser facility, an e-mail facility, telephony means, etc.
  • The user device 205 includes an application 220 for rendering and presenting content and advertisements. In some implementations, the application is a web browser that can render and display documents (e.g., webpages) and other content. In an exemplary implementation, one or more plug-ins can be associated with the application. The plug-ins can facilitate rendering and presentation of content (e.g., video or audio content items) or advertisements by the application 220.
  • The user device 205 includes a content player module 222 and an advertising module 224. The content player module 222 can play back content items, such as video or audio content items, for example. In some implementations, the content player module 222 is embedded in a document (e.g., a webpage), and received by the user device 205 when the document is received by the user device 205. When the document is rendered by the application 220, the content player module 222 is executed by the application 220 or a plug-in to the application 220.
  • The advertising module 224 sends requests for advertisements to the advertising provider 201 and receives advertisements responsive to the requests from the advertising provider 201. The advertising module 224 can provide the advertisements to the application 220 or the content player module 222 for presentation to the user. In some implementations, the advertising module 224 is a sub-module of the content player module 222.
  • FIG. 3 is an example user interface 300 illustrating video content displayed on a screen with surrounding content. The user interface 300 illustrates an example web browser user interface. However, the content shown in the user interface 300 can be presented in a webpage, an MP3 player, a streaming audio player, a streaming video player, a television, a computer, a mobile device, etc. The content shown in the user interface 300 may be provided by advertisement provider 201, content provider 203, another networked device, or some combination of those providers.
  • As shown, the user interface 300 includes a content player region 302 and one or more “surrounding content” regions 304A, 304B, and 304C. The content player region 302 may include a media player for presenting text, images, video, or audio, or any combination thereof. An example of what can be shown in the content player region 302 is described in further detail below in relation to FIG. 4.
  • The surrounding content regions 304A, 304B, and 304C can display text, graphics, links, third party add-ins (e.g., search controls, download buttons, etc.), video and audio clips (e.g., graphics), help instructions (e.g., text, html, pop-up controls, etc.), and advertisements (e.g., banner ads, flash-based video/audio ads, scrolling ads, etc.). In some implementations, the surrounding content regions 304A, 304B, and 304C are portions of a webpage in which the content player is embedded.
  • The surrounding content may be related to the content displayed in the content player region 302. For example, advertisements related to the content in the content player region 302 can be displayed in any of the surrounding content regions 304A, 304B, 304C. As another example, the content in the content player region 302 is chosen for display with the surrounding content because they are related in subject matter (e.g., a video and a video player embedded with a posting in a blog). In some other implementations, the surrounding content is not related to the content in the content player region 302.
  • The surrounding content regions 304A, 304B, and 304C may be in proximity to the content player region 302 during the presentation of video content in the region 302. For example, the surrounding content regions 304A, 304B, and 304C can be adjacent to the content player region 302, either above, below, or to the side of the content player region 302. For example, the user interface 300 may include an add-on, such as a stock ticker with text advertisements. The stock ticker can be presented in one of the surrounding content regions 304A, 304B, or 304C.
  • FIG. 4 illustrates an example user interface that can be displayed in a video player, such as in content player region 302. Content items, such as video, audio, and so forth can be displayed in the content player region 302. The region 302 includes a content display portion 402 for displaying a content item, a portion 404 for displaying information (e.g., title, running time, etc.) about the content item, player controls 405 (e.g., volume adjustment, full-screen mode, play/pause button, progress bar and slider, option menu, etc.), an advertisement display portion 408, and a multi-purpose portion 406 that can be used to display various content (e.g., advertisements, closed-captions/subtitles/transcript of the content item, related links, etc.).
  • As shown, the content shown represents a video (or audio) interview occurring between a person located in New York City, New York and a person located in Los Angeles, Calif. The interview is displayed in the content display portion 402 of the region 302.
  • The region 302 may be presented as a stream, upon visiting a particular site presenting the interview, or after the execution of a downloaded file containing the interview or a link to the interview. As such, the region 302 may display additional content (e.g., advertisement content) that relates to the content shown in the video interview. For example, the additional content may change according to what is displayed in the region 302. The additional content can be substantially available as content from the content provider 203 and/or the advertisement provider 201.
  • An on-screen advertisement is displayed in the multi-purpose portion 406. An additional on-screen advertisement is displayed in the advertisement display portion 408. In some implementations, on-screen advertisements may include text-and-audio, video, text, animated images, still images, or some combination thereof.
  • In some implementations, the content display portion 402 can display advertisements targeted to audio-only content, such as ads capable of being displayed in-stream with a podcast or web monitored radio broadcasts. For example, the advertisement provider 201 may provide interstitial advertisements, sound bytes, or news information in the audio stream of music or disc jockey conversations.
  • Advertisements may be presented on the content display portion 402. Temporal placement of advertisements relative to a video content item may vary. For example, an advertisement presentation may be pre-roll, mid-roll or post-roll placement.
  • In some implementations, the progress bar in the player controls 405 also shows the positions of the advertisement slots in the content item being played.
  • The multi-purpose portion 406 may also include a skip advertisement link or control 410. When the skip advertisement link 410 is selected by the user, the currently displayed video advertisement is skipped and playback continues from the first frame of the video after the skipped video advertisement (or, playback stops if the skipped video advertisement is a post-roll advertisement). In some implementations, the skip advertisement link or control 410 is a link. In some other implementations, the skip advertisement link or control 410 may be a button, selectable icon, or some other user-selectable user interface object.
  • In some implementations, the ability of a user to skip advertisements, for example, by using the skip advertisement link or control 410, may effect the selection of an advertisement to be presented by the advertisement provider 201. For example, a large number of skips for an advertisement may indicate that the advertisement is ineffective or unpopular, and thus can be made less likely to be selected for presentation (e.g., by decreasing its bid in a placement auction or being counted as a negative weighting factor in auctions or towards advertising impressions). Inversely, advertisements that are infrequently skipped may have a positive weighting factor and or have a positive impact toward advertising impressions.
  • FIG. 5 illustrates an example process 500 for selecting and delivering advertisements for presentation with a content item. For convenience, the process 500 will be described in reference to a system (e.g., system 200) that performs the process. For ease of explanation, process 500 will be described with reference to a video content item (e.g., a video stream or file), but it should be appreciated that process 500 is also applicable to other content items, such as audio content items, as well.
  • A request for one or more advertisements is received (502). In some implementations, a user device 205 can receive a document from a content provider 203. In some implementations, the document can include an embedded video content item (hereinafter referred to as the “video”), an embedded content player (e.g., content player module 222), and an advertising module (e.g., advertising module 224). The advertising module 224 sends a request for advertisements to the advertisement provider 201. The request can be for one or more advertisements for presentation with the video. In some implementations, the request can specify the type (e.g., computer, mobile phone, etc.) of the user device 205 that sent the request. For example, the request can specify that the user device is a desktop or notebook computer or a mobile phone.
  • The request can include metadata associated with the video. The metadata can specify preferences on what and how advertisements are presented with the video. The preferences can be set by the author of the video, the author of the document in which the video is embedded, or content provider 203 (hereinafter referred to collectively as “content provider preferences”).
  • In some implementations, the content provider preferences specify the available positions in which video advertisements can be presented. For example, the metadata can specify that the video has two interstitial positions, a pre-roll position, and a post-roll position for advertisements, and that advertisements may be placed in any of these positions. As another example, the content provider preferences can specify that advertisements may be presented with the video only as post-roll advertisements, only as pre-roll advertisements before the video, only as interstitial advertisements between portions of the video, or some combination thereof (e.g., no interstitial advertisements=pre-roll or post-roll advertisements only).
  • The metadata can also include data that provides an indication of the content (e.g., the subject matter) of the video. For example, the metadata can include one or more keywords that specify topics or concepts to which the video is relevant. The metadata can include one or more categories or verticals (hereinafter referred as “channels”) into which the video can be classified. The metadata can also include information referencing one or more playlists or playlist categories on which the content is placed or linked. The topical data or the channel data can be used by the advertisement provider 201 to identify advertisements that are relevant to the video. In some other implementations, the topical or channel information of the video can be determined by analyzing the video.
  • In some implementations, the metadata can specify other types of content provider preferences as well. For example, the metadata can specify whether video advertisements displayed with the video can be skipped or not. As another example, the metadata can also specify one or more target demographics for the video. As a further example, the metadata can specify the time lengths of advertisement placement positions, the maximum allowable advertisement duration, and/or which types of advertisements (e.g., text ads, overlay ads, banner ads, pop-up ads, video ads, etc.) are allowed. In some implementations, the preferences in the metadata include blacklisted keywords or advertisers. An advertisement associated with a blacklisted keyword or advertiser is automatically disqualified from consideration for placement with the video. In some implementations, an advertiser in the blacklist is specified by a name and/or a URL associated with the advertiser.
  • One or more candidate advertisements are identified (504). In some implementations, one or more advertisements from the advertisements repository 214 are identified by the ad selection module 204 to be candidates in an auction for placement with the video. In some implementations, the candidate advertisements are identified based on multiple criteria.
  • In some implementations, a criterion for identification of candidate advertisements is advertiser preferences. For an advertisement, an advertiser can set preferences related to the presentation and targeting of the advertisement. The preferences can include preferences related to presentation position (e.g., pre-roll, post-roll, interstitial), capability of being skipped by the user, target demographics, type of device, and so forth. In some implementations, the advertiser preferences are matched against the content provider preferences.
  • For example, the advertiser can specify a preference as to whether a video advertisement can be presented pre-roll, post-roll, or in an interstitial slot. The advertiser can specify a preference as to whether a user viewing a video advertisement can be given the capability to skip it. The advertiser can specify a preference as to the target demographics to which the advertisement may be shown. The advertiser can specify that the advertisement is best viewed on a desktop/notebook computer or a mobile phone. Other types of preferences are possible.
  • In some implementations, other characteristics of the advertisement can be matched against the content provider preferences. For example, the content provider preferences may include a maximum allowable time length per video ad. A video ad may be disqualified if its length is longer than the maximum time length. As another example, the content provider preferences may include restrictions on which types of ads are allowed. If the ad does not belong to one of the allowed types, the ad can be disqualified.
  • In some implementations, an advertiser preference can be an absolute preference or a relative preference. If a preference is an absolute preference and it is not satisfied by the video, then the advertisement is disqualified from the auction, regardless of satisfaction of other preferences and relevance of the advertisement to the video or the document. If the preference is a relative preference and it is not completely satisfied, then the advertisement may still be eligible for the auction, subject to other criteria, but the incomplete satisfaction of the preference can be used to modify a bid associated with the advertisement in the auction. The advertiser preferences can specify a bid modifier for a preference, in case that preference is not satisfied completely.
  • In some implementations, the preferences can include a channel to which the advertisement is targeted. An advertiser can target an advertisement to a particular channel. Examples of channels include “news videos” or “action movies.” In some implementations, an advertisement targeted to a channel is identified as a candidate advertisement for auctions for placement with videos in that channel. However, an advertisement targeted to a channel can be disregarded with respect to a particular video in the channel if other advertiser preferences for the advertisement are not satisfied by the video. For example, an advertisement can be disregarded if the advertiser specified that the advertisement be presented only as a pre-roll ad but a video only allows interstitial and post-roll advertisements.
  • Yet another criterion for identifying candidate advertisements is the content (e.g., subject matter) of the video. The video can be associated with metadata specifying one or more topics or concepts. Alternatively, a document provided by the content provider (e.g., a webpage) can include topical or concept information for the video. The video can be retrieved by the advertisement provider 201 from the content provider 203 and analyzed by the content analyzer 202. One or more topics or concepts to which the video is relevant can be determined from the analysis. The content of the advertisement are compared against the topics or concepts of the video. In an example implementation, the topics or concepts of the video are compared to the topics and concepts of the advertisement. For example, the topical keywords of the video are compared to the topical keywords of the advertisement using, for example, keyword analysis, and a score can be determined from the comparison. If the score exceeds a threshold, for example, the advertisement can be identified as a candidate for the auction, pending satisfaction of other criteria, if any. Topics can also be determined from image, audio or video analysis.
  • In some implementations, the topical keywords for a video can be determined using search engine data. For example, data related to interactions to a video by users who were directed to the video by a search query (e.g., the video was a result for the search query) can be an indication of whether the search query is a good topical keyword for the video. Interactions with the video can include watching the video in its entirety, watching the video repeatedly, fast-forwarding or advancing the video, or skipping the video. A search engine can collect the query and interaction data anonymously and the advertisement provider 201 can analyze the data to determine if a search query is a good keyword for the video based on the interactions of the users. For example, if users who accessed a video through a search query tended to skip the video, then the search query can be determined to be a poor keyword for the video. On the other hand, if the users tend to watch the video over and over again, then search query can be determined to be a good keyword for the video. The good keywords are associated with the video and can be compared to the keywords for the advertisements.
  • In some implementations, an advertisement can target specific channels. A video can be analyzed to determine to which channel the video can be classified, or the channels to which the video can be classified can be specified in the video metadata. In other words, videos can be classified into channels based on the analysis of the videos, if no channels were pre-specified. If the targeted channels of the advertisement match any of the channels to which the video is classified, the advertisement is identified as a candidate for the auction, subject to satisfaction of other preferences (e.g., satisfaction of content provider preferences regarding placement position). Similarly, advertisements can target categories of playlists or individual playlists based on such categorization.
  • A further criterion for identifying candidate advertisements is the surrounding content around the embedded video. The surrounding content can include the content of the document in which the video is embedded. In some implementations, the surrounding content can also include the use profile of the owner of the document or the user viewing the video. For example, if a video is embedded in a person's page on a social networking site, the user profile of the person owning the page and/or the user profile of the user who is viewing the page and the video are surrounding content. The surrounding content module 208 can compare the surrounding content against the topics and concepts of the advertisement using, for example, keyword analysis, and a score can be determined from the comparison. If the score exceeds a threshold, for example, the advertisement can be identified as a candidate for the auction, pending satisfaction of other criteria, if any.
  • In some implementations, a “user profile” of a page owner or of a viewer of the video is demographic information associated with the respective user, such as age, gender, education, etc. Demographic information of a user can be extracted from the user's profile on a social networking site, for example. In some implementations, the demographic information can also be obtained from third parties. The demographic information can be obtained separate from any personally identifiable information.
  • In some implementations, a further criterion for identifying candidate advertisements is content of a page that links to the video. Hyperlinks to a video can be identified by a web crawler system. Content in the pages that contain the hyperlinks can be extracted. For example, the anchor text of a hyperlink and text content surrounding the hyperlink can be extracted. The extracted text can be compared to the topics and concepts of the advertisements, similar to the comparison of the surrounding content and the advertisement described above. In some implementations, the content from the linking page can be weighted higher if the linking page is a high-quality page based on inbound and outbound links, for example.
  • In some implementations, the page linking to the video or the page in which the video is embedded can be analyzed to determine if the content in the page is related specifically to the video or includes content related to other videos as well. For example, the page can be parsed in a structured way to identify components such as titles, particular blog posts, and so on. When identifying candidate advertisements for the video, content related to other videos can be weighted less or disregarded.
  • In some implementations, interaction data (e.g., skipping, fast-forwarding, watching multiple times, etc.) can be collected for users navigating to the video through the hyperlinks. The interaction data can be used to determine which hyperlinks and content related to the hyperlinks are more relevant to the video for use in advertisement selection.
  • In some implementations, another criterion is a text transcript of speech in the video generated by a speech recognition process applied to the video. The transcript can be a source of keywords for the video; keyword analysis can be applied to the transcript to identify keywords for the video.
  • In some implementations, the speech recognition process can also determine per-term confidence levels for terms in the transcript, where a term can include words, phrases, numbers, characters, etc. The per-term confidence levels measure a confidence of the speech recognition process that the corresponding terms were recognized correctly. The confidence levels can be used as weights for terms in the transcript. For example, for two terms in the transcript of approximately equal relevance to the video, the term with the higher confidence level is given more weight, and thus more influence, when matching advertisements to terms in the transcript. An example of using speech recognition, including confidence levels for terms, to identify advertisements relevant to content is disclosed in U.S. patent application Ser. No. 11/241,834, titled “Using Speech Recognition to Determine Advertisements Relevant to Audio Content and/or Audio Content Relevant to Advertisements,” filed Sep. 30, 2005, which is incorporated by reference herein in its entirety.
  • In some implementations, the transcript can also be used to verify any publisher-provided metadata regarding the subject matter of the video. For example, the publisher of the video can declare in the video metadata that the video is about cars. If the transcript indicates that the video is about baseball (e.g., because the transcript has no mention of “cars”), the video and the publisher can be flagged for further manual review. If the manual review confirms that the video metadata does not accurately describe the subject matter of the video, the publisher can be penalized (e.g., banned from receiving advertisements from the advertisement provider 201, lesser penalties).
  • In some implementations, the comparison with the surrounding content is omitted if the advertisement targets particular channels of videos.
  • As an example of how both the content of the video and the surrounding content can affect the identification of candidate advertisements, consider a video about cars. If the video is embedded in a webpage about racing, then advertisements for racing cars or sports cars, for example, can be identified as candidate advertisements. If the video is embedded in a webpage about parenting, then advertisements for family friendly cars (e.g., minivans) or child safety seats, for example, can be identified as candidate advertisements. Advertisements for racing cars or sports cars would be less likely to be identified as candidate advertisements in the second scenario because of the difference in the surrounding content (webpage about parenting vs. webpage about racing). In some implementations, content that is not likely to be relevant to the content of a webpage can be excluded from the targeting.
  • In some implementations, an advertisement is compared against both the content of the video and of the surrounding content, and the comparisons are combined in a weighted manner. For example, the comparison between the topics of the advertisement and the topics of the video, as determined from an analysis of the video can yield one score, the comparison between the topics of the advertisement and the content of the document in which the video is embedded yields another score. Further, comparison to the user profile of the document owner and/or to the user profile of a user viewing the document can yield respective scores. Weights can be specified for each of these types of comparisons. In some implementations, the weights can be specified by the author or provider of the video or the owner or provider of the document in which the video is embedded. The weights can be specified per provider and/or per content item. The weights can serve as an indication of which comparison should play a larger role in affecting which advertisements are candidates.
  • In some implementations, the weights can be adjusted over time to optimize the weightings. For example, weights can be determined from experimentation and empirical analysis of performance of advertisements placed using different weights. As another example, a machine learning process can be used to analyze the advertisement performance and adjust weights without manual intervention.
  • The scores from the comparisons to the content of the video and of the surrounding content can be combined into a weighted score for the advertisement using, for example, a linear combination. An example linear combination is S=αA+βB+γC+ . . . , where S is the weighted score, α, β, and γ are the weights, and A, B, C are the scores determined from the individual comparisons. A weighted score S can be determined for an advertisement. If the weighted score exceeds a specified threshold, then the advertisement is identified as a candidate advertisement for the auction. Otherwise, the advertisement is disregarded or disqualified.
  • One or more of the candidate advertisements are selected (506). In some implementations, the identified candidate advertisements are placed into an auction for presentation with the video. Advertisers can enter bids using the ad server 206, for example. The bids can be cost-per-click (CPC) bids or cost-per-thousand impression (CPM) bids. In an example implementation, the CPM bid of an advertisement can converted to an estimate CPC bid by multiplying the CPM bid with a click-through rate of the advertisement. The advertisements in the auction are then ordered based on their actual or estimate CPC bids.
  • In some implementations, the advertiser can enter a base bid and any number of bid modifiers. An example of a bid modifier is a decrease of $0.50 from the bid if the video allows skipping of advertisements. Another example of a bid modifier is a decrease of $0.50 from the bid if the video does not allow pre-roll advertisements. These modifiers allow an advertiser to fine tune its bid based on the advertiser's value of an ad placement under various conditions.
  • A bid of an advertisement can also be modified by the advertisement provider 201 based on the performance or quality of the advertisement over time. In some implementations, quality measures of an advertisement can include a click-through rate, a conversion rate, a skip rate, the number of times the advertisement was viewed completely vs. the number of impressions, and so forth. Advertisements of low quality based on these measures can have their bids decreased. For example, if the advertisement has a high rate of being skipped by viewers and/or has a low conversion rate, the bid can be decreased.
  • The advertisements with the highest bids are selected by the ad selection module 204 to fill the available advertisement placement positions. Thus, for example, if there are three available positions (say, a pre-roll position for one ad, an interstitial slot for one ad, and a post-roll position for one ad), the three advertisements with the highest bids are selected for the positions. In an example implementation, the advertisements are placed into the positions in their bid order. Thus, for example, the advertisement with the highest bid is placed in the earliest position, the advertisement with the next highest bid is placed in the next position, and so forth.
  • In some other implementations, the bid order determines a priority as to which of the selected advertisements will get a desired position. For example, if there are a pre-roll position, an interstitial slot, and a post-roll position, a highest bidding advertisement that prefers an interstitial slot will get the interstitial slot, while the next advertisement in the bid order can be placed in a pre-roll or a post-roll position even if the advertisement also prefers an interstitial slot.
  • The selected advertisements are transmitted for presentation with the video (508). The ad server 206 transmits the advertisements to the user device 205. The advertisements are received by the advertising module 224 and are placed into their respective positions. The content player module 222 displays the advertisements at their respective positions. For example, the advertisements can be displayed in-stream with a video in the content display region 402 (FIG. 4).
  • In some implementations, after the advertisements are transmitted to the user device 205, the user response to the advertisements and the video can be monitored. For example, data on click-throughs of the advertisement, viewing of the respective advertisements in their entirety (or not), viewing of the video in its entirety, and skipping of the advertisements can be collected by the advertisement provider 201. The advertisement provider can use this data to determine the performance of the advertisements, for example. In an example implementation where the content provider participates in a revenue sharing scheme, the performance data can be used to determine the amount of revenue to which the content provider is entitled. An example of a revenue sharing scheme is disclosed in U.S. patent application Ser. No. 11/755,624, entitled “Flexible Revenue Sharing and Referral Bounty System,” filed May 30, 2007, which is incorporated by reference in its entirety herein.
  • Further, the performance data can be used as a signal to adjust the weights of the scores of comparisons to the video and to the document or to adjust the bids of ads in an auction. For example, if the performance of an ad is good with respect to a particular video, the performance data can be used by the advertisement provider 201 to increase one or more weights with respect to the content provider of the video or with respect to the video. On the other hand, if the performance is poor, the weights can be decreased. As another example, the bid for a poor performing ad can be decreased, and the bid for a well performing ad can be increased.
  • In an example implementation of the process 500 described above, the identification of candidate advertisements begins, after receiving the request for advertisements, with the determination of advertisements that are relevant to the video and to the document based on comparisons to the video content and to the document content, as described above. An advertisement can be scored using a linear combination, for example, of the scores for the individual comparisons. For example, the score from the comparison to the video content can be weighted by a first weight to yield a first weighted score (e.g., αA) and the comparison to the document can be weighted by a second weight to yield a second weighted score (e.g., βB). The two weighted scores are added to yield the score for the advertisement. Ads with scores above a threshold are identified as candidates for further consideration.
  • The identified advertisements are then filtered by matching them against the content provider preferences. For example, if the content provider preferences specify a desire for interstitial ads only, and each ad can be at most 30 seconds long, then ads targeted for a pre-roll or post-roll position or that are longer than 30 seconds can be disqualified. As another example, if the content provider preferences specify a desire for video ads only, then non-video ads are disqualified.
  • One or more of the non-disqualified ads are selected based on an auction, where each of the non-disqualified ads is associated with a bid. The ads with the top bids are selected, subject to length restrictions. For example, if the auction is for one interstitial slot of 1 minute long, and the ad with the high bid is 40 seconds long, then the remaining 20 seconds in the slot can be filled by an ad of 20-seconds or shorter with the next highest bid, or it can go unfilled if there are no 20-second ad in the auction. The ads selected from the auction are transmitted to the user device 205 for presentation to the user. Data on user responses to the advertisements and the video (e.g., click-throughs of the advertisement, viewing of the respective advertisements in their entirety, viewing of the video in its entirety, and skipping of the advertisements) can be collected. The collected data can be used by the advertising provider 201 to adjust the weights in the determination of relevant advertisements and/or adjust the bids of ads in auctions.
  • FIG. 6 shows an example of a generic computer device 600 and a generic mobile computer device 650, which may be used with the techniques described above. Computing device 600 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, television set-top boxes, servers, blade servers, mainframes, and other appropriate computers. Computing device 650 is intended to represent various forms of mobile devices, such as personal digital assistants, cellular telephones, smartphones, and other similar computing devices. The components shown here, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit the implementations described and/or the claims.
  • Computing device 600 includes a processor 602, memory 604, a storage device 606, a high-speed interface 608 connecting to memory 604 and high-speed expansion ports 610, and a low speed interface 612 connecting to low speed bus 614 and storage device 606. Each of the components 602, 604, 606, 608, 610, and 612, are interconnected using various buses, and may be mounted on a common motherboard or in other manners as appropriate. The processor 602 can process instructions for execution within the computing device 600, including instructions stored in the memory 604 or on the storage device 606 to display graphical information for a GUI on an external input/output device, such as display 616 coupled to high speed interface 608. In other implementations, multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory. Also, multiple computing devices 600 may be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system).
  • The memory 604 stores information within the computing device 600. In one implementation, the memory 604 is a volatile memory unit or units. In another implementation, the memory 604 is a non-volatile memory unit or units. The memory 604 may also be another form of computer-readable medium, such as a magnetic or optical disk.
  • The storage device 606 is capable of providing mass storage for the computing device 600. In one implementation, the storage device 606 may be or contain a computer-readable medium, such as a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations. A computer program product can be tangibly embodied in an information carrier. The computer program product may also contain instructions that, when executed, perform one or more methods, such as those described above. The information carrier is a computer- or machine-readable medium, such as the memory 604, the storage device 606, memory on processor 602, or a propagated signal.
  • The high speed controller 608 manages bandwidth-intensive operations for the computing device 600, while the low speed controller 612 manages lower bandwidth-intensive operations. Such allocation of functions is exemplary only. In one implementation, the high-speed controller 608 is coupled to memory 604, display 616 (e.g., through a graphics processor or accelerator), and to high-speed expansion ports 610, which may accept various expansion cards (not shown). In the implementation, low-speed controller 612 is coupled to storage device 606 and low-speed expansion port 614. The low-speed expansion port, which may include various communication ports (e.g., USB, Bluetooth, Ethernet, wireless Ethernet) may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
  • The computing device 600 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a standard server 620, or multiple times in a group of such servers. It may also be implemented as part of a rack server system 624. In addition, it may be implemented in a personal computer such as a laptop computer 622. Alternatively, components from computing device 600 may be combined with other components in a mobile device (not shown), such as device 650. Each of such devices may contain one or more of computing device 600, 650, and an entire system may be made up of multiple computing devices 600, 650 communicating with each other.
  • Computing device 650 includes a processor 652, memory 664, an input/output device such as a display 654, a communication interface 666, and a transceiver 668, among other components. The device 650 may also be provided with a storage device, such as a microdrive or other device, to provide additional storage. Each of the components 650, 652, 664, 654, 666, and 668, are interconnected using various buses, and several of the components may be mounted on a common motherboard or in other manners as appropriate.
  • The processor 652 can execute instructions within the computing device 650, including instructions stored in the memory 664. The processor may be implemented as a chipset of chips that include separate and multiple analog and digital processors. The processor may provide, for example, for coordination of the other components of the device 650, such as control of user interfaces, applications run by device 650, and wireless communication by device 650.
  • Processor 652 may communicate with a user through control interface 658 and display interface 656 coupled to a display 654. The display 654 may be, for example, a TFT (Thin-Film-Transistor Liquid Crystal Display) display or an OLED (Organic Light Emitting Diode) display, or other appropriate display technology. The display interface 656 may comprise appropriate circuitry for driving the display 654 to present graphical and other information to a user. The control interface 658 may receive commands from a user and convert them for submission to the processor 652. In addition, an external interface 662 may be provide in communication with processor 652, so as to enable near area communication of device 650 with other devices. External interface 662 may provide, for example, for wired communication in some implementations, or for wireless communication in other implementations, and multiple interfaces may also be used.
  • The memory 664 stores information within the computing device 650. The memory 664 can be implemented as one or more of a computer-readable medium or media, a volatile memory unit or units, or a non-volatile memory unit or units. Expansion memory 674 may also be provided and connected to device 650 through expansion interface 672, which may include, for example, a SIMM (Single In Line Memory Module) card interface. Such expansion memory 674 may provide extra storage space for device 650, or may also store applications or other information for device 650. Specifically, expansion memory 674 may include instructions to carry out or supplement the processes described above, and may include secure information also. Thus, for example, expansion memory 674 may be provide as a security module for device 650, and may be programmed with instructions that permit secure use of device 650. In addition, secure applications may be provided via the SIMM cards, along with additional information, such as placing identifying information on the SIMM card in a non-hackable manner.
  • The memory may include, for example, flash memory and/or NVRAM memory, as discussed below. In one implementation, a computer program product is tangibly embodied in an information carrier. The computer program product contains instructions that, when executed, perform one or more methods, such as those described above. The information carrier is a computer- or machine-readable medium, such as the memory 664, expansion memory 674, memory on processor 652, or a propagated signal that may be received, for example, over transceiver 668 or external interface 662.
  • Device 650 may communicate wirelessly through communication interface 666, which may include digital signal processing circuitry where necessary. Communication interface 666 may provide for communications under various modes or protocols, such as GSM voice calls, SMS, EMS, or MMS messaging, CDMA, TDMA, PDC, WCDMA, CDMA2000, or GPRS, among others. Such communication may occur, for example, through radio-frequency transceiver 668. In addition, short-range communication may occur, such as using a Bluetooth, WiFi, or other such transceiver (not shown). In addition, GPS (Global Positioning System) receiver module 670 may provide additional navigation- and location-related wireless data to device 650, which may be used as appropriate by applications running on device 650.
  • Device 650 may also communicate audibly using audio codec 660, which may receive spoken information from a user and convert it to usable digital information. Audio codec 660 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset of device 650. Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, etc.) and may also include sound generated by applications operating on device 650.
  • The computing device 650 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a cellular telephone 680. It may also be implemented as part of a smartphone 682, personal digital assistant, or other similar mobile device.
  • The disclosed and other embodiments and the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. The disclosed and other embodiments can be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer-readable medium for execution by, or to control the operation of, data processing apparatus. The computer-readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more them. The term “data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them. A propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode information for transmission to suitable receiver apparatus.
  • A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
  • Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer need not have such devices. Computer-readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
  • To provide for interaction with a user, the disclosed embodiments can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
  • The disclosed embodiments can be implemented in a computing system that includes a back-end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front-end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of what is disclosed here, or any combination of one or more such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
  • The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • While this specification contains many specifics, these should not be construed as limitations on the scope of what being claims or of what may be claimed, but rather as descriptions of features specific to particular embodiments. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
  • Similarly, while operations are depicted in the drawings in a particular order, this should not be understand as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
  • Thus, particular embodiments have been described. Other embodiments are within the scope of the following claims.

Claims (20)

1. (canceled)
2. A computer-implemented method comprising:
receiving a request for sponsored content for presentation with a content item, wherein the content item is included in a document;
for each of one or more candidate sponsored content items:
calculating a first score based on, at least, a comparison of the candidate sponsored content item to one or more keywords for the content item;
calculating a second score based on, at least, a comparison of content one or more second documents that link to the document to the candidate sponsored item;
calculating a final score based on, at least, a combination of the respective first and second scores;
selecting one or more of the candidate sponsored content items based on, at least, the candidate sponsored content item's respective final score; and
providing the selected sponsored content items for presentation with the content item.
3. The method of claim 2 wherein the content of each of the second documents comprises anchor text of one or more hyperlinks that point to the document.
4. The method of claim 2 wherein calculating one of the scores is further based on a comparison of the candidate sponsored content item to content of the document.
5. The method of claim 2 wherein the sponsored content items comprise one or more advertisements.
6. The method of claim 2 wherein the content item comprises a video file or stream.
7. The method of claim 2 wherein at least one of the keywords for the content item was selected based on historical user interaction with the content item when the content item was provided as a search result responsive to a query comprising the at least one keyword.
8. The method of claim 2 wherein calculating the final score based at least partially on a combination of the respective first and second scores comprises weighting the first and second scores, wherein the first and second scores are weighted differently, and wherein a weight of the first or the second score is based at least partially on a measure of past user interactions with the candidate sponsored content item when the candidate sponsored content item was presented with the content item.
9. The method of claim 2 wherein one of the one or more keywords is determined by analysis of the content item.
10. The method of claim 2 wherein selecting one or more of the candidate sponsored content items comprises selecting one or more of the candidate sponsored content items based on respective bids associated with the candidate sponsored content items.
11. A system comprising:
data processing apparatus programmed to perform operations comprising:
receiving a request for sponsored content for presentation with a content item, wherein the content item is included in a document;
for each of one or more candidate sponsored content items:
calculating a first score based on, at least, a comparison of the candidate sponsored content item to one or more keywords for the content item;
calculating a second score based on, at least, a comparison of content one or more second documents that link to the document to the candidate sponsored item;
calculating a final score based on, at least, a combination of the respective first and second scores;
selecting one or more of the candidate sponsored content items based on, at least, the candidate sponsored content item's respective final score; and
providing the selected sponsored content items for presentation with the content item.
12. The system of claim 11 wherein the content of each of the second documents comprises anchor text of one or more hyperlinks that point to the document.
13. The system of claim 11 wherein calculating one of the scores is further based on a comparison of the candidate sponsored content item to content of the document.
14. The system of claim 11 wherein the sponsored content items comprise one or more advertisements.
15. The system of claim 11 wherein the content item comprises a video file or stream.
16. The system of claim 11 wherein at least one of the keywords for the content item was selected based on historical user interaction with the content item when the content item was provided as a search result responsive to a query comprising the at least one keyword.
17. The system of claim 11 wherein calculating the final score based at least partially on a combination of the respective first and second scores comprises weighting the first and second scores, wherein the first and second scores are weighted differently, and wherein a weight of the first or the second score is based at least partially on a measure of past user interactions with the candidate sponsored content item when the candidate sponsored content item was presented with the content item.
18. The system of claim 11 wherein one of the one or more keywords is determined by analysis of the content item.
19. The system of claim 11 wherein selecting one or more of the candidate sponsored content items comprises selecting one or more of the candidate sponsored content items based on respective bids associated with the candidate sponsored content items.
20. A program product stored on a machine readable storage medium that, when executed by data processing apparatus, causes the data processing apparatus to perform operations comprising:
receiving a request for sponsored content for presentation with a content item, wherein the content item is included in a document;
for each of one or more candidate sponsored content items:
calculating a first score based on, at least, a comparison of the candidate sponsored content item to one or more keywords for the content item;
calculating a second score based on, at least, a comparison of content one or more second documents that link to the document to the candidate sponsored item;
calculating a final score based on, at least, a combination of the respective first and second scores;
selecting one or more of the candidate sponsored content items based on, at least, the candidate sponsored content item's respective final score; and
providing the selected sponsored content items for presentation with the content item.
US13/619,961 2007-06-27 2012-09-14 Selection of advertisements for placement with content Abandoned US20130254802A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/619,961 US20130254802A1 (en) 2007-06-27 2012-09-14 Selection of advertisements for placement with content

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US94670207P 2007-06-27 2007-06-27
US11/850,652 US8433611B2 (en) 2007-06-27 2007-09-05 Selection of advertisements for placement with content
US13/619,961 US20130254802A1 (en) 2007-06-27 2012-09-14 Selection of advertisements for placement with content

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/850,652 Continuation US8433611B2 (en) 2007-06-27 2007-09-05 Selection of advertisements for placement with content

Publications (1)

Publication Number Publication Date
US20130254802A1 true US20130254802A1 (en) 2013-09-26

Family

ID=40161848

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/850,652 Active US8433611B2 (en) 2007-06-27 2007-09-05 Selection of advertisements for placement with content
US13/619,961 Abandoned US20130254802A1 (en) 2007-06-27 2012-09-14 Selection of advertisements for placement with content

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/850,652 Active US8433611B2 (en) 2007-06-27 2007-09-05 Selection of advertisements for placement with content

Country Status (3)

Country Link
US (2) US8433611B2 (en)
EP (1) EP2186047A2 (en)
WO (1) WO2009003179A2 (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140098715A1 (en) * 2012-10-09 2014-04-10 Tv Ears, Inc. System for streaming audio to a mobile device using voice over internet protocol
US8719865B2 (en) 2006-09-12 2014-05-06 Google Inc. Using viewing signals in targeted video advertising
US20140293129A1 (en) * 2013-03-26 2014-10-02 Panasonic Corporation Video reception device and image recognition method for received video
US20140293130A1 (en) * 2013-03-26 2014-10-02 Panasonic Corporation Video reception device and image recognition method for received video
US20150058113A1 (en) * 2013-08-23 2015-02-26 Yahoo! Inc. Dwell time based advertising
US20150058114A1 (en) * 2013-08-23 2015-02-26 Yahoo! Inc. Dwell time based advertising in a scrollable content stream
US9064024B2 (en) 2007-08-21 2015-06-23 Google Inc. Bundle generation
US9128981B1 (en) 2008-07-29 2015-09-08 James L. Geer Phone assisted ‘photographic memory’
US20170148049A1 (en) * 2015-11-25 2017-05-25 Yahoo! Inc. Systems and methods for ad placement in content streams
US9762951B2 (en) 2013-07-30 2017-09-12 Panasonic Intellectual Property Management Co., Ltd. Video reception device, added-information display method, and added-information display system
US9774924B2 (en) 2014-03-26 2017-09-26 Panasonic Intellectual Property Management Co., Ltd. Video reception device, video recognition method and additional information display system
US9792361B1 (en) 2008-07-29 2017-10-17 James L. Geer Photographic memory
US9900650B2 (en) 2013-09-04 2018-02-20 Panasonic Intellectual Property Management Co., Ltd. Video reception device, video recognition method, and additional information display system
US9906843B2 (en) 2013-09-04 2018-02-27 Panasonic Intellectual Property Management Co., Ltd. Video reception device, video recognition method, and display system for providing additional information to be superimposed on displayed image
US9955103B2 (en) 2013-07-26 2018-04-24 Panasonic Intellectual Property Management Co., Ltd. Video receiving device, appended information display method, and appended information display system
US10194216B2 (en) 2014-03-26 2019-01-29 Panasonic Intellectual Property Management Co., Ltd. Video reception device, video recognition method, and additional information display system
US10200765B2 (en) 2014-08-21 2019-02-05 Panasonic Intellectual Property Management Co., Ltd. Content identification apparatus and content identification method
EP3564888A1 (en) 2018-05-04 2019-11-06 Hotmart B.V. Methods and systems for displaying a form associated with a video
US10616613B2 (en) 2014-07-17 2020-04-07 Panasonic Intellectual Property Management Co., Ltd. Recognition data generation device, image recognition device, and recognition data generation method
WO2020113080A1 (en) * 2018-11-29 2020-06-04 Kingston Joseph Peter Systems and methods for integrated marketing
US11210058B2 (en) 2019-09-30 2021-12-28 Tv Ears, Inc. Systems and methods for providing independently variable audio outputs
WO2022150053A1 (en) * 2021-01-07 2022-07-14 Google Llc Selection and provision of digital components during display of content

Families Citing this family (160)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10354280B2 (en) 2004-12-27 2019-07-16 Blue Calypso, Llc System and method for distribution of targeted advertising between mobile communication devices
US9314697B2 (en) 2013-07-26 2016-04-19 Blue Calypso, Llc System and method for advertising distribution through mobile social gaming
US10755313B2 (en) 2004-12-27 2020-08-25 Andrew Levi System and method for distribution of targeted content between mobile communication devices
US20080065721A1 (en) * 2006-09-12 2008-03-13 Brian John Cragun Proximity-based web page content placement mechanism
US8831987B2 (en) * 2006-12-19 2014-09-09 The Rubicon Project Managing bids in a real-time auction for advertisements
KR20090089914A (en) 2006-12-19 2009-08-24 팍스 인터렉티브 미디어, 인크. Auction for each individual ad impression
US8667532B2 (en) 2007-04-18 2014-03-04 Google Inc. Content recognition for targeting video advertisements
US8433611B2 (en) 2007-06-27 2013-04-30 Google Inc. Selection of advertisements for placement with content
US8069414B2 (en) * 2007-07-18 2011-11-29 Google Inc. Embedded video player
US9553947B2 (en) 2007-07-18 2017-01-24 Google Inc. Embedded video playlists
US8091103B2 (en) * 2007-07-22 2012-01-03 Overlay.Tv Inc. Server providing content directories of video signals and linkage to content information sources
GB0717245D0 (en) * 2007-09-05 2007-10-17 Seeman Robert A method of displaying a webpage on a device
US9191450B2 (en) * 2007-09-20 2015-11-17 Disney Enterprises, Inc. Measuring user engagement during presentation of media content
US20090089830A1 (en) * 2007-10-02 2009-04-02 Blinkx Uk Ltd Various methods and apparatuses for pairing advertisements with video files
US20090119169A1 (en) * 2007-10-02 2009-05-07 Blinkx Uk Ltd Various methods and apparatuses for an engine that pairs advertisements with video files
US8416247B2 (en) 2007-10-09 2013-04-09 Sony Computer Entertaiment America Inc. Increasing the number of advertising impressions in an interactive environment
US8572112B2 (en) * 2007-11-02 2013-10-29 Microsoft Corporation Syndicating search queries using web advertising
US8275419B2 (en) * 2007-11-14 2012-09-25 Yahoo! Inc. Advertisements on mobile devices using integrations with mobile applications
US10210533B2 (en) * 2007-11-20 2019-02-19 Redgage Llc Revenue sharing system that optimizes ad revenue with preformatted page generator and preview distribution system
US9043828B1 (en) 2007-12-28 2015-05-26 Google Inc. Placing sponsored-content based on images in video content
US8156001B1 (en) 2007-12-28 2012-04-10 Google Inc. Facilitating bidding on images
US8315423B1 (en) 2007-12-28 2012-11-20 Google Inc. Providing information in an image-based information retrieval system
US8458598B1 (en) * 2008-01-23 2013-06-04 Goldmail, Inc. Customized advertising for online slideshow
US20110191809A1 (en) 2008-01-30 2011-08-04 Cinsay, Llc Viral Syndicated Interactive Product System and Method Therefor
US8312486B1 (en) 2008-01-30 2012-11-13 Cinsay, Inc. Interactive product placement system and method therefor
US11227315B2 (en) 2008-01-30 2022-01-18 Aibuy, Inc. Interactive product placement system and method therefor
US9824372B1 (en) 2008-02-11 2017-11-21 Google Llc Associating advertisements with videos
US8499247B2 (en) 2008-02-26 2013-07-30 Livingsocial, Inc. Ranking interactions between users on the internet
US20090231356A1 (en) * 2008-03-17 2009-09-17 Photometria, Inc. Graphical user interface for selection of options from option groups and methods relating to same
US8965786B1 (en) * 2008-04-18 2015-02-24 Google Inc. User-based ad ranking
WO2009137368A2 (en) 2008-05-03 2009-11-12 Mobile Media Now, Inc. Method and system for generation and playback of supplemented videos
GB0809631D0 (en) * 2008-05-28 2008-07-02 Mirriad Ltd Zonesense
US9183885B2 (en) * 2008-05-30 2015-11-10 Echostar Technologies L.L.C. User-initiated control of an audio/video stream to skip interstitial content between program segments
US20090319648A1 (en) * 2008-06-24 2009-12-24 Mobile Tribe Llc Branded Advertising Based Dynamic Experience Generator
US8458147B2 (en) * 2008-08-20 2013-06-04 Intel Corporation Techniques for the association, customization and automation of content from multiple sources on a single display
US8458053B1 (en) 2008-12-17 2013-06-04 Google Inc. Click-to buy overlays
US20100223126A1 (en) * 2009-03-02 2010-09-02 Tung Kevin W Advertising system and method
US20110071911A1 (en) * 2009-03-02 2011-03-24 Tung Kevin W Advertising system and method
US8600849B1 (en) 2009-03-19 2013-12-03 Google Inc. Controlling content items
US9170995B1 (en) * 2009-03-19 2015-10-27 Google Inc. Identifying context of content items
US9760906B1 (en) 2009-03-19 2017-09-12 Google Inc. Sharing revenue associated with a content item
US9723089B2 (en) * 2009-04-14 2017-08-01 Excalibur Ip, Llc Constructing a data pipeline having scalability and low latency
US20100274667A1 (en) * 2009-04-24 2010-10-28 Nexidia Inc. Multimedia access
US8036990B1 (en) 2009-04-28 2011-10-11 GumGum, Inc. Systems and methods for electronically managing content licenses
US10699235B2 (en) * 2009-05-05 2020-06-30 Oracle America, Inc. System, method and computer readable medium for placing advertisements into web pages
US20100318421A1 (en) * 2009-06-12 2010-12-16 Martin-Cocher Gaelle Christine Method and system to identify the contextual data that was used to perform an advertisement selection
US8719713B2 (en) * 2009-06-17 2014-05-06 Microsoft Corporation Rich entity for contextually relevant advertisements
WO2011004034A1 (en) * 2009-07-10 2011-01-13 Juan Ignacio Pita Riva Method for broadcasting audiovisual content
US8280408B2 (en) 2009-07-17 2012-10-02 At&T Intellectual Property I, Lp Methods, systems and computer program products for tailoring advertisements to a user based on actions taken using a portable electronic device
US20110071901A1 (en) * 2009-09-21 2011-03-24 Alexander Fries Online Advertising Methods and Systems and Revenue Sharing Methods and Systems Related to Same
US11257112B1 (en) 2009-10-15 2022-02-22 Livingsocial, Inc. Ad targeting and display optimization based on social and community data
US20110106615A1 (en) * 2009-11-03 2011-05-05 Yahoo! Inc. Multimode online advertisements and online advertisement exchanges
US9152708B1 (en) 2009-12-14 2015-10-06 Google Inc. Target-video specific co-watched video clusters
US8387086B2 (en) * 2009-12-14 2013-02-26 Microsoft Corporation Controlling ad delivery for video on-demand
US20110184807A1 (en) * 2010-01-28 2011-07-28 Futurewei Technologies, Inc. System and Method for Filtering Targeted Advertisements for Video Content Delivery
US20110197221A1 (en) * 2010-02-11 2011-08-11 Alan Rouse Ad selection based on promotional coupon redemption
US8516063B2 (en) 2010-02-12 2013-08-20 Mary Anne Fletcher Mobile device streaming media application
US20110251896A1 (en) * 2010-04-09 2011-10-13 Affine Systems, Inc. Systems and methods for matching an advertisement to a video
US20110251902A1 (en) * 2010-04-11 2011-10-13 Transaxtions Llc Target Area Based Content and Stream Monetization Using Feedback
US9443147B2 (en) * 2010-04-26 2016-09-13 Microsoft Technology Licensing, Llc Enriching online videos by content detection, searching, and information aggregation
WO2012012812A2 (en) * 2010-07-23 2012-01-26 Lourence Cornelius Johannes Greyvenstein Apparatus and methods of advertising on the internet
CN102232220B (en) * 2010-10-29 2014-04-30 华为技术有限公司 Method and system for extracting and correlating video interested objects
US20120173341A1 (en) * 2010-12-31 2012-07-05 Kun Li Information publishing method, apparatus and system
US9384408B2 (en) 2011-01-12 2016-07-05 Yahoo! Inc. Image analysis system and method using image recognition and text search
US20120251080A1 (en) * 2011-03-29 2012-10-04 Svendsen Jostein Multi-layer timeline content compilation systems and methods
US10739941B2 (en) 2011-03-29 2020-08-11 Wevideo, Inc. Multi-source journal content integration systems and methods and systems and methods for collaborative online content editing
US20120254929A1 (en) * 2011-04-04 2012-10-04 Google Inc. Content Extraction for Television Display
US9256888B2 (en) 2011-04-04 2016-02-09 Zynga Inc. Matching advertising to game play content
US9152984B1 (en) 2011-07-14 2015-10-06 Zynga Inc. Personal ad targeting
US8635519B2 (en) 2011-08-26 2014-01-21 Luminate, Inc. System and method for sharing content based on positional tagging
US8996650B2 (en) 2011-08-26 2015-03-31 Accenture Global Services Limited Preparing content packages
RU2604670C2 (en) 2011-08-29 2016-12-10 Синсэй, Инк. Containerized software for virally copying from one endpoint to another
US9262766B2 (en) 2011-08-31 2016-02-16 Vibrant Media, Inc. Systems and methods for contextualizing services for inline mobile banner advertising
US9966107B1 (en) 2011-09-28 2018-05-08 Amazon Technologies, Inc. Networked media consumption service
US20130086112A1 (en) 2011-10-03 2013-04-04 James R. Everingham Image browsing system and method for a digital content platform
US20130085866A1 (en) * 2011-10-04 2013-04-04 Ilya Levitis Floating smartphone icon messaging system
US8737678B2 (en) 2011-10-05 2014-05-27 Luminate, Inc. Platform for providing interactive applications on a digital content platform
USD736224S1 (en) 2011-10-10 2015-08-11 Yahoo! Inc. Portion of a display screen with a graphical user interface
USD737290S1 (en) 2011-10-10 2015-08-25 Yahoo! Inc. Portion of a display screen with a graphical user interface
US10002355B1 (en) 2011-10-19 2018-06-19 Amazon Technologies, Inc. Licensed media in a remote storage media consumption service
US8255495B1 (en) 2012-03-22 2012-08-28 Luminate, Inc. Digital image and content display systems and methods
US10181131B2 (en) * 2012-03-27 2019-01-15 Google Llc Conditional billing of advertisements based on determined user interest
US8234168B1 (en) 2012-04-19 2012-07-31 Luminate, Inc. Image content and quality assurance system and method
US8495489B1 (en) 2012-05-16 2013-07-23 Luminate, Inc. System and method for creating and displaying image annotations
US9965129B2 (en) 2012-06-01 2018-05-08 Excalibur Ip, Llc Personalized content from indexed archives
US9792285B2 (en) * 2012-06-01 2017-10-17 Excalibur Ip, Llc Creating a content index using data on user actions
US20150186386A1 (en) * 2012-06-13 2015-07-02 Joel Hilliard Video player with enhanced content ordering and method of acquiring content
US9607330B2 (en) 2012-06-21 2017-03-28 Cinsay, Inc. Peer-assisted shopping
US10789631B2 (en) 2012-06-21 2020-09-29 Aibuy, Inc. Apparatus and method for peer-assisted e-commerce shopping
US10373508B2 (en) * 2012-06-27 2019-08-06 Intel Corporation Devices, systems, and methods for enriching communications
JP5593352B2 (en) * 2012-07-10 2014-09-24 ヤフー株式会社 Information providing apparatus, information providing method, and information providing program
US9172999B2 (en) * 2012-08-08 2015-10-27 Verizon Patent And Licensing Inc. Behavioral keyword identification based on thematic channel viewing
US10482487B1 (en) 2012-08-13 2019-11-19 Livingsocial, Inc. Incentivizing sharing in social networks
US9479677B2 (en) * 2012-09-05 2016-10-25 Intel Corproation Protocol for communications between platforms and image devices
US20140136313A1 (en) * 2012-11-14 2014-05-15 Satyam Shaw Categorizing content selections
US10430839B2 (en) * 2012-12-12 2019-10-01 Cisco Technology, Inc. Distributed advertisement insertion in content-centric networks
US9245024B1 (en) * 2013-01-18 2016-01-26 Google Inc. Contextual-based serving of content segments in a video delivery system
US8612226B1 (en) * 2013-01-28 2013-12-17 Google Inc. Determining advertisements based on verbal inputs to applications on a computing device
US20140223271A1 (en) * 2013-02-04 2014-08-07 Google Inc. Systems and methods of creating an animated content item
US11748833B2 (en) 2013-03-05 2023-09-05 Wevideo, Inc. Systems and methods for a theme-based effects multimedia editing platform
US9066122B1 (en) 2013-03-08 2015-06-23 Google Inc. Serving video content segments
US8875177B1 (en) * 2013-03-12 2014-10-28 Google Inc. Serving video content segments
US20150025981A1 (en) * 2013-03-15 2015-01-22 David Zaretsky Url shortening computer-processed platform for processing internet traffic
WO2014150399A1 (en) * 2013-03-15 2014-09-25 Brandstetter Jeffrey D Systems and methods for defining ad spaces in video
US9405775B1 (en) * 2013-03-15 2016-08-02 Google Inc. Ranking videos based on experimental data
US9626691B2 (en) 2013-05-02 2017-04-18 Google Inc. Determining a bid modifier value to maximize a return on investment in a hybrid campaign
CN103327379B (en) * 2013-06-06 2016-08-10 合一信息技术(北京)有限公司 A kind of method and device carrying out advertising matches input according to online video time length
US9634910B1 (en) * 2013-06-14 2017-04-25 Google Inc. Adaptive serving companion shared content
US9986307B2 (en) * 2013-07-19 2018-05-29 Bottle Rocket LLC Interactive video viewing
US10373431B2 (en) 2013-07-26 2019-08-06 Blue Calypso, Llc System and method for advertising distribution through mobile social gaming
US9814985B2 (en) 2013-07-26 2017-11-14 Blue Calypso, Llc System and method for advertising distribution through mobile social gaming
US9367529B1 (en) * 2013-07-31 2016-06-14 Google Inc. Selecting content based on entities
US9720890B2 (en) * 2013-08-06 2017-08-01 Educational Testing Service System and method for rendering an assessment item
US10218954B2 (en) * 2013-08-15 2019-02-26 Cellular South, Inc. Video to data
US9940972B2 (en) * 2013-08-15 2018-04-10 Cellular South, Inc. Video to data
US20150066653A1 (en) * 2013-09-04 2015-03-05 Google Inc. Structured informational link annotations
US9953347B2 (en) 2013-09-11 2018-04-24 Cinsay, Inc. Dynamic binding of live video content
US9832505B2 (en) * 2013-09-24 2017-11-28 Telefonaktiebolaget Lm Ericsson (Publ) Method for inserting an advertisement into a video stream of an application on demand (AoD) service, AoD processing device and AoD server
KR102344237B1 (en) 2013-09-27 2021-12-27 에이아이바이, 인크. Apparatus and method for supporting relationships associated with content provisioning
JP6531105B2 (en) 2013-09-27 2019-06-12 アイバイ,インコーポレイテッド N-level duplication of supplemental content
US9489692B1 (en) * 2013-10-16 2016-11-08 Google Inc. Location-based bid modifiers
US8935247B1 (en) 2013-10-21 2015-01-13 Googel Inc. Methods and systems for hierarchically partitioning a data set including a plurality of offerings
US10614491B2 (en) 2013-11-06 2020-04-07 Google Llc Content rate display adjustment between different categories of online documents in a computer network environment
US9275133B1 (en) * 2013-11-13 2016-03-01 Google Inc. Content request identification via a computer network
US20150170222A1 (en) * 2013-12-18 2015-06-18 MaxPoint Interactive, Inc. System and method for controlled purchasing of online advertisements in a real-time bidding environment
US9747618B1 (en) 2013-12-18 2017-08-29 MaxPoint Interactive, Inc. Purchasing pace control in a real-time bidding environment using a multi-loop control scheme
US9304799B2 (en) * 2013-12-27 2016-04-05 International Business Machines Corporation Placement of input / output adapter cards in a server
US9619470B2 (en) 2014-02-04 2017-04-11 Google Inc. Adaptive music and video recommendations
US9258589B2 (en) 2014-02-14 2016-02-09 Pluto, Inc. Methods and systems for generating and providing program guides and content
US10448075B2 (en) 2014-03-06 2019-10-15 Cox Communications, Inc. Content conditioning and distribution of conditioned media assets at a content platform
US11080777B2 (en) 2014-03-31 2021-08-03 Monticello Enterprises LLC System and method for providing a social media shopping experience
KR102137207B1 (en) * 2014-06-06 2020-07-23 삼성전자주식회사 Electronic device, contorl method thereof and system
US9852188B2 (en) * 2014-06-23 2017-12-26 Google Llc Contextual search on multimedia content
US9705832B2 (en) * 2014-08-27 2017-07-11 Lenovo (Singapore) Pte. Ltd. Context-aware aggregation of text-based messages
US9953646B2 (en) 2014-09-02 2018-04-24 Belleau Technologies Method and system for dynamic speech recognition and tracking of prewritten script
US10528982B2 (en) * 2014-09-12 2020-01-07 Facebook, Inc. Determining a prompt for performing an action presented to a user in association with video data
US10956936B2 (en) 2014-12-30 2021-03-23 Spotify Ab System and method for providing enhanced user-sponsor interaction in a media environment, including support for shake action
US20180268475A1 (en) * 2015-02-19 2018-09-20 Billionaired Labs Enabling a Personalized Conversation between Retailer and Customer at Scale
EP3086273A1 (en) * 2015-04-20 2016-10-26 Spoods GmbH A method for data communication between a data processing unit and an end device as well as a system for data communication
US10210218B2 (en) * 2015-06-16 2019-02-19 Salesforce.Com, Inc. Processing a file to generate a recommendation using a database system
US10062015B2 (en) 2015-06-25 2018-08-28 The Nielsen Company (Us), Llc Methods and apparatus for identifying objects depicted in a video using extracted video frames in combination with a reverse image search engine
WO2017132087A1 (en) 2016-01-25 2017-08-03 nToggle, Inc. Platform for programmatic advertising
US9984115B2 (en) * 2016-02-05 2018-05-29 Patrick Colangelo Message augmentation system and method
US10289732B2 (en) * 2016-06-13 2019-05-14 Google Llc Server-based conversion of autoplay content to click-to-play content
US10356480B2 (en) 2016-07-05 2019-07-16 Pluto Inc. Methods and systems for generating and providing program guides and content
US10992726B2 (en) 2016-08-01 2021-04-27 AdsWizz Inc. Detecting sensor-based interactions with client device in conjunction with presentation of content
US11126785B1 (en) * 2017-02-17 2021-09-21 Amazon Technologies, Inc. Artificial intelligence system for optimizing network-accessible content
CN106919690B (en) * 2017-03-03 2021-04-20 北京金山安全软件有限公司 Information shielding method and device and electronic equipment
US10726196B2 (en) * 2017-03-03 2020-07-28 Evolv Technology Solutions, Inc. Autonomous configuration of conversion code to control display and functionality of webpage portions
WO2018176081A1 (en) * 2017-03-28 2018-10-04 ALLT Technologies Pty Ltd A method and a system for associating an object contained in media content with an entity
US11140232B2 (en) * 2017-06-26 2021-10-05 Facebook, Inc. Analyzing geo-spatial data in layers
US11010791B1 (en) 2018-02-27 2021-05-18 Inmar Clearing, Inc. System for generating targeted advertisement content based upon influencer content and related methods
US11087369B1 (en) * 2018-03-16 2021-08-10 Facebook, Inc. Context-based provision of media content
US10560758B1 (en) * 2018-06-12 2020-02-11 Facebook, Inc. Two-stage content item selection process incorporating brand value
WO2020014712A1 (en) 2018-07-13 2020-01-16 Pubwise, LLLP Digital advertising platform with demand path optimization
US11232254B2 (en) * 2018-09-28 2022-01-25 Microsoft Technology Licensing, Llc Editing mechanism for electronic content items
US11163940B2 (en) * 2019-05-25 2021-11-02 Microsoft Technology Licensing Llc Pipeline for identifying supplemental content items that are related to objects in images
CN111163355A (en) * 2019-12-27 2020-05-15 深圳市九洲电器有限公司 Advertisement playing method, device and system
US11526912B2 (en) 2020-08-20 2022-12-13 Iris.TV Inc. Managing metadata enrichment of digital asset portfolios
US20230142904A1 (en) * 2021-11-09 2023-05-11 Honda Motor Co., Ltd. Creation of notes for items of interest mentioned in audio content

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6698020B1 (en) * 1998-06-15 2004-02-24 Webtv Networks, Inc. Techniques for intelligent video ad insertion
US20060218573A1 (en) * 2005-03-04 2006-09-28 Stexar Corp. Television program highlight tagging
US20070100690A1 (en) * 2005-11-02 2007-05-03 Daniel Hopkins System and method for providing targeted advertisements in user requested multimedia content
US20070101359A1 (en) * 2005-11-01 2007-05-03 Broadband Royalty Corporation Generating ad insertion metadata at program file load time
US20070112630A1 (en) * 2005-11-07 2007-05-17 Scanscout, Inc. Techniques for rendering advertisments with rich media
US20070204310A1 (en) * 2006-02-27 2007-08-30 Microsoft Corporation Automatically Inserting Advertisements into Source Video Content Playback Streams
US20070260597A1 (en) * 2006-05-02 2007-11-08 Mark Cramer Dynamic search engine results employing user behavior
US20070282893A1 (en) * 2006-04-24 2007-12-06 Keith Smith System for providing digital content and advertising among multiple entities
US20080010117A1 (en) * 2006-06-14 2008-01-10 Microsoft Corporation Dynamic advertisement insertion in a download service
US20080021775A1 (en) * 2006-07-21 2008-01-24 Videoegg, Inc. Systems and methods for interaction prompt initiated video advertising
US20080027798A1 (en) * 2006-07-25 2008-01-31 Shivkumar Ramamurthi Serving advertisements based on keywords related to a webpage determined using external metadata
US20080033806A1 (en) * 2006-07-20 2008-02-07 Howe Karen N Targeted advertising for playlists based upon search queries
US20080092159A1 (en) * 2006-10-17 2008-04-17 Google Inc. Targeted video advertising
US20080098420A1 (en) * 2006-10-19 2008-04-24 Roundbox, Inc. Distribution and display of advertising for devices in a network
US20080155588A1 (en) * 2006-12-21 2008-06-26 Verizon Data Services Inc. Content hosting and advertising systems and methods
US20080163071A1 (en) * 2006-12-28 2008-07-03 Martin Abbott Systems and methods for selecting advertisements for display over a communications network
US20080249855A1 (en) * 2007-04-04 2008-10-09 Yahoo! Inc. System for generating advertising creatives
US20080275763A1 (en) * 2007-05-03 2008-11-06 Thai Tran Monetization of Digital Content Contributions
US20080319827A1 (en) * 2007-06-25 2008-12-25 Microsoft Corporation Mining implicit behavior
US20120036117A1 (en) * 2005-06-16 2012-02-09 Richard Kazimierz Zwicky Selection of advertisements to present on a web page or other destination based on search activities of users who selected the destination

Family Cites Families (122)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2986345B2 (en) * 1993-10-18 1999-12-06 インターナショナル・ビジネス・マシーンズ・コーポレイション Voice recording indexing apparatus and method
US5664227A (en) * 1994-10-14 1997-09-02 Carnegie Mellon University System and method for skimming digital audio/video data
US5724521A (en) * 1994-11-03 1998-03-03 Intel Corporation Method and apparatus for providing electronic advertisements to end users in a consumer best-fit pricing manner
US5740549A (en) * 1995-06-12 1998-04-14 Pointcast, Inc. Information and advertising distribution system and method
US6026368A (en) * 1995-07-17 2000-02-15 24/7 Media, Inc. On-line interactive system and method for providing content and advertising information to a targeted set of viewers
US5848397A (en) 1996-04-19 1998-12-08 Juno Online Services, L.P. Method and apparatus for scheduling the presentation of messages to computer users
US5948061A (en) * 1996-10-29 1999-09-07 Double Click, Inc. Method of delivery, targeting, and measuring advertising over networks
US6078914A (en) * 1996-12-09 2000-06-20 Open Text Corporation Natural language meta-search system and method
ATE355662T1 (en) * 1997-01-06 2006-03-15 Bellsouth Intellect Pty Corp METHOD AND SYSTEM FOR NETWORK USAGE COLLECTION
US6044376A (en) * 1997-04-24 2000-03-28 Imgis, Inc. Content stream analysis
US6144944A (en) 1997-04-24 2000-11-07 Imgis, Inc. Computer system for efficiently selecting and providing information
AU8072798A (en) * 1997-06-16 1999-01-04 Doubleclick Inc. Method and apparatus for automatic placement of advertising
US6282548B1 (en) * 1997-06-21 2001-08-28 Alexa Internet Automatically generate and displaying metadata as supplemental information concurrently with the web page, there being no link between web page and metadata
US6091416A (en) * 1997-09-29 2000-07-18 International Business Machines Corporation Method, apparatus and computer program product for graphical user interface control and generating a multitool icon
US6167382A (en) 1998-06-01 2000-12-26 F.A.C. Services Group, L.P. Design and production of print advertising and commercial display materials over the Internet
US6141010A (en) * 1998-07-17 2000-10-31 B. E. Technology, Llc Computer interface method and apparatus with targeted advertising
US6754905B2 (en) * 1998-07-23 2004-06-22 Diva Systems Corporation Data structure and methods for providing an interactive program guide
US7185353B2 (en) 2000-08-31 2007-02-27 Prime Research Alliance E., Inc. System and method for delivering statistically scheduled advertisements
US20020083441A1 (en) * 2000-08-31 2002-06-27 Flickinger Gregory C. Advertisement filtering and storage for targeted advertisement systems
US6282713B1 (en) * 1998-12-21 2001-08-28 Sony Corporation Method and apparatus for providing on-demand electronic advertising
US6985882B1 (en) * 1999-02-05 2006-01-10 Directrep, Llc Method and system for selling and purchasing media advertising over a distributed communication network
US6621980B1 (en) * 1999-04-23 2003-09-16 Monkeymedia, Inc. Method and apparatus for seamless expansion of media
US6393158B1 (en) 1999-04-23 2002-05-21 Monkeymedia, Inc. Method and storage device for expanding and contracting continuous play media seamlessly
US6269361B1 (en) * 1999-05-28 2001-07-31 Goto.Com System and method for influencing a position on a search result list generated by a computer network search engine
US6188398B1 (en) * 1999-06-02 2001-02-13 Mark Collins-Rector Targeting advertising using web pages with video
US7165069B1 (en) * 1999-06-28 2007-01-16 Alexa Internet Analysis of search activities of users to identify related network sites
US7293280B1 (en) * 1999-07-08 2007-11-06 Microsoft Corporation Skimming continuous multimedia content
US20010003214A1 (en) * 1999-07-15 2001-06-07 Vijnan Shastri Method and apparatus for utilizing closed captioned (CC) text keywords or phrases for the purpose of automated searching of network-based resources for interactive links to universal resource locators (URL's)
US6401075B1 (en) * 2000-02-14 2002-06-04 Global Network, Inc. Methods of placing, purchasing and monitoring internet advertising
KR20030031471A (en) * 2000-03-31 2003-04-21 유나이티드 비디오 프로퍼티즈, 인크. System and method for metadata-linked advertisements
US7555557B2 (en) * 2000-04-07 2009-06-30 Avid Technology, Inc. Review and approval system
WO2002005140A1 (en) 2000-07-11 2002-01-17 Launch Media, Inc. Online playback system with community bias
US6990496B1 (en) * 2000-07-26 2006-01-24 Koninklijke Philips Electronics N.V. System and method for automated classification of text by time slicing
US20060015904A1 (en) * 2000-09-08 2006-01-19 Dwight Marcus Method and apparatus for creation, distribution, assembly and verification of media
US7925967B2 (en) * 2000-11-21 2011-04-12 Aol Inc. Metadata quality improvement
KR101548473B1 (en) * 2001-02-21 2015-08-28 로비 가이드스, 인크. Systems and methods for interactive program guides with personal video recording features
US20020116716A1 (en) * 2001-02-22 2002-08-22 Adi Sideman Online video editor
US8949878B2 (en) 2001-03-30 2015-02-03 Funai Electric Co., Ltd. System for parental control in video programs based on multimedia content information
US20030061305A1 (en) * 2001-03-30 2003-03-27 Chyron Corporation System and method for enhancing streaming media delivery and reporting
US6976028B2 (en) 2001-06-15 2005-12-13 Sony Corporation Media content creating and publishing system and process
US6996564B2 (en) * 2001-08-13 2006-02-07 The Directv Group, Inc. Proactive internet searching tool
US7055103B2 (en) * 2001-08-28 2006-05-30 Itzhak Lif Method of matchmaking service
KR100464075B1 (en) * 2001-12-28 2004-12-30 엘지전자 주식회사 Video highlight generating system based on scene transition
US7058963B2 (en) * 2001-12-18 2006-06-06 Thomson Licensing Method and apparatus for generating commercial viewing/listening information
US6978470B2 (en) 2001-12-26 2005-12-20 Bellsouth Intellectual Property Corporation System and method for inserting advertising content in broadcast programming
US20030154128A1 (en) * 2002-02-11 2003-08-14 Liga Kevin M. Communicating and displaying an advertisement using a personal video recorder
US7334251B2 (en) * 2002-02-11 2008-02-19 Scientific-Atlanta, Inc. Management of television advertising
JP4005820B2 (en) 2002-02-21 2007-11-14 株式会社東芝 Electronic merchandise distribution system, electronic merchandise distribution method and program
JP2003289521A (en) 2002-03-27 2003-10-10 Toshiba Corp Method of inserting advertisement, distributing system, transmitter, receiver, and program
EP1495635B1 (en) 2002-03-28 2019-05-08 Arris Group, Inc. Automatic advertisement insertion into an interactive television ticker
US7136875B2 (en) 2002-09-24 2006-11-14 Google, Inc. Serving advertisements based on content
US7716161B2 (en) * 2002-09-24 2010-05-11 Google, Inc, Methods and apparatus for serving relevant advertisements
US20050114198A1 (en) * 2003-11-24 2005-05-26 Ross Koningstein Using concepts for ad targeting
US7194527B2 (en) * 2002-06-18 2007-03-20 Microsoft Corporation Media variations browser
US7383258B2 (en) * 2002-10-03 2008-06-03 Google, Inc. Method and apparatus for characterizing documents based on clusters of related words
US7043746B2 (en) * 2003-01-06 2006-05-09 Matsushita Electric Industrial Co., Ltd. System and method for re-assuring delivery of television advertisements non-intrusively in real-time broadcast and time shift recording
KR20040096014A (en) 2003-05-07 2004-11-16 엘지전자 주식회사 Advertisement method in the digital broadcasting
US20050091311A1 (en) * 2003-07-29 2005-04-28 Lund Christopher D. Method and apparatus for distributing multimedia to remote clients
US20050034151A1 (en) * 2003-08-08 2005-02-10 Maven Networks, Inc. System and method of integrating video content with interactive elements
US8041601B2 (en) 2003-09-30 2011-10-18 Google, Inc. System and method for automatically targeting web-based advertisements
US7181447B2 (en) * 2003-12-08 2007-02-20 Iac Search And Media, Inc. Methods and systems for conceptually organizing and presenting information
US7519274B2 (en) * 2003-12-08 2009-04-14 Divx, Inc. File format for multiple track digital data
US20070159522A1 (en) * 2004-02-20 2007-07-12 Harmut Neven Image-based contextual advertisement method and branded barcodes
US7158966B2 (en) * 2004-03-09 2007-01-02 Microsoft Corporation User intent discovery
JP4285287B2 (en) 2004-03-17 2009-06-24 セイコーエプソン株式会社 Image processing apparatus, image processing method and program, and recording medium
US20060053470A1 (en) * 2004-04-30 2006-03-09 Vulcan Inc. Management and non-linear presentation of augmented broadcasted or streamed multimedia content
US20060013555A1 (en) * 2004-07-01 2006-01-19 Thomas Poslinski Commercial progress bar
SG119229A1 (en) * 2004-07-30 2006-02-28 Agency Science Tech & Res Method and apparatus for insertion of additional content into video
US7986372B2 (en) * 2004-08-02 2011-07-26 Microsoft Corporation Systems and methods for smart media content thumbnail extraction
US8156010B2 (en) 2004-08-31 2012-04-10 Intel Corporation Multimodal context marketplace
US20060059510A1 (en) * 2004-09-13 2006-03-16 Huang Jau H System and method for embedding scene change information in a video bitstream
US7657519B2 (en) * 2004-09-30 2010-02-02 Microsoft Corporation Forming intent-based clusters and employing same by search
US20060090182A1 (en) * 2004-10-27 2006-04-27 Comcast Interactive Capital, Lp Method and system for multimedia advertising
US7689458B2 (en) * 2004-10-29 2010-03-30 Microsoft Corporation Systems and methods for determining bid value for content items to be placed on a rendered page
US7266198B2 (en) * 2004-11-17 2007-09-04 General Instrument Corporation System and method for providing authorized access to digital content
JP2006155384A (en) 2004-11-30 2006-06-15 Nippon Telegr & Teleph Corp <Ntt> Video comment input/display method and device, program, and storage medium with program stored
US20060129533A1 (en) * 2004-12-15 2006-06-15 Xerox Corporation Personalized web search method
US20060179453A1 (en) * 2005-02-07 2006-08-10 Microsoft Corporation Image and other analysis for contextual ads
JP4232746B2 (en) 2005-02-24 2009-03-04 ソニー株式会社 Playback device and display control method
US9454762B2 (en) 2005-03-18 2016-09-27 Samuel Robert Gaidemak System and method for the delivery of content to a networked device
US20060214947A1 (en) * 2005-03-23 2006-09-28 The Boeing Company System, method, and computer program product for animating drawings
US20060224496A1 (en) 2005-03-31 2006-10-05 Combinenet, Inc. System for and method of expressive sequential auctions in a dynamic environment on a network
WO2006124193A2 (en) 2005-04-20 2006-11-23 Videoegg, Inc. Browser enabled video manipulation
US7769819B2 (en) * 2005-04-20 2010-08-03 Videoegg, Inc. Video editing with timeline representations
US8156176B2 (en) * 2005-04-20 2012-04-10 Say Media, Inc. Browser based multi-clip video editing
US7809802B2 (en) 2005-04-20 2010-10-05 Videoegg, Inc. Browser based video editing
US20060277567A1 (en) 2005-06-07 2006-12-07 Kinnear D S System and method for targeting audio advertisements
US20070073579A1 (en) * 2005-09-23 2007-03-29 Microsoft Corporation Click fraud resistant learning of click through rate
US8626588B2 (en) * 2005-09-30 2014-01-07 Google Inc. Advertising with audio content
US20070078708A1 (en) * 2005-09-30 2007-04-05 Hua Yu Using speech recognition to determine advertisements relevant to audio content and/or audio content relevant to advertisements
US7937724B2 (en) * 2005-10-27 2011-05-03 E-Cast Inc. Advertising content tracking for an entertainment device
US7484656B2 (en) * 2005-11-15 2009-02-03 International Business Machines Corporation Apparatus, system, and method for correlating a cost of media service to advertising exposure
US20070130602A1 (en) * 2005-12-07 2007-06-07 Ask Jeeves, Inc. Method and system to present a preview of video content
US20070157228A1 (en) * 2005-12-30 2007-07-05 Jason Bayer Advertising with video ad creatives
US9710818B2 (en) * 2006-04-03 2017-07-18 Kontera Technologies, Inc. Contextual advertising techniques for implemented at mobile devices
US8699806B2 (en) 2006-04-12 2014-04-15 Google Inc. Method and apparatus for automatically summarizing video
US7593965B2 (en) 2006-05-10 2009-09-22 Doubledip Llc System of customizing and presenting internet content to associate advertising therewith
US20070277205A1 (en) 2006-05-26 2007-11-29 Sbc Knowledge Ventures L.P. System and method for distributing video data
JP2009540770A (en) 2006-06-12 2009-11-19 インビディ テクノロジーズ コーポレイション System and method for media insertion based on keyword search
US7613691B2 (en) 2006-06-21 2009-11-03 Microsoft Corporation Dynamic insertion of supplemental video based on metadata
US20080004948A1 (en) * 2006-06-28 2008-01-03 Microsoft Corporation Auctioning for video and audio advertising
US20080005166A1 (en) * 2006-06-29 2008-01-03 International Business Machines Corporation Dynamic search result of audio-visual and related content
US8752086B2 (en) * 2006-08-09 2014-06-10 Carson Victor Conant Methods and apparatus for sending content to a media player
US8243017B2 (en) * 2006-09-11 2012-08-14 Apple Inc. Menu overlay including context dependent menu icon
US20080066107A1 (en) * 2006-09-12 2008-03-13 Google Inc. Using Viewing Signals in Targeted Video Advertising
KR101443404B1 (en) 2006-09-15 2014-10-02 구글 인코포레이티드 Capture and display of annotations in paper and electronic documents
US7559017B2 (en) * 2006-12-22 2009-07-07 Google Inc. Annotation framework for video
US20080155585A1 (en) * 2006-12-22 2008-06-26 Guideworks, Llc Systems and methods for viewing substitute media while fast forwarding past an advertisement
US20080189169A1 (en) * 2007-02-01 2008-08-07 Enliven Marketing Technologies Corporation System and method for implementing advertising in an online social network
US20080201361A1 (en) * 2007-02-16 2008-08-21 Alexander Castro Targeted insertion of an audio - video advertising into a multimedia object
US20080229353A1 (en) 2007-03-12 2008-09-18 Microsoft Corporation Providing context-appropriate advertisements in video content
US7912217B2 (en) * 2007-03-20 2011-03-22 Cisco Technology, Inc. Customized advertisement splicing in encrypted entertainment sources
US8667532B2 (en) 2007-04-18 2014-03-04 Google Inc. Content recognition for targeting video advertisements
US20080276266A1 (en) 2007-04-18 2008-11-06 Google Inc. Characterizing content for identification of advertising
US20090165041A1 (en) * 2007-12-21 2009-06-25 Penberthy John S System and Method for Providing Interactive Content with Video Content
US20080300974A1 (en) 2007-05-30 2008-12-04 Google Inc. Flexible Revenue Sharing and Referral Bounty System
US20080306999A1 (en) 2007-06-08 2008-12-11 Finger Brienne M Systems and processes for presenting informational content
US8433611B2 (en) 2007-06-27 2013-04-30 Google Inc. Selection of advertisements for placement with content
US7853601B2 (en) * 2007-11-19 2010-12-14 Yume, Inc. Method for associating advertisements with relevant content
US7966632B1 (en) * 2007-12-12 2011-06-21 Google Inc. Visual presentation of video recommendations
EP2285236A1 (en) 2008-05-30 2011-02-23 DSM IP Assets B.V. Use of succinic acid
US8365227B2 (en) * 2009-12-02 2013-01-29 Nbcuniversal Media, Llc Methods and systems for online recommendation

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6698020B1 (en) * 1998-06-15 2004-02-24 Webtv Networks, Inc. Techniques for intelligent video ad insertion
US20060218573A1 (en) * 2005-03-04 2006-09-28 Stexar Corp. Television program highlight tagging
US20120036117A1 (en) * 2005-06-16 2012-02-09 Richard Kazimierz Zwicky Selection of advertisements to present on a web page or other destination based on search activities of users who selected the destination
US20070101359A1 (en) * 2005-11-01 2007-05-03 Broadband Royalty Corporation Generating ad insertion metadata at program file load time
US20070100690A1 (en) * 2005-11-02 2007-05-03 Daniel Hopkins System and method for providing targeted advertisements in user requested multimedia content
US20070112630A1 (en) * 2005-11-07 2007-05-17 Scanscout, Inc. Techniques for rendering advertisments with rich media
US20070204310A1 (en) * 2006-02-27 2007-08-30 Microsoft Corporation Automatically Inserting Advertisements into Source Video Content Playback Streams
US20070282893A1 (en) * 2006-04-24 2007-12-06 Keith Smith System for providing digital content and advertising among multiple entities
US20070260597A1 (en) * 2006-05-02 2007-11-08 Mark Cramer Dynamic search engine results employing user behavior
US20080010117A1 (en) * 2006-06-14 2008-01-10 Microsoft Corporation Dynamic advertisement insertion in a download service
US20080033806A1 (en) * 2006-07-20 2008-02-07 Howe Karen N Targeted advertising for playlists based upon search queries
US20080021775A1 (en) * 2006-07-21 2008-01-24 Videoegg, Inc. Systems and methods for interaction prompt initiated video advertising
US20080027798A1 (en) * 2006-07-25 2008-01-31 Shivkumar Ramamurthi Serving advertisements based on keywords related to a webpage determined using external metadata
US20080092159A1 (en) * 2006-10-17 2008-04-17 Google Inc. Targeted video advertising
US20080098420A1 (en) * 2006-10-19 2008-04-24 Roundbox, Inc. Distribution and display of advertising for devices in a network
US20080155588A1 (en) * 2006-12-21 2008-06-26 Verizon Data Services Inc. Content hosting and advertising systems and methods
US20080163071A1 (en) * 2006-12-28 2008-07-03 Martin Abbott Systems and methods for selecting advertisements for display over a communications network
US20080249855A1 (en) * 2007-04-04 2008-10-09 Yahoo! Inc. System for generating advertising creatives
US20080275763A1 (en) * 2007-05-03 2008-11-06 Thai Tran Monetization of Digital Content Contributions
US20080319827A1 (en) * 2007-06-25 2008-12-25 Microsoft Corporation Mining implicit behavior

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WO 2004/111771, Published on 12/23/204, by Bharat et al. *

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8719865B2 (en) 2006-09-12 2014-05-06 Google Inc. Using viewing signals in targeted video advertising
US9064024B2 (en) 2007-08-21 2015-06-23 Google Inc. Bundle generation
US9569523B2 (en) 2007-08-21 2017-02-14 Google Inc. Bundle generation
US11086929B1 (en) 2008-07-29 2021-08-10 Mimzi LLC Photographic memory
US11782975B1 (en) 2008-07-29 2023-10-10 Mimzi, Llc Photographic memory
US11308156B1 (en) 2008-07-29 2022-04-19 Mimzi, Llc Photographic memory
US9128981B1 (en) 2008-07-29 2015-09-08 James L. Geer Phone assisted ‘photographic memory’
US9792361B1 (en) 2008-07-29 2017-10-17 James L. Geer Photographic memory
US8774172B2 (en) * 2012-10-09 2014-07-08 Heartv Llc System for providing secondary content relating to a VoIp audio session
US20140098715A1 (en) * 2012-10-09 2014-04-10 Tv Ears, Inc. System for streaming audio to a mobile device using voice over internet protocol
US20140293130A1 (en) * 2013-03-26 2014-10-02 Panasonic Corporation Video reception device and image recognition method for received video
US20140293129A1 (en) * 2013-03-26 2014-10-02 Panasonic Corporation Video reception device and image recognition method for received video
US9131184B2 (en) * 2013-03-26 2015-09-08 Panasonic Intellectual Property Management Co., Ltd. Video reception device and image recognition method for received video
US9148610B2 (en) * 2013-03-26 2015-09-29 Panasonic Intellectual Property Management Co., Ltd. Video reception device and image recognition method for received video
US9955103B2 (en) 2013-07-26 2018-04-24 Panasonic Intellectual Property Management Co., Ltd. Video receiving device, appended information display method, and appended information display system
US9762951B2 (en) 2013-07-30 2017-09-12 Panasonic Intellectual Property Management Co., Ltd. Video reception device, added-information display method, and added-information display system
US10600074B2 (en) 2013-08-23 2020-03-24 Oath Inc. Dwell time based advertising in a scrollable content stream
US10108977B2 (en) * 2013-08-23 2018-10-23 Oath Inc. Dwell time based advertising in a scrollable content stream
US20150058113A1 (en) * 2013-08-23 2015-02-26 Yahoo! Inc. Dwell time based advertising
US20150058114A1 (en) * 2013-08-23 2015-02-26 Yahoo! Inc. Dwell time based advertising in a scrollable content stream
US9906843B2 (en) 2013-09-04 2018-02-27 Panasonic Intellectual Property Management Co., Ltd. Video reception device, video recognition method, and display system for providing additional information to be superimposed on displayed image
US9900650B2 (en) 2013-09-04 2018-02-20 Panasonic Intellectual Property Management Co., Ltd. Video reception device, video recognition method, and additional information display system
US9906844B2 (en) 2014-03-26 2018-02-27 Panasonic Intellectual Property Management Co., Ltd. Video reception device, video recognition method and additional information display system
US10194216B2 (en) 2014-03-26 2019-01-29 Panasonic Intellectual Property Management Co., Ltd. Video reception device, video recognition method, and additional information display system
US9774924B2 (en) 2014-03-26 2017-09-26 Panasonic Intellectual Property Management Co., Ltd. Video reception device, video recognition method and additional information display system
US10616613B2 (en) 2014-07-17 2020-04-07 Panasonic Intellectual Property Management Co., Ltd. Recognition data generation device, image recognition device, and recognition data generation method
US10200765B2 (en) 2014-08-21 2019-02-05 Panasonic Intellectual Property Management Co., Ltd. Content identification apparatus and content identification method
US10832276B2 (en) * 2015-11-25 2020-11-10 Oath Inc. Systems and methods for ad placement in content streams
US20170148049A1 (en) * 2015-11-25 2017-05-25 Yahoo! Inc. Systems and methods for ad placement in content streams
EP3564888A1 (en) 2018-05-04 2019-11-06 Hotmart B.V. Methods and systems for displaying a form associated with a video
US11490147B2 (en) 2018-05-04 2022-11-01 Hotmart B.V. Methods and systems for displaying a (payment) form associated with a video
WO2020113080A1 (en) * 2018-11-29 2020-06-04 Kingston Joseph Peter Systems and methods for integrated marketing
US11341567B2 (en) 2018-11-29 2022-05-24 Joseph Peter Kingston Systems and methods for integrated marketing
US11210058B2 (en) 2019-09-30 2021-12-28 Tv Ears, Inc. Systems and methods for providing independently variable audio outputs
WO2022150053A1 (en) * 2021-01-07 2022-07-14 Google Llc Selection and provision of digital components during display of content

Also Published As

Publication number Publication date
US8433611B2 (en) 2013-04-30
US20090006375A1 (en) 2009-01-01
WO2009003179A2 (en) 2008-12-31
WO2009003179A3 (en) 2009-09-17
EP2186047A2 (en) 2010-05-19

Similar Documents

Publication Publication Date Title
US8433611B2 (en) Selection of advertisements for placement with content
US11915263B2 (en) Device functionality-based content selection
US10299015B1 (en) Time-based content presentation
US20080276266A1 (en) Characterizing content for identification of advertising
US8315423B1 (en) Providing information in an image-based information retrieval system
KR101531907B1 (en) Improved advertising with video ad creatives
US9043828B1 (en) Placing sponsored-content based on images in video content
US20080066107A1 (en) Using Viewing Signals in Targeted Video Advertising
US8346604B2 (en) Facilitating bidding on images
US20170213248A1 (en) Placing sponsored-content associated with an image
KR20110054074A (en) Improved advertising with audio content
AU2011203560B2 (en) Improved advertising with video ad creatives

Legal Events

Date Code Title Description
AS Assignment

Owner name: GOOGLE INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LAX, REUVEN;ARANKALLE, POORVA;SAMADI, SHAMIM;AND OTHERS;SIGNING DATES FROM 20070905 TO 20070906;REEL/FRAME:029065/0015

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION