US20130326521A1 - Method of associating multiple applications - Google Patents

Method of associating multiple applications Download PDF

Info

Publication number
US20130326521A1
US20130326521A1 US13/606,657 US201213606657A US2013326521A1 US 20130326521 A1 US20130326521 A1 US 20130326521A1 US 201213606657 A US201213606657 A US 201213606657A US 2013326521 A1 US2013326521 A1 US 2013326521A1
Authority
US
United States
Prior art keywords
application
application program
related information
information
search
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/606,657
Inventor
Munetaka Tsuda
Ryoma AOKI
Yasuto KAKIMOTO
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nintendo Co Ltd
Original Assignee
Nintendo Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nintendo Co Ltd filed Critical Nintendo Co Ltd
Assigned to NINTENDO CO., LTD. reassignment NINTENDO CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AOKI, RYOMA, Kakimoto, Yasuto, TSUDA, MUNETAKA
Publication of US20130326521A1 publication Critical patent/US20130326521A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9538Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/951Indexing; Web crawling techniques

Definitions

  • the technology relates to executing multiple application programs.
  • a so-called multitask function is known in which, when one application program is executed, another application program additionally may be activated.
  • An exemplary embodiment provides an information-processing device including: an acquisition unit configured to acquire application-related information on a first application program; and a presentation unit configured, when a second application program capable of performing a search is activated, after activation of the first application program, to present to a user the application-related information acquired by the acquisition unit as a candidate for an item to be searched for.
  • FIG. 1 shows an exemplary non-limiting diagram showing a configuration of a display system
  • FIG. 2 shows an exemplary non-limiting block diagram showing a hardware configuration of the controller
  • FIG. 3 shows an exemplary non-limiting block diagram showing a hardware configuration of a main device
  • FIG. 4 shows an exemplary non-limiting block diagram showing a principal functional configuration of the main device
  • FIG. 5 shows an exemplary non-limiting flowchart showing a process executed by the main device
  • FIG. 6 shows an exemplary non-limiting diagram showing exemplary images displayed on the monitor and the controller
  • FIG. 7 shows an exemplary non-limiting diagram showing exemplary images displayed on the monitor and the controller
  • FIG. 8 shows an exemplary non-limiting, flowchart showing a process executed by the main device
  • FIG. 9 shows an exemplary non-limiting diagram showing an exemplary image displayed on the controller
  • FIG. 10 shows an exemplary non-limiting diagram showing an exemplary image displayed on the controller.
  • FIG. 11 shows an exemplary non-limiting diagram showing an exemplary image displayed on the controller.
  • FIG. 1 is a diagram showing a configuration of display system 10 .
  • Display system 10 is a system for displaying a variety of images in accordance with operations performed by a user.
  • Display system 10 includes controller 100 , monitor 200 , and main device 300 .
  • Controller 100 is a portable display device, and also is an operation terminal via which a user performs a variety of operations. Controller 100 is provided with an operation means such as notions 161 , in addition to display region 141 . Controller 100 causes a variety of images to be displayed in display region 141 based on data transmitted from main device 300 , and generates operation information indicating an operation state of buttons 161 , and transmits the generated operation information to main device 300 .
  • Monitor 200 is a stationary-type display device, and may be, for example, a television set for receiving a television broadcast or a display of a personal computer. It is assumed that monitor 200 has display region 201 having a larger size than display region 141 of controller 100 , though display region 201 may be the same size as or smaller than display region 141 of controller 100 .
  • Main device 300 is a computer that executes programs and serves as an information-processing device for controlling operation of controller 100 and monitor 200 .
  • Main device 300 is connected to each of controller 100 and monitor 200 via wired or wireless communication. It is assumed here that main device 300 conducts wireless communication with controller 100 and conducts wired communication with monitor 200 .
  • Main device 300 has a multitask function whereby it is capable of executing multiple application programs in parallel. In accordance with this multitask function, main device 300 executes in parallel, for example, a game program in use by a user and a browser program for allowing the user to browse web pages.
  • Main device 300 causes at least one of controller 100 and monitor 200 to display an image. Depending on a user operation and/or a type of an image to be displayed, main device 300 may cause only one of controller 100 and monitor 200 to display an image or may cause each of controller 100 and monitor 200 to display an image. It is to be noted that, in a case where an image is displayed on each of controller 100 and monitor 200 , main device 300 may cause the same image to be displayed on controller 100 and monitor 200 , or may cause different images to be displayed on controller 100 and monitor 200 .
  • main device 300 causes monitor 200 to display a captured image (a so-called “screen shot”), which was displayed on controller 100 and monitor 200 during playing of the game immediately before activation of the browser program, and causes controller 100 to display an image of the newly activated browser program.
  • a captured image a so-called “screen shot”
  • main device 300 presents to the user information relating to the game program (hereinafter referred to as application-related information), such as the title of the game being played, as a candidate for a keyword indicating an item of information to be searched for (hereinafter, referred to as a search item), together with the image of the browser program displayed on controller 100 .
  • application-related information such as the title of the game being played
  • search item a keyword indicating an item of information to be searched for
  • This presentation may be achieved in any way, such as by display of an image representing the application-related information or by output of sound representing the application-related information, so long as the information is conveyed to the user.
  • explanation will be given taking display of an image as an example.
  • a search engine on the Internet or the like operates in response to a request from the browser program, and main device 300 can obtain a result of the search performed by the search engine.
  • FIG. 2 is a block diagram showing a hardware configuration of controller 100 .
  • Controller 100 includes control unit 110 , auxiliary storage unit 120 , communication unit 130 , display 140 , touch screen unit 150 , and operation unit 160 .
  • Control unit 110 is a means for controlling operations of various units of controller 100 .
  • Control unit 110 includes a processing device such as a CPU (Central Processing Unit), a memory serving as a main memory device, an input/output interface for communicating information with various units of controller 100 , and so on, and executes a program(s) to control display of images or data transmission and reception to and from main device 300 .
  • a processing device such as a CPU (Central Processing Unit)
  • a memory serving as a main memory device
  • an input/output interface for communicating information with various units of controller 100 , and so on
  • a program(s) to control display of images or data transmission and reception to and from main device 300 .
  • Auxiliary storage unit 120 is a means for storing data used by control unit 110 .
  • Auxiliary storage unit 120 is a flash memory, for example. It is to be noted that auxiliary storage unit 120 may include a detachable storage medium such as a so-called memory card.
  • Communication unit 130 is a means for communicating with main device 300 .
  • Communication unit 130 includes an antenna or the like for communicating with main device 300 wirelessly.
  • Display 140 is a means for displaying an image.
  • Display 140 includes a display panel having pixels formed by liquid crystal elements or organic EL (electroluminescence) elements, and a drive circuit for driving the display panel, and displays, in display region 141 , an image in accordance with image data provided from control unit 110 .
  • Touch screen unit 150 is a means for receiving an operation performed by a user, and generating coordinate information that represents a position in display region 141 , to supply the coordinate information to control unit 110 .
  • Touch screen unit 150 includes a sensor disposed to overlap display region 141 , and a control circuit for generating coordinate information representing a position detected by the sensor and providing the coordinate information to control unit 110 .
  • Touch screen unit 150 may be of resistive type, or may be of another type such as capacitive type.
  • Operation unit 160 is another means for receiving an operation performed by a user.
  • Operation unit 160 includes the aforementioned buttons 161 , and provides control unit 110 with operation information in accordance with an operation performed by a user.
  • FIG. 3 is a block diagram showing a hardware configuration of main device 300 .
  • Main device 300 includes control unit 310 , auxiliary storage unit 320 , disk drive unit 330 , network communication unit 340 , terminal communication unit 350 , and AV (Audio and Visual) interface unit 360 .
  • Control unit 310 is a means for controlling operations of various units of main device 300 by executing a program(s), and corresponds to a “computer” in the exemplary embodiment.
  • Control unit 310 includes a processing device such as a CPU, a GPU (Graphics Processing Unit), a DSP (Digital Signal Processor) or the like, a memory serving as a main memory device or a VRAM (Video Random Access Memory), an input/output interface for communicating information with various units of main device 300 , and so on.
  • Auxiliary storage unit 320 is a means for storing data used by control unit 310 .
  • Auxiliary storage unit 320 is a flash memory or a hard disk, for example, but may include a detachable storage medium such as a memory card.
  • Auxiliary storage unit 320 is capable of storing programs to be executed by control unit 310 and data acquired via network communication unit 340 or terminal communication unit 350 .
  • the programs stored in auxiliary storage unit 320 include a game program for presenting a game to a user and a browser program for browsing web pages.
  • Disk drive unit 330 is a means for reading data stored in an optical disk (optical storage medium).
  • the optical disk may store data used for playing a game, such as a game program, for example. It is to be noted that disk drive unit 330 may read data stored in another storage medium such as a magneto-optical disk or a semiconductor memory.
  • Network communication unit 340 is a means for communicating via a network such as the Internet. The communication performed by network communication unit 340 may be wired or wireless communication. Network communication unit 340 receives data from an external server device or transmits data thereto in accordance with instructions from control unit 310 .
  • Terminal communication unit 350 is a means for communicating with controller 100 . In a case where a controller other than controller 100 is used, terminal communication unit 350 may communicate with the other controller.
  • the wireless communication performed by terminal communication unit 350 may utilize any communication technology such as Wi-Fi, Bluetooth, or infrared communication.
  • AV interface 360 is a means for supplying to monitor 200 image data, sound data, or the like.
  • AV interface 360 includes one or more interfaces such as an HDMI (High-Definition Multimedia Interface) terminal or the like.
  • FIG. 4 is a block diagram (functional block diagram) showing a principal functional configuration of main device 300 .
  • Main device 300 includes storage unit 301 , acquisition unit 302 , presentation unit 303 , and instruction unit 304 . The functions of these units are realized by execution of one or more programs by control unit 310 of main device 300 . It is to be noted that main device 300 does not have to include every unit shown in FIG. 4 . Further, main device 300 may realize the functions of the units shown in FIG. 4 by executing a single program or multiple programs.
  • Storage unit 301 is a means realized by auxiliary storage unit 320 , for example, and stores application-related information, which is information relating to a first application program, in association with a type of the first application program.
  • application-related information is information relating to a first application program, in association with a type of the first application program.
  • the type of the first application program is a game program
  • the application-related information is the title of the game.
  • Acquisition unit 302 is a means realized by control unit 310 , for example, and acquires the application -related information, which is information relating to the first application program. For example, in a case where the first application program is a game program and the application-related information is the title of the game, acquisition unit 302 acquires the title of the game from storage unit 301 .
  • Presentation unit 303 is a means realized by control unit 310 and terminal communication unit 350 , for example, and, when a second application program, by which a search can be performed, is activated after activation of the first application program, presents to a user the application-related information relating to the first application program and acquired by the acquisition unit, as a candidate for a search item in the search.
  • presentation unit 303 transmits to controller 100 image data representing the title of the game, for example, to cause the title to be displayed on display 140 , thereby presenting to the user the title as a keyword serving as a candidate for a search item.
  • Instruction unit 304 is a means realized by control unit 310 , for example, and, in response to an operation performed by a user for the application-related information presented by presentation unit 303 , instructs the second application program to search for information relating to the application-related information.
  • the first application program is a game program
  • the second application program is a browser program
  • the application-related information is the title of the game.
  • Instruction unit 304 instructs the browser program to search for information relating to the title of the game.
  • information relating to the title of the game such as an effective way to complete the game, is retrieved by a search engine on the Internet 400 or the like, as a search result.
  • the obtained search result is transmitted from main device 300 to controller 100 to be displayed on display 140 .
  • FIGS. 5 and 8 are flowcharts showing a process performed by main device 300 .
  • control unit 310 of main device 300 Upon activation of a game program, control unit 310 of main device 300 generates game data according to a procedure described in the game program, where the game data includes image data representing game images, sound data representing the sound output while each game image is displayed, and so on (step S 1 ).
  • the game data includes two types of game data: game data for the controller, which is to be output from controller 100 ; and game data for the monitor, which is to be output from monitor 200 .
  • Control unit 310 generates the two types of game data.
  • control unit 310 transmits the game data for the controller to controller 100 via terminal communication unit 350 , and transmits the game data for the monitor to monitor 200 via AV interface 360 (step S 2 ).
  • control unit 110 of controller 100 displays, in display region 141 , a game image based on the image data, and outputs sound according to the sound data.
  • monitor 200 displays, in display region 201 , a game image based on the image data, and outputs sound according to the sound data.
  • FIG. 6 is a diagram showing exemplary game images displayed on controller 100 and monitor 200 .
  • Displayed in display region 201 of monitor 200 is an image showing character C 1 , which is a main character in a game, and character C 2 , which is a monster against which the main character fights, viewed from the side in a virtual three-dimensional space.
  • displayed in display region 141 of controller 100 is an image showing characters C 1 and C 2 viewed from above in the virtual three-dimensional space.
  • control unit 110 of controller 100 if a user performs an operation with controller 100 during the progress of the game (step S 3 ; YES), control unit 110 of controller 100 generates operation information representing the operation and transmits the operation information to main device 300 .
  • Control unit 310 of main device 300 acquires the operation information from controller 100 .
  • Control unit 310 determines whether the operation represented by the operation information is an operation relating to progress of the game (step S 4 ). If the operation is an operation relating to the progress of the game (step S 4 ; YES), control unit 310 identities an event corresponding to the operation (step S 5 ), and generates game data by applying the event to the procedure described in the game program (step S 1 ). While the game is in progress, step S 1 to step S 5 are repeated.
  • control unit 310 determines whether the operation represented by the operation information is an operation instructing return to display of a home screen (step S 6 ).
  • a home screen is an image displayed first to guide various functions provided by main device 300 . If the operation is an operation for returning to the home screen (step S 6 ; YES), control unit 310 temporarily suspends execution of the game program, reads out image data of home screen from auxiliary storage unit 320 , and transmits the image data of home screen to controller 100 via terminal communication unit 350 and to monitor 200 via AV interface 360 (step S 7 ).
  • each of control unit 110 of controller 100 and monitor 200 displays the home screen based on the image data of home screen.
  • Arranged in this home screen are, in addition to a soft key for activating a browser program, a soft key for viewing television, a soft key for displaying a so-called: help function, and a soft key for displaying various other menus, for example.
  • control unit 310 When an operation for activating a browser program is performed in the home screen (step S 8 ; YES), control unit 310 activates the browser program (step S 9 ). Then, control unit 310 transmits to monitor 200 game image data, which is image data for the game displayed on each of monitor 200 and controller 100 immediately before transition to the home screen (namely, immediately before temporary suspension of execution of the game program), and transmits to controller 100 browser image data representing an image of the browser program (step S 10 ).
  • control unit 310 reads out the title of the game stored in association with the game program (for example, “ADVENTURES OF A PRINCE”) from auxiliary storage unit 320 , and transmits the title of the game to controller 100 as a part of the browser image data.
  • monitor 200 displays a game image based on the game image data.
  • control unit 110 of controller 100 displays a browser image based on the browser image data. It is to be noted that, if the determination in steps S 6 and step S 8 is “NO,” control unit 310 of main device 300 performs a predetermined corresponding process (step S 11 ).
  • control unit 310 of main device 300 restarts execution of the game program from the point at which the procedure was suspended. Namely, in accordance with the procedure described in the game program, control unit 310 generates game data for controller and game data for monitor, and transmits them to controller 100 and monitor 200 , respectively. As is described in the foregoing, control unit 310 serves as a return unit for restarting progress of a game caused by execution of a game program from a point at which the progress was suspended.
  • FIG. 7 is a diagram showing exemplary images displayed on controller 100 and monitor 200 when a browser program has been activated.
  • Displayed in display region 201 of monitor 200 are game image g 1 , which was displayed in display region 201 of monitor 200 immediately before transition to the home screen, and game image g 2 , which was displayed in region 141 of controller 100 immediately before transition to the home screen.
  • displayed in display region 141 is a browser image.
  • message m ‘SEARCH FOR “ADVENTURES OF A PRINCE”’ is displayed in connection with soft key SK 5 for transition to a search image, as a form of a so-called balloon superimposed on the browser image.
  • a user recognizes this string of characters “ADVENTURES OF A PRINCE” as a candidate for a search item.
  • a browser program when activated, in place of an image displayed as a result of execution of a game program, an image displayed as a result of execution of the browser program and message m including application-related information such as ‘SEARCH FOR “ADVENTURES OF A PRINCE”’ are displayed in display region 141 of controller 100 .
  • control unit 310 acquires operation information from controller 100 , and determines an operation content (event) of the operation information (step S 13 ). If the operation content (event) is touching, by the user, of message m displayed as a balloon (step S 13 ; touch search item), control unit 310 determines that an operation instructing a search for the string of characters “ADVENTURES OF A PRINCE” included in message m is performed, and performs a search with “ADVENTURES OF A PRINCE” being a search item (step S 17 ).
  • control unit 310 notifies the browser program of an event corresponding to an operation performed by a user, to perform a process for searching in accordance with a procedure described in the browser program, while maintaining au image displayed as a result of execution of a game program to be displayed on monitor 200 .
  • control unit 310 transmits to controller 100 search result image data representing the search result (step S 18 ).
  • control unit 110 of controller 100 displays a search result image based on the received search result image data.
  • the browser program is given an instruction to search for information relating to the string of characters “ADVENTURES OF A PRINCE” included in message m.
  • FIG. 9 is a diagram showing an exemplary search result image displayed on controller 100 .
  • a search result image displayed in display region 141 of controller 100 below the string of characters “ADVENTURES OF A PRINCE,” which is the search item, results of the search for information on the Internet 400 relating to the string of characters are displayed.
  • a user can select a desired result from the displayed search results to browse the information in detail.
  • step S 13 if the operation content (event) is touching, by a user, of soft key SK 5 for transition to a search image (step S 13 ; touch soft key for search), control unit 310 transmits to controller 100 search image data representing a search image (step S 14 ). At this time, control unit 310 reads out from auxiliary storage unit 320 the title of the game “ADVENTURES OF A PRINCE” stored in association with the game program, and transmits the title of the game to controller 100 as a part of the search image data. Upon receipt of the search image data, control unit 110 of controller 100 displays a search image based on the search image data.
  • FIG. 10 is a diagram showing an exemplary search image displayed on controller 100 .
  • the string of characters “ADVENTURES OF A PRINCE” is inputted automatically (i.e., without need for an input operation performed by a user) and displayed in input field A for inputting a search item.
  • Below input field A are displayed soft keys SK 7 resembling a keyboard.
  • step S 13 touch search item
  • step S 13 touch soft key for search
  • the user can input desired characters into the input field in addition to the string of characters “ADVENTURES OF A PRINCE.” For example, if the user wishes to know options for attacking the monster displayed on monitor 200 , the user may input one or more strings of characters such as “dinosaur,” “monster exhaling fire,” or “attack,” in addition to the string of characters “ADVENTURES OF A PRINCE.” At this time, the game image shown on monitor 200 ( FIG. 7 ) may be viewed as reference or ‘help’ information when the user inputs a keyword(s) serving as a search item(s).
  • control unit 310 receives operation information corresponding thereto (step S 16 ), and performs a search in accordance with the search instruction (step S 17 ).
  • control unit 310 transmits to controller 100 search result image data representing the search result (step S 18 ).
  • control unit 110 of controller 100 displays a search result image based on the search result image data (refer to FIG. 9 ).
  • control unit 310 transmits browser image data to controller 100 .
  • control unit 110 of controller 100 displays a browser image based on the image data (step S 19 ).
  • FIG. 11 is a diagram showing an exemplary browser image displayed on controller 100 .
  • message m ‘SEARCH FOR “ADVENTURES OF A PRINCE”’ is not displayed in connection with soft key SK 5 for transition to a search image. Namely, when compared to FIG. 7 , the browser image is displayed with message m being deleted.
  • control unit 310 transmits to controller 100 search image data representing a search image (step S 14 ).
  • control unit 310 reads out from auxiliary storage unit 320 the title of the game “ADVENTURES OF A PRINCE” stored in association with the game program, and transmits the title of the game to controller 100 as a part of the search image data.
  • an image identical to that shown in FIG. 10 is displayed on controller 100 .
  • deleted message m is presented again as a search item.
  • soft keys SK 7 the user can input desired characters into the input field in addition to the string of characters “ADVENTURES OF A PRINCE.”
  • the exemplary embodiment described in the foregoing is one embodiment for carrying out the exemplary embodiment.
  • the exemplary embodiment is not limited to the exemplary embodiment, and can be carried out in other embodiments, as shown by the following modifications. It is to be noted, that multiple modifications may be combined in carrying out the exemplary embodiment.
  • a game program is described as an example of a first application program
  • a browser program is described as an example of a second application program.
  • the first application program may be any program
  • the second application program may be any program that can perform a search.
  • the second application program may be, for example, an application program, such as dictionary software, which pre-stores information and performs a search for information according to a request input by a user.
  • the search process performed by the second application program may be a search process performed via the Internet 400 , or it may be a search process performed by accessing a storage device without accessing the Internet 400 .
  • Display device 10 may be used without use of monitor 200 . Though an image is displayed on each of controller 100 and monitor 200 in the exemplary embodiment, it is possible to display an image only on controller 100 or only on monitor 200 .
  • control unit 310 of main device 300 causes a game image of a first application program (e.g., game program) to be displayed on either controller 100 or monitor 200 , and, when a second application (e.g., a browser program) is activated, causes a browser program to be displayed in place of the game image.
  • a first application program e.g., game program
  • a second application e.g., a browser program
  • control unit 310 of main device 300 may display an image displayed by execution of a first application program (e.g., game program) and an image of a second application program (e.g., browser program) via which a search can be performed, simultaneously on a single display device, by dividing the display region of the display device into multiple parts, for example.
  • a first application program e.g., game program
  • a second application program e.g., browser program
  • application-related information relating to a first application program is stored in storage unit 301 in association with a type of the first application program.
  • the stored application-related information is not limited to the title of a game, and may be any information so long as it relates to the first application program.
  • a state of execution here indicates a variable procedure executed in the program.
  • the first application program is a game program
  • the game presented by the first application program is a game played in a virtual three-dimensional space
  • a point of view and a viewing direction perceived by a user when viewing the virtual three-dimensional space may vary in accordance with an operation performed by the user, and this may result in a possibility of various game images displayed for the same scene in the game. Therefore, in this case, it can be said that the application-related information is stored for each scene of the game, rather than that the application-related information is stored for each image displayed in the game.
  • control unit 301 of main device 300 causes monitor 200 to display a captured image (so-called “screen shot”), which was displayed on controller 100 and monitor 200 during playing of the game immediately before activation of the browser program.
  • the image displayed on monitor 200 at this time is not limited to the image that was displayed on controller 100 and monitor 200 immediately before activation of the browser program (second application program).
  • a single image for each game scene may be set as a screen shot.
  • control unit 301 may cause the image set for the game scene at the time the browser program (second application program) is activated to be displayed.
  • control unit 301 may cause a moving image displayed on controller 100 or monitor 200 , in accordance with execution of a game program (first application program) during a predetermined period before activation of a browser program (second application program), to be displayed on monitor 200 .
  • the exemplary embodiment does not have to be carried out in a main device 300 , in a display system including main device 300 and controller 100 , or in a display system (display system 10 ) further including monitor 200 , but also can be carried out in an information-processing device including integrally a configuration corresponding to main device 300 and a configuration corresponding to controller 100 . Further, the functions of the main device may be assigned to and realized by multiple devices.
  • the exemplary embodiment may be carried out not only as an information-processing device or a display system as described in the foregoing, but also as an information-processing method or a program for executing such a method.
  • the program of the exemplary embodiment may be stored in a storage medium such as an optical disk or a semiconductor memory, or may be downloaded to an information-processing device via a network such as the Internet.

Abstract

An example information-processing device includes: am acquisition unit configured to acquire application-related information relating to a first application program; and a presentation unit configured, when a second application program, by which a search can be performed, is activated after activation of the first application program, to present to a user the application-related information acquired by the acquisition unit as a candidate for an item to be searched for.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is based on and claims priority under 35 U.S.C 119 from Japanese Patent Application No. 2012-124544, which was tiled on May 31, 2012.
  • FIELD
  • The technology relates to executing multiple application programs.
  • BACKGROUND AND SUMMARY
  • A so-called multitask function is known in which, when one application program is executed, another application program additionally may be activated.
  • An exemplary embodiment provides an information-processing device including: an acquisition unit configured to acquire application-related information on a first application program; and a presentation unit configured, when a second application program capable of performing a search is activated, after activation of the first application program, to present to a user the application-related information acquired by the acquisition unit as a candidate for an item to be searched for.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Exemplary embodiments will now be described with reference to the following drawings, wherein:
  • FIG. 1 shows an exemplary non-limiting diagram showing a configuration of a display system;
  • FIG. 2 shows an exemplary non-limiting block diagram showing a hardware configuration of the controller;
  • FIG. 3 shows an exemplary non-limiting block diagram showing a hardware configuration of a main device;
  • FIG. 4 shows an exemplary non-limiting block diagram showing a principal functional configuration of the main device;
  • FIG. 5 shows an exemplary non-limiting flowchart showing a process executed by the main device;
  • FIG. 6 shows an exemplary non-limiting diagram showing exemplary images displayed on the monitor and the controller;
  • FIG. 7 shows an exemplary non-limiting diagram showing exemplary images displayed on the monitor and the controller;
  • FIG. 8 shows an exemplary non-limiting, flowchart showing a process executed by the main device;
  • FIG. 9 shows an exemplary non-limiting diagram showing an exemplary image displayed on the controller;
  • FIG. 10 shows an exemplary non-limiting diagram showing an exemplary image displayed on the controller; and
  • FIG. 11 shows an exemplary non-limiting diagram showing an exemplary image displayed on the controller.
  • DETAILED DESCRIPTION OF NON-LIMITING EXAMPLE EMBODIMENTS
  • FIG. 1 is a diagram showing a configuration of display system 10. Display system 10 is a system for displaying a variety of images in accordance with operations performed by a user. Display system 10 includes controller 100, monitor 200, and main device 300.
  • Controller 100 is a portable display device, and also is an operation terminal via which a user performs a variety of operations. Controller 100 is provided with an operation means such as notions 161, in addition to display region 141. Controller 100 causes a variety of images to be displayed in display region 141 based on data transmitted from main device 300, and generates operation information indicating an operation state of buttons 161, and transmits the generated operation information to main device 300.
  • Monitor 200 is a stationary-type display device, and may be, for example, a television set for receiving a television broadcast or a display of a personal computer. It is assumed that monitor 200 has display region 201 having a larger size than display region 141 of controller 100, though display region 201 may be the same size as or smaller than display region 141 of controller 100.
  • Main device 300 is a computer that executes programs and serves as an information-processing device for controlling operation of controller 100 and monitor 200. Main device 300 is connected to each of controller 100 and monitor 200 via wired or wireless communication. It is assumed here that main device 300 conducts wireless communication with controller 100 and conducts wired communication with monitor 200. Main device 300 has a multitask function whereby it is capable of executing multiple application programs in parallel. In accordance with this multitask function, main device 300 executes in parallel, for example, a game program in use by a user and a browser program for allowing the user to browse web pages.
  • Main device 300 causes at least one of controller 100 and monitor 200 to display an image. Depending on a user operation and/or a type of an image to be displayed, main device 300 may cause only one of controller 100 and monitor 200 to display an image or may cause each of controller 100 and monitor 200 to display an image. It is to be noted that, in a case where an image is displayed on each of controller 100 and monitor 200, main device 300 may cause the same image to be displayed on controller 100 and monitor 200, or may cause different images to be displayed on controller 100 and monitor 200.
  • While a user is playing a game using this display system 10, if the user wishes to discover an effective way to complete the game, for example, the user may operate controller 100 to activate a browser program of main device 300. At this time, main device 300 causes monitor 200 to display a captured image (a so-called “screen shot”), which was displayed on controller 100 and monitor 200 during playing of the game immediately before activation of the browser program, and causes controller 100 to display an image of the newly activated browser program. Further, main device 300 presents to the user information relating to the game program (hereinafter referred to as application-related information), such as the title of the game being played, as a candidate for a keyword indicating an item of information to be searched for (hereinafter, referred to as a search item), together with the image of the browser program displayed on controller 100. This presentation may be achieved in any way, such as by display of an image representing the application-related information or by output of sound representing the application-related information, so long as the information is conveyed to the user. In this exemplary embodiment, explanation will be given taking display of an image as an example. When a user instructs the browser program to conduct a search using the presented application information as a search item, a search engine on the Internet or the like operates in response to a request from the browser program, and main device 300 can obtain a result of the search performed by the search engine.
  • FIG. 2 is a block diagram showing a hardware configuration of controller 100. Controller 100 includes control unit 110, auxiliary storage unit 120, communication unit 130, display 140, touch screen unit 150, and operation unit 160.
  • Control unit 110 is a means for controlling operations of various units of controller 100. Control unit 110 includes a processing device such as a CPU (Central Processing Unit), a memory serving as a main memory device, an input/output interface for communicating information with various units of controller 100, and so on, and executes a program(s) to control display of images or data transmission and reception to and from main device 300.
  • Auxiliary storage unit 120 is a means for storing data used by control unit 110. Auxiliary storage unit 120 is a flash memory, for example. It is to be noted that auxiliary storage unit 120 may include a detachable storage medium such as a so-called memory card.
  • Communication unit 130 is a means for communicating with main device 300. Communication unit 130 includes an antenna or the like for communicating with main device 300 wirelessly.
  • Display 140 is a means for displaying an image. Display 140 includes a display panel having pixels formed by liquid crystal elements or organic EL (electroluminescence) elements, and a drive circuit for driving the display panel, and displays, in display region 141, an image in accordance with image data provided from control unit 110.
  • Touch screen unit 150 is a means for receiving an operation performed by a user, and generating coordinate information that represents a position in display region 141, to supply the coordinate information to control unit 110. Touch screen unit 150 includes a sensor disposed to overlap display region 141, and a control circuit for generating coordinate information representing a position detected by the sensor and providing the coordinate information to control unit 110. Touch screen unit 150 may be of resistive type, or may be of another type such as capacitive type.
  • Operation unit 160 is another means for receiving an operation performed by a user. Operation unit 160 includes the aforementioned buttons 161, and provides control unit 110 with operation information in accordance with an operation performed by a user.
  • FIG. 3 is a block diagram showing a hardware configuration of main device 300. Main device 300 includes control unit 310, auxiliary storage unit 320, disk drive unit 330, network communication unit 340, terminal communication unit 350, and AV (Audio and Visual) interface unit 360.
  • Control unit 310 is a means for controlling operations of various units of main device 300 by executing a program(s), and corresponds to a “computer” in the exemplary embodiment. Control unit 310 includes a processing device such as a CPU, a GPU (Graphics Processing Unit), a DSP (Digital Signal Processor) or the like, a memory serving as a main memory device or a VRAM (Video Random Access Memory), an input/output interface for communicating information with various units of main device 300, and so on.
  • Auxiliary storage unit 320 is a means for storing data used by control unit 310. Auxiliary storage unit 320 is a flash memory or a hard disk, for example, but may include a detachable storage medium such as a memory card. Auxiliary storage unit 320 is capable of storing programs to be executed by control unit 310 and data acquired via network communication unit 340 or terminal communication unit 350. The programs stored in auxiliary storage unit 320 include a game program for presenting a game to a user and a browser program for browsing web pages.
  • Disk drive unit 330 is a means for reading data stored in an optical disk (optical storage medium). The optical disk may store data used for playing a game, such as a game program, for example. It is to be noted that disk drive unit 330 may read data stored in another storage medium such as a magneto-optical disk or a semiconductor memory.
  • Network communication unit 340 is a means for communicating via a network such as the Internet. The communication performed by network communication unit 340 may be wired or wireless communication. Network communication unit 340 receives data from an external server device or transmits data thereto in accordance with instructions from control unit 310.
  • Terminal communication unit 350 is a means for communicating with controller 100. In a case where a controller other than controller 100 is used, terminal communication unit 350 may communicate with the other controller. The wireless communication performed by terminal communication unit 350 may utilize any communication technology such as Wi-Fi, Bluetooth, or infrared communication.
  • AV interface 360 is a means for supplying to monitor 200 image data, sound data, or the like. AV interface 360 includes one or more interfaces such as an HDMI (High-Definition Multimedia Interface) terminal or the like.
  • FIG. 4 is a block diagram (functional block diagram) showing a principal functional configuration of main device 300. Main device 300 includes storage unit 301, acquisition unit 302, presentation unit 303, and instruction unit 304. The functions of these units are realized by execution of one or more programs by control unit 310 of main device 300. It is to be noted that main device 300 does not have to include every unit shown in FIG. 4. Further, main device 300 may realize the functions of the units shown in FIG. 4 by executing a single program or multiple programs.
  • Storage unit 301 is a means realized by auxiliary storage unit 320, for example, and stores application-related information, which is information relating to a first application program, in association with a type of the first application program. In a case where the type of the first application program is a game program, for example, the application-related information is the title of the game.
  • Acquisition unit 302 is a means realized by control unit 310, for example, and acquires the application -related information, which is information relating to the first application program. For example, in a case where the first application program is a game program and the application-related information is the title of the game, acquisition unit 302 acquires the title of the game from storage unit 301.
  • Presentation unit 303 is a means realized by control unit 310 and terminal communication unit 350, for example, and, when a second application program, by which a search can be performed, is activated after activation of the first application program, presents to a user the application-related information relating to the first application program and acquired by the acquisition unit, as a candidate for a search item in the search. For example, in a case where the first application program is a game program, the second application program is a browser program, and the application-related information is the title of the game, presentation unit 303 transmits to controller 100 image data representing the title of the game, for example, to cause the title to be displayed on display 140, thereby presenting to the user the title as a keyword serving as a candidate for a search item.
  • Instruction unit 304 is a means realized by control unit 310, for example, and, in response to an operation performed by a user for the application-related information presented by presentation unit 303, instructs the second application program to search for information relating to the application-related information. For example, in a case where the first application program is a game program, the second application program is a browser program, and the application-related information is the title of the game. Instruction unit 304 instructs the browser program to search for information relating to the title of the game. As a result of execution of the browser program in accordance with this instruction, information relating to the title of the game, such as an effective way to complete the game, is retrieved by a search engine on the Internet 400 or the like, as a search result. The obtained search result is transmitted from main device 300 to controller 100 to be displayed on display 140.
  • Next, explanation will be given of an operation of this exemplary Embodiment. FIGS. 5 and 8 are flowcharts showing a process performed by main device 300. Upon activation of a game program, control unit 310 of main device 300 generates game data according to a procedure described in the game program, where the game data includes image data representing game images, sound data representing the sound output while each game image is displayed, and so on (step S1). The game data includes two types of game data: game data for the controller, which is to be output from controller 100; and game data for the monitor, which is to be output from monitor 200. Control unit 310 generates the two types of game data.
  • Next, control unit 310 transmits the game data for the controller to controller 100 via terminal communication unit 350, and transmits the game data for the monitor to monitor 200 via AV interface 360 (step S2). Upon receipt of the game data for the controller, control unit 110 of controller 100 displays, in display region 141, a game image based on the image data, and outputs sound according to the sound data. On the other hand, upon receipt of the game data for the monitor, monitor 200 displays, in display region 201, a game image based on the image data, and outputs sound according to the sound data.
  • FIG. 6 is a diagram showing exemplary game images displayed on controller 100 and monitor 200. Displayed in display region 201 of monitor 200 is an image showing character C1, which is a main character in a game, and character C2, which is a monster against which the main character fights, viewed from the side in a virtual three-dimensional space. On the other hand, displayed in display region 141 of controller 100 is an image showing characters C1 and C2 viewed from above in the virtual three-dimensional space.
  • Referring again to FIG. 5, if a user performs an operation with controller 100 during the progress of the game (step S3; YES), control unit 110 of controller 100 generates operation information representing the operation and transmits the operation information to main device 300. Control unit 310 of main device 300 acquires the operation information from controller 100. Control unit 310 determines whether the operation represented by the operation information is an operation relating to progress of the game (step S4). If the operation is an operation relating to the progress of the game (step S4; YES), control unit 310 identities an event corresponding to the operation (step S5), and generates game data by applying the event to the procedure described in the game program (step S1). While the game is in progress, step S1 to step S5 are repeated.
  • If it is determined in step S4 that the operation performed by the user is not an operation relating to the progress of the game (step S4; NO), control unit 310 determines whether the operation represented by the operation information is an operation instructing return to display of a home screen (step S6). A home screen is an image displayed first to guide various functions provided by main device 300. If the operation is an operation for returning to the home screen (step S6; YES), control unit 310 temporarily suspends execution of the game program, reads out image data of home screen from auxiliary storage unit 320, and transmits the image data of home screen to controller 100 via terminal communication unit 350 and to monitor 200 via AV interface 360 (step S7). Upon receipt of the image data of home screen, each of control unit 110 of controller 100 and monitor 200 displays the home screen based on the image data of home screen.
  • Arranged in this home screen are, in addition to a soft key for activating a browser program, a soft key for viewing television, a soft key for displaying a so-called: help function, and a soft key for displaying various other menus, for example.
  • When an operation for activating a browser program is performed in the home screen (step S8; YES), control unit 310 activates the browser program (step S9). Then, control unit 310 transmits to monitor 200 game image data, which is image data for the game displayed on each of monitor 200 and controller 100 immediately before transition to the home screen (namely, immediately before temporary suspension of execution of the game program), and transmits to controller 100 browser image data representing an image of the browser program (step S10).
  • At this time, control unit 310 reads out the title of the game stored in association with the game program (for example, “ADVENTURES OF A PRINCE”) from auxiliary storage unit 320, and transmits the title of the game to controller 100 as a part of the browser image data. Upon receipt of the game image data, monitor 200 displays a game image based on the game image data. On the other hand, upon receipt of the browser image data, control unit 110 of controller 100 displays a browser image based on the browser image data. It is to be noted that, if the determination in steps S6 and step S8 is “NO,” control unit 310 of main device 300 performs a predetermined corresponding process (step S11). For example, if an operation instructing return to the game is performed, control unit 310 of main device 300 restarts execution of the game program from the point at which the procedure was suspended. Namely, in accordance with the procedure described in the game program, control unit 310 generates game data for controller and game data for monitor, and transmits them to controller 100 and monitor 200, respectively. As is described in the foregoing, control unit 310 serves as a return unit for restarting progress of a game caused by execution of a game program from a point at which the progress was suspended.
  • FIG. 7 is a diagram showing exemplary images displayed on controller 100 and monitor 200 when a browser program has been activated. Displayed in display region 201 of monitor 200 are game image g1, which was displayed in display region 201 of monitor 200 immediately before transition to the home screen, and game image g2, which was displayed in region 141 of controller 100 immediately before transition to the home screen. On the other hand, displayed in display region 141 is a browser image. In this browser image, message m ‘SEARCH FOR “ADVENTURES OF A PRINCE”’ is displayed in connection with soft key SK5 for transition to a search image, as a form of a so-called balloon superimposed on the browser image. A user recognizes this string of characters “ADVENTURES OF A PRINCE” as a candidate for a search item. As is described in the foregoing, when a browser program is activated, in place of an image displayed as a result of execution of a game program, an image displayed as a result of execution of the browser program and message m including application-related information such as ‘SEARCH FOR “ADVENTURES OF A PRINCE”’ are displayed in display region 141 of controller 100.
  • Next, with reference to FIG. 8, if the user performs an operation with controller 100 (step S12; YES), control unit 310 acquires operation information from controller 100, and determines an operation content (event) of the operation information (step S13). If the operation content (event) is touching, by the user, of message m displayed as a balloon (step S13; touch search item), control unit 310 determines that an operation instructing a search for the string of characters “ADVENTURES OF A PRINCE” included in message m is performed, and performs a search with “ADVENTURES OF A PRINCE” being a search item (step S17). Namely, in this process, control unit 310 notifies the browser program of an event corresponding to an operation performed by a user, to perform a process for searching in accordance with a procedure described in the browser program, while maintaining au image displayed as a result of execution of a game program to be displayed on monitor 200.
  • Subsequently, when a result of the search is obtained, control unit 310 transmits to controller 100 search result image data representing the search result (step S18). Upon receipt of the search result image data, control unit 110 of controller 100 displays a search result image based on the received search result image data. Thus, even when an image for inputting a search item is not displayed as a result of activation of a browser program, the browser program is given an instruction to search for information relating to the string of characters “ADVENTURES OF A PRINCE” included in message m.
  • FIG. 9 is a diagram showing an exemplary search result image displayed on controller 100. In a search result image displayed in display region 141 of controller 100, below the string of characters “ADVENTURES OF A PRINCE,” which is the search item, results of the search for information on the Internet 400 relating to the string of characters are displayed. A user can select a desired result from the displayed search results to browse the information in detail.
  • Referring again to FIG. 8, in step S13, if the operation content (event) is touching, by a user, of soft key SK5 for transition to a search image (step S13; touch soft key for search), control unit 310 transmits to controller 100 search image data representing a search image (step S14). At this time, control unit 310 reads out from auxiliary storage unit 320 the title of the game “ADVENTURES OF A PRINCE” stored in association with the game program, and transmits the title of the game to controller 100 as a part of the search image data. Upon receipt of the search image data, control unit 110 of controller 100 displays a search image based on the search image data.
  • FIG. 10 is a diagram showing an exemplary search image displayed on controller 100. In a search image displayed in display region 141 of controller 100, the string of characters “ADVENTURES OF A PRINCE” is inputted automatically (i.e., without need for an input operation performed by a user) and displayed in input field A for inputting a search item. Below input field A are displayed soft keys SK7 resembling a keyboard. As is described in the foregoing, when message m including the string of characters “ADVENTURES OF A PRINCE” shown in FIG. 7 is selected (step S13; touch search item), a search for the string of characters “ADVENTURES OF A PRINCE” is performed immediately, and, on the other hand, when soft key SK5, which is an operation element indicating search, is selected (step S13; touch soft key for search), the screen image is caused to transition to a search image in which the presented string of characters “ADVENTURES OF A PRINCE” has been input as an initial value.
  • By using soft keys SK7, the user can input desired characters into the input field in addition to the string of characters “ADVENTURES OF A PRINCE.” For example, if the user wishes to know options for attacking the monster displayed on monitor 200, the user may input one or more strings of characters such as “dinosaur,” “monster exhaling fire,” or “attack,” in addition to the string of characters “ADVENTURES OF A PRINCE.” At this time, the game image shown on monitor 200 (FIG. 7) may be viewed as reference or ‘help’ information when the user inputs a keyword(s) serving as a search item(s).
  • Referring again to FIG. 8, since the string of characters “ADVENTURES OF A PRINCE” has been input automatically in the input field for inputting a search item, as described with reference to FIG. 10 (step S15; YES), upon touching, by the user, of soft key SK6 for instructing a search, control unit 310 receives operation information corresponding thereto (step S16), and performs a search in accordance with the search instruction (step S17). When a result of the search is obtained, control unit 310 transmits to controller 100 search result image data representing the search result (step S18). Upon receipt of the search result image data, control unit 110 of controller 100 displays a search result image based on the search result image data (refer to FIG. 9). Thus, irrespective of whether a search image for inputting a search item is displayed as a result of activation of a browser program, it is possible to instruct the browser program to search for information relating to the string of characters “ADVENTURES OF A PRINCE.”
  • Further, in a case where it is determined in step S13 that the operation content (event) is touching, by the user, of an area other than soil key SK5 and message m (step S13; touch another part), control unit 310 transmits browser image data to controller 100. Upon receipt of the browser image data, control unit 110 of controller 100 displays a browser image based on the image data (step S19).
  • FIG. 11 is a diagram showing an exemplary browser image displayed on controller 100. In a state where a browser image is displayed on controller 100, message m ‘SEARCH FOR “ADVENTURES OF A PRINCE”’ is not displayed in connection with soft key SK5 for transition to a search image. Namely, when compared to FIG. 7, the browser image is displayed with message m being deleted. In this image, when soft key SK5 for transition to a search image is touched (step S20; YES), control unit 310 transmits to controller 100 search image data representing a search image (step S14). At this time, control unit 310 reads out from auxiliary storage unit 320 the title of the game “ADVENTURES OF A PRINCE” stored in association with the game program, and transmits the title of the game to controller 100 as a part of the search image data. As a result, an image identical to that shown in FIG. 10 is displayed on controller 100. Namely, deleted message m is presented again as a search item. By using soft keys SK7, the user can input desired characters into the input field in addition to the string of characters “ADVENTURES OF A PRINCE.”
  • Modifications
  • The exemplary embodiment described in the foregoing is one embodiment for carrying out the exemplary embodiment. The exemplary embodiment is not limited to the exemplary embodiment, and can be carried out in other embodiments, as shown by the following modifications. It is to be noted, that multiple modifications may be combined in carrying out the exemplary embodiment.
  • Modification 1
  • In the exemplary embodiment, a game program is described as an example of a first application program, and a browser program is described as an example of a second application program. However, the first application program and the second application program are not limited to the examples described in the exemplary embodiment. The first application program may be any program, and the second application program may be any program that can perform a search. The second application program may be, for example, an application program, such as dictionary software, which pre-stores information and performs a search for information according to a request input by a user. Namely, the search process performed by the second application program may be a search process performed via the Internet 400, or it may be a search process performed by accessing a storage device without accessing the Internet 400.
  • Modification 2
  • Display device 10 may be used without use of monitor 200. Though an image is displayed on each of controller 100 and monitor 200 in the exemplary embodiment, it is possible to display an image only on controller 100 or only on monitor 200. In this case, control unit 310 of main device 300 causes a game image of a first application program (e.g., game program) to be displayed on either controller 100 or monitor 200, and, when a second application (e.g., a browser program) is activated, causes a browser program to be displayed in place of the game image. Further, control unit 310 of main device 300 may display an image displayed by execution of a first application program (e.g., game program) and an image of a second application program (e.g., browser program) via which a search can be performed, simultaneously on a single display device, by dividing the display region of the display device into multiple parts, for example.
  • Modification 3
  • In the exemplary embodiment, application-related information relating to a first application program is stored in storage unit 301 in association with a type of the first application program. The stored application-related information is not limited to the title of a game, and may be any information so long as it relates to the first application program.
  • Further, it is also possible to store application-related information relating to the first application program in storage unit 301 in association with a state of execution of the first application program when the first application program is executed. A state of execution here indicates a variable procedure executed in the program. For example, in a case where the first application program is a game program, it is possible that, for each image displayed in the game, a string of characters representing the content of the image is stored as the application-related information. Further, for each scone in the game, a string of characters representing the scene may be stored as the application-related information. At this time, in a case where the game presented by the first application program is a game played in a virtual three-dimensional space, for example, a point of view and a viewing direction perceived by a user when viewing the virtual three-dimensional space may vary in accordance with an operation performed by the user, and this may result in a possibility of various game images displayed for the same scene in the game. Therefore, in this case, it can be said that the application-related information is stored for each scene of the game, rather than that the application-related information is stored for each image displayed in the game.
  • Modification 4
  • In the foregoing exemplary embodiment, when a browser program is activated while the user is playing a game using display system 10, control unit 301 of main device 300 causes monitor 200 to display a captured image (so-called “screen shot”), which was displayed on controller 100 and monitor 200 during playing of the game immediately before activation of the browser program.
  • The image displayed on monitor 200 at this time is not limited to the image that was displayed on controller 100 and monitor 200 immediately before activation of the browser program (second application program). For example, in a case where multiple game scenes are provided by execution of the game program (first application program), a single image for each game scene may be set as a screen shot. In this case, control unit 301 may cause the image set for the game scene at the time the browser program (second application program) is activated to be displayed.
  • Further, the image displayed on monitor 200 at this time is not limited to a stationary image as a screen shot, and may be a moving image. For example, control unit 301 may cause a moving image displayed on controller 100 or monitor 200, in accordance with execution of a game program (first application program) during a predetermined period before activation of a browser program (second application program), to be displayed on monitor 200.
  • Modification 5
  • The exemplary embodiment does not have to be carried out in a main device 300, in a display system including main device 300 and controller 100, or in a display system (display system 10) further including monitor 200, but also can be carried out in an information-processing device including integrally a configuration corresponding to main device 300 and a configuration corresponding to controller 100. Further, the functions of the main device may be assigned to and realized by multiple devices.
  • Further, the exemplary embodiment may be carried out not only as an information-processing device or a display system as described in the foregoing, but also as an information-processing method or a program for executing such a method. Furthermore, the program of the exemplary embodiment may be stored in a storage medium such as an optical disk or a semiconductor memory, or may be downloaded to an information-processing device via a network such as the Internet.
  • The foregoing description of the embodiments of the exemplary embodiment is provided for purposes of illustration and description, and is in no way to be taken as either exhaustive or specifically limitative of the exemplary embodiment; and it will be obvious to those skilled in the art that a wide range of modifications and variations can be applied to the exemplary embodiment described in the exemplified embodiments, with such embodiments having been chosen merely with a view to providing a clear explanation of the principles of the exemplary embodiment and its range of practical application, thereby to enable others skilled in the art to understand the exemplary embodiment in the context of a variety of embodiments, which can be adopted in the scope of the exemplary embodiment so as to best suit a contemplated use. The scope of the exemplary embodiment is intended to be defined by the claims that follow and equivalents thereof.

Claims (14)

What is claimed is:
1. An information-processing device comprising:
an acquisition unit configured to acquire application-related information on a first application program; and
a presentation unit configured, when a second application program capable of performing a search is activated after activation of the first application program, to present to a user the application-related information acquired by the acquisition unit as a candidate for an item to be searched for.
2. The information-processing device according to claim 1, wherein the second application program is a browser program.
3. The information-processing device according to claim 1, wherein the presentation unit is further configured, when the second application program is activated, to cause an image displayed on a display device by execution of the first application program to be replaced with the application-related information and an image displayed as a result of execution of the second application program.
4. The information-processing device according to claim 1, further comprising an instruction unit configured, in response to an instruction of searching for the application-related information presented by the presentation unit, to instruct the second application program to search for information relating to the application-related information.
5. The information-processing device according to claim 4, wherein the instruction unit is further configured to instruct the second application program to search for information relating to the application-related information, irrespective of whether an image for inputting an item to be searched for is displayed as a result of activation of the second application program.
6. The information-processing device according to claim 1, wherein the presentation unit is further configured to display automatically the application-related information in an Input field for an item to be searched for, the input field being displayed as a result of activation of the second application program.
7. The information-processing device according to claim 1, wherein the presentation unit is further configured to display the application-related information on an image displayed as a result of activation of the second application program.
8. The information-processing device according to claim 7, wherein, the presentation unit is further configured, in response to an instruction, to delete the application-related information displayed on the image displayed as a result of activation of the second application program, and after deletion of the application-related information, to display the identical application-related information again in response to an instruction.
9. The information-processing device according to claim 1, wherein
the application-related information is stored in a storage unit in association with a type of the first application program, and
the acquisition unit is further configured to acquire the application-related information from the storage unit.
10. The information-processing device according to claim 1, wherein
the application-related information is stored in a storage unit in association with a state of execution of the first application program, and
the acquisition unit is further configured to acquire the application-related information from the storage unit.
11. An information-processing method comprising:
when a second application program, by which a search can be performed, is activated after activation of a first application program, presenting to a user application-related information, which is information relating to the first application program, as a candidate for an item to be searched for.
12. A computer-readable non-transitory storage medium storing a program for causing a computer to execute:
acquiring application-related information, which is information relating to a first application program; and
when a second application program capable of performing a search is activated after activation of the first application program, presenting to a user the application-related information relating to the first application program as a candidate for an item to be searched for.
13. A display system comprising:
a display;
an acquisition unit configured to acquire application-related information, which is information relating to a first application program; and
a presentation unit configured, when a second application program capable of performing a search is activated after activation of the first application program, to present to a user the application-related information relating to the first application program and acquired by the acquisition unit, as a candidate for an item to be searched for.
14. An information-processing device comprising:
an acquisition unit configured to acquire application-related information relating to a first application program; and
a presentation unit configured, when a state where an image of the first application program is displayed is caused to transition to a state where an image of the second application program capable of performing a search is displayed, to present to a user the application-related information acquired by the acquisition unit.
US13/606,657 2012-05-31 2012-09-07 Method of associating multiple applications Abandoned US20130326521A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012124544A JP6022214B2 (en) 2012-05-31 2012-05-31 Program, information processing method, information processing apparatus, and display system
JP2012-124544 2012-05-31

Publications (1)

Publication Number Publication Date
US20130326521A1 true US20130326521A1 (en) 2013-12-05

Family

ID=49671961

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/606,657 Abandoned US20130326521A1 (en) 2012-05-31 2012-09-07 Method of associating multiple applications
US13/606,720 Active 2032-11-25 US9075880B2 (en) 2012-05-31 2012-09-07 Method of associating multiple applications

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/606,720 Active 2032-11-25 US9075880B2 (en) 2012-05-31 2012-09-07 Method of associating multiple applications

Country Status (2)

Country Link
US (2) US20130326521A1 (en)
JP (1) JP6022214B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104199641A (en) * 2014-08-04 2014-12-10 联想(北京)有限公司 Information processing method and first electronic equipment
US20200026395A1 (en) * 2018-07-17 2020-01-23 Google Llc Methods and Systems for Input Suggestion

Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6973505B1 (en) * 1999-09-01 2005-12-06 Eric Schneider Network resource access method, product, and apparatus
US20070061487A1 (en) * 2005-02-01 2007-03-15 Moore James F Systems and methods for use of structured and unstructured distributed data
US20080183681A1 (en) * 2007-01-29 2008-07-31 Samsung Electronics Co., Ltd. Method and system for facilitating information searching on electronic devices
US20080220854A1 (en) * 2007-03-08 2008-09-11 Timothy Michael Midgley Method and apparatus for collecting user game play data and crediting users in an online gaming environment
US20080300060A1 (en) * 2007-05-31 2008-12-04 Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) Online game server, online game program product and game apparatus
US20080317346A1 (en) * 2007-06-21 2008-12-25 Microsoft Corporation Character and Object Recognition with a Mobile Photographic Device
US20090144262A1 (en) * 2007-12-04 2009-06-04 Microsoft Corporation Search query transformation using direct manipulation
US20100010987A1 (en) * 2008-07-01 2010-01-14 Barry Smyth Searching system having a server which automatically generates search data sets for shared searching
US7688324B1 (en) * 1999-03-05 2010-03-30 Zoran Corporation Interactive set-top box having a unified memory architecture
US20100100839A1 (en) * 2008-10-22 2010-04-22 Erick Tseng Search Initiation
US20100144439A1 (en) * 2008-12-05 2010-06-10 Microsoft Corporation Out-of-band voice communication with interactive voice response services during gameplay
US20110131195A1 (en) * 2007-12-21 2011-06-02 Yeo Joon La Network search method for providing search window during execution of application program
US20110187662A1 (en) * 2010-02-04 2011-08-04 Samsung Electronics Co. Ltd. Mobile device with dual display units and method for controlling the dual display units
US7997969B1 (en) * 2007-03-15 2011-08-16 Capital One Financial Corp System and method for implementing a game for financial data extraction
US20110289530A1 (en) * 2010-05-19 2011-11-24 Google Inc. Television Related Searching
US20120026069A1 (en) * 2009-03-31 2012-02-02 Yasunari Ohsaki Mobile terminal device, and control program and multiple display screen control method therefor
US20120060114A1 (en) * 2010-09-02 2012-03-08 Samsung Electronics Co., Ltd. Method for providing search service convertible between search window and image display window and display apparatus applying the same
US20120081268A1 (en) * 2010-10-01 2012-04-05 Imerj LLC Launching applications into revealed desktop
JP2012085823A (en) * 2010-10-19 2012-05-10 Sony Computer Entertainment Inc Information processing system, information processing method, information processing program, and computer-readable recording medium with information processing program recorded thereon
CN102469361A (en) * 2010-11-03 2012-05-23 Tcl集团股份有限公司 Method for automatically downloading interlude of television program and television
US20120214505A1 (en) * 2011-02-22 2012-08-23 Sony Computer Entertainment Inc. Communication system, communication method, program, and information storage medium
US20130017870A1 (en) * 2011-07-12 2013-01-17 Cbs Interactive Inc. Game navigation interface for electronic content
US20130073583A1 (en) * 2011-09-20 2013-03-21 Nokia Corporation Method and apparatus for conducting a search based on available data modes
US8954996B2 (en) * 2009-12-11 2015-02-10 Red Hat, Inc. Profiling the system providing performance statistics in real time

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002052256A (en) * 2000-08-07 2002-02-19 Konami Co Ltd Supporting device of game winning, terminal device, and recording medium
JP2002123539A (en) * 2000-10-16 2002-04-26 Ntt Docomo Tokai Inc Information retrieval method
JP2002153676A (en) * 2000-11-17 2002-05-28 Square Co Ltd Game machine, information providing server, record medium, and information providing method and program
JP2002191868A (en) * 2000-12-25 2002-07-10 Namco Ltd Capture information provision information, information memory medium, game system and capture information provision system
US7843491B2 (en) * 2005-04-05 2010-11-30 3Vr Security, Inc. Monitoring and presenting video surveillance data
US8732019B2 (en) * 2006-07-21 2014-05-20 Say Media, Inc. Non-expanding interactive advertisement
JP5136292B2 (en) 2008-08-26 2013-02-06 日本電気株式会社 Application starting method for information processing terminal, information processing terminal and program
JP5349101B2 (en) * 2009-03-23 2013-11-20 株式会社ソニー・コンピュータエンタテインメント Information processing apparatus, information processing method, program, and information storage medium
JP5553668B2 (en) * 2010-04-14 2014-07-16 株式会社ソニー・コンピュータエンタテインメント Information search method, information search server, and information search system
US8798684B2 (en) * 2010-04-19 2014-08-05 Lg Electronics Inc. Mobile terminal and controlling method thereof
JP5256265B2 (en) * 2010-09-17 2013-08-07 株式会社ソニー・コンピュータエンタテインメント Computer system, computer system control method, program, and information storage medium
US9436274B2 (en) * 2011-06-30 2016-09-06 International Business Machines Corporation System to overlay application help on a mobile device
US8271334B1 (en) * 2011-10-05 2012-09-18 Google Inc. Generating a media content availability notification

Patent Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7688324B1 (en) * 1999-03-05 2010-03-30 Zoran Corporation Interactive set-top box having a unified memory architecture
US6973505B1 (en) * 1999-09-01 2005-12-06 Eric Schneider Network resource access method, product, and apparatus
US20070061487A1 (en) * 2005-02-01 2007-03-15 Moore James F Systems and methods for use of structured and unstructured distributed data
US20080183681A1 (en) * 2007-01-29 2008-07-31 Samsung Electronics Co., Ltd. Method and system for facilitating information searching on electronic devices
US20080220854A1 (en) * 2007-03-08 2008-09-11 Timothy Michael Midgley Method and apparatus for collecting user game play data and crediting users in an online gaming environment
US7997969B1 (en) * 2007-03-15 2011-08-16 Capital One Financial Corp System and method for implementing a game for financial data extraction
US20080300060A1 (en) * 2007-05-31 2008-12-04 Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) Online game server, online game program product and game apparatus
US20080317346A1 (en) * 2007-06-21 2008-12-25 Microsoft Corporation Character and Object Recognition with a Mobile Photographic Device
US20090144262A1 (en) * 2007-12-04 2009-06-04 Microsoft Corporation Search query transformation using direct manipulation
US20110131195A1 (en) * 2007-12-21 2011-06-02 Yeo Joon La Network search method for providing search window during execution of application program
US20100010987A1 (en) * 2008-07-01 2010-01-14 Barry Smyth Searching system having a server which automatically generates search data sets for shared searching
US20100100839A1 (en) * 2008-10-22 2010-04-22 Erick Tseng Search Initiation
US20100144439A1 (en) * 2008-12-05 2010-06-10 Microsoft Corporation Out-of-band voice communication with interactive voice response services during gameplay
US20120026069A1 (en) * 2009-03-31 2012-02-02 Yasunari Ohsaki Mobile terminal device, and control program and multiple display screen control method therefor
US8954996B2 (en) * 2009-12-11 2015-02-10 Red Hat, Inc. Profiling the system providing performance statistics in real time
US20110187662A1 (en) * 2010-02-04 2011-08-04 Samsung Electronics Co. Ltd. Mobile device with dual display units and method for controlling the dual display units
US20110289530A1 (en) * 2010-05-19 2011-11-24 Google Inc. Television Related Searching
US20120060114A1 (en) * 2010-09-02 2012-03-08 Samsung Electronics Co., Ltd. Method for providing search service convertible between search window and image display window and display apparatus applying the same
US20120081268A1 (en) * 2010-10-01 2012-04-05 Imerj LLC Launching applications into revealed desktop
JP2012085823A (en) * 2010-10-19 2012-05-10 Sony Computer Entertainment Inc Information processing system, information processing method, information processing program, and computer-readable recording medium with information processing program recorded thereon
CN102469361A (en) * 2010-11-03 2012-05-23 Tcl集团股份有限公司 Method for automatically downloading interlude of television program and television
US20120214505A1 (en) * 2011-02-22 2012-08-23 Sony Computer Entertainment Inc. Communication system, communication method, program, and information storage medium
US20130017870A1 (en) * 2011-07-12 2013-01-17 Cbs Interactive Inc. Game navigation interface for electronic content
US20130073583A1 (en) * 2011-09-20 2013-03-21 Nokia Corporation Method and apparatus for conducting a search based on available data modes

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Blogocio, Gameplay F1 2011 en Nitendo 3DS, 10/2/2011, YouTube, page 1 *
Camp, Nintendo World 2011 3DS lineup unveiled, 12/28/2010, digitaltrends.com, pages 1-4 *
DinosaursLoveExistence, Set-top box, 5/8/2012, Wikipedia, pages 1-7 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104199641A (en) * 2014-08-04 2014-12-10 联想(北京)有限公司 Information processing method and first electronic equipment
US20200026395A1 (en) * 2018-07-17 2020-01-23 Google Llc Methods and Systems for Input Suggestion
US10901577B2 (en) * 2018-07-17 2021-01-26 Google Llc Methods and systems for input suggestion
US11803290B2 (en) 2018-07-17 2023-10-31 Google Llc Methods and systems for input suggestion

Also Published As

Publication number Publication date
JP6022214B2 (en) 2016-11-09
US9075880B2 (en) 2015-07-07
US20130326540A1 (en) 2013-12-05
JP2013248121A (en) 2013-12-12

Similar Documents

Publication Publication Date Title
US11875027B2 (en) Contextual user interface
CN108235086B (en) Video playing control method and device and corresponding terminal
US8918731B2 (en) Content search method and display device using the same
US9329678B2 (en) Augmented reality overlay for control devices
KR102488975B1 (en) Content viewing device and Method for displaying content viewing options thereon
US10751611B2 (en) Using a game controller as a mouse or gamepad
JP2015118556A (en) Augmented reality overlay for control devices
US9495064B2 (en) Information processing method and electronic device
US9019315B2 (en) Method of controlling display
US20140229823A1 (en) Display apparatus and control method thereof
JP5952644B2 (en) Program, information processing method, information processing apparatus, and display system
US9075880B2 (en) Method of associating multiple applications
US20130321469A1 (en) Method of controlling display
KR102104136B1 (en) Augmented reality overlay for control devices
JP2016173677A (en) Information processing device and information display method
US8972864B2 (en) Website list navigation
KR102204599B1 (en) Method for outputting screen and display device for executing the same
JP2019051360A (en) Game program, game providing method, and information processor
US20240082713A1 (en) Storage medium, method, and information processing apparatus
US20240050851A1 (en) Systems and methods for game assistance and presenting game guides
KR20210000140A (en) Method and system for providing content composed of spatial unit
EP2886173A1 (en) Augmented reality overlay for control devices
JP2016162066A (en) Information processor, and start guide program
WO2015021341A1 (en) Methods and systems for generating dynamic user interface

Legal Events

Date Code Title Description
AS Assignment

Owner name: NINTENDO CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TSUDA, MUNETAKA;AOKI, RYOMA;KAKIMOTO, YASUTO;SIGNING DATES FROM 20120806 TO 20120807;REEL/FRAME:028916/0856

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION