US20090069096A1 - Program, information storage medium, game system, and input instruction device - Google Patents

Program, information storage medium, game system, and input instruction device Download PDF

Info

Publication number
US20090069096A1
US20090069096A1 US12/201,619 US20161908A US2009069096A1 US 20090069096 A1 US20090069096 A1 US 20090069096A1 US 20161908 A US20161908 A US 20161908A US 2009069096 A1 US2009069096 A1 US 2009069096A1
Authority
US
United States
Prior art keywords
controller
moving
instruction image
movement
game
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/201,619
Inventor
Yasuhiro NISHIMOTO
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bandai Namco Entertainment Inc
Original Assignee
Namco Bandai Games Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2008211603A external-priority patent/JP5410710B2/en
Application filed by Namco Bandai Games Inc filed Critical Namco Bandai Games Inc
Assigned to NAMCO BANDAI GAMES INC. reassignment NAMCO BANDAI GAMES INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NISHIMOTO, YASUHIRO
Publication of US20090069096A1 publication Critical patent/US20090069096A1/en
Assigned to BANDAI NAMCO GAMES INC. reassignment BANDAI NAMCO GAMES INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: NAMCO BANDAI GAMES INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/814Musical performances, e.g. by evaluating the player's ability to follow a notation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • A63F2300/305Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for providing a graphical or textual hint to the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/63Methods for processing data by generating or executing the game program for controlling the execution of the game in time
    • A63F2300/638Methods for processing data by generating or executing the game program for controlling the execution of the game in time according to the timing of operation or a time limit
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8047Music games

Definitions

  • the present invention relates to a program, an information storage medium, a game system, and an input instruction device.
  • JP-A-2002-166051 and JP-A-2001-224729 disclose a game device which instructs movements in the forward direction, the leftward direction, the rightward direction, the diagonal leftward direction, and the diagonal rightward direction for the player on the game screen so that the player performs a Para Para dance.
  • a game system in which a controller (input section) including a physical quantity sensor (e.g., acceleration sensor) and a game device main body are separately provided has been known.
  • the player plays the game by performing an input operation of shaking the controller or an input operation of inclining the controller.
  • a program that causes a computer to function as:
  • an instruction image generation section that generates an instruction image that instructs a given movement of a controller including a physical quantity sensor
  • a detection/determination section that acquires a signal from the physical quantity sensor included in the controller, detects the movement of the controller, and determines the degree of conformity of the detected movement of the controller with the movement instructed by the instruction image.
  • a computer-readable information storage medium storing the above-described program.
  • a game system comprising:
  • an instruction image generation section that generates an instruction image that instructs a given movement of a controller including a physical quantity sensor
  • a detection/determination section that acquires a signal from the physical quantity sensor included in the controller, detects the movement of the controller, and determines the degree of conformity of the detected movement of the controller with the movement instructed by the instruction image.
  • FIG. 1 is an explanatory view showing an example of a game system according to one embodiment of the invention.
  • FIG. 2 is an explanatory view showing of an example of a controller according to one embodiment of the invention.
  • FIG. 3 is a diagram for describing the principle of pointing instruction performed by a controller according to one embodiment of the invention.
  • FIG. 4 is a functional block diagram showing a game system according to one embodiment of the invention.
  • FIG. 5 is an explanatory view showing an example of a game screen including an instruction image.
  • FIG. 6A to FIG. 6C are explanatory views showing a series of changes in a game screen including a pointing instruction image.
  • FIG. 7A to FIG. 7C are explanatory views showing examples of a change in a game screen including an instruction image that instructs a turn or punch.
  • FIGS. 8A and 8B are diagrams for describing examples of an instruction image formed by combining a plurality of moving path instruction image parts.
  • FIG. 9A to FIG. 9D are explanatory views showing an example of a change in an instruction image.
  • FIGS. 10A and 10B are diagrams for describing the controller position detection principle.
  • FIG. 11 is a table for describing first type determination data.
  • FIG. 12 is a list for describing second type determination data.
  • FIGS. 13A to 13C are explanatory views showing production screens based on a determination result.
  • FIGS. 14A and 14B are explanatory views showing a game screen pointing operation using a controller.
  • FIGS. 15A to 15D are diagrams for describing an example of a pointing instruction image.
  • FIG. 16A to FIG. 16C are explanatory views showing a change in a game production screen in which the number of backing dancers increases.
  • FIGS. 17A and 17B are explanatory views respectively showing a two-player mode and a four-player mode.
  • FIG. 18 is a flowchart showing an example of a process performed by a game system according to one embodiment of the invention.
  • FIG. 19 is a flowchart showing an example of a process performed in a step S 14 in FIG. 18 .
  • FIG. 20 is a flowchart showing an example of a process performed in a step S 40 in FIG. 19 .
  • FIG. 21 is a flowchart showing an example of a process performed in a step S 42 in FIG. 19 .
  • FIG. 22 is a flowchart showing an example of a process performed in a step S 22 in FIG. 18 .
  • the invention may provide an input instruction device, a program, an information storage medium, and a game system that can visually instruct the movement of a controller including a physical quantity sensor and appropriately determine whether or not the controller has been operated according to the instructed movement so that a player can easily and appropriately play the game by using the controller, for example.
  • an input instruction device comprising:
  • an instruction image generation section that generates an instruction image that instructs a given movement of a controller including a physical quantity sensor
  • a detection/determination section that acquires a signal from the physical quantity sensor included in the controller, detects the movement of the controller, and determines the degree of conformity of the detected movement of the controller with the movement instructed by the instruction image.
  • a game system comprising the above-mentioned sections.
  • a program that causes a computer to function as the above-mentioned sections.
  • a computer-readable information storage medium storing a program that causes a computer to function as the above-mentioned sections.
  • controller refers to a controller that can be held by the user and moved in real space.
  • the controller is preferably a controller that can be held by the user and moved in the vertical direction, the horizontal direction, the forward and backward direction, and the like, for example.
  • the type of the physical quantity sensor included in the controller is not particularly limited insofar as the physical quantity sensor detects a physical quantity from which the movement of the controller in real space can be determined (acquired).
  • the physical quantity sensor is preferably formed as a sensor that detects a physical quantity from which a moving amount per unit time in an arbitrary moving direction can be detected. It is more preferable that the physical quantity sensor detect a three-dimensional movement in real space as moving amounts per unit time in three axial directions.
  • the acceleration sensor 210 detects the accelerations in three-axis (X axis, Y axis, and Z axis) directions. An acceleration sensor that detects a three-dimensional movement as accelerations in X axis, Y axis, and Z axis directions that intersect perpendicularly may be used.
  • instruction image that instructs the movement refers to an instruction image that instructs the user who holds the controller to move the controller in real space in various ways.
  • the user who holds the controller moves the controller in real space in accordance with the movement instructed by the instruction image while observing the instruction image.
  • the detection/determination section acquires the signal from the physical quantity sensor of the controller to detect the movement of the controller in real space.
  • the detection/determination section determines the degree of conformity of the detected movement of the controller in real space with the movement instructed by the instruction image.
  • a section that transmits the determination result to the user may be provided, if necessary.
  • the determination result may be transmitted as image data, may be transmitted aural as a sound output, or may be transmitted as image data and a sound output.
  • the embodiments in which the movement of the controller is instructed may be applied to a game system.
  • the movement of the controller may be appropriately instructed for the player who holds the controller by using the instruction image, and a game production effect such as a given event may be generated based on the determination result as to whether or not the controller has been accurately moved in real space in accordance with the instructions.
  • a game system that provides the player who holds the controller with given dance movement instructions as the instruction image so that the player can easily play the dance game may be implemented.
  • inventions may also be suitably applied to other applications, such as giving aerobic or exercise instructions to a student (or player) who holds the controller so that the student can perform appropriate aerobics or exercise and determining the result.
  • the physical quantity sensor may detect a physical quantity from which a moving direction and a moving amount per unit time can be derived;
  • the instruction image generation section may generate the instruction image that instructs a moving direction and a moving timing of the controller as the movement;
  • the detection/determination section may acquire the signal from the physical quantity sensor, detect the moving direction and the moving timing of the controller, and determine the degree of conformity of the detected moving direction and moving timing of the controller with the instructions instructed by the instruction image.
  • the instruction image generation section may generate the instruction image that instructs a moving start timing as the moving timing and instructs a moving duration
  • the detection/determination section may acquire the signal from the physical quantity sensor, detect the moving direction, the moving start timing, and the moving duration of the controller, and determine the degree of conformity of the detected moving direction, moving start timing, and moving duration of the controller with the instructions instructed by the instruction image.
  • the instruction image may include:
  • timing instruction image part that instructs a moving timing of the controller along the path
  • the instruction image being updated according to a change in content of instruction.
  • the path instructed by the moving path instruction image part may be a straight line or a curve.
  • the timing instruction image part may be arbitrarily formed insofar as the timing instruction image part instructs the moving timing of the controller along a given path instructed by the moving path instruction image part.
  • the timing instruction image part may be formed to move along the moving path corresponding to the moving timing of the controller, or may be integrally formed with the moving path instruction image part so that the color of the given path instructed by the moving path instruction image part is changed corresponding to the moving timing of the controller. It suffices that the timing instruction image part be displayed so that the moving timing can be visually recognized.
  • the timing instruction image part may move from a movement start instruction position to a movement finish instruction position along the moving path instruction image part to instruct a moving start timing and a moving duration of the controller along the instructed path.
  • the user can visually and easily determine the moving start timing and the moving duration of the controller along the instructed path by moving the timing instruction image part along the moving path instruction image part.
  • At least one of the moving path instruction image part and the timing instruction image part may be displayed as a transition image that changes from a previous notice display that is displayed before the moving start timing to a main display when the moving start timing is reached.
  • the player can be appropriately notified of the moving start timing, the moving direction, and the like using the transition image that changes from the previous notice display before the moving start timing to the main display.
  • the instruction image may be displayed as a set of moving path instruction image parts that instruct a continuous moving path by combining a plurality of the moving path instruction image parts;
  • the timing instruction image part may move along the continuously combined moving path instruction image parts to instruct the moving timing of the controller along a moving path instructed by each of the moving path instruction image parts.
  • a simple dance game with a variety can be implemented in which two controllers held by the player with both hands are considered to be two pompons held by the cheerleader, which instructs a complex movement (e.g., shaking the pompon to draw the letter “8” or making a turn while holding the pompons) instead of merely moving the pompons in the vertical or horizontal direction.
  • a complex movement e.g., shaking the pompon to draw the letter “8” or making a turn while holding the pompons
  • the detection/determination process by the detection/determination section may be performed each time the movement instruction is issued using each moving path instruction image part that forms the set of moving path instruction image parts.
  • the detection/determination section may acquire the signal from the physical quantity sensor, and detect a timing and a duration when the moving amount of the controller per unit time exceeds a given value as the moving start timing and the moving duration of the controller;
  • the detection/determination section may determine the degree of conformity of the detected moving start timing and moving duration of the controller with a moving start timing and determination moving duration for determination related to the instructed moving start timing and instructed moving duration instructed by the instruction image when the detection/determination section has determined that the detected moving direction of the controller coincides with the instructed moving direction instructed by the instruction image.
  • the detection/determination section can acquire the signal from the physical quantity sensor, and detect the timing and the duration when the moving amount of the controller per unit time exceeds a given value as the moving start timing and the moving duration of the controller.
  • a given delay time occurs until the moving amount of the controller per unit time exceeds a given value after the controller has been moved, although the delay time differs depending on the user. Therefore, even if the player has moved the controller at the timing instructed by the instruction image, detection of the movement is delayed by the given delay time.
  • the moving start timing for determination and the determination moving duration for determination related to the moving start timing and the moving duration instructed by the instruction image has been set. Even if the detected moving start timing of the controller is delayed as compared with the instructed moving start timing, it is determined that the degree of conformity of the movement of the controller with the instructed movement is high when the moving start timing and the moving duration correspond to the moving start timing and the determination moving duration for determination.
  • the moving start timing and the determination moving duration for determination may be set for a beginner player, an intermediate player, and a skilled player corresponding to the level of the user.
  • the degree of difficulty can be set by reducing delay between the moving start timing instructed by the instruction image and the determination moving start timing for a skilled player, and increasing the delay for a beginner player.
  • the moving duration for determination may be reduced by the delay of the moving start timing for determination with respect to the instructed moving start timing instructed by the instruction image.
  • the degree of difficulty can be decreased by increasing the moving duration for determination, and can be increased by decreasing the moving duration for determination.
  • the detection/determination section may perform a first process that compares the signal output from the physical quantity sensor when moving the controller with a first type determination database and determines the position of the controller that is moved, the first type determination database being related to the signal output from the physical quantity sensor when moving the controller in different positions in the moving direction instructed by the instruction image, and the first type determination database being used to determine the position of the controller; and
  • the detection/determination section may perform a second process that compares the signal output from the physical quantity sensor when moving the controller in the position determined by the first process in the moving direction instructed by the instruction image with a second type determination database to specify the movement of the controller including at least the moving direction, and determines the degree of conformity of the specified movement with the instructed movement, the second type determination database being related to the position of the controller determined based on the first type determination database, and being used to determine the movement the controller including at least the moving direction from the signal output from the physical quantity sensor when moving the controller in the moving direction instructed by the instruction image.
  • the first determination process that determines the position of the controller using the first type determination database and the second process that determines the degree of conformity of the movement of the controller including at least the moving direction using the second type determination database are performed.
  • three acceleration sensors that detect accelerations in X, Y, and Z axial directions in the space based on the controller may be used as the physical quantity sensor included in the controller.
  • the values output from the sensors that detect accelerations in X, Y, and Z axial directions differ depending on the position of the controller even when moving the controller in an identical direction (e.g., rightward direction).
  • the position of the controller when the user moves the rod-shaped controller in accordance with the instructions instructed by the instruction image is classified into basic patterns, for example.
  • basic positions such as the case where the user holds the controller vertically, the case where the user holds the controller horizontally, and the case where the user inclines the controller are taken into consideration.
  • the signals output from the physical quantity sensor included in the controller are collected within the predetermined allowable range, and stored in the database, while relating to the different basic positions. In this case, even if the user vertically holds the controller, the user may hold the controller in a state in which the controller is inclined with respect to the vertical direction to some extent. If such a position is excluded from the vertical position, it becomes impossible to classify the basic position of the controller.
  • the signals output from the sensor of the controller when the inclination of the controller from the basic position (e.g., vertical position) is within the predetermined allowable range (e.g., when the controller held by the player is inclined with respect to the vertical basic position within the predetermined allowable range) are collected as data within the predetermined allowable range and stored in the database corresponding to the basic position.
  • the data collected to be within the predetermined allowable range and stored in the database while being related to each basic position is the first type determination database.
  • the signals output from the physical quantity sensor included in the controller when moving the controller in the moving direction instructed by the instruction image are compared with the first type determination database, and the position of the controller that is moved in the moving direction is determined (first determination process). Therefore, the position (e.g., vertical or horizontal with respect to the screen) of the controller held by the user can be determined.
  • the second type determination database is provided in the same manner as the first type determination database. Specifically, the signals output from the physical quantity sensor of the controller when moving the controller in the direction instructed by the instruction image are collected corresponding to each basic position in which the player is considered to hold the controller.
  • the second type determination database for specifying the movement of the controller including the moving direction when moving the controller in a given position is provided in which the signals output from the physical quantity sensor when moving the controller in each position are associated with the moving direction of the controller.
  • the user may move the controller in a meandering path or a curved path with respect to the instructed direction. If it is determined that the movement along such a moving path does not conform to the movement in the instructed moving direction, a situation in which a wide range of users cannot enjoy the game may occur.
  • an identical direction e.g., rightward direction
  • the controller when the controller is moved along a moving path with the predetermined allowable range with respect to the moving path instructed by the instruction image (e.g., the moving direction instructed by the moving path instruction section), it is necessary to determine the movement to be a movement in the moving direction.
  • the instruction image e.g., the moving direction instructed by the moving path instruction section
  • the signals output from the physical quantity sensor when moving the controller in a specific basic position in the moving direction instructed by the instruction image are collected within the predetermined allowable range, and stored in the database.
  • the moving direction instructed by the instruction image is the vertical direction, the horizontal direction, the forward and backward direction, a predetermined diagonal direction, a semicircle or a circle along the vertical plane, or a semicircle or a circle along the horizontal plane
  • the signals output from the physical quantity sensor when moving the controller in a specific basic position in each moving direction are collected as data within the predetermined allowable range.
  • the data in which the output data from the physical quantity sensor collected with the predetermined allowable range when moving the controller in a given moving direction is associated with the given moving direction is collected corresponding to each moving direction instructed by the instruction image and stored in the database.
  • the second type determination database is thus generated.
  • the data relating to the signals with the predetermined allowable range output from the physical quantity sensor when moving the controller in the determined position is compared with the signals actually output from the physical quantity sensor of the controller using the second type determination database to specify the movement of the controller including the moving direction.
  • a process that specifies the moving direction and the moving amount per unit time of the controller held in the position determined by the first process is performed in real time.
  • the moving direction and the moving timing specified based on the moving direction and the moving amount per unit time of the controller are determined, and whether or not the movement coincides with the movement instructed by the instruction image is determined.
  • the first determination process and the second determination process each time the movement of the controller is instructed by the instruction image that instructs one movement.
  • different movements of the controller are sequentially instructed, and the movement of the controller can be detected accurately without being affected by a change in the position of the controller.
  • the first determination process and the second determination process may be performed in this order.
  • the signals output from the physical quantity sensor of the controller may be stored in a buffer, and the first determination process and the second determination process may be performed in parallel.
  • the first determination process and the second determination process may be performed in a period in which the controller must be moved to satisfy the moving path and the moving timing instructed by each moving path instruction image part.
  • the first determination process may be appropriately omitted so that the first determination process and the second determination process are performed in a specific period and only the second determination process is performed in a specific period.
  • the instruction image generation section may generate the instruction image that individually instructs a given movement for at least two controllers each having the physical quantity sensor.
  • one player may play the game while holding two controllers with both hands, or two players may play the game while respectively holding one controller.
  • a four-player game in which operation instructions are given to four players each holding one controllers may be implemented, or a two-player game in which operation instructions are given to two players each holding two controllers may be implemented.
  • each controller When a single player plays the game while holding two controllers, the operation of each controller may be individually determined, and the determination result for each operation may be evaluated. Alternatively, the operations of the controllers may be evaluated.
  • Each of the input instruction device, the game system, the program and the information storage medium may further comprise:
  • a game calculation section that instructs a player to perform a dance action accompanying the movement of the controller, generates a game screen including a character that performs a dance related to the instructed dance action based on an input from the controller, and generates a dance background music signal, wherein the instruction image generation section generates the instruction image that instructs a given movement of the controller in the game screen.
  • a dance game which instructs a dance movement with one hand or both hands for the player (user) using the instruction image so that the player can easily enjoy dancing to the rhythm can be provided.
  • a dance character associated with the dance action instructed by the instruction image appears on the game screen according to these embodiments.
  • the dance character dances to the dance background music, and the instruction image that instructs a given movement of the controller associated with the dance movement is generated and displayed on the game image where the dance character appears. Therefore, the player can dance in accordance with the movement instructed by the instruction image in synchronization with the dance character and background music. As a result, a player-friendly dance game can be provided.
  • the game calculation section may include a subsection that performs game production related to a result of determination for the degree of conformity by the detection/determination section.
  • game production corresponding to the degree of conformity determination result is performed, such as generating effect sound or displaying a production effect image on the game screen to liven up the game.
  • a special event may be generated when the player has successfully moved the controller corresponding to the successive movements. This improves the game production effect.
  • the game calculation section may include a subsection that specifies a cause of an incorrect operation of the controller and notifies the player of the cause of the incorrect operation based on a result of determination for the degree of conformity by the detection/determination section.
  • the player can be urged to more accurately move the controller by notifying the player of the cause of the incorrect operation.
  • the game calculation section may include a subsection that traces the moving path of the controller detected by the detection/determination section in the game screen based on a given condition.
  • the user can thus visually observe the degree of similarity between the movement instructed by the instruction image and the actual movement of the controller so that the user can enjoy the game while further improving his dance skill.
  • Each of the input instruction device, the game system, the program and the information storage medium may further comprise:
  • a pointing position detection section that acquires an imaging signal from a imaging section provided in the controller and detects a pointing position of the controller on the game screen, the imaging section imaging a reference position recognition body disposed or displayed at a position related to the game screen,
  • the game calculation section includes a subsection that displays a position instruction image that instructs to point at a predetermined position of the game screen at a given timing during the game;
  • the game calculation section generates an event related to a pointing at the predetermined position when the pointing at the predetermined position has been detected at the timing.
  • the reference position recognition body allow the pointing position of the controller on the game screen to be specified from the captured image.
  • the reference position recognition body may be a recognition body provided at a position associated with the game screen (e.g., at least two light sources or recognizable objects), or may be a recognition image displayed on the game screen.
  • the two light sources may be provided around a display and imaged by using an imaging section provided in the controller so that a CPU can determine the relative positional relationship between the controller and the game screen on the screen and determine the pointing position of the controller on the game screen.
  • the position instruction image that instructs the player to point a given position on the game screen at a given timing during the game is generated and displayed.
  • a predetermined event e.g., the number of backing dancers appearing on the game screen increases or a plurality of backing dancers who dance on the game screen give special performance
  • a predetermined event may be generated to increase the range of the game.
  • the instruction image generation section may generate the instruction image that instructs a given movement of the controller related to the predetermined pointed position of the game screen as the event related to the pointing at the predetermined position.
  • FIG. 1 is a schematic external view showing a game system according to one embodiment of the invention.
  • the game system includes a display section 12 that displays a game image on a display screen 11 , a game device 10 (game device main body) that performs a game process and the like, a first controller 20 - 1 (operation input section), and a second controller 20 - 2 (operation input section), the first controller 20 - 1 and the second controller 20 - 2 being held by a player P with either hand so that their positions and directions within a predetermined range can be arbitrarily changed.
  • a display section 12 that displays a game image on a display screen 11
  • a game device 10 game device main body
  • a first controller 20 - 1 operation input section
  • a second controller 20 - 2 operation input section
  • the game device 10 and each of the controllers 20 - 1 and 20 - 2 exchange various types of information via wireless communication.
  • FIG. 2 is a schematic external view showing the controller 20 according to this embodiment.
  • the controller 20 includes an arrow key 16 a and an operation button 16 b as an operation section.
  • the controller 20 also includes an acceleration sensor 210 as a physical quantity sensor that detects information which changes corresponding to the inclination and the movement of the controller so that information relating to the inclination and the movement of the controller in real space can be acquired.
  • an acceleration sensor 210 as a physical quantity sensor that detects information which changes corresponding to the inclination and the movement of the controller so that information relating to the inclination and the movement of the controller in real space can be acquired.
  • the acceleration sensor 210 is formed as a triaxial acceleration sensor 210 (detection section).
  • the acceleration sensor 210 detects the direction and the degree of inclination of the controller as acceleration vectors (inclination information) in three axial directions applied to the controller.
  • the acceleration sensor 210 detects the movement of the controller (i.e., changes in speed and direction of the controller per unit time due to the movement of the controller) as acceleration vectors (movement information) in three axial directions applied to the controller.
  • the game device 10 detects and determines the inclination and the movement of each of the first controller 20 - 1 and the second controller 20 - 2 in real space based on the information that changes corresponding to the inclination and the movement of each controller, and controls the game.
  • the game system displays a dance game screen shown in FIG. 5 , and displays an instruction image 340 in the game screen with the progress of the game.
  • the instruction image 340 instructs the player who holds the controller to move the controller in real space in various ways.
  • the player who holds the controller 20 moves the controller 20 in real space in accordance with the movement of the controller instructed by the instruction image while observing the instruction image.
  • the game device 10 acquires signals from the acceleration sensor 210 of the controller 20 to detect the movement of the controller in real space.
  • the game device 10 determines the degree of conformity of the detected movement of the controller in real space with the movement instructed by the instruction image.
  • the game device 10 generates a given event or a game production effect based on the determination result.
  • the player P who holds the controller 20 is thus provided with given dance movement instructions or the like using the instruction image 340 so that the player can easily play the dance game.
  • the controller 20 has a pointing function of indicating (pointing) an arbitrary position on the display screen 11 .
  • a pair of light sources 198 R and 198 L (reference position recognition portions) is disposed around the display section 12 at a position associated with the display screen 11 .
  • the light sources 198 R and 198 L are disposed at a predetermined interval along the upper side of the display section 12 , and are formed to project infrared radiation (i.e., invisible light).
  • An imaging section 220 that captures an image in front of the controller 20 is provided on the front side of the controller 20 .
  • the pointing position of the controller 20 on the display screen 11 is calculated as follows.
  • the rectangular area shown in FIG. 3 instructs a captured image PA acquired by the imaging section 220 (image sensor).
  • the captured image PA is an image corresponding to the position and the direction of the controller 20 .
  • the position RP of an area RA corresponding to the light source 198 R and the position LP of an area LA corresponding to the light source 198 L included in the captured image PA are calculated.
  • the positions RP and LP are instructed by position coordinates specified by a two-dimensional coordinate system (XY-axis coordinate system) in the captured image PA.
  • the distance between the light sources 198 R and 198 L and the relative positions of the light sources 198 R and 198 L associated with the display screen 11 are known in advance. Therefore, the game device 10 calculates the indication position (pointing position) on the display screen 11 using the controller 20 from the coordinates of the positions RP and LP thus calculated.
  • the origin O of the captured image PA is determined to be the pointing position of the controller 20 .
  • the pointing position is calculated from the relative positional relationship between the origin O of the captured image PA, the positions RP and LP in the captured image PA, and a display screen area DA that is an area in the captured image PA corresponding to the display screen 11 .
  • the positions RP and LP are position above the center of an imaging area PA to some extent in a state in which the line segment that connects the positions RP and LP is rotated clockwise by theta degrees with respect to a reference line L (X axis) of the imaging area PA.
  • the origin O corresponds to a predetermined position on the lower right of the display screen area DA so that the coordinates of the indication position (pointing position) of the controller 20 on the display screen 11 can be calculated.
  • a game image shown in FIGS. 6A to 6C is displayed on the display screen 11 , for example.
  • a position instruction image 350 that instructs the player to points at a predetermined position in the game image at a given timing using the first controller 20 - 1 and the second controller 20 - 2 held with the left hand or the right hand with the progress of the dance game is displayed in the game image.
  • the game device 10 determines whether or not the predetermined position has been instructed at an appropriate timing.
  • the game device 10 performs a game production process that generates a predetermined event (e.g., the number of backing dancers appearing in the game image increases).
  • the reference position recognition body allow the pointing position of the controller on the game screen to be specified from the captured image.
  • the reference position recognition body may be a recognition body provided at a position associated with the game screen (e.g., at least two light sources or recognizable objects), or may be a recognition image displayed on the game screen.
  • two reference position recognition images may be displayed at predetermined positions on the game screen as the recognition bodies.
  • the number of recognition bodies need not necessarily two.
  • the number of recognition bodies may be one.
  • FIG. 4 shows an example of a functional block diagram of the game system according to this embodiment. Note that the game system according to this embodiment need not necessarily include all of the elements shown in FIG. 1 . The game system according to this embodiment may have a configuration in which some of the elements are omitted.
  • the game system includes the game device 10 , the controller 20 as an input section, an information storage medium 180 , a display section (display device) 190 , a speaker 192 , and the light sources 198 R and 198 L.
  • the controller 20 includes the acceleration sensor 210 , the imaging section 220 , a speaker 230 , a vibration section 240 , a microcomputer 250 , and a communication section 260 .
  • the controller 20 may include an image input sensor, a sound input sensor, and a pressure sensor.
  • the acceleration sensor 210 detects the accelerations in three axial directions (X axis, Y axis, and Z axis). Specifically, the acceleration sensor 210 detects the accelerations in the vertical direction, the horizontal direction, and the backward or forward direction. The acceleration sensor 210 detects the accelerations at intervals of 5 msec. The acceleration sensor 210 may detect the accelerations in one axis, two axes, or six axes. The accelerations detected by the acceleration sensor are transmitted to the game device through the communication section 260 .
  • the imaging section 220 includes an infrared filter 222 , a lens 224 , an imaging element (image sensor) 226 , and an image processing circuit 228 .
  • the infrared filter 222 is disposed on the front side of the controller, and allows only infrared radiation contained in light incident from the light source 198 disposed while being associated with the display section 190 to pass through.
  • the lens 224 condenses the infrared radiation that has passed through the infrared filter 222 , and emits the infrared radiation to the imaging element 226 .
  • the imaging element 226 is a solid-state imaging element such as a CMOS sensor or a CCD.
  • the imaging element 226 images the infrared radiation condensed by the lens 224 to generate a captured image.
  • the image processing circuit 228 processes the captured image generated by the imaging device 226 .
  • the image processing circuit 228 processes the captured image from the imaging device 226 to detect a high luminance component, and detects light source position information (specific position) in the captured image.
  • the image processing circuit 228 detects the position information relating to the plurality of light sources in the captured image.
  • the detected position information is transmitted to the game device through the communication section 260 .
  • the controller 20 may be utilized as a pointing device that points a position (position information) on the game screen.
  • the speaker 230 outputs sound acquired from the game device through the communication section 260 .
  • the speaker 230 outputs confirmation sound transmitted from the game device or effect sound corresponding to motion.
  • the vibration section (vibrator) 240 receives a vibration signal transmitted from the game device, and operates based on the vibration signal.
  • the microcomputer 250 outputs sound or operates the vibrator based on data from received from the game device.
  • the microcomputer 250 causes the accelerations detected by the acceleration sensor 210 to be transmitted to the game device through the communication section 260 , or causes the position information detected by the imaging section 220 to be transmitted to the game device 10 through the communication section 260 .
  • the communication section 260 includes an antenna and a wireless module.
  • the communication section 260 exchanges data with the game device via wireless communication using the Bluetooth (registered trademark) technology, for example.
  • the communication section 260 according to this embodiment transmits the accelerations detected by the acceleration sensor 210 , the position information detected by the imaging section 220 , and the like to the game device at alternate intervals of 4 msec and 6 msec.
  • the communication section 260 may be connected to the game device via a communication cable, and exchange information with the game device via the communication cable.
  • the controller 20 may also include operating sections such as a button, a lever (analog pad), a mouse, an arrow key, and a touch panel display.
  • the controller 20 may include a gyrosensor that detects the angular velocity which changes due to the input operation of the player.
  • the game device 10 according to this embodiment is described below.
  • the game device 10 includes a storage section 170 , a processing section 100 , and a communication section 196 .
  • the storage section 170 serves as a work area for the processing section 100 , the communication section 194 , and the like.
  • the function of the storage section 170 may be implemented by hardware such as a RAM (VRAM).
  • the storage section 170 includes a main storage section 172 , a drawing buffer 174 , and a sound data storage section 176 .
  • the main storage section 172 serves as a work area for the processing section 100 , the communication section 194 , and the like.
  • the function of the storage section 170 may be implemented by hardware such as a RAM (VRAM).
  • the main storage section 172 includes a storage area 173 that stores first and second type determination databases described later.
  • the drawing buffer 174 stores an image generated by a drawing section 120 .
  • the sound data storage section 176 stores confirmation sound that instructs the reaction of the controller with regard to the input operation of the player and effect sound output along with a game calculation process.
  • the sound data storage section 176 stores a plurality of types of confirmation sound corresponding to detected information.
  • the sound data storage section 176 stores a plurality of types of effect sound corresponding to motion and a given event.
  • the processing section 100 performs various processes according to this embodiment based on a program (data) stored in (read from) the information storage medium 180 .
  • the information storage medium 180 stores a program that causes a computer to function as each section according to this embodiment (i.e., a program that causes a computer to perform the process of each section).
  • the information storage medium 180 includes a memory card that stores a player's personal data, game save data, and the like.
  • the communication section 196 can communicate with another game device through a network (Internet).
  • the function of the communication section 196 may be implemented by hardware such as a processor, a communication ASIC, or a network interface card, a program, or the like.
  • the communication section 196 can perform cable communication and wireless communication.
  • the communication section 196 includes an antenna and a wireless module, and exchanges data with the communication section 260 of the controller 20 using the Bluetooth (registered trademark) technology, for example.
  • the communication section 196 transmits sound data (e.g., confirmation sound and effect sound) and the vibration signal to the controller, and receives information detected by the acceleration sensor and the image sensor of the controller 20 at alternate intervals of 4 msec and 6 msec.
  • a program (data) that causes a computer to function as each section according to this embodiment may be distributed to the information storage medium 180 (or the storage section 170 ) from a storage section or an information storage medium included in a server through a network.
  • Use of the information storage medium of the server is also included within the scope of the invention.
  • the processing section 100 performs a game calculation process, an image generation process, and a sound control process based on detected information received from the controller 20 , a program loaded into the storage section 170 from the information storage medium 180 , and the like.
  • the processing section 100 functions as an instruction image generation section 102 , a pointing position instruction section 104 , a detection/determination section 110 , a game calculation section 112 , a drawing section 120 , a sound control section 130 , and a vibration control section 140 .
  • the instruction image generation section 102 generates an instruction image that instructs a given movement of the controller 20 on the game screen. Specifically, the instruction image generation section 102 generates the instruction image 340 that instructs the moving direction and the moving timing of the controller 20 as the given movement of the controller 20 .
  • the pointing position instruction section 104 generates a position instruction image (pointing instruction image) 350 that instructs the player to point a predetermined position on the game screen at a given timing during the game.
  • a position instruction image pointing instruction image
  • the detection/determination section 110 detects the movement of the controller 20 based on information obtained from the acceleration sensor 210 of the controller 20 , and determines the degree of conformity of the detected movement of the controller 20 with the movement instructed by the instruction image.
  • the detection/determination section 110 detects the pointing position of the controller 20 on the game screen based on information from the imaging section 220 of the controller 20 , and determines whether or not the detected pointing position has been appropriately pointed at the predetermined timing instructed by the position instruction image.
  • the game calculation section 112 performs game calculations based on the determination result of the detection/determination section 110 and a given program.
  • the game calculation section 112 disposes various objects (i.e., objects formed by a primitive such as a polygon, free-form surface, or subdivision surface) that represent display objects such as a character (player character or enemy character), a moving body (e.g., car or airplane), a building, a tree, a pillar, a wall, or a map (topography) in an object space.
  • the game calculation section 112 determines the position and the rotational angle (synonymous with orientation or direction) of the object in a world coordinate system, and disposes the object at the determined position (X, Y, Z) and the determined rotational angle (rotational angles around X, Y, and Z axes).
  • the game calculation section 112 controls a virtual camera (viewpoint) for generating an image viewed from a given (arbitrary) viewpoint in the object space. Specifically, the game calculation section 112 controls the position (X, Y, Z) or the rotational angle (rotational angles around X, Y, and Z axes) of the virtual camera (controls the viewpoint position, the line-of-sight direction, or the angle of view).
  • the game calculation section 112 controls the position or the rotational angle (direction) of the virtual camera so that the virtual camera follows a change in position or rotation of the object.
  • the game calculation section 112 may control the virtual camera based on information such as the position, the rotational angle, or the speed of the object obtained by a motion generation section 124 described later.
  • the game calculation section 112 may rotate the virtual camera at a predetermined rotational angle, or move the virtual camera along a predetermined path.
  • the game calculation section 112 controls the virtual camera based on virtual camera data for specifying the position (path) or the rotational angle of the virtual camera.
  • the game calculation section 112 calculates the movement/motion (movement/motion simulation) of a model (e.g., character, car, or airplane). Specifically, the game calculation section 112 causes the model to move in the object space or causes the object to perform a motion (animation) based on detected information determined to satisfy a predetermined condition, a program (movement/motion algorithm), motion data, and the like. Specifically, the game calculation section 112 performs a simulation process that sequentially calculates movement information (position, rotational angle, speed, or acceleration) and motion information (position or rotational angle of each part that forms the object) of the object in frame ( 1/60 sec) units. Note that the term “frame” refers to a time unit when performing the object movement/motion process (simulation process) and the image generation process.
  • the drawing section 120 performs a drawing process based on the results of various processes (game calculation process) performed by the processing section 100 to generate an image, and outputs the image to the display section 190 .
  • display object data object data or model data
  • vertex data e.g., vertex position coordinates, texture coordinates, color data, normal vector, or alpha value
  • the drawing section 120 performs a vertex process based on the vertex data included in the input display object data.
  • the drawing section 120 may perform a vertex generation process (tessellation, curved surface division, or polygon division) for dividing the polygon, if necessary.
  • the drawing section 120 performs a vertex movement process and a geometric process such as coordinate transformation (world coordinate transformation or camera coordinate transformation), clipping, perspective transformation, or a light source process, and changes (updates or adjusts) the vertex data relating to the vertices that form the display object based on the processing results.
  • the drawing section 120 performs rasterization (scan conversion) based on the vertex data after the vertex process so that the surface of the polygon (primitive) is associated with pixels.
  • the drawing section 120 then performs a pixel process (fragment process) that draws pixels which form the image (fragments which form the display screen).
  • the drawing section 120 determines the final pixel drawing color by performing various processes such as a texture reading (texture mapping) process, a color data setting/change process, a translucent blending process, and an anti-aliasing process, and outputs (draws) the drawing color of the object subjected to perspective transformation to (in) the drawing buffer 174 (i.e., a buffer that can store image information in pixel units; VRAM or rendering target).
  • the pixel process includes a per-pixel process that sets or changes the image information (e.g., color, normal, luminance, and alpha value) in pixel units.
  • an image may be generated so that images (divided images) viewed from the respective virtual cameras can be displayed on one screen.
  • the vertex process and the pixel process performed by the drawing section 120 may be implemented by hardware that enables a programmable polygon (primitive) drawing process (i.e., programmable shader (vertex shader and pixel shader)) based on a shader program written using a shading language.
  • a programmable polygon (primitive) drawing process i.e., programmable shader (vertex shader and pixel shader)
  • the programmable shader enables a programmable per-vertex process and per-pixel process to increase the degree of freedom relating to the drawing process so that the representation capability is significantly improved as compared with a fixed hardware drawing process.
  • the drawing section 120 performs a geometric process, a texture mapping process, a hidden surface removal process, an alpha blending process, and the like when drawing the display object.
  • the display object is subjected to a coordinate transformation process, a clipping process, a perspective transformation process, a light source calculation process, and the like.
  • the display object data e.g., display object's vertex position coordinates, texture coordinates, color data (luminance data), normal vector, or alpha value
  • the display object data after the geometric process (after perspective transformation) is stored in the main storage section 171 .
  • the term “texture mapping process” refers to a process for mapping a texture (texel value) stored in the storage section 170 on the display object.
  • the drawing section 120 reads a texture (surface properties such as color (RGB) and alpha value) from the storage section 170 using the texture coordinates set (assigned) to the vertices of the display object, for example.
  • the drawing section 120 maps the texture (two-dimensional image) on the display object.
  • the drawing section 120 performs a pixel-texel association process, bilinear interpolation (texel interpolation), and the like.
  • the drawing section 130 may perform a hidden surface removal process by a Z buffer method (depth comparison method or Z test) using a Z buffer (depth buffer) that stores the Z value (depth information) of the drawing pixel.
  • the drawing section 120 refers to the Z value stored in the 7 buffer when drawing the drawing pixel corresponding to the primitive of the object.
  • the drawing section 120 compares the Z value stored in the Z buffer with the Z value of the drawing pixel of the primitive.
  • the drawing section 120 draws the drawing pixel and updates the Z value stored in the Z buffer with a new Z value.
  • alpha blending process refers to a translucent blending process (e.g., normal alpha blending, additive alpha blending, or subtractive alpha blending) based on an alpha value (A value).
  • a value alpha value
  • the drawing section 120 calculates a color obtained by blending two colors by performing linear interpolation using the alpha value as the degree of blending.
  • alpha value refers to information that can be stored while being associated with each pixel (texel or dot), such as additional information other than the color information that instructs the luminance of each RGB color component.
  • the alpha value may be used as mask information, translucency (equivalent to transparency or opacity), bump information, or the like.
  • the sound control section 130 causes at least one of the speaker 230 of the controller and the speaker 192 to output sound (including confirmation sound and effect sound) stored in the sound data storage section 176 based on the results of various processes (e.g., the determination process and the game calculation process) performed by the processing section 100 .
  • the sound control section 130 causes the speaker to output confirmation sound when the detection/determination section 110 has determined that the predetermined condition is satisfied.
  • the sound control section 130 may cause the speaker to output confirmation sound corresponding to the detected information.
  • the sound control section 130 may cause only the speaker 230 of the controller to output confirmation sound, and may cause the speaker 192 to output effect sound corresponding to the game calculation process (e.g., effect sound corresponding to the motion determined based on the detected information).
  • the vibration control section 140 causes the vibration section 240 of the controller to vibrate based on a predetermined condition.
  • the game system may be a system dedicated to a single-player mode in which only one player can play the game, or may be a system provided with a multi-player mode in which a plurality of players can play the game.
  • the game images and the game sound provided to the players may be generated using one game device and one display section.
  • the game images and the game sound may be generated by a distributed process using a plurality of game devices connected through a network (transmission line or communication line) or the like.
  • a determination as to whether or not a predetermined condition is satisfied based on the detected information
  • sound control based on the determination result vibration control are performed corresponding to the controller of each player.
  • the information storage medium 180 (computer-readable medium) stores a program, data, and the like.
  • the function of the information storage medium 180 may be implemented by hardware such as an optical disk (CD or DVD), a magneto-optical disk (MO), a magnetic disk, a hard disk, a magnetic tape, or a memory (ROM).
  • the display section 190 outputs an image generated by the processing section 100 .
  • the function of the display section 190 may be implemented by hardware such as a CRT display, a liquid crystal display (LCD), an organic EL display (OELD), a plasma display panel (PDP), a touch panel display, or a head mount display (HMD).
  • the speaker 192 outputs sound reproduced by the sound control section 130 .
  • the function of the speaker 192 may be implemented by hardware such as a speaker or a headphone.
  • the speaker 192 may be a speaker provided in the display section.
  • the speaker 192 may be a speaker provided in the television set.
  • the light source 198 is an LED that emits infrared radiation (i.e., invisible light), for example.
  • the light source 198 is disposed while being associated with the display section 190 .
  • a plurality of light sources (light source 198 R and light source 198 L) are provided.
  • the light source R and the light source L are disposed at a predetermined interval.
  • FIG. 5 shows an example of the game screen displayed according to this embodiment.
  • the game executed by the game system according to this embodiment is configured so that the player plays the leader of a cheerleading dance team to lead the dance of the entire team while giving dance instructions to the members of the team aiming to succeed in the dance.
  • floral beat characters 330 that instruct the beat of background music (BGM) are displayed on the game screen.
  • the beat characters 330 blink on the beat.
  • a main dancer 310 who holds pompons with both hands and a plurality of sub dancers 312 - 1 and 312 - 2 positioned behind the main dancer 310 are displayed on the game screen.
  • the main dancer 310 is a player character that reproduces a dance corresponding to the operation of the player as a cheerleader.
  • the instruction image generation section 102 generates and displays a pair of instruction images 340 - 1 and 340 - 2 that respectively instruct the movements of the first controller 20 - 1 and the second controller 20 - 2 held by the player with the progress of the game.
  • the instruction image 340 is displayed at given time intervals in a predetermined order with the progress of the game.
  • the instruction images 340 - 1 and 340 - 2 that respectively instruct the movements of the first controller 20 - 1 and the second controller 20 - 2 held by the player are displayed on either side of the main dancer 310 that is a player character.
  • the instruction image 340 - 1 instructs the operation of the first controller 20 - 1 held by the player with the right hand
  • the instruction image 340 - 2 instructs the operation of the second controller 20 - 2 held by the player with the left hand.
  • the instruction images 340 - 1 and 340 - 2 are displayed at given positions on the game screen with the progress of the game.
  • the instruction images 340 - 1 and 340 - 2 give instructions to the player with regard to given movements (i.e., moving direction, moving timing, and moving duration) of the controllers 20 - 1 and 20 - 2 .
  • the instruction image 340 includes a trace rail 342 (or a moving path instruction image part) that instructs a moving direction of the controller, a timing mark 344 (or a timing instruction image part) that instructs the moving timing, and an operation finish mark 346 that instructs the expiration of the moving duration.
  • the timing mark 344 is displayed on one end of the trace rail 342
  • the operation finish mark 36 is displayed on the other end of the trace rail 342 in a fixed state.
  • the timing mark 344 starts to move along the trace rail 342 in the moving direction of the controller 20 in synchronization with the operation start timing of the controller 20 , and reaches the operation finish mark 346 at the finish timing of the moving duration.
  • the player can determine the moving timing and the moving direction of the controller by the trace rail 342 and the timing mark 344 that moves along the trace rail 342 , and can determine the expiration of the operation duration when the timing mark 344 has reached the operation finish mark 346 .
  • the next timing mark 344 may be displayed on the identical trace rail 342 and moved to give the identical movement instruction to the player.
  • the trace rail 342 instructs the upward movement of the controller.
  • the instruction image 340 that instructs the movement in another reference direction (e.g., right direction, left direction, diagonally right upward direction, diagonally left upward direction, downward direction, right downward direction, left downward direction, and backward or forward direction (depth direction or front direction)) is generated and displayed.
  • the instruction image 340 that instructs the movement in the backward or forward direction may be formed by displaying the trace rail 342 in the game space displayed on the game screen in the forward direction and moving the timing mark 344 along the trace rail 342 in the forward direction or the backward direction, for example.
  • the instruction image 340 that instructs the player to make a right turn is displayed in FIGS. 7A and 7B
  • the instruction image 340 that instructs the player to perform a punch operation i.e., move the controller forward
  • FIG. 7C The instruction image 340 that instructs the player to perform a punch operation (i.e., move the controller forward) is displayed in FIG. 7C .
  • the instruction images 340 that instruct the player who gets into the rhythm of the background music to make a dance action are displayed one after another with the progress of the game.
  • the main dancer 310 who dances corresponding to the dance action instructed by the instruction image 340 appears on the game screen, and the main dancer 310 and the sub dancers 312 dance to the background music.
  • the player moves the first controller 20 - 1 and the second controller 20 - 2 (i.e., both hands) in real space in accordance with the instructions given by the instruction images 340 - 1 and 340 - 2 displayed one after another while listening to the background music and watching the main dancer 310 displayed on the game screen to enjoy dancing to the rhythm as if the player were the leader of the cheerleading dance team.
  • a set of moving path instruction image parts that instruct a continuous moving path is generated and displayed by combining a plurality of trace rails (i.e., moving path instruction image parts) so that complex movement instructions can be given to the player.
  • FIG. 8A shows an example of the instruction image 340 that instructs the player to move the controller 20 almost in the shape of the letter “8”.
  • Four trace rails 342 - 1 to 342 - 4 that differ in moving direction are displayed in combination.
  • the timing mark 344 sequentially moves along the four trace rails 342 - 1 to 342 - 4 so that the player can easily and visually determine the moving direction, the moving timing, and the moving duration of the corresponding controller.
  • FIG. 8B shows the instruction image 340 that instructs the player to circularly move the controller 20 clockwise by combining a plurality of arc-shaped trace rails 342 - 1 and 342 - 2 .
  • the instruction image shown in FIG. 8A can instruct the player to perform a dance operation that moves the pompon in the shape of the letter “8”, and the instruction image shown in FIG. 8B can instruct the player to perform a dance operation that swings the pompon clockwise.
  • FIGS. 9A to 9D show an example of the instruction image 340 that allows the player to be more easily notified of the moving start timing of the controller 20 .
  • the trace rail 342 (or a moving path instruction image part) and the timing mark 344 (or a timing instruction image part) are displayed as a transition image that changes from a previous notice display that is displayed before the moving start timing to a main display when the moving start timing has been reached.
  • the trace rail 342 shown in FIG. 9A is previously displayed by a thin dotted line two beats before the moving start timing of the controller.
  • the trace rail 342 is displayed by a solid line (see FIG. 9B ) one beat before the moving start timing, and the timing mark 344 is displayed to notify the player that the operation will occur shortly.
  • the trace rail 342 is displayed in a shape that instructs that the operation timing has been reached, and the timing mark 344 moves along the trace rail 342 from the movement start point to the movement finish point. The player is thus instructed to move the controller 20 in the direction instructed by the trace rail 342 in synchronization with the movement of the timing mark 344 .
  • the path sequentially disappears along with the movement of the timing mark 344 , and the expiration of the moving duration is instructed when the timing mark 344 has reached the operation finish mark 346 .
  • the player can be appropriately notified of the moving start timing and the moving direction using the transition image that changes from the previous notice display before the moving start timing to the main display so that the player can make preparations for moving the controller 20 in the instructed direction.
  • transition image may be displayed as a transition image that changes from transparent display to the main display, or may be displayed as a transition image that moves toward the display position along the depth direction while changing its transparency or size.
  • An appearance of the trace rail 342 that is a moving path instruction image part and an appearance of the timing mark 344 that is an instruction image part may be changed in the timing of instruction to the player.
  • At least one of color, form, and size of the timing mark 344 that is an instruction image part may be changed so that the timing mark 344 becomes gradually more visible and has more improved production effects as the timing mark 344 comes close to the operation finish mark 346 .
  • At least one of color, form, and size of at least one of the trace rail 342 and the timing mark 344 may be changed so that the visibility and the production effects are improved.
  • the detection/determination section 110 detects the actual movement of the controller 20 performed by the player, and determines the degree of conformity of the detected movement of the controller with the movement instructed by the instruction image.
  • the moving direction and the moving start timing of the controller 20 may be calculated based on the accelerations detected by the acceleration sensor 210 .
  • the acceleration sensor 210 detects the accelerations in three axial directions (X axis, Y axis, and Z axis) in the space based on the controller.
  • FIG. 10A is a diagram showing the gravitational acceleration in a real space coordinate system
  • FIG. 10B is a diagram showing the gravitational acceleration in the coordinate system of the controller 20 based on the controller 20 .
  • a gravity of 1 G is applied in the Y axis (downward) direction (gravitational acceleration direction).
  • a gravity of 1 G is also applied in the downward direction along the y axis in the coordinate system of the controller 20 .
  • a gravitational acceleration of 1 G is applied in the Y-axis direction in real space coordinate system ( FIG. 10A ).
  • the gravitational acceleration of 1 G is decomposed in the x axis direction and the y axis direction.
  • the x-axis component and the y-axis component are respectively 1/ ⁇ 2 G.
  • the inclination of the controller 20 is thus detected utilizing the gravitational acceleration.
  • FIG. 9B when an acceleration of 1/ ⁇ 2 G has been detected in the negative direction along the x axis and an acceleration of 1/ ⁇ 2 G has been detected in the negative direction along the y axis, it is possible to detect that the controller 20 is inclined by 45° counterclockwise in real space coordinate system (XY axes).
  • the accelerations in three axial directions can be detected, the three-dimensional inclination in real space can be calculated from the acceleration in each axial direction.
  • the accelerations in three axial directions (X axis, Y axis, and Z axis) output from the acceleration sensor 210 differ between the case where the player vertically holds the controller 20 and moves the controller 20 in a given direction (e.g., rightward) and the case where the player horizontally holds the controller 20 and moves the controller 20 in the given direction.
  • a first type determination database for determining the position of the controller in real space is formed and stored in the storage area 173 of the main storage section 172 , as shown in FIG. 1 .
  • the position of the controller 20 in real space is classified into a plurality of basic positions taking variations in position when the player holds the controller 20 into consideration.
  • the position of the controller 20 in real space is classified into a vertical position, a diagonal rightward position, a diagonal leftward position, a horizontal position, and other basic positions.
  • the outputs of the acceleration sensor in the x, y, and z axial directions in the controller coordinate system are stored corresponding to each basic position. Therefore, the first process that determines the position of the controller 20 in real space can be performed based on the outputs of the acceleration sensor 210 in the x, y, and z axial directions.
  • the position of the controller 20 cannot be determined when the player holds the controller 20 in a position that differs from the basic position to some extent.
  • the game according to this embodiment aims at a wide range of users (i.e., from children to adult), it is necessary to form the first type determination database so that a position that differs from each basic position within a predetermined allowable range can be determined to be the corresponding basic position even if the player holds the controller 20 in a position that differs from the basic position to some extent.
  • the signals output from the acceleration sensor when the position of the controller 20 differs from a specific basic position within the allowable range are also collected as the sensor output corresponding to the basic position and stored in the database.
  • the data corresponding to each basic position within the predetermined allowable range is thus collected and stored in the first type determination database.
  • the signals in the x, y, and z axial directions output from the acceleration sensor of the controller 20 when moving the controller 20 in the moving direction instructed by the instruction image 34 are compared with the first type determination database, and the basic position that coincides with the position of the controller that is moved in the moving direction instructed by the instruction image 34 is determined (first determination process).
  • the position (e.g., vertical or horizontal with respect to the screen) of the controller 20 held by the player can be determined flexibly.
  • a second type determination database shown in FIG. 12 is used to perform the second process.
  • the second type determination database is stored in the storage area 173 of the main storage section 172 .
  • the second type determination database is generated as follows.
  • the signals in the x, y, and z directions output from the acceleration sensor of the controller 20 when moving the controller 20 in the direction instructed by the instruction image 340 in real space are collected corresponding to each basic position of the controller 20 shown in FIG. 11 .
  • the signals output from the acceleration sensor 210 when the controller 20 is moved in each basic position are associated with the moving direction of the controller 20 in real space to create a second type determination database.
  • the player may move the controller 20 in a meandering path or a curved path with respect to the instructed direction. If it is determined that the movement along such a moving path does not conform to the movement in the instructed moving direction, a situation in which a wide range of users cannot enjoy the game may occur.
  • the signals in the x, y, and z axial directions output from the acceleration sensor 210 when moving the controller 20 in a specific basic position in the moving direction instructed by the instruction image are collected within the predetermined allowable range, and stored in the database.
  • the signals output from the acceleration sensor are collected as signals corresponding to the instructed basic position, and stored in the database.
  • data relating to the controller 20 held in each basic position is classified corresponding to each moving direction (i.e., rightward direction, diagonally right upward direction, upward direction, downward direction, diagonally right downward direction, diagonally left downward direction, forward direction, backward direction, clockwise direction, counterclockwise direction, and other directions), and the signals in the x, y, and z axial directions output from the acceleration sensor are collected within the predetermined allowable range, and stored in the database.
  • FIG. 12 shows the database corresponding to one basic position. Note that data is similarly collected and stored corresponding to other basic positions shown in FIG. 11 .
  • data corresponding to the determined position is compared with the signals output from the acceleration sensor of the controller 20 using the second type determination database to specify the movement (e.g., the moving direction) of the controller 20 .
  • a process that specifies the moving direction and the moving amount per unit time of the controller 20 held in the position determined by the first process is performed in real time.
  • the moving direction, the moving timing, and the moving duration of the controller 20 in real space are determined based on the moving direction and the moving amount per unit time of the controller 20 thus specified, and whether or not the movement coincides with the movement instructed by the instruction image 340 is evaluated.
  • the movements of the first controller 20 - 1 and the second controller 20 - 2 are individually evaluated with respect to the instructions given by the instruction images 340 - 1 and 340 - 2 shown in FIG. 4 .
  • the main dancer 310 displayed on the game screen operates to reproduce a dance corresponding to the operation of the player.
  • the player's operation may be evaluated to a larger extent as compared with the case where the movement of only one of the first controller 20 - 1 and the second controller 20 - 2 has been determined to be good.
  • An evaluation display area 322 that evaluates the operation of the player is provided on the game screen (upper right) shown in FIG. 5 . Characters “COOL” are displayed when the input direction and the input timing are correct, and characters “POOR” are displayed when the input direction or timing is incorrect.
  • FIGS. 13A to 13C Screens shown in FIGS. 13A to 13C may be displayed based on the evaluation result.
  • the cause of the incorrect operation of the controller 20 may be specified based on the incorrect determination result. As shown in FIGS. 13A and 13B , a screen that instructs that the input timing has been incorrect, or a screen that instructs that the controller has been moved inappropriately may be displayed.
  • the moving path of the controller 20 may be displayed near the corresponding instruction image 340 .
  • the player can thus visually observe the degree of similarity between the movement instructed by the instruction image 340 and the actual movement of the controller 20 so that the player can enjoy the game while further improving his dance skill.
  • the first process and the second process are performed.
  • the movement of the controller 20 is then detected, and whether or not the detected movement coincides with the instructed movement is then determined.
  • the velocity vector (i.e., moving amount per unit time) of the controller 20 may be calculated as a composite value of the velocity vectors in the x, y, and z axis directions obtained from the acceleration sensor 210 of the controller 20 .
  • the velocity vector detected when starting the movement is small even if the player has moved the controller 20 in the direction at the moving start timing instructed by the instruction image 340 , and reaches a predetermined reference value after several frames (e.g., a frame T 3 when the movement has been started in a frame T 1 ). Therefore, a delay by predetermined frames occurs between the timing at which the controller 20 has been operated and the timing at which the operation is detected.
  • a determination moving start timing and a determination moving duration are set taking the above-mentioned detection delay into consideration in addition to the moving start timing and the moving duration instructed by the instruction image 340 to determine the movement of the controller 20 .
  • the determination moving start timing is set at a timing delayed by several frames as compared with the moving start timing instructed by the instruction image 340 , and the determination moving duration is set to coincide with the finish timing of the moving duration instructed by the instruction image 340 or expire after the moving duration instructed by the instruction image 340 by predetermined frames.
  • the degree of difficulty in the game may be set by appropriately setting the determination moving start timing and the determination moving duration.
  • the degree of difficulty can be decreased by increasing the delay of the determination moving start timing with respect to the moving start timing instructed by the instruction image, and can be increased by decreasing the delay of the determination moving start timing with respect to the moving start timing instructed by the instruction image.
  • the degree of difficulty can be increased by decreasing the determination moving duration, and can be decreased by increasing the determination moving duration.
  • the determination moving start timing and the moving duration may be set corresponding to the degree of difficulty selected by the player, for example.
  • the pointing position instruction section 104 generates the pointing instruction image (pointing instruction image) that instructs the player to point a predetermined position on the game screen at a given timing with the progress of the game.
  • FIGS. 6A to 6C show specific examples of the pointing instruction image 350 displayed on the game screen.
  • the pointing instruction image 350 is displayed corresponding to the character on the left of the game screen.
  • the pointing instruction image 350 includes a target board section 352 displayed at the pointing position, and a ring section 354 displayed around the target board section 352 .
  • the ring section 354 is displayed as a large ring that encloses the target board section 352 immediately after the pointing instruction image 350 has been displayed.
  • the ring section 354 shrinks with the lapse of time (see FIGS. 15A to 15D ).
  • the ring section 354 overlaps the target board section 352 when the pointing timing has been reached.
  • the player points the controller 20 (the first controller 20 - 1 held with the right hand in FIGS. 6A to 6C ) at the position of the target board section 352 displayed on the game screen within a predetermined period in accordance with the pointing timing instruction.
  • the light sources 198 L and 198 R are provided around the display section 12 , as described above.
  • the player directs the imaging section 220 provided on the front side of the controller 20 toward the game screen to operate the controller 20 as a pointing device that points an arbitrary point on the game screen.
  • FIGS. 14A and 1413 show an example of a state in which the player points the desired position on the game screen using the controller 20 .
  • the detection/determination section 110 determines whether or not the player has pointed the target board section 352 instructed by the pointing instruction image 350 at the instructed timing based on a signal acquired from the imaging section 220 of the controller 20 .
  • a predetermined event e.g., the backing dancer is raised as shown in FIG. 6B
  • a new pointing instruction image 350 is displayed at the center of the game screen, as shown in FIG. 6B .
  • the instruction images 340 - 1 and 340 - 2 that instruct a given movement of the controller are displayed corresponding to the new pointing instruction image 350 .
  • the timing marks 344 of the instruction images 340 - 1 and 340 - 2 move along the trace rails 342 when a specific period of time has elapsed to instruct given movements of the controllers 20 - 1 and 20 - 2 for the player.
  • a screen in which an acrobatic dance has succeeded is displayed, as shown in FIG. 6C , for example.
  • a game event in which the number of backing dancers appearing on the screen is successively increased may be generated to increase the interest of the game, for example.
  • a multi-player mode (two-player mode or four-player mode) can be selected when starting the game.
  • FIG. 17A shows an example of a game screen when a two-player mode has been selected.
  • two players hold the first controller 20 - 1 or the second controller 20 - 2 .
  • the players operate the controllers 20 - 1 and 20 - 2 in accordance with the instructions given by the instruction images 340 - 1 and 340 - 2 shown in FIG. 17A , and compete for the dance skill.
  • Each player holds the corresponding controller 20 , and observes the instruction image 340 displayed on the game screen.
  • two players corresponding to two spotlighted dance characters are given instructions on the movement of the controller 20 using the instruction images 340 - 1 and 340 - 2 .
  • the player visually recognizes his turn when the dance character corresponding to the player is spotlighted.
  • the player moves the controller in accordance with the movement instructed by the corresponding instruction image 340 to compete for the dance skill.
  • FIG. 18 shows an operation example when applying this embodiment to a game system.
  • step S 10 When the player has selected a single-player mode or a multi-player mode and started the game, the game calculation starts (step S 10 ).
  • a cheerleading dance game screen is displayed on the display section 12 , and background music (dance music) corresponding to the dance is output.
  • instructions may be given to the player as to the position of the controller 20 .
  • instructions may be given to the player P as to whether to hold the controller 20 vertically or horizontally.
  • the instruction images 340 - 1 and 340 - 2 that instruct the movements of the controllers 20 - 1 and 20 - 2 are displayed at a given timing with the progress of the game.
  • the detection/determination process that determines whether or not the player has moved each of the controllers 20 - 1 and 20 - 2 in accordance with the instructions given by the instruction images 340 - 1 and 340 - 2 is then performed (steps S 12 and S 14 ).
  • the pointing instruction image 350 is displayed on the game screen, as shown in FIGS. 6A to 6C , and the detection/determination process that determines whether or not the player has pointed the controller 20 at the area of the target board section 352 instructed by the pointing instruction image 350 at the instructed timing is performed (steps S 20 and S 22 ).
  • step S 30 The above-described process is repeated until the game ends (step S 30 ).
  • step S 32 The final game result is displayed when the game ends (step S 32 ).
  • the result display event based on the determination result occurs corresponding to each determination result of the detection/determination process performed in the steps S 14 and S 22 .
  • the total value (total score) of each determination result of the detection/determination process performed in the steps S 14 and S 22 is calculated and displayed as the final game result.
  • FIG. 19 shows a specific example of the process performed in the step S 14 shown in FIG. 18 .
  • the instruction images 340 - 1 and 340 - 2 are generated and displayed on the game screen (step S 40 ), and whether or not the player has accurately moved the controllers 20 - 1 and 20 - 2 in real space in accordance with the instructions given by the instruction images 340 - 1 and 340 - 2 is determined (step S 42 ).
  • the score 320 is updated based on the determination result, and characters “COOL” or “POOR” are displayed in the evaluation display area 322 .
  • the production image shown in FIGS. 13A to 13C is displayed corresponding to the evaluation result (step S 44 ).
  • the production image shown in FIGS. 13A to 13C may be displayed only in a predetermined scene during the game.
  • the production image shown in FIGS. 13A to 13C may be displayed only when the player or the operator has performed the production screen display setting before starting the game.
  • FIG. 20 shows a specific example of the process performed in the step S 40 shown in FIG. 19 .
  • the transition image shown in FIGS. 9A and 91B is displayed before the moving start timing of the controller 20 . Therefore, the player can determine and prepare for the moving direction and the moving start timing of the controller before the moving start timing of the controller 20 .
  • step S 52 When the moving start timing of the controller has been reached (step S 52 ), the timing mark 344 is moved toward the operation finish mark 346 along the trace rail 342 , as shown in FIG. 9C .
  • the player moves the controller 20 in the direction instructed by the trace rail 342 at the moving timing of the timing mark 344 .
  • the player successively moves the controller 20 in accordance with the instructions until the timing mark 344 reaches the operation finish mark 346 .
  • step S 56 When the successive movement finish timing has been reached (see FIG. 9D ) (step S 56 ), the display of the instruction image 340 is finished (step S 58 ).
  • the above display control process makes it possible to give visual instructions to the player as to the movement of the controller in the desired direction at an appropriate timing.
  • FIG. 21 shows a specific example of the detection/determination process performed in the step S 42 shown in FIG. 19 .
  • step S 60 When the moving start timing of the controller instructed by the instruction image 340 has been reached ( FIG. 9C ) (step S 60 ), signals output from the acceleration sensor of the controller 20 are acquired (step S 62 ), and the above-mentioned first process and second process are performed (steps S 64 and S 66 ).
  • the first process that determines the basic position which corresponds to the position of the controller 20 is performed, and the second process that determines the direction and the movement of the controller 20 and determines the degree of conformity with the movement instructed by the instruction image 340 is then performed.
  • FIG. 22 shows a specific example of the process performed in the step S 22 shown in FIG. 18 .
  • the pointing instruction image 350 is displayed at a given position of the game screen (step S 70 ).
  • the pointing instruction image 350 includes the target board section 352 that instructs the pointing area and the ring section 354 that shrinks toward the target board section 352 .
  • a timing at which the ring section 354 has shrunk to enclose the target board section 352 is the timing at which the player should point the controller 20 at the position instructed by the target board section 352 .
  • step S 72 Whether or not the player has successfully pointed the controller 20 at the instructed position is determined.
  • step S 74 an event corresponding to the pointing instruction is generated.
  • FIG. 6A an event in which the pointed backing dancer lifts the adjacent backing dancer is generated.
  • the instruction images 340 - 1 and 340 - 2 are displayed corresponding to the display position of the pointing instruction image 350 .
  • An operation that moves the controllers 20 - 1 and 20 - 2 in the direction instructed by the instruction images 340 - 1 and 340 - 2 is instructed after the player has successfully pointed the controller 20 at the instructed position, and the completion of an acrobatic dance shown in FIG. 6C is displayed when the player has successfully performed the successive operations.
  • the invention is not limited thereto.
  • the invention may be suitably applied to other applications, such as giving aerobic or exercise instructions to a student who holds a controller so that the student can perform appropriate aerobics or exercise and determining the result.
  • the invention is not limited thereto.
  • An arbitrary instruction image may be used insofar as a given movement of the controller can be instructed.
  • the color of a given path displayed by the trace rail 342 may be changed corresponding to the moving timing and the operation duration of the controller.
  • the moving direction may be displayed using an arrow or the like.
  • the operation start timing may be instructed by a countdown display, and the operation duration may also be instructed by a countdown display.

Abstract

A game system performs a process that generates and displays an instruction image that instructs a given movement of a controller including an acceleration sensor, and a process that acquires a signal from the acceleration sensor included in the controller to detect the movement of the controller, and determines the degree of conformity of the detected movement of the controller with the instructed movement instructed by the instruction image.

Description

  • Japanese Patent Application No. 2007-237278, filed on Sep. 12, 2007, and Japanese Patent Application No. 2008-211603, filed on Aug. 20, 2008, are hereby incorporated by reference in its entirety.
  • BACKGROUND OF THE INVENTION
  • The present invention relates to a program, an information storage medium, a game system, and an input instruction device.
  • A game system which gives dance action instructions to the player so that the player enjoys dancing has been known. JP-A-2002-166051 and JP-A-2001-224729 disclose a game device which instructs movements in the forward direction, the leftward direction, the rightward direction, the diagonal leftward direction, and the diagonal rightward direction for the player on the game screen so that the player performs a Para Para dance.
  • However, since such a game device requires infrared sensors for detecting the movement of the player in the five directions at five positions around the player, the configuration becomes complicated and expensive.
  • In recent years, a game system in which a controller (input section) including a physical quantity sensor (e.g., acceleration sensor) and a game device main body are separately provided has been known. In such a game system, the player plays the game by performing an input operation of shaking the controller or an input operation of inclining the controller.
  • However, such a game system has not allowed the player to easily enjoy a given operation (e.g., dance game).
  • SUMMARY
  • According to a first aspect of the invention, there is provided a program that causes a computer to function as:
  • an instruction image generation section that generates an instruction image that instructs a given movement of a controller including a physical quantity sensor; and
  • a detection/determination section that acquires a signal from the physical quantity sensor included in the controller, detects the movement of the controller, and determines the degree of conformity of the detected movement of the controller with the movement instructed by the instruction image.
  • According to a second aspect of the invention, there is provided a computer-readable information storage medium storing the above-described program.
  • According to a third aspect of the invention, there is provided a game system comprising:
  • an instruction image generation section that generates an instruction image that instructs a given movement of a controller including a physical quantity sensor; and
  • a detection/determination section that acquires a signal from the physical quantity sensor included in the controller, detects the movement of the controller, and determines the degree of conformity of the detected movement of the controller with the movement instructed by the instruction image.
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING
  • FIG. 1 is an explanatory view showing an example of a game system according to one embodiment of the invention.
  • FIG. 2 is an explanatory view showing of an example of a controller according to one embodiment of the invention.
  • FIG. 3 is a diagram for describing the principle of pointing instruction performed by a controller according to one embodiment of the invention.
  • FIG. 4 is a functional block diagram showing a game system according to one embodiment of the invention.
  • FIG. 5 is an explanatory view showing an example of a game screen including an instruction image.
  • FIG. 6A to FIG. 6C are explanatory views showing a series of changes in a game screen including a pointing instruction image.
  • FIG. 7A to FIG. 7C are explanatory views showing examples of a change in a game screen including an instruction image that instructs a turn or punch.
  • FIGS. 8A and 8B are diagrams for describing examples of an instruction image formed by combining a plurality of moving path instruction image parts.
  • FIG. 9A to FIG. 9D are explanatory views showing an example of a change in an instruction image.
  • FIGS. 10A and 10B are diagrams for describing the controller position detection principle.
  • FIG. 11 is a table for describing first type determination data.
  • FIG. 12 is a list for describing second type determination data.
  • FIGS. 13A to 13C are explanatory views showing production screens based on a determination result.
  • FIGS. 14A and 14B are explanatory views showing a game screen pointing operation using a controller.
  • FIGS. 15A to 15D are diagrams for describing an example of a pointing instruction image.
  • FIG. 16A to FIG. 16C are explanatory views showing a change in a game production screen in which the number of backing dancers increases.
  • FIGS. 17A and 17B are explanatory views respectively showing a two-player mode and a four-player mode.
  • FIG. 18 is a flowchart showing an example of a process performed by a game system according to one embodiment of the invention.
  • FIG. 19 is a flowchart showing an example of a process performed in a step S14 in FIG. 18.
  • FIG. 20 is a flowchart showing an example of a process performed in a step S40 in FIG. 19.
  • FIG. 21 is a flowchart showing an example of a process performed in a step S42 in FIG. 19.
  • FIG. 22 is a flowchart showing an example of a process performed in a step S22 in FIG. 18.
  • DETAILED DESCRIPTION OF THE EMBODIMENT
  • The invention may provide an input instruction device, a program, an information storage medium, and a game system that can visually instruct the movement of a controller including a physical quantity sensor and appropriately determine whether or not the controller has been operated according to the instructed movement so that a player can easily and appropriately play the game by using the controller, for example.
  • (1) According to one embodiment of the invention, there is provided an input instruction device comprising:
  • an instruction image generation section that generates an instruction image that instructs a given movement of a controller including a physical quantity sensor; and
  • a detection/determination section that acquires a signal from the physical quantity sensor included in the controller, detects the movement of the controller, and determines the degree of conformity of the detected movement of the controller with the movement instructed by the instruction image.
  • According to another embodiment of the invention, there is provided a game system comprising the above-mentioned sections. According to still another embodiment of the invention, there is provided a program that causes a computer to function as the above-mentioned sections. According to a further embodiment of the invention, there is provided a computer-readable information storage medium storing a program that causes a computer to function as the above-mentioned sections.
  • The term “controller” refers to a controller that can be held by the user and moved in real space. The controller is preferably a controller that can be held by the user and moved in the vertical direction, the horizontal direction, the forward and backward direction, and the like, for example.
  • The type of the physical quantity sensor included in the controller is not particularly limited insofar as the physical quantity sensor detects a physical quantity from which the movement of the controller in real space can be determined (acquired). The physical quantity sensor is preferably formed as a sensor that detects a physical quantity from which a moving amount per unit time in an arbitrary moving direction can be detected. It is more preferable that the physical quantity sensor detect a three-dimensional movement in real space as moving amounts per unit time in three axial directions. The acceleration sensor 210 detects the accelerations in three-axis (X axis, Y axis, and Z axis) directions. An acceleration sensor that detects a three-dimensional movement as accelerations in X axis, Y axis, and Z axis directions that intersect perpendicularly may be used.
  • The term “instruction image that instructs the movement” refers to an instruction image that instructs the user who holds the controller to move the controller in real space in various ways.
  • According to these embodiments, the user who holds the controller moves the controller in real space in accordance with the movement instructed by the instruction image while observing the instruction image.
  • The detection/determination section acquires the signal from the physical quantity sensor of the controller to detect the movement of the controller in real space. The detection/determination section determines the degree of conformity of the detected movement of the controller in real space with the movement instructed by the instruction image.
  • In these embodiments, a section that transmits the determination result to the user may be provided, if necessary. The determination result may be transmitted as image data, may be transmitted aural as a sound output, or may be transmitted as image data and a sound output.
  • This enables the user to determine the accuracy of the movement of the controller in real space in accordance with the movement instructed by the instruction image.
  • The embodiments in which the movement of the controller is instructed may be applied to a game system. For example, the movement of the controller may be appropriately instructed for the player who holds the controller by using the instruction image, and a game production effect such as a given event may be generated based on the determination result as to whether or not the controller has been accurately moved in real space in accordance with the instructions. A game system that provides the player who holds the controller with given dance movement instructions as the instruction image so that the player can easily play the dance game may be implemented.
  • These embodiments may also be suitably applied to other applications, such as giving aerobic or exercise instructions to a student (or player) who holds the controller so that the student can perform appropriate aerobics or exercise and determining the result.
  • (2) In each of the input instruction device, the game system, the program and the information storage medium,
  • the physical quantity sensor may detect a physical quantity from which a moving direction and a moving amount per unit time can be derived;
  • the instruction image generation section may generate the instruction image that instructs a moving direction and a moving timing of the controller as the movement; and
  • the detection/determination section may acquire the signal from the physical quantity sensor, detect the moving direction and the moving timing of the controller, and determine the degree of conformity of the detected moving direction and moving timing of the controller with the instructions instructed by the instruction image.
  • This makes it possible to instruct the moving timing as the movement in addition to the moving direction of the controller.
  • (3) In each of the input instruction device, the game system, the program and the information storage medium,
  • the instruction image generation section may generate the instruction image that instructs a moving start timing as the moving timing and instructs a moving duration; and
  • the detection/determination section may acquire the signal from the physical quantity sensor, detect the moving direction, the moving start timing, and the moving duration of the controller, and determine the degree of conformity of the detected moving direction, moving start timing, and moving duration of the controller with the instructions instructed by the instruction image.
  • This makes it possible to make the user successively move a hand or an arm in the instructed direction at a more appropriate timing.
  • (4) In each of the input instruction device, the game system, the program and the information storage medium,
  • the instruction image may include:
  • a moving path instruction image part that instructs a moving direction of the controller along a given path; and
  • a timing instruction image part that instructs a moving timing of the controller along the path,
  • the instruction image being updated according to a change in content of instruction.
  • This makes it possible to visually instruct the moving direction of the controller along a given path by using the moving path instruction image part, and to visually instruct the moving timing of the controller along the path by using the timing instruction image part. Therefore, the player can visually, instantaneously, and easily determine the movement of the controller from the instruction image.
  • The path instructed by the moving path instruction image part may be a straight line or a curve. A path that instructs the user who holds the controller to move the controller in the vertical direction, the horizontal direction, or the forward and backward direction, or make a turn while holding the controller.
  • The timing instruction image part may be arbitrarily formed insofar as the timing instruction image part instructs the moving timing of the controller along a given path instructed by the moving path instruction image part. For example, the timing instruction image part may be formed to move along the moving path corresponding to the moving timing of the controller, or may be integrally formed with the moving path instruction image part so that the color of the given path instructed by the moving path instruction image part is changed corresponding to the moving timing of the controller. It suffices that the timing instruction image part be displayed so that the moving timing can be visually recognized.
  • (5) In each of the input instruction device, the game system, the program and the information storage medium,
  • the timing instruction image part may move from a movement start instruction position to a movement finish instruction position along the moving path instruction image part to instruct a moving start timing and a moving duration of the controller along the instructed path.
  • The user can visually and easily determine the moving start timing and the moving duration of the controller along the instructed path by moving the timing instruction image part along the moving path instruction image part.
  • (6) In each of the input instruction device, the game system, the program and the information storage medium,
  • at least one of the moving path instruction image part and the timing instruction image part may be displayed as a transition image that changes from a previous notice display that is displayed before the moving start timing to a main display when the moving start timing is reached.
  • The player can be appropriately notified of the moving start timing, the moving direction, and the like using the transition image that changes from the previous notice display before the moving start timing to the main display.
  • (7) In each of the input instruction device, the game system, the program and the information storage medium,
  • the instruction image may be displayed as a set of moving path instruction image parts that instruct a continuous moving path by combining a plurality of the moving path instruction image parts; and
  • the timing instruction image part may move along the continuously combined moving path instruction image parts to instruct the moving timing of the controller along a moving path instructed by each of the moving path instruction image parts.
  • This makes it possible to instruct the movement and the moving timing of the controller in a more complicated manner with a variety by combining a plurality of moving path instruction image parts.
  • Therefore, when applying these embodiments to a dance game such as a dance game in which the player dances as a cheerleader, a simple dance game with a variety can be implemented in which two controllers held by the player with both hands are considered to be two pompons held by the cheerleader, which instructs a complex movement (e.g., shaking the pompon to draw the letter “8” or making a turn while holding the pompons) instead of merely moving the pompons in the vertical or horizontal direction.
  • The detection/determination process by the detection/determination section may be performed each time the movement instruction is issued using each moving path instruction image part that forms the set of moving path instruction image parts.
  • (8) In each of the input instruction device, the game system, the program and the information storage medium,
  • the detection/determination section may acquire the signal from the physical quantity sensor, and detect a timing and a duration when the moving amount of the controller per unit time exceeds a given value as the moving start timing and the moving duration of the controller; and
  • the detection/determination section may determine the degree of conformity of the detected moving start timing and moving duration of the controller with a moving start timing and determination moving duration for determination related to the instructed moving start timing and instructed moving duration instructed by the instruction image when the detection/determination section has determined that the detected moving direction of the controller coincides with the instructed moving direction instructed by the instruction image.
  • The detection/determination section can acquire the signal from the physical quantity sensor, and detect the timing and the duration when the moving amount of the controller per unit time exceeds a given value as the moving start timing and the moving duration of the controller.
  • In this case, a given delay time occurs until the moving amount of the controller per unit time exceeds a given value after the controller has been moved, although the delay time differs depending on the user. Therefore, even if the player has moved the controller at the timing instructed by the instruction image, detection of the movement is delayed by the given delay time.
  • According to these embodiments, the moving start timing for determination and the determination moving duration for determination related to the moving start timing and the moving duration instructed by the instruction image has been set. Even if the detected moving start timing of the controller is delayed as compared with the instructed moving start timing, it is determined that the degree of conformity of the movement of the controller with the instructed movement is high when the moving start timing and the moving duration correspond to the moving start timing and the determination moving duration for determination.
  • This eliminates a problem due to the delay in detecting the moving start timing of the controller. Therefore, the movement of the controller can be appropriately determined.
  • The moving start timing and the determination moving duration for determination may be set for a beginner player, an intermediate player, and a skilled player corresponding to the level of the user. For example, the degree of difficulty can be set by reducing delay between the moving start timing instructed by the instruction image and the determination moving start timing for a skilled player, and increasing the delay for a beginner player. The moving duration for determination may be reduced by the delay of the moving start timing for determination with respect to the instructed moving start timing instructed by the instruction image. The degree of difficulty can be decreased by increasing the moving duration for determination, and can be increased by decreasing the moving duration for determination.
  • (9) In each of the input instruction device, the game system, the program and the information storage medium,
  • the detection/determination section may perform a first process that compares the signal output from the physical quantity sensor when moving the controller with a first type determination database and determines the position of the controller that is moved, the first type determination database being related to the signal output from the physical quantity sensor when moving the controller in different positions in the moving direction instructed by the instruction image, and the first type determination database being used to determine the position of the controller; and
  • the detection/determination section may perform a second process that compares the signal output from the physical quantity sensor when moving the controller in the position determined by the first process in the moving direction instructed by the instruction image with a second type determination database to specify the movement of the controller including at least the moving direction, and determines the degree of conformity of the specified movement with the instructed movement, the second type determination database being related to the position of the controller determined based on the first type determination database, and being used to determine the movement the controller including at least the moving direction from the signal output from the physical quantity sensor when moving the controller in the moving direction instructed by the instruction image.
  • According to these embodiments, the first determination process that determines the position of the controller using the first type determination database and the second process that determines the degree of conformity of the movement of the controller including at least the moving direction using the second type determination database are performed.
  • For example, three acceleration sensors that detect accelerations in X, Y, and Z axial directions in the space based on the controller may be used as the physical quantity sensor included in the controller. In this case, the values output from the sensors that detect accelerations in X, Y, and Z axial directions differ depending on the position of the controller even when moving the controller in an identical direction (e.g., rightward direction).
  • In order to more accurate detect the movement of the controller including at least the moving direction, it is important to determine the position of the controller when the controller is moved in advance, and specify the movement of the controller including at least the moving direction from the signals output from the sensor that detect accelerations in X, Y, and the Z axial directions, while being related to the position of the controller during movement.
  • Therefore, the position of the controller when the user moves the rod-shaped controller in accordance with the instructions instructed by the instruction image is classified into basic patterns, for example. For example, basic positions such as the case where the user holds the controller vertically, the case where the user holds the controller horizontally, and the case where the user inclines the controller are taken into consideration.
  • The signals output from the physical quantity sensor included in the controller are collected within the predetermined allowable range, and stored in the database, while relating to the different basic positions. In this case, even if the user vertically holds the controller, the user may hold the controller in a state in which the controller is inclined with respect to the vertical direction to some extent. If such a position is excluded from the vertical position, it becomes impossible to classify the basic position of the controller.
  • Therefore, the signals output from the sensor of the controller when the inclination of the controller from the basic position (e.g., vertical position) is within the predetermined allowable range (e.g., when the controller held by the player is inclined with respect to the vertical basic position within the predetermined allowable range) are collected as data within the predetermined allowable range and stored in the database corresponding to the basic position.
  • The data collected to be within the predetermined allowable range and stored in the database while being related to each basic position is the first type determination database.
  • In these embodiments, the signals output from the physical quantity sensor included in the controller when moving the controller in the moving direction instructed by the instruction image are compared with the first type determination database, and the position of the controller that is moved in the moving direction is determined (first determination process). Therefore, the position (e.g., vertical or horizontal with respect to the screen) of the controller held by the user can be determined.
  • When the position of the controller has been determined by the first process, it is necessary to determine whether or not the movement of the controller coincides with the instructions when moving the controller in this position in the instructed direction.
  • Therefore, the second type determination database is provided in the same manner as the first type determination database. Specifically, the signals output from the physical quantity sensor of the controller when moving the controller in the direction instructed by the instruction image are collected corresponding to each basic position in which the player is considered to hold the controller. The second type determination database for specifying the movement of the controller including the moving direction when moving the controller in a given position is provided in which the signals output from the physical quantity sensor when moving the controller in each position are associated with the moving direction of the controller.
  • Even if the user moves the controller in an identical direction (e.g., rightward direction), the user may move the controller in a meandering path or a curved path with respect to the instructed direction. If it is determined that the movement along such a moving path does not conform to the movement in the instructed moving direction, a situation in which a wide range of users cannot enjoy the game may occur.
  • Therefore, when the controller is moved along a moving path with the predetermined allowable range with respect to the moving path instructed by the instruction image (e.g., the moving direction instructed by the moving path instruction section), it is necessary to determine the movement to be a movement in the moving direction.
  • Therefore, the signals output from the physical quantity sensor when moving the controller in a specific basic position in the moving direction instructed by the instruction image (e.g., horizontal direction) are collected within the predetermined allowable range, and stored in the database. When the moving direction instructed by the instruction image is the vertical direction, the horizontal direction, the forward and backward direction, a predetermined diagonal direction, a semicircle or a circle along the vertical plane, or a semicircle or a circle along the horizontal plane, the signals output from the physical quantity sensor when moving the controller in a specific basic position in each moving direction are collected as data within the predetermined allowable range.
  • The data in which the output data from the physical quantity sensor collected with the predetermined allowable range when moving the controller in a given moving direction is associated with the given moving direction is collected corresponding to each moving direction instructed by the instruction image and stored in the database. The second type determination database is thus generated.
  • When the position of the controller has been determined by the first process, the data relating to the signals with the predetermined allowable range output from the physical quantity sensor when moving the controller in the determined position is compared with the signals actually output from the physical quantity sensor of the controller using the second type determination database to specify the movement of the controller including the moving direction.
  • For example, a process that specifies the moving direction and the moving amount per unit time of the controller held in the position determined by the first process is performed in real time. The moving direction and the moving timing specified based on the moving direction and the moving amount per unit time of the controller are determined, and whether or not the movement coincides with the movement instructed by the instruction image is determined.
  • Therefore, whether or not the player has moved the controller in the instructed direction can be determined irrespective of the position of the controller
  • It is preferable to perform the first determination process and the second determination process each time the movement of the controller is instructed by the instruction image that instructs one movement. In this case, different movements of the controller are sequentially instructed, and the movement of the controller can be detected accurately without being affected by a change in the position of the controller.
  • The first determination process and the second determination process may be performed in this order. Alternatively, the signals output from the physical quantity sensor of the controller may be stored in a buffer, and the first determination process and the second determination process may be performed in parallel.
  • When indicating a consecutive moving path by combining a plurality of moving path instruction image parts, the first determination process and the second determination process may be performed in a period in which the controller must be moved to satisfy the moving path and the moving timing instructed by each moving path instruction image part. Alternatively, the first determination process may be appropriately omitted so that the first determination process and the second determination process are performed in a specific period and only the second determination process is performed in a specific period.
  • (10) In each of the input instruction device, the game system, the program and the information storage medium,
  • the instruction image generation section may generate the instruction image that individually instructs a given movement for at least two controllers each having the physical quantity sensor.
  • This makes it possible for the user to hold two controllers with both hands, thus enabling to instruct the movement of each controller by the instruction image so that more complex instructions relating to the movement of the controller can be visually given.
  • When applying these embodiments to sport instructions (e.g., dance game or exercise), the movement of each arm of the user can be individually instructed as the movement of the controller. This makes it possible to visually and simply give various instructions on a dance or exercise in which a plurality of movements are combined to the user.
  • When applying these embodiments to a dance game or the like, one player may play the game while holding two controllers with both hands, or two players may play the game while respectively holding one controller. For example, when forming a system so that four controllers can be used, a four-player game in which operation instructions are given to four players each holding one controllers may be implemented, or a two-player game in which operation instructions are given to two players each holding two controllers may be implemented.
  • When a single player plays the game while holding two controllers, the operation of each controller may be individually determined, and the determination result for each operation may be evaluated. Alternatively, the operations of the controllers may be evaluated.
  • (11) Each of the input instruction device, the game system, the program and the information storage medium may further comprise:
  • a game calculation section that instructs a player to perform a dance action accompanying the movement of the controller, generates a game screen including a character that performs a dance related to the instructed dance action based on an input from the controller, and generates a dance background music signal, wherein the instruction image generation section generates the instruction image that instructs a given movement of the controller in the game screen.
  • According to these embodiments, a dance game which instructs a dance movement with one hand or both hands for the player (user) using the instruction image so that the player can easily enjoy dancing to the rhythm can be provided.
  • Specifically, a dance character associated with the dance action instructed by the instruction image appears on the game screen according to these embodiments. The dance character dances to the dance background music, and the instruction image that instructs a given movement of the controller associated with the dance movement is generated and displayed on the game image where the dance character appears. Therefore, the player can dance in accordance with the movement instructed by the instruction image in synchronization with the dance character and background music. As a result, a player-friendly dance game can be provided.
  • (12) In each of the input instruction device, the game system, the program and the information storage medium,
  • the game calculation section may include a subsection that performs game production related to a result of determination for the degree of conformity by the detection/determination section.
  • According to these embodiments, when the movement of the controller instructed by the instruction image has been appropriately performed, game production corresponding to the degree of conformity determination result is performed, such as generating effect sound or displaying a production effect image on the game screen to liven up the game.
  • When successive movements are instructed by the instruction image, a special event may be generated when the player has successfully moved the controller corresponding to the successive movements. This improves the game production effect.
  • (13) In each of the input instruction device, the game system, the program and the information storage medium,
  • the game calculation section may include a subsection that specifies a cause of an incorrect operation of the controller and notifies the player of the cause of the incorrect operation based on a result of determination for the degree of conformity by the detection/determination section.
  • When the actual movement of the player does not coincide with the movement instructions instructed by the instruction image, the player can be urged to more accurately move the controller by notifying the player of the cause of the incorrect operation.
  • For example, when the player has diagonally moved the controller even though the instruction image instructs the horizontal movement of the controller, the player is notified to that effect so that the player can correct the operation to move the controller along a more accurate moving path. Therefore, when applying this embodiment to a dance game or the like, the above configuration prompts the player to correct the dance and challenge a more difficult dance game.
  • (14) In each of the input instruction device, the game system, the program and the information storage medium,
  • the game calculation section may include a subsection that traces the moving path of the controller detected by the detection/determination section in the game screen based on a given condition.
  • The user can thus visually observe the degree of similarity between the movement instructed by the instruction image and the actual movement of the controller so that the user can enjoy the game while further improving his dance skill.
  • (15) Each of the input instruction device, the game system, the program and the information storage medium may further comprise:
  • a pointing position detection section that acquires an imaging signal from a imaging section provided in the controller and detects a pointing position of the controller on the game screen, the imaging section imaging a reference position recognition body disposed or displayed at a position related to the game screen,
  • wherein the game calculation section includes a subsection that displays a position instruction image that instructs to point at a predetermined position of the game screen at a given timing during the game; and
  • wherein the game calculation section generates an event related to a pointing at the predetermined position when the pointing at the predetermined position has been detected at the timing.
  • It suffices that the reference position recognition body allow the pointing position of the controller on the game screen to be specified from the captured image. For example, the reference position recognition body may be a recognition body provided at a position associated with the game screen (e.g., at least two light sources or recognizable objects), or may be a recognition image displayed on the game screen.
  • For example, the two light sources may be provided around a display and imaged by using an imaging section provided in the controller so that a CPU can determine the relative positional relationship between the controller and the game screen on the screen and determine the pointing position of the controller on the game screen.
  • In these embodiments, the position instruction image that instructs the player to point a given position on the game screen at a given timing during the game is generated and displayed.
  • When the player has successfully pointed the game screen using the controller in accordance with the instruction, a predetermined event (e.g., the number of backing dancers appearing on the game screen increases or a plurality of backing dancers who dance on the game screen give special performance) may be generated to increase the range of the game.
  • (16) In each of the input instruction device, the game system, the program and the information storage medium,
  • the instruction image generation section may generate the instruction image that instructs a given movement of the controller related to the predetermined pointed position of the game screen as the event related to the pointing at the predetermined position.
  • Some embodiments of the invention will be described below. Note that the embodiments described below do not in any way limit the scope of the invention laid out in the claims herein. In addition, not all of the elements of the embodiments described below should be taken as essential requirements of the invention.
  • The following embodiments illustrate an example in which the invention is applied to a game system.
  • 1. Outline of System
  • FIG. 1 is a schematic external view showing a game system according to one embodiment of the invention.
  • The game system according to this embodiment includes a display section 12 that displays a game image on a display screen 11, a game device 10 (game device main body) that performs a game process and the like, a first controller 20-1 (operation input section), and a second controller 20-2 (operation input section), the first controller 20-1 and the second controller 20-2 being held by a player P with either hand so that their positions and directions within a predetermined range can be arbitrarily changed.
  • In the example shown in FIG. 1, the game device 10 and each of the controllers 20-1 and 20-2 exchange various types of information via wireless communication.
  • FIG. 2 is a schematic external view showing the controller 20 according to this embodiment.
  • The controller 20 includes an arrow key 16 a and an operation button 16 b as an operation section.
  • The controller 20 also includes an acceleration sensor 210 as a physical quantity sensor that detects information which changes corresponding to the inclination and the movement of the controller so that information relating to the inclination and the movement of the controller in real space can be acquired.
  • The acceleration sensor 210 according to this embodiment is formed as a triaxial acceleration sensor 210 (detection section). The acceleration sensor 210 detects the direction and the degree of inclination of the controller as acceleration vectors (inclination information) in three axial directions applied to the controller.
  • The acceleration sensor 210 detects the movement of the controller (i.e., changes in speed and direction of the controller per unit time due to the movement of the controller) as acceleration vectors (movement information) in three axial directions applied to the controller.
  • As shown in FIG. 1, when the player P has moved the first controller 20-1 and the second controller 20-2 while holding each controller to change the inclination and the movement of each controller, the game device 10 detects and determines the inclination and the movement of each of the first controller 20-1 and the second controller 20-2 in real space based on the information that changes corresponding to the inclination and the movement of each controller, and controls the game.
  • The game system according to this embodiment displays a dance game screen shown in FIG. 5, and displays an instruction image 340 in the game screen with the progress of the game. The instruction image 340 instructs the player who holds the controller to move the controller in real space in various ways.
  • The player who holds the controller 20 moves the controller 20 in real space in accordance with the movement of the controller instructed by the instruction image while observing the instruction image.
  • The game device 10 acquires signals from the acceleration sensor 210 of the controller 20 to detect the movement of the controller in real space. The game device 10 determines the degree of conformity of the detected movement of the controller in real space with the movement instructed by the instruction image.
  • The game device 10 generates a given event or a game production effect based on the determination result.
  • The player P who holds the controller 20 is thus provided with given dance movement instructions or the like using the instruction image 340 so that the player can easily play the dance game.
  • The controller 20 has a pointing function of indicating (pointing) an arbitrary position on the display screen 11.
  • A pair of light sources 198R and 198L (reference position recognition portions) is disposed around the display section 12 at a position associated with the display screen 11. The light sources 198R and 198L are disposed at a predetermined interval along the upper side of the display section 12, and are formed to project infrared radiation (i.e., invisible light). An imaging section 220 that captures an image in front of the controller 20 is provided on the front side of the controller 20.
  • The pointing position of the controller 20 on the display screen 11 is calculated as follows.
  • The rectangular area shown in FIG. 3 instructs a captured image PA acquired by the imaging section 220 (image sensor). The captured image PA is an image corresponding to the position and the direction of the controller 20.
  • The position RP of an area RA corresponding to the light source 198R and the position LP of an area LA corresponding to the light source 198L included in the captured image PA are calculated. The positions RP and LP are instructed by position coordinates specified by a two-dimensional coordinate system (XY-axis coordinate system) in the captured image PA. The distance between the light sources 198R and 198L and the relative positions of the light sources 198R and 198L associated with the display screen 11 are known in advance. Therefore, the game device 10 calculates the indication position (pointing position) on the display screen 11 using the controller 20 from the coordinates of the positions RP and LP thus calculated.
  • In this embodiment, the origin O of the captured image PA is determined to be the pointing position of the controller 20. The pointing position is calculated from the relative positional relationship between the origin O of the captured image PA, the positions RP and LP in the captured image PA, and a display screen area DA that is an area in the captured image PA corresponding to the display screen 11.
  • In the example shown in FIG. 3, the positions RP and LP are position above the center of an imaging area PA to some extent in a state in which the line segment that connects the positions RP and LP is rotated clockwise by theta degrees with respect to a reference line L (X axis) of the imaging area PA. In the example shown in FIG. 3, the origin O corresponds to a predetermined position on the lower right of the display screen area DA so that the coordinates of the indication position (pointing position) of the controller 20 on the display screen 11 can be calculated.
  • In the game system according to this embodiment, a game image shown in FIGS. 6A to 6C is displayed on the display screen 11, for example. A position instruction image 350 that instructs the player to points at a predetermined position in the game image at a given timing using the first controller 20-1 and the second controller 20-2 held with the left hand or the right hand with the progress of the dance game is displayed in the game image. When the player has instructed the pointing position instructed by the position instruction image at a predetermined timing by using the first controller 20-1 and the second controller 20-2, the game device 10 determines whether or not the predetermined position has been instructed at an appropriate timing. When the pointing operation has been performed appropriately, the game device 10 performs a game production process that generates a predetermined event (e.g., the number of backing dancers appearing in the game image increases).
  • It suffices that the reference position recognition body allow the pointing position of the controller on the game screen to be specified from the captured image. For example, the reference position recognition body may be a recognition body provided at a position associated with the game screen (e.g., at least two light sources or recognizable objects), or may be a recognition image displayed on the game screen. For example, two reference position recognition images may be displayed at predetermined positions on the game screen as the recognition bodies. The number of recognition bodies need not necessarily two. A recognition body having a shape for which the relative positional relationship with the display screen 11 can be specified. The number of recognition bodies may be one.
  • 2. Configuration
  • FIG. 4 shows an example of a functional block diagram of the game system according to this embodiment. Note that the game system according to this embodiment need not necessarily include all of the elements shown in FIG. 1. The game system according to this embodiment may have a configuration in which some of the elements are omitted.
  • The game system according to this embodiment includes the game device 10, the controller 20 as an input section, an information storage medium 180, a display section (display device) 190, a speaker 192, and the light sources 198R and 198L.
  • The controller 20 includes the acceleration sensor 210, the imaging section 220, a speaker 230, a vibration section 240, a microcomputer 250, and a communication section 260. The controller 20 may include an image input sensor, a sound input sensor, and a pressure sensor.
  • The acceleration sensor 210 detects the accelerations in three axial directions (X axis, Y axis, and Z axis). Specifically, the acceleration sensor 210 detects the accelerations in the vertical direction, the horizontal direction, and the backward or forward direction. The acceleration sensor 210 detects the accelerations at intervals of 5 msec. The acceleration sensor 210 may detect the accelerations in one axis, two axes, or six axes. The accelerations detected by the acceleration sensor are transmitted to the game device through the communication section 260.
  • The imaging section 220 includes an infrared filter 222, a lens 224, an imaging element (image sensor) 226, and an image processing circuit 228. The infrared filter 222 is disposed on the front side of the controller, and allows only infrared radiation contained in light incident from the light source 198 disposed while being associated with the display section 190 to pass through. The lens 224 condenses the infrared radiation that has passed through the infrared filter 222, and emits the infrared radiation to the imaging element 226. The imaging element 226 is a solid-state imaging element such as a CMOS sensor or a CCD. The imaging element 226 images the infrared radiation condensed by the lens 224 to generate a captured image. The image processing circuit 228 processes the captured image generated by the imaging device 226. For example, the image processing circuit 228 processes the captured image from the imaging device 226 to detect a high luminance component, and detects light source position information (specific position) in the captured image. When a plurality of light sources are provided, the image processing circuit 228 detects the position information relating to the plurality of light sources in the captured image. The detected position information is transmitted to the game device through the communication section 260. In this embodiment, the controller 20 may be utilized as a pointing device that points a position (position information) on the game screen.
  • The speaker 230 outputs sound acquired from the game device through the communication section 260. In this embodiment, the speaker 230 outputs confirmation sound transmitted from the game device or effect sound corresponding to motion.
  • The vibration section (vibrator) 240 receives a vibration signal transmitted from the game device, and operates based on the vibration signal.
  • The microcomputer 250 outputs sound or operates the vibrator based on data from received from the game device. The microcomputer 250 causes the accelerations detected by the acceleration sensor 210 to be transmitted to the game device through the communication section 260, or causes the position information detected by the imaging section 220 to be transmitted to the game device 10 through the communication section 260.
  • The communication section 260 includes an antenna and a wireless module. The communication section 260 exchanges data with the game device via wireless communication using the Bluetooth (registered trademark) technology, for example. The communication section 260 according to this embodiment transmits the accelerations detected by the acceleration sensor 210, the position information detected by the imaging section 220, and the like to the game device at alternate intervals of 4 msec and 6 msec. The communication section 260 may be connected to the game device via a communication cable, and exchange information with the game device via the communication cable.
  • The controller 20 may also include operating sections such as a button, a lever (analog pad), a mouse, an arrow key, and a touch panel display. The controller 20 may include a gyrosensor that detects the angular velocity which changes due to the input operation of the player.
  • The game device 10 according to this embodiment is described below.
  • The game device 10 according to this embodiment includes a storage section 170, a processing section 100, and a communication section 196.
  • The storage section 170 serves as a work area for the processing section 100, the communication section 194, and the like. The function of the storage section 170 may be implemented by hardware such as a RAM (VRAM).
  • The storage section 170 according to this embodiment includes a main storage section 172, a drawing buffer 174, and a sound data storage section 176.
  • The main storage section 172 serves as a work area for the processing section 100, the communication section 194, and the like. The function of the storage section 170 may be implemented by hardware such as a RAM (VRAM).
  • In this embodiment, the main storage section 172 includes a storage area 173 that stores first and second type determination databases described later.
  • The drawing buffer 174 stores an image generated by a drawing section 120.
  • The sound data storage section 176 stores confirmation sound that instructs the reaction of the controller with regard to the input operation of the player and effect sound output along with a game calculation process. The sound data storage section 176 stores a plurality of types of confirmation sound corresponding to detected information. The sound data storage section 176 stores a plurality of types of effect sound corresponding to motion and a given event.
  • The processing section 100 performs various processes according to this embodiment based on a program (data) stored in (read from) the information storage medium 180. Specifically, the information storage medium 180 stores a program that causes a computer to function as each section according to this embodiment (i.e., a program that causes a computer to perform the process of each section). The information storage medium 180 includes a memory card that stores a player's personal data, game save data, and the like.
  • The communication section 196 can communicate with another game device through a network (Internet). The function of the communication section 196 may be implemented by hardware such as a processor, a communication ASIC, or a network interface card, a program, or the like. The communication section 196 can perform cable communication and wireless communication.
  • The communication section 196 includes an antenna and a wireless module, and exchanges data with the communication section 260 of the controller 20 using the Bluetooth (registered trademark) technology, for example. For example, the communication section 196 transmits sound data (e.g., confirmation sound and effect sound) and the vibration signal to the controller, and receives information detected by the acceleration sensor and the image sensor of the controller 20 at alternate intervals of 4 msec and 6 msec.
  • A program (data) that causes a computer to function as each section according to this embodiment may be distributed to the information storage medium 180 (or the storage section 170) from a storage section or an information storage medium included in a server through a network. Use of the information storage medium of the server is also included within the scope of the invention.
  • The processing section 100 (processor) performs a game calculation process, an image generation process, and a sound control process based on detected information received from the controller 20, a program loaded into the storage section 170 from the information storage medium 180, and the like.
  • The processing section 100 according to this embodiment functions as an instruction image generation section 102, a pointing position instruction section 104, a detection/determination section 110, a game calculation section 112, a drawing section 120, a sound control section 130, and a vibration control section 140.
  • The instruction image generation section 102 generates an instruction image that instructs a given movement of the controller 20 on the game screen. Specifically, the instruction image generation section 102 generates the instruction image 340 that instructs the moving direction and the moving timing of the controller 20 as the given movement of the controller 20.
  • The pointing position instruction section 104 generates a position instruction image (pointing instruction image) 350 that instructs the player to point a predetermined position on the game screen at a given timing during the game.
  • The detection/determination section 110 detects the movement of the controller 20 based on information obtained from the acceleration sensor 210 of the controller 20, and determines the degree of conformity of the detected movement of the controller 20 with the movement instructed by the instruction image.
  • The detection/determination section 110 detects the pointing position of the controller 20 on the game screen based on information from the imaging section 220 of the controller 20, and determines whether or not the detected pointing position has been appropriately pointed at the predetermined timing instructed by the position instruction image.
  • The game calculation section 112 performs game calculations based on the determination result of the detection/determination section 110 and a given program.
  • For example, the game calculation section 112 disposes various objects (i.e., objects formed by a primitive such as a polygon, free-form surface, or subdivision surface) that represent display objects such as a character (player character or enemy character), a moving body (e.g., car or airplane), a building, a tree, a pillar, a wall, or a map (topography) in an object space. Specifically, the game calculation section 112 determines the position and the rotational angle (synonymous with orientation or direction) of the object in a world coordinate system, and disposes the object at the determined position (X, Y, Z) and the determined rotational angle (rotational angles around X, Y, and Z axes).
  • The game calculation section 112 controls a virtual camera (viewpoint) for generating an image viewed from a given (arbitrary) viewpoint in the object space. Specifically, the game calculation section 112 controls the position (X, Y, Z) or the rotational angle (rotational angles around X, Y, and Z axes) of the virtual camera (controls the viewpoint position, the line-of-sight direction, or the angle of view).
  • For example, when imaging the object (e.g., character) from behind using the virtual camera, the game calculation section 112 controls the position or the rotational angle (direction) of the virtual camera so that the virtual camera follows a change in position or rotation of the object. In this case, the game calculation section 112 may control the virtual camera based on information such as the position, the rotational angle, or the speed of the object obtained by a motion generation section 124 described later. Alternatively, the game calculation section 112 may rotate the virtual camera at a predetermined rotational angle, or move the virtual camera along a predetermined path. In this case, the game calculation section 112 controls the virtual camera based on virtual camera data for specifying the position (path) or the rotational angle of the virtual camera. When a plurality of virtual cameras (view points) are provided, the above-described control process is performed on each virtual camera.
  • The game calculation section 112 calculates the movement/motion (movement/motion simulation) of a model (e.g., character, car, or airplane). Specifically, the game calculation section 112 causes the model to move in the object space or causes the object to perform a motion (animation) based on detected information determined to satisfy a predetermined condition, a program (movement/motion algorithm), motion data, and the like. Specifically, the game calculation section 112 performs a simulation process that sequentially calculates movement information (position, rotational angle, speed, or acceleration) and motion information (position or rotational angle of each part that forms the object) of the object in frame ( 1/60 sec) units. Note that the term “frame” refers to a time unit when performing the object movement/motion process (simulation process) and the image generation process.
  • The drawing section 120 performs a drawing process based on the results of various processes (game calculation process) performed by the processing section 100 to generate an image, and outputs the image to the display section 190. When generating a three-dimensional game image, display object data (object data or model data) including vertex data (e.g., vertex position coordinates, texture coordinates, color data, normal vector, or alpha value) relating to each vertex that defines the display object (object or model) is input to the drawing section 120, and the drawing section 120 performs a vertex process based on the vertex data included in the input display object data. When performing the vertex process, the drawing section 120 may perform a vertex generation process (tessellation, curved surface division, or polygon division) for dividing the polygon, if necessary. In the vertex process, the drawing section 120 performs a vertex movement process and a geometric process such as coordinate transformation (world coordinate transformation or camera coordinate transformation), clipping, perspective transformation, or a light source process, and changes (updates or adjusts) the vertex data relating to the vertices that form the display object based on the processing results. The drawing section 120 performs rasterization (scan conversion) based on the vertex data after the vertex process so that the surface of the polygon (primitive) is associated with pixels. The drawing section 120 then performs a pixel process (fragment process) that draws pixels which form the image (fragments which form the display screen). In the pixel process, the drawing section 120 determines the final pixel drawing color by performing various processes such as a texture reading (texture mapping) process, a color data setting/change process, a translucent blending process, and an anti-aliasing process, and outputs (draws) the drawing color of the object subjected to perspective transformation to (in) the drawing buffer 174 (i.e., a buffer that can store image information in pixel units; VRAM or rendering target). Specifically, the pixel process includes a per-pixel process that sets or changes the image information (e.g., color, normal, luminance, and alpha value) in pixel units. This causes an image viewed from the virtual camera (given viewpoint) set in the object space to be generated. When a plurality of virtual cameras (viewpoints) are provided, an image may be generated so that images (divided images) viewed from the respective virtual cameras can be displayed on one screen.
  • The vertex process and the pixel process performed by the drawing section 120 may be implemented by hardware that enables a programmable polygon (primitive) drawing process (i.e., programmable shader (vertex shader and pixel shader)) based on a shader program written using a shading language. The programmable shader enables a programmable per-vertex process and per-pixel process to increase the degree of freedom relating to the drawing process so that the representation capability is significantly improved as compared with a fixed hardware drawing process.
  • The drawing section 120 performs a geometric process, a texture mapping process, a hidden surface removal process, an alpha blending process, and the like when drawing the display object.
  • In the geometric process, the display object is subjected to a coordinate transformation process, a clipping process, a perspective transformation process, a light source calculation process, and the like. The display object data (e.g., display object's vertex position coordinates, texture coordinates, color data (luminance data), normal vector, or alpha value) after the geometric process (after perspective transformation) is stored in the main storage section 171.
  • The term “texture mapping process” refers to a process for mapping a texture (texel value) stored in the storage section 170 on the display object. Specifically, the drawing section 120 reads a texture (surface properties such as color (RGB) and alpha value) from the storage section 170 using the texture coordinates set (assigned) to the vertices of the display object, for example. The drawing section 120 maps the texture (two-dimensional image) on the display object. In this case, the drawing section 120 performs a pixel-texel association process, bilinear interpolation (texel interpolation), and the like.
  • The drawing section 130 may perform a hidden surface removal process by a Z buffer method (depth comparison method or Z test) using a Z buffer (depth buffer) that stores the Z value (depth information) of the drawing pixel. Specifically, the drawing section 120 refers to the Z value stored in the 7 buffer when drawing the drawing pixel corresponding to the primitive of the object. The drawing section 120 compares the Z value stored in the Z buffer with the Z value of the drawing pixel of the primitive. When the Z value of the drawing pixel is the Z value in front of the virtual camera (e.g., a small Z value), the drawing section 120 draws the drawing pixel and updates the Z value stored in the Z buffer with a new Z value.
  • The term “alpha blending process” refers to a translucent blending process (e.g., normal alpha blending, additive alpha blending, or subtractive alpha blending) based on an alpha value (A value). In normal alpha blending, the drawing section 120 calculates a color obtained by blending two colors by performing linear interpolation using the alpha value as the degree of blending.
  • The term “alpha value” refers to information that can be stored while being associated with each pixel (texel or dot), such as additional information other than the color information that instructs the luminance of each RGB color component. The alpha value may be used as mask information, translucency (equivalent to transparency or opacity), bump information, or the like.
  • The sound control section 130 causes at least one of the speaker 230 of the controller and the speaker 192 to output sound (including confirmation sound and effect sound) stored in the sound data storage section 176 based on the results of various processes (e.g., the determination process and the game calculation process) performed by the processing section 100.
  • The sound control section 130 according to this embodiment causes the speaker to output confirmation sound when the detection/determination section 110 has determined that the predetermined condition is satisfied. The sound control section 130 may cause the speaker to output confirmation sound corresponding to the detected information. The sound control section 130 may cause only the speaker 230 of the controller to output confirmation sound, and may cause the speaker 192 to output effect sound corresponding to the game calculation process (e.g., effect sound corresponding to the motion determined based on the detected information).
  • The vibration control section 140 causes the vibration section 240 of the controller to vibrate based on a predetermined condition.
  • The game system according to this embodiment may be a system dedicated to a single-player mode in which only one player can play the game, or may be a system provided with a multi-player mode in which a plurality of players can play the game. When a plurality of players play the game, the game images and the game sound provided to the players may be generated using one game device and one display section. The game images and the game sound may be generated by a distributed process using a plurality of game devices connected through a network (transmission line or communication line) or the like. In this embodiment, when a plurality of players play the game, a determination as to whether or not a predetermined condition is satisfied based on the detected information, sound control based on the determination result, vibration control are performed corresponding to the controller of each player.
  • The information storage medium 180 (computer-readable medium) stores a program, data, and the like. The function of the information storage medium 180 may be implemented by hardware such as an optical disk (CD or DVD), a magneto-optical disk (MO), a magnetic disk, a hard disk, a magnetic tape, or a memory (ROM).
  • The display section 190 outputs an image generated by the processing section 100. The function of the display section 190 may be implemented by hardware such as a CRT display, a liquid crystal display (LCD), an organic EL display (OELD), a plasma display panel (PDP), a touch panel display, or a head mount display (HMD).
  • The speaker 192 outputs sound reproduced by the sound control section 130. The function of the speaker 192 may be implemented by hardware such as a speaker or a headphone. The speaker 192 may be a speaker provided in the display section. For example, when a television set (home television set) is used as the display section, the speaker 192 may be a speaker provided in the television set.
  • The light source 198 is an LED that emits infrared radiation (i.e., invisible light), for example. The light source 198 is disposed while being associated with the display section 190. In this embodiment, a plurality of light sources (light source 198R and light source 198L) are provided. The light source R and the light source L are disposed at a predetermined interval.
  • 3. Method According to this Embodiment
  • A method according to this embodiment is described below with reference to the drawings.
  • 3-1: Game Executed According to this Embodiment and Instruction Image Display Process
  • FIG. 5 shows an example of the game screen displayed according to this embodiment.
  • The game executed by the game system according to this embodiment is configured so that the player plays the leader of a cheerleading dance team to lead the dance of the entire team while giving dance instructions to the members of the team aiming to succeed in the dance.
  • As shown in FIG. 5, floral beat characters 330 that instruct the beat of background music (BGM) are displayed on the game screen. The beat characters 330 blink on the beat.
  • A main dancer 310 who holds pompons with both hands and a plurality of sub dancers 312-1 and 312-2 positioned behind the main dancer 310 are displayed on the game screen.
  • The main dancer 310 is a player character that reproduces a dance corresponding to the operation of the player as a cheerleader.
  • The instruction image generation section 102 generates and displays a pair of instruction images 340-1 and 340-2 that respectively instruct the movements of the first controller 20-1 and the second controller 20-2 held by the player with the progress of the game. In this embodiment, the instruction image 340 is displayed at given time intervals in a predetermined order with the progress of the game.
  • In FIG. 5, the instruction images 340-1 and 340-2 that respectively instruct the movements of the first controller 20-1 and the second controller 20-2 held by the player are displayed on either side of the main dancer 310 that is a player character.
  • The instruction image 340-1 instructs the operation of the first controller 20-1 held by the player with the right hand, and the instruction image 340-2 instructs the operation of the second controller 20-2 held by the player with the left hand.
  • The instruction images 340-1 and 340-2 are displayed at given positions on the game screen with the progress of the game. The instruction images 340-1 and 340-2 give instructions to the player with regard to given movements (i.e., moving direction, moving timing, and moving duration) of the controllers 20-1 and 20-2.
  • The instruction image 340 according to this embodiment includes a trace rail 342 (or a moving path instruction image part) that instructs a moving direction of the controller, a timing mark 344 (or a timing instruction image part) that instructs the moving timing, and an operation finish mark 346 that instructs the expiration of the moving duration.
  • The timing mark 344 is displayed on one end of the trace rail 342, and the operation finish mark 36 is displayed on the other end of the trace rail 342 in a fixed state.
  • The timing mark 344 starts to move along the trace rail 342 in the moving direction of the controller 20 in synchronization with the operation start timing of the controller 20, and reaches the operation finish mark 346 at the finish timing of the moving duration.
  • The player can determine the moving timing and the moving direction of the controller by the trace rail 342 and the timing mark 344 that moves along the trace rail 342, and can determine the expiration of the operation duration when the timing mark 344 has reached the operation finish mark 346.
  • When the timing mark 344 has moved along the trace rail 342 and reached the operation finish mark 346, the next timing mark 344 may be displayed on the identical trace rail 342 and moved to give the identical movement instruction to the player. In the game image shown in FIG. 5, the trace rail 342 instructs the upward movement of the controller. In this embodiment, the instruction image 340 that instructs the movement in another reference direction (e.g., right direction, left direction, diagonally right upward direction, diagonally left upward direction, downward direction, right downward direction, left downward direction, and backward or forward direction (depth direction or front direction)) is generated and displayed.
  • The instruction image 340 that instructs the movement in the backward or forward direction (depth direction or front direction) may be formed by displaying the trace rail 342 in the game space displayed on the game screen in the forward direction and moving the timing mark 344 along the trace rail 342 in the forward direction or the backward direction, for example.
  • The instruction image 340 that instructs the player to make a right turn is displayed in FIGS. 7A and 7B, and the instruction image 340 that instructs the player to perform a punch operation (i.e., move the controller forward) is displayed in FIG. 7C.
  • In this embodiment, when the game has started, the instruction images 340 that instruct the player who gets into the rhythm of the background music to make a dance action are displayed one after another with the progress of the game.
  • In this embodiment, the main dancer 310 who dances corresponding to the dance action instructed by the instruction image 340 appears on the game screen, and the main dancer 310 and the sub dancers 312 dance to the background music. The player moves the first controller 20-1 and the second controller 20-2 (i.e., both hands) in real space in accordance with the instructions given by the instruction images 340-1 and 340-2 displayed one after another while listening to the background music and watching the main dancer 310 displayed on the game screen to enjoy dancing to the rhythm as if the player were the leader of the cheerleading dance team.
  • In this embodiment, a set of moving path instruction image parts that instruct a continuous moving path is generated and displayed by combining a plurality of trace rails (i.e., moving path instruction image parts) so that complex movement instructions can be given to the player.
  • FIG. 8A shows an example of the instruction image 340 that instructs the player to move the controller 20 almost in the shape of the letter “8”. Four trace rails 342-1 to 342-4 that differ in moving direction are displayed in combination.
  • The timing mark 344 sequentially moves along the four trace rails 342-1 to 342-4 so that the player can easily and visually determine the moving direction, the moving timing, and the moving duration of the corresponding controller.
  • FIG. 8B shows the instruction image 340 that instructs the player to circularly move the controller 20 clockwise by combining a plurality of arc-shaped trace rails 342-1 and 342-2.
  • Since a plurality of trace rails 342 are displayed in combination, the instruction image shown in FIG. 8A can instruct the player to perform a dance operation that moves the pompon in the shape of the letter “8”, and the instruction image shown in FIG. 8B can instruct the player to perform a dance operation that swings the pompon clockwise.
  • FIGS. 9A to 9D show an example of the instruction image 340 that allows the player to be more easily notified of the moving start timing of the controller 20.
  • In this embodiment, the trace rail 342 (or a moving path instruction image part) and the timing mark 344 (or a timing instruction image part) are displayed as a transition image that changes from a previous notice display that is displayed before the moving start timing to a main display when the moving start timing has been reached.
  • Specifically, the trace rail 342 shown in FIG. 9A is previously displayed by a thin dotted line two beats before the moving start timing of the controller. The trace rail 342 is displayed by a solid line (see FIG. 9B) one beat before the moving start timing, and the timing mark 344 is displayed to notify the player that the operation will occur shortly.
  • When the operation timing has been reached, the trace rail 342 is displayed in a shape that instructs that the operation timing has been reached, and the timing mark 344 moves along the trace rail 342 from the movement start point to the movement finish point. The player is thus instructed to move the controller 20 in the direction instructed by the trace rail 342 in synchronization with the movement of the timing mark 344.
  • When the timing mark 344 moves along the trace rail 342 (see FIG. 9D), the path sequentially disappears along with the movement of the timing mark 344, and the expiration of the moving duration is instructed when the timing mark 344 has reached the operation finish mark 346.
  • According to this embodiment, the player can be appropriately notified of the moving start timing and the moving direction using the transition image that changes from the previous notice display before the moving start timing to the main display so that the player can make preparations for moving the controller 20 in the instructed direction.
  • Note that the transition image may be displayed as a transition image that changes from transparent display to the main display, or may be displayed as a transition image that moves toward the display position along the depth direction while changing its transparency or size.
  • An appearance of the trace rail 342 that is a moving path instruction image part and an appearance of the timing mark 344 that is an instruction image part may be changed in the timing of instruction to the player.
  • For example, at least one of color, form, and size of the timing mark 344 that is an instruction image part may be changed so that the timing mark 344 becomes gradually more visible and has more improved production effects as the timing mark 344 comes close to the operation finish mark 346.
  • When the timing mark 344 moves on the trace rail 342, at least one of color, form, and size of at least one of the trace rail 342 and the timing mark 344 may be changed so that the visibility and the production effects are improved.
  • 3-2: Controller Movement Detection/Determination Process
  • When the instruction image 340 has instructed a given movement of controller 20, the detection/determination section 110 according to this embodiment detects the actual movement of the controller 20 performed by the player, and determines the degree of conformity of the detected movement of the controller with the movement instructed by the instruction image.
  • The details are described below.
  • Position Detection of Controller 20 (First Process)
  • In this embodiment, the moving direction and the moving start timing of the controller 20 may be calculated based on the accelerations detected by the acceleration sensor 210.
  • The acceleration sensor 210 according to this embodiment detects the accelerations in three axial directions (X axis, Y axis, and Z axis) in the space based on the controller.
  • A method that calculates the gravitational accelerations of the controller 20 in two axes (XY axes) is described below for convenience with reference to FIGS. 10A and 10B FIG. 10A is a diagram showing the gravitational acceleration in a real space coordinate system, and FIG. 10B is a diagram showing the gravitational acceleration in the coordinate system of the controller 20 based on the controller 20.
  • For example, when the controller 20 is placed horizontally (in this case, the acceleration sensor 210 of the controller 20 is also placed horizontally) (see FIG. 10A), a gravity of 1 G is applied in the Y axis (downward) direction (gravitational acceleration direction). As shown in FIG. 10B, a gravity of 1 G is also applied in the downward direction along the y axis in the coordinate system of the controller 20. When the player has inclined the controller 20 by 45° counterclockwise in real space coordinate system (XY axes), a gravitational acceleration of 1 G is applied in the Y-axis direction in real space coordinate system (FIG. 10A). In the controller coordinate system, the gravitational acceleration of 1 G is decomposed in the x axis direction and the y axis direction. Specifically, the x-axis component and the y-axis component are respectively 1/√2 G.
  • In this embodiment, the inclination of the controller 20 is thus detected utilizing the gravitational acceleration. In the example shown in FIG. 9B, when an acceleration of 1/√2 G has been detected in the negative direction along the x axis and an acceleration of 1/√2 G has been detected in the negative direction along the y axis, it is possible to detect that the controller 20 is inclined by 45° counterclockwise in real space coordinate system (XY axes). In this embodiment, since the accelerations in three axial directions can be detected, the three-dimensional inclination in real space can be calculated from the acceleration in each axial direction.
  • Specifically, the accelerations in three axial directions (X axis, Y axis, and Z axis) output from the acceleration sensor 210 differ between the case where the player vertically holds the controller 20 and moves the controller 20 in a given direction (e.g., rightward) and the case where the player horizontally holds the controller 20 and moves the controller 20 in the given direction.
  • Therefore, in order to accurately detect the moving direction and the like of the controller 20, it is preferable to perform a first process that detects the position of the controller 20 in real space before detecting the moving direction and the like of the controller 20.
  • In this embodiment, a first type determination database for determining the position of the controller in real space is formed and stored in the storage area 173 of the main storage section 172, as shown in FIG. 1. Specifically, the position of the controller 20 in real space is classified into a plurality of basic positions taking variations in position when the player holds the controller 20 into consideration. In this embodiment, the position of the controller 20 in real space is classified into a vertical position, a diagonal rightward position, a diagonal leftward position, a horizontal position, and other basic positions. The outputs of the acceleration sensor in the x, y, and z axial directions in the controller coordinate system are stored corresponding to each basic position. Therefore, the first process that determines the position of the controller 20 in real space can be performed based on the outputs of the acceleration sensor 210 in the x, y, and z axial directions.
  • In this case, if the basic position of the controller 20 is strictly associated with the outputs of the acceleration sensor in the x, y, and z axial directions in the controller coordinate system, the position of the controller cannot be determined when the player holds the controller 20 in a position that differs from the basic position to some extent.
  • Since the game according to this embodiment aims at a wide range of users (i.e., from children to adult), it is necessary to form the first type determination database so that a position that differs from each basic position within a predetermined allowable range can be determined to be the corresponding basic position even if the player holds the controller 20 in a position that differs from the basic position to some extent.
  • Therefore, the signals output from the acceleration sensor when the position of the controller 20 differs from a specific basic position within the allowable range (e.g., when the controller held by the player is inclined with respect to the vertical basic position within the predetermined allowable range) are also collected as the sensor output corresponding to the basic position and stored in the database.
  • The data corresponding to each basic position within the predetermined allowable range is thus collected and stored in the first type determination database.
  • In this embodiment, the signals in the x, y, and z axial directions output from the acceleration sensor of the controller 20 when moving the controller 20 in the moving direction instructed by the instruction image 34 are compared with the first type determination database, and the basic position that coincides with the position of the controller that is moved in the moving direction instructed by the instruction image 34 is determined (first determination process).
  • Therefore, the position (e.g., vertical or horizontal with respect to the screen) of the controller 20 held by the player can be determined flexibly.
  • Detection and Determination of Movement of Controller 20 (Second Process)
  • When the basic position of the controller 20 in real space has been determined by the first process, a second process that detects the movement of the controller 20 in real space including the moving direction is performed.
  • In this embodiment, a second type determination database shown in FIG. 12 is used to perform the second process. The second type determination database is stored in the storage area 173 of the main storage section 172.
  • The second type determination database is generated as follows.
  • The signals in the x, y, and z directions output from the acceleration sensor of the controller 20 when moving the controller 20 in the direction instructed by the instruction image 340 in real space are collected corresponding to each basic position of the controller 20 shown in FIG. 11.
  • The signals output from the acceleration sensor 210 when the controller 20 is moved in each basic position are associated with the moving direction of the controller 20 in real space to create a second type determination database.
  • Even if the player moves the controller 20 held in an identical basic position in an identical direction (e.g., rightward direction), the player may move the controller 20 in a meandering path or a curved path with respect to the instructed direction. If it is determined that the movement along such a moving path does not conform to the movement in the instructed moving direction, a situation in which a wide range of users cannot enjoy the game may occur.
  • Therefore, the signals in the x, y, and z axial directions output from the acceleration sensor 210 when moving the controller 20 in a specific basic position in the moving direction instructed by the instruction image are collected within the predetermined allowable range, and stored in the database. For example, when the player moves the controller 20 while drawing a path that differs from the direction instructed by the instruction image 340 within the predetermined allowable range, the signals output from the acceleration sensor are collected as signals corresponding to the instructed basic position, and stored in the database.
  • In this embodiment, data relating to the controller 20 held in each basic position is classified corresponding to each moving direction (i.e., rightward direction, diagonally right upward direction, upward direction, downward direction, diagonally right downward direction, diagonally left downward direction, forward direction, backward direction, clockwise direction, counterclockwise direction, and other directions), and the signals in the x, y, and z axial directions output from the acceleration sensor are collected within the predetermined allowable range, and stored in the database.
  • The second type determination database shown in FIG. 12 is generated in this manner FIG. 12 shows the database corresponding to one basic position. Note that data is similarly collected and stored corresponding to other basic positions shown in FIG. 11.
  • When the position of the controller 20 has been determined by the first process, data corresponding to the determined position is compared with the signals output from the acceleration sensor of the controller 20 using the second type determination database to specify the movement (e.g., the moving direction) of the controller 20.
  • Specifically, a process that specifies the moving direction and the moving amount per unit time of the controller 20 held in the position determined by the first process is performed in real time. The moving direction, the moving timing, and the moving duration of the controller 20 in real space are determined based on the moving direction and the moving amount per unit time of the controller 20 thus specified, and whether or not the movement coincides with the movement instructed by the instruction image 340 is evaluated.
  • Therefore, whether or not the player moves the controller 20 in the instructed direction can be evaluated regardless of the position of the controller 20.
  • In this embodiment, the movements of the first controller 20-1 and the second controller 20-2 are individually evaluated with respect to the instructions given by the instruction images 340-1 and 340-2 shown in FIG. 4.
  • When the movement of each of the first controller 20-1 and the second controller 20-2 has been determined to be good (i.e., the controller has been moved in the instructed direction at the instructed input timing and moved for the predetermined duration), the player scores 500 points corresponding to each determination result (score 320). When the movements of both of the first controller 20-1 and the second controller 20-2 have been determined to be good, the player scores 1000 points (score 320). In this case, the main dancer 310 displayed on the game screen operates to reproduce a dance corresponding to the operation of the player.
  • When the movements of both of the first controller 20-1 and the second controller 20-2 have been determined to be good, the player's operation may be evaluated to a larger extent as compared with the case where the movement of only one of the first controller 20-1 and the second controller 20-2 has been determined to be good. An evaluation display area 322 that evaluates the operation of the player is provided on the game screen (upper right) shown in FIG. 5. Characters “COOL” are displayed when the input direction and the input timing are correct, and characters “POOR” are displayed when the input direction or timing is incorrect.
  • Screens shown in FIGS. 13A to 13C may be displayed based on the evaluation result.
  • For example, the cause of the incorrect operation of the controller 20 may be specified based on the incorrect determination result. As shown in FIGS. 13A and 13B, a screen that instructs that the input timing has been incorrect, or a screen that instructs that the controller has been moved inappropriately may be displayed.
  • This prompts the player to appropriately move the controller 20 so that the player corrects the dance and is urged to challenge a more difficult dance game.
  • When the player has been determined to have appropriately moved the controller 20 in accordance with the movement instructed by the instruction image a screen that instructs characters “Perfect” is displayed (see FIG. 13C). Effect sound may also be generated to liven up the game.
  • When successive movements are instructed by the instruction image, a special event is generated when the player has successfully performed the successive movements. This improves the game production effect.
  • When the player has moved the controller 20 in accordance with the movement instructed by the instruction image 340, the moving path of the controller 20 may be displayed near the corresponding instruction image 340. The player can thus visually observe the degree of similarity between the movement instructed by the instruction image 340 and the actual movement of the controller 20 so that the player can enjoy the game while further improving his dance skill.
  • Determination Moving Start Timing and Moving Duration
  • In this embodiment, when the movement of the controller in a predetermined direction and the moving start timing have been instructed by the instruction image 340, the first process and the second process are performed. The movement of the controller 20 is then detected, and whether or not the detected movement coincides with the instructed movement is then determined.
  • In this embodiment, the velocity vector (i.e., moving amount per unit time) of the controller 20 may be calculated as a composite value of the velocity vectors in the x, y, and z axis directions obtained from the acceleration sensor 210 of the controller 20. The velocity vector detected when starting the movement is small even if the player has moved the controller 20 in the direction at the moving start timing instructed by the instruction image 340, and reaches a predetermined reference value after several frames (e.g., a frame T3 when the movement has been started in a frame T1). Therefore, a delay by predetermined frames occurs between the timing at which the controller 20 has been operated and the timing at which the operation is detected.
  • In this embodiment, a determination moving start timing and a determination moving duration are set taking the above-mentioned detection delay into consideration in addition to the moving start timing and the moving duration instructed by the instruction image 340 to determine the movement of the controller 20.
  • Specifically, the determination moving start timing is set at a timing delayed by several frames as compared with the moving start timing instructed by the instruction image 340, and the determination moving duration is set to coincide with the finish timing of the moving duration instructed by the instruction image 340 or expire after the moving duration instructed by the instruction image 340 by predetermined frames.
  • This eliminates a problem due to the delay in detecting the moving start timing of the controller. Therefore, the movement of the controller can be appropriately determined.
  • According to this embodiment, the degree of difficulty in the game may be set by appropriately setting the determination moving start timing and the determination moving duration.
  • For example, the degree of difficulty can be decreased by increasing the delay of the determination moving start timing with respect to the moving start timing instructed by the instruction image, and can be increased by decreasing the delay of the determination moving start timing with respect to the moving start timing instructed by the instruction image.
  • The degree of difficulty can be increased by decreasing the determination moving duration, and can be decreased by increasing the determination moving duration.
  • Therefore, the determination moving start timing and the moving duration may be set corresponding to the degree of difficulty selected by the player, for example.
  • 3-3: Game Screen Position Pointing Instruction and Detection/Determination Process
  • The pointing position instruction section 104 according to this embodiment generates the pointing instruction image (pointing instruction image) that instructs the player to point a predetermined position on the game screen at a given timing with the progress of the game.
  • FIGS. 6A to 6C show specific examples of the pointing instruction image 350 displayed on the game screen. In FIG. 6A, the pointing instruction image 350 is displayed corresponding to the character on the left of the game screen.
  • As shown in FIGS. 15A to 15D, the pointing instruction image 350 includes a target board section 352 displayed at the pointing position, and a ring section 354 displayed around the target board section 352. As shown in FIG. 15A, the ring section 354 is displayed as a large ring that encloses the target board section 352 immediately after the pointing instruction image 350 has been displayed. The ring section 354 shrinks with the lapse of time (see FIGS. 15A to 15D). As shown in FIG. 15D, the ring section 354 overlaps the target board section 352 when the pointing timing has been reached.
  • The player points the controller 20 (the first controller 20-1 held with the right hand in FIGS. 6A to 6C) at the position of the target board section 352 displayed on the game screen within a predetermined period in accordance with the pointing timing instruction.
  • The light sources 198L and 198R are provided around the display section 12, as described above. The player directs the imaging section 220 provided on the front side of the controller 20 toward the game screen to operate the controller 20 as a pointing device that points an arbitrary point on the game screen. FIGS. 14A and 1413 show an example of a state in which the player points the desired position on the game screen using the controller 20.
  • The detection/determination section 110 according to this embodiment determines whether or not the player has pointed the target board section 352 instructed by the pointing instruction image 350 at the instructed timing based on a signal acquired from the imaging section 220 of the controller 20.
  • When the detection/determination section 110 has determined that the player has pointed the instructed position at the instructed timing, a predetermined event (e.g., the backing dancer is raised as shown in FIG. 6B) is generated, and a new pointing instruction image 350 is displayed at the center of the game screen, as shown in FIG. 6B.
  • As shown in FIG. 6B, the instruction images 340-1 and 340-2 that instruct a given movement of the controller are displayed corresponding to the new pointing instruction image 350.
  • In this case, when the player has successfully pointed the target board section 352 instructed by the newly displayed pointing instruction image 350 at the instructed timing, the timing marks 344 of the instruction images 340-1 and 340-2 move along the trace rails 342 when a specific period of time has elapsed to instruct given movements of the controllers 20-1 and 20-2 for the player.
  • When the player has successfully moved the controllers 20-1 and 20-2 in accordance with the instructions, a screen in which an acrobatic dance has succeeded is displayed, as shown in FIG. 6C, for example. When such successive movements have succeeded, a game event in which the number of backing dancers appearing on the screen is successively increased (see FIGS. 16A to 16C) may be generated to increase the interest of the game, for example.
  • 3-4: Multi-Player Mode
  • In this embodiment, a multi-player mode (two-player mode or four-player mode) can be selected when starting the game.
  • FIG. 17A shows an example of a game screen when a two-player mode has been selected.
  • In this case, two players hold the first controller 20-1 or the second controller 20-2. The players operate the controllers 20-1 and 20-2 in accordance with the instructions given by the instruction images 340-1 and 340-2 shown in FIG. 17A, and compete for the dance skill.
  • When four controllers 20 that differ in ID can be provided, a four-player mode shown in FIG. 17B can be selected.
  • Each player holds the corresponding controller 20, and observes the instruction image 340 displayed on the game screen.
  • In this embodiment, two players corresponding to two spotlighted dance characters are given instructions on the movement of the controller 20 using the instruction images 340-1 and 340-2.
  • Since the spotlighted dance character is changed with the progress of the game, the player visually recognizes his turn when the dance character corresponding to the player is spotlighted. The player moves the controller in accordance with the movement instructed by the corresponding instruction image 340 to compete for the dance skill.
  • This embodiment has been described taking an example in which one player utilizes one controller when performing a multi-player mode. Note that one player may utilize two controllers, if necessary.
  • 4. Process According to this Embodiment
  • An example of the process according to this embodiment is described below with reference to flowcharts shown in FIGS. 18 to 21.
  • FIG. 18 shows an operation example when applying this embodiment to a game system.
  • When the player has selected a single-player mode or a multi-player mode and started the game, the game calculation starts (step S10).
  • A cheerleading dance game screen is displayed on the display section 12, and background music (dance music) corresponding to the dance is output.
  • In this case, instructions may be given to the player as to the position of the controller 20. For example, instructions may be given to the player P as to whether to hold the controller 20 vertically or horizontally.
  • When the dance characters 310 and 312 displayed on the game screen start to dance, as shown in FIG. 5, the instruction images 340-1 and 340-2 that instruct the movements of the controllers 20-1 and 20-2 are displayed at a given timing with the progress of the game. The detection/determination process that determines whether or not the player has moved each of the controllers 20-1 and 20-2 in accordance with the instructions given by the instruction images 340-1 and 340-2 is then performed (steps S12 and S14).
  • When the display timing of the pointing instruction image 350 shown in FIGS. 6A to 6C has been reached, the pointing instruction image 350 is displayed on the game screen, as shown in FIGS. 6A to 6C, and the detection/determination process that determines whether or not the player has pointed the controller 20 at the area of the target board section 352 instructed by the pointing instruction image 350 at the instructed timing is performed (steps S20 and S22).
  • The above-described process is repeated until the game ends (step S30). The final game result is displayed when the game ends (step S32).
  • In this embodiment, the result display event based on the determination result (e.g., score calculation or screen display shown in FIGS. 17A and 17B) occurs corresponding to each determination result of the detection/determination process performed in the steps S14 and S22. In a step S32, the total value (total score) of each determination result of the detection/determination process performed in the steps S14 and S22 is calculated and displayed as the final game result.
  • FIG. 19 shows a specific example of the process performed in the step S14 shown in FIG. 18.
  • When the generation timing of the instruction image 340 has been reached with the progress of the game, the instruction images 340-1 and 340-2 are generated and displayed on the game screen (step S40), and whether or not the player has accurately moved the controllers 20-1 and 20-2 in real space in accordance with the instructions given by the instruction images 340-1 and 340-2 is determined (step S42).
  • The score 320 is updated based on the determination result, and characters “COOL” or “POOR” are displayed in the evaluation display area 322.
  • The production image shown in FIGS. 13A to 13C is displayed corresponding to the evaluation result (step S44).
  • The production image shown in FIGS. 13A to 13C may be displayed only in a predetermined scene during the game. The production image shown in FIGS. 13A to 13C may be displayed only when the player or the operator has performed the production screen display setting before starting the game.
  • FIG. 20 shows a specific example of the process performed in the step S40 shown in FIG. 19.
  • When the display timing of the instruction image 340 has been reached during the game, the transition image shown in FIGS. 9A and 91B is displayed before the moving start timing of the controller 20. Therefore, the player can determine and prepare for the moving direction and the moving start timing of the controller before the moving start timing of the controller 20.
  • When the moving start timing of the controller has been reached (step S52), the timing mark 344 is moved toward the operation finish mark 346 along the trace rail 342, as shown in FIG. 9C.
  • The player moves the controller 20 in the direction instructed by the trace rail 342 at the moving timing of the timing mark 344. The player successively moves the controller 20 in accordance with the instructions until the timing mark 344 reaches the operation finish mark 346.
  • When the successive movement finish timing has been reached (see FIG. 9D) (step S56), the display of the instruction image 340 is finished (step S58).
  • The above display control process makes it possible to give visual instructions to the player as to the movement of the controller in the desired direction at an appropriate timing.
  • FIG. 21 shows a specific example of the detection/determination process performed in the step S42 shown in FIG. 19.
  • When the moving start timing of the controller instructed by the instruction image 340 has been reached (FIG. 9C) (step S60), signals output from the acceleration sensor of the controller 20 are acquired (step S62), and the above-mentioned first process and second process are performed (steps S64 and S66).
  • Specifically, the first process that determines the basic position which corresponds to the position of the controller 20 is performed, and the second process that determines the direction and the movement of the controller 20 and determines the degree of conformity with the movement instructed by the instruction image 340 is then performed.
  • FIG. 22 shows a specific example of the process performed in the step S22 shown in FIG. 18.
  • When the display timing of the pointing instruction image 350 shown in FIGS. 6A to 6C has been reached during the game, the pointing instruction image 350 is displayed at a given position of the game screen (step S70).
  • As shown in FIGS. 15A to 15D, the pointing instruction image 350 includes the target board section 352 that instructs the pointing area and the ring section 354 that shrinks toward the target board section 352. A timing at which the ring section 354 has shrunk to enclose the target board section 352 (i.e., the timing shown in FIG. 15D) is the timing at which the player should point the controller 20 at the position instructed by the target board section 352.
  • Whether or not the player has successfully pointed the controller 20 at the instructed position is determined (step S72). When it has been determined that the player has successfully pointed the controller 20 at the instructed position, an event corresponding to the pointing instruction is generated (step S74).
  • In the example shown in FIG. 6A, an event in which the pointed backing dancer lifts the adjacent backing dancer is generated. In the example shown in FIG. 6B, the instruction images 340-1 and 340-2 are displayed corresponding to the display position of the pointing instruction image 350. An operation that moves the controllers 20-1 and 20-2 in the direction instructed by the instruction images 340-1 and 340-2 is instructed after the player has successfully pointed the controller 20 at the instructed position, and the completion of an acrobatic dance shown in FIG. 6C is displayed when the player has successfully performed the successive operations.
  • Note that the invention is not limited to the above embodiments. Various modifications and variations may be made without departing from the scope of the invention.
  • For example, although the above embodiments have been described taking an example in which the invention is applied to a dance game, the invention is not limited thereto. The invention may be suitably applied to other applications, such as giving aerobic or exercise instructions to a student who holds a controller so that the student can perform appropriate aerobics or exercise and determining the result.
  • Although the above embodiments have been described taking an example in which the timing mark 344 is moved along the trace rail 342 as the instruction image 340, the invention is not limited thereto. An arbitrary instruction image may be used insofar as a given movement of the controller can be instructed. For example, the color of a given path displayed by the trace rail 342 may be changed corresponding to the moving timing and the operation duration of the controller. Alternatively, the moving direction may be displayed using an arrow or the like. The operation start timing may be instructed by a countdown display, and the operation duration may also be instructed by a countdown display.
  • Although only some embodiments of this invention have been described in detail above, those skilled in the art will readily appreciate that many modifications are possible in the embodiments without materially departing from the novel teachings and advantages of this invention. Accordingly, all such modifications are intended to be included within the scope of the invention.

Claims (18)

1. A program that causes a computer to function as:
an instruction image generation section that generates an instruction image that instructs a given movement of a controller including a physical quantity sensor; and
a detection/determination section that acquires a signal from the physical quantity sensor included in the controller, detects the movement of the controller, and determines the degree of conformity of the detected movement of the controller with the movement instructed by the instruction image.
2. The program as defined in claim 1, wherein:
the physical quantity sensor detects a physical quantity from which a moving direction and a moving amount per unit time can be derived;
the instruction image generation section generates the instruction image that instructs a moving direction and a moving timing of the controller as the movement; and
the detection/determination section acquires the signal from the physical quantity sensor, detects the moving direction and the moving timing of the controller, and determines the degree of conformity of the detected moving direction and moving timing of the controller with the instructions instructed by the instruction image.
3. The program as defined in claim 2, wherein:
the instruction image generation section generates the instruction image that instructs a moving start timing as the moving timing and instructs a moving duration; and
the detection/determination section acquires the signal from the physical quantity sensor, detects the moving direction, the moving start timing, and the moving duration of the controller, and determines the degree of conformity of the detected moving direction, moving start timing, and moving duration of the controller with the instructions instructed by the instruction image.
4. The program as defined in claim 1,
wherein the instruction image includes:
a moving path instruction image part that instructs a moving direction of the controller along a given path; and
a timing instruction image part that instructs a moving timing of the controller along the path,
the instruction image being updated according to a change in content of instruction.
5. The program as defined in claim 4,
wherein the timing instruction image part moves from a movement start instruction position to a movement finish instruction position along the moving path instruction image part to instruct a moving start timing and a moving duration of the controller along the instructed path.
6. The program as defined in claim 4,
wherein at least one of the moving path instruction image part and the timing instruction image part is displayed as a transition image that changes from a previous notice display that is displayed before the moving start timing to a main display when the moving start timing is reached.
7. The program as defined in claim 4, wherein:
the instruction image is displayed as a set of moving path instruction image parts that instruct a continuous moving path by combining a plurality of the moving path instruction image parts; and
the timing instruction image part moves along the continuously combined moving path instruction image parts to instruct the moving timing of the controller along a moving path instructed by each of the moving path instruction image parts.
8. The program as defined in claim 3,
wherein the detection/determination section acquires the signal from the physical quantity sensor, detects a timing and a duration when the moving amount of the controller per unit time exceeds a given value as the moving start timing and the moving duration of the controller; and
wherein the detection/determination section determines the degree of conformity of the detected moving start timing and moving duration of the controller with a moving start timing and determination moving duration for determination related to the instructed moving start timing and instructed moving duration instructed by the instruction image when the detection/determination section has determined that the detected moving direction of the controller coincides with the instructed moving direction instructed by the instruction image.
9. The program as defined in claim 1,
wherein the detection/determination section performs a first process that compares the signal output from the physical quantity sensor when moving the controller with a first type determination database and determines the position of the controller that is moved, the first type determination database being related to the signal output from the physical quantity sensor when moving the controller in different positions in the moving direction instructed by the instruction image, and the first type determination database being used to determine the position of the controller; and
wherein the detection/determination section performs a second process that compares the signal output from the physical quantity sensor when moving the controller in the position determined by the first process in the moving direction instructed by the instruction image with a second type determination database to specify the movement of the controller including at least the moving direction, and determines the degree of conformity of the specified movement with the instructed movement, the second type determination database being related to the position of the controller determined based on the first type determination database, and being used to determine the movement the controller including at least the moving direction from the signal output from the physical quantity sensor when moving the controller in the moving direction instructed by the instruction image.
10. The program as defined in claim 1,
wherein the instruction image generation section generates the instruction image that individually instructs a given movement for at least two controllers each having the physical quantity sensor.
11. The program as defined in claim 1, the program further causing the computer to function as:
a game calculation section that instructs a player to perform a dance action accompanying the movement of the controller, generates a game screen including a character that performs a dance related to the instructed dance action based on an input from the controller, and generates a dance background music signal,
wherein the instruction image generation section generates the instruction image that instructs a given movement of the controller in the game screen.
12. The program as defined in claim 11,
the game calculation section including a subsection that performs game production related to a result of determination for the degree of conformity by the detection/determination section.
13. The program as defined in claim 11,
the game calculation section including a subsection that specifies a cause of an incorrect operation of the controller and notifies the player of the cause of the incorrect operation based on a result of determination for the degree of conformity by the detection/determination section.
14. The program as defined in claim 11,
the game calculation section including a subsection that traces the moving path of the controller detected by the detection/determination section in the game screen based on a given condition.
15. The program as defined in claim 11, the program further causing the computer to function as:
a pointing position detection section that acquires an imaging signal from a imaging section provided in the controller and detects a pointing position of the controller on the game screen, the imaging section imaging a reference position recognition body disposed or displayed at a position related to the game screen,
wherein the game calculation section includes a subsection that displays a position instruction image that instructs to point at a predetermined position of the game screen at a given timing during the game; and
wherein the game calculation section generates an event related to a pointing at the predetermined position when the pointing at the predetermined position has been detected at the timing.
16. The program as defined in claim 15,
wherein the instruction image generation section generates the instruction image that instructs a given movement of the controller related to the predetermined pointed position of the game screen as the event related to the pointing at the predetermined position.
17. A computer-readable information storage medium storing the program as defined in claim 1.
18. A game system comprising:
an instruction image generation section that generates an instruction image that instructs a given movement of a controller including a physical quantity sensor; and
a detection/determination section that acquires a signal from the physical quantity sensor included in the controller, detects the movement of the controller, and determines the degree of conformity of the detected movement of the controller with the movement instructed by the instruction image.
US12/201,619 2007-09-12 2008-08-29 Program, information storage medium, game system, and input instruction device Abandoned US20090069096A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2007237278 2007-09-12
JP2007-237278 2007-09-12
JP2008-211603 2008-08-20
JP2008211603A JP5410710B2 (en) 2007-09-12 2008-08-20 Program, information storage medium, game system

Publications (1)

Publication Number Publication Date
US20090069096A1 true US20090069096A1 (en) 2009-03-12

Family

ID=40326362

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/201,619 Abandoned US20090069096A1 (en) 2007-09-12 2008-08-29 Program, information storage medium, game system, and input instruction device

Country Status (2)

Country Link
US (1) US20090069096A1 (en)
EP (1) EP2039402B1 (en)

Cited By (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070255434A1 (en) * 2006-04-28 2007-11-01 Nintendo Co., Ltd. Storage medium storing sound output control program and sound output control apparatus
US20090310027A1 (en) * 2008-06-16 2009-12-17 James Fleming Systems and methods for separate audio and video lag calibration in a video game
US20100248833A1 (en) * 2009-03-31 2010-09-30 Nintendo Co., Ltd. Game apparatus and game program
US20100292005A1 (en) * 2009-05-12 2010-11-18 Takeshi Miyamoto Game apparatus and computer-readable storage medium having a game program stored thereon
US20110159959A1 (en) * 2009-12-24 2011-06-30 Sony Computer Entertainment Inc. Wireless Device Pairing Methods
US20110183765A1 (en) * 2010-01-27 2011-07-28 Namco Bandai Games Inc. Information storage medium, game system, and input determination method
US20110306422A1 (en) * 2010-06-11 2011-12-15 Namco Bandai Games Inc. Image generation system, image generation method, and information storage medium
US20120143358A1 (en) * 2009-10-27 2012-06-07 Harmonix Music Systems, Inc. Movement based recognition and evaluation
US20120229513A1 (en) * 2011-03-08 2012-09-13 Nintendo Co., Ltd. Storage medium having stored thereon information processing program, information processing apparatus, information processing system, and information processing method
US20120295705A1 (en) * 2011-05-20 2012-11-22 Konami Digital Entertainment Co., Ltd. Game device, game control method, and non-transitory information recording medium that records a program
US20120309512A1 (en) * 2011-06-03 2012-12-06 Nintendo Co., Ltd. Computer-readable storage medium having stored therein game program, game apparatus, game system, and game processing method
US8439733B2 (en) 2007-06-14 2013-05-14 Harmonix Music Systems, Inc. Systems and methods for reinstating a player within a rhythm-action game
US8444464B2 (en) 2010-06-11 2013-05-21 Harmonix Music Systems, Inc. Prompting a player of a dance game
US8449360B2 (en) 2009-05-29 2013-05-28 Harmonix Music Systems, Inc. Displaying song lyrics and vocal cues
US8465366B2 (en) 2009-05-29 2013-06-18 Harmonix Music Systems, Inc. Biasing a musical performance input to a part
US8550908B2 (en) 2010-03-16 2013-10-08 Harmonix Music Systems, Inc. Simulating musical instruments
US8599135B1 (en) * 2012-05-25 2013-12-03 Nintendo Co., Ltd. Controller device, information processing system, and communication method
US8663013B2 (en) 2008-07-08 2014-03-04 Harmonix Music Systems, Inc. Systems and methods for simulating a rock band experience
US8678896B2 (en) 2007-06-14 2014-03-25 Harmonix Music Systems, Inc. Systems and methods for asynchronous band interaction in a rhythm action game
US8686269B2 (en) 2006-03-29 2014-04-01 Harmonix Music Systems, Inc. Providing realistic interaction to a player of a music-based video game
US8702485B2 (en) 2010-06-11 2014-04-22 Harmonix Music Systems, Inc. Dance game and tutorial
WO2014081900A1 (en) * 2012-11-20 2014-05-30 Morinoske Co., Ltd. Curvate motion sensing and control system
US8749489B2 (en) 2012-05-25 2014-06-10 Nintendo Co., Ltd. Controller device, information processing system, and communication method
US9024166B2 (en) 2010-09-09 2015-05-05 Harmonix Music Systems, Inc. Preventing subtractive track separation
US9022862B2 (en) 2011-06-03 2015-05-05 Nintendo Co., Ltd. Computer-readable storage medium having stored therein information processing program, information processing apparatus, information processing system, and information processing method
US9030410B2 (en) 2012-05-25 2015-05-12 Nintendo Co., Ltd. Controller device, information processing system, and information processing method
US20150133206A1 (en) * 2012-04-30 2015-05-14 The Regents Of The University Of California Method and apparatus for mobile rehabilitation exergaming
US9070194B2 (en) 2012-10-25 2015-06-30 Microsoft Technology Licensing, Llc Planar surface detection
US9126114B2 (en) 2011-11-09 2015-09-08 Nintendo Co., Ltd. Storage medium, input terminal device, control system, and control method
US9262856B1 (en) * 2012-07-17 2016-02-16 Disney Enterprises, Inc. Providing content responsive to performance of available actions solicited via visual indications
US9358456B1 (en) 2010-06-11 2016-06-07 Harmonix Music Systems, Inc. Dance competition game
US9375640B2 (en) 2011-03-08 2016-06-28 Nintendo Co., Ltd. Information processing system, computer-readable storage medium, and information processing method
US20160196029A1 (en) * 2015-01-06 2016-07-07 LINE Plus Corporation Game system for providing rhythm game service and method therefor
US9539511B2 (en) 2011-03-08 2017-01-10 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method for operating objects in a virtual world based on orientation data related to an orientation of a device
US9561443B2 (en) 2011-03-08 2017-02-07 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method
US20170072304A1 (en) 2015-06-12 2017-03-16 Nintendo Co., Ltd. Information processing system, information processing device, controller device and accessory
US9643085B2 (en) 2011-03-08 2017-05-09 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method for controlling a virtual object using attitude data
US20170153709A1 (en) * 2015-11-26 2017-06-01 Colopl, Inc. Method of giving a movement instruction to an object in a virtual space, and program therefor
US20170266551A1 (en) * 2016-03-18 2017-09-21 Colopl, Inc. Game medium, method of using the game medium, and game system for using the game medium
US9925464B2 (en) 2011-03-08 2018-03-27 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method for displaying an image on a display device using attitude data of a display device
US10076704B2 (en) 2013-03-11 2018-09-18 Capcom Co., Ltd. Game device
US10092829B2 (en) 2016-10-06 2018-10-09 Nintendo Co., Ltd. Attachment
US20180329215A1 (en) * 2015-12-02 2018-11-15 Sony Interactive Entertainment Inc. Display control apparatus and display control method
US20190030423A1 (en) * 2017-07-27 2019-01-31 Nintendo Co., Ltd. Game system, accessory, storage medium having stored therein game program, and game processing method
US10220303B1 (en) 2013-03-15 2019-03-05 Harmonix Music Systems, Inc. Gesture-based music game
US10258879B2 (en) 2015-06-12 2019-04-16 Nintendo Co., Ltd. Supporting device, charging device and controller system
US10328350B2 (en) 2016-10-06 2019-06-25 Nintendo Co., Ltd. Attachment and control system
US10357714B2 (en) 2009-10-27 2019-07-23 Harmonix Music Systems, Inc. Gesture-based user interface for navigating a menu
EP3591658A1 (en) * 2018-07-04 2020-01-08 Nokia Technologies Oy Method and apparatus for hand function monitoring
US11036987B1 (en) 2019-06-27 2021-06-15 Facebook Technologies, Llc Presenting artificial reality content using a mirror
US11055920B1 (en) * 2019-06-27 2021-07-06 Facebook Technologies, Llc Performing operations using a mirror in an artificial reality environment
US11145126B1 (en) * 2019-06-27 2021-10-12 Facebook Technologies, Llc Movement instruction using a mirror in an artificial reality environment
US11247121B2 (en) * 2019-08-07 2022-02-15 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having game program stored therein, game system, game apparatus, and game processing control method for correct direction determination
US11253776B2 (en) * 2017-12-28 2022-02-22 Bandai Namco Entertainment Inc. Computer device and evaluation control method
US11260286B2 (en) 2017-12-28 2022-03-01 Bandai Namco Entertainment Inc. Computer device and evaluation control method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5081964B2 (en) 2010-10-28 2012-11-28 株式会社コナミデジタルエンタテインメント GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5982352A (en) * 1992-09-18 1999-11-09 Pryor; Timothy R. Method for providing human input to a computer
US6227968B1 (en) * 1998-07-24 2001-05-08 Konami Co., Ltd. Dance game apparatus and step-on base for dance game
US20020055383A1 (en) * 2000-02-24 2002-05-09 Namco Ltd. Game system and program
US6843726B1 (en) * 1999-09-07 2005-01-18 Konami Corporation Game system
US20050210418A1 (en) * 2004-03-23 2005-09-22 Marvit David L Non-uniform gesture precision
US20070060384A1 (en) * 2005-09-14 2007-03-15 Nintendo Co., Ltd. Storage medium storing video game program
US20070197290A1 (en) * 2003-09-18 2007-08-23 Ssd Company Limited Music Game Device, Music Game System, Operation Object, Music Game Program, And Music Game Method
US7331856B1 (en) * 1999-09-07 2008-02-19 Sega Enterprises, Ltd. Game apparatus, input device used in game apparatus and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001070640A (en) 1999-09-07 2001-03-21 Konami Co Ltd Game machine
JP3917456B2 (en) * 2001-08-09 2007-05-23 株式会社コナミスポーツ&ライフ Evaluation program, recording medium thereof, timing evaluation apparatus, timing evaluation system
US20050261073A1 (en) * 2004-03-26 2005-11-24 Smartswing, Inc. Method and system for accurately measuring and modeling a sports instrument swinging motion
JP4907128B2 (en) * 2005-08-30 2012-03-28 任天堂株式会社 Game system and game program
US8157651B2 (en) * 2005-09-12 2012-04-17 Nintendo Co., Ltd. Information processing program
EP1970104A4 (en) * 2005-12-12 2010-08-04 Ssd Co Ltd Training method, training device, and coordination training method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5982352A (en) * 1992-09-18 1999-11-09 Pryor; Timothy R. Method for providing human input to a computer
US6227968B1 (en) * 1998-07-24 2001-05-08 Konami Co., Ltd. Dance game apparatus and step-on base for dance game
US6843726B1 (en) * 1999-09-07 2005-01-18 Konami Corporation Game system
US7331856B1 (en) * 1999-09-07 2008-02-19 Sega Enterprises, Ltd. Game apparatus, input device used in game apparatus and storage medium
US20020055383A1 (en) * 2000-02-24 2002-05-09 Namco Ltd. Game system and program
US20070197290A1 (en) * 2003-09-18 2007-08-23 Ssd Company Limited Music Game Device, Music Game System, Operation Object, Music Game Program, And Music Game Method
US20050210418A1 (en) * 2004-03-23 2005-09-22 Marvit David L Non-uniform gesture precision
US20070060384A1 (en) * 2005-09-14 2007-03-15 Nintendo Co., Ltd. Storage medium storing video game program
US20080318692A1 (en) * 2005-09-14 2008-12-25 Nintendo Co., Ltd. Storage medium storing video game program for calculating a distance between a game controller and a reference

Cited By (103)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8686269B2 (en) 2006-03-29 2014-04-01 Harmonix Music Systems, Inc. Providing realistic interaction to a player of a music-based video game
US20070255434A1 (en) * 2006-04-28 2007-11-01 Nintendo Co., Ltd. Storage medium storing sound output control program and sound output control apparatus
US7890199B2 (en) * 2006-04-28 2011-02-15 Nintendo Co., Ltd. Storage medium storing sound output control program and sound output control apparatus
US8678895B2 (en) 2007-06-14 2014-03-25 Harmonix Music Systems, Inc. Systems and methods for online band matching in a rhythm action game
US8444486B2 (en) 2007-06-14 2013-05-21 Harmonix Music Systems, Inc. Systems and methods for indicating input actions in a rhythm-action game
US8439733B2 (en) 2007-06-14 2013-05-14 Harmonix Music Systems, Inc. Systems and methods for reinstating a player within a rhythm-action game
US8678896B2 (en) 2007-06-14 2014-03-25 Harmonix Music Systems, Inc. Systems and methods for asynchronous band interaction in a rhythm action game
US20090310027A1 (en) * 2008-06-16 2009-12-17 James Fleming Systems and methods for separate audio and video lag calibration in a video game
US8663013B2 (en) 2008-07-08 2014-03-04 Harmonix Music Systems, Inc. Systems and methods for simulating a rock band experience
US20100248833A1 (en) * 2009-03-31 2010-09-30 Nintendo Co., Ltd. Game apparatus and game program
US20120094760A1 (en) * 2009-03-31 2012-04-19 Nintendo Co., Ltd. Game apparatus and game program
US8303412B2 (en) * 2009-03-31 2012-11-06 Nintendo Co., Ltd. Game apparatus and game program
US8353769B2 (en) * 2009-03-31 2013-01-15 Nintendo Co., Ltd. Game apparatus and game program
US20100292005A1 (en) * 2009-05-12 2010-11-18 Takeshi Miyamoto Game apparatus and computer-readable storage medium having a game program stored thereon
US9269175B2 (en) 2009-05-12 2016-02-23 Nintendo Co., Ltd. Game apparatus and computer-readable storage medium having a game program stored thereon
US8465366B2 (en) 2009-05-29 2013-06-18 Harmonix Music Systems, Inc. Biasing a musical performance input to a part
US8449360B2 (en) 2009-05-29 2013-05-28 Harmonix Music Systems, Inc. Displaying song lyrics and vocal cues
US10421013B2 (en) 2009-10-27 2019-09-24 Harmonix Music Systems, Inc. Gesture-based user interface
US9981193B2 (en) * 2009-10-27 2018-05-29 Harmonix Music Systems, Inc. Movement based recognition and evaluation
US10357714B2 (en) 2009-10-27 2019-07-23 Harmonix Music Systems, Inc. Gesture-based user interface for navigating a menu
US20120143358A1 (en) * 2009-10-27 2012-06-07 Harmonix Music Systems, Inc. Movement based recognition and evaluation
US8620213B2 (en) * 2009-12-24 2013-12-31 Sony Computer Entertainment Inc. Wireless device pairing methods
US20110159959A1 (en) * 2009-12-24 2011-06-30 Sony Computer Entertainment Inc. Wireless Device Pairing Methods
US8784201B2 (en) 2010-01-27 2014-07-22 Namco Bandai Games Inc. Information storage medium, game system, and input determination method
US20110183765A1 (en) * 2010-01-27 2011-07-28 Namco Bandai Games Inc. Information storage medium, game system, and input determination method
US8874243B2 (en) 2010-03-16 2014-10-28 Harmonix Music Systems, Inc. Simulating musical instruments
US9278286B2 (en) 2010-03-16 2016-03-08 Harmonix Music Systems, Inc. Simulating musical instruments
US8550908B2 (en) 2010-03-16 2013-10-08 Harmonix Music Systems, Inc. Simulating musical instruments
US8568234B2 (en) 2010-03-16 2013-10-29 Harmonix Music Systems, Inc. Simulating musical instruments
US8636572B2 (en) 2010-03-16 2014-01-28 Harmonix Music Systems, Inc. Simulating musical instruments
US9358456B1 (en) 2010-06-11 2016-06-07 Harmonix Music Systems, Inc. Dance competition game
US8444464B2 (en) 2010-06-11 2013-05-21 Harmonix Music Systems, Inc. Prompting a player of a dance game
US8655015B2 (en) * 2010-06-11 2014-02-18 Namco Bandai Games Inc. Image generation system, image generation method, and information storage medium
US8702485B2 (en) 2010-06-11 2014-04-22 Harmonix Music Systems, Inc. Dance game and tutorial
US20110306422A1 (en) * 2010-06-11 2011-12-15 Namco Bandai Games Inc. Image generation system, image generation method, and information storage medium
US8562403B2 (en) 2010-06-11 2013-10-22 Harmonix Music Systems, Inc. Prompting a player of a dance game
US9024166B2 (en) 2010-09-09 2015-05-05 Harmonix Music Systems, Inc. Preventing subtractive track separation
US9643085B2 (en) 2011-03-08 2017-05-09 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method for controlling a virtual object using attitude data
US9539511B2 (en) 2011-03-08 2017-01-10 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method for operating objects in a virtual world based on orientation data related to an orientation of a device
US20120229513A1 (en) * 2011-03-08 2012-09-13 Nintendo Co., Ltd. Storage medium having stored thereon information processing program, information processing apparatus, information processing system, and information processing method
US9925464B2 (en) 2011-03-08 2018-03-27 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method for displaying an image on a display device using attitude data of a display device
US9375640B2 (en) 2011-03-08 2016-06-28 Nintendo Co., Ltd. Information processing system, computer-readable storage medium, and information processing method
US20120229455A1 (en) * 2011-03-08 2012-09-13 Nintendo Co., Ltd. Storage medium having stored thereon information processing program, information processing apparatus, information processing system, and information processing method
US9561443B2 (en) 2011-03-08 2017-02-07 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method
US8845430B2 (en) 2011-03-08 2014-09-30 Nintendo Co., Ltd. Storage medium having stored thereon game program, game apparatus, game system, and game processing method
US9526981B2 (en) 2011-03-08 2016-12-27 Nintendo Co., Ltd. Storage medium having stored thereon information processing program, information processing apparatus, information processing system, and information processing method
US9205327B2 (en) 2011-03-08 2015-12-08 Nintento Co., Ltd. Storage medium having information processing program stored thereon, information processing apparatus, information processing system, and information processing method
US9522323B2 (en) * 2011-03-08 2016-12-20 Nintendo Co., Ltd. Storage medium having stored thereon information processing program, information processing apparatus, information processing system, and information processing method
US9492743B2 (en) * 2011-03-08 2016-11-15 Nintendo Co., Ltd. Storage medium having stored thereon information processing program, information processing apparatus, information processing system, and information processing method
US9492742B2 (en) * 2011-03-08 2016-11-15 Nintendo Co., Ltd. Storage medium having stored thereon information processing program, information processing apparatus, information processing system, and information processing method
US9345962B2 (en) 2011-03-08 2016-05-24 Nintendo Co., Ltd. Storage medium having stored thereon information processing program, information processing apparatus, information processing system, and information processing method
US20120229454A1 (en) * 2011-03-08 2012-09-13 Nintendo Co., Ltd. Storage medium having stored thereon information processing program, information processing apparatus, information processing system, and information processing method
US9370712B2 (en) 2011-03-08 2016-06-21 Nintendo Co., Ltd. Information processing system, information processing apparatus, storage medium having information processing program stored therein, and image display method for controlling virtual objects based on at least body state data and/or touch position data
US20120295705A1 (en) * 2011-05-20 2012-11-22 Konami Digital Entertainment Co., Ltd. Game device, game control method, and non-transitory information recording medium that records a program
US9101839B2 (en) * 2011-06-03 2015-08-11 Nintendo Co., Ltd. Computer-readable storage medium having stored therein game program, game apparatus, game system, and game processing method
US9022862B2 (en) 2011-06-03 2015-05-05 Nintendo Co., Ltd. Computer-readable storage medium having stored therein information processing program, information processing apparatus, information processing system, and information processing method
US20120309512A1 (en) * 2011-06-03 2012-12-06 Nintendo Co., Ltd. Computer-readable storage medium having stored therein game program, game apparatus, game system, and game processing method
US9126114B2 (en) 2011-11-09 2015-09-08 Nintendo Co., Ltd. Storage medium, input terminal device, control system, and control method
US20150133206A1 (en) * 2012-04-30 2015-05-14 The Regents Of The University Of California Method and apparatus for mobile rehabilitation exergaming
US9615048B2 (en) 2012-05-25 2017-04-04 Nintendo Co., Ltd. Controller device, information processing system, and communication method
US9030410B2 (en) 2012-05-25 2015-05-12 Nintendo Co., Ltd. Controller device, information processing system, and information processing method
US10429961B2 (en) 2012-05-25 2019-10-01 Nintendo Co., Ltd. Controller device, information processing system, and information processing method
US8749489B2 (en) 2012-05-25 2014-06-10 Nintendo Co., Ltd. Controller device, information processing system, and communication method
US8599135B1 (en) * 2012-05-25 2013-12-03 Nintendo Co., Ltd. Controller device, information processing system, and communication method
US9262856B1 (en) * 2012-07-17 2016-02-16 Disney Enterprises, Inc. Providing content responsive to performance of available actions solicited via visual indications
US9070194B2 (en) 2012-10-25 2015-06-30 Microsoft Technology Licensing, Llc Planar surface detection
WO2014081900A1 (en) * 2012-11-20 2014-05-30 Morinoske Co., Ltd. Curvate motion sensing and control system
US10076704B2 (en) 2013-03-11 2018-09-18 Capcom Co., Ltd. Game device
US10220303B1 (en) 2013-03-15 2019-03-05 Harmonix Music Systems, Inc. Gesture-based music game
US20160196029A1 (en) * 2015-01-06 2016-07-07 LINE Plus Corporation Game system for providing rhythm game service and method therefor
US10583356B2 (en) 2015-06-12 2020-03-10 Nintendo Co., Ltd. Information processing system, information processing device, controller device and accessory
US10258879B2 (en) 2015-06-12 2019-04-16 Nintendo Co., Ltd. Supporting device, charging device and controller system
US11951386B2 (en) 2015-06-12 2024-04-09 Nintendo Co., Ltd. Information processing system, information processing device, controller device and accessory
US10118093B2 (en) * 2015-06-12 2018-11-06 Nintendo Co., Ltd. Information processing system, information processing device, controller device and accessory
US11110344B2 (en) 2015-06-12 2021-09-07 Nintendo Co., Ltd. Information processing system, information processing device, controller device and accessory
EP3103532B1 (en) * 2015-06-12 2020-07-22 Nintendo Co., Ltd. Game controller
US10661160B2 (en) 2015-06-12 2020-05-26 Nintendo Co., Ltd. Game controller
US10543423B2 (en) 2015-06-12 2020-01-28 Nintendo Co., Ltd. Information processing system, information processing device, controller device and accessory
US10610776B2 (en) 2015-06-12 2020-04-07 Nintendo Co., Ltd. Supporting device, charging device and controller system
US10010789B2 (en) 2015-06-12 2018-07-03 Nintendo Co., Ltd. Information processing system, information processing device, controller device and accessory
US11141654B2 (en) 2015-06-12 2021-10-12 Nintendo Co., Ltd. Game controller
US20170072304A1 (en) 2015-06-12 2017-03-16 Nintendo Co., Ltd. Information processing system, information processing device, controller device and accessory
US20170136353A1 (en) * 2015-06-12 2017-05-18 Nintendo Co., Ltd. Information processing system, information processing device, controller device and accessory
US11724178B2 (en) 2015-06-12 2023-08-15 Nintendo Co., Ltd. Game controller
US9952679B2 (en) * 2015-11-26 2018-04-24 Colopl, Inc. Method of giving a movement instruction to an object in a virtual space, and program therefor
US20170153709A1 (en) * 2015-11-26 2017-06-01 Colopl, Inc. Method of giving a movement instruction to an object in a virtual space, and program therefor
US11042038B2 (en) * 2015-12-02 2021-06-22 Sony Interactive Entertainment Inc. Display control apparatus and display control method
US11768383B2 (en) 2015-12-02 2023-09-26 Sony Interactive Entertainment Inc. Display control apparatus and display control method
US20180329215A1 (en) * 2015-12-02 2018-11-15 Sony Interactive Entertainment Inc. Display control apparatus and display control method
US20170266551A1 (en) * 2016-03-18 2017-09-21 Colopl, Inc. Game medium, method of using the game medium, and game system for using the game medium
US10279256B2 (en) * 2016-03-18 2019-05-07 Colopl, Inc. Game medium, method of using the game medium, and game system for using the game medium
US10328350B2 (en) 2016-10-06 2019-06-25 Nintendo Co., Ltd. Attachment and control system
US10596454B2 (en) 2016-10-06 2020-03-24 Nintendo Co., Ltd. Attachment
US10092829B2 (en) 2016-10-06 2018-10-09 Nintendo Co., Ltd. Attachment
US10888770B2 (en) * 2017-07-27 2021-01-12 Nintendo Co., Ltd. Game system, accessory, storage medium having stored therein game program, and game processing method
US20190030423A1 (en) * 2017-07-27 2019-01-31 Nintendo Co., Ltd. Game system, accessory, storage medium having stored therein game program, and game processing method
US11253776B2 (en) * 2017-12-28 2022-02-22 Bandai Namco Entertainment Inc. Computer device and evaluation control method
US11260286B2 (en) 2017-12-28 2022-03-01 Bandai Namco Entertainment Inc. Computer device and evaluation control method
EP3591658A1 (en) * 2018-07-04 2020-01-08 Nokia Technologies Oy Method and apparatus for hand function monitoring
US11036987B1 (en) 2019-06-27 2021-06-15 Facebook Technologies, Llc Presenting artificial reality content using a mirror
US11055920B1 (en) * 2019-06-27 2021-07-06 Facebook Technologies, Llc Performing operations using a mirror in an artificial reality environment
US11145126B1 (en) * 2019-06-27 2021-10-12 Facebook Technologies, Llc Movement instruction using a mirror in an artificial reality environment
US11247121B2 (en) * 2019-08-07 2022-02-15 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having game program stored therein, game system, game apparatus, and game processing control method for correct direction determination

Also Published As

Publication number Publication date
EP2039402A3 (en) 2012-01-04
EP2039402A2 (en) 2009-03-25
EP2039402B1 (en) 2020-02-19

Similar Documents

Publication Publication Date Title
EP2039402B1 (en) Input instruction device, input instruction method, and dancing simultation system using the input instruction device and method
JP5410710B2 (en) Program, information storage medium, game system
JP6754678B2 (en) Simulation system and program
US8784201B2 (en) Information storage medium, game system, and input determination method
JP5081964B2 (en) GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
US7855713B2 (en) Program, input evaluation system, and input evaluation method
JP5491217B2 (en) Program, information storage medium, game system
US20090244064A1 (en) Program, information storage medium, and image generation system
US8655015B2 (en) Image generation system, image generation method, and information storage medium
US8520901B2 (en) Image generation system, image generation method, and information storage medium
US8684837B2 (en) Information processing program, information processing system, information processing apparatus, and information processing method
JP2008136694A (en) Program, information storage medium and game apparatus
JP6362634B2 (en) Image generation system, game device, and program
US8384661B2 (en) Program, information storage medium, determination device, and determination method
JP2016140378A (en) Game machine, game system, and program
JP2010233671A (en) Program, information storage medium and game device
JP2009279050A (en) Program, information storage medium, and game apparatus
EP2253358B1 (en) Game device, game device control method, program, information storage medium
JP6732463B2 (en) Image generation system and program
JPH11146978A (en) Three-dimensional game unit, and information recording medium
JP2006268511A (en) Program, information storage medium and image generation system
CN112104857A (en) Image generation system, image generation method, and information storage medium
JP7104539B2 (en) Simulation system and program
JP6918189B2 (en) Simulation system and program
JP2008067853A (en) Program, information storage medium and image generation system

Legal Events

Date Code Title Description
AS Assignment

Owner name: NAMCO BANDAI GAMES INC., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NISHIMOTO, YASUHIRO;REEL/FRAME:021745/0497

Effective date: 20081010

AS Assignment

Owner name: BANDAI NAMCO GAMES INC., JAPAN

Free format text: CHANGE OF NAME;ASSIGNOR:NAMCO BANDAI GAMES INC.;REEL/FRAME:033061/0930

Effective date: 20140401

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION