US20070197290A1 - Music Game Device, Music Game System, Operation Object, Music Game Program, And Music Game Method - Google Patents

Music Game Device, Music Game System, Operation Object, Music Game Program, And Music Game Method Download PDF

Info

Publication number
US20070197290A1
US20070197290A1 US10/572,429 US57242904A US2007197290A1 US 20070197290 A1 US20070197290 A1 US 20070197290A1 US 57242904 A US57242904 A US 57242904A US 2007197290 A1 US2007197290 A1 US 2007197290A1
Authority
US
United States
Prior art keywords
guide
operation article
cursor
information
music
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/572,429
Inventor
Hiromu Ueshima
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SSD Co Ltd
Original Assignee
SSD Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SSD Co Ltd filed Critical SSD Co Ltd
Priority to US10/572,429 priority Critical patent/US20070197290A1/en
Priority claimed from PCT/JP2004/014025 external-priority patent/WO2005028053A1/en
Assigned to SSD COMPANY LIMITED reassignment SSD COMPANY LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: UESHIMA, HIROMU
Publication of US20070197290A1 publication Critical patent/US20070197290A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/814Musical performances, e.g. by evaluating the player's ability to follow a notation
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0008Associated control or indicating means
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • A63F13/245Constructional details thereof, e.g. game controllers with detachable joystick handles specially adapted to a particular type of game, e.g. steering wheels
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/44Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment involving timing of operations, e.g. performing an action within a time slot
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5375Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for graphically or textually suggesting an action, e.g. by displaying an arrow indicating a turn in a driving game
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0033Recording/reproducing or transmission of music for electrophonic musical instruments
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/219Input arrangements for video game devices characterised by their sensors, purposes or types for aiming at specific areas on the display, e.g. light-guns
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • A63F13/655Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/63Methods for processing data by generating or executing the game program for controlling the execution of the game in time
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6607Methods for processing data by generating or executing the game program for rendering three dimensional images for animating game characters, e.g. skeleton kinematics
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/69Involving elements of the real world in the game world, e.g. measurement in live races, real video
    • A63F2300/695Imported photos, e.g. of the player
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/201User input interfaces for electrophonic musical instruments for movement interpretation, i.e. capturing and recognizing a gesture or a specific kind of movement, e.g. to control a musical instrument
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/405Beam sensing or control, i.e. input interfaces involving substantially immaterial beams, radiation, or fields of any nature, used, e.g. as a switch as in a light barrier, or as a control device, e.g. using the theremin electric field sensing principle
    • G10H2220/411Light beams
    • G10H2220/415Infrared beams
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/441Image sensing, i.e. capturing images or optical patterns for musical purposes or musical control purposes
    • G10H2220/455Camera input, e.g. analyzing pictures from a video camera and using the analysis results as control data

Definitions

  • the present invention relates to a music game apparatus which displays images following the motion of an operation article and the related arts.
  • Patent document 1 Japanese Patent Published Application No. 2002-263360.
  • This music conducting game apparatus is provided with a phototransmitter unit at the tip of a baton controller, and a photoreceiver unit in a lower position of a monitor. The motion of the baton controller is detected by such a configuration.
  • Patent document 2 Japanese Patent Published Application No. Hei 10-143151 discloses a conducting apparatus.
  • music parameters such as a tempo, an accent and dynamics are calculated with reference to the trajectory of the mouse.
  • the music parameters as calculated are reflected in the music and image as output. For example, in the case where the motion picture of a steam train is displayed, the speed of the steam train is controlled to follow the tempo as calculated, the variation of the speed is controlled to follow the accent as calculated, and the amount of smoke of the steam train is controlled to follow the dynamics as calculated.
  • the image which is displayed (a steam train in the case of the above example) is not interesting enough, and little importance is attached to such an image that can be enjoyed by the player.
  • the baton controller of Patent document 1 is provided with the phototransmitter unit, and thereby it is indispensable to use an electronic circuit. Accordingly, the cost of the baton controller rises, and it can be the cause of trouble. Still further, the manipulability is degraded. Particularly, since the baton controller is used by swinging, it is desirable to dispense with an electronic circuit and simplify the configuration.
  • the mouse of the Patent document 2 can be moved only on a plane surface so that there are substantial restrictions on the manipulation, and in addition to this there are the same problem as in the baton controller of Patent document 1.
  • a music game apparatus operable to automatically playing music, comprises: a stroboscope operable to irradiate an operation article manipulated by a player with light in a predetermined cycle; an imaging unit operable to generate a lighted image signal and an unlighted image signal by capturing images of the operation article respectively when said stroboscope is lighted and unlighted; a differential signal generating unit operable to generate a differential signal between the lighted image signal and the unlighted image signal; a state information calculating unit operable to calculate the state information of the operation article on the basis of the differential signal; a guide control unit operable to control the display of a guide for the manipulation of a cursor, which moves in association with the operation article, in a timing on the basis of the music; a cursor control unit operable to control the display of the cursor on the basis of the state information of the operation article; and a follow-up image control unit operable to control the display of an image in accordance with guidance by the guide when the curs
  • the display of the image is controlled in accordance with the guidance by the guide.
  • the display of the image is controlled in accordance with the manipulation of the cursor.
  • the display of the image is controlled in accordance with the manipulation of the operation article.
  • the state information of the operation article is obtained by capturing the image of the operation article, which is intermittently lighted by the stroboscope. Because of this, no circuit which is driven by a power supply need be provided within the operation article for obtaining the state information of the operation article. Furthermore, this music game apparatus serves to automatically play music.
  • the operation article is manipulated in synchronization with music as long as the player manipulates the cursor in correspondence with the guide. Accordingly, the player can enjoy the manipulation of the operation article in synchronization with music.
  • the “manipulation” of the operation article means moving the operation article itself (for example, changing the position thereof), but does not mean pressing a switch, moving an analog stick, and so forth.
  • the guide is operable to guide the cursor to a destination position in a manipulation timing
  • said follow-up image control unit is operable to control the display of the image in correspondence with the direction of the destination position as guided by the guide when the cursor is correctly manipulated by the operation article in correspondence with the guide.
  • said state information calculating unit is operable to calculate the position of the operation article as the state information on the basis of the differential signal
  • said follow-up image control unit is operable to determine that the cursor, which moves in association with the operation article, is correctly manipulated in correspondence with the guide if the position of the operation article as calculated by said state information calculating unit is located in an area corresponding to the guidance by the guide within a period corresponding to the guidance by the guide.
  • the guide is operable to guide the moving path, moving direction and manipulation timing of the cursor.
  • said state information calculating unit is operable to calculate the position of the operation article as the state information on the basis of the differential signal
  • said follow-up image control unit is operable to determine that the cursor, which moves in association with the operation article, is correctly manipulated in correspondence with the guide if the position of the operation article as calculated by said state information calculating unit is moved through a plurality of predetermined areas guided by the guide in a predetermined order guided by the guide within a period guided by the guide.
  • the guide is displayed in each of a plurality of positions which is determined in advance in a screen, and wherein the guide control unit is operable to change the appearance of the guide in a timing on the basis of the music;
  • the player can easily recognize the position and the direction to which the cursor is to be moved with reference to the change of the position guide in appearance.
  • the appearance of the guide is related to either or both of the shape and color of the guide.
  • the guide is expressed in an image with which it is possible to visually recognize the motion from a first predetermined position to a second predetermined position on a screen, and wherein the guide control unit is operable to control the display of the guide in a timing on the basis of the music.
  • the player can clearly recognize the direction and path of the cursor to be moved.
  • the guide is expressed by the change in appearance of a plurality of objects which are arranged in a path having a start point at the first predetermined position and an end point at the second predetermined position on the screen.
  • the player can easily recognize the direction and path of the cursor to be moved with reference to the change in appearance of the plurality of objects.
  • the guide is expressed by an object moving from the first predetermined position to the second predetermined position on the screen.
  • the player can easily recognize the direction and path of the cursor to be moved with reference to the motion of the object.
  • the guide is expressed by the change in appearance of a path having a start point at the first predetermined position and an end point at the second predetermined position on the screen.
  • the player can easily recognize the direction and path of the cursor to be moved with reference to the change in appearance of the path.
  • the state information of the operation article as calculated by said state information calculating unit is any one of or any combination of two or more of speed information, moving direction information, moving distance information, velocity vector information, acceleration information, movement locus information, area information, and positional information.
  • FIG. 1 is a view showing the overall configuration of the music game system in accordance with an embodiment of the present invention.
  • FIG. 2 is a perspective view of the operation article of FIG. 1 .
  • FIG. 3 ( a ) is a top view showing the reflection ball of FIG. 2 .
  • FIG. 3 ( b ) is a side view for showing the reflection ball as seen from arrow A of FIG. 3 ( a ).
  • FIG. 3 ( c ) is a side view for showing the reflection ball as seen from arrow B of FIG. 3 ( a ).
  • FIG. 4 is a longitudinal section view of the reflection ball of FIG. 2 .
  • FIG. 5 is an explanatory schematic diagram for showing one example of the imaging unit of FIG. 1 .
  • FIG. 6 is a view showing the electric configuration of the music game apparatus of FIG. 1 .
  • FIG. 7 is a block diagram of the high speed processor of FIG. 6 .
  • FIG. 8 is a circuit diagram for showing an LED drive circuit and the configuration for transferring pixel data from the image sensor of FIG. 6 to a high speed processor.
  • FIG. 9 ( a ) is a timing diagram of a frame status flag signal FSF as output from the image sensor of FIG. 8 .
  • FIG. 9 ( b ) is a timing diagram of a pixel data strobe signal PDS as output from the image sensor of FIG. 8 .
  • FIG. 9 ( c ) is a timing diagram of the pixel data D (X, Y) as output from the image sensor of FIG. 8 .
  • FIG. 9 ( d ) is a timing diagram of an LED control signal LEDC as output from the image sensor of FIG. 8 .
  • FIG. 9 ( e ) is a timing diagram showing the lighting state of the infrared light emitting diodes of FIG. 8 .
  • FIG. 9 ( f ) is a timing diagram showing the exposure period of the image sensor of FIG. 8 .
  • FIG. 10 ( a ) is an expanded view showing the frame status flag signal FSF of FIG. 9 .
  • FIG. 10 ( b ) is an expanded view showing the pixel data strobe signal PDS of FIG. 9 .
  • FIG. 10 ( c ) is an expanded view showing the pixel data D (X, Y) of FIG. 9 .
  • FIG. 11 is a view for showing an example of the game screen as displayed on the screen of the television monitor of FIG. 1 .
  • FIG. 12 is a view for showing another example of the game screen as displayed on the screen of the television monitor of FIG. 1 .
  • FIG. 13 is a view for showing a further example of the game screen as displayed on the screen of the television monitor of FIG. 1 .
  • FIG. 14 is a view for explaining the sprites forming an object which is displayed on the screen of the television monitor of FIG. 1 .
  • FIG. 15 is an explanatory view for showing a background screen to be displayed on the screen of the television monitor of FIG. 1 .
  • FIG. 16 ( a ) is an explanatory view for showing the background screen of FIG. 15 before scrolling it.
  • FIG. 16 ( b ) is an explanatory view for showing the background screen after scrolling it.
  • FIG. 17 is a schematic representation of a program and data stored in the ROM of FIG. 6 .
  • FIG. 18 is a schematic representation of one example of the first musical score data of FIG. 17 .
  • FIG. 19 is a schematic representation of one example of the second musical score data of FIG. 17 .
  • FIG. 20 ( a ) is a view showing the correspondence between note numbers and the directions in which the cursor is guided.
  • FIG. 20 ( b ) is another view showing the correspondence between note numbers and the directions in which the cursor is guided.
  • FIG. 20 ( c ) is a further view showing the correspondence between note numbers and the directions in which the cursor is guided.
  • FIG. 21 ( a ) is a view for showing an example of the image which is captured by an ordinary used image sensor and is not processed by a particular treatment.
  • FIG. 21 ( b ) is a view for showing an example of the image which is obtained by level filtering the image signal of FIG. 21 ( a ) by a certain threshold value.
  • FIG. 21 ( c ) is a view for showing an example of the image which is captured by the image sensor through the infrared filter with the illumination and level filtered by a certain threshold value.
  • FIG. 21 ( d ) is a view for showing an example of the image which is captured by the image sensor through the infrared filter without the illumination and is level filtered by a certain threshold value.
  • FIG. 21 ( e ) is a view for showing an example of the differential signal between the image signal with the illumination and the image signal without the illumination.
  • FIG. 22 is a view for explaining the process of calculating the coordinates of the target point of the operation article of FIG. 1 .
  • FIG. 23 ( a ) is a view for explaining the process of scanning in the X-direction when the coordinates of the target point of the operation article of FIG. 1 are calculated on the basis of the coordinates of the pixel having the maximum luminance value.
  • FIG. 23 ( b ) is a view for explaining the process of starting scanning in the Y-direction when the coordinates of the target point of the operation article of FIG. 1 are calculated on the basis of the coordinates of the pixel having the maximum luminance value.
  • FIG. 23 ( c ) is a view for explaining the process of scanning in the Y-direction when the coordinates of the target point of the operation article of FIG. 1 are calculated on the basis of the coordinates of the pixel having the maximum luminance value.
  • FIG. 23 ( d ) is an explanatory view for showing the result of the process of calculating the coordinates of the target point of the operation article on the basis of the coordinates of the pixel having the maximum luminance value.
  • FIG. 24 is a view for explaining a target point existence area determination process ( 1 ) performed by the CPU 201 .
  • FIG. 25 is a view for explaining a target point existence area determination process ( 2 ) performed by the CPU 201 .
  • FIG. 26 is a view for explaining the registration process of the animations of the direction guide “G”, the position guide “g” and the path guide “rg” in accordance with the present embodiment.
  • FIG. 27 is a view for showing an example of the animation table which is designated by the animation table storage location information of FIG. 26 .
  • FIG. 28 is a timing diagram for explaining the relationship among the first musical score data, the second musical score data, the direction guide “G”, the position guide “g”, the judgment of manipulation and the dance animation in accordance with the present embodiment.
  • FIG. 29 is a flow chart showing the entire process flow of the music game apparatus 1 of FIG. 1 .
  • FIG. 30 is a flow chart showing the process flow for the initial settings of the system in step S 1 of FIG. 29 .
  • FIG. 31 is a flow chart showing the process flow for sensor initial settings in step S 14 of FIG. 30 .
  • FIG. 32 is a flow chart showing the command transmission process in step S 21 of FIG. 31 .
  • (a) is a timing diagram showing the register setting clock CLK of FIG. 8 .
  • (b) is a timing diagram showing register data of FIG. 8 .
  • FIG. 34 is a flow chart showing the register setting process in step S 23 of FIG. 31 .
  • FIG. 35 is a flow chart showing the process of calculating the state information in step S 2 of FIG. 29 .
  • FIG. 36 is a flow chart showing the process flow of acquiring a pixel data group in step S 50 of FIG. 35 .
  • FIG. 37 is a flow chart showing the process flow of acquiring pixel data in step S 61 of FIG. 36 .
  • FIG. 38 is a flow chart showing the process flow of extracting a target point in step S 51 of FIG. 35 .
  • FIG. 39 is a flow chart showing the process flow of calculating the coordinates of a target point in step S 85 of FIG. 38 .
  • FIG. 40 is a flow chart showing the game process flow in step S 3 of FIG. 29 .
  • FIG. 41 is a flow chart showing the interrupt process in accordance with the present embodiment.
  • FIG. 42 is a flow chart showing the process flow of the playback of music in step S 150 of FIG. 41 .
  • FIG. 43 is a flow chart showing the process flow of registering guides in step S 151 of FIG. 41 .
  • FIG. 44 is a view for showing another example of the direction guide in accordance with the present embodiment.
  • FIG. 45 is a view for showing a further example of the direction guide in accordance with the present embodiment.
  • FIG. 46 is a view for showing a still further example of the direction guide in accordance with the present embodiment.
  • FIG. 1 is a view showing the overall configuration of the music game system in accordance with the embodiment of the present invention. As shown in FIG. 1 , this music game system includes a music game apparatus 1 , an operation article 150 and a television monitor 90 .
  • the housing 19 of the music game apparatus 1 includes an imaging unit 13 therein.
  • the imaging unit 13 includes four infrared light emitting diodes 15 and an infrared filter 17 .
  • the light emission units of the infrared light emitting diodes 15 are exposed from the infrared filter 17 .
  • the music game apparatus 1 is supplied with a DC power voltage from an AC adapter 92 .
  • a battery cell (not shown in the figure) can be used to apply the DC power voltage in place of the AC adaptor 92 .
  • the television monitor 90 includes a screen 91 at the front side thereof.
  • the television monitor 90 and the music game apparatus 1 are connected by an AV cable 93 .
  • the music game apparatus 1 is placed for example on the upper surface of the television monitor 90 .
  • the player 94 When the player 94 turns on the power switch (not shown in the figure) which is provided in the back side of the music game apparatus 1 , a game screen is displayed on the screen 91 .
  • the player 94 manipulates the operation article 150 in accordance with the guidance of a game screen to run a game.
  • the “manipulation” of the operation article 150 means moving the operation article itself (for example, changing the position thereof), but does not mean pressing a switch, moving an analog stick, and so forth.
  • the infrared light emitting diodes 15 of the imaging unit 13 intermittently emit infrared light.
  • the infrared light emitted from the infrared light emitting diodes 15 is reflected by the reflection sheet (to be described below) attached to this operation article 150 , and input to the imaging device (to be described below) located inside the infrared filter 17 .
  • the image of the operation article 150 is intermittently captured.
  • the music game apparatus 1 can intermittently acquire an image signal of the operation article 150 which is moved by the player 94 .
  • the music game apparatus 1 analyzes the image signals and reflects the analysis result in the game.
  • the reflection sheet which is used in the present embodiment is for example a retroreflective sheet.
  • FIG. 2 is a perspective view for showing the operation article 150 of FIG. 1 .
  • the operation article 150 comprises the reflection ball 151 fixed to the tip of a stick 152 .
  • the infrared light from the infrared light emitting diodes 15 is reflected by this reflection ball 151 .
  • the details of the reflection ball 151 will be explained.
  • FIG. 3 ( a ) is a top view showing the reflection ball 151 of FIG. 2
  • FIG. 3 ( b ) is a side view for showing the reflection ball 151 as seen from arrow A of FIG. 3 ( a )
  • FIG. 3 ( c ) is a side view for showing the reflection ball 151 as seen from arrow B of FIG. 3 ( a ).
  • the reflection ball 151 comprises a spherical inner shell 154 which is fixedly located inside a spherical outer shell 153 of a transparent color (inclusive of a semi-transparent, a colored-transparent and colorless transparent).
  • the spherical inner shell 154 is provided with a reflection sheet 155 attached thereto. This reflection sheet 155 serves to reflect infrared light from the infrared light emitting diodes 15 .
  • FIG. 4 is a longitudinal section view taken through the reflection ball 151 of FIG. 2 .
  • the spherical outer shell 153 comprises two semispherical outer shells which are fixed together with bosses 156 and screws (not shown in the figure).
  • the spherical inner shell 154 comprises two semispherical inner shells which are fixed inside the spherical outer shell 153 with bosses 157 .
  • the stick 152 is fixed to the reflection ball 151 by inserting it thereinto.
  • the stick 152 is fixed to the reflection ball 151 by placing the stick 152 between the two semispherical outer shells forming the spherical outer shell 153 and the two semispherical inner shells forming the spherical inner shell 154 , fixing together the two semispherical outer shells with the bosses 156 and the screws, and fixing together the two semispherical inner shells with the bosses 157 .
  • FIG. 5 is an explanatory schematic diagram for showing one example of the imaging unit 13 of FIG. 1 .
  • the imaging unit 13 includes a unit base 35 which is molded for example from a plastic material, and a supporting cylinder 36 is attached to the inside of this unit base 35 .
  • the supporting cylinder 36 is provided with an horn opening 41 formed in the upper surface of the supporting cylinder 36 with an inner surface shaped in the form of an inverted cone, and an optical system located in a cylindrical portion below the opening 41 and including a concave lens 39 and a convex lens 37 each of which is molded for example from a plastic material, and an image sensor 43 as an imaging device is fixed below the convex lens 37 . Accordingly, the image sensor 43 can capture an image in accordance with light which incomes through the opening 41 via the lens sections 39 and 37 .
  • the image sensor 43 is a low resolution CMOS image sensor (for example, 32 pixels ⁇ 32 pixels: gray scale). However, this image sensor 43 may be an image sensor having a larger number of pixels, a CCD or the like device. In the following explanation, it is assumed that the image sensor 43 comprises 32 pixels ⁇ 32 pixels.
  • the infrared light emitting diodes 15 is attached to the unit base 35 in order that the light output directions thereof are set respectively to the upward direction. Infrared light is emitted to an area over the imaging unit 13 by this infrared light emitting diodes 15 .
  • the infrared filter 17 (a filter capable of passing only infrared light therethrough) is attached to the upper portion of the unit base 35 in order to cover the above opening 41 . Then, the infrared light emitting diodes 15 are repeatedly turned on/off (non-lighted) in a continuous manner, as will be described below, so that it serves as a stroboscope.
  • the “stroboscope” is a generic term used to refer to a device serving to intermittently irradiate a moving object. Accordingly, the above image sensor 43 serves to capture an image of an object, which is moving in the scope of imaging, i.e., the operation article 150 in the case of the embodiment.
  • the stroboscope is composed mainly of the infrared light emitting diodes 15 , an LED drive circuit 75 and a high speed processor 200 .
  • the imaging unit 13 is incorporated in the housing 19 in order that the light receiving surface of the image sensor 43 is inclined from the horizontal surface at a predetermined angle (for example, 90 degrees). Also, the scope of imaging of the image sensor 43 is for example within 60 degrees as determined by the concave lens 39 and the convex lens 37 .
  • FIG. 6 is a view showing the electric configuration of the music game apparatus 1 of FIG. 1 .
  • the music game apparatus 1 includes the image sensor 43 , the infrared light emitting diodes 15 , a video signal output terminal 47 , an audio signal output terminal 49 , the high speed processor 200 , a ROM (read only memory) 51 , and a bus 53 .
  • the high speed processor 200 is connected to the bus 53 . Furthermore, the ROM 51 is connected to the bus 53 . Accordingly, the high speed processor 200 can access the ROM 51 through the bus 53 to read and execute a game program as stored in the ROM 51 , and read and process image data and music data as stored in the ROM 51 in order to generate a video signal and an audio signal, which are then output through the video signal output terminal 47 and the audio signal output terminal 49 respectively.
  • the operation article 150 is irradiated with infrared light emitted from the infrared light emitting diodes 15 , and reflects the infrared light by the reflection sheet 155 .
  • the image sensor 43 detects the reflected light from this retroreflective sheet 155 , and outputs an image signal which includes an image of the retroreflective sheet 155 .
  • the analog image signal output from the image sensor 43 is converted into digital data by an A/D converter (to be described below) incorporated in the high speed processor 200 . This process is performed also in the periods without infrared light.
  • the high speed processor 200 analyzes this digital data, and reflects the analysis result in the game processing.
  • FIG. 7 is a block diagram of the high speed processor 200 of FIG. 6 .
  • this high speed processor 200 includes a central processing unit (CPU) 201 , a graphics processor 202 , a sound processor 203 , a DMA (direct memory access) controller 204 , a first bus arbiter circuit 205 , a second bus arbiter circuit 206 , an internal memory 207 , an A/D converter (ADC: analog to digital converter) 208 , an input/output control circuit 209 , a timer circuit 210 , a DRAM (dynamic random access memory) refresh control circuit 211 , an external memory interface circuit 212 , a clock driver 213 , a PLL (phase-locked loop) circuit 214 , a low voltage detection circuit 215 , a first bus 218 , and a second bus 219 .
  • CPU central processing unit
  • graphics processor 202 e.g., a graphics processor 202 , a graphics processor 202 , a sound processor
  • the CPU 201 takes control of the entire system and perform various types of arithmetic operations in accordance with the program stored in the memory (the internal memory 207 , or the ROM 51 ).
  • the CPU 201 is a bus master of the first bus 218 and the second bus 219 , and can access the resources connected to the respective buses.
  • the graphics processor 202 is also a bus master of the first bus 218 and the second bus 219 , and generates a video signal on the basis of the data as stored in the internal memory 207 or the ROM 51 , and output the video signal through the video signal output terminal 47 .
  • the graphics processor 202 is controlled by the CPU 201 through the first bus 218 . Also, the graphics processor 202 has the functionality of outputting an interrupt request signal 220 to the CPU 201 .
  • the sound processor 203 is also a bus master of the first bus 218 and the second bus 219 , and generates an audio signal on the basis of the data as stored in the internal memory 207 or the ROM 51 , and output the audio signal through the audio signal output terminal 49 .
  • the sound processor 203 is controlled by the CPU 201 through the first bus 218 . Also, the sound processor 203 has the functionality of outputting an interrupt request signal 220 to the CPU 201 .
  • the DMA controller 204 serves to transfer data from the ROM 51 to the internal memory 207 . Also, the DMA controller 204 has the functionality of outputting, to the CPU 201 , an interrupt request signal 220 indicative of the completion of the data transfer.
  • the DMA controller 204 is also a bus master of the first bus 218 and the second bus 219 . The DMA controller 204 is controlled by the CPU 201 through the first bus 218 .
  • the internal memory 207 may be implemented with one or any necessary combination of a mask ROM, an SRAM (static random access memory) and a DRAM in accordance with the system requirements.
  • a battery 217 is provided if an SRAM has to be powered by the battery for maintaining the data contained therein. In the case where a DRAM is used, a so-called refresh cycle is periodically performed to maintain the data contained therein.
  • the first bus arbiter circuit 205 accepts a first bus use request signal from the respective bus masters of the first bus 218 , performs bus arbitration among the requests for the first bus 218 , and issue a first bus use permission signal to one of the respective bus masters. Each bus master is permitted to access the first bus 218 after receiving the first bus use permission signal.
  • the first bus use request signal and the first bus use permission signal are shown as the first bus arbitration signals 222 in FIG. 7 .
  • the second bus arbiter circuit 206 accepts a second bus use request signal from the respective bus masters of the second bus 219 , performs bus arbitration among the requests for the second bus 219 , and issue a second bus use permission signal to one of the respective bus masters. Each bus master is permitted to access the second bus 219 after receiving the second bus use permission signal.
  • the second bus use request signal and the second bus use permission signal are shown as the second bus arbitration signals 223 in FIG. 7 .
  • the input/output control circuit 209 serves to perform the communication with an external input/output device(s) and/or an external semiconductor device(s) through input/output signals.
  • the read and write operations of the input/output signals are performed by the CPU 201 through the first bus 218 .
  • the input/output control circuit 209 has the functionality of outputting an interrupt request signal 220 to the CPU 201 .
  • This input/output control circuit 209 outputs an LED control signal LEDC for controlling the infrared light emitting diodes 15 .
  • the timer circuit 210 has the functionality of periodically outputting an interrupt request signal 220 to the CPU 201 on the basis of a time interval as preset.
  • the settings such as the time interval are performed by the CPU 201 through the first bus 218 .
  • the ADC 208 converts analog input signals into digital signals.
  • the digital signals are read by the CPU 201 through the first bus 218 .
  • the ADC 208 has the functionality of outputting an interrupt request signal 220 to the CPU 201 .
  • This ADC 208 receives pixel data (analog) from the image sensor 43 and converts it into digital data.
  • the PLL circuit 214 generates a high frequency clock signal by multiplication of the sinusoidal signal as obtained from a quartz oscillator 216 .
  • the clock driver 213 amplifies the high frequency clock signal as received from the PLL circuit 214 to a sufficient signal level to supply the respective blocks with the clock signal 225 .
  • the low voltage detection circuit 215 monitors the power potential Vcc and issues the reset signal 226 of the PLL circuit 214 and the reset signal 227 to the other circuit elements of the entire system when the power potential Vcc falls below a certain voltage. Also, in the case where the internal memory 207 is implemented with an SRAM requiring the power supply from the battery 217 for maintaining data, the low voltage detection circuit 215 serves to issue a battery backup control signal 224 when the power potential Vcc falls below the certain voltage.
  • the external memory interface circuit 212 has the functionality of connecting the second bus 219 to the external bus 53 and the functionality of controlling the bus cycle length of the second bus by issuing a cycle end signal 228 .
  • the DRAM refresh cycle control circuit 211 periodically and unconditionally gets the ownership of the first bus 218 to perform the refresh cycle of the DRAM at a certain interval. Needless to say, the DRAM refresh cycle control circuit 211 is provided in the case where the internal memory 207 includes a DRAM.
  • FIG. 8 is a circuit diagram for showing the LED drive circuit and the configuration of transferring pixel data from the image sensor 43 of FIG. 6 to the high speed processor 200 .
  • FIG. 9 is a timing diagram showing the operation of the high speed processor 200 which receives pixel data from the image sensor 43 of FIG. 6 .
  • FIG. 10 is an expanded timing diagram showing part of FIG. 9 .
  • the image sensor 43 is a sensor which outputs pixel data D (X, Y) as an analog signal
  • this pixel data D (X, Y) is input to an analog input port of the high speed processor 200 .
  • the analog input port is connected to the ADC 208 of this high speed processor 200 , and therefore the high speed processor 200 acquires therein pixel data converted into digital data from the ADC 208 .
  • the middle point of the analog pixel data D (X, Y) as described above is determined by a reference voltage given to a reference voltage terminal Vref of the image sensor 43 .
  • a reference voltage generation circuit 59 made of a resistance voltage divider is provided in order to supply a reference voltage which is always kept at a certain level to the reference voltage terminal Vref.
  • the respective digital signals for controlling the image sensor 43 are input to and output from the high speed processor 200 through the I/O ports thereof.
  • These I/O ports are digital ports capable of controlling input and output operations and connected to the input/output control circuit 209 inside of this high speed processor 200 .
  • a reset signal “reset” is output to the image sensor 43 from the I/O port of the high speed processor 200 for resetting the image sensor 43 .
  • a pixel data strobe signal PDS and a frame status flag signal FSF are output from the image sensor 43 , and supplied to the input ports of the high speed processor 200 .
  • the pixel data strobe signal PDS is a strobe signal as shown in FIG. 9 ( b ) which is used to read the pixel signal D (X, Y) as described above.
  • the frame status flag signal FSF is a flag signal which indicates the state of the image sensor 43 and is used for defining the exposure period of this image sensor 43 as illustrated in FIG. 9 ( a ). In other words, while the exposure period is defined by the low level period of the frame status flag signal FSF as illustrated in FIG. 9 ( a ), the non-exposure period is defined by the high level period of the frame status flag signal FSF as illustrated in FIG. 9 ( a ).
  • the high speed processor 200 outputs, from the I/O ports, a command (or command associated with data) to be set in a control register (not shown in the figure) of the image sensor 43 , outputs a register setting clock CLK which periodically and alternatively takes high and low levels, and supplies the register setting clock CLK to the image sensor 43 .
  • the infrared light emitting diodes 15 as used are four infrared light emitting diodes 15 a , 15 b , 15 c and 15 d which are connected in parallel each other as illustrated in FIG. 8 .
  • These four infrared light emitting diodes 15 a to 15 d are arranged and surround the image sensor 43 , as explained above, in order to irradiate the operation article 150 with infrared light emitted in the same direction as the viewpoint of the image sensor 43 is directed.
  • the individual infrared light emitting diodes 15 a to 15 d are referred to simply as the infrared light emitting diodes 15 unless it is necessary to distinguish them.
  • This infrared light emitting diodes 15 is turned on/off (non-lighted) by the LED drive circuit 75 .
  • the LED drive circuit 75 receives the frame status flag signal FSF as described above from the image sensor 43 , and this frame status flag signal FSF is passed through a differentiating circuit 67 , which is made up of a resistor 69 and a capacitor 71 , and given to the base of the PNP transistor 77 .
  • the base of this PNP transistor 77 is connected further to a pull-up resistor 79 which usually pulls up the base of the PNP transistor 77 to a high level. Then, when the frame status flag signal FSF is pulled down to a low level, the low level signal is input to the base through the differentiating circuit 67 so that the PNP transistor 77 is turned on only for the low level period of the frame status flag signal FSF.
  • the emitter of the PNP transistor 77 is grounded through resistors 73 and 65 .
  • the connecting point between the emitter resistors 73 and 65 is connected to the base of an NPN transistor 81 .
  • the collector of this NPN transistor 81 is connected commonly to the anodes of the respective infrared light emitting diodes 15 a to 15 d .
  • the emitter of the NPN transistor 81 is connected directly to the base of another NPN transistor 61 .
  • the collector of the NPN transistor 61 is connected commonly to the cathodes of the respective infrared light emitting diodes 15 a to 15 d , while the emitter of the NPN transistor 61 is grounded.
  • This LED drive circuit 75 turns on the infrared light emitting diodes 15 a to 15 d only within the period when the LED control signal LEDC which is output from the I/O port of the high speed processor 200 is activated (in a high level) while the frame status flag signal FSF which is output from the image sensor 43 is in a low level.
  • the PNP transistor 77 When the frame status flag signal FSF is pulled down to the low level as shown in FIG. 9 ( a ), the PNP transistor 77 is turned on for the low level period (in practice, there is a delay time corresponding to the time constant of the differentiating circuit 67 ). Accordingly, when the LED control signal LEDC shown in FIG. 9 ( d ) is output from the high speed processor 200 as a high level signal, the base of the NPN transistor 81 is pulled up to a high level and turned on. When the transistor 81 is turned on, the transistor 61 is also turned on. Accordingly, a current flows from the power supply (indicated by a small open circle in FIG.
  • the LED drive circuit 75 turns on the infrared light emitting diodes 15 only the period when the LED control signal LEDC is activated as shown in FIG. 9 ( d ) while the frame status flag signal FSF is in a low level as shown in FIG. 9 ( a ), and therefore the infrared light emitting diodes 15 are turned on only in the exposure period (refer to FIG. 9 ( f )) of the image sensor 43 .
  • the frame status flag signal FSF is given also to the coupling capacitor 71 , the transistor 77 is necessarily turned off after a certain period even when the flag signal FSF is fixed at a low level due to the runaway of the image sensor 43 or the like, so that the infrared light emitting diodes 15 are also necessarily turned off after the certain period.
  • the lighting period, non-lighting period, cycles of lighting/non-lighting period and so forth of the infrared light emitting diodes 15 i.e., of the stroboscope can be arbitrarily and freely set and changed by adjusting the mark durations and the frequencies of the frame status flag signal FSF and LED control signal LEDC.
  • the image sensor 43 is exposed to the light reflected from the operation article 150 .
  • the above pixel data D (X, Y) is output from the image sensor 43 . More specifically speaking, during the period in which the frame status flag signal FSF as shown in FIG. 9 ( a ) is in a high level (the infrared light emitting diodes 15 is not turned on), the image sensor 43 outputs analog pixel data D (X, Y) as shown in FIG. 9 ( c ) in synchronism with the pixel data strobe signal PDS as shown in FIG. 9 ( b ).
  • the high speed processor 200 acquires digital pixel data through the ADC 208 while monitoring the frame status flag signal FSF and the pixel data strobe signal PDS.
  • the pixel data is sequentially output as the zeroth line, the first line, . . . and the thirty-first line as illustrated in FIG. 10 ( c ).
  • the first pixel of each line is associated with dummy data.
  • FIG. 11 is a view for showing an example of the game screen as displayed on the screen 91 of the television monitor 90 of FIG. 1 .
  • the game screen shown in FIG. 11 is a game start screen.
  • the game start screen displayed on the screen 91 includes a background 110 , position guides “G 1 ” to “G 4 ”, evaluation objects 107 to 109 , a cursor 105 , a dance object 106 and masks 141 and 142 . And, automatic playing of music is started.
  • the position guides “G 1 ” to “G 4 ” are displayed in the form of blooms
  • the evaluation objects 107 to 109 are displayed in the form of heart-shaped objects
  • the dance object 106 is displayed in the form of a male-female pair
  • the cursor 105 is displayed in the form of the operation article 150 .
  • the term “position guides G” are used to generally represent the position guides “G 1 ” to “G 4 ”.
  • the cursor 105 serves to indicate the position of the operation article 150 on the screen 91 , and moves on the screen 91 to follow the motion of the operation article 150 . Accordingly, as seen from the player 94 , the manipulation of the operation article 150 is equivalent to the manipulation of the cursor 105 .
  • the position guide “G” serves to guide the manipulation timing and destination position of the cursor 105 (the operation article 150 ) in terms of the timings relative to the music which is automatically played.
  • Direction guides “g 1 ” to “g 5 ”, which will be described below, serves to guide the manipulation timing and moving direction of the cursor 105 (the operation article 150 ) in terms of the timings relative to the music which is automatically played.
  • the evaluation objects 107 to 109 serves to indicate the evaluation of the manipulate of the cursor 105 (the operation article 150 ) by the player 94 in a visual way.
  • the term “direction guides g” are used to generally represent the direction guides “g 1 ” to “g 5 ”.
  • the term “path guides “rg” are used to generally represent the path guides “rg 1 ” to “rg 10 ”.
  • FIG. 12 is a view for showing another example of the game screen as displayed on the screen 91 of the television monitor 90 of FIG. 1 .
  • the animation of the position guide “G” that a bloom is gradually opening indicates the position to which the cursor 105 is to be moved.
  • the player 94 is instructed to move the cursor 105 to the area in which is displayed the animation of the position guide “G 1 ” that a bloom is opening.
  • the player 94 moves the operation article 150 in order to move the cursor 105 to the area in which is displayed the animation of the position guide “G” that a bloom is opening.
  • the position guide “G” is displayed with the animation that the bloom is closed.
  • the direction in which the cursor 105 is to be moved is indicated by the direction toward the animation of the position guide “G” that a bloom is opening.
  • the player 94 is instructed to move the cursor 105 to the direction toward the animation of the position guide “G” that a bloom is opening.
  • the direction in which the cursor 105 is to be moved is guided also by the direction guides “g 1 ” to “g 5 ”.
  • the direction guides “g 1 ” to “g 5 ” sequentially appear in the order that the direction guide “g 1 ” appears first, the direction guide “g 2 ” appears second, the direction guide “g 3 ” appears third, the direction guide “g 4 ” appears fourth, and then the direction guide “g 5 ” appears fifth. Accordingly, the direction in which the cursor 105 is to be moved is guided by the direction in which the direction guides “g 1 ” to “g 5 ” appear in sequence.
  • each of the direction guides “g 11 ” to “g 5 ” is displayed as a graphic form representing a small sphere just after it appears, the sphere gradually increases in size as time passes, and when the size is maximized an animation is performed as if the sphere shatters into fragments. Accordingly, the direction toward the graphic form of the sphere which appears is the direction in which the cursor 105 is to be moved.
  • the player 94 has to move the cursor 105 to the area in which the position guide “G 1 ” is displayed within a predetermined period in which the bloom serving as the position guide “G” is opened.
  • the position guide “G” serves to guide the manipulation timing of the cursor 105 by the animation that the bloom is opened.
  • the player 94 has to move the cursor 105 to the area in which the position guide “G” is displayed as an opening bloom within a predetermined period after the last direction guide “g” appears as the graphic form of the sphere.
  • the manipulation timing of the cursor 105 is guided also by the direction guide “g”.
  • the position guide “G” serves also to indicate in advance the manipulation direction of the cursor 105 . That is to say, if the bud of the bloom serving as the position guide “G” is coming to open, it enable the player 94 to know the direction in which the cursor 105 is to be moved next.
  • the direction guide “g” serves also to indicate in advance the manipulation direction of the cursor 105 . Namely, since the direction guide “g” appears in advance of the manipulation timing, the player 94 can know the direction in which the cursor 105 is to be moved next also by the direction guide “g”.
  • the position to which the cursor 105 is to be moved is indicated by the animation of the position guide “G 2 ” that a bloom is gradually opening.
  • the player 94 is instructed to move the cursor 105 to the area in which is displayed the animation of the position guide “G 2 ” that the bloom is opening.
  • the direction toward the animation of the position guide “G 2 ” that the bloom is opening is the direction in which the cursor 105 is to be moved.
  • the player 94 is instructed to move the cursor 105 to the direction toward the animation of the position guide “G 2 ” that the bloom is opening.
  • the graphic forms of the spheres as the direction guides “g 1 ” to “g 5 ” subsequently appear from the position guide “G 1 ” to the position guide “G 2 ”.
  • the motion of the cursor 105 is guided from the position guide “G 1 ” to the position guide “G 2 ”.
  • the player 94 has to move the cursor 105 to the area in which the position guide “G 2 ” is displayed within a predetermined period in which the bloom serving as the position guide “G 2 ” is opened. Also, the player 94 has to move the cursor 105 to the area in which the position guide “G 2 ” is displayed as an opening bloom within a predetermined period after the last direction guide “g 5 ” appears as the graphic form of the sphere. In other words, the manipulation timing of the cursor 105 is guided also by the direction guide “g”.
  • the player 94 appropriately manipulates the operation article 150 in accordance with the instruction by the position guide “G 2 ” and the direction guides “g 1 ” to “g 5 ” in order to move the cursor 105 from the position of the position guide “G 1 ” to the position of the position guide “G 2 ”.
  • animation is performed such that all the evaluation objects 107 to 109 are flashing.
  • the cursor 105 is manipulated in a most appropriate timing, animation is performed in order that all the evaluation objects 107 to 109 are flashing, and if the cursor 105 is manipulated in a timing which is not most appropriate but within an acceptable range, animation is performed in order that only the evaluation object 108 is flashing.
  • each of the position guides “G 1 ”, “G 3 ” and “G 4 ” is displayed in the form of the bud of the bloom because the current time is out of the time slot for guiding the manipulation timing and destination position of the cursor 105 .
  • the direction guide “g” does not appear between the position guide “G 2 ” and the position guide “G 4 ”, between the position guide “G 4 ” and the position guide “G 3 ” and between the position guide “G 3 ” and the position guide “G 1 ”, because the current time is out of the time slot for guiding the manipulation timing and destination position of the cursor 105 .
  • the animation of dance is performed in the direction corresponding to the moving direction of the cursor 105 (the direction from the position guide “G 1 ” to the position guide “G 2 ”, i.e., the right direction as seen toward the screen 91 ).
  • the animation of the dance object 106 turning in the counter-clockwise direction is performed, while the background 110 is scrolled in the left direction as seen toward the screen 91 .
  • FIG. 13 is a view for showing a further example of the game screen as displayed on the screen 91 of the television monitor 90 of FIG. 1 .
  • animation is performed in order that the position guides “G 1 ” to “G 4 ” are opening as blooms at the same time.
  • the player 94 is guided to move the cursor 105 in the direction and along the path in accordance with the path guides “rg 1 ” to “rg 10 ”.
  • the appearance positions of the path guides “rg 1 ” to “rg 10 ” indicate the guide path of the cursor 105 .
  • the path guides “rg 1 ” to “rg 10 ” appear in the order that the path guide “rg 1 ” appears first, the path guide “rg 2 ” appears second, the path guide “rg 3 ” appears third, the path guide “rg 4 ” appears fourth, . . . . and the path guide “rg 10 ” finally appears. Accordingly, the direction in which the cursor 105 is to be moved is guided by the direction in which the path guides “rg 1 ” to “rg 10 ” appear in sequence.
  • each of the path guides “rg 1 ” to “rg 10 ” is displayed as a graphic form representing a small sphere just after it appears, the sphere gradually increases in size as time passes, and when the size is maximized an animation is performed as if the sphere shatters into fragments.
  • FIG. 13 it is indicated to move the cursor 105 in the counter clockwise direction from a start point in the vicinity of the position guide “G 3 ” along the path guides “rg 1 ” to “rg 10 ”.
  • the animation of the dance object 106 (for example, the animation which is widely turning in the counter clockwise direction) is performed in correspondence with the path guides “rg 1 ” to “rg 10 ”.
  • the object illustrated in each of FIG. 12 and FIG. 13 such as the dance object 106 is an image corresponding to a certain picture for an animation.
  • a series of the dance objects 106 are prepared for dance animation.
  • a series of object images in the graphic forms of blooms are prepared for the animation of the position guide “G”.
  • a series of object images in the graphic forms of spheres are prepared for the animation of the direction guide “g” and the path guide “rg”.
  • each of the dance object 106 , the position guide “G”, the evaluation objects 107 to 109 , the cursor 105 , the direction guide “g” and the path guide “rg” in the game screens as illustrated in FIG. 11 and FIG. 13 is composed of a single or a plurality of sprites.
  • a sprite comprises a rectangular pixel set and can be arranged in an arbitrary position of the screen 91 .
  • a generic term “object” is sometimes used to generally refer to the position guide “G”, the evaluation objects 107 to 109 , the cursor 105 , the direction guide “g” and the path guide “rg”.
  • FIG. 14 is a view for explaining the sprites forming an object which is displayed on the screen 91 .
  • the dance object 106 of FIG. 11 is composed, for example, of 12 sprites SP 0 to SP 11 .
  • Each of the sprites SP 0 to SP 11 consists, for example, of 16 pixels ⁇ 16 pixels.
  • the coordinates at which the center of the upper left corner sprite SP 0 is to be located is designated.
  • the coordinates at which the centers of the respective sprites SP 1 to SP 11 is to be located are calculated on the basis of the coordinates as designated and the size of the sprites SP 0 to SP 11 .
  • FIG. 15 is an explanatory view for showing the background screen to be displayed on the screen 91 of the television monitor 90 of FIG. 1 .
  • the background screen 140 is composed, for example, of 32 ⁇ 32 blocks “ 0 ” to “ 1023 ”.
  • Each of the block “ 0 ” to the block “ 1023 ” is composed, for example, of a rectangular element comprising 8 pixels ⁇ 8 pixels.
  • An array element PA[ 0 ] to an array element PA[ 1023 ] and an array element CA[ 0 ] to an array element CA[ 1023 ] are prepared in correspondence respectively with the block “ 0 ” to the block “ 1023 ”.
  • the block “ 0 ” to the block “ 1023 ” are generally referred to, they are referred to simply as the “block”; in the case where the array element PA[ 0 ] to the array element PA[ 1023 ] are generally referred to, they are referred to as the “array element PA”; and in the case where the array element CA[ 0 ] to the array element CA[ 1023 ] are generally referred to, they are referred to as the “array element CA”.
  • data (pixel pattern data) for designating the pixel pattern of the corresponding block is assigned to the array element PA.
  • This pixel pattern data consists of the color information of the respective pixels of the 8 pixels ⁇ 8 pixels for making up a block.
  • the information for designating the color palette and the depth value for use in the corresponding block is assigned to the array element CA.
  • a color palette consists of the predetermined number of color information entries. The depth value indicates the depth position of the pixels, and if a plurality of pixels overlap each other in the same position only the pixel having the largest depth value is displayed.
  • FIG. 16 ( a ) is an explanatory view for showing the background screen 140 in advance of scrolling it
  • FIG. 16 ( b ) is an explanatory view for showing the background screen 140 after scrolling it.
  • the size of the screen 91 of the television monitor 90 is 256 pixels ⁇ 224 pixels
  • an area of 256 pixels ⁇ 224 pixels in the background screen 140 is displayed the screen 91 . It is considered here that the background screen 140 is scrolled to shift the center position thereof to the left by “k” pixels.
  • the width of the background screen 140 in the lateral direction (the horizontal direction) is equal to the width of the screen 91 in the lateral direction
  • the portion thereof (hatched portion) scrolled out of the screen 91 is displayed in the right edge as illustrated in FIG. 16 ( b ).
  • the same background screen 140 is repeatedly arranged in the lateral direction.
  • the image displayed near the right edge of the screen 91 is defined by the array elements PA[ 64 ], . . . , and PA[ 928 ] and the array elements CA[ 64 ], . . . , and CA[ 928 ] corresponding to these blocks.
  • the mask 141 is provided at the left edge of the screen 91 in order to avoid such shortcomings. For the same reason, there is the mask 142 provided at the right edge.
  • the scroll process in the rightward direction is performed in the same manner as the scroll process in the leftward direction. Also, in the case of the present embodiment, since the range of scrolling is limited within ⁇ 16 pixels in the longitudinal direction (vertical direction) of the background screen 140 , there is no mask at the top and bottom edges of the screen 91 .
  • the background 110 is scrolled by scrolling the background screen 140 .
  • FIG. 17 is a schematic representation of a program and data stored in the ROM 51 of FIG. 6 .
  • the ROM 51 is used to store a game program 300 , image data 301 and music data 304 .
  • the image data 302 includes object image data (inclusive of image data such as the position guide “G”, the direction guide “g”, the path guide “rg”, the evaluation objects 107 to 109 and the cursor 105 ) and background image data.
  • the music data 304 includes first musical score data 305 , second musical score data 306 and sound source data (wave data) 307 .
  • the first musical score data 305 shown in FIG. 17 is the data in which music control information is arranged in a time series.
  • FIG. 18 is a schematic representation of one example of the first musical score data 305 of FIG. 17 .
  • the music control information contains a command, a note number/a waiting time information item, an instrument designation information item, a velocity value and a gate time.
  • “Note On” is a command to output sound
  • “Wait” is a command to set a waiting time.
  • the waiting time is the time period to elapse before reading the next command (the time period between one musical note and the next musical note).
  • the note number is information for designating the frequency of sound vibration (pitch).
  • the waiting time information item is information for designating a waiting time to be set.
  • the instrument designation information item is information for designating a musical instrument whose tone quality is to be used.
  • the velocity value is information for designating the magnitude of sound, i.e., a sound volume.
  • the gate time is information for designating a period for which a musical note is to be continuously output.
  • the second musical score data 306 is the data in which guide control information is arranged in a time series. This second musical score data 306 is used when guides (the position guide “G”, the direction guide “g” and the path guide “rg”) is displayed on the screen 91 .
  • the first musical score data 305 is the musical score data for automatically playing music
  • the second musical score data 306 is the musical score data for displaying the guides in synchronization with the music.
  • FIG. 19 is a schematic representation of one example of the second musical score data 306 of FIG. 17 .
  • the guide control information contains a command, a note number/a waiting time information item, and an instrument designation information item.
  • the instrument designation information item of the second musical score data 306 is the number indicating that the second musical score data 306 is the musical score data for displaying guides (the position guide “G”, the direction guide “g” and the path guide “rg”) rather than the number indicating the instrument (tone quality) corresponding to the instrument of which sound is to be output.
  • “Note On” is not a command to output sound but a command to designate starting the animation of the position guide “G” or designate starting the display of the direction guide “g” and the path guide “rg”. Accordingly, the note number is not a command to designate the frequency of sound vibration (pitch) but information used to designate which of the animations of the position guides “G” is to be started and designate where the direction guide “g” and the path guide “rg” are displayed. This point will be explained in detail.
  • FIG. 20 ( a ) through FIG. 20 ( c ) are views showing the correspondence between note numbers and the directions in which the cursor 105 is guided.
  • the direction of each arrow indicates the direction in which the cursor 105 is guided
  • the start point of each arrow indicates the position of the position guide “G” which previously guided the cursor 105
  • the end point of each arrow indicates the position of the position guide “G” which currently guides the cursor 105 .
  • FIG. 20 ( a ) through FIG. 20 ( c ) are views showing the correspondence between note numbers and the directions in which the cursor 105 is guided.
  • the direction of each arrow indicates the direction in which the cursor 105 is guided
  • the start point of each arrow indicates the position of the position guide “G” which previously guided the cursor 105
  • the end point of each arrow indicates the position of the position guide “G” which currently guides the cursor 105 .
  • the note number 115511 is used to direct the cursor 105 from the position guide “G 1 ” to the position guide “G 2 ”, and when the note number indicated by the musical score data pointer is “55” the position guide “G” and the direction guide “g” are displayed as illustrated in FIG. 12 .
  • the note number “ 57 ” is used to direct the cursor 105 so that it turns in the counter clockwise direction from the position guide “G 3 ” as the start point, and when the note number indicated by the musical score data pointer is “ 57 ” the position guide “G 1 ” and the path guide “rg” are displayed as illustrated in FIG. 13 .
  • the note number “ 81 ” is dummy data placed on top of the second musical score data 306 (refer to FIG. 19 ) and not information which is used to control the display of guidance.
  • the top positions of the first musical score data 305 and the second musical score data 306 are aligned with each other.
  • the note number “ 79 ” is data indicative of the end of music, and arranged at the end of the second musical score data 306 (refer to FIG. 19 ).
  • the note number “ 79 ” is not information which is used to control the display of guidance.
  • the CPU 201 acquires digital pixel data by converting analog pixel data which is output from the image sensor 43 , and assigns it to the array element P[X][Y]. Meanwhile, it is assumed that the horizontal axis (in the lateral direction or the row direction) of the image sensor 43 is X-axis and the vertical axis (in the longitudinal direction or the column direction) is Y-axis.
  • the CPU 201 calculates the differential data between the pixel data P[X] [Y] acquired when the infrared light emitting diodes 15 are turned on and the pixel data P[X][Y] acquired when the infrared light emitting diodes 15 are turned off, and the differential data is assigned to the array element Dif[X] [Y].
  • the pixel data represents the luminance value. Accordingly, the differential data also represents the luminance value.
  • FIG. 21 ( a ) is a view for showing an example of the image which is captured by the use of an ordinary used image sensor and is not processed by a particular treatment
  • FIG. 21 ( b ) is a view for showing an example of the image which is obtained by level filtering the image signal of FIG. 21 ( a ) by a certain threshold value
  • FIG. 21 ( c ) is a view for showing an example of the image which is captured by the image sensor 43 through the infrared filter 17 with the illumination and is level filtered by a certain threshold value
  • FIG. 21 ( d ) is a view for showing an example of the image which is captured by the image sensor 43 through the infrared filter 17 without the illumination and is level filtered by a certain threshold value
  • FIG. 21 ( e ) is a view for showing an example of the differential signal between the image signal with the illumination and the image signal without the illumination.
  • the operation article 150 is irradiated with infrared light in order to capture an image by the reflected infrared light which is incident on the image sensor 43 through the infrared filter 17 .
  • an ordinary image sensor (corresponding to the image sensor 43 of FIG. 5 ) captures an image which includes not only light sources such as a fluorescent light source, an incandescent light source and a solar light source (window) but any other objects located inside of the room in addition to an image of the operation article 150 as illustrated in FIG. 21 ( a ).
  • a computer or a processor having a substantially high-speed processing capability is needed in order to extract only the image of the operation article 150 by processing the image of FIG. 21 ( a ).
  • a high-performance computer cannot be used in a device which must be manufactured at a low cost. Then, it is conceivable to lessen the load by the use of a variety of processing techniques.
  • FIG. 21 ( a ) would have to be drawn as a gray-scale image, the illustration is omitted. Also, in each of FIG. 21 ( a ) through FIG. 21 ( e ), an image is captured of the reflection sheet 155 of the operation article 150 .
  • FIG. 21 ( b ) is an image signal after level filtering the image signal of FIG. 21 ( a ) by a certain threshold value. While such a level filtering process can be performed by a dedicated hardware circuit or by software control, it is possible to remove images having low luminance values other than the operation article 150 and the light sources by performing the level filtering process which cut off pixel data whose luminance value is no higher than a certain level. In the case of the image signal of FIG. 21 ( b ), the images other than the operation article 150 and the light sources can be eliminated so as to lessen the load on the computer, however, since there are high-luminance images yet including light source images, it is difficult to discriminate between the operation article 150 and other light sources.
  • the infrared filter 17 is used as illustrated in FIG. 5 in order that the image sensor 43 does not capture the images other than the image of the infrared light.
  • FIG. 21 ( c ) it is possible to remove the fluorescent light source which emits little infrared light.
  • the solar light source and the incandescent light source included in the image signal Accordingly, the load is lessened by calculating the difference between the pixel data when the infrared light stroboscope is turned on and the pixel data when the infrared light stroboscope is turned off.
  • the difference is calculated between the pixel data of the image signal with the illumination as shown in FIG. 21 ( c ) and the pixel data of the image signal without the illumination as shown in FIG. 21 ( d ). Then, as illustrated in FIG. 21 ( e ), only the image corresponding to the difference can be acquired.
  • the image corresponding to the difference includes only the image corresponding to the operation article 150 as apparent from the comparison with FIG. 21 ( a ). Accordingly, while lessening the processing load, it is possible to acquire the state information on the operation article 150 .
  • the state information is any one of or any combination of two or more of speed information, moving direction information, moving distance information, velocity vector information, acceleration information, movement locus information, area information, and positional information.
  • the CPU 201 acquires differential data by calculating the difference between the pixel data acquired when the infrared light emitting diodes 15 are turned on and the pixel data acquired when the infrared light emitting diodes 15 are turned off.
  • the CPU 201 obtains the coordinates of the target point of the operation article 150 on the basis of the differential data Dif[X][Y] as calculated. This will be explained in detail.
  • FIG. 22 is a view for explaining the calculation process of the target point of the operation article 150 .
  • the image sensor 43 shown in FIG. 22 is an image sensor of 32 pixels ⁇ 32 pixels.
  • the CPU 201 scans the differential data in the X-direction through 32 pixels while incrementing the Y-coordinate, in such a manner that the CPU 201 scans the differential data through 32 pixels in the X-direction (the horizontal direction, the lateral direction or the row direction), then increments the Y-coordinate, scans the differential data through 32 pixels in the X-direction, then increments the Y-coordinate, and so on.
  • the CPU 201 finds the differential data of the maximum luminance value from the differential data of 32 pixels ⁇ 32 pixels as scanned, and compares the maximum luminance value to a predetermined threshold value “Th”. Then, if the maximum luminance value is larger than the predetermined threshold value “Th”, the CPU 201 calculates the coordinates of the target point of the operation article 150 on the basis of the coordinates of the pixel having the maximum luminance value. This point will be explained in detail.
  • FIG. 23 ( a ) is a view for explaining the process of scanning in the X-direction when the coordinates of the target point of the operation article 150 are calculated on the basis of the coordinates of the pixel having the maximum luminance value
  • FIG. 23 ( b ) is a view for explaining the process of starting scanning in the Y-direction when the coordinates of the target point of the operation article 150 are calculated on the basis of the coordinates of the pixel having the maximum luminance value
  • FIG. 23 ( c ) is a view for explaining the process of scanning in the Y-direction when the coordinates of the target point of the operation article 150 are calculated on the basis of the coordinates of the pixel having the maximum luminance value
  • FIG. 23 ( d ) is an explanatory view for showing the result of the process of calculating the coordinates of the target point of the operation article 150 on the basis of the coordinates of the pixel having the maximum luminance value.
  • the CPU 201 performs scanning the differential data in the X-direction from the coordinates of the pixel having the maximum luminance value as the center in order to detect pixels whose luminance values are larger than the predetermined threshold value “Th”.
  • the CPU 201 performs the process of calculating the coordinates (xc, yc) of the target point as described above each time the frame is updated. Then, the CPU 201 assigns “xc” and “yc” respectively to the array elements Px[M] and Py[M]. Meanwhile, “M” is an integer and incremented by one each time the frame displayed on the screen 91 is updated.
  • the CPU 201 determines which of areas a 1 to a 4 includes the target point of the operation article 150 on the screen 91 . This point will be explained in detail.
  • FIG. 24 is a view for explaining the target point existence area determination process ( 2 ) performed by the CPU 201 .
  • a predetermined area a 1 including the position guide “G 1 ”, a predetermined area a 2 including the position guide “G 2 ”, a predetermined area a 3 including the position guide “G 3 ” and the predetermined area a 4 including the position guide “G 4 ” are defined on the screen 91 .
  • the CPU 201 determines, from among the predetermined areas a 1 to a 4 , the area in which the target point (xc, yc) of the operation article 150 is located and stores the result of determination in the array element J 1 [M].
  • the CPU 201 performs the determination process as described above each time the frame displayed on the screen 91 is updated.
  • FIG. 25 is a view for explaining the target point existence area determination process ( 2 ) performed by the CPU 201 .
  • the areas A 1 to A 4 are defined by dividing the screen 91 into four.
  • the CPU 201 determines, from among the areas A 1 to A 4 , the area in which the target point (xc, yc) of the operation article 150 is located and stores the result of determination in the array element J 2 [M].
  • the CPU 201 performs the determination process as described above each time the frame displayed on the screen 91 is updated.
  • the CPU 201 registers (stores in the internal memory 207 ) the coordinates (xc, yc) of the current target point of the operation article 150 as the coordinates of the cursor 105 to be displayed in the next frame.
  • the CPU 201 assigns a note number (refer to FIG. 19 and FIG. 20 ( a ) through FIG. 20 ( c )), which are read from the second musical score data 306 in accordance with the musical score data pointer for guidance, to an array element NN[ 0 ] or an array element NN[ 1 ].
  • the number of elements of the array is two as described above because the guidance of a certain position guide “G” and the guidance of another position guide “G” are started in different timings but can be overlappingly continued in a certain period.
  • the musical score data pointer for guidance is a pointer pointing to the position of the second musical score data 306 from which data is read.
  • FIG. 26 is a view for explaining the registration process of the animations of the direction guide “G”, the position guide “g” and the path guide “rg”.
  • the ROM 51 or the internal memory 207 there is prepared a table in which the note number are associated with the animation information (the storage location information of the animation table of the position guide “G”, the display coordinate information of the position guide “G” on the screen 91 , the display timing information of the position guide “G”, the storage location information of the animation table of the direction guide “g”/the path guide “rg”, the display coordinate information of the direction guide “g”/the path guide “rg” in the screen 91 , and the display timing information of the direction guide “g”/the path guide “rg”).
  • Each of the note numbers in this table is a note number which is used to control the display of a guide and shown in FIG. 20 ( a ) through FIG. 20 ( c ).
  • the CPU 201 refers to this table and registers (stores in the predetermined area of the internal memory 207 ) the animation information (the storage location information of the animation table of the position guide “G”, the display coordinates of the position guide “G” on the screen 91 , the display timing information of the position guide “G”, the storage location information of the animation table of the direction guide “g”, the display coordinate information of the direction guide “g” in the screen 91 , and the display timing information of the direction guide “g”) associated with the note number “ 55 ”.
  • the display timing information is information indicative of when an object is to be displayed on the screen 91 .
  • the guide number “ 55 ” indicates that the position guide “G 2 ” is to be displayed at the coordinates (x 1 , y 1 ) in the next frame following the frame which is currently displayed since the display timing information of the position guide “G” is “0”.
  • the display timing information of the direction guide “g” is 0, 6, 12, . . .
  • the guide number “ 55 ” indicates that the position guide “g 1 ” is to be displayed at the coordinates (x 3 , y 1 ) in the next frame following the frame which is currently displayed, that the position guide “g 2 ” is to be displayed at the coordinates (x 4 , y 1 ) 6 frames after, . . . and that the position guide “g 5 ” is to be displayed at the coordinates (x 7 , y 1 ) 24 frames after.
  • FIG. 27 is a view for showing an example of the animation table which is designated by the animation table storage location information of FIG. 26 .
  • the animation table is a table in which are associated the storage location information of animation image data (a plurality of image object data items arranged in a time series), the reference numbers of objects for use in performing animation arranged in a time series, information indicative of how many frames (the number of duration frames) an object is continuously displayed, the size of an object, the information on a color palette, the information on the depth value, and the size of a sprite.
  • the animation image data is pixel pattern data.
  • the pixel pattern data, the color palette and the depth value are related to sprites for forming objects, and the definitions thereof are the same as explained in conjunction with the blocks of FIG. 14 .
  • the animation table pointed to by the animation table storage location information “address 0 ” is an example of the animation table of the position guide “G”
  • the animation table pointed to by the animation table storage location information “address 1 ” is an example of the animation table of the direction guide “g”
  • the animation table pointed to by the animation table storage location information “address 2 ” is an example of the animation table of the path guide “rg”
  • the animation table pointed to by the animation table storage location information “address 3 ” is an example of the animation table of the position guide “G” which is used when the player 94 successfully manipulates the cursor 105 .
  • the CPU 201 determines whether or not the player 94 correctly manipulates the cursor 105 in correspondence with the position guide “G” and the direction guide “g” or the position guide “G” and the path guide “rg”. More specific description is as follows.
  • the CPU 201 determines whether or not the cursor 105 (i.e., the target point of the operation article 150 ) is located in the area which is currently designated by the position guide “G” and the direction guide “g” on the basis of the result of determination J 1 [M] performed by the target point existence area determination process ( 1 ) (refer to FIG. 24 ). For example, in the case where the area a 2 is currently designated by the position guide “G” and the direction guide “g”, if the cursor 105 is located in the area a 2 , the CPU 201 determines that the cursor 105 is correctly manipulated in correspondence with the position guide “G” and the direction guide “g”.
  • the CPU 201 determines whether or not the cursor 105 (i.e., the target point of the operation article 150 ) is moved along the path which is currently designated by the position guide “G” and the path guide “rg” on the basis of the result of determination J 2 [M] performed by the target point existence area determination process ( 2 ) (refer to FIG. 25 ).
  • the path which is designated by the guide number “ 53 ” as shown in FIG. 20 ( b ) is the path of the area A 3 ->the area A 1 ->the area A 2 ->the area A 4 as shown in FIG. 25 .
  • the 20 ( c ) is the path of the area A 4 ->the area A 2 ->the area A 1 ->the area A 3 as shown in FIG. 25 . Accordingly, for example, in the case where the path corresponding to the guide number “ 53 ” is currently designated by the position guide “G” and the path guide “rg”, if the cursor 105 is moved along the path of the area A 3 ->the area A 1 ->the area A 2 ->the area A 4 , the CPU 201 determines that the cursor 105 is correctly manipulated in correspondence with the position guide “G 1 ” and the path guide “rg”.
  • the CPU 201 registers (stores in the predetermined area of the internal memory 207 ) the dance animation information corresponding to the position guide “G” and the direction guide “g” or the position guide “G” and the path guide “rg”.
  • a table is prepared in order to associate the dance animation information with the note numbers (refer to FIG. 20 ( a ) through FIG. 20 ( c )) for controlling the display of guides.
  • the note numbers indicating the same guide direction for example, the note numbers “ 55 ” and 67) are associated with the same dance animation information.
  • the dance animation information is designed in the same manner as the animation information of FIG. 26 and contains the storage location information of the dance animation table, the display coordinates of the dance object 106 on the screen 91 , and the display timing information of the dance object 106 . Also, the dance animation table is designed in the same manner as the animation table of FIG.
  • the dance animation image data is pixel pattern data.
  • the second musical score data 306 of FIG. 17 may contain a note number indicating that a high speed dance animation is to be performed and a note number indicating that a low speed dance animation is to be performed.
  • a high speed dance animation table and a low speed dance animation table are prepared respectively as the dance animation table.
  • a dance animation table is prepared for each dance speed.
  • the CPU 201 registers the storage location information (“address 3 ” in the case of the example of FIG. 27 ) of the animation table of the position guide “G” which is used when the player 94 successfully manipulates the cursor 105 .
  • the CPU 201 registers (stores in the predetermined area of the internal memory 207 ) the animation information for the evaluation objects 107 to 109 .
  • This animation information is designed in the same manner as the animation information of FIG. 26 . Accordingly, this animation information contains the storage location information of the animation table for the evaluation objects 107 to 109 .
  • This animation table is designed in the same manner as the animation table of FIG. 27 .
  • the CPU 201 performs the scrolling corresponding to the position guide “G” and the direction guide “g” or the position guide “G” and the path guide “rg”. More specifically speaking, the CPU 201 changes the center position of the background screen 140 to scroll the background 110 (refer to FIG. 16 ( a ) and FIG. 16 ( b )) in correspondence with the position guide “G” and the direction guide “g” or the position guide “G” and the path guide “rg”, and the note number for controlling the dance speed. Still further, in the case where the background screen 140 is scrolled in the lateral direction, the CPU 201 changes the data in the array elements PA and the array elements CA corresponding thereto.
  • FIG. 28 is a timing diagram for explaining the relationship among the first musical score data 305 , the second musical score data 306 , the direction guide “G”, the position guide “g”, the judgment of manipulation and the dance animation.
  • each thick line indicates the execution period for which the process continues
  • the filled circle at the left end of each thick line indicates the starting point of the process
  • the filled circle at the right end of each thick line indicates the end of the process.
  • the time point of starting reading the second musical score data 306 is set earlier than the time point T 1 , T 2 . . . of starting reading the first musical score data 305 by a predetermined time “t”. Accordingly, the display of the direction guide “g” is started the predetermined time “t” earlier than the corresponding note number of the first musical score data 305 is read at the corresponding time point of T 1 to T 3 , and continued to this corresponding time point of T 1 to T 3 in which the corresponding note number of the first musical score data 305 is read (for example, for 60 frames).
  • the animation of the position guide “G” is started the predetermined time “t” earlier than the corresponding note number of the first musical score data 305 is read at the corresponding time point of T 1 to T 3 , and continued a short time after the corresponding note number of the first musical score data 305 is read at this corresponding time point of T 1 to T 3 .
  • the CPU 201 starts the process of determining whether or not the cursor 105 is correctly manipulated in correspondence with the direction guide “g” and the position guide “G” a predetermined period (for example, 30 frames) after starting the display of the direction guide “g”, and completes the process at the corresponding time point of T 1 to T 3 in which the corresponding note number of the first musical score data 305 is read. Then, if it is determined that the cursor 105 is correctly manipulated in correspondence with the direction guide “g” and the position guide “G”, the CPU 201 registers the dance animation information at the time point when the determination period ends. Accordingly, in this case, dance animation is performed on the basis of the dance animation information which is registered.
  • a predetermined period for example, 30 frames
  • the time point of starting reading the second musical score data 306 is earlier than the time point T 1 , T 2 . . . of starting reading the first musical score data 305 by a predetermined time “t”. Namely, since the player 94 starts the manipulation of the operation article 150 after the guidance by the direction guide “g” and the position guide “G” is started, the direction guide “g” and the position guide “G” are displayed earlier than in the timing of music for the purpose of adjusting the time lag.
  • the timing of displaying the path guide “rg” is provided in the same manner as the timing of displaying the direction guide “G”. However, for example, the determination of whether or not the cursor 105 is correctly manipulated in correspondence with the path guide “rg” is performed between the start and end of the guidance by the path guide “rg” (For example, for 60 frames).
  • the CPU 201 provides the graphics processor 202 of FIG. 7 with the information required for drawing during the vertical blanking period on the basis of the information registered by the cursor control process, the guide control process and the dance control process. Then, the graphics processor 202 generates a video signal on the basis of the information as given, and outputs it to the video signal output terminal 47 . By this process, the game screen including the position guide “G”, the background 110 and so forth is displayed on the screen 91 of the television monitor 90 . More specific description is as follows.
  • the CPU 201 calculates the display coordinates of the respective sprites forming the cursor 105 on the basis of the coordinate information (the coordinate information of the target point of the operation article 150 ) which is registered by the cursor control process. Then, the CPU 201 provides the graphics processor 202 with the display coordinate information, the color palette information, the depth value, the size information and the pixel pattern data storage location information of the respective sprites for forming the cursor 105 . The graphics processor 202 generates the image signal of the cursor 105 on the basis of the respective information, and outputs it to the video signal output terminal 47 .
  • the CPU 201 acquires the size information of the object for forming the animation image of each guide (the position guide “G”, the direction guide “g” or the path guide “rg”) and the size information of the sprite for forming the object with reference to the animation table on the basis of the animation table storage location information contained in the animation information which is registered by the guide control process. Then, the CPU 201 calculates the display coordinates of the respective sprites for forming the object on the basis of the above respective information and the display coordinate information contained in the animation information as registered.
  • the CPU 201 calculates the pixel pattern data storage location information of the respective sprites for forming the object on the basis of the reference number of the position guide “G” to be displayed next, the size information of the object and sprite contained in the animation table, and the animation image data storage location information of the position guide “G” contained in the animation table.
  • the CPU 201 provides the graphics processor 202 with the color palette information, the depth value and the size information of the respective sprites for forming the position guide “G” together with the pixel pattern data storage location information and the display coordinate information of the respective sprites with reference to the animation table.
  • the CPU 201 provides the graphics processor 202 with the above respective information on the basis of the display timing information of the position guide “G” contained in the animation information as registered and the information on the number of duration frames of the animation table.
  • the information to be given to the graphics processor 202 by the CPU 201 has a similar content and is acquired in a similar manner as for the position guide “G”.
  • the CPU 201 provides the information on the direction guides “g 1 ” to “g 4 ” and the path guides “rg 1 ” to “rg 10 ” to the graphics processor 202 , when starting displaying each of the direction guides “g 1 ” to “g 4 ” and each of the path guides “rg 1 ” to “rg 10 ”, with reference to the display coordinate information and the display timing information contained in the animation information as registered.
  • the graphics processor 202 generates the image signals of the guides (the position guide “G”, the direction guide “g”, the path guide “rg”) on the basis of the above information which is given as described above, and outputs them to the video signal output terminal 47 .
  • the CPU 201 acquires the size information of the dance object 106 for forming the dance animation image and the size information of the sprite for forming the dance object 106 with reference to the dance animation table on the basis of the dance animation table storage location information contained in the dance animation information which is registered by the dance control process. Then, the CPU 201 calculates the display coordinates of the respective sprites for forming the dance object 106 on the basis of the above respective information and the display coordinate information contained in the dance animation information as registered.
  • the CPU 201 calculates the pixel pattern data storage location information of the respective sprites for forming the dance object 106 on the basis of the reference number of the dance object 106 to be displayed next, the size information of the dance object 106 and the sprite contained in the dance animation table, and the dance animation image data storage location information contained in the dance animation table.
  • the CPU 201 provides the graphics processor 202 with the color palette information, the depth value and the size information of the respective sprites for forming the dance object 106 together with the pixel pattern data storage location information and the display coordinate information of the respective sprites with reference to the dance animation table.
  • the CPU 201 provides the above respective information to the graphics processor 202 on the basis of the display timing information contained in the dance animation information as registered and the information on the number of duration frames of the dance animation table.
  • the CPU 201 acquires the information required for generating image signals on the basis of the animation information and the animation table for the evaluation objects 107 to 109 which are registered by the dance control process, and provides the information to the graphics processor 202 .
  • the information to be given to the graphics processor 202 by the CPU 201 has a similar content and is acquired in a similar manner as for the dance object 106 .
  • the graphics processor 202 generates the image signals of the dance object 106 and the evaluation objects 107 to 109 on the basis of the above information which is given as described above, and outputs them to the video signal output terminal 47 .
  • the playback of music is performed by an interrupt operation.
  • the CPU 201 reads and interprets the music control information of FIG. 18 while incrementing the musical score data pointer for music.
  • the musical score data pointer for music is a pointer pointing to the position of the first musical score data 305 from which data is read.
  • the CPU 201 provides the sound processor 203 with the head address from which the wave data is stored in accordance with the frequency of sound vibration (pitch) designated by the note number contained in the music control information and the instrument (tone quality) designated by the instrument designation information. Furthermore, if the command contained in the music control information as read is “Note On”, the CPU 201 provides the sound processor 203 with the head address from which the envelope data as required is stored.
  • the CPU 201 provides the sound processor 203 with pitch control information corresponding to the frequency of sound vibration (pitch) designated by the note number contained in the music control information, and volume information contained in the music control information.
  • the pitch control information is used to perform the pitch conversion by changing the frequency of reading the wave data.
  • the sound processor 203 periodically reads the pitch control information at a certain interval and accumulates the pitch control information.
  • the sound processor 203 then processes this result of accumulation to obtain the address pointer to the wave data. Accordingly, if the pitch control information is set to a large value, the address pointer is quickly incremented by the large value to raise the frequency of the wave data. Conversely, if the pitch control information is set to a small value, the address pointer is slowly incremented by the small value to lower the frequency of the wave data. In this way, the sound processor 203 performs the pitch conversion of wave data.
  • the sound processor 203 reads the wave data stored in the location pointed to by the head address as given from the ROM 51 , while incrementing the address pointer on the basis of the pitch control information as given. Then, the sound processor 203 generates an audio signal by multiplying the wave data, which is successively read, by the envelope data and the volume data. In this way, an audio signal having the tone quality of the musical instrument, the frequency of sound vibration (pitch) and the sound volume which are designated by the first musical score data 305 is generated and output to the audio signal output terminal 49 .
  • the CPU 201 manages the gate times contained in the music control information as read. Accordingly, the CPU 201 outputs an instruction to the sound processor 203 in order that, when a gate time elapses, the output of the corresponding musical tone is terminated. In response to this, the sound processor 203 terminates the output of the corresponding musical tone as designated.
  • Music is played back as described above on the basis of the first musical score data 305 and output through a speaker (not shown in the figure) of the television monitor 90 .
  • FIG. 29 is a flow chart showing the entire process flow of the music game apparatus 1 of FIG. 1 .
  • the CPU 201 performs the initial settings of the system in step S 1 .
  • the CPU 201 calculates the state information of the operation article 150 .
  • the CPU 201 performs the game process on the basis of the state information of the operation article 150 as calculated in step S 2 .
  • the CPU 201 determines whether or not “M” is smaller than a predetermined value “K”. If “M” is greater than or equal to the predetermined value “K”, the CPU 201 proceeds to step S 5 , in which “0” is assigned to “M”, and proceeds to step S 6 . On the other hand, if “M” is smaller than the predetermined value “K”, the CPU 201 proceeds from step S 4 to step S 6 .
  • This value “M” will be explained in the following description.
  • step S 6 it is determines whether or not the CPU 201 waits for the video system synchronous interrupt.
  • the CPU 201 provides the graphics processor 202 with the image information for updating the display screen of the television monitor 90 after starting the vertical blanking period (step S 7 ). Accordingly, after the process necessary for updating the display screen is completed, the process is halted until the next video system synchronous interrupt is issued. If “YES” is determined in step S 6 , i.e., while waiting for a video system synchronous interrupt (while there is no video system synchronous interrupt), the same step S 6 is repeated.
  • step S 6 determines whether “NO” is determined in step S 6 , i.e., if it gets out of the state of waiting for a video system synchronous interrupt (if there is a video system synchronous interrupt).
  • step S 7 the CPU 201 provides the graphics processor 202 with the image information required for generating the game screen (refer to FIG. 11 through FIG. 13 ) during the vertical blanking period on the basis the game process in step S 3 .
  • FIG. 30 is a flow chart showing the process flow for the initial settings of the system in step S 1 of FIG. 29 .
  • the CPU 201 initializes the musical score data pointer for guidance in step S 10 .
  • the CPU 201 sets an execution stand-by counter for guidance to “0”.
  • step S 12 the CPU 201 initializes the musical score data pointer for music for music.
  • step S 13 the CPU 201 sets an execution stand-by counter for music to “t”.
  • step S 14 the CPU 201 performs the initial settings of the image sensor 43 .
  • step S 15 the CPU 201 initializes various flags and various counters.
  • step S 16 the CPU 201 sets the timer circuit 210 as an interrupt source for outputting sound.
  • the sound processor 203 performs a process to output sound from the speaker of the television monitor 90 .
  • FIG. 31 is a flow chart showing the process flow for sensor initial settings in step S 14 of FIG. 30 .
  • the high speed processor 200 sets setting data to a command “CONF”.
  • this command “CONF” is a command used to inform the image sensor 43 that the high speed processor 200 enters a configuration mode in which a command is transmitted to the image sensor 43 .
  • a command transmission process is performed.
  • FIG. 32 is a flow chart showing the command transmission process in step S 21 of FIG. 31 .
  • the high speed processor 200 sets register data (I/O port) to the setting data (the command “CONF” in the case of step S 21 ) in the first step S 30 , and sets the register setting clock (I/O port) to a low level in the next step S 31 .
  • the register setting clock CLK is set to a high level in step S 33 .
  • the register setting clock CLK is set to a low level again in step S 35 .
  • the process of transmitting a command can be performed by changing the level of the register setting clock CLK to a low level, then to a high level and again to a low level while waiting for the predetermined time before each change.
  • step S 22 the pixel mode is set as well as the exposure time.
  • the image sensor 43 is for example a CMOS image sensor of 32 pixels ⁇ 32 pixels as described above, “0h” indicative of 32 pixels ⁇ 32 pixels is loaded into a pixel mode register at a setting address “0”.
  • step S 23 the high speed processor 200 performs a register setting process.
  • FIG. 34 is a flow chart showing the register setting process in step S 23 of FIG. 31 .
  • the high speed processor 200 sets the setting data to the command “MOV” associated with an address in the first step S 40 , and then performs the command transmission process in the next step S 41 as explained above with reference to FIG. 32 to transmit the command.
  • the high speed processor 200 sets the setting data to the command “LD” associated with data in the next step S 42 , and then performs the command transmission process in the next step S 43 to transmit the command.
  • the high speed processor 200 sets the setting data to the command “SET” in step S 44 , and transmits the command in the next step S 45 .
  • the command “MOV” is a command for transmitting the address of a control register
  • the command “LD” is a command for transmitting data
  • the command “SET” is a command for actually loading the data into the address. Incidentally, the above process is repeated if there are a plurality of control registers to be set.
  • step S 24 the setting address is set to “1” (the address of the low nibble of an exposure time setting register), and “Fh” is loaded into the low nibble of the exposure time setting register as the low nibble data of “FFh” indicative of the maximum exposure time.
  • step S 25 the register setting process of FIG. 34 is performed.
  • step S 26 the setting address is set to “2” (the address of the high nibble of the exposure time setting register), and “Fh” is loaded into the high nibble of the exposure time setting register as the high nibble data of “FFh” indicative of the maximum exposure time, and the register setting process is performed in step S 27 .
  • step S 28 the setting data is set to a command “RUN” in step S 28 for indicating the completion of initialization and having the image sensor 43 start outputting data, followed by step S 29 in which the command “RUN” is transmitted.
  • the sensor initialization process is performed in step S 14 of FIG. 30 .
  • the specific examples as illustrated in FIG. 31 to FIG. 34 may be modified in accordance with the specification of the image sensor 43 actually employed.
  • FIG. 35 is a flow chart showing the process of calculating the state information in step S 2 of FIG. 29 .
  • the CPU 201 acquires digital pixel data from the ADC 208 in step S 50 .
  • This digital pixel data is data obtained by converting the analog pixel data, which is transmitted from the image sensor 43 , into digital data by the ADC 208 .
  • step S 51 the process of extracting a target point is performed. More specifically speaking, the CPU 201 acquires differential data by calculating the difference between the pixel data acquired when the infrared light emitting diodes 15 are turned on and the pixel data acquired when the infrared light emitting diodes 15 are turned off. Then, the CPU 201 finds the maximum value of the differential data and compares it with the predetermined threshold value “Th”. Furthermore, if the maximum value of the differential data is greater than the predetermined threshold value “Th”, the CPU 201 converts the coordinates of the pixel having the differential data corresponding to the maximum value into the coordinates on the screen 91 of the television monitor 90 and sets the coordinates of the target point of the operation article 150 to the coordinates as converted.
  • step S 52 the CPU 201 determines which of the areas a 1 to a 4 in FIG. 24 includes the target point of the operation article 150 , and stores the result of determination in the array element J 1 [M].
  • step S 53 the CPU 201 determines which of the areas A 1 to A 4 in FIG. 25 includes the target point of the operation article 150 , and stores the result of determination in the array element J 2 [M].
  • FIG. 36 is a flow chart showing the process flow of acquiring a pixel data group in step S 50 of FIG. 35 .
  • the CPU 201 sets “X” to “ ⁇ 1” and “Y” to “0” as element indices of a pixel data array in the first step S 60 .
  • FIG. 37 is a flow chart showing the process flow of acquiring pixel data in step S 61 of FIG. 36 .
  • the CPU 201 checks the frame status flag signal FSF as input from the image sensor 43 in the initial step S 70 , and judges whether or not the rising edge thereof (from a low level to a high level) is detected in step S 71 . Then, if the rising edge of the frame status flag signal FSF is detected in step S 71 , in the next step S 72 , the CPU 201 instructs the ADC 208 to start the conversion of the analog pixel data input thereto into digital data.
  • step S 73 the pixel strobe signal PDS as input from the image sensor 43 is checked in step S 73 , and it is judged whether or not the rising edge of the pixel strobe signal PDS from a low level to a high level is detected in step S 74 .
  • step S 75 If “NO” is determined in step S 75 , since it is the second or later pixel data constructing the line, the current pixel data is acquired and saved in a temporary register (not shown in the figure) in steps S 76 and S 78 . Thereafter, the process proceeds to step S 62 of FIG. 36 .
  • step S 62 of FIG. 36 the pixel data as saved in the temporary register is assigned to a pixel data element P[Y][X].
  • step S 63 “X” is incremented if “X” is smaller than “32”, the process of from step S 61 to step S 63 is repeatedly performed. If “X” is equal to “32”, i.e., if the acquisition process of pixel data reaches the end of the current line, “X” is set to “ ⁇ 1” in the following step S 65 , “Y” is incremented in step S 66 , and the acquisition process of pixel data is repeated from the top of the next line.
  • step S 67 If “Y” is equal to “32” in step S 67 , i.e., if the acquisition process of pixel data reaches the last pixel data array element P[Y][X], the process proceeds to step S 51 of FIG. 35 .
  • FIG. 38 is a flow chart showing the process flow of extracting a target point in step S 51 of FIG. 35 .
  • the CPU 201 acquires differential data by calculating the difference between the pixel data acquired from the image sensor 43 when the infrared light emitting diodes 15 are turned on and the pixel data acquired from the image sensor 43 when the infrared light emitting diodes 15 are turned off.
  • step S 82 the CPU 201 scans all the array elements Dif[X] [Y].
  • step S 83 the CPU 201 finds the maximum value of all the array elements Dif[X] [Y]. If the maximum value is greater than the predetermined threshold value “Th”, the CPU 201 proceeds to step S 85 , and if the maximum value is less than or equal to the predetermined threshold value “Th”, the CPU 201 proceeds to step S 4 of FIG. 29 (step S 84 ).
  • step S 85 the CPU 201 calculates the coordinates (Xc, Yc) of the target point of the operation article 150 on the basis of the coordinates corresponding to the maximum value.
  • step S 87 the CPU 201 converts the coordinates (Xc, Yc) of the target point on the image sensor 43 into the coordinates (xc, yc) on the screen 91 of the television monitor 90 .
  • step S 88 the CPU 201 assigns “xc” to the array element Px[M] as the x-coordinate of the M-th target point, and “yc” to the array element Py[M] as the y-coordinate of the M-th target point.
  • FIG. 39 is a flow chart showing the process flow of calculating the coordinates of a target point in step S 85 of FIG. 38 .
  • step S 100 the CPU 201 assigns the X-coordinate and the Y-coordinate, which are obtained in step S 83 in correspondence with the maximum value, respectively to “m” and “n”.
  • step S 103 the CPU 201 assigns the current value of “m” to “m”.
  • the endmost X-coordinate of the differential data greater than the predetermined threshold value “Th” is obtained by repeating steps S 101 to S 103 while scanning the X-axis from the maximum value position in the positive direction.
  • step S 104 the CPU 201 assigns to “m” the X-coordinate which is obtained in step S 83 in correspondence with the maximum value.
  • step S 105 the CPU 201 decrements “m” by one. If the differential data Dif[m][n] is greater than the predetermined threshold value “Th”, the CPU 201 proceeds to step S 107 , otherwise proceeds to step S 108 (step S 106 ).
  • step S 107 the CPU 201 assigns the current value of “m” to “ml”. The endmost X-coordinate of the differential data greater than the predetermined threshold value “Th” is obtained by repeating steps S 105 to S 107 while scanning the X-axis from the maximum value position in the negative direction.
  • step S 108 the CPU 201 calculates the center coordinate between the X-coordinate “mr” and the X-coordinate “ml”, and assigns it to the X-coordinate (Xc) of the target point.
  • step S 109 the CPU 201 assigns “Xc” which is obtained in step S 108 and the Y-coordinate which is obtained in step S 83 in correspondence with the maximum value, respectively to “m” and “n”.
  • step S 112 the CPU 201 assigns the current value of “n” to “md”.
  • the endmost Y-coordinate of the differential data greater than the predetermined threshold value “Th” is obtained by repeating steps S 110 to S 112 while scanning the Y-axis from the maximum value position in the positive direction.
  • step S 113 the CPU 201 assigns to “n” the Y-coordinate which is obtained in step S 83 in correspondence with the maximum value.
  • step S 114 the CPU 201 decrements “n” by one. If the differential data Dif[m][n] is greater than the predetermined threshold value “Th”, the CPU 201 proceeds to step S 116 , otherwise proceeds to step S 117 (step S 115 ).
  • step S 116 the CPU 201 assigns the current value of “n” to “mu”. The endmost Y-coordinate of the differential data greater than the predetermined threshold value “Th” is obtained by repeating steps S 114 to S 116 while scanning the Y-axis from the maximum value position in the negative direction.
  • step S 117 the CPU 201 calculates the center coordinate between the Y-coordinate “md” and the Y-coordinate “mu”, and assigns it to the Y-coordinate (Yc) of the target point.
  • the coordinates (Xc, Yc) of the target point of the operation article 150 is calculated.
  • FIG. 40 is a flow chart showing the game process flow in step S 3 of FIG. 29 .
  • the CPU 201 checks a music end flag (refer to step S 196 of FIG. 43 ), and if music ends the game process is over, conversely if music does not end yet the process proceeds to step S 121 .
  • step S 121 the CPU 201 registers the x-coordinate Px[M] and the y-coordinate Py[M] of the target point of the operation article 150 as the display coordinates of the cursor 105 on the screen 91 .
  • the CPU 201 repeats the process between step S 122 and step S 144 twice.
  • “j” represents a guidance display number “J” (refer to FIG. 43 ).
  • step S 123 the CPU 201 checks a guidance start flag GF[j] (refer to step S 194 of FIG. 43 ). If the guidance start flag GF[j] is turned on, the CPU 201 proceeds to step S 125 , and if it is turned off the CPU 201 proceeds to step S 144 (step S 124 ). In step S 125 , the CPU 201 checks a frame counter C[j]. If the frame counter C[j] is greater than “0”, the CPU 201 proceeds to step S 128 , conversely if the frame counter C[j] is equal to “0”, the CPU 201 proceeds to step S 127 (step S 126 ).
  • step S 127 in accordance with the note number NN[j], the CPU 201 registers the animation information of the direction guide “g” or the path guide “rg” together with the animation information of the position guide “G”.
  • the animation information is registered only when the frame counter C[j] is “0” because once the animation information is registered the animation is performed in accordance with the registration information so that the registration is needed only when starting the animation.
  • step S 128 the CPU 201 checks the note number NN[j], and if it is the note number designating the turn of the cursor 105 (refer to FIG. 20 ( b ) and FIG. 20 ( c )) the process proceeds to step S 131 otherwise (refer to FIG. 20 ( a )) proceeds to step S 129 .
  • step S 129 the CPU 201 checks the frame counter C[j]. If the frame counter C[j] is greater than or equal to the predetermined number “f 1 ” of frames, the CPU 201 proceeds to step S 131 otherwise proceeds to step S 141 (step S 130 ).
  • step S 131 the CPU 201 determines whether or not the cursor 105 is correctly manipulated in correspondence with the guides (the position guide “G”/the direction guide “g”/the path guide “rg”) (success determination).
  • step S 128 and step S 131 in the case where the note number NN[j] designates the turn of the cursor 105 , the success determination of the manipulate of the cursor 105 is performed just after starting displaying the position guide “G” and the path guide “rg” (the frame counter C[j] is “0”) irrespective of the value of the frame counter C[j].
  • the success determination of the manipulate of the cursor 105 is performed a predetermined number “f 1 ” of frames (for example, 30 frames) after starting displaying the position guide “G” and the direction guide “g” (the frame counter C[j] is “0”) (refer to FIG. 28 ).
  • step S 133 the CPU 201 registers the dance animation information with reference to the note number NN[j] and a dance speed flag DF (refer to step S 193 , step S 190 and step S 192 of FIG. 43 ). Also, in the case where the manipulation is successful, the CPU 201 changes the center position of the background screen 140 and modifies the corresponding data of the array elements PA and the array elements CA with reference to the note number NN[j] and a dance speed flag DF in order to scroll the background 110 . Furthermore, the CPU 201 registers the storage location information of the animation table of the position guide “G” which is used when the manipulation succeeded.
  • step S 134 the CPU 201 checks the note numbers NN[j], and if it is the note number designating the turn of the cursor 105 , the process proceeds to step S 137 otherwise proceeds to step S 135 .
  • step S 135 the CPU 201 checks the frame counter C[j]. If the frame counter [j] is greater than or equal to a predetermined number “f 2 ” of frames, the CPU 201 proceeds to step S 137 otherwise proceeds to step S 138 (step S 136 ).
  • step S 137 the CPU 201 adds “3” to a score “S”. On the other hand, in step S 138 , “1” is added to the score “S”.
  • the cursor 105 is located in the area of the position guide “G” within a predetermined period (for example, 10 frames) from the time point the predetermined number “f 2 ” of frames (for example, 50 frames) after starting displaying the position guide “G” and the direction guide “g” (the frame counter C[j] is “0”), it is determined that the cursor 105 is manipulated in a best timing so that “3” is added.
  • a predetermined period for example, 10 frames
  • the predetermined number “f 2 ” of frames for example, 50 frames
  • the cursor 105 is located in the area of the position guide “G” the predetermined number “f 1 ” of frames after and the predetermined number “f 2 ” of frames before, it is determined that the cursor 105 is manipulated in an ordinary successful manner that “1” is added. Also, when the manipulation is performed in correspondence with the position guide “G” and the path guide “rg” (the guide designating the turn of the cursor 105 ), “3” is added equally to the score “S”.
  • step S 139 the CPU 201 checks the frame counter C[j]. If the frame counter C[j] is equal to a predetermined number “f 3 ” (for example, 60 frames), the CPU 201 proceeds to step S 142 otherwise proceeds to step S 141 (step S 140 ). In step S 141 , the CPU 201 increments the frame counter C[j] by one. On the other hand, in step S 142 , the CPU 201 sets the frame counter C[j] to “0”. In step S 143 , the CPU 201 turns off the guidance start flag GF[j]. Incidentally, the predetermined number “f 3 ” is used to define the end of success determination.
  • a predetermined number “f 3 ” for example, 60 frames
  • FIG. 41 is a flow chart showing the interrupt process flow. As shown in FIG. 41 , in step S 150 , the CPU 201 performs the playback of music. In step S 151 , the CPU 201 performs the process of registering the guides (the position guide “G”, the direction guide “g” and the path guide “rg”).
  • FIG. 42 is a flow chart showing the process flow of the playback of music in step S 150 of FIG. 41 .
  • step S 160 the CPU 201 checks the execution stand-by counter for music. If the value of the execution stand-by counter for music is “0”, the process proceeds to step S 162 , conversely if it is not “0”, the process proceeds to step S 170 (step S 161 ). In step S 170 , the CPU 201 decrements the execution stand-by counter for music.
  • step S 162 the CPU 201 reads and interprets the command pointed to by the musical score data pointer for music. If the command is “Note On”, the process proceeds to step S 164 (step S 163 ). On the other hand, if the command is not “Note On”, i.e., “Waiting”, the process proceeds to step S 165 . In step S 165 , the CPU 201 sets a waiting time to the execution stand-by counter for music.
  • step S 164 the CPU 201 instructs the sound processor 203 to start outputting a sound corresponding to the note number which is read.
  • step S 166 the CPU 201 increments the musical score data pointer for music.
  • step S 167 the CPU 201 checks the remaining sound outputting time corresponding to the note number associated with the outputting sound. If the remaining sound outputting time is “0”, the process proceeds to step S 169 otherwise proceeds to step S 151 of FIG. 41 (step S 168 ). In step S 169 , the CPU 201 instructs the sound processor 203 to perform the sound termination process of the note number having the remaining sound outputting time of “0”.
  • FIG. 43 is a flow chart showing the process flow of registering guides in step S 151 of FIG. 41 .
  • step S 180 the CPU 201 checks the execution stand-by counter for guide. If the value of the execution stand-by counter for guide is “0”, the process proceeds to step S 182 , conversely if it is not “0”, the process proceeds to step S 198 (step S 181 ).
  • step S 198 the CPU 201 decrements the execution stand-by counter for guide.
  • step S 182 the CPU 201 reads and interprets the command pointed to by the musical score data pointer for guide. If the command is “Note On”, the CPU 201 proceeds to step S 184 (step S 183 ). On the other hand, if the command is not “Note On”, i.e., “Waiting”, the CPU 201 proceeds to step S 197 . In step S 197 , the CPU 201 sets the execution stand-by counter for guide to a waiting time.
  • step S 196 the CPU 201 turns on the music end flag.
  • step S 195 the CPU 201 proceeds to step S 195 otherwise proceeds to step S 186 (step S 185 ).
  • the CPU 201 sets the guidance display number “J” to “0” in step S 188 , conversely if the guidance display number “J” is not “1 (i.e., it is 0”), the CPU 201 sets the guidance display number “J” to “1” in step S 187 . Since the guidance of a certain position guide “G” and the guidance of another position guide “G” are started in different timings but can be overlappingly continued in a certain period, the guidance display number “J” is used to perform the game process in step S 3 of FIG. 29 .
  • step S 190 the CPU 201 sets the dance speed flag DF to “1” (a high speed dance animation).
  • step S 192 the CPU 201 sets the dance speed flag DF to “0” (a low speed dance animation).
  • a note number in the case where the note number is none of the note number designating the end of music, the note number designating the start of music, the note number designating a high speed dance animation and the note number designating a low speed dance animation, such a note number shall be a note number which designates a type of a guide ( FIG. 20 ( a ) through FIG. 20 ( c )) and thereby the CPU 201 assigns the note number to the array element NN[J] in step S 193 .
  • the CPU 201 turns on the guidance start flag GF[J].
  • step S 195 the CPU 201 increments the musical score data pointer for guide.
  • FIG. 44 is a view for showing an example of the game screen in which another example of the direction guide “g” is applied.
  • a direction guide “g 20 ” is displayed in the form of a belt which is extending from the position guide “G 1 ” toward the position guide “G 2 ”.
  • This direction guide “g 20 ” grows as time passes from the position guide “G 1 ” to the position guide “G 2 ”.
  • the direction in which the cursor 105 is to be manipulated is guided in terms of the direction in which this direction guide “g 20 ” grows.
  • the direction guide “g 20 ” is represented by gradually change the color of the path from the position guide “G 1 ” to the position guide “G 2 ”.
  • FIG. 45 is a view for showing an example of the game screen in which a further example of the direction guide “g” is applied.
  • a direction guide “g 30 ” is displayed on the game screen between one position guide “G” and another position guide “G”.
  • This direction guide “g 30 ” consists of five partial paths “g 31 ” to “g 35 ”.
  • the five partial paths “g 31 ” to “g 35 ” change color sequentially from the position guide “G 1 ” toward the position guide “G 2 ”.
  • the change in color is illustrated by hatching.
  • the manipulation direction of the cursor 105 can be guided in terms of the direction in which the partial paths “g 31 ” to “g 35 ” change color in sequence. Also, in the case of this example, if a predetermined time period after changing the color of the partial path “g 35 ” adjacent to the position guide “G 2 ” guiding the destination position of the cursor 105 is used as the period for performing the success determination of manipulating the cursor 105 , it is possible to guide the manipulation timing of the cursor 105 by the direction guide “g 30 ”.
  • FIG. 46 is a view for showing an example of the game screen in which a still further example of the direction guide “g” is applied.
  • a direction guide “g 40 ” is displayed on the game screen between the position guide “G 1 ” and the position guide “G 2 ”.
  • This direction guide “g 40 ” moves from the position guide “G 1 ” to the position guide “G 2 ” as time passes.
  • the direction in which the cursor 105 is to be moved is guided in terms of the direction in which this direction guide “g 40 ” moves.
  • the display of images (the dance object 106 and the background 110 in the above example) is controlled in accordance with the guidance by the guides.
  • the display of images is controlled in accordance with the manipulation of the cursor 105 .
  • the display of the images is controlled in accordance with the manipulation of the operation article 150 .
  • the image of the operation article 150 which is intermittently lighted by the stroboscope, is captured by the imaging unit 13 in order to obtain the state information of the operation article 150 . Because of this, no circuit which is driven by a power supply need be provided within the operation article 150 for obtaining the state information of the operation article 150 . Furthermore, this music game apparatus 1 serves to automatically play music.
  • the player 94 can enjoy, together with the music, images which are displayed in synchronization with the manipulation of the operation article 150 by manipulating the operation article 150 having a simple structure.
  • the operation article 150 is manipulated in synchronization with music as long as the player 94 manipulates the cursor 105 in correspondence with the guides. Accordingly, the player 94 can enjoy the manipulation of the operation article 150 in synchronization with music.
  • the high speed processor 200 scrolls the background 110 to the left, and prepares the dance animation information and the dance animation table for turning the dance object 106 in the counter clockwise direction. Also, for example, in correspondence with the note numbers “ 45 ” and “ 64 ”, the high speed processor 200 scrolls the background 110 to the right, and prepares the dance animation information and the dance animation table for turning the dance object 106 in the clockwise direction. Furthermore, for example, in correspondence with the note numbers “ 76 ” and “ 77 ”, the high speed processor 200 scrolls the background 110 in the downward direction, and prepares the dance animation information and the dance animation table for turning the dance object 106 in the counter clockwise direction.
  • the high speed processor 200 scrolls the background 110 in the upward direction, and prepares the dance animation information and the dance animation table for turning the dance object 106 in the clockwise direction.
  • the dance animation information and the dance animation table for widely turning the dance object 106 in the clockwise direction are prepare in correspondence with the note number “ 53 ”.
  • the dance animation information and the dance animation table for widely turning the dance object 106 in the counter clockwise direction are prepare in correspondence with the note number “ 57 ”.
  • the above types of the note numbers are note numbers for controlling the display of the guides, and thereby the background 110 and the dance object 106 are controlled in accordance with the guides. Furthermore, in other words, the background 110 and the dance object 106 are controlled in accordance with the manipulation of the operation article 150 .
  • the position guide “G” serves to guide the manipulation timing and the destination position of the cursor 105 .
  • the high speed processor 200 controls the display of images (the dance object 106 and the background 110 in the case of the above example) in correspondence with the direction toward the destination position which is guided by the position guide “G”.
  • the path guide “rg” serves to guide the moving path, the moving direction and the manipulation timing of the cursor 105 . Accordingly, when the player 94 manipulates the operation article 150 in order to move the cursor 105 in the manipulation timing guided by the path guide “rg”, in the moving direction guided by the path guide “rg” and along the moving path guided by the path guide “rg”, the display of images (the dance object in the case of the above example) is controlled in correspondence with the path guide “rg”. As a result, it is possible to enjoy, together with music, the images which are synchronized with the cursor 105 which is moving in association with the motion of the operation article 150 (refer to FIG. 13 ).
  • the cursor 105 which is moving in association with the operation article 150 is correctly manipulated in correspondence with the guidance of the path guide “rg” (refer to FIG. 25 ).
  • the position guide “G” is displayed in each of a plurality of positions which are determined in advance on the screen 91 . Then, the high speed processor 200 changes the appearance of the position guide “G” in the timing on the basis of music (the animation that a bloom is opening in the case of the example of FIG. 12 ). Accordingly, the player 94 can easily recognize the position and the direction to which the cursor 105 is to be moved with reference to the change of the position guide “G” in appearance.
  • the direction guide “g” and the path guide “rg” are expressed in images with which it is possible to visually recognize the motion from the first predetermined position to the second predetermined position on the screen 91 .
  • the manipulation of the cursor 105 is guided not only by the position guide “G” but also by the direction guide “g” and the path guide “rg”. Accordingly, the player 94 can clearly recognize the direction and path of the cursor 105 to be moved. More specific description is as follows.
  • the direction guide “g” and the path guide “rg” are expressed by the change in appearance of a plurality of objects (in the form of spheres in the case of the examples of FIG. 12 and FIG. 13 ) which are arranged in the path having a start point at the first predetermined position and an end point at the second predetermined position on the screen 91 .
  • the player 94 can easily recognize the direction and the path to which the cursor 105 is to be moved with reference to the change in appearance of the plurality of the objects.
  • the direction guide “g” is expressed by the motion of an object (in the form of a bird in the case of the example of FIG. 46 ) from the first predetermined position to the second predetermined position on the screen 91 .
  • the player 94 can easily recognize the direction and the path to which the cursor 105 is to be moved with reference to the motion of the object.
  • the direction guide “g” is expressed by the change in appearance of the path having a start point at the first predetermined position and an end point at the second predetermined position on the screen 91 (refer to FIG. 44 and FIG. 45 ).
  • the player 94 can easily recognize the direction and the path to which the cursor 105 is to be moved with reference to the change in appearance of the path.
  • the high speed processor 200 can be calculate, as the state information of the operation article 150 , any one of or any combination of two or more of speed information, moving direction information, moving distance information, velocity vector information, acceleration information, movement locus information, area information, and positional information.
  • speed information moving direction information
  • moving distance information moving distance information
  • velocity vector information velocity vector information
  • acceleration information movement locus information
  • area information area information
  • positional information positional information
  • the state information of the operation article 150 can be obtained by intermittently emitting infrared light to the operation article 150 to which the reflection sheet 155 is attached and capturing the image thereof. Because of this, no circuit which is driven by a power supply need be provided within the operation article 150 for obtaining the state information of the operation article 150 . Accordingly, it is possible to improve the manipulability and reliability of the operation article 150 , and to reduce the cost.
  • the dance object 106 and the background 110 are explained as images (follow-up images) which are controlled to follow the motion of the operation article 150 .
  • the present invention is not limited thereto, but any arbitrary object can be selected as such a follow-up image.
  • the configuration of the operation article is not limited to those as described above as long as a reflecting object is provided.

Abstract

When a cursor (105: FIG. 12) is correctly manipulated in correspondence with guides (“G1” to “G4” and “g1” to “g5”: FIG. 12), the display of the dance object (106: FIG. 12) and the background (110: FIG. 12) is controlled in accordance with the manipulation direction of the cursor. The position of the operation article on a screen (91: FIG. 1) is obtained as the coordinates of the cursor by intermittently irradiating an operation article (150: FIG. 1) by a stroboscope and capturing the image thereof by an imaging unit (13: FIG. 1).

Description

    FIELD OF THE INVENTION
  • The present invention relates to a music game apparatus which displays images following the motion of an operation article and the related arts.
  • BACKGROUND ART
  • A music conducting game apparatus is disclosed in Patent document 1 (Japanese Patent Published Application No. 2002-263360). This music conducting game apparatus is provided with a phototransmitter unit at the tip of a baton controller, and a photoreceiver unit in a lower position of a monitor. The motion of the baton controller is detected by such a configuration.
  • When a game is started, an operation guidance image is displayed on the monitor in order to instruct the direction and timing of swinging the baton controller while the sound of music performance is output. This sound of performance is output irrespective of the manipulation of the baton controller. On the other hand, a baton responsive sound is output only when the baton controller is manipulated in accordance with the direction and timing as instructed. This baton responsive sound corresponds to fragments into which a certain performance part is divided by a predetermined length. As a result, each time the player manipulates the baton controller in accordance with the direction and diodes as instructed, the baton responsive sound corresponding thereto is output.
  • Patent document 2 (Japanese Patent Published Application No. Hei 10-143151) discloses a conducting apparatus. In this conducting apparatus, while a mouse is manipulated in the same manner as a baton, music parameters such as a tempo, an accent and dynamics are calculated with reference to the trajectory of the mouse. Then, the music parameters as calculated are reflected in the music and image as output. For example, in the case where the motion picture of a steam train is displayed, the speed of the steam train is controlled to follow the tempo as calculated, the variation of the speed is controlled to follow the accent as calculated, and the amount of smoke of the steam train is controlled to follow the dynamics as calculated.
  • As explained above, it is apparently the main purpose of the music conducting game apparatus of Patent document 1 that the player plays music performance. On the other hand, in the conducting apparatus of the Patent document 2, since it is the main purpose that the player plays music performance, the moving information of the mouse is converted into music parameters which are then reflected in the music and image.
  • As has been discussed above, in the case of the conventional apparatuses having the main purpose of playing music performance by the player, the image which is displayed (a steam train in the case of the above example) is not interesting enough, and little importance is attached to such an image that can be enjoyed by the player.
  • Furthermore, with respect to the baton controller and the mouse which are the operation articles manipulated by the player, there are the following facts. The baton controller of Patent document 1 is provided with the phototransmitter unit, and thereby it is indispensable to use an electronic circuit. Accordingly, the cost of the baton controller rises, and it can be the cause of trouble. Still further, the manipulability is degraded. Particularly, since the baton controller is used by swinging, it is desirable to dispense with an electronic circuit and simplify the configuration. In addition to this, the mouse of the Patent document 2 can be moved only on a plane surface so that there are substantial restrictions on the manipulation, and in addition to this there are the same problem as in the baton controller of Patent document 1.
  • SUMMARY OF THE INVENTION
  • Accordingly, it is an object of the present invention to provide a music game apparatus and the related arts in which the player can enjoy images, which are displayed in synchronization with the manipulation of an operation article, together with music by manipulating the operation article having a simple structure, while automatically playing the music without relation to the player.
  • In accordance with an aspect of the present invention, a music game apparatus operable to automatically playing music, comprises: a stroboscope operable to irradiate an operation article manipulated by a player with light in a predetermined cycle; an imaging unit operable to generate a lighted image signal and an unlighted image signal by capturing images of the operation article respectively when said stroboscope is lighted and unlighted; a differential signal generating unit operable to generate a differential signal between the lighted image signal and the unlighted image signal; a state information calculating unit operable to calculate the state information of the operation article on the basis of the differential signal; a guide control unit operable to control the display of a guide for the manipulation of a cursor, which moves in association with the operation article, in a timing on the basis of the music; a cursor control unit operable to control the display of the cursor on the basis of the state information of the operation article; and a follow-up image control unit operable to control the display of an image in accordance with guidance by the guide when the cursor is correctly manipulated by the operation article in correspondence with the guide, wherein said follow-up image control unit determines whether or not the cursor is correctly manipulated by the operation article in correspondence with the guide, on the basis of the state information of the operation article and the information about the guide.
  • In accordance with this configuration, if the cursor is correctly manipulated in correspondence with the guide, the display of the image is controlled in accordance with the guidance by the guide. In this case, since the cursor is manipulated in correspondence with the guidance by the guide, the display of the image is controlled in accordance with the manipulation of the cursor. In other words, since the cursor moves in association with the operation article, the display of the image is controlled in accordance with the manipulation of the operation article. The state information of the operation article is obtained by capturing the image of the operation article, which is intermittently lighted by the stroboscope. Because of this, no circuit which is driven by a power supply need be provided within the operation article for obtaining the state information of the operation article. Furthermore, this music game apparatus serves to automatically play music.
  • As a result, while automatically playing music without relation to the player, the player can enjoy, together with the music, images which are displayed in synchronization with the manipulation of the operation article by manipulating the operation article having a simple structure.
  • Also, since the guide is controlled in the timing on the basis of music, the operation article is manipulated in synchronization with music as long as the player manipulates the cursor in correspondence with the guide. Accordingly, the player can enjoy the manipulation of the operation article in synchronization with music.
  • In this case, the “manipulation” of the operation article means moving the operation article itself (for example, changing the position thereof), but does not mean pressing a switch, moving an analog stick, and so forth.
  • In the above music game apparatus, the guide is operable to guide the cursor to a destination position in a manipulation timing, and wherein said follow-up image control unit is operable to control the display of the image in correspondence with the direction of the destination position as guided by the guide when the cursor is correctly manipulated by the operation article in correspondence with the guide.
  • In accordance with this configuration, when the player manipulates the operation article in order to move the cursor to the destination position guided by the position guide in the manipulation timing guided by the guide, the display of images is controlled in correspondence with the direction toward the destination position of the cursor guided by the guide. As a result, it is possible to enjoy, together with music, the images which are synchronized with the cursor which is moving in association with the motion of the operation article.
  • In the above music game apparatus, said state information calculating unit is operable to calculate the position of the operation article as the state information on the basis of the differential signal, and wherein said follow-up image control unit is operable to determine that the cursor, which moves in association with the operation article, is correctly manipulated in correspondence with the guide if the position of the operation article as calculated by said state information calculating unit is located in an area corresponding to the guidance by the guide within a period corresponding to the guidance by the guide.
  • In accordance with this configuration, it is possible to determine the correctness of the manipulation of the cursor on the basis of the position of the operation article which can be calculated by a simple process.
  • In the above music game apparatus, the guide is operable to guide the moving path, moving direction and manipulation timing of the cursor.
  • In accordance with this configuration, when the player manipulates the operation article in order to move the cursor in the manipulation timing guided by the guide, in the moving direction guided by the guide and along the moving path guided by the guide, the display of images is controlled in correspondence with the guide. As a result, it is possible to enjoy, together with music, the images which are synchronized with the cursor which is moving in association with the motion of the operation article.
  • In the above music game apparatus, said state information calculating unit is operable to calculate the position of the operation article as the state information on the basis of the differential signal, and wherein said follow-up image control unit is operable to determine that the cursor, which moves in association with the operation article, is correctly manipulated in correspondence with the guide if the position of the operation article as calculated by said state information calculating unit is moved through a plurality of predetermined areas guided by the guide in a predetermined order guided by the guide within a period guided by the guide.
  • In accordance with this configuration, it is possible to determine the correctness of the manipulation of the cursor on the basis of the position of the operation article which can be calculated by a simple process.
  • In the above music game apparatus, the guide is displayed in each of a plurality of positions which is determined in advance in a screen, and wherein the guide control unit is operable to change the appearance of the guide in a timing on the basis of the music;
  • In accordance with this configuration, the player can easily recognize the position and the direction to which the cursor is to be moved with reference to the change of the position guide in appearance.
  • In the present specification, the appearance of the guide is related to either or both of the shape and color of the guide.
  • In the above music game apparatus the guide is expressed in an image with which it is possible to visually recognize the motion from a first predetermined position to a second predetermined position on a screen, and wherein the guide control unit is operable to control the display of the guide in a timing on the basis of the music.
  • In accordance with this configuration, the player can clearly recognize the direction and path of the cursor to be moved.
  • For example, the guide is expressed by the change in appearance of a plurality of objects which are arranged in a path having a start point at the first predetermined position and an end point at the second predetermined position on the screen.
  • In accordance with this configuration, the player can easily recognize the direction and path of the cursor to be moved with reference to the change in appearance of the plurality of objects.
  • For example, the guide is expressed by an object moving from the first predetermined position to the second predetermined position on the screen.
  • In accordance with this configuration, the player can easily recognize the direction and path of the cursor to be moved with reference to the motion of the object.
  • For example, the guide is expressed by the change in appearance of a path having a start point at the first predetermined position and an end point at the second predetermined position on the screen.
  • In accordance with this configuration, the player can easily recognize the direction and path of the cursor to be moved with reference to the change in appearance of the path.
  • In the above music game apparatus the state information of the operation article as calculated by said state information calculating unit is any one of or any combination of two or more of speed information, moving direction information, moving distance information, velocity vector information, acceleration information, movement locus information, area information, and positional information.
  • In accordance with this configuration, since a variety of information can be used as the state information of the operation article for determining whether or not the cursor is correctly manipulated in correspondence with the guides, the possibility of expression of guides is greatly expanded, and thereby the design freedom of the game content is also greatly increased.
  • The novel features of the invention are set forth in the appended claims. The invention itself, however, as well as other features and advantages thereof, will be best understood by reading the detailed description of specific embodiments in conjunction with the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a view showing the overall configuration of the music game system in accordance with an embodiment of the present invention.
  • FIG. 2 is a perspective view of the operation article of FIG. 1.
  • FIG. 3(a) is a top view showing the reflection ball of FIG. 2. FIG. 3(b) is a side view for showing the reflection ball as seen from arrow A of FIG. 3(a). FIG. 3(c) is a side view for showing the reflection ball as seen from arrow B of FIG. 3(a).
  • FIG. 4 is a longitudinal section view of the reflection ball of FIG. 2.
  • FIG. 5 is an explanatory schematic diagram for showing one example of the imaging unit of FIG. 1.
  • FIG. 6 is a view showing the electric configuration of the music game apparatus of FIG. 1.
  • FIG. 7 is a block diagram of the high speed processor of FIG. 6.
  • FIG. 8 is a circuit diagram for showing an LED drive circuit and the configuration for transferring pixel data from the image sensor of FIG. 6 to a high speed processor.
  • FIG. 9(a) is a timing diagram of a frame status flag signal FSF as output from the image sensor of FIG. 8. FIG. 9(b) is a timing diagram of a pixel data strobe signal PDS as output from the image sensor of FIG. 8. FIG. 9(c) is a timing diagram of the pixel data D (X, Y) as output from the image sensor of FIG. 8. FIG. 9(d) is a timing diagram of an LED control signal LEDC as output from the image sensor of FIG. 8. FIG. 9(e) is a timing diagram showing the lighting state of the infrared light emitting diodes of FIG. 8. FIG. 9(f) is a timing diagram showing the exposure period of the image sensor of FIG. 8.
  • FIG. 10(a) is an expanded view showing the frame status flag signal FSF of FIG. 9. FIG. 10(b) is an expanded view showing the pixel data strobe signal PDS of FIG. 9. FIG. 10(c) is an expanded view showing the pixel data D (X, Y) of FIG. 9.
  • FIG. 11 is a view for showing an example of the game screen as displayed on the screen of the television monitor of FIG. 1.
  • FIG. 12 is a view for showing another example of the game screen as displayed on the screen of the television monitor of FIG. 1.
  • FIG. 13 is a view for showing a further example of the game screen as displayed on the screen of the television monitor of FIG. 1.
  • FIG. 14 is a view for explaining the sprites forming an object which is displayed on the screen of the television monitor of FIG. 1.
  • FIG. 15 is an explanatory view for showing a background screen to be displayed on the screen of the television monitor of FIG. 1.
  • FIG. 16(a) is an explanatory view for showing the background screen of FIG. 15 before scrolling it. FIG. 16(b) is an explanatory view for showing the background screen after scrolling it.
  • FIG. 17 is a schematic representation of a program and data stored in the ROM of FIG. 6.
  • FIG. 18 is a schematic representation of one example of the first musical score data of FIG. 17.
  • FIG. 19 is a schematic representation of one example of the second musical score data of FIG. 17.
  • FIG. 20(a) is a view showing the correspondence between note numbers and the directions in which the cursor is guided. FIG. 20(b) is another view showing the correspondence between note numbers and the directions in which the cursor is guided. FIG. 20(c) is a further view showing the correspondence between note numbers and the directions in which the cursor is guided.
  • FIG. 21(a) is a view for showing an example of the image which is captured by an ordinary used image sensor and is not processed by a particular treatment. FIG. 21(b) is a view for showing an example of the image which is obtained by level filtering the image signal of FIG. 21(a) by a certain threshold value. FIG. 21(c) is a view for showing an example of the image which is captured by the image sensor through the infrared filter with the illumination and level filtered by a certain threshold value. FIG. 21(d) is a view for showing an example of the image which is captured by the image sensor through the infrared filter without the illumination and is level filtered by a certain threshold value. FIG. 21(e) is a view for showing an example of the differential signal between the image signal with the illumination and the image signal without the illumination.
  • FIG. 22 is a view for explaining the process of calculating the coordinates of the target point of the operation article of FIG. 1.
  • FIG. 23(a) is a view for explaining the process of scanning in the X-direction when the coordinates of the target point of the operation article of FIG. 1 are calculated on the basis of the coordinates of the pixel having the maximum luminance value. FIG. 23(b) is a view for explaining the process of starting scanning in the Y-direction when the coordinates of the target point of the operation article of FIG. 1 are calculated on the basis of the coordinates of the pixel having the maximum luminance value. FIG. 23(c) is a view for explaining the process of scanning in the Y-direction when the coordinates of the target point of the operation article of FIG. 1 are calculated on the basis of the coordinates of the pixel having the maximum luminance value. FIG. 23(d) is an explanatory view for showing the result of the process of calculating the coordinates of the target point of the operation article on the basis of the coordinates of the pixel having the maximum luminance value.
  • FIG. 24 is a view for explaining a target point existence area determination process (1) performed by the CPU 201.
  • FIG. 25 is a view for explaining a target point existence area determination process (2) performed by the CPU 201.
  • FIG. 26 is a view for explaining the registration process of the animations of the direction guide “G”, the position guide “g” and the path guide “rg” in accordance with the present embodiment.
  • FIG. 27 is a view for showing an example of the animation table which is designated by the animation table storage location information of FIG. 26.
  • FIG. 28 is a timing diagram for explaining the relationship among the first musical score data, the second musical score data, the direction guide “G”, the position guide “g”, the judgment of manipulation and the dance animation in accordance with the present embodiment.
  • FIG. 29 is a flow chart showing the entire process flow of the music game apparatus 1 of FIG. 1.
  • FIG. 30 is a flow chart showing the process flow for the initial settings of the system in step S1 of FIG. 29.
  • FIG. 31 is a flow chart showing the process flow for sensor initial settings in step S14 of FIG. 30.
  • FIG. 32 is a flow chart showing the command transmission process in step S21 of FIG. 31.
  • In FIG. 33, (a) is a timing diagram showing the register setting clock CLK of FIG. 8.
  • (b) is a timing diagram showing register data of FIG. 8.
  • FIG. 34 is a flow chart showing the register setting process in step S23 of FIG. 31.
  • FIG. 35 is a flow chart showing the process of calculating the state information in step S2 of FIG. 29.
  • FIG. 36 is a flow chart showing the process flow of acquiring a pixel data group in step S50 of FIG. 35.
  • FIG. 37 is a flow chart showing the process flow of acquiring pixel data in step S61 of FIG. 36.
  • FIG. 38 is a flow chart showing the process flow of extracting a target point in step S51 of FIG. 35.
  • FIG. 39 is a flow chart showing the process flow of calculating the coordinates of a target point in step S85 of FIG. 38.
  • FIG. 40 is a flow chart showing the game process flow in step S3 of FIG. 29.
  • FIG. 41 is a flow chart showing the interrupt process in accordance with the present embodiment.
  • FIG. 42 is a flow chart showing the process flow of the playback of music in step S150 of FIG. 41.
  • FIG. 43 is a flow chart showing the process flow of registering guides in step S151 of FIG. 41.
  • FIG. 44 is a view for showing another example of the direction guide in accordance with the present embodiment.
  • FIG. 45 is a view for showing a further example of the direction guide in accordance with the present embodiment.
  • FIG. 46 is a view for showing a still further example of the direction guide in accordance with the present embodiment.
  • BEST MODE FOR CARRYING OUT THE INVENTION
  • In what follows, an embodiment of the present invention will be explained in conjunction with the accompanying drawings. Meanwhile, like references indicate the same or functionally similar elements throughout the drawings, and therefore redundant explanation is not repeated.
  • FIG. 1 is a view showing the overall configuration of the music game system in accordance with the embodiment of the present invention. As shown in FIG. 1, this music game system includes a music game apparatus 1, an operation article 150 and a television monitor 90.
  • The housing 19 of the music game apparatus 1 includes an imaging unit 13 therein. The imaging unit 13 includes four infrared light emitting diodes 15 and an infrared filter 17. The light emission units of the infrared light emitting diodes 15 are exposed from the infrared filter 17.
  • The music game apparatus 1 is supplied with a DC power voltage from an AC adapter 92. Alternatively, a battery cell (not shown in the figure) can be used to apply the DC power voltage in place of the AC adaptor 92.
  • The television monitor 90 includes a screen 91 at the front side thereof. The television monitor 90 and the music game apparatus 1 are connected by an AV cable 93. Incidentally, as illustrated in FIG. 1, the music game apparatus 1 is placed for example on the upper surface of the television monitor 90.
  • When the player 94 turns on the power switch (not shown in the figure) which is provided in the back side of the music game apparatus 1, a game screen is displayed on the screen 91. The player 94 manipulates the operation article 150 in accordance with the guidance of a game screen to run a game. In the present specification, the “manipulation” of the operation article 150 means moving the operation article itself (for example, changing the position thereof), but does not mean pressing a switch, moving an analog stick, and so forth.
  • The infrared light emitting diodes 15 of the imaging unit 13 intermittently emit infrared light. The infrared light emitted from the infrared light emitting diodes 15 is reflected by the reflection sheet (to be described below) attached to this operation article 150, and input to the imaging device (to be described below) located inside the infrared filter 17. In this way, the image of the operation article 150 is intermittently captured. Accordingly, the music game apparatus 1 can intermittently acquire an image signal of the operation article 150 which is moved by the player 94. The music game apparatus 1 analyzes the image signals and reflects the analysis result in the game. The reflection sheet which is used in the present embodiment is for example a retroreflective sheet.
  • FIG. 2 is a perspective view for showing the operation article 150 of FIG. 1. As shown in FIG. 2, the operation article 150 comprises the reflection ball 151 fixed to the tip of a stick 152. The infrared light from the infrared light emitting diodes 15 is reflected by this reflection ball 151. The details of the reflection ball 151 will be explained.
  • FIG. 3(a) is a top view showing the reflection ball 151 of FIG. 2, FIG. 3(b) is a side view for showing the reflection ball 151 as seen from arrow A of FIG. 3(a), and FIG. 3(c) is a side view for showing the reflection ball 151 as seen from arrow B of FIG. 3(a).
  • As illustrated in FIG. 3(a) through FIG. 3(c), the reflection ball 151 comprises a spherical inner shell 154 which is fixedly located inside a spherical outer shell 153 of a transparent color (inclusive of a semi-transparent, a colored-transparent and colorless transparent). The spherical inner shell 154 is provided with a reflection sheet 155 attached thereto. This reflection sheet 155 serves to reflect infrared light from the infrared light emitting diodes 15.
  • FIG. 4 is a longitudinal section view taken through the reflection ball 151 of FIG. 2. As illustrated in FIG. 4, the spherical outer shell 153 comprises two semispherical outer shells which are fixed together with bosses 156 and screws (not shown in the figure). The spherical inner shell 154 comprises two semispherical inner shells which are fixed inside the spherical outer shell 153 with bosses 157. In addition, the stick 152 is fixed to the reflection ball 151 by inserting it thereinto. More specifically speaking, the stick 152 is fixed to the reflection ball 151 by placing the stick 152 between the two semispherical outer shells forming the spherical outer shell 153 and the two semispherical inner shells forming the spherical inner shell 154, fixing together the two semispherical outer shells with the bosses 156 and the screws, and fixing together the two semispherical inner shells with the bosses 157.
  • FIG. 5 is an explanatory schematic diagram for showing one example of the imaging unit 13 of FIG. 1. As illustrated in FIG. 5, the imaging unit 13 includes a unit base 35 which is molded for example from a plastic material, and a supporting cylinder 36 is attached to the inside of this unit base 35. The supporting cylinder 36 is provided with an horn opening 41 formed in the upper surface of the supporting cylinder 36 with an inner surface shaped in the form of an inverted cone, and an optical system located in a cylindrical portion below the opening 41 and including a concave lens 39 and a convex lens 37 each of which is molded for example from a plastic material, and an image sensor 43 as an imaging device is fixed below the convex lens 37. Accordingly, the image sensor 43 can capture an image in accordance with light which incomes through the opening 41 via the lens sections 39 and 37.
  • The image sensor 43 is a low resolution CMOS image sensor (for example, 32 pixels×32 pixels: gray scale). However, this image sensor 43 may be an image sensor having a larger number of pixels, a CCD or the like device. In the following explanation, it is assumed that the image sensor 43 comprises 32 pixels×32 pixels.
  • In addition, a plurality (four in this embodiment) of the infrared light emitting diodes 15 is attached to the unit base 35 in order that the light output directions thereof are set respectively to the upward direction. Infrared light is emitted to an area over the imaging unit 13 by this infrared light emitting diodes 15. In addition, the infrared filter 17 (a filter capable of passing only infrared light therethrough) is attached to the upper portion of the unit base 35 in order to cover the above opening 41. Then, the infrared light emitting diodes 15 are repeatedly turned on/off (non-lighted) in a continuous manner, as will be described below, so that it serves as a stroboscope. However, the “stroboscope” is a generic term used to refer to a device serving to intermittently irradiate a moving object. Accordingly, the above image sensor 43 serves to capture an image of an object, which is moving in the scope of imaging, i.e., the operation article 150 in the case of the embodiment. Incidentally, as illustrated in FIG. 8 to be described below, the stroboscope is composed mainly of the infrared light emitting diodes 15, an LED drive circuit 75 and a high speed processor 200.
  • In this case, the imaging unit 13 is incorporated in the housing 19 in order that the light receiving surface of the image sensor 43 is inclined from the horizontal surface at a predetermined angle (for example, 90 degrees). Also, the scope of imaging of the image sensor 43 is for example within 60 degrees as determined by the concave lens 39 and the convex lens 37.
  • FIG. 6 is a view showing the electric configuration of the music game apparatus 1 of FIG. 1. As shown in FIG. 6, the music game apparatus 1 includes the image sensor 43, the infrared light emitting diodes 15, a video signal output terminal 47, an audio signal output terminal 49, the high speed processor 200, a ROM (read only memory) 51, and a bus 53.
  • The high speed processor 200 is connected to the bus 53. Furthermore, the ROM 51 is connected to the bus 53. Accordingly, the high speed processor 200 can access the ROM 51 through the bus 53 to read and execute a game program as stored in the ROM 51, and read and process image data and music data as stored in the ROM 51 in order to generate a video signal and an audio signal, which are then output through the video signal output terminal 47 and the audio signal output terminal 49 respectively.
  • The operation article 150 is irradiated with infrared light emitted from the infrared light emitting diodes 15, and reflects the infrared light by the reflection sheet 155. The image sensor 43 detects the reflected light from this retroreflective sheet 155, and outputs an image signal which includes an image of the retroreflective sheet 155. The analog image signal output from the image sensor 43 is converted into digital data by an A/D converter (to be described below) incorporated in the high speed processor 200. This process is performed also in the periods without infrared light. The high speed processor 200 analyzes this digital data, and reflects the analysis result in the game processing.
  • FIG. 7 is a block diagram of the high speed processor 200 of FIG. 6. As illustrated in FIG. 7, this high speed processor 200 includes a central processing unit (CPU) 201, a graphics processor 202, a sound processor 203, a DMA (direct memory access) controller 204, a first bus arbiter circuit 205, a second bus arbiter circuit 206, an internal memory 207, an A/D converter (ADC: analog to digital converter) 208, an input/output control circuit 209, a timer circuit 210, a DRAM (dynamic random access memory) refresh control circuit 211, an external memory interface circuit 212, a clock driver 213, a PLL (phase-locked loop) circuit 214, a low voltage detection circuit 215, a first bus 218, and a second bus 219.
  • The CPU 201 takes control of the entire system and perform various types of arithmetic operations in accordance with the program stored in the memory (the internal memory 207, or the ROM 51). The CPU 201 is a bus master of the first bus 218 and the second bus 219, and can access the resources connected to the respective buses.
  • The graphics processor 202 is also a bus master of the first bus 218 and the second bus 219, and generates a video signal on the basis of the data as stored in the internal memory 207 or the ROM 51, and output the video signal through the video signal output terminal 47. The graphics processor 202 is controlled by the CPU 201 through the first bus 218. Also, the graphics processor 202 has the functionality of outputting an interrupt request signal 220 to the CPU 201.
  • The sound processor 203 is also a bus master of the first bus 218 and the second bus 219, and generates an audio signal on the basis of the data as stored in the internal memory 207 or the ROM 51, and output the audio signal through the audio signal output terminal 49. The sound processor 203 is controlled by the CPU 201 through the first bus 218. Also, the sound processor 203 has the functionality of outputting an interrupt request signal 220 to the CPU 201.
  • The DMA controller 204 serves to transfer data from the ROM 51 to the internal memory 207. Also, the DMA controller 204 has the functionality of outputting, to the CPU 201, an interrupt request signal 220 indicative of the completion of the data transfer. The DMA controller 204 is also a bus master of the first bus 218 and the second bus 219. The DMA controller 204 is controlled by the CPU 201 through the first bus 218.
  • The internal memory 207 may be implemented with one or any necessary combination of a mask ROM, an SRAM (static random access memory) and a DRAM in accordance with the system requirements. A battery 217 is provided if an SRAM has to be powered by the battery for maintaining the data contained therein. In the case where a DRAM is used, a so-called refresh cycle is periodically performed to maintain the data contained therein.
  • The first bus arbiter circuit 205 accepts a first bus use request signal from the respective bus masters of the first bus 218, performs bus arbitration among the requests for the first bus 218, and issue a first bus use permission signal to one of the respective bus masters. Each bus master is permitted to access the first bus 218 after receiving the first bus use permission signal. Here, the first bus use request signal and the first bus use permission signal are shown as the first bus arbitration signals 222 in FIG. 7.
  • The second bus arbiter circuit 206 accepts a second bus use request signal from the respective bus masters of the second bus 219, performs bus arbitration among the requests for the second bus 219, and issue a second bus use permission signal to one of the respective bus masters. Each bus master is permitted to access the second bus 219 after receiving the second bus use permission signal. Here, the second bus use request signal and the second bus use permission signal are shown as the second bus arbitration signals 223 in FIG. 7.
  • The input/output control circuit 209 serves to perform the communication with an external input/output device(s) and/or an external semiconductor device(s) through input/output signals. The read and write operations of the input/output signals are performed by the CPU 201 through the first bus 218. Also, the input/output control circuit 209 has the functionality of outputting an interrupt request signal 220 to the CPU 201.
  • This input/output control circuit 209 outputs an LED control signal LEDC for controlling the infrared light emitting diodes 15.
  • The timer circuit 210 has the functionality of periodically outputting an interrupt request signal 220 to the CPU 201 on the basis of a time interval as preset. The settings such as the time interval are performed by the CPU 201 through the first bus 218.
  • The ADC 208 converts analog input signals into digital signals. The digital signals are read by the CPU 201 through the first bus 218. Also, the ADC 208 has the functionality of outputting an interrupt request signal 220 to the CPU 201.
  • This ADC 208 receives pixel data (analog) from the image sensor 43 and converts it into digital data.
  • The PLL circuit 214 generates a high frequency clock signal by multiplication of the sinusoidal signal as obtained from a quartz oscillator 216.
  • The clock driver 213 amplifies the high frequency clock signal as received from the PLL circuit 214 to a sufficient signal level to supply the respective blocks with the clock signal 225.
  • The low voltage detection circuit 215 monitors the power potential Vcc and issues the reset signal 226 of the PLL circuit 214 and the reset signal 227 to the other circuit elements of the entire system when the power potential Vcc falls below a certain voltage. Also, in the case where the internal memory 207 is implemented with an SRAM requiring the power supply from the battery 217 for maintaining data, the low voltage detection circuit 215 serves to issue a battery backup control signal 224 when the power potential Vcc falls below the certain voltage.
  • The external memory interface circuit 212 has the functionality of connecting the second bus 219 to the external bus 53 and the functionality of controlling the bus cycle length of the second bus by issuing a cycle end signal 228.
  • The DRAM refresh cycle control circuit 211 periodically and unconditionally gets the ownership of the first bus 218 to perform the refresh cycle of the DRAM at a certain interval. Needless to say, the DRAM refresh cycle control circuit 211 is provided in the case where the internal memory 207 includes a DRAM.
  • In what follows, with reference to FIG. 8 and FIG. 10, the configuration of transferring pixel data from the image sensor 43 to the high speed processor 200 will be explained in detail.
  • FIG. 8 is a circuit diagram for showing the LED drive circuit and the configuration of transferring pixel data from the image sensor 43 of FIG. 6 to the high speed processor 200. FIG. 9 is a timing diagram showing the operation of the high speed processor 200 which receives pixel data from the image sensor 43 of FIG. 6. FIG. 10 is an expanded timing diagram showing part of FIG. 9.
  • Referring to FIG. 8, since the image sensor 43 is a sensor which outputs pixel data D (X, Y) as an analog signal, this pixel data D (X, Y) is input to an analog input port of the high speed processor 200. The analog input port is connected to the ADC 208 of this high speed processor 200, and therefore the high speed processor 200 acquires therein pixel data converted into digital data from the ADC 208.
  • The middle point of the analog pixel data D (X, Y) as described above is determined by a reference voltage given to a reference voltage terminal Vref of the image sensor 43. For this reason, in association with the image sensor 43, for example, a reference voltage generation circuit 59 made of a resistance voltage divider is provided in order to supply a reference voltage which is always kept at a certain level to the reference voltage terminal Vref.
  • The respective digital signals for controlling the image sensor 43 are input to and output from the high speed processor 200 through the I/O ports thereof. These I/O ports are digital ports capable of controlling input and output operations and connected to the input/output control circuit 209 inside of this high speed processor 200.
  • More specifically speaking, a reset signal “reset” is output to the image sensor 43 from the I/O port of the high speed processor 200 for resetting the image sensor 43. In addition, a pixel data strobe signal PDS and a frame status flag signal FSF are output from the image sensor 43, and supplied to the input ports of the high speed processor 200.
  • The pixel data strobe signal PDS is a strobe signal as shown in FIG. 9(b) which is used to read the pixel signal D (X, Y) as described above. The frame status flag signal FSF is a flag signal which indicates the state of the image sensor 43 and is used for defining the exposure period of this image sensor 43 as illustrated in FIG. 9(a). In other words, while the exposure period is defined by the low level period of the frame status flag signal FSF as illustrated in FIG. 9(a), the non-exposure period is defined by the high level period of the frame status flag signal FSF as illustrated in FIG. 9(a).
  • Also, the high speed processor 200 outputs, from the I/O ports, a command (or command associated with data) to be set in a control register (not shown in the figure) of the image sensor 43, outputs a register setting clock CLK which periodically and alternatively takes high and low levels, and supplies the register setting clock CLK to the image sensor 43.
  • Incidentally, the infrared light emitting diodes 15 as used are four infrared light emitting diodes 15 a, 15 b, 15 c and 15 d which are connected in parallel each other as illustrated in FIG. 8. These four infrared light emitting diodes 15 a to 15 d are arranged and surround the image sensor 43, as explained above, in order to irradiate the operation article 150 with infrared light emitted in the same direction as the viewpoint of the image sensor 43 is directed. However, the individual infrared light emitting diodes 15 a to 15 d are referred to simply as the infrared light emitting diodes 15 unless it is necessary to distinguish them.
  • This infrared light emitting diodes 15 is turned on/off (non-lighted) by the LED drive circuit 75. The LED drive circuit 75 receives the frame status flag signal FSF as described above from the image sensor 43, and this frame status flag signal FSF is passed through a differentiating circuit 67, which is made up of a resistor 69 and a capacitor 71, and given to the base of the PNP transistor 77. The base of this PNP transistor 77 is connected further to a pull-up resistor 79 which usually pulls up the base of the PNP transistor 77 to a high level. Then, when the frame status flag signal FSF is pulled down to a low level, the low level signal is input to the base through the differentiating circuit 67 so that the PNP transistor 77 is turned on only for the low level period of the frame status flag signal FSF.
  • The emitter of the PNP transistor 77 is grounded through resistors 73 and 65. On the other hand, the connecting point between the emitter resistors 73 and 65 is connected to the base of an NPN transistor 81. The collector of this NPN transistor 81 is connected commonly to the anodes of the respective infrared light emitting diodes 15 a to 15 d. The emitter of the NPN transistor 81 is connected directly to the base of another NPN transistor 61. The collector of the NPN transistor 61 is connected commonly to the cathodes of the respective infrared light emitting diodes 15 a to 15 d, while the emitter of the NPN transistor 61 is grounded.
  • This LED drive circuit 75 turns on the infrared light emitting diodes 15 a to 15 d only within the period when the LED control signal LEDC which is output from the I/O port of the high speed processor 200 is activated (in a high level) while the frame status flag signal FSF which is output from the image sensor 43 is in a low level.
  • When the frame status flag signal FSF is pulled down to the low level as shown in FIG. 9(a), the PNP transistor 77 is turned on for the low level period (in practice, there is a delay time corresponding to the time constant of the differentiating circuit 67). Accordingly, when the LED control signal LEDC shown in FIG. 9(d) is output from the high speed processor 200 as a high level signal, the base of the NPN transistor 81 is pulled up to a high level and turned on. When the transistor 81 is turned on, the transistor 61 is also turned on. Accordingly, a current flows from the power supply (indicated by a small open circle in FIG. 8) through the respective infrared light emitting diodes 15 a to 15 d and the transistor 61, and in response to this, the respective infrared light emitting diodes 15 a to 15 d are lighted as shown in FIG. 9(e).
  • The LED drive circuit 75 turns on the infrared light emitting diodes 15 only the period when the LED control signal LEDC is activated as shown in FIG. 9(d) while the frame status flag signal FSF is in a low level as shown in FIG. 9(a), and therefore the infrared light emitting diodes 15 are turned on only in the exposure period (refer to FIG. 9(f)) of the image sensor 43.
  • Accordingly, useless power consumption can be restricted. Furthermore, since the frame status flag signal FSF is given also to the coupling capacitor 71, the transistor 77 is necessarily turned off after a certain period even when the flag signal FSF is fixed at a low level due to the runaway of the image sensor 43 or the like, so that the infrared light emitting diodes 15 are also necessarily turned off after the certain period.
  • It is therefore possible to arbitrarily and freely change the exposure period of the image sensor 43 by adjusting the mark duration of the frame status flag signal FSF.
  • Furthermore, the lighting period, non-lighting period, cycles of lighting/non-lighting period and so forth of the infrared light emitting diodes 15, i.e., of the stroboscope can be arbitrarily and freely set and changed by adjusting the mark durations and the frequencies of the frame status flag signal FSF and LED control signal LEDC.
  • As has been discussed above, when the operation article 150 is irradiated with the infrared light emitted from the infrared light emitting diodes 15, the image sensor 43 is exposed to the light reflected from the operation article 150. In response to this, the above pixel data D (X, Y) is output from the image sensor 43. More specifically speaking, during the period in which the frame status flag signal FSF as shown in FIG. 9(a) is in a high level (the infrared light emitting diodes 15 is not turned on), the image sensor 43 outputs analog pixel data D (X, Y) as shown in FIG. 9(c) in synchronism with the pixel data strobe signal PDS as shown in FIG. 9(b).
  • The high speed processor 200 acquires digital pixel data through the ADC 208 while monitoring the frame status flag signal FSF and the pixel data strobe signal PDS.
  • In this case, the pixel data is sequentially output as the zeroth line, the first line, . . . and the thirty-first line as illustrated in FIG. 10(c). However, as explained in the followings, the first pixel of each line is associated with dummy data.
  • Next, the details of the game played with the music game apparatus 1 will be explained with specific examples.
  • FIG. 11 is a view for showing an example of the game screen as displayed on the screen 91 of the television monitor 90 of FIG. 1. The game screen shown in FIG. 11 is a game start screen. As shown in FIG. 11, the game start screen displayed on the screen 91 includes a background 110, position guides “G1” to “G4”, evaluation objects 107 to 109, a cursor 105, a dance object 106 and masks 141 and 142. And, automatic playing of music is started.
  • Incidentally, in the case of the present embodiment, the position guides “G1” to “G4” are displayed in the form of blooms, the evaluation objects 107 to 109 are displayed in the form of heart-shaped objects, the dance object 106 is displayed in the form of a male-female pair, and the cursor 105 is displayed in the form of the operation article 150. In the following description, the term “position guides G” are used to generally represent the position guides “G1” to “G4”.
  • The cursor 105 serves to indicate the position of the operation article 150 on the screen 91, and moves on the screen 91 to follow the motion of the operation article 150. Accordingly, as seen from the player 94, the manipulation of the operation article 150 is equivalent to the manipulation of the cursor 105. The position guide “G” serves to guide the manipulation timing and destination position of the cursor 105 (the operation article 150) in terms of the timings relative to the music which is automatically played. Direction guides “g1” to “g5”, which will be described below, serves to guide the manipulation timing and moving direction of the cursor 105 (the operation article 150) in terms of the timings relative to the music which is automatically played. Path guides “rg1” to “rg10”, which will be described below, serves to guide the manipulation timing, the moving direction and moving path of the cursor 105 (the operation article 150) in terms of the timings relative to the music which is automatically played. The evaluation objects 107 to 109 serves to indicate the evaluation of the manipulate of the cursor 105 (the operation article 150) by the player 94 in a visual way. In the following description, the term “direction guides g” are used to generally represent the direction guides “g1” to “g5”. In the same manner, the term “path guides “rg” are used to generally represent the path guides “rg1” to “rg10”.
  • FIG. 12 is a view for showing another example of the game screen as displayed on the screen 91 of the television monitor 90 of FIG. 1. As shown in FIG. 12, the animation of the position guide “G” that a bloom is gradually opening indicates the position to which the cursor 105 is to be moved. By this guidance, the player 94 is instructed to move the cursor 105 to the area in which is displayed the animation of the position guide “G1” that a bloom is opening. The player 94 moves the operation article 150 in order to move the cursor 105 to the area in which is displayed the animation of the position guide “G” that a bloom is opening. After the animation that a bloom is opening, the position guide “G” is displayed with the animation that the bloom is closed. Furthermore, the direction in which the cursor 105 is to be moved is indicated by the direction toward the animation of the position guide “G” that a bloom is opening. By this guidance, the player 94 is instructed to move the cursor 105 to the direction toward the animation of the position guide “G” that a bloom is opening.
  • In addition to this, the direction in which the cursor 105 is to be moved is guided also by the direction guides “g1” to “g5”. Particularly, the direction guides “g1” to “g5” sequentially appear in the order that the direction guide “g1” appears first, the direction guide “g2” appears second, the direction guide “g3” appears third, the direction guide “g4” appears fourth, and then the direction guide “g5” appears fifth. Accordingly, the direction in which the cursor 105 is to be moved is guided by the direction in which the direction guides “g1” to “g5” appear in sequence. In this case, while each of the direction guides “g11” to “g5” is displayed as a graphic form representing a small sphere just after it appears, the sphere gradually increases in size as time passes, and when the size is maximized an animation is performed as if the sphere shatters into fragments. Accordingly, the direction toward the graphic form of the sphere which appears is the direction in which the cursor 105 is to be moved.
  • The player 94 has to move the cursor 105 to the area in which the position guide “G1” is displayed within a predetermined period in which the bloom serving as the position guide “G” is opened. In other words, the position guide “G” serves to guide the manipulation timing of the cursor 105 by the animation that the bloom is opened. Also, the player 94 has to move the cursor 105 to the area in which the position guide “G” is displayed as an opening bloom within a predetermined period after the last direction guide “g” appears as the graphic form of the sphere. In other words, the manipulation timing of the cursor 105 is guided also by the direction guide “g”.
  • In addition to this, the position guide “G” serves also to indicate in advance the manipulation direction of the cursor 105. That is to say, if the bud of the bloom serving as the position guide “G” is coming to open, it enable the player 94 to know the direction in which the cursor 105 is to be moved next. Furthermore, the direction guide “g” serves also to indicate in advance the manipulation direction of the cursor 105. Namely, since the direction guide “g” appears in advance of the manipulation timing, the player 94 can know the direction in which the cursor 105 is to be moved next also by the direction guide “g”.
  • This will be explained with reference to a specific example. In the case of the example as illustrated in FIG. 12, the position to which the cursor 105 is to be moved is indicated by the animation of the position guide “G2” that a bloom is gradually opening. By this guidance, the player 94 is instructed to move the cursor 105 to the area in which is displayed the animation of the position guide “G2” that the bloom is opening. Also, the direction toward the animation of the position guide “G2” that the bloom is opening is the direction in which the cursor 105 is to be moved. By this process, the player 94 is instructed to move the cursor 105 to the direction toward the animation of the position guide “G2” that the bloom is opening. In addition to this, the graphic forms of the spheres as the direction guides “g1” to “g5” subsequently appear from the position guide “G1” to the position guide “G2”. As described above, also by the direction guides “g1” to “g5”, the motion of the cursor 105 is guided from the position guide “G1” to the position guide “G2”.
  • The player 94 has to move the cursor 105 to the area in which the position guide “G2” is displayed within a predetermined period in which the bloom serving as the position guide “G2” is opened. Also, the player 94 has to move the cursor 105 to the area in which the position guide “G2” is displayed as an opening bloom within a predetermined period after the last direction guide “g5” appears as the graphic form of the sphere. In other words, the manipulation timing of the cursor 105 is guided also by the direction guide “g”.
  • The player 94 appropriately manipulates the operation article 150 in accordance with the instruction by the position guide “G2” and the direction guides “g1” to “g5” in order to move the cursor 105 from the position of the position guide “G1” to the position of the position guide “G2”. As a result, animation is performed such that all the evaluation objects 107 to 109 are flashing. Incidentally, if the cursor 105 is manipulated in a most appropriate timing, animation is performed in order that all the evaluation objects 107 to 109 are flashing, and if the cursor 105 is manipulated in a timing which is not most appropriate but within an acceptable range, animation is performed in order that only the evaluation object 108 is flashing. Meanwhile, each of the position guides “G1”, “G3” and “G4” is displayed in the form of the bud of the bloom because the current time is out of the time slot for guiding the manipulation timing and destination position of the cursor 105. Also, the direction guide “g” does not appear between the position guide “G2” and the position guide “G4”, between the position guide “G4” and the position guide “G3” and between the position guide “G3” and the position guide “G1”, because the current time is out of the time slot for guiding the manipulation timing and destination position of the cursor 105.
  • When the player 94 appropriately manipulates the cursor 105 in accordance with the guidance given by the position guide “G2” and the direction guides “g1” to “g5”, the animation of dance is performed in the direction corresponding to the moving direction of the cursor 105 (the direction from the position guide “G1” to the position guide “G2”, i.e., the right direction as seen toward the screen 91). For example, the animation of the dance object 106 turning in the counter-clockwise direction is performed, while the background 110 is scrolled in the left direction as seen toward the screen 91. By this process, although the dance object 106 is positioned in the center of the screen 91, it appears that the dance object 106 is turning in the counter clockwise direction while moving in the right direction.
  • FIG. 13 is a view for showing a further example of the game screen as displayed on the screen 91 of the television monitor 90 of FIG. 1. As shown in FIG. 13, animation is performed in order that the position guides “G1” to “G4” are opening as blooms at the same time. Taking this opportunity, the player 94 is guided to move the cursor 105 in the direction and along the path in accordance with the path guides “rg1” to “rg10”. In this case, the appearance positions of the path guides “rg1” to “rg10” indicate the guide path of the cursor 105. Also, the path guides “rg1” to “rg10” appear in the order that the path guide “rg1” appears first, the path guide “rg2” appears second, the path guide “rg3” appears third, the path guide “rg4” appears fourth, . . . . and the path guide “rg10” finally appears. Accordingly, the direction in which the cursor 105 is to be moved is guided by the direction in which the path guides “rg1” to “rg10” appear in sequence. In this case, while each of the path guides “rg1” to “rg10” is displayed as a graphic form representing a small sphere just after it appears, the sphere gradually increases in size as time passes, and when the size is maximized an animation is performed as if the sphere shatters into fragments. In FIG. 13, it is indicated to move the cursor 105 in the counter clockwise direction from a start point in the vicinity of the position guide “G3” along the path guides “rg1” to “rg10”.
  • When the player 94 manipulates the operation article 150 in accordance with the position guides “G1” to “G4” and the path guides “rg1” to “rg10” in order to move the cursor 105 in an appropriate manner, the animation of the dance object 106 (for example, the animation which is widely turning in the counter clockwise direction) is performed in correspondence with the path guides “rg1” to “rg10”.
  • Meanwhile, as discussed above, the object illustrated in each of FIG. 12 and FIG. 13 such as the dance object 106 is an image corresponding to a certain picture for an animation. For example, a series of the dance objects 106 are prepared for dance animation. Also, for example, a series of object images in the graphic forms of blooms are prepared for the animation of the position guide “G”. Furthermore, for example, a series of object images in the graphic forms of spheres are prepared for the animation of the direction guide “g” and the path guide “rg”.
  • In this case, each of the dance object 106, the position guide “G”, the evaluation objects 107 to 109, the cursor 105, the direction guide “g” and the path guide “rg” in the game screens as illustrated in FIG. 11 and FIG. 13 is composed of a single or a plurality of sprites. A sprite comprises a rectangular pixel set and can be arranged in an arbitrary position of the screen 91. Incidentally, a generic term “object” (or “object image”) is sometimes used to generally refer to the position guide “G”, the evaluation objects 107 to 109, the cursor 105, the direction guide “g” and the path guide “rg”.
  • FIG. 14 is a view for explaining the sprites forming an object which is displayed on the screen 91. As illustrated in FIG. 14, the dance object 106 of FIG. 11 is composed, for example, of 12 sprites SP0 to SP11. Each of the sprites SP0 to SP11 consists, for example, of 16 pixels×16 pixels. When the dance object 106 is arranged on the screen 91, for example, the coordinates at which the center of the upper left corner sprite SP0 is to be located is designated. Then, the coordinates at which the centers of the respective sprites SP1 to SP11 is to be located are calculated on the basis of the coordinates as designated and the size of the sprites SP0 to SP11.
  • Next, the scrolling of the background 110 will be explained. First, the background screen will be explained.
  • FIG. 15 is an explanatory view for showing the background screen to be displayed on the screen 91 of the television monitor 90 of FIG. 1. As illustrated in FIG. 15, the background screen 140 is composed, for example, of 32×32 blocks “0” to “1023”. Each of the block “0” to the block “1023” is composed, for example, of a rectangular element comprising 8 pixels×8 pixels. An array element PA[0] to an array element PA[1023] and an array element CA[0] to an array element CA[1023] are prepared in correspondence respectively with the block “0” to the block “1023”.
  • In this description, in the case where the block “0” to the block “1023” are generally referred to, they are referred to simply as the “block”; in the case where the array element PA[0] to the array element PA[1023] are generally referred to, they are referred to as the “array element PA”; and in the case where the array element CA[0] to the array element CA[1023] are generally referred to, they are referred to as the “array element CA”.
  • Incidentally, data (pixel pattern data) for designating the pixel pattern of the corresponding block is assigned to the array element PA. This pixel pattern data consists of the color information of the respective pixels of the 8 pixels×8 pixels for making up a block. On the other hand, the information for designating the color palette and the depth value for use in the corresponding block is assigned to the array element CA. A color palette consists of the predetermined number of color information entries. The depth value indicates the depth position of the pixels, and if a plurality of pixels overlap each other in the same position only the pixel having the largest depth value is displayed.
  • FIG. 16(a) is an explanatory view for showing the background screen 140 in advance of scrolling it, and FIG. 16(b) is an explanatory view for showing the background screen 140 after scrolling it. As illustrated in FIG. 16(a), since the size of the screen 91 of the television monitor 90 is 256 pixels×224 pixels, an area of 256 pixels×224 pixels in the background screen 140 is displayed the screen 91. It is considered here that the background screen 140 is scrolled to shift the center position thereof to the left by “k” pixels. In this case, since the width of the background screen 140 in the lateral direction (the horizontal direction) is equal to the width of the screen 91 in the lateral direction, the portion thereof (hatched portion) scrolled out of the screen 91 is displayed in the right edge as illustrated in FIG. 16(b). In other words, when scrolling in the lateral direction, conceptually, it can be thought that the same background screen 140 is repeatedly arranged in the lateral direction.
  • For example, if it is assumed that the portion thereof (hatched portion) scrolled out of the screen 91 consists of the block “64”, the block “96”, . . . , the block “896” and the block “928” of FIG. 15, the image displayed near the right edge of the screen 91 is defined by the array elements PA[64], . . . , and PA[928] and the array elements CA[64], . . . , and CA[928] corresponding to these blocks. From this fact, in order to make the background coherent while scrolling the background screen 140 in the left direction, it is needed to update the data assigned to the array elements PA and the array elements CA corresponding to the blocks included in the portion thereof (hatched portion) scrolled out of the screen 91. By this process, the image defined by the array elements PA and the array elements CA which are updated is displayed in the right edge of the screen 91.
  • In order to making the background look smooth and continuous, it is needed to update the relevant array elements PA and the relevant array elements CA in advance of displaying them at the right edge of the screen 91. In this case, it is needed to update the relevant array elements PA and the relevant array elements CA while displaying the left edge of the screen 91, and thereby the image near the left edge of the screen 91 becomes incoherent. Accordingly, as illustrated in FIG. 11 to FIG. 13, the mask 141 is provided at the left edge of the screen 91 in order to avoid such shortcomings. For the same reason, there is the mask 142 provided at the right edge.
  • Incidentally, the scroll process in the rightward direction is performed in the same manner as the scroll process in the leftward direction. Also, in the case of the present embodiment, since the range of scrolling is limited within ±16 pixels in the longitudinal direction (vertical direction) of the background screen 140, there is no mask at the top and bottom edges of the screen 91.
  • As has been discussed above, the background 110 is scrolled by scrolling the background screen 140.
  • Next, the details of the game process by the music game apparatus 1 will be explained. FIG. 17 is a schematic representation of a program and data stored in the ROM 51 of FIG. 6. As shown in FIG. 17, the ROM 51 is used to store a game program 300, image data 301 and music data 304. The image data 302 includes object image data (inclusive of image data such as the position guide “G”, the direction guide “g”, the path guide “rg”, the evaluation objects 107 to 109 and the cursor 105) and background image data. The music data 304 includes first musical score data 305, second musical score data 306 and sound source data (wave data) 307.
  • The first musical score data 305 shown in FIG. 17 is the data in which music control information is arranged in a time series.
  • FIG. 18 is a schematic representation of one example of the first musical score data 305 of FIG. 17. As shown in FIG. 18, the music control information contains a command, a note number/a waiting time information item, an instrument designation information item, a velocity value and a gate time.
  • “Note On” is a command to output sound, and “Wait” is a command to set a waiting time. The waiting time is the time period to elapse before reading the next command (the time period between one musical note and the next musical note). The note number is information for designating the frequency of sound vibration (pitch). The waiting time information item is information for designating a waiting time to be set. The instrument designation information item is information for designating a musical instrument whose tone quality is to be used. The velocity value is information for designating the magnitude of sound, i.e., a sound volume. The gate time is information for designating a period for which a musical note is to be continuously output.
  • Returning to FIG. 17, the second musical score data 306 is the data in which guide control information is arranged in a time series. This second musical score data 306 is used when guides (the position guide “G”, the direction guide “g” and the path guide “rg”) is displayed on the screen 91. In other words, while the first musical score data 305 is the musical score data for automatically playing music, the second musical score data 306 is the musical score data for displaying the guides in synchronization with the music.
  • FIG. 19 is a schematic representation of one example of the second musical score data 306 of FIG. 17. As shown in FIG. 19, the guide control information contains a command, a note number/a waiting time information item, and an instrument designation information item.
  • The instrument designation information item of the second musical score data 306 is the number indicating that the second musical score data 306 is the musical score data for displaying guides (the position guide “G”, the direction guide “g” and the path guide “rg”) rather than the number indicating the instrument (tone quality) corresponding to the instrument of which sound is to be output.
  • Accordingly, “Note On” is not a command to output sound but a command to designate starting the animation of the position guide “G” or designate starting the display of the direction guide “g” and the path guide “rg”. Accordingly, the note number is not a command to designate the frequency of sound vibration (pitch) but information used to designate which of the animations of the position guides “G” is to be started and designate where the direction guide “g” and the path guide “rg” are displayed. This point will be explained in detail.
  • FIG. 20(a) through FIG. 20(c) are views showing the correspondence between note numbers and the directions in which the cursor 105 is guided. As illustrated in FIG. 20(a) through FIG. 20(c), the direction of each arrow indicates the direction in which the cursor 105 is guided, the start point of each arrow indicates the position of the position guide “G” which previously guided the cursor 105, and the end point of each arrow indicates the position of the position guide “G” which currently guides the cursor 105. For example, as illustrated in FIG. 20(a), the note number 115511 is used to direct the cursor 105 from the position guide “G1” to the position guide “G2”, and when the note number indicated by the musical score data pointer is “55” the position guide “G” and the direction guide “g” are displayed as illustrated in FIG. 12. Also, for example, as illustrated in FIG. 20(c), the note number “57” is used to direct the cursor 105 so that it turns in the counter clockwise direction from the position guide “G3” as the start point, and when the note number indicated by the musical score data pointer is “57” the position guide “G1” and the path guide “rg” are displayed as illustrated in FIG. 13.
  • Meanwhile, for example, the note number “81” is dummy data placed on top of the second musical score data 306 (refer to FIG. 19) and not information which is used to control the display of guidance. By this configuration, the top positions of the first musical score data 305 and the second musical score data 306 are aligned with each other. Furthermore, for example, the note number “79” is data indicative of the end of music, and arranged at the end of the second musical score data 306 (refer to FIG. 19). Incidentally, the note number “79” is not information which is used to control the display of guidance.
  • Next is the explanation of the main process performed by the high speed processor 200.
  • [Pixel Data Group Acquisition Process] The CPU 201 acquires digital pixel data by converting analog pixel data which is output from the image sensor 43, and assigns it to the array element P[X][Y]. Meanwhile, it is assumed that the horizontal axis (in the lateral direction or the row direction) of the image sensor 43 is X-axis and the vertical axis (in the longitudinal direction or the column direction) is Y-axis.
  • [Differential Data Calculation Process] The CPU 201 calculates the differential data between the pixel data P[X] [Y] acquired when the infrared light emitting diodes 15 are turned on and the pixel data P[X][Y] acquired when the infrared light emitting diodes 15 are turned off, and the differential data is assigned to the array element Dif[X] [Y]. In what follows, the advantages of obtaining the differential data will be explained with reference to drawings. In this case, the pixel data represents the luminance value. Accordingly, the differential data also represents the luminance value.
  • FIG. 21(a) is a view for showing an example of the image which is captured by the use of an ordinary used image sensor and is not processed by a particular treatment, FIG. 21(b) is a view for showing an example of the image which is obtained by level filtering the image signal of FIG. 21(a) by a certain threshold value, FIG. 21(c) is a view for showing an example of the image which is captured by the image sensor 43 through the infrared filter 17 with the illumination and is level filtered by a certain threshold value, FIG. 21(d) is a view for showing an example of the image which is captured by the image sensor 43 through the infrared filter 17 without the illumination and is level filtered by a certain threshold value, and FIG. 21(e) is a view for showing an example of the differential signal between the image signal with the illumination and the image signal without the illumination.
  • As has been discussed above, the operation article 150 is irradiated with infrared light in order to capture an image by the reflected infrared light which is incident on the image sensor 43 through the infrared filter 17. In the case where an image of the operation article 150 is stroboscope captured by the use of an ordinary light source in an ordinary indoor environment, an ordinary image sensor (corresponding to the image sensor 43 of FIG. 5) captures an image which includes not only light sources such as a fluorescent light source, an incandescent light source and a solar light source (window) but any other objects located inside of the room in addition to an image of the operation article 150 as illustrated in FIG. 21(a). Accordingly, a computer or a processor having a substantially high-speed processing capability is needed in order to extract only the image of the operation article 150 by processing the image of FIG. 21(a). However, such a high-performance computer cannot be used in a device which must be manufactured at a low cost. Then, it is conceivable to lessen the load by the use of a variety of processing techniques.
  • Incidentally, although the image of FIG. 21(a) would have to be drawn as a gray-scale image, the illustration is omitted. Also, in each of FIG. 21(a) through FIG. 21(e), an image is captured of the reflection sheet 155 of the operation article 150.
  • Next, FIG. 21(b) is an image signal after level filtering the image signal of FIG. 21(a) by a certain threshold value. While such a level filtering process can be performed by a dedicated hardware circuit or by software control, it is possible to remove images having low luminance values other than the operation article 150 and the light sources by performing the level filtering process which cut off pixel data whose luminance value is no higher than a certain level. In the case of the image signal of FIG. 21(b), the images other than the operation article 150 and the light sources can be eliminated so as to lessen the load on the computer, however, since there are high-luminance images yet including light source images, it is difficult to discriminate between the operation article 150 and other light sources.
  • Because of this, the infrared filter 17 is used as illustrated in FIG. 5 in order that the image sensor 43 does not capture the images other than the image of the infrared light. By this process, as illustrated in FIG. 21(c), it is possible to remove the fluorescent light source which emits little infrared light. However, there are the solar light source and the incandescent light source included in the image signal. Accordingly, the load is lessened by calculating the difference between the pixel data when the infrared light stroboscope is turned on and the pixel data when the infrared light stroboscope is turned off.
  • For this purpose, the difference is calculated between the pixel data of the image signal with the illumination as shown in FIG. 21(c) and the pixel data of the image signal without the illumination as shown in FIG. 21(d). Then, as illustrated in FIG. 21(e), only the image corresponding to the difference can be acquired. The image corresponding to the difference includes only the image corresponding to the operation article 150 as apparent from the comparison with FIG. 21(a). Accordingly, while lessening the processing load, it is possible to acquire the state information on the operation article 150. The state information is any one of or any combination of two or more of speed information, moving direction information, moving distance information, velocity vector information, acceleration information, movement locus information, area information, and positional information.
  • For the reason as described above, the CPU 201 acquires differential data by calculating the difference between the pixel data acquired when the infrared light emitting diodes 15 are turned on and the pixel data acquired when the infrared light emitting diodes 15 are turned off.
  • [Target Point Extraction Process] The CPU 201 obtains the coordinates of the target point of the operation article 150 on the basis of the differential data Dif[X][Y] as calculated. This will be explained in detail.
  • FIG. 22 is a view for explaining the calculation process of the target point of the operation article 150. Incidentally, it is assumed that the image sensor 43 shown in FIG. 22 is an image sensor of 32 pixels×32 pixels.
  • As illustrated in FIG. 22, the CPU 201 scans the differential data in the X-direction through 32 pixels while incrementing the Y-coordinate, in such a manner that the CPU 201 scans the differential data through 32 pixels in the X-direction (the horizontal direction, the lateral direction or the row direction), then increments the Y-coordinate, scans the differential data through 32 pixels in the X-direction, then increments the Y-coordinate, and so on.
  • In this case, the CPU 201 finds the differential data of the maximum luminance value from the differential data of 32 pixels×32 pixels as scanned, and compares the maximum luminance value to a predetermined threshold value “Th”. Then, if the maximum luminance value is larger than the predetermined threshold value “Th”, the CPU 201 calculates the coordinates of the target point of the operation article 150 on the basis of the coordinates of the pixel having the maximum luminance value. This point will be explained in detail.
  • FIG. 23(a) is a view for explaining the process of scanning in the X-direction when the coordinates of the target point of the operation article 150 are calculated on the basis of the coordinates of the pixel having the maximum luminance value, FIG. 23(b) is a view for explaining the process of starting scanning in the Y-direction when the coordinates of the target point of the operation article 150 are calculated on the basis of the coordinates of the pixel having the maximum luminance value, FIG. 23(c) is a view for explaining the process of scanning in the Y-direction when the coordinates of the target point of the operation article 150 are calculated on the basis of the coordinates of the pixel having the maximum luminance value, and FIG. 23(d) is an explanatory view for showing the result of the process of calculating the coordinates of the target point of the operation article 150 on the basis of the coordinates of the pixel having the maximum luminance value.
  • As illustrated in FIG. 23(a), the CPU 201 performs scanning the differential data in the X-direction from the coordinates of the pixel having the maximum luminance value as the center in order to detect pixels whose luminance values are larger than the predetermined threshold value “Th”. In the case of the example of FIG. 23(a), the pixels corresponding to X=11 to 15 are pixels whose luminance values are larger than the predetermined threshold value “Th”.
  • Next, as illustrated in FIG. 23(b), the CPU 201 obtains the center of X (=11 to 15). Then, it is determined that Xc=13 as the X-coordinate of the center.
  • Next, as illustrated in FIG. 23(c), the scanning operation of the differential data is performed in the Y-direction from the center at the X-coordinate (=13) as obtained in FIG. 23(b), and detects pixels whose luminance values are larger than the predetermined threshold value “Th”. In the case of the example of FIG. 23(c), the pixels corresponding to Y=5 to 10 are pixels whose luminance values are larger than the predetermined threshold value “Th”.
  • Next, as illustrated in FIG. 23(d), the CPU 201 obtains the center of Y (=5 to 10). Then, it is determined that Yc=7 as the Y-coordinate of the center.
  • The CPU 201 converts the coordinates (Xc, Yc) (=(13, 7)) of the target point which is calculated as described above into the coordinates (xc, yc) in the screen 91. The CPU 201 performs the process of calculating the coordinates (xc, yc) of the target point as described above each time the frame is updated. Then, the CPU 201 assigns “xc” and “yc” respectively to the array elements Px[M] and Py[M]. Meanwhile, “M” is an integer and incremented by one each time the frame displayed on the screen 91 is updated.
  • [Target Point Existence Area Determination Process (1)] The CPU 201 determines which of areas a1 to a4 includes the target point of the operation article 150 on the screen 91. This point will be explained in detail.
  • FIG. 24 is a view for explaining the target point existence area determination process (2) performed by the CPU 201. As illustrated in FIG. 24, a predetermined area a1 including the position guide “G1”, a predetermined area a2 including the position guide “G2”, a predetermined area a3 including the position guide “G3” and the predetermined area a4 including the position guide “G4” are defined on the screen 91. The CPU 201 determines, from among the predetermined areas a1 to a4, the area in which the target point (xc, yc) of the operation article 150 is located and stores the result of determination in the array element J1[M]. The CPU 201 performs the determination process as described above each time the frame displayed on the screen 91 is updated.
  • [Target Point Existence Area Determination Process (2)] The CPU 201 determines which of areas A1 to A4 includes the target point of the operation article 150 on the screen 91. This point will be explained in detail.
  • FIG. 25 is a view for explaining the target point existence area determination process (2) performed by the CPU 201. As illustrated in FIG. 25, the areas A1 to A4 are defined by dividing the screen 91 into four. The CPU 201 determines, from among the areas A1 to A4, the area in which the target point (xc, yc) of the operation article 150 is located and stores the result of determination in the array element J2[M]. The CPU 201 performs the determination process as described above each time the frame displayed on the screen 91 is updated.
  • [Cursor Control Process] The CPU 201 registers (stores in the internal memory 207) the coordinates (xc, yc) of the current target point of the operation article 150 as the coordinates of the cursor 105 to be displayed in the next frame.
  • [Guide Type Registration Process] The CPU 201 assigns a note number (refer to FIG. 19 and FIG. 20(a) through FIG. 20(c)), which are read from the second musical score data 306 in accordance with the musical score data pointer for guidance, to an array element NN[0] or an array element NN[1]. The number of elements of the array is two as described above because the guidance of a certain position guide “G” and the guidance of another position guide “G” are started in different timings but can be overlappingly continued in a certain period. Incidentally, the musical score data pointer for guidance is a pointer pointing to the position of the second musical score data 306 from which data is read.
  • [Guide Control Process] The CPU 201 registers the animation information of the direction guide “G”, the animation information of the position guide “g” and the animation information of the path guide “rg” with reference to the array element NN[J] (guide display number “J”=0, 1) in accordance with the note number assigned to the array element NN[J]. This point will be explained in detail.
  • FIG. 26 is a view for explaining the registration process of the animations of the direction guide “G”, the position guide “g” and the path guide “rg”. As illustrated in FIG. 26, in the ROM 51 or the internal memory 207, there is prepared a table in which the note number are associated with the animation information (the storage location information of the animation table of the position guide “G”, the display coordinate information of the position guide “G” on the screen 91, the display timing information of the position guide “G”, the storage location information of the animation table of the direction guide “g”/the path guide “rg”, the display coordinate information of the direction guide “g”/the path guide “rg” in the screen 91, and the display timing information of the direction guide “g”/the path guide “rg”).
  • Each of the note numbers in this table is a note number which is used to control the display of a guide and shown in FIG. 20(a) through FIG. 20(c). For example, if the note number assigned to the array element NN[J] is “55”, the CPU 201 refers to this table and registers (stores in the predetermined area of the internal memory 207) the animation information (the storage location information of the animation table of the position guide “G”, the display coordinates of the position guide “G” on the screen 91, the display timing information of the position guide “G”, the storage location information of the animation table of the direction guide “g”, the display coordinate information of the direction guide “g” in the screen 91, and the display timing information of the direction guide “g”) associated with the note number “55”.
  • In this case, the display timing information is information indicative of when an object is to be displayed on the screen 91. For example, the guide number “55” indicates that the position guide “G2” is to be displayed at the coordinates (x1, y1) in the next frame following the frame which is currently displayed since the display timing information of the position guide “G” is “0”. Also, for example, since the display timing information of the direction guide “g” is 0, 6, 12, . . . and 24, the guide number “55” indicates that the position guide “g1” is to be displayed at the coordinates (x3, y1) in the next frame following the frame which is currently displayed, that the position guide “g2” is to be displayed at the coordinates (x4, y1) 6 frames after, . . . and that the position guide “g5” is to be displayed at the coordinates (x7, y1) 24 frames after.
  • FIG. 27 is a view for showing an example of the animation table which is designated by the animation table storage location information of FIG. 26. As illustrated in FIG. 27, the animation table is a table in which are associated the storage location information of animation image data (a plurality of image object data items arranged in a time series), the reference numbers of objects for use in performing animation arranged in a time series, information indicative of how many frames (the number of duration frames) an object is continuously displayed, the size of an object, the information on a color palette, the information on the depth value, and the size of a sprite. Incidentally, the animation image data is pixel pattern data. In this case, the pixel pattern data, the color palette and the depth value are related to sprites for forming objects, and the definitions thereof are the same as explained in conjunction with the blocks of FIG. 14.
  • The animation table pointed to by the animation table storage location information “address 0” is an example of the animation table of the position guide “G”, the animation table pointed to by the animation table storage location information “address 1” is an example of the animation table of the direction guide “g”, the animation table pointed to by the animation table storage location information “address 2” is an example of the animation table of the path guide “rg”, and the animation table pointed to by the animation table storage location information “address 3” is an example of the animation table of the position guide “G” which is used when the player 94 successfully manipulates the cursor 105.
  • [Dance Control Process] The CPU 201 determines whether or not the player 94 correctly manipulates the cursor 105 in correspondence with the position guide “G” and the direction guide “g” or the position guide “G” and the path guide “rg”. More specific description is as follows.
  • The CPU 201 determines whether or not the cursor 105 (i.e., the target point of the operation article 150) is located in the area which is currently designated by the position guide “G” and the direction guide “g” on the basis of the result of determination J1[M] performed by the target point existence area determination process (1) (refer to FIG. 24). For example, in the case where the area a2 is currently designated by the position guide “G” and the direction guide “g”, if the cursor 105 is located in the area a2, the CPU 201 determines that the cursor 105 is correctly manipulated in correspondence with the position guide “G” and the direction guide “g”.
  • Also, the CPU 201 determines whether or not the cursor 105 (i.e., the target point of the operation article 150) is moved along the path which is currently designated by the position guide “G” and the path guide “rg” on the basis of the result of determination J2[M] performed by the target point existence area determination process (2) (refer to FIG. 25). In this case, the path which is designated by the guide number “53” as shown in FIG. 20(b) is the path of the area A3->the area A1->the area A2->the area A4 as shown in FIG. 25. Also, the path which is designated by the guide number “57” as shown in FIG. 20(c) is the path of the area A4->the area A2->the area A1->the area A3 as shown in FIG. 25. Accordingly, for example, in the case where the path corresponding to the guide number “53” is currently designated by the position guide “G” and the path guide “rg”, if the cursor 105 is moved along the path of the area A3->the area A1->the area A2->the area A4, the CPU 201 determines that the cursor 105 is correctly manipulated in correspondence with the position guide “G1” and the path guide “rg”.
  • From the above results, in the case where it is determined that the player 94 correctly manipulates the cursor 105 in correspondence with the position guide “G” and the direction guide “g” or the position guide “G” and the path guide “rg”, the CPU 201 registers (stores in the predetermined area of the internal memory 207) the dance animation information corresponding to the position guide “G” and the direction guide “g” or the position guide “G” and the path guide “rg”. In the same manner as the table of FIG. 26, in the ROM 51 or the internal memory 207, a table is prepared in order to associate the dance animation information with the note numbers (refer to FIG. 20(a) through FIG. 20(c)) for controlling the display of guides. However, the note numbers indicating the same guide direction (for example, the note numbers “55” and 67) are associated with the same dance animation information.
  • The dance animation information is designed in the same manner as the animation information of FIG. 26 and contains the storage location information of the dance animation table, the display coordinates of the dance object 106 on the screen 91, and the display timing information of the dance object 106. Also, the dance animation table is designed in the same manner as the animation table of FIG. 27 and provided as a table in which are associated the storage location information of dance animation image data (a plurality of image object data items of the dance object 106 arranged in a time series), the reference numbers of the dance objects 106 for use in performing animation arranged in a time series, information indicative of how many frames (the number of duration frames) the dance object is continuously displayed, the size of the dance object 106, the information on the color palette, the information on the depth value and the size of the sprite. The dance animation image data is pixel pattern data.
  • Meanwhile, the second musical score data 306 of FIG. 17 may contain a note number indicating that a high speed dance animation is to be performed and a note number indicating that a low speed dance animation is to be performed. In this case, a high speed dance animation table and a low speed dance animation table are prepared respectively as the dance animation table. Also, in this case, there is a table prepared in the ROM 51 or the internal memory 207 in order to associate the dance animation information with the note numbers for controlling the display of guides and the note numbers for controlling the speed of dance. In the same manner, a dance animation table is prepared for each dance speed.
  • Also, in the case where it is determined that the player 94 correctly manipulates the cursor 105 in correspondence with the position guide “G” and the direction guide “g” or the position guide “G” and the path guide “rg”, the CPU 201 registers the storage location information (“address 3” in the case of the example of FIG. 27) of the animation table of the position guide “G” which is used when the player 94 successfully manipulates the cursor 105.
  • Furthermore, in the case where it is determined that the player 94 correctly manipulates the cursor 105 in correspondence with the position guide “G” and the direction guide “g” or the position guide “G” and the path guide “rg”, the CPU 201 registers (stores in the predetermined area of the internal memory 207) the animation information for the evaluation objects 107 to 109. This animation information is designed in the same manner as the animation information of FIG. 26. Accordingly, this animation information contains the storage location information of the animation table for the evaluation objects 107 to 109. This animation table is designed in the same manner as the animation table of FIG. 27.
  • Still further, in the case where it is determined that the player 94 correctly manipulates the cursor 105 in correspondence with the position guide “G” and the direction guide “g” or the position guide “G” and the path guide “rg”, the CPU 201 performs the scrolling corresponding to the position guide “G” and the direction guide “g” or the position guide “G” and the path guide “rg”. More specifically speaking, the CPU 201 changes the center position of the background screen 140 to scroll the background 110 (refer to FIG. 16(a) and FIG. 16(b)) in correspondence with the position guide “G” and the direction guide “g” or the position guide “G” and the path guide “rg”, and the note number for controlling the dance speed. Still further, in the case where the background screen 140 is scrolled in the lateral direction, the CPU 201 changes the data in the array elements PA and the array elements CA corresponding thereto.
  • Incidentally, FIG. 28 is a timing diagram for explaining the relationship among the first musical score data 305, the second musical score data 306, the direction guide “G”, the position guide “g”, the judgment of manipulation and the dance animation. Incidentally, in FIG. 28, each thick line indicates the execution period for which the process continues, the filled circle at the left end of each thick line indicates the starting point of the process, and the filled circle at the right end of each thick line indicates the end of the process.
  • As illustrated in FIG. 28, the time point of starting reading the second musical score data 306 is set earlier than the time point T1, T2 . . . of starting reading the first musical score data 305 by a predetermined time “t”. Accordingly, the display of the direction guide “g” is started the predetermined time “t” earlier than the corresponding note number of the first musical score data 305 is read at the corresponding time point of T1 to T3, and continued to this corresponding time point of T1 to T3 in which the corresponding note number of the first musical score data 305 is read (for example, for 60 frames). Likewise, the animation of the position guide “G” is started the predetermined time “t” earlier than the corresponding note number of the first musical score data 305 is read at the corresponding time point of T1 to T3, and continued a short time after the corresponding note number of the first musical score data 305 is read at this corresponding time point of T1 to T3.
  • The CPU 201 starts the process of determining whether or not the cursor 105 is correctly manipulated in correspondence with the direction guide “g” and the position guide “G” a predetermined period (for example, 30 frames) after starting the display of the direction guide “g”, and completes the process at the corresponding time point of T1 to T3 in which the corresponding note number of the first musical score data 305 is read. Then, if it is determined that the cursor 105 is correctly manipulated in correspondence with the direction guide “g” and the position guide “G”, the CPU 201 registers the dance animation information at the time point when the determination period ends. Accordingly, in this case, dance animation is performed on the basis of the dance animation information which is registered.
  • Meanwhile, the following is the reason why the time point of starting reading the second musical score data 306 is earlier than the time point T1, T2 . . . of starting reading the first musical score data 305 by a predetermined time “t”. Namely, since the player 94 starts the manipulation of the operation article 150 after the guidance by the direction guide “g” and the position guide “G” is started, the direction guide “g” and the position guide “G” are displayed earlier than in the timing of music for the purpose of adjusting the time lag.
  • The timing of displaying the path guide “rg” is provided in the same manner as the timing of displaying the direction guide “G”. However, for example, the determination of whether or not the cursor 105 is correctly manipulated in correspondence with the path guide “rg” is performed between the start and end of the guidance by the path guide “rg” (For example, for 60 frames).
  • [Image Display Process] The CPU 201 provides the graphics processor 202 of FIG. 7 with the information required for drawing during the vertical blanking period on the basis of the information registered by the cursor control process, the guide control process and the dance control process. Then, the graphics processor 202 generates a video signal on the basis of the information as given, and outputs it to the video signal output terminal 47. By this process, the game screen including the position guide “G”, the background 110 and so forth is displayed on the screen 91 of the television monitor 90. More specific description is as follows.
  • The CPU 201 calculates the display coordinates of the respective sprites forming the cursor 105 on the basis of the coordinate information (the coordinate information of the target point of the operation article 150) which is registered by the cursor control process. Then, the CPU 201 provides the graphics processor 202 with the display coordinate information, the color palette information, the depth value, the size information and the pixel pattern data storage location information of the respective sprites for forming the cursor 105. The graphics processor 202 generates the image signal of the cursor 105 on the basis of the respective information, and outputs it to the video signal output terminal 47.
  • Also, the CPU 201 acquires the size information of the object for forming the animation image of each guide (the position guide “G”, the direction guide “g” or the path guide “rg”) and the size information of the sprite for forming the object with reference to the animation table on the basis of the animation table storage location information contained in the animation information which is registered by the guide control process. Then, the CPU 201 calculates the display coordinates of the respective sprites for forming the object on the basis of the above respective information and the display coordinate information contained in the animation information as registered. Furthermore, the CPU 201 calculates the pixel pattern data storage location information of the respective sprites for forming the object on the basis of the reference number of the position guide “G” to be displayed next, the size information of the object and sprite contained in the animation table, and the animation image data storage location information of the position guide “G” contained in the animation table.
  • Still further, the CPU 201 provides the graphics processor 202 with the color palette information, the depth value and the size information of the respective sprites for forming the position guide “G” together with the pixel pattern data storage location information and the display coordinate information of the respective sprites with reference to the animation table. In this case, the CPU 201 provides the graphics processor 202 with the above respective information on the basis of the display timing information of the position guide “G” contained in the animation information as registered and the information on the number of duration frames of the animation table.
  • For the direction guide “g” and the path guide “rg”, the information to be given to the graphics processor 202 by the CPU 201 has a similar content and is acquired in a similar manner as for the position guide “G”. However, since the direction guides “g1” to “g4” and the path guides “rg1” to “rg10” are sequentially displayed in a plurality of positions which are designated by the display coordinate information contained in the animation information in the timing which is designated by the display timing information contained in the animation information, the CPU 201 provides the information on the direction guides “g1” to “g4” and the path guides “rg1” to “rg10” to the graphics processor 202, when starting displaying each of the direction guides “g1” to “g4” and each of the path guides “rg1” to “rg10”, with reference to the display coordinate information and the display timing information contained in the animation information as registered.
  • The graphics processor 202 generates the image signals of the guides (the position guide “G”, the direction guide “g”, the path guide “rg”) on the basis of the above information which is given as described above, and outputs them to the video signal output terminal 47.
  • Also, the CPU 201 acquires the size information of the dance object 106 for forming the dance animation image and the size information of the sprite for forming the dance object 106 with reference to the dance animation table on the basis of the dance animation table storage location information contained in the dance animation information which is registered by the dance control process. Then, the CPU 201 calculates the display coordinates of the respective sprites for forming the dance object 106 on the basis of the above respective information and the display coordinate information contained in the dance animation information as registered. Furthermore, the CPU 201 calculates the pixel pattern data storage location information of the respective sprites for forming the dance object 106 on the basis of the reference number of the dance object 106 to be displayed next, the size information of the dance object 106 and the sprite contained in the dance animation table, and the dance animation image data storage location information contained in the dance animation table.
  • Still further, the CPU 201 provides the graphics processor 202 with the color palette information, the depth value and the size information of the respective sprites for forming the dance object 106 together with the pixel pattern data storage location information and the display coordinate information of the respective sprites with reference to the dance animation table. In this case, the CPU 201 provides the above respective information to the graphics processor 202 on the basis of the display timing information contained in the dance animation information as registered and the information on the number of duration frames of the dance animation table.
  • Still further, the CPU 201 acquires the information required for generating image signals on the basis of the animation information and the animation table for the evaluation objects 107 to 109 which are registered by the dance control process, and provides the information to the graphics processor 202. Incidentally, in this case, the information to be given to the graphics processor 202 by the CPU 201 has a similar content and is acquired in a similar manner as for the dance object 106.
  • The graphics processor 202 generates the image signals of the dance object 106 and the evaluation objects 107 to 109 on the basis of the above information which is given as described above, and outputs them to the video signal output terminal 47.
  • [Music Playback] The playback of music is performed by an interrupt operation. The CPU 201 reads and interprets the music control information of FIG. 18 while incrementing the musical score data pointer for music. Incidentally, the musical score data pointer for music is a pointer pointing to the position of the first musical score data 305 from which data is read.
  • Then, if the command contained in the music control information as read is “Note On”, the CPU 201 provides the sound processor 203 with the head address from which the wave data is stored in accordance with the frequency of sound vibration (pitch) designated by the note number contained in the music control information and the instrument (tone quality) designated by the instrument designation information. Furthermore, if the command contained in the music control information as read is “Note On”, the CPU 201 provides the sound processor 203 with the head address from which the envelope data as required is stored. Still further, if the command contained in the music control information as read is “Note On”, the CPU 201 provides the sound processor 203 with pitch control information corresponding to the frequency of sound vibration (pitch) designated by the note number contained in the music control information, and volume information contained in the music control information.
  • In what follows, the pitch control information will be explained. The pitch control information is used to perform the pitch conversion by changing the frequency of reading the wave data. Namely, the sound processor 203 periodically reads the pitch control information at a certain interval and accumulates the pitch control information. The sound processor 203 then processes this result of accumulation to obtain the address pointer to the wave data. Accordingly, if the pitch control information is set to a large value, the address pointer is quickly incremented by the large value to raise the frequency of the wave data. Conversely, if the pitch control information is set to a small value, the address pointer is slowly incremented by the small value to lower the frequency of the wave data. In this way, the sound processor 203 performs the pitch conversion of wave data.
  • Next, the sound processor 203 reads the wave data stored in the location pointed to by the head address as given from the ROM 51, while incrementing the address pointer on the basis of the pitch control information as given. Then, the sound processor 203 generates an audio signal by multiplying the wave data, which is successively read, by the envelope data and the volume data. In this way, an audio signal having the tone quality of the musical instrument, the frequency of sound vibration (pitch) and the sound volume which are designated by the first musical score data 305 is generated and output to the audio signal output terminal 49.
  • On the other hand, the CPU 201 manages the gate times contained in the music control information as read. Accordingly, the CPU 201 outputs an instruction to the sound processor 203 in order that, when a gate time elapses, the output of the corresponding musical tone is terminated. In response to this, the sound processor 203 terminates the output of the corresponding musical tone as designated.
  • Music is played back as described above on the basis of the first musical score data 305 and output through a speaker (not shown in the figure) of the television monitor 90.
  • Next, the entire process flow of the music game apparatus 1 of FIG. 1 will be explained with reference to a flow chart.
  • FIG. 29 is a flow chart showing the entire process flow of the music game apparatus 1 of FIG. 1. As illustrated in FIG. 29, the CPU 201 performs the initial settings of the system in step S1. In step S2, the CPU 201 calculates the state information of the operation article 150. In step S3, the CPU 201 performs the game process on the basis of the state information of the operation article 150 as calculated in step S2. In step S4, the CPU 201 determines whether or not “M” is smaller than a predetermined value “K”. If “M” is greater than or equal to the predetermined value “K”, the CPU 201 proceeds to step S5, in which “0” is assigned to “M”, and proceeds to step S6. On the other hand, if “M” is smaller than the predetermined value “K”, the CPU 201 proceeds from step S4 to step S6. This value “M” will be explained in the following description.
  • In step S6, it is determines whether or not the CPU 201 waits for the video system synchronous interrupt. The CPU 201 provides the graphics processor 202 with the image information for updating the display screen of the television monitor 90 after starting the vertical blanking period (step S7). Accordingly, after the process necessary for updating the display screen is completed, the process is halted until the next video system synchronous interrupt is issued. If “YES” is determined in step S6, i.e., while waiting for a video system synchronous interrupt (while there is no video system synchronous interrupt), the same step S6 is repeated. Conversely, if “NO” is determined in step S6, i.e., if it gets out of the state of waiting for a video system synchronous interrupt (if there is a video system synchronous interrupt), the process proceeds to step S7. In step S7, the CPU 201 provides the graphics processor 202 with the image information required for generating the game screen (refer to FIG. 11 through FIG. 13) during the vertical blanking period on the basis the game process in step S3.
  • FIG. 30 is a flow chart showing the process flow for the initial settings of the system in step S1 of FIG. 29. As shown in FIG. 30, the CPU 201 initializes the musical score data pointer for guidance in step S10. In step S11, the CPU 201 sets an execution stand-by counter for guidance to “0”.
  • In step S12, the CPU 201 initializes the musical score data pointer for music for music. In step S13, the CPU 201 sets an execution stand-by counter for music to “t”.
  • In step S14, the CPU 201 performs the initial settings of the image sensor 43. In step S15, the CPU 201 initializes various flags and various counters.
  • In step S16, the CPU 201 sets the timer circuit 210 as an interrupt source for outputting sound. Incidentally, as an interrupt handler, the sound processor 203 performs a process to output sound from the speaker of the television monitor 90.
  • FIG. 31 is a flow chart showing the process flow for sensor initial settings in step S14 of FIG. 30. As shown in FIG. 31, in the initial step S20, the high speed processor 200 sets setting data to a command “CONF”. In this case, this command “CONF” is a command used to inform the image sensor 43 that the high speed processor 200 enters a configuration mode in which a command is transmitted to the image sensor 43. Then, in the next step S21, a command transmission process is performed.
  • FIG. 32 is a flow chart showing the command transmission process in step S21 of FIG. 31. As shown in FIG. 32, the high speed processor 200 sets register data (I/O port) to the setting data (the command “CONF” in the case of step S21) in the first step S30, and sets the register setting clock (I/O port) to a low level in the next step S31. Then, after waiting for a predetermined time in step S32, the register setting clock CLK is set to a high level in step S33. Then, after further waiting for another predetermined time in step S34, the register setting clock CLK is set to a low level again in step S35.
  • As has been discussed above, as illustrated in FIG. 33, the process of transmitting a command (or a command associated with data) can be performed by changing the level of the register setting clock CLK to a low level, then to a high level and again to a low level while waiting for the predetermined time before each change.
  • Returning to FIG. 31, the explanation continues. In step S22, the pixel mode is set as well as the exposure time. In the case of the present embodiment, since the image sensor 43 is for example a CMOS image sensor of 32 pixels×32 pixels as described above, “0h” indicative of 32 pixels×32 pixels is loaded into a pixel mode register at a setting address “0”. In the next step S23, the high speed processor 200 performs a register setting process.
  • FIG. 34 is a flow chart showing the register setting process in step S23 of FIG. 31. As shown in FIG. 34, the high speed processor 200 sets the setting data to the command “MOV” associated with an address in the first step S40, and then performs the command transmission process in the next step S41 as explained above with reference to FIG. 32 to transmit the command. Next, the high speed processor 200 sets the setting data to the command “LD” associated with data in the next step S42, and then performs the command transmission process in the next step S43 to transmit the command. Then, the high speed processor 200 sets the setting data to the command “SET” in step S44, and transmits the command in the next step S45. Incidentally, the command “MOV” is a command for transmitting the address of a control register; the command “LD” is a command for transmitting data; and the command “SET” is a command for actually loading the data into the address. Incidentally, the above process is repeated if there are a plurality of control registers to be set.
  • Returning to FIG. 31, the explanation continues. In step S24, the setting address is set to “1” (the address of the low nibble of an exposure time setting register), and “Fh” is loaded into the low nibble of the exposure time setting register as the low nibble data of “FFh” indicative of the maximum exposure time. Then, in step S25, the register setting process of FIG. 34 is performed. In the same manner, in step S26, the setting address is set to “2” (the address of the high nibble of the exposure time setting register), and “Fh” is loaded into the high nibble of the exposure time setting register as the high nibble data of “FFh” indicative of the maximum exposure time, and the register setting process is performed in step S27.
  • Thereafter, the setting data is set to a command “RUN” in step S28 for indicating the completion of initialization and having the image sensor 43 start outputting data, followed by step S29 in which the command “RUN” is transmitted. As has been discussed above, the sensor initialization process is performed in step S14 of FIG. 30. However, the specific examples as illustrated in FIG. 31 to FIG. 34 may be modified in accordance with the specification of the image sensor 43 actually employed.
  • FIG. 35 is a flow chart showing the process of calculating the state information in step S2 of FIG. 29. As shown in FIG. 35, the CPU 201 acquires digital pixel data from the ADC 208 in step S50. This digital pixel data is data obtained by converting the analog pixel data, which is transmitted from the image sensor 43, into digital data by the ADC 208.
  • In step S51, the process of extracting a target point is performed. More specifically speaking, the CPU 201 acquires differential data by calculating the difference between the pixel data acquired when the infrared light emitting diodes 15 are turned on and the pixel data acquired when the infrared light emitting diodes 15 are turned off. Then, the CPU 201 finds the maximum value of the differential data and compares it with the predetermined threshold value “Th”. Furthermore, if the maximum value of the differential data is greater than the predetermined threshold value “Th”, the CPU 201 converts the coordinates of the pixel having the differential data corresponding to the maximum value into the coordinates on the screen 91 of the television monitor 90 and sets the coordinates of the target point of the operation article 150 to the coordinates as converted.
  • In step S52, the CPU 201 determines which of the areas a1 to a4 in FIG. 24 includes the target point of the operation article 150, and stores the result of determination in the array element J1[M].
  • In step S53, the CPU 201 determines which of the areas A1 to A4 in FIG. 25 includes the target point of the operation article 150, and stores the result of determination in the array element J2[M].
  • FIG. 36 is a flow chart showing the process flow of acquiring a pixel data group in step S50 of FIG. 35. As shown in FIG. 36, the CPU 201 sets “X” to “−1” and “Y” to “0” as element indices of a pixel data array in the first step S60. In the case of the present embodiment, while the pixel data array is a two-dimensional array in which X=0 to 31 and Y=0 to 31, dummy data is output as the data of the initial pixel as described above so that the initial value of “X” is set to “−1”. In the next step S61, the process of acquiring pixel data.
  • FIG. 37 is a flow chart showing the process flow of acquiring pixel data in step S61 of FIG. 36. As shown in FIG. 37, the CPU 201 checks the frame status flag signal FSF as input from the image sensor 43 in the initial step S70, and judges whether or not the rising edge thereof (from a low level to a high level) is detected in step S71. Then, if the rising edge of the frame status flag signal FSF is detected in step S71, in the next step S72, the CPU 201 instructs the ADC 208 to start the conversion of the analog pixel data input thereto into digital data. Thereafter, the pixel strobe signal PDS as input from the image sensor 43 is checked in step S73, and it is judged whether or not the rising edge of the pixel strobe signal PDS from a low level to a high level is detected in step S74.
  • If “YES” is determined in step S74, the CPU 201 determines in step S75 whether or not X=−1, i.e., whether or not it is the initial pixel. As has been discussed above, since the initial pixel of each line is set as a dummy pixel, if “YES” is determined in this step S75, the current pixel data is not acquired, but the element index “X” is incremented in the following step S77.
  • If “NO” is determined in step S75, since it is the second or later pixel data constructing the line, the current pixel data is acquired and saved in a temporary register (not shown in the figure) in steps S76 and S78. Thereafter, the process proceeds to step S62 of FIG. 36.
  • In step S62 of FIG. 36, the pixel data as saved in the temporary register is assigned to a pixel data element P[Y][X].
  • In the following step S63, “X” is incremented if “X” is smaller than “32”, the process of from step S61 to step S63 is repeatedly performed. If “X” is equal to “32”, i.e., if the acquisition process of pixel data reaches the end of the current line, “X” is set to “−1” in the following step S65, “Y” is incremented in step S66, and the acquisition process of pixel data is repeated from the top of the next line.
  • If “Y” is equal to “32” in step S67, i.e., if the acquisition process of pixel data reaches the last pixel data array element P[Y][X], the process proceeds to step S51 of FIG. 35.
  • FIG. 38 is a flow chart showing the process flow of extracting a target point in step S51 of FIG. 35. As shown in FIG. 38, in step S80, the CPU 201 acquires differential data by calculating the difference between the pixel data acquired from the image sensor 43 when the infrared light emitting diodes 15 are turned on and the pixel data acquired from the image sensor 43 when the infrared light emitting diodes 15 are turned off. In step S81, the CPU 201 assigns the differential data as calculated to the array elements Dif[X][Y]. In this case, since the image sensor 43 of 32 pixels×32 pixels is used in the case of the present embodiment, X=0 to 31 and Y=0 to 31.
  • In step S82, the CPU 201 scans all the array elements Dif[X] [Y]. In step S83, the CPU 201 finds the maximum value of all the array elements Dif[X] [Y]. If the maximum value is greater than the predetermined threshold value “Th”, the CPU 201 proceeds to step S85, and if the maximum value is less than or equal to the predetermined threshold value “Th”, the CPU 201 proceeds to step S4 of FIG. 29 (step S84).
  • In step S85, the CPU 201 calculates the coordinates (Xc, Yc) of the target point of the operation article 150 on the basis of the coordinates corresponding to the maximum value. In step S86, the CPU 201 increments the value of the count “M” by one (M=M+1).
  • In step S87, the CPU 201 converts the coordinates (Xc, Yc) of the target point on the image sensor 43 into the coordinates (xc, yc) on the screen 91 of the television monitor 90. In step S88, the CPU 201 assigns “xc” to the array element Px[M] as the x-coordinate of the M-th target point, and “yc” to the array element Py[M] as the y-coordinate of the M-th target point.
  • FIG. 39 is a flow chart showing the process flow of calculating the coordinates of a target point in step S85 of FIG. 38. As shown in FIG. 39, in step S100, the CPU 201 assigns the X-coordinate and the Y-coordinate, which are obtained in step S83 in correspondence with the maximum value, respectively to “m” and “n”. In step S101, the CPU 201 increments “m” by one (m=m+1). If the differential data Dif[m][n] is greater than the predetermined threshold value “Th”, the CPU 201 proceeds to step S103, otherwise proceeds to step S104 (step S102). In step S103, the CPU 201 assigns the current value of “m” to “m”. The endmost X-coordinate of the differential data greater than the predetermined threshold value “Th” is obtained by repeating steps S101 to S103 while scanning the X-axis from the maximum value position in the positive direction.
  • In step S104, the CPU 201 assigns to “m” the X-coordinate which is obtained in step S83 in correspondence with the maximum value. In step S105, the CPU 201 decrements “m” by one. If the differential data Dif[m][n] is greater than the predetermined threshold value “Th”, the CPU 201 proceeds to step S107, otherwise proceeds to step S108 (step S106). In step S107, the CPU 201 assigns the current value of “m” to “ml”. The endmost X-coordinate of the differential data greater than the predetermined threshold value “Th” is obtained by repeating steps S105 to S107 while scanning the X-axis from the maximum value position in the negative direction.
  • In step S108, the CPU 201 calculates the center coordinate between the X-coordinate “mr” and the X-coordinate “ml”, and assigns it to the X-coordinate (Xc) of the target point. In step S109, the CPU 201 assigns “Xc” which is obtained in step S108 and the Y-coordinate which is obtained in step S83 in correspondence with the maximum value, respectively to “m” and “n”. In step S110, the CPU 201 increments “n” by one (n=n+1). If the differential data Dif[m][n] is greater than the predetermined threshold value “Th”, the CPU 201 proceeds to step S112, otherwise proceeds to step S113 (step S111). In step S112, the CPU 201 assigns the current value of “n” to “md”. The endmost Y-coordinate of the differential data greater than the predetermined threshold value “Th” is obtained by repeating steps S110 to S112 while scanning the Y-axis from the maximum value position in the positive direction.
  • In step S113, the CPU 201 assigns to “n” the Y-coordinate which is obtained in step S83 in correspondence with the maximum value. In step S114, the CPU 201 decrements “n” by one. If the differential data Dif[m][n] is greater than the predetermined threshold value “Th”, the CPU 201 proceeds to step S116, otherwise proceeds to step S117 (step S115). In step S116, the CPU 201 assigns the current value of “n” to “mu”. The endmost Y-coordinate of the differential data greater than the predetermined threshold value “Th” is obtained by repeating steps S114 to S116 while scanning the Y-axis from the maximum value position in the negative direction.
  • In step S117, the CPU 201 calculates the center coordinate between the Y-coordinate “md” and the Y-coordinate “mu”, and assigns it to the Y-coordinate (Yc) of the target point. As has been described above, the coordinates (Xc, Yc) of the target point of the operation article 150 is calculated.
  • FIG. 40 is a flow chart showing the game process flow in step S3 of FIG. 29. As shown in FIG. 40, in step S120, the CPU 201 checks a music end flag (refer to step S196 of FIG. 43), and if music ends the game process is over, conversely if music does not end yet the process proceeds to step S121.
  • In step S121, the CPU 201 registers the x-coordinate Px[M] and the y-coordinate Py[M] of the target point of the operation article 150 as the display coordinates of the cursor 105 on the screen 91.
  • The CPU 201 repeats the process between step S122 and step S144 twice. In this case, “j” represents a guidance display number “J” (refer to FIG. 43).
  • In step S123, the CPU 201 checks a guidance start flag GF[j] (refer to step S194 of FIG. 43). If the guidance start flag GF[j] is turned on, the CPU 201 proceeds to step S125, and if it is turned off the CPU 201 proceeds to step S144 (step S124). In step S125, the CPU 201 checks a frame counter C[j]. If the frame counter C[j] is greater than “0”, the CPU 201 proceeds to step S128, conversely if the frame counter C[j] is equal to “0”, the CPU 201 proceeds to step S127 (step S126). In step S127, in accordance with the note number NN[j], the CPU 201 registers the animation information of the direction guide “g” or the path guide “rg” together with the animation information of the position guide “G”. The animation information is registered only when the frame counter C[j] is “0” because once the animation information is registered the animation is performed in accordance with the registration information so that the registration is needed only when starting the animation.
  • In step S128, the CPU 201 checks the note number NN[j], and if it is the note number designating the turn of the cursor 105 (refer to FIG. 20(b) and FIG. 20(c)) the process proceeds to step S131 otherwise (refer to FIG. 20(a)) proceeds to step S129. In step S129, the CPU 201 checks the frame counter C[j]. If the frame counter C[j] is greater than or equal to the predetermined number “f1” of frames, the CPU 201 proceeds to step S131 otherwise proceeds to step S141 (step S130). In step S131, the CPU 201 determines whether or not the cursor 105 is correctly manipulated in correspondence with the guides (the position guide “G”/the direction guide “g”/the path guide “rg”) (success determination).
  • Incidentally, as apparent from step S128 and step S131, in the case where the note number NN[j] designates the turn of the cursor 105, the success determination of the manipulate of the cursor 105 is performed just after starting displaying the position guide “G” and the path guide “rg” (the frame counter C[j] is “0”) irrespective of the value of the frame counter C[j]. On the other hand, in the case where the note number NN[j] is a note number other than the note number designating the turn of the cursor 105, the success determination of the manipulate of the cursor 105 is performed a predetermined number “f1” of frames (for example, 30 frames) after starting displaying the position guide “G” and the direction guide “g” (the frame counter C[j] is “0”) (refer to FIG. 28).
  • By the way, as a result of the determination in step S131, if the manipulate of the cursor 105 succeeded, the CPU 201 proceeds to step S133 otherwise proceeds to step S140 (step S132). In step S133, the CPU 201 registers the dance animation information with reference to the note number NN[j] and a dance speed flag DF (refer to step S193, step S190 and step S192 of FIG. 43). Also, in the case where the manipulation is successful, the CPU 201 changes the center position of the background screen 140 and modifies the corresponding data of the array elements PA and the array elements CA with reference to the note number NN[j] and a dance speed flag DF in order to scroll the background 110. Furthermore, the CPU 201 registers the storage location information of the animation table of the position guide “G” which is used when the manipulation succeeded.
  • In step S134, the CPU 201 checks the note numbers NN[j], and if it is the note number designating the turn of the cursor 105, the process proceeds to step S137 otherwise proceeds to step S135. In step S135, the CPU 201 checks the frame counter C[j]. If the frame counter [j] is greater than or equal to a predetermined number “f2” of frames, the CPU 201 proceeds to step S137 otherwise proceeds to step S138 (step S136). In step S137, the CPU 201 adds “3” to a score “S”. On the other hand, in step S138, “1” is added to the score “S”.
  • Incidentally, the following is the reason why “3” is added to the score “S” in step S137 while “1” is added to the score “S” in step S138. In the case where the cursor 105 is located in the area of the position guide “G” within a predetermined period (for example, 10 frames) from the time point the predetermined number “f2” of frames (for example, 50 frames) after starting displaying the position guide “G” and the direction guide “g” (the frame counter C[j] is “0”), it is determined that the cursor 105 is manipulated in a best timing so that “3” is added. On the other hand, in the case where the cursor 105 is located in the area of the position guide “G” the predetermined number “f1” of frames after and the predetermined number “f2” of frames before, it is determined that the cursor 105 is manipulated in an ordinary successful manner that “1” is added. Also, when the manipulation is performed in correspondence with the position guide “G” and the path guide “rg” (the guide designating the turn of the cursor 105), “3” is added equally to the score “S”.
  • Next, in step S139, the CPU 201 checks the frame counter C[j]. If the frame counter C[j] is equal to a predetermined number “f3” (for example, 60 frames), the CPU 201 proceeds to step S142 otherwise proceeds to step S141 (step S140). In step S141, the CPU 201 increments the frame counter C[j] by one. On the other hand, in step S142, the CPU 201 sets the frame counter C[j] to “0”. In step S143, the CPU 201 turns off the guidance start flag GF[j]. Incidentally, the predetermined number “f3” is used to define the end of success determination.
  • FIG. 41 is a flow chart showing the interrupt process flow. As shown in FIG. 41, in step S150, the CPU 201 performs the playback of music. In step S151, the CPU 201 performs the process of registering the guides (the position guide “G”, the direction guide “g” and the path guide “rg”).
  • FIG. 42 is a flow chart showing the process flow of the playback of music in step S150 of FIG. 41. As shown in FIG. 42, in step S160, the CPU 201 checks the execution stand-by counter for music. If the value of the execution stand-by counter for music is “0”, the process proceeds to step S162, conversely if it is not “0”, the process proceeds to step S170 (step S161). In step S170, the CPU 201 decrements the execution stand-by counter for music.
  • On the other hand, in step S162, the CPU 201 reads and interprets the command pointed to by the musical score data pointer for music. If the command is “Note On”, the process proceeds to step S164 (step S163). On the other hand, if the command is not “Note On”, i.e., “Waiting”, the process proceeds to step S165. In step S165, the CPU 201 sets a waiting time to the execution stand-by counter for music.
  • On the other hand, in step S164, the CPU 201 instructs the sound processor 203 to start outputting a sound corresponding to the note number which is read. In step S166, the CPU 201 increments the musical score data pointer for music.
  • In step S167, the CPU 201 checks the remaining sound outputting time corresponding to the note number associated with the outputting sound. If the remaining sound outputting time is “0”, the process proceeds to step S169 otherwise proceeds to step S151 of FIG. 41 (step S168). In step S169, the CPU 201 instructs the sound processor 203 to perform the sound termination process of the note number having the remaining sound outputting time of “0”.
  • FIG. 43 is a flow chart showing the process flow of registering guides in step S151 of FIG. 41. As shown in FIG. 43, in step S180, the CPU 201 checks the execution stand-by counter for guide. If the value of the execution stand-by counter for guide is “0”, the process proceeds to step S182, conversely if it is not “0”, the process proceeds to step S198 (step S181). In step S198, the CPU 201 decrements the execution stand-by counter for guide.
  • On the other hand, in step S182, the CPU 201 reads and interprets the command pointed to by the musical score data pointer for guide. If the command is “Note On”, the CPU 201 proceeds to step S184 (step S183). On the other hand, if the command is not “Note On”, i.e., “Waiting”, the CPU 201 proceeds to step S197. In step S197, the CPU 201 sets the execution stand-by counter for guide to a waiting time.
  • If the note number designates the end of music, the CPU 201 proceeds to step S196 otherwise proceeds to step S185 (step S184). In step S196, the CPU 201 turns on the music end flag.
  • On the other hand, if the note number designates the start of music, the CPU 201 proceeds to step S195 otherwise proceeds to step S186 (step S185). If the guidance display number “J” is “1, the CPU 201 sets the guidance display number “J” to “0” in step S188, conversely if the guidance display number “J” is not “1 (i.e., it is 0”), the CPU 201 sets the guidance display number “J” to “1” in step S187. Since the guidance of a certain position guide “G” and the guidance of another position guide “G” are started in different timings but can be overlappingly continued in a certain period, the guidance display number “J” is used to perform the game process in step S3 of FIG. 29.
  • By the way, if the note number designates that a high speed dance animation is to be performed, the CPU 201 proceeds to step S190 otherwise proceeds to step S191 (step S189). In step S190, the CPU 201 sets the dance speed flag DF to “1” (a high speed dance animation).
  • On the other hand, if the note number designates that a low speed dance animation is to be performed, the CPU 201 proceeds to step S192 otherwise proceeds to step S193 (step S191). In step S192, the CPU 201 sets the dance speed flag DF to “0” (a low speed dance animation).
  • By the way, in the case where the note number is none of the note number designating the end of music, the note number designating the start of music, the note number designating a high speed dance animation and the note number designating a low speed dance animation, such a note number shall be a note number which designates a type of a guide (FIG. 20(a) through FIG. 20(c)) and thereby the CPU 201 assigns the note number to the array element NN[J] in step S193. In step S194, the CPU 201 turns on the guidance start flag GF[J].
  • In step S195, the CPU 201 increments the musical score data pointer for guide.
  • Next is the explanation of another example of the direction guide “g”. FIG. 44 is a view for showing an example of the game screen in which another example of the direction guide “g” is applied. As shown in FIG. 44, a direction guide “g20” is displayed in the form of a belt which is extending from the position guide “G1” toward the position guide “G2”. This direction guide “g20” grows as time passes from the position guide “G1” to the position guide “G2”. The direction in which the cursor 105 is to be manipulated is guided in terms of the direction in which this direction guide “g20” grows. Also, if a predetermined time period after the direction guide “g20” reaches the position guide “G2” is used as the period for performing the success determination of manipulating the cursor 105, it is possible to guide the manipulation timing of the cursor 105 by the direction guide “g20”. Incidentally, it can be said that the direction guide “g20” is represented by gradually change the color of the path from the position guide “G1” to the position guide “G2”.
  • FIG. 45 is a view for showing an example of the game screen in which a further example of the direction guide “g” is applied. As shown in FIG. 45, a direction guide “g30” is displayed on the game screen between one position guide “G” and another position guide “G”. This direction guide “g30” consists of five partial paths “g31” to “g35”. Then, in the case of the example of FIG. 45, the five partial paths “g31” to “g35” change color sequentially from the position guide “G1” toward the position guide “G2”. The change in color is illustrated by hatching. By this method, the manipulation direction of the cursor 105 can be guided in terms of the direction in which the partial paths “g31” to “g35” change color in sequence. Also, in the case of this example, if a predetermined time period after changing the color of the partial path “g35” adjacent to the position guide “G2” guiding the destination position of the cursor 105 is used as the period for performing the success determination of manipulating the cursor 105, it is possible to guide the manipulation timing of the cursor 105 by the direction guide “g30”.
  • FIG. 46 is a view for showing an example of the game screen in which a still further example of the direction guide “g” is applied. As shown in FIG. 46, a direction guide “g40” is displayed on the game screen between the position guide “G1” and the position guide “G2”. This direction guide “g40” moves from the position guide “G1” to the position guide “G2” as time passes. The direction in which the cursor 105 is to be moved is guided in terms of the direction in which this direction guide “g40” moves. Also, if a predetermined time period after the direction guide “g40” reaches the position guide “G2” is used as the period for performing the success determination of manipulating the cursor 105, it is possible to guide the manipulation timing of the cursor 105 by the direction guide “g40”.
  • As has been discussed above, in the case of the present embodiment, if the cursor 105 is correctly manipulated in correspondence with the guides (the position guide “G”, the direction guide “g” and the path guide “rg”), the display of images (the dance object 106 and the background 110 in the above example) is controlled in accordance with the guidance by the guides. In this case, since the cursor 105 is correctly manipulated in correspondence with the guidance by the guides, the display of images is controlled in accordance with the manipulation of the cursor 105. In other words, since the cursor 105 moves in association with the operation article 150, the display of the images is controlled in accordance with the manipulation of the operation article 150. Also, the image of the operation article 150, which is intermittently lighted by the stroboscope, is captured by the imaging unit 13 in order to obtain the state information of the operation article 150. Because of this, no circuit which is driven by a power supply need be provided within the operation article 150 for obtaining the state information of the operation article 150. Furthermore, this music game apparatus 1 serves to automatically play music.
  • As a result, while automatically playing music without relation to the player 94, the player 94 can enjoy, together with the music, images which are displayed in synchronization with the manipulation of the operation article 150 by manipulating the operation article 150 having a simple structure.
  • Also, since the guides are controlled in the timing on the basis of music, the operation article 150 is manipulated in synchronization with music as long as the player 94 manipulates the cursor 105 in correspondence with the guides. Accordingly, the player 94 can enjoy the manipulation of the operation article 150 in synchronization with music.
  • In this case, for example, in correspondence with the note numbers “55” and “67”, the high speed processor 200 scrolls the background 110 to the left, and prepares the dance animation information and the dance animation table for turning the dance object 106 in the counter clockwise direction. Also, for example, in correspondence with the note numbers “45” and “64”, the high speed processor 200 scrolls the background 110 to the right, and prepares the dance animation information and the dance animation table for turning the dance object 106 in the clockwise direction. Furthermore, for example, in correspondence with the note numbers “76” and “77”, the high speed processor 200 scrolls the background 110 in the downward direction, and prepares the dance animation information and the dance animation table for turning the dance object 106 in the counter clockwise direction. Still further, for example, in correspondence with the note numbers “65” and “74”, the high speed processor 200 scrolls the background 110 in the upward direction, and prepares the dance animation information and the dance animation table for turning the dance object 106 in the clockwise direction. Still further, for example, the dance animation information and the dance animation table for widely turning the dance object 106 in the clockwise direction are prepare in correspondence with the note number “53”. Still further, for example, the dance animation information and the dance animation table for widely turning the dance object 106 in the counter clockwise direction are prepare in correspondence with the note number “57”.
  • Incidentally, the above types of the note numbers (refer to FIG. 20(a) through FIG. 20(c)) are note numbers for controlling the display of the guides, and thereby the background 110 and the dance object 106 are controlled in accordance with the guides. Furthermore, in other words, the background 110 and the dance object 106 are controlled in accordance with the manipulation of the operation article 150.
  • Also, in the case of the present embodiment, the position guide “G” serves to guide the manipulation timing and the destination position of the cursor 105. In addition, when the cursor 105 is correctly manipulated by the operation article 150 in correspondence with the guidance of the position guide “G”, the high speed processor 200 controls the display of images (the dance object 106 and the background 110 in the case of the above example) in correspondence with the direction toward the destination position which is guided by the position guide “G”.
  • Accordingly, when the player 94 manipulates the operation article 150 in order to move the cursor 105 to the destination position guided by the position guide “G” in the manipulation timing guided by the position guide “G”, the display of images is controlled in correspondence with the direction toward the destination position guided by the position guide “G”. As a result, it is possible to enjoy, together with music, the images which are synchronized with the cursor 105 which is moving in association with the motion of the operation article 150 (refer to FIG. 12).
  • Furthermore, in the case of the present embodiment, the path guide “rg” serves to guide the moving path, the moving direction and the manipulation timing of the cursor 105. Accordingly, when the player 94 manipulates the operation article 150 in order to move the cursor 105 in the manipulation timing guided by the path guide “rg”, in the moving direction guided by the path guide “rg” and along the moving path guided by the path guide “rg”, the display of images (the dance object in the case of the above example) is controlled in correspondence with the path guide “rg”. As a result, it is possible to enjoy, together with music, the images which are synchronized with the cursor 105 which is moving in association with the motion of the operation article 150 (refer to FIG. 13).
  • Furthermore, in the case of the present embodiment, if the position of the target point of the operation article 150 is found in the area guided by the position guide “G” within the period is guided by the position guide “G”, it is determined that the cursor 105 which is moving in association with the operation article 150 is correctly manipulated in correspondence with the guidance of the position guide “G” (refer to FIG. 24). Also, if the position of the target point of the operation article 150 is moved through a plurality of predetermined areas guided by the path guide “rg” in the predetermined order guided by the path guide “rg” within the period guided by the path guide “rg”, it is determined that the cursor 105 which is moving in association with the operation article 150 is correctly manipulated in correspondence with the guidance of the path guide “rg” (refer to FIG. 25). As has been discussed above, it is possible to determine the correctness of the manipulation of the cursor 105 on the basis of the position of the target point of the operation article 150 which can be calculated by a simple process.
  • Furthermore, in the case of the present embodiment, the position guide “G” is displayed in each of a plurality of positions which are determined in advance on the screen 91. Then, the high speed processor 200 changes the appearance of the position guide “G” in the timing on the basis of music (the animation that a bloom is opening in the case of the example of FIG. 12). Accordingly, the player 94 can easily recognize the position and the direction to which the cursor 105 is to be moved with reference to the change of the position guide “G” in appearance.
  • Furthermore, in the case of the present embodiment, the direction guide “g” and the path guide “rg” are expressed in images with which it is possible to visually recognize the motion from the first predetermined position to the second predetermined position on the screen 91. As has been discussed above, the manipulation of the cursor 105 is guided not only by the position guide “G” but also by the direction guide “g” and the path guide “rg”. Accordingly, the player 94 can clearly recognize the direction and path of the cursor 105 to be moved. More specific description is as follows.
  • The direction guide “g” and the path guide “rg” are expressed by the change in appearance of a plurality of objects (in the form of spheres in the case of the examples of FIG. 12 and FIG. 13) which are arranged in the path having a start point at the first predetermined position and an end point at the second predetermined position on the screen 91. In this case, the player 94 can easily recognize the direction and the path to which the cursor 105 is to be moved with reference to the change in appearance of the plurality of the objects.
  • Also, the direction guide “g” is expressed by the motion of an object (in the form of a bird in the case of the example of FIG. 46) from the first predetermined position to the second predetermined position on the screen 91. In this case, the player 94 can easily recognize the direction and the path to which the cursor 105 is to be moved with reference to the motion of the object.
  • In addition to this, the direction guide “g” is expressed by the change in appearance of the path having a start point at the first predetermined position and an end point at the second predetermined position on the screen 91 (refer to FIG. 44 and FIG. 45). In this case, the player 94 can easily recognize the direction and the path to which the cursor 105 is to be moved with reference to the change in appearance of the path.
  • Furthermore, in the case of the present embodiment, the high speed processor 200 can be calculate, as the state information of the operation article 150, any one of or any combination of two or more of speed information, moving direction information, moving distance information, velocity vector information, acceleration information, movement locus information, area information, and positional information. As has been discussed above, since a variety of information can be used as the state information of the operation article 150 for determining whether or not the cursor 105 is correctly manipulated in correspondence with the guides (the position guide “G”, the direction guide “g” and the path guide “rg”), the possibility of expression of guides is greatly expanded, and thereby the design freedom of the game content is also greatly increased.
  • Furthermore, in the case of the present embodiment, the state information of the operation article 150 can be obtained by intermittently emitting infrared light to the operation article 150 to which the reflection sheet 155 is attached and capturing the image thereof. Because of this, no circuit which is driven by a power supply need be provided within the operation article 150 for obtaining the state information of the operation article 150. Accordingly, it is possible to improve the manipulability and reliability of the operation article 150, and to reduce the cost.
  • Meanwhile, the present invention is not limited to the embodiments as described above, but can be applied in a variety of aspects without departing from the spirit thereof, and for example the following modifications may be effected.
  • (1) In the above embodiments, the dance object 106 and the background 110 are explained as images (follow-up images) which are controlled to follow the motion of the operation article 150. However, the present invention is not limited thereto, but any arbitrary object can be selected as such a follow-up image. Also, instead of scrolling the background 110 in order to express the motion of the dance object 106, it is possible to move the dance object 106 itself in the up, down, right and left directions.
  • (2) While the direct manipulation of the cursor 105 is guided by both the position guide “G” and the direction guide “g” in the above embodiments, it is possible to perform the guidance only by one of them. In the case where only the direction guide “g” is used to instruct the manipulation of the cursor 105, it is preferred that the position guide “G” in the form of a still image is arranged at each of the start point and the end point of the direction guide “g”. Also, while both the position guide “G” and the path guide “rg” is used to guide the turning manipulation of the cursor 105, it is possible to guide only by the path guide “rg”. Furthermore, while the guides (the position guide “G”, the direction guide “g” and the path guide “rg”) are expressed by animation in the above description, the present invention is not limited thereto. Still further, the implementation of a guide is not limited to those as described above.
  • (3) While the operation article 150 comprising the stick 152 and the reflection ball 151 is employed as an operation article in the above embodiments, the configuration of the operation article is not limited to those as described above as long as a reflecting object is provided.
  • (4) While the coordinates of the operation article 150 are calculated as illustrated in FIG. 23(a) and FIG. 23(d) in the above embodiments, it is possible to convert the coordinates of a pixel having the maximum luminance value greater than the predetermined threshold value “Th” (refer to step S83 of FIG. 38) into coordinates on the screen 91 and to make use of them as the coordinates of a target point.
  • (5) While an arbitrary type of a processor can be used as the high speed processor 200 of FIG. 6, it is preferred to make use of the high speed processor of which the present applicant has filed a patent application. This high speed processor is disclosed, for example, in Japanese Patent Published Application No. Hei 10-307790 and U.S. Pat. No. 6,070,205 corresponding thereto.
  • While the present invention has been described in terms of embodiments, it is apparent to those skilled in the art that the invention is not limited to the embodiments as described in the present specification. The present invention can be practiced with modification and alteration within the spirit and scope which are defined by the appended claims. Accordingly, the description of this application is thus to be regarded as illustrative instead of limiting in any way on the present invention.

Claims (15)

1. A music game apparatus operable to automatically playing music, comprising:
a stroboscope operable to irradiate an operation article manipulated by a player with light in a predetermined cycle;
an imaging unit operable to generate a lighted image signal and an unlighted image signal by capturing images of the operation article respectively when said stroboscope is lighted and unlighted;
a differential signal generating unit operable to generate a differential signal between the lighted image signal and the unlighted image signal;
a state information calculating unit operable to calculate the state information of the operation article on the basis of the differential signal;
a guide control unit operable to control the display of a guide for the manipulation of a cursor, which moves in association with the operation article, in a timing on the basis of the music;
a cursor control unit operable to control the display of the cursor on the basis of the state information of the operation article; and
a follow-up image control unit operable to control the display of an image in accordance with guidance by the guide when the cursor is correctly manipulated by the operation article in correspondence with the guide,
wherein said follow-up image control unit determines whether or not the cursor is correctly manipulated by the operation article in correspondence with the guide, on the basis of the state information of the operation article and the information about the guide.
2. The music game apparatus as claimed in claim 1 wherein the guide is operable to guide the cursor to a destination position in a manipulation timing, and
wherein said follow-up image control unit is operable to control the display of the image in correspondence with the direction of the destination position as guided by the guide when the cursor is correctly manipulated by the operation article in correspondence with the guide.
3. The music game apparatus as claimed in claim 2 wherein said state information calculating unit is operable to calculate the position of the operation article as the state information on the basis of the differential signal, and
wherein said follow-up image control unit is operable to determine that the cursor, which moves in association with the operation article, is correctly manipulated in correspondence with the guide if the position of the operation article as calculated by said state information calculating unit is located in an area corresponding to the guidance by the guide within a period corresponding to the guidance by the guide.
4. The music game apparatus as claimed in claim 1 wherein the guide is operable to guide the moving path, moving direction and manipulation timing of the cursor.
5. The music game apparatus as claimed in claim 4 wherein said state information calculating unit is operable to calculate the position of the operation article as the state information on the basis of the differential signal, and
wherein said follow-up image control unit is operable to determine that the cursor, which moves in association with the operation article, is correctly manipulated in correspondence with the guide if the position of the operation article as calculated by said state information calculating unit is moved through a plurality of predetermined areas guided by the guide in a predetermined order guided by the guide within a period guided by the guide.
6. The music game apparatus as claimed in claim 1 wherein the guide is displayed in each of a plurality of positions which is determined in advance in a screen, and
wherein the guide control unit is operable to change the appearance of the guide in a timing on the basis of the music;
7. The music game apparatus as claimed in claim 1 wherein the guide is expressed in an image with which it is possible to visually recognize the motion from a first predetermined position to a second predetermined position on a screen, and
wherein the guide control unit is operable to control the display of the guide in a timing on the basis of the music.
8. The music game apparatus as claimed in claim 7 wherein the guide is expressed by the change in appearance of a plurality of objects which are arranged in a path having a start point at the first predetermined position and an end point at the second predetermined position on the screen.
9. The music game apparatus as claimed in claim 7 wherein the guide is expressed by an object moving from the first predetermined position to the second predetermined position on the screen.
10. The music game apparatus as claimed in claim 7 wherein the guide is expressed by the change in appearance of a path having a start point at the first predetermined position and an end point at the second predetermined position on the screen.
11. The music game apparatus as claimed in claim 1 wherein the state information of the operation article as calculated by said state information calculating unit is any one of or any combination of two or more of speed information, moving direction information, moving distance information, velocity vector information, acceleration information, movement locus information, area information, and positional information.
12. A music game system operable to automatically playing music, comprising:
an operation article to be manipulated by a player;
a stroboscope operable to irradiate said operation article with light in a predetermined cycle;
an imaging unit operable to generate a lighted image signal and an unlighted image signal by capturing images of the operation article respectively when said stroboscope is lighted and unlighted;
a differential signal generating unit operable to generate a differential signal between the lighted image signal and the unlighted image signal;
a state information calculating unit operable to calculate the state information of the operation article on the basis of the differential signal;
a guide control unit operable to control the display of a guide for the manipulation of a cursor, which moves in association with the operation article, in a timing on the basis of the music;
a cursor control unit operable to control the display of the cursor on the basis of the state information of the operation article; and
a follow-up image control unit operable to control the display of an image in accordance with guidance by the guide when the cursor is correctly manipulated by the operation article in correspondence with the guide,
wherein said follow-up image control unit determines whether or not the cursor is correctly manipulated by the operation article in correspondence with the guide, on the basis of the state information of the operation article and the information about the guide.
13. The operation article manipulated by the player of the music game apparatus as recited in claim 1, comprising:
a stick-like grip portion to be gripped by the player; and
a reflecting portion provided at one end of said grip portion and operable to retroreflectively reflect incident light.
14. A music game program which makes a computer perform processing comprising:
automatically playing music;
irradiating an operation article manipulated by a player with light in a predetermined cycle;
generating a lighted image signal and an unlighted image signal by capturing images of the operation article respectively when the light is emitted and not emitted;
generating a differential signal between the lighted image signal and the unlighted image signal;
calculating the state information of the operation article on the basis of the differential signal;
controlling the display of a guide for the manipulation of a cursor, which moves in association with the operation article, in a timing on the basis of the music;
controlling the display of the cursor on the basis of the state information of the operation article; and
determining whether or not the cursor is correctly manipulated by the operation article in correspondence with the guide, on the basis of the state information of the operation article and the information about the guide, and controlling the display of an image in accordance with guidance by the guide when the cursor is correctly manipulated by the operation article in correspondence with the guide.
15. A music game method comprising:
irradiating an operation article manipulated by a player with light in a predetermined cycle;
generating a lighted image signal and an unlighted image signal by capturing images of the operation article respectively when the light is emitted and not emitted;
generating a differential signal between the lighted image signal and the unlighted image signal;
calculating the state information of the operation article on the basis of the differential signal;
controlling the display of a guide for the manipulation of a cursor, which moves in association with the operation article, in a timing on the basis of the music;
controlling the display of the cursor on the basis of the state information of the operation article; and
controlling the display of an image in accordance with guidance by the guide when the cursor is correctly manipulated by the operation article in correspondence with the guide, and wherein
in said step of controlling the display of the image, it is determined whether or not the cursor is correctly manipulated by the operation article in correspondence with the guide, on the basis of the state information of the operation article and the information about the guide.
US10/572,429 2003-09-18 2004-09-17 Music Game Device, Music Game System, Operation Object, Music Game Program, And Music Game Method Abandoned US20070197290A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/572,429 US20070197290A1 (en) 2003-09-18 2004-09-17 Music Game Device, Music Game System, Operation Object, Music Game Program, And Music Game Method

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2003325381 2003-09-18
JP2003-325381 2003-09-18
US51526703P 2003-10-29 2003-10-29
PCT/JP2004/014025 WO2005028053A1 (en) 2003-09-18 2004-09-17 Music game device, music game system, operation object, music game program, and music game method
US10/572,429 US20070197290A1 (en) 2003-09-18 2004-09-17 Music Game Device, Music Game System, Operation Object, Music Game Program, And Music Game Method

Publications (1)

Publication Number Publication Date
US20070197290A1 true US20070197290A1 (en) 2007-08-23

Family

ID=38428925

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/572,429 Abandoned US20070197290A1 (en) 2003-09-18 2004-09-17 Music Game Device, Music Game System, Operation Object, Music Game Program, And Music Game Method

Country Status (1)

Country Link
US (1) US20070197290A1 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050096132A1 (en) * 2003-09-22 2005-05-05 Hiromu Ueshima Music game with strike sounds changing in quality in the progress of music and entertainment music system
US20080242385A1 (en) * 2007-03-30 2008-10-02 Nintendo Co., Ltd. Game device and storage medium storing game program
US20090069096A1 (en) * 2007-09-12 2009-03-12 Namco Bandai Games Inc. Program, information storage medium, game system, and input instruction device
US20110003641A1 (en) * 2008-02-19 2011-01-06 Konami Digital Entertainment Co., Ltd. Game device, game control method, information recording medium, and program
US20110034247A1 (en) * 2009-08-04 2011-02-10 Konami Digital Entertainment Co., Ltd. Game system and game program
US20120044141A1 (en) * 2008-05-23 2012-02-23 Hiromu Ueshima Input system, input method, computer program, and recording medium
US20120231862A1 (en) * 2011-03-08 2012-09-13 Konami Digital Entertainment Co., Ltd. Game system and method of controlling computer
US20120277001A1 (en) * 2011-04-28 2012-11-01 Microsoft Corporation Manual and Camera-based Game Control
US20130053114A1 (en) * 2011-08-24 2013-02-28 Konami Digital Entertainment Co., Ltd. Game machine, storage medium storing computer program, and control method of controlling computer
CN103083905A (en) * 2010-03-15 2013-05-08 科乐美数码娱乐株式会社 Game system and storage medium
US9227142B2 (en) * 2005-08-24 2016-01-05 Nintendo Co., Ltd. Game controller and game system
US9227138B2 (en) 2005-08-24 2016-01-05 Nintendo Co., Ltd. Game controller and game system
US10328339B2 (en) * 2017-07-11 2019-06-25 Specular Theory, Inc. Input controller and corresponding game mechanics for virtual reality systems
US11253776B2 (en) * 2017-12-28 2022-02-22 Bandai Namco Entertainment Inc. Computer device and evaluation control method
US11260286B2 (en) 2017-12-28 2022-03-01 Bandai Namco Entertainment Inc. Computer device and evaluation control method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5349371A (en) * 1991-06-04 1994-09-20 Fong Kwang Chien Electro-optical mouse with means to separately detect the changes in contrast ratio in X and Y directions
US5690492A (en) * 1996-07-18 1997-11-25 The United States Of America As Represented By The Secretary Of The Army Detecting target imaged on a large screen via non-visible light
US5741185A (en) * 1997-02-05 1998-04-21 Toymax Inc. Interactive light-operated toy shooting game
US5796387A (en) * 1994-08-16 1998-08-18 Smith Engineering Positioning system using infrared radiation
US6070205A (en) * 1997-02-17 2000-05-30 Ssd Company Limited High-speed processor system having bus arbitration mechanism
US6227968B1 (en) * 1998-07-24 2001-05-08 Konami Co., Ltd. Dance game apparatus and step-on base for dance game
US20030003991A1 (en) * 2001-06-29 2003-01-02 Konami Corporation Game device, game controlling method and program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5349371A (en) * 1991-06-04 1994-09-20 Fong Kwang Chien Electro-optical mouse with means to separately detect the changes in contrast ratio in X and Y directions
US5796387A (en) * 1994-08-16 1998-08-18 Smith Engineering Positioning system using infrared radiation
US5690492A (en) * 1996-07-18 1997-11-25 The United States Of America As Represented By The Secretary Of The Army Detecting target imaged on a large screen via non-visible light
US5741185A (en) * 1997-02-05 1998-04-21 Toymax Inc. Interactive light-operated toy shooting game
US6070205A (en) * 1997-02-17 2000-05-30 Ssd Company Limited High-speed processor system having bus arbitration mechanism
US6227968B1 (en) * 1998-07-24 2001-05-08 Konami Co., Ltd. Dance game apparatus and step-on base for dance game
US20030003991A1 (en) * 2001-06-29 2003-01-02 Konami Corporation Game device, game controlling method and program

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7682237B2 (en) * 2003-09-22 2010-03-23 Ssd Company Limited Music game with strike sounds changing in quality in the progress of music and entertainment music system
US20050096132A1 (en) * 2003-09-22 2005-05-05 Hiromu Ueshima Music game with strike sounds changing in quality in the progress of music and entertainment music system
US11027190B2 (en) 2005-08-24 2021-06-08 Nintendo Co., Ltd. Game controller and game system
US10137365B2 (en) 2005-08-24 2018-11-27 Nintendo Co., Ltd. Game controller and game system
US9533220B2 (en) 2005-08-24 2017-01-03 Nintendo Co., Ltd. Game controller and game system
US9498709B2 (en) 2005-08-24 2016-11-22 Nintendo Co., Ltd. Game controller and game system
US9227138B2 (en) 2005-08-24 2016-01-05 Nintendo Co., Ltd. Game controller and game system
US9227142B2 (en) * 2005-08-24 2016-01-05 Nintendo Co., Ltd. Game controller and game system
US9211475B2 (en) * 2007-03-30 2015-12-15 Nintendo Co., Ltd. Game device and storage medium storing game program for performing a game process based on data from sensor
US20080242385A1 (en) * 2007-03-30 2008-10-02 Nintendo Co., Ltd. Game device and storage medium storing game program
EP2039402A3 (en) * 2007-09-12 2012-01-04 Namco Bandai Games Inc. Input instruction device, input instruction method, and dancing simultation system using the input instruction device and method
US20090069096A1 (en) * 2007-09-12 2009-03-12 Namco Bandai Games Inc. Program, information storage medium, game system, and input instruction device
US20110003641A1 (en) * 2008-02-19 2011-01-06 Konami Digital Entertainment Co., Ltd. Game device, game control method, information recording medium, and program
US8298083B2 (en) * 2008-02-19 2012-10-30 Konami Digital Entertainment Co., Ltd. Game device, game control method, information recording medium, and program
US20120044141A1 (en) * 2008-05-23 2012-02-23 Hiromu Ueshima Input system, input method, computer program, and recording medium
US8419516B2 (en) * 2009-08-04 2013-04-16 Konami Digital Entertainment Co., Ltd. Game system and game program
US20110034247A1 (en) * 2009-08-04 2011-02-10 Konami Digital Entertainment Co., Ltd. Game system and game program
CN103083905A (en) * 2010-03-15 2013-05-08 科乐美数码娱乐株式会社 Game system and storage medium
US8696421B2 (en) * 2011-03-08 2014-04-15 Konami Digital Entertainment Co., Ltd. Game system and method of using operation timing of an operation unit in a musical input device
US20120231862A1 (en) * 2011-03-08 2012-09-13 Konami Digital Entertainment Co., Ltd. Game system and method of controlling computer
US20120277001A1 (en) * 2011-04-28 2012-11-01 Microsoft Corporation Manual and Camera-based Game Control
US20130053114A1 (en) * 2011-08-24 2013-02-28 Konami Digital Entertainment Co., Ltd. Game machine, storage medium storing computer program, and control method of controlling computer
US8647185B2 (en) * 2011-08-24 2014-02-11 Konami Digital Entertainment Co., Ltd. Game machine, storage medium storing computer program, and control method of controlling computer
US10328339B2 (en) * 2017-07-11 2019-06-25 Specular Theory, Inc. Input controller and corresponding game mechanics for virtual reality systems
US20190299090A1 (en) * 2017-07-11 2019-10-03 Specular Theory, Inc. Input controller and corresponding game mechanics for virtual reality systems
US11253776B2 (en) * 2017-12-28 2022-02-22 Bandai Namco Entertainment Inc. Computer device and evaluation control method
US11260286B2 (en) 2017-12-28 2022-03-01 Bandai Namco Entertainment Inc. Computer device and evaluation control method

Similar Documents

Publication Publication Date Title
JP5130504B2 (en) Information processing apparatus, information processing method, program, and storage medium
JP4742247B2 (en) GAME DEVICE AND GAME PROGRAM
US20070197290A1 (en) Music Game Device, Music Game System, Operation Object, Music Game Program, And Music Game Method
US8758132B2 (en) Methods and systems for enabling depth and direction detection when interfacing with a computer program
US5704836A (en) Motion-based command generation technology
JP5449859B2 (en) GAME PROGRAM, GAME DEVICE, AND GAME SYSTEM
TWI469813B (en) Tracking groups of users in motion capture system
US7297864B2 (en) Image signal generating apparatus, an image signal generating program and an image signal generating method
JP2002196855A (en) Image processor, image processing method, recording medium, computer program and semiconductor device
US8586852B2 (en) Storage medium recorded with program for musical performance, apparatus, system and method
JP2009009266A (en) Image processing program and image processing unit
US20080281597A1 (en) Information processing system and storage medium storing information processing program
JP5848520B2 (en) Music performance program, music performance device, music performance system, and music performance method
US7554545B2 (en) Drawing apparatus operable to display a motion path of an operation article
US9153071B2 (en) Game apparatus, game program and game system
JP5758202B2 (en) Image processing program, image processing apparatus, image processing method, and image processing system
JP2005230534A (en) Dress-up game device
JP4747334B2 (en) Drawing apparatus, operation article, drawing system, drawing program, and drawing method
JPH09311759A (en) Method and device for gesture recognition
JP4735802B2 (en) GAME DEVICE, GAME PROGRAM, AND GAME DEVICE CONTROL METHOD
JP4701411B2 (en) Music game device, music game system, operation article, music game program, and music game method
JP3159906U (en) Image processing module
JP2008076765A (en) Musical performance system
JP3098423U (en) Automatic performance device and automatic performance system

Legal Events

Date Code Title Description
AS Assignment

Owner name: SSD COMPANY LIMITED, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:UESHIMA, HIROMU;REEL/FRAME:018585/0633

Effective date: 20061122

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION