US20120172099A1 - Music game system, computer program of same, and method of generating sound effect data - Google Patents

Music game system, computer program of same, and method of generating sound effect data Download PDF

Info

Publication number
US20120172099A1
US20120172099A1 US13/394,967 US201013394967A US2012172099A1 US 20120172099 A1 US20120172099 A1 US 20120172099A1 US 201013394967 A US201013394967 A US 201013394967A US 2012172099 A1 US2012172099 A1 US 2012172099A1
Authority
US
United States
Prior art keywords
sound
data
musical
sound effect
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/394,967
Inventor
Osamu Migitera
Yoshitaka Nishimura
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Konami Digital Entertainment Co Ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Assigned to KONAMI DIGITAL ENTERTAINMENT CO., LTD. reassignment KONAMI DIGITAL ENTERTAINMENT CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NISHIMURA, YOSHITAKA, MIGITERA, OSAMU
Publication of US20120172099A1 publication Critical patent/US20120172099A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/54Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/36Accompaniment arrangements
    • G10H1/361Recording/reproducing of accompaniment for use with an external source, e.g. karaoke systems
    • G10H1/368Recording/reproducing of accompaniment for use with an external source, e.g. karaoke systems displaying animated or moving pictures synchronized with the music or audio part
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/424Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving acoustic input signals, e.g. by using the results of pitch or rhythm extraction or voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/44Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment involving timing of operations, e.g. performing an action within a time slot
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/814Musical performances, e.g. by evaluating the player's ability to follow a notation
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/36Accompaniment arrangements
    • G10H1/40Rhythm
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/215Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1081Input via voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/206Game information storage, e.g. cartridges, CD ROM's, DVD's, smart cards
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6063Methods for processing data by generating or executing the game program for sound processing
    • A63F2300/6081Methods for processing data by generating or executing the game program for sound processing generating an output signal, e.g. under timing constraints, for spatialization
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/63Methods for processing data by generating or executing the game program for controlling the execution of the game in time
    • A63F2300/638Methods for processing data by generating or executing the game program for controlling the execution of the game in time according to the timing of operation or a time limit
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8047Music games
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/031Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
    • G10H2210/066Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal for pitch analysis as part of wider processing for musical purposes, e.g. transcription, musical performance evaluation; Pitch recognition, e.g. in polyphonic sounds; Estimation or use of missing fundamental
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/091Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
    • G10H2220/101Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
    • G10H2220/106Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters using icons, e.g. selecting, moving or linking icons, on-screen symbols, screen regions or segments representing musical elements or parameters
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/135Musical aspects of games or videogames; Musical instrument-shaped game input interfaces
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2230/00General physical, ergonomic or hardware implementation of electrophonic musical tools or instruments, e.g. shape or architecture
    • G10H2230/005Device type or category
    • G10H2230/015PDA [personal digital assistant] or palmtop computing devices used for musical purposes, e.g. portable music players, tablet computers, e-readers or smart phones in which mobile telephony functions need not be used

Abstract

A music game system (1) is provided with a sound input device (9) which inputs sound, a speaker (8) which outputs game sound, and a external storage device (20) which stores sound effect data (27) to cause the speaker to output each of sound effects of different musical intervals and sequence data (29) in which a relationship between a player's operation and the sound effect to be output correspondingly is described. The music game system determines the musical interval representing an input sound based on sound data of the sound input by the sound input device (9), generates multiple tone data which have different music interval from the sound data respectively based on the musical interval determination result so as to form the musical scale, and stores a set of multiple tone data as at least a part of the sound effect data (27).

Description

    TECHNICAL FIELD
  • The present invention relates to a music game system and the like in which a sound input by a player is reflected in game contents.
  • BACKGROUND ART
  • Music game machines in which game contents changes based on a sound input by a player are well-known. For example, music game machines that reflect an input sound in the behavior of characters (refer to Patent Literature 1) and also music game machines that inputs and marks player's singing to vie for supremacy (refer to Patent Literature 2) are known. Patent Literature 1: JP-A-2002-136764 and Patent Literature 2: JP-A-H10-268876.
  • SUMMARY OF INVENTION Technical Problem
  • All of the above game machines change game contents by capturing a player's voice. After a musical interval of the player's voice is detected, processing is performed so that behavior of characters is changed based on a result of comparison with a reference musical interval. However, no game machine is configured to reflect a sound input by the player as a raw material in game content to enjoy the game based on the input sound.
  • The present invention aims to provide a music game system capable of determining a sound input by a player and forming a musical scale based on a determination result, a computer program thereof, and a method of generating sound effect data.
  • Solution to Problem
  • The music game system of the present invention is a game system comprising: a sound input device which inputs sound; an audio output device which outputs game sound; a sound effect data storage device which stores sound effect data to cause the audio output device to output each of sound effects of different musical intervals; a sequence data storage device which stores sequence data in which a relationship between a player's operation and the sound effect to be output correspondingly is described; a musical interval determination device which determines the musical interval representing an input sound based on sound data of the sound input by the sound input device; a musical scale generating device which generates multiple tone data which have different music interval from the sound data respectively based on a musical interval determination result of the musical interval determination device so as to form the musical scale; and a sound effect data storage control device which causes the sound effect data storage device to store the multiple tone data generated by the musical scale generating device as at least a part of the sound effect data.
  • The computer program of the present invention is a computer program for a music game system comprising: a sound input device which inputs sound; an audio output device which outputs game sound; a sound effect data storage device which stores sound effect data to cause the audio output device to output each of sound effects of different musical intervals; a sequence data storage device which stores sequence data in which a relationship between a player's operation and the sound effect to output correspondingly is described; wherein the computer program causes the music game system to function as: a musical interval determination device which determines the musical interval representing an input sound based on sound data of the sound input by the sound input device; a musical scale generating device which generates multiple tone data which have different music interval from the sound data respectively based on a musical interval determination result of the musical interval determination device so as to form the musical scale; and a sound effect data storage control device which causes the sound effect data storage device to store the multiple tone data generated by the musical scale generating device as at least a part of the sound effect data.
  • In the present invention, sound data is generated by a musical interval determination device based on a sound input into a sound input device by a player and a musical interval to represent the sound data is determined. Then, multiple tone data which have different music interval are generated by a musical scale generation device from the sound data whose musical interval has been determined based on a musical interval determination result of the sound data. The multiple tone data form a musical scale. The multiple tone data are stored in a sound effect data storage device as sound effect data and the multiple tone data are used as a sound effect to be output in response to a player's operation. Thus, a musical scale is formed based on a sound input arbitrarily by the player and therefore, a melody can be played based on an input sound or an input sound may be reflected in game content as a raw material for the player to enjoy a game with a sound input by the player.
  • As one aspect of the music game system of the present invention, the musical interval determination device determines the musical interval of the sound by identifying a frequency representing the sound data of the sound input by the sound input device. According to this, the musical interval of the sound is determined by, for example, identifying the frequency at which the distribution is maximum as a representative value with reference to a frequency spectrum of the sound data.
  • As one aspect of the music game system of the present invention, the musical scale generating device generates the musical scale of at least one octave or more. According to this, a melody can be played by generating a musical scale. If a large number of pieces of tone data is generated, the musical scale grows in breadth and the number of melodies that can be played increases so that game content can be made more advanced.
  • As one aspect of the music game system of the present invention, further comprising: an input device which has at least one operating device; wherein the sound effect following a description of the sequence data is played by the audio output device based on operations of the player through the input device. According to this, by operating the operating device, the player can reproduce a sound effect constituted of a musical scale formed by using a sound input by the player. Therefore, an input sound can be reflected in game content as a raw material to enjoy a game with a sound input by the player.
  • The method of the present invention is a method of generating sound effect data comprising: a musical interval determination step which determines the musical interval representing an input sound based on sound data of the sound input by a sound input device; a musical scale generating step which generates multiple tone data which have different music interval from the sound data respectively based on a musical interval determination result of the musical interval determination step so as to form the musical scale; and a storing step which causes a storage device to store the multiple tone data generated by the musical scale generating step as sound effect data for outputting from an audio output device.
  • The present invention is a method of generating sound effect data in a music game system and a computer program thereof and achieves a similar operation effect. The present invention is not limited to music game systems and is also applicable to various electronic devices such as electronic musical instruments.
  • Advantageous Effects of Invention
  • In a music game system according to the present invention and a computer program thereof, as described above, sound data is generated by a musical interval determination device based on a sound input into a sound input device by a player and a musical interval to represent the sound data is determined. Then, multiple tone data which have different music interval are generated by a musical scale generation device from the sound data whose musical interval has been determined based on a musical interval determination result of the sound data. The multiple tone data form a musical scale. The multiple tone data are stored in a sound effect data storage device as sound effect data and the multiple tone data are used as a sound effect to be output in response to a player's operation. Thus, a musical scale is formed based on a sound input arbitrarily by the player and therefore, a melody can be played based on an input sound or an input sound may be reflected in game content as a raw material for the player to enjoy a game with a sound input by the player. A similar effect is achieved by a method of generating sound effect data.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1 is a diagram showing an appearance of a game machine according to one aspect of the present invention.
  • FIG. 2 is a functional block diagram of the game machine according to one aspect of the present invention.
  • FIG. 3 is an enlarged view of an operation instruction screen displayed as part of a game screen.
  • FIG. 4 is a diagram showing one example of contents of sound effect data.
  • FIG. 5 is a diagram showing one example of contents of sequence data.
  • FIG. 6 is a flowchart showing a sequence processing routine executed by a game controller.
  • FIG. 7 is a flowchart showing a musical interval determination processing routine executed by the game controller.
  • FIG. 8 is a flowchart showing a musical scale generating processing routine executed by the game controller.
  • FIG. 9 is a graph showing one example of sound data.
  • FIG. 10 is a graph showing a frequency spectrum of the sound data in FIG. 9.
  • FIG. 11 is a graph showing tone data obtained by frequency conversion of the sound data in FIG. 9.
  • DESCRIPTION OF EMBODIMENTS
  • An embodiment obtained by applying the present invention to a mobile game machine will be described below. As shown in FIG. 1, a game machine 1 includes a housing 2 that can be held by a player (user) by hand, a first monitor 3 arranged on the right side of the housing 2, a second monitor 4 arranged on the left side of the housing 2, a plurality of push-button switches 5 arranged in the upper part of the first monitor 3, and a cross key 6 arranged in the lower part of the first monitor 3. A transparent touch panel 7 is laid on the surface of the first monitor 3. The touch panel 7 is a well-known input device that, when touched by a player through a touch pen or the like, outputs a signal in accordance with the touch position. In addition, the game machine 1 is provided with various input devices and output devices included in an ordinary mobile game machine such as a power switch, volume operation switch, and power lamp, but an illustration thereof is omitted in FIG. 1.
  • As shown in FIG. 2, a control unit 10 as a computer is provided inside the game machine 1. The control unit 10 includes a game controller 11 as a control body, a pair of display controllers 12, 13 that operate according to output from the game controller 11, and an audio output controller 14. The game controller 11 is configured as a unit combining a microprocessor and various peripheral devices such as internal storage devices (as an example, a ROM and a RAM) necessary for the operation of the microprocessor. The display controllers 12, 13 render an image in accordance with image data provided from the game controller 11 in a frame buffer to cause the monitors 3, 4 to display a predetermined image by outputting a video signal corresponding to the rendered image to the monitors 3, 4 respectively. The audio output controller 14 causes a speaker 8 to play predetermined sound (including music sound and the like) by generating audio playback signals in accordance with audio playback data provided from the game controller 11 and outputting them to the speaker 8.
  • The push-button switches 5, the cross key 6, and the touch panel 7 described above are connected to the game controller 11 as input devices and, in addition, a sound input device (microphone) 9 is connected thereto. Besides, various input devices may be connected to the game controller 11. Further, an external storage device 20 is connected to the game controller 11. A storage medium capable of holding storage without power feeding like a magnetic storage device and a nonvolatile semiconductor memory device such as EEPROM is used as the external storage device 20. The storage medium of the external storage device 20 is removable from the game machine 1.
  • A game program 21 and game data 22 are stored in the external storage device 20. The game program 21 is a computer program needed to play a music game in the game machine 1 according to a predetermined procedure and contains a sequence control module 23, a musical interval determination module 24, and a musical scale generating module 25 to realize functions according to the present invention. When the game machine 1 is started, the game controller 11 performs various initial settings necessary for operation as the game machine 1 by executing an operation program stored in an internal storage device thereof and then sets the environment to play the music game according to the game program 21 by reading the game program 21 from the external storage device 20 and executing the game program 21. A sequence process portion 15 is generated in the game controller 11 after the sequence control module 23 of the game program 21 being performed by the game controller 11. Also, a musical interval determination portion 16 is generated in the game controller 11 after the musical interval determination module 24 of the game program 21 being performed by the game controller 11 and similarly, a musical scale generating portion 17 is generated in the game controller 11 after the musical scale generating module 25 being performed by the game controller 11.
  • The sequence process portion 15, the musical interval determination portion 16, and the musical scale generating portion 17 are logical devices realized by combining computer hardware and computer programs. The sequence process portion 15 performs music game processing such as issuing instructions of operation to a player in time to playback of music (musical piece) selected by the player or generating a sound effect in accordance with a player's operation. The musical interval determination portion 16 decides a representative value of a frequency by capturing any sound input into the sound input device 9 by the player and performing predetermined processing described later thereon. The musical scale generating portion 17 generates multiple tone data by changing the musical interval based on the representative value decided by the musical interval determination portion 16. These pieces of tone data form musical scales of a predetermined octave number and constitute sound effects. In addition to the above modules 23 to 25, various program modules necessary for playing the music game are contained in the game program 21 and logical devices corresponding to such modules are generated in the game controller 11, but an illustration thereof is omitted.
  • Various kinds of data to be referenced when the music game is played according to the game program 21 are contained in the game data 22. For example, music data 26, sound effect data 27, and image data 28 are contained in the game data 22. The music data 26 is data needed to cause the speaker 8 to play and output a musical piece intended for the game. Though one kind of the music data 26 is shown in FIG. 2, the player can actually select the musical piece from a plurality of musical pieces. The game data 22 has the plurality of pieces of the music data 26 recorded with information to identify each musical piece attached thereto. The sound effect data 27 is data in which a plurality of kinds of sound effects to be output from the speaker 8 in response to a player's operation is recorded by associating with unique code for each sound effect. A sound effect contains sounds of instruments and other various kinds of sounds. Vocal sounds to cause the speaker 8 to output text are also contained as a kind of sound effects. The sound effect data 27 is prepared for each kind for a predetermined octave number by changing the musical interval. The image data 28 is data to cause the monitors 3 and 4 to display a background image in the game screen, various objects, icons and the like.
  • Further, sequence data 29 is contained in the game data 22. The sequence data 29 is data that defines operations and the like to be instructed to the player. At least one piece of the sequence data 29 is prepared for one piece of the music data 26.
  • Next, an overview of the music game played in the game machine 1 will be provided. As shown in FIG. 1, an operation instruction screen 100 of the game is displayed in the first monitor 3 and an information screen 110 of the game is displayed in the second monitor 4 while the music game is played in the game machine 1. As shown also in FIG. 3, a state in which a first lane 101, a second lane 102, and a third lane 103 extending in the vertical direction are visually divided by a procedure such as dividing by a division line 104 is displayed in the operation instruction screen 100. An operation reference portion 105 is displayed at a bottom end of each of the lanes 101, 102 and 103. Objects 106 as operation indicators are displayed in the lanes 101, 102 and 103 according to the sequence data 27 while the music game is played, that is, playback of a musical piece is in progress.
  • The objects 106 appear at a top end of the lanes 101, 102 and 103 at an appropriate time of the musical piece and are scrolled downward, as indicated by an arrow A in FIG. 3, with the progress of the musical piece. The player is requested to perform a touch operation of the lane 101, the lane 102, or the lane 103 in which the object 106 is displayed through an operation member such as a touch pen 120 coinciding with the arrival of the object 106 at the operation reference portion 105. If the player performs a touch operation, a difference between the time when the object 106 and the operation reference portion 105 match and the time when the player performs the touch operation is detected. The player's operation is evaluated more highly with a decreasing the difference. Moreover, a sound effect corresponding to each of the objects 106 is played by the speaker 8 in accordance with the touch operation. In the example of FIG. 3, the object 106 is immediately before arriving at the operation reference portion 105 in the second lane 102 and the player may perform a touch operation of the second lane 102 coinciding with the arrival thereof. Anywhere inside the second lane 102 may be touched. That is, three operating devices are formed in the present embodiment by the combination of the lanes 101, 102 and 103 displayed in the first monitor 3 and the touch panel 107 laid thereon. Incidentally, each of the lanes 101, 102 and 103 may be used as a term representing the operating device below.
  • The sound effect corresponding to each of the objects 106 played in accordance with a touch operation is selected from a plurality of sound effects recorded in the sound effect data 27. As shown in FIG. 4, the sound effect data 27 contains original data 27 a pre-recorded in the game data 22 and user data 27 b obtained based on sound input into the sound input device 9 by the player. The original data 27 a and the user data 27 b have a plurality of sound effects A1, B1, . . . recorded therein and if the sound effect A1 is taken as an example, the sound effect A1 has a set of tone data sd_000, sd_001, sd_002, . . . associating each tone configuring a musical scale with unique code recorded therein. The other sound effects B1, Cl, . . . have similar tone data. The user data 27 b is similar to the original data 27 a in the structure of tone data included in the sound effects A1, B1, . . . , but is different from the pre-recorded original data 27 a in that tone data is generated based on sound input into the sound input device 9 by the player.
  • Next, the sequence data 29 will be described in detail. As shown in FIG. 5, the sequence data 29 contains an initial setting portion 29 a and an operation sequence portion 29 b. In the initial setting portion 29 a, information specifying play conditions of the game that are different from musical piece to musical piece such as information like the tempo of the music (for example, a BPM) as an initial setting for the game to play and information specifying sound effects to be generated when the lanes 101 to 103 are each operated is described.
  • In the operation sequence portion 29 b, on the other hand, operation specifying information 29 c and sound effect switching instruction information 29 d are described. The operation specifying information 29 c in which operation times of the lanes 101 to 103 are associated with information specifying one of the lanes 101 to 103 is described. That is, as illustrated in FIG. 5 as a portion thereof, the operation specifying information 29 c is configured as a set of a plurality of records associating the time (operation time) when an operation should be performed during a musical piece with information specifying the operation device (lane). As the operation time, values indicating a bar number in the musical piece, a beat number, and the time in a beat are described by each delimited with a comma. The time in a beat is an elapsed time from the start of a beat and if the time length of a beat is equally divided into n unit times, the time in a beat is represented by the number of units from the start of the beat. If, for example, n=100 and the time ¼ having passed from the start of the second beat of the first bar of a musical piece should be specified as the operation time, the operation specifying information 29 c is described as “01, 2, 025”. When the first lane 101 should be specified as the operation device, “button 1” is described, when the second lane 102 should be specified, “button2” is described, and when the third lane 103 should be specified, “button3” is described. In the example of FIG. 5, the operation time and the operation device are specified in a manner such as touching the first lane 101 at the start (000) of the first beat of the first bar, touching the second lane 102 at the start (000) of the second beat of the first bar, and touching the third lane 103 when “025” passes after the start of the second beat of the first bar.
  • The sound effect switching instruction information 29 d is inserted into a suitable position in the operation specifying information 29 c. The sound effect switching instruction information 29 d is described by associating the time in a musical piece when the sound effect should be changed and tone data of sound effects to be generated when the lanes 101 to 103 are each operated to change the sound effects generated when the specified lane is touched in the subsequent operation specifying information 29 c. The time in a musical piece is described in the same format as the format of the operation time of the operation specifying information 29 c. The sound effect switching instruction information 29 d specifies tone data of one of the original data 27 a and the user data 27 b recorded in the sound effect data 27 for each lane. The sound effect switching instruction information 29 d is inserted into the time in a musical piece when the sound effect should be switched and the setting of the sound effect is maintained until instructed by the next sound effect switching instruction information 29 d.
  • The sequence process portion 15 of the game controller 11 controls the display of each of the lanes 101 to 103 so that the object 106 and the operation reference portion 105 match at the above operation time specified by the operation specifying information 29 c. The sequence process portion 15 also exercises control so that the sound effects generated when the player touches the specified lanes 101 to 103 are switched at the time in a musical piece specified by the sound effect switching instruction information 29 d.
  • Next, processing of the game controller 11 when a music game is played on the game machine 1 will be described. After completing initial settings necessary to play the music game by reading the game program 21, the game controller 11 waits in preparation for instructions to start the game from a player. Instructions to start the game include, for example, an operation to identify the musical piece to be played in the game or data to be used in the game such as the selection of the degree of difficulty. The procedure for receiving such instructions may be the same as the procedure for a well-known music game and the like.
  • If the start of the game is instructed, the game controller 11 reads the music data 26 corresponding to the music selected by the player and outputs the music data 26 to the audio output controller 14 to cause the speaker 8 to play the musical piece. Accordingly, the control unit 10 functions as a musical piece playback device. In synchronization with playback of the musical piece, the game controller 11 also reads the sequence data 29 corresponding to the player's selection to generate image data necessary for rendering of the operation instruction screen 100 and the information screen 110 while referencing the image data 28 and outputs the image data to the display controllers 12 and 13 to cause the monitors 3 and 4 to display the operation instruction screen 100 and the information screen 110 respectively. Further, while the music game is played, the game controller 11 repeatedly executes the sequence processing routine shown in FIG. 6 as processing necessary for the display of the operation instruction screen 100 and the like in a predetermined period.
  • When the sequence processing routine shown in FIG. 6 is started, in step S1, the sequence process portion 15 of the game controller 11 first obtains the current time in the musical piece. For example, keeping time is started by an internal clock of the game controller 11 relative to the time of playback start of the musical piece and the current time is obtained from the value of the internal clock. In subsequent step S2, the sequence process portion 15 obtains data of the operation time present in the time length corresponding to the display range of the operation instruction screen 100 from the sequence data 28. The display range is set, as an example, to the time range corresponding to two bars of the musical piece from the current time toward the future.
  • In next step S3, the sequence process portion 15 calculates coordinates of all the objects 106 to be displayed in the lanes 101 to 103 in the operation instruction screen 100. The calculation is carried out, as an example, as described below. Whether to arrange the object 106 in any of the lanes 101 to 103 is determined based on the designation of the lanes 101 to 103 associated with any operation time contained in the display range, that is, the designation of any of “button1” to “button3” in the example of FIG. 5. Also, the position of each of the objects 106 in the time-axis direction (namely, the direction of movement of the object 106) from the operation reference portion 105 is determined in accordance with a difference between each operation time and the current time. Accordingly, coordinates of each of the objects 106 needed to arrange each of the objects 106 along the time axis from the operation reference portion 105 in the specified lanes 101 to 103 can be obtained.
  • After the calculation of coordinates of the objects 106 is completed, the sequence process portion 15 proceeds to step S4 to determine whether the sound effect switching instruction information 29 d is present in the data which is obtained from the sequence data 29. If the sound effect switching instruction information 29 d is present, the sequence process portion 15 obtains the current time in step S5 and compares the current time with the time in the musical piece specified by the sound effect switching instruction information 29 d to determine whether the current time corresponds to the timing of switching instructions of the sound effect. If the current time corresponds to the timing of switching instructions of the sound effect, in step S6, the sequence control portion 15 changes the sound effects generated in the respective lanes 101 to 103 specified by the subsequent operation specifying information 29 c to the sound effects specified by the sound effect switching instruction information 29 d. To give a description by taking the example shown in FIG. 5, after the start of the third beat of the first bar of the musical piece, sound data sd_101, sd_105, sd_106 of the sound effect A2 of the user data 27 b of the sound effect data 27 is allocated to the lanes 101, 102 and 103 respectively and if the player touches the lanes 101, 102 or 103, the respective sound data is played. If the sound effect switching instruction information 29 d is not present in step S4 or the sound effect switching instruction information 29 d is not present in step S5, the sequence process portion 15 proceeds to step S7.
  • When switching of the sound effects is completed, the sequence process portion 15 proceeds to next step S7 to generate image data necessary for rendering of the operation instruction screen 100 based on coordinates of the objects 106 calculated in step S3. More specifically, the sequence process portion 15 generates image data in such a way that the objects 106 are arranged in calculated coordinates. The image of the object 106 may be obtained from the image data 28.
  • In subsequent step S8, the sequence process portion 15 outputs the image data to the display controller 12. Accordingly, the operation instruction screen 100 is displayed in the first monitor 3. When the processing in step S8 is completed, the sequence process portion 15 terminates this sequence processing routine. With the above processing being performed repeatedly, the objects 106 are displayed by scrolling in the lanes 101 to 103 in such a way that the objects 106 arrive at the operation reference portion 105 at operation times described in the sequence data 29.
  • Next, processing by the musical interval determination unit 16 and the musical scale generating portion 17 when a sound effect is created based on a sound input by a player into the game machine 1 will be described. A sound effect is created when, for example, the start thereof is instructed by the player in a waiting state in which no music game is played. When the creation of a sound effect is started, first the musical interval determination portion 16 executes the musical interval determination processing routine shown in FIG. 7 and the musical scale generating portion 17 executes the musical scale generating processing routine shown in FIG. 8 based on a result of the musical interval determination processing routine.
  • When the musical interval determination processing routine in FIG. 7 is started, in step S11, the musical interval determination portion 16 of the game controller 11 obtains sound input by the player. If the player inputs sound when the sound input device 9 is ready to capture sound, raw sound data is generated. In subsequent step S12, the musical interval determination portion 16 makes A/D conversions of the raw sound data. An analog signal of the raw sound data is thereby converted into a digital signal to create sound data of the input sound. FIG. 9 shows an example of sound data. The sound data in FIG. 9 is a digital waveform of guitar sound and the horizontal axis and the vertical axis represent the dynamic range and the duration respectively. Incidentally, well-known technology may be used for A/D conversion.
  • Then, in step S13, the musical interval determination portion 16 obtains a frequency spectrum of the sound data. FIG. 10 shows a frequency spectrum generated by a fast Fourier transform of the sound data obtained in step S12. The horizontal axis and the vertical axis represent the frequency and the degree of distribution of the frequency respectively. Incidentally, the generation of a frequency spectrum is not limited to the calculation based on the fast Fourier transform and various well-known technologies may be used. In subsequent step S14, the musical interval determination portion 16 decides the representative value from the frequency spectrum obtained in step S13. The representative value is defined as the maximum value of the distribution number of the frequency spectrum. To describe by taking the graph in FIG. 10, the frequency at the peak indicated by an arrow p becomes the representative value. Based on the frequency of the representative value decided as described above, the musical interval of the sound data based on the sound input by the player is determined. The representative value may also be calculated from data of a band q occupying both sides of a crest having the above maximum peak. The representative value can also be calculated from a fixed band by the method as described above when the peak is ambiguous with the peak frequency having a width or the like. If the processing in step S14 is completed, the musical interval determination portion 16 terminates this musical interval determination processing routine. With the above processing, the representative value of sound data based on a sound input by a player is decided and the inherent musical interval is determined.
  • If the representative value is obtained by the musical interval determination processing routine, the musical scale generating portion 17 executes the musical scale generating processing routine in FIG. 8. In step S21, the musical scale generating portion 17 generates multiple tone data forming a musical scale from sound data whose representative value has been decided. The musical scale generating portion 17 makes frequency conversions of the sound data based on the representative value so that the representative value of each piece of tone data becomes the frequency of each tone forming the musical scale of the predetermined octave number. FIG. 11 shows an example of frequency-converted tone data. The waveform in FIG. 11 is obtained by frequency conversion one octave upward of the sound data in FIG. 9. Then, in step S22, the musical scale generating portion 17 stores a set of generated tone data in the sound effect data 27. The tone data is stored in the user data 27 b of the sound effect data 27. If the processing in step S22 is completed, the musical scale generating portion 17 terminates this musical scale generating processing routine. With the above processing, multiple tone data with mutually different frequencies of representative values is generated based on sound data whose representative value has been decided to form a musical scale. A set of tone data forming the musical scale is stored in the user data 27 b of the sound effect data 27 as a sound effect.
  • In the above embodiment, the external storage device 20 of the game machine 1 functions as a sound effect data storage device and a sequence data storage device. Also, the control unit 10 functions as a musical interval determination device by causing the musical interval determination portion 16 to perform the processing in steps S11 to S14 in FIG. 7, functions as a musical scale generating device by causing the musical scale generating portion 17 to perform the processing in step S21 in FIG. 8, and functions as a sound effect data storage control device by causing the musical scale generating portion 17 to perform the processing in step S22 in FIG. 8.
  • The present invention is not limited to the above embodiment and can be carried out in various embodiments. For example, the present embodiment has been described by taking the music game machine 1 as an example of the apparatus that causes a musical interval determination device, a musical scale generating device, and a sound effect data storage control device to function, but is not limited to the above example. For example, the present invention may be applied to various electronic devices such as electronic musical instruments. If the present invention is applied to an electronic musical instrument, a melody can be played based on any sound input by the player.
  • A music game system according to the present invention is not limited to game systems realized as mobile game machines and may be realized in an appropriate form such as home video game machines, business-use game machines installed in commercial facilities, and game systems realized by using a network. The input device is not limited to an example using the touch panel and input devices configured in various ways such as a push button, lever, and track ball can be used.

Claims (6)

1. A music game system comprising:
a sound input device which inputs sound;
an audio output device which outputs game sound;
a sound effect data storage device which stores sound effect data to cause the audio output device to output each of sound effects of different musical intervals;
a sequence data storage device which stores sequence data in which a relationship between a player's operation and the sound effect to be output correspondingly is described;
a musical interval determination device which determines the musical interval representing an input sound based on sound data of the sound input by the sound input device;
a musical scale generating device which generates multiple tone data which have different music interval from the sound data respectively based on a musical interval determination result of the musical interval determination device so as to form the musical scale; and
a sound effect data storage control device which causes the sound effect data storage device to store the multiple tone data generated by the musical scale generating device as at least a part of the sound effect data.
2. The music game system of claim 1, wherein
the musical interval determination device determines the musical interval of the sound by identifying a frequency representing the sound data of the sound input by the sound input device.
3. The music game system of claim 1, wherein
the musical scale generating device generates the musical scale of at least one octave or more.
4. The music game system of claim 1, further comprising:
an input device which has at least one operating device; wherein
the sound effect following a description of the sequence data is played by the audio output device based on operations of the player through the input device.
5. A storage medium storing a computer program for a music game system comprising:
a sound input device which inputs sound;
an audio output device which outputs game sound;
a sound effect data storage device which stores sound effect data to cause the audio output device to output each of sound effects of different musical intervals;
a sequence data storage device which stores sequence data in which a relationship between a player's operation and the sound effect to output correspondingly is described; wherein
the computer program causes the music game system to function as:
a musical interval determination device which determines the musical interval representing an input sound based on sound data of the sound input by the sound input device;
a musical scale generating device which generates multiple tone data which have different music interval from the sound data respectively based on a musical interval determination result of the musical interval determination device so as to form the musical scale; and
a sound effect data storage control device which causes the sound effect data storage device to store the multiple tone data generated by the musical scale generating device as at least a part of the sound effect data.
6. A method of generating sound effect data comprising:
a musical interval determination step which determines the musical interval representing an input sound based on sound data of the sound input by a sound input device;
a musical scale generating step which generates multiple tone data which have different music interval from the sound data respectively based on a musical interval determination result of the musical interval determination step so as to form the musical scale; and
a storing step which causes a storage device to store the multiple tone data generated by the musical scale generating step as sound effect data for outputting from an audio output device.
US13/394,967 2009-09-11 2010-09-07 Music game system, computer program of same, and method of generating sound effect data Abandoned US20120172099A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2009210571A JP5399831B2 (en) 2009-09-11 2009-09-11 Music game system, computer program thereof, and method of generating sound effect data
JP2009210571 2009-09-11
PCT/JP2010/065337 WO2011030761A1 (en) 2009-09-11 2010-09-07 Music game system, computer program of same, and method of generating sound effect data

Publications (1)

Publication Number Publication Date
US20120172099A1 true US20120172099A1 (en) 2012-07-05

Family

ID=43732433

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/394,967 Abandoned US20120172099A1 (en) 2009-09-11 2010-09-07 Music game system, computer program of same, and method of generating sound effect data

Country Status (4)

Country Link
US (1) US20120172099A1 (en)
JP (1) JP5399831B2 (en)
CN (1) CN102481488B (en)
WO (1) WO2011030761A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140106885A1 (en) * 2012-10-17 2014-04-17 Nintendo Co., Ltd. Storage medium having stored therein game program, game apparatus, game system, and game processing method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5978492A (en) * 1994-12-02 1999-11-02 Sony Corporation Sound source data generation method, recording medium, and sound source data processing device
US20010045154A1 (en) * 2000-05-23 2001-11-29 Yamaha Corporation Apparatus and method for generating auxiliary melody on the basis of main melody
US6509519B2 (en) * 1995-09-29 2003-01-21 Yamaha Corporation Method and apparatus for generating musical tone waveforms by user input of sample waveform frequency
US20050130740A1 (en) * 2003-09-12 2005-06-16 Namco Ltd. Input device, input determination method, game system, game system control method, program, and information storage medium
US20080125222A1 (en) * 2005-07-11 2008-05-29 Konami Digital Entertainment Co., Ltd. Game program, game device, and game method
US20080200224A1 (en) * 2007-02-20 2008-08-21 Gametank Inc. Instrument Game System and Method
US20080276793A1 (en) * 2007-05-08 2008-11-13 Sony Corporation Beat enhancement device, sound output device, electronic apparatus and method of outputting beats

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1106209C (en) * 1989-01-10 2003-04-23 任天堂株式会社 Electronic gaming device with pseude-stereophonic sound generating cap abilities
JPH08123448A (en) * 1994-10-18 1996-05-17 Sega Enterp Ltd Image processor using waveform analysis of sound signal
AU4496797A (en) * 1997-04-14 1998-11-11 Thomson Consumer Electronics, Inc System for forming program guide information for user initiation of control and communication functions
US6464585B1 (en) * 1997-11-20 2002-10-15 Nintendo Co., Ltd. Sound generating device and video game device using the same
KR100424231B1 (en) * 1999-03-08 2004-03-25 파이쓰, 인크. Data reproducing device, data reproducing method, and information terminal
JP2001009152A (en) * 1999-06-30 2001-01-16 Konami Co Ltd Game system and storage medium readable by computer
JP2003509729A (en) * 1999-09-16 2003-03-11 ハンスルソフト コーポレーション リミテッド Method and apparatus for playing musical instruments based on digital music files
JP4497264B2 (en) * 2001-01-22 2010-07-07 株式会社セガ Game program, game apparatus, sound effect output method, and recording medium
JP2002351489A (en) * 2001-05-29 2002-12-06 Namco Ltd Game information, information storage medium, and game machine
CN1805003B (en) * 2006-01-12 2011-05-11 深圳市蔚科电子科技开发有限公司 Pitch training method
JP4108719B2 (en) * 2006-08-30 2008-06-25 株式会社バンダイナムコゲームス PROGRAM, INFORMATION STORAGE MEDIUM, AND GAME DEVICE
JP2008178449A (en) * 2007-01-23 2008-08-07 Yutaka Kojima Puzzle game system and numeric keypad character

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5978492A (en) * 1994-12-02 1999-11-02 Sony Corporation Sound source data generation method, recording medium, and sound source data processing device
US6509519B2 (en) * 1995-09-29 2003-01-21 Yamaha Corporation Method and apparatus for generating musical tone waveforms by user input of sample waveform frequency
US20010045154A1 (en) * 2000-05-23 2001-11-29 Yamaha Corporation Apparatus and method for generating auxiliary melody on the basis of main melody
US20050130740A1 (en) * 2003-09-12 2005-06-16 Namco Ltd. Input device, input determination method, game system, game system control method, program, and information storage medium
US20080125222A1 (en) * 2005-07-11 2008-05-29 Konami Digital Entertainment Co., Ltd. Game program, game device, and game method
US20080200224A1 (en) * 2007-02-20 2008-08-21 Gametank Inc. Instrument Game System and Method
US20080276793A1 (en) * 2007-05-08 2008-11-13 Sony Corporation Beat enhancement device, sound output device, electronic apparatus and method of outputting beats

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140106885A1 (en) * 2012-10-17 2014-04-17 Nintendo Co., Ltd. Storage medium having stored therein game program, game apparatus, game system, and game processing method
US10179278B2 (en) * 2012-10-17 2019-01-15 Nintendo Co., Ltd. Storage medium having stored therein game program, game apparatus, game system, and game processing method

Also Published As

Publication number Publication date
WO2011030761A1 (en) 2011-03-17
CN102481488B (en) 2015-04-01
JP2011056122A (en) 2011-03-24
JP5399831B2 (en) 2014-01-29
CN102481488A (en) 2012-05-30

Similar Documents

Publication Publication Date Title
US5296642A (en) Auto-play musical instrument with a chain-play mode for a plurality of demonstration tones
US8618404B2 (en) File creation process, file format and file playback apparatus enabling advanced audio interaction and collaboration capabilities
US20150103019A1 (en) Methods and Devices and Systems for Positioning Input Devices and Creating Control
WO2008004690A1 (en) Portable chord output device, computer program and recording medium
CN106128437B (en) Electronic musical instrument
WO2017028686A1 (en) Information processing method, terminal device and computer storage medium
JP2008253440A (en) Music reproduction control system, music performance program and synchronous reproduction method of performance data
JP6019803B2 (en) Automatic performance device and program
JP2007078751A (en) Concert system
CN105637579A (en) Technique for reproducing waveform by switching between plurality of sets of waveform data
JP6729052B2 (en) Performance instruction device, performance instruction program, and performance instruction method
JP3286683B2 (en) Melody synthesis device and melody synthesis method
US9947306B2 (en) Electric acoustic apparatus
CN105489209A (en) Electroacoustic musical instrument rhythm controllable method and improvement of karaoke thereof
US20120172099A1 (en) Music game system, computer program of same, and method of generating sound effect data
US10805475B2 (en) Resonance sound signal generation device, resonance sound signal generation method, non-transitory computer readable medium storing resonance sound signal generation program and electronic musical apparatus
JP2007271739A (en) Concert parameter display device
KR100841047B1 (en) Portable player having music data editing function and MP3 player function
JP5773956B2 (en) Music performance apparatus, music performance control method, and program
JP2017015957A (en) Musical performance recording device and program
JP6163755B2 (en) Information processing apparatus, information processing method, and program
JP2008165098A (en) Electronic musical instrument
JP2002301262A (en) Game device using music, its computer program and program storage medium
WO2022209557A1 (en) Electronic musical instrument, electronic musical instrument control method, and program
JP2012132991A (en) Electronic music instrument

Legal Events

Date Code Title Description
AS Assignment

Owner name: KONAMI DIGITAL ENTERTAINMENT CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MIGITERA, OSAMU;NISHIMURA, YOSHITAKA;SIGNING DATES FROM 20111129 TO 20111226;REEL/FRAME:027828/0174

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION