US20080156177A1 - Music search system and music search apparatus - Google Patents

Music search system and music search apparatus Download PDF

Info

Publication number
US20080156177A1
US20080156177A1 US12/073,381 US7338108A US2008156177A1 US 20080156177 A1 US20080156177 A1 US 20080156177A1 US 7338108 A US7338108 A US 7338108A US 2008156177 A1 US2008156177 A1 US 2008156177A1
Authority
US
United States
Prior art keywords
music
data
search
rhythm
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/073,381
Inventor
Naoki Iketani
Masanori Hattori
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Corp
Original Assignee
Toshiba Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Corp filed Critical Toshiba Corp
Priority to US12/073,381 priority Critical patent/US20080156177A1/en
Publication of US20080156177A1 publication Critical patent/US20080156177A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M19/00Current supply arrangements for telephone systems
    • H04M19/02Current supply arrangements for telephone systems providing ringing current or supervisory tones, e.g. dialling tone or busy tone
    • H04M19/04Current supply arrangements for telephone systems providing ringing current or supervisory tones, e.g. dialling tone or busy tone the ringing-current being generated at the substations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/63Querying
    • G06F16/632Query formulation
    • G06F16/634Query by example, e.g. query by humming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/68Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/683Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/031Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
    • G10H2210/071Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal for rhythm pattern analysis or rhythm style recognition
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2230/00General physical, ergonomic or hardware implementation of electrophonic musical tools or instruments, e.g. shape or architecture
    • G10H2230/005Device type or category
    • G10H2230/021Mobile ringtone, i.e. generation, transmission, conversion or downloading of ringing tones or other sounds for mobile telephony; Special musical data formats or protocols herefor
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2240/00Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
    • G10H2240/011Files or data streams containing coded musical information, e.g. for transmission
    • G10H2240/046File format, i.e. specific or non-standard musical file format used in or adapted for electrophonic musical instruments, e.g. in wavetables
    • G10H2240/056MIDI or other note-oriented file format
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2240/00Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
    • G10H2240/121Musical libraries, i.e. musical databases indexed by musical parameters, wavetables, indexing schemes using musical parameters, musical rule bases or knowledge bases, e.g. for automatic composing methods
    • G10H2240/131Library retrieval, i.e. searching a database or selecting a specific musical piece, segment, pattern, rule or parameter set
    • G10H2240/141Library retrieval matching, i.e. any of the steps of matching an inputted segment or phrase with musical database contents, e.g. query by humming, singing or playing; the steps may include, e.g. musical analysis of the input, musical feature extraction, query formulation, or details of the retrieval process
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2240/00Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
    • G10H2240/171Transmission of musical instrument data, control or status information; Transmission, remote access or control of music data for electrophonic musical instruments
    • G10H2240/201Physical layer or hardware aspects of transmission to or from an electrophonic musical instrument, e.g. voltage levels, bit streams, code words or symbols over a physical link connecting network nodes or instruments
    • G10H2240/241Telephone transmission, i.e. using twisted pair telephone lines or any type of telephone network
    • G10H2240/251Mobile telephone transmission, i.e. transmitting, accessing or controlling music data wirelessly via a wireless or mobile telephone receiver, analog or digital, e.g. DECT GSM, UMTS

Definitions

  • the present invention relates to a music search system and apparatus for searching for music, a musical piece, a song, etc.
  • a method of entering information associated with music such as a title or a performer (artist), as a character string and searching for it is generally known.
  • a mobile telephone is held to the sound source of music produced in the surrounding of a searcher and the sound is input to an apparatus.
  • Voice waveform of hummed tune is input.
  • a search is made for music based on performance information of a keyboard instrument, etc.
  • a scale string representing syllable names in a character string, such as do re mi fa, is input for searching for music.
  • a chord progression of music (series of harmony of C, G7, etc.,) is input for searching for music containing the chord progression.
  • the music search methods in the related arts are characterized by the fact that any of at least timbre, melody, or harmony among rhythm, timbre, melody, and harmony of components of music is input.
  • a data searching method by inputting a time-series signal, namely, rhythm only is proposed in JP-A-2004-033492.
  • the rhythm identifying method is not refined and the music search method lacks effective components to make the most of the method as a music search method.
  • the conventional methods also lack effective components to implement the method as a search method for ringer music in a mobile telephone.
  • a music search system including: a music search apparatus; and a music search terminal, wherein the music search apparatus includes: an input unit that inputs a time-series signal represented by on/off signals; a data storage unit that stores a plurality of pieces of rhythm data in association with music-associated information associated with music corresponding to the rhythm data; a search unit that searches the plurality of pieces of rhythm data stored in the data storage unit for rhythm data having the same fluctuation pattern as or a similar fluctuation pattern to the time-series signal input to the input unit; and a search result output unit that reads the music-associated information stored in association with the rhythm data found by the search unit from the data storage unit and outputs the read music-associated information as the search result of the search, wherein the music search terminal includes: a communication unit that communicates with the music search apparatus via a communication line; a operation unit that inputs the time-series signal to the input unit through the communication unit; a receiving unit that receives the is search result of the input
  • a music search apparatus including: an input unit that inputs a time-series signal represented by on/off signals; a data storage unit that stores a plurality of pieces of rhythm data in association with music-associated information associated with music corresponding to the rhythm data; a search unit that searches the plurality of pieces of rhythm data stored in the data storage unit for rhythm data having the same fluctuation pattern as or a similar fluctuation pattern to the time-series signal input to the input unit; and a search result output unit that reads the music-associated information stored in association with the rhythm data found by the search unit from the data storage unit and outputs the read music-associated information as the search result of the search.
  • a music search method including: inputting a time-series signal represented by on/off signals; storing a plurality of pieces of rhythm data in association with music-associated information associated with music corresponding to the rhythm data; searching the plurality of pieces of rhythm data for rhythm data having the same fluctuation pattern as or a similar fluctuation pattern to the time-series signal; reading the music-associated information found by the search; and outputting the read music-associated information as the search result of the search.
  • a computer-readable program product for causing a computer system to execute procedures for searching a music, including: inputting a time-series signal represented by on/off signals; storing a plurality of pieces of rhythm data in association with music-associated information associated with music corresponding to the rhythm data; searching the plurality of pieces of rhythm data for rhythm data having the same fluctuation pattern as or a similar fluctuation pattern to the time-series signal; reading the music-associated information found by the search; and outputting the read music-associated information as the search result of the search.
  • FIG. 1 is a drawing to show the configuration of a music search apparatus based on rhythm input according to an embodiment of the invention
  • FIG. 2 is a function block diagram to show the configuration of a music search apparatus based on rhythm input according to the embodiment
  • FIG. 3 is a flowchart to describe the mobile telephone operation and the search apparatus operation
  • FIG. 4 is a drawing to show a composition example of data stored in a rhythm data storage section 201 ;
  • FIG. 5 is a flowchart to show a processing flow of rhythm data generation
  • FIG. 6 is a drawing to show a composition example of data stored in a rhythm data storage section 203 ;
  • FIG. 7 is a flowchart to show a processing flow of similar rhythm search conducted by a similar rhythm search section 206 ;
  • FIG. 8 is a drawing to describe a procedure of converting an input time-series signal train into rhythm data
  • FIG. 9 is a drawing to show an example of the similar rhythm search result
  • FIG. 10 is a drawing to show an example of information relevant to ringer tone stored in a music-associated information storage section 204 ;
  • FIG. 11 is a drawing to show an example of the finally output search result
  • FIG. 12 is a flowchart to describe two-stage search.
  • FIG. 13 is a block diagram to show the configuration when the music search apparatus according to the embodiment of the invention is implemented in a computer.
  • FIG. 1 is a drawing to show the configuration of a music search terminal, such as a mobile telephone, based on rhythm input according to an embodiment of the invention.
  • a music search terminal 101 of mobile telephone type is included.
  • the embodiment typically is implemented as a computer controlled by software.
  • the software in this case includes a program and data, the functions and effects of the invention are provided by making the most of computer hardware physically, and appropriate related arts are applied to portions where the related arts can be applied. Further, the specific types and configurations of hardware and software for embodying the invention, the software processing range, and the like can be changed as desired. Therefore, in the description that follows, a virtual function block diagram indicating the component functions of the invention as blocks is used. A program for operating a computer to embody the invention is also one form of the invention.
  • a music data storage section 201 has a function of storing music data to search for in the format of ringer melody in “.mmf” format, ringer melody in SMAF format, a kind of ringer music, MIDI format, “Chaku-uta (registered trademark in Japan)” format of audio data, linear PCM code audio format, MPEG1 AUDIO Layer 3 format, etc.
  • the music data storage section 201 is implemented as a record medium such as memory or a hard disk, for example.
  • a rhythm data generation section 202 receives music data input from the music data storage section 201 and further input of edit data of a rhythm data editor as required and generates rhythm data fitted to each piece of music, used as a search key in searching for music for each piece of music and then registers the rhythm data in a rhythm data storage section 203 .
  • the rhythm data fitted to each piece of music is data formed as rhythm assumed to be entered by the user in searching for the piece of music or a rhythm group containing the rhythm; generally the rhythm of the beginning portion, the characteristic portion, and the theme of music becomes rhythm data for input music. That is, the rhythm data generation section 202 generally generates the rhythm data of the beginning, the characteristic portion, the theme, etc., of input music data as registered rhythm data.
  • rhythm data may be created for each of a plurality of parts contained in MIDI data.
  • MIDI data from a WAVE file of linear PCM code audio format
  • mechanical conversion may be executed, for example, using “MuseBook(R) Wav2Midi Version 1.0, or using a technique employed in a computer software called “Saifunotatsujin”.
  • rhythm data directly from the WAVE file or the MPEG1 AUDIO Layer 3 format depending on music data.
  • the vocal signal is extracted through an analog filter for performing signal processing of narrowing down to the voice frequency band; further, in music data of stereo audio, using the fact that the vocal part is fixed at the center and a chorus and other instrumental sounds are fixed at the left and right other than the center, the vocal signal can be extracted more precisely.
  • the portion where the sound volume or the sound volume change rate is high is assumed to be vocalization and the time is recorded, whereby song rhythm can be generated.
  • rhythm data of the characteristic portion assumed to be entered by the user can be generated with high accuracy.
  • the characteristic portion detection technology is described in the following document: “Masataka Gotou: “Real time musical scene description system: Sabikukan ken'shutsushuhou,” Jyouhoushori gakkai On'gakujyouhoukagaku ken'kyuukai ken'kyuu houkoku 2002-MUS-47-6, Vol. 2002, No. 100, pp. 27-34, October 2002.”
  • a rhythm data storage section 203 retains the rhythm data generated by the rhythm data generation section 202 and provides the rhythm data for a similar rhythm search section 206 .
  • the rhythm data storage section 203 is implemented as memory in a computer, for example.
  • a time-series signal input section 205 inputs a music rhythm signal input by a music searcher as a search key for searching for music (input time-series signal) and outputs the input signal to the similar rhythm search section 206 as input time-series signal.
  • the time-series signal input section 205 may have a function capable of detecting time change of ON/OFF and can be implemented as any of various machines such as not only various keys and buttons of a keyboard of a personal computer (PC), a mouse, buttons of a mobile telephone, and buttons of a remote control, but also a touch panel and an infrared sensor.
  • the similar rhythm search section 206 inputs the input time-series signal input from the time-series signal input section 205 and the rhythm data retained in the rhythm data storage section 203 , searches for rhythm data similar to the signal pattern of the input time-series signal, and outputs the search result to a search result generation section 207 .
  • a music-associated information storage section 204 is a section for retaining titles, composer names, songwriters, performer names, a part or all of words, URL where music data exists, music data, etc., associated with music in association with the music data, and is implemented as memory in a computer, for example.
  • the music-associated information storage section 204 may further retain syllable names, score data, words, or sound data associated with music.
  • the search result generation section 207 inputs the search result of the similar rhythm search section 206 , references the music-associated information from the music-associated information storage section 204 in response to the search result as required, and generates output data for outputting the search result to the music searcher. For example, from the search result of the similar rhythm search section 206 , as relevant information to the found rhythm data, the title, the composer name, the songwriter, the performer name, a part or all of words, the URL where the music data corresponding to the rhythm data exists, the music data, and the like are output as a set to a search result output section 208 as output data.
  • the search result output section 208 outputs the output data generated by the search result generation section 207 to the searcher.
  • the data is output as screen information or sound information with the mobile telephone used for the search.
  • the mobile telephone may be any type of mobile telephone including PHS, a mobile telephone involved in radio LAN communications, etc.
  • the ringer tone refers to a sound produced to inform the mobile telephone user that a telephone call or mail comes in the mobile telephone, and contains not only a musical piece, but also a voice only.
  • the operation of mobile telephone is executed according to a procedure shown in FIG. 3 .
  • Step S 301 Mobile telephone 101 transmits an input time-series signal to search apparatus via a radio communication line.
  • To input the input time-series signal for example, rhythm is input using one button of the mobile telephone and the input time-series signal is transmitted to the search apparatus in sequence.
  • Step S 302 The mobile telephone 101 receives the search result from the search apparatus via the radio communication line and outputs the search result to a display 102 or a speaker 111 of the mobile telephone 101 .
  • Step S 303 To terminate the ringer tone search processing, the mobile telephone 101 goes to step S 304 ; to continue the ringer tone search processing, the mobile telephone 101 returns to step S 301 .
  • Step S 304 The mobile telephone 101 sends a search processing termination notification to the search apparatus via the radio communication line and terminates the processing.
  • the operation of the search apparatus is executed according to a procedure shown in FIG. 3 .
  • Step S 351 The search apparatus receives the input time-series signal input in sequence from the mobile telephone via the radio communication line, passes the input time-series signal to the similar rhythm search section 206 in sequence, and makes a search request for searching for rhythm data similar to the input time-series signal.
  • Step S 352 The search apparatus executes search in sequence with the passage of time and outputs the search result produced using music-associated information related to the found rhythm data to the mobile telephone via the radio communication line.
  • music-associated information related to the found rhythm data to the mobile telephone via the radio communication line.
  • the title and the performer name among the title, the composer name, the songwriter, the performer name, a part or all of words, the URL where the music data exists, and the music data are used as the music-associated information.
  • Step S 353 To continue the search, the search apparatus returns to step S 351 ; to terminate the search, the processing is terminated.
  • the search apparatus searches the rhythm data storage section 203 in sequence for the rhythm data having rhythm similar to the input time-series signal and transmits the rhythm data to the mobile telephone.
  • the music searcher can obtain in sequence the search result of playing back the possible ringer tone to be found, seeing the title of the ringer tone, etc.
  • the music-associated information storage section 204 can also retain the syllable names, the score data, the words, or the sound data associated with the piece of music, the music searcher can also check them as the search result.
  • the input time-series signal is transmitted in sequence from the mobile telephone to the search apparatus, which then executes search in sequence.
  • Search may be executed with input of a search execution button pressed on the mobile telephone as a trigger.
  • the music searcher using the mobile telephone 101 inputs the input time-series signal for a given time and then presses the search execution button for transmitting the input time-series signal for the given time to the search apparatus.
  • the search apparatus may search the rhythm data storage section 203 for the rhythm data having rhythm similar to the input time-series signal and may transmit the rhythm data to the mobile telephone.
  • the mobile telephone needs to have a function for acquiring the input time-series signal.
  • ringer tone data or data to acquire ringer tone is transmitted to the mobile telephone, whereby the user of the mobile telephone can download any desired ringer tone from the search result.
  • the data stored in the music data storage section 201 is data for enabling music to be played back using a computer; for example, it includes data in SMAF (Synthetic music Mobile Application Format) represented by extension “.mmf,” a typical data format of ringer melody, and data in SMF (Standard Midi File) format represented by extension “.mid,” a more general MIDI standard format, as shown in FIG. 4 .
  • the data covers general music data, such as ringer melody data known as extension “.pmd,” Chaku-uta (registered trademark in Japan) having extension “.amc,” etc., data with extension “.wav” called WAVE format, and data with extension “.mp3” in AUDIO Layer 3 format.
  • the music data is stored in association with music data ID.
  • the music data ID is called CID.
  • the music data storage section 201 can be implemented as a record medium such as memory or a hard disk in a server computer.
  • the rhythm data generation section 202 inputs music data from the music data storage section 201 and generates rhythm data fitted to each piece of music for each piece of music.
  • music data in the MIDI format including SMF is like digital data of score information and is made up of tempo of music and timbre, duration, pitch, etc., of separate sound.
  • the rhythm data generation section 202 excludes the timbre information and the pitch information from the data, extracts the duration information, and converts it into data.
  • Step S 501 MIDI data file in the SMF format is read together with the CID from the music data storage section 201 .
  • Step S 502 If a plurality of tracks exist in the MIDI data, one track is selected according to assistance input of the operator.
  • Step S 503 Time information (delta time) where sound rising information in the selected track (note on message) occurs is cut out and sound interval information is added to an array as a numeric string. This step is repeated and the sound interval information string for the whole music data is retained.
  • Step S 504 For the sound interval information string, one or more portions used as rhythm data are selected according to assistance input of the operator.
  • Step S 505 One or more pieces of sound interval information selected at step S 504 are output to the rhythm data storage section 203 as the rhythm data in association with the CID.
  • the processing section described above may be accomplished using any of various known GUI functions in related arts.
  • rhythm data the time interval string of keying is indicated, separated by a comma in millisecond units, and rhythm data r is represented as follows:
  • r ⁇ r1, r2, r3, . . . , rn ⁇
  • rhythm data generation section 202 is implemented as a CPU and a program of a server computer.
  • the rhythm data storage section 203 is a function block for retaining the rhythm data in association with the identifier; typically, it is implemented as storage of a computer.
  • the rhythm data storage section 203 is constructed as memory or a hard disk of a server computer and memory of a mobile telephone.
  • the rhythm data storage section 203 retains not only the rhythm data generated and output in the rhythm data generation section 202 , but also the rhythm data created outside the search apparatus and the rhythm data manually entered by the user.
  • FIG. 6 shows a composition example of the data stored in the rhythm data storage section 203 .
  • the rhythm data storage section 203 stores the CID of the music data ID and the rhythm data in association with each other.
  • the time-series signal input section 205 outputs the time change of ON/OFF entered by the enterer to the similar rhythm search section 206 as an input time-series signal train.
  • the time-series signal input section 205 is an input unit that can detect at least one or more ON/OFF states; typically it is implemented as a button of a mechanical part involving electronic output. In the embodiment, it is a part of the mobile telephone of the searcher.
  • the button and the circuit for receiving the button signal correspond to the time-series signal input section 205 .
  • the time-series signal input section 205 can be implemented not only as the mobile telephone, but also as a keyboard, a mouse, etc., of PC, a touch panel, a remote control button, etc., if it can detect the ON/OFF state, as described above.
  • the similar rhythm search section 206 inputs the input time-series signal train, selects rhythm data similar to the fluctuation pattern of the signal, and outputs the similarity degree between one or more candidates and input patterns as a set to the search result generation section 207 .
  • the similar rhythm search section 206 is implemented as a server computer and a CPU and a program of a mobile telephone. Typically, it is implemented as a program of a computer for converting the input signal train into rhythm data and then performing calculation using an algorithm for calculating the similarity between two rhythms.
  • Step S 701 The similar rhythm search section 206 receives the input time-series signal from the time-series signal input section 205 and converts the input time-series signal into input rhythm data.
  • Step S 702 One piece of rhythm data is taken out from the rhythm data storage section 203 and the similarity between the rhythm data and the input rhythm data is calculated based on a similarity determination algorithm.
  • Step S 703 If the similarity is not calculated for all rhythm data, the process returns to step S 701 and the processing is repeated; if the similarity is calculated for all rhythm data, the process goes to step S 704 .
  • Step S 704 The CIDs and the similarity degrees s for the three high-order pieces of rhythm data in the descending order of the rhythm similarity of the calculation result are output to the search result generation section 207 and the processing is terminated.
  • the procedure of converting the input time-series signal train into the rhythm data is an easy procedure according to which the time at which the OFF-to-ON state transition is made is detected, each time interval is described in millisecond units, and the time interval string is converted into data separated by a comma, as shown in FIG. 8 .
  • a calculation method according to a rhythm vector technique in non-patent documents 1 and 2 a similarity degree calculation method according to the differential square sum, a similarity degree calculation method using dynamic programming, etc., can be used.
  • the similarity degree calculation method according to the differential square sum technique is adopted, and the calculation method will be discussed below:
  • R i ⁇ r i1 , r i2 , r i3 , . . . , r in ⁇
  • R j ⁇ r j1 , r j2 , r j3 , . . . , r jn , . . . , r jm ⁇
  • the similarity degree s is calculated as follows:
  • the similar rhythm search result as shown in FIG. 9 is output as the result of the calculation.
  • FIG. 9 shows that the CID of the music data with the highest similarity degree is “MUS0003” and the similarity degree is “0.8.”
  • the music-associated information storage section 204 is a function block for retaining information relevant to ringer tone and retaining associated information of the title, composer name, etc., for example, with the identifier for identifying the ringer tone as a key.
  • the music-associated information storage section 204 is implemented as memory of a computer, etc.
  • the music-associated information storage section 204 retains data of “title,” “music genre,” “composer name,” “ composer name,” and “URL where music data exists” with the CID of the music identifier as a key, as shown in FIG. 10 .
  • the invention is not limited to the format; for example, all information relevant to music and its data, such as the performer name, the album name, the music play time, the data size of ringer tone data, a part or all of words, and the ringer tone data, can be stored.
  • the search result generation section 207 receives the pairs of the CIDs and the similarity degrees s input from the similar rhythm search section 206 , organizes them as output data in the search result output section 208 , and outputs the organized output data to the search result output section 208 .
  • any other information which needs to be output as the search result is acquired from the music-associated information storage section 204 based on the CID.
  • the output data is made up of text information, image information, sound information, the score of the similarity degree, etc.
  • the similarity degree may be normalized so that 0 points mean complete non-similarity and 100 points mean complete match as the score based on the similarity degree s. Accordingly, it is made possible for the searcher to see how much the result matches the input at a glance.
  • the search result output section 208 receives the search result from the search result generation section 207 and outputs the search result by displaying a character string or an image or playing back music, etc.
  • the ringer tone data is downloaded from URL 1 where the data exists from the music-associated information shown in FIG. 10 and is stored in the memory in the mobile telephone, and voice is output from the speaker 111 connected to the mobile telephone 101 .
  • search for ringer tone in the mobile telephone is taken as an example, but the invention is not limited to it.
  • the invention can also be embodied as a music search apparatus in an apparatus such as a mobile terminal or a CP that can play back/download music.
  • the rhythm data storage section 203 and the music data storage section 201 are storage media such as memory in the mobile terminal, and the similar rhythm search section 206 is implemented as a CPU and a program of the mobile terminal for playing back music data as output.
  • the data of “title,” “music genre,” “composer name,” “ composer name,” “URL where music data exists,” etc. is obtained with the CID of the music identifier as a key.
  • the data of “title,” “music genre,” “composer name,” “ composer name,” “URL where music data exists,” etc. may be obtained with the ID (identifier) of rhythm data as a key.
  • the search of the invention can also be used for the purpose of calling to play back a musical piece retained in the mobile telephone as the search result.
  • step S 503 Only the time information (delta time) where sound rising information in the selected track (note on message) occurs is cut out at (step S 503 ). However, further sound falling information (note off message) may also be cut out and the sound rising information and the sound falling information may be added together to the array as a numeric string. In this case, the rising time and the falling time are recorded alternately in the numeric array, and the data becomes time interval information with sound switched between ON and OFF.
  • the rhythm data thus created likewise, not only the time at which the OFF-to-ON state transition is made, but also the time at which the ON-to-OFF state transition is made is detected at the step where the input rhythm data is generated by the time-series signal input section 205 (step S 701 ). If the data is used, search can be conducted as in the embodiment described above. Accordingly, search making the most of not only the sound rising timing, but also duration information can be conducted.
  • the music search apparatus based on rhythm input can be made easier to use and easier to implement.
  • the functions of the function blocks shown in FIG. 2 are installed in both a mobile telephone as of a music search terminal and a music search apparatus as of a server. If the search result of a predetermined similarity degree is not provided in the terminal of the mobile telephone, etc., search is made using the music search apparatus of the server.
  • the rhythm data storage section 203 in mobile telephone 101 stores a subset of the data retained in a rhythm data storage section in the music search apparatus of the server, and it is effective to store musical pieces having a high possibility that the musical piece may become the search result.
  • the rhythm data storage section 203 is not limited to a subset and the rhythm data of the musical pieces retained by the user and the rhythm data keyed by the user are stored, whereby it is also possible to provide the search result unique to the user.
  • FIG. 12 is a flowchart to describe the operation of the music search apparatus.
  • a suffix of T (Terminal) is added to each reference numeral and for the functions in the music search apparatus of the server, a suffix of S (Server) is added to each reference numeral.
  • a time-series signal input section 205 T receives a time-series signal with the passage of time and outputs the time-series signal to a similar rhythm search section 206 T in the mobile telephone (step S 1201 ).
  • This can be executed in the mobile telephone as an application, such as i-appli (registered trademark in Japan), as follows: For example, a program for detecting an event of a center button being pressed and receiving an input time-series signal forms the time-series signal input section 205 T, and rhythm data previously defined as constant data is stored in a scratch pad of a data storage mechanism of the i-appli, thereby forming a rhythm data storage section 203 .
  • the similar rhythm search section 206 T formed as an i-appli program for example, in the mobile telephone references the rhythm data storage section 203 T in the mobile telephone and performs search processing similar to that in the first embodiment (step S 1202 ).
  • a search result generation section 207 T uses music-associated information obtained from a music-associated information storage section 204 T to generate the search result to be output as the final search result based on the obtained result (step S 1204 ).
  • the predetermined similarity degree mentioned above can be determined 0.9, etc., in the similarity degree with 1 as the maximum value.
  • the time-series signal as a search request is transmitted to the music search apparatus of the server (step S 1205 ).
  • a rhythm data storage section 203 S is referenced and search processing is performed (step S 2151 ) and then the search result to be output as the final search result is generated using music-associated information obtained from a music-associated information storage section 204 S and is transmitted to the mobile telephone terminal (step S 1252 ) as in the first embodiment.
  • the process returns to step S 1251 .
  • the search result is received from the music search apparatus of the server (step S 1206 ).
  • the obtained search result is output to a display, etc., (step S 1207 ).
  • a search processing termination notification is sent to the search apparatus via a radio communication line and the processing is terminated (step S 128 ).
  • the music search apparatus based on rhythm input can be made easier to use and easier to implement.
  • the search result of the predetermined similarity degree when the search result of the predetermined similarity degree can be obtained in the mobile telephone terminal, the search result can be obtained without delay caused by communications, etc., so that the search result can be obtained at high speed and further when the search result of the predetermined similarity degree cannot be obtained, the search result is provided by the music search apparatus of the server, so that the search result considered to be more correct can be obtained.
  • the server has a margin of the storage capacity as compared with the terminal and thus stores more rhythm data than the terminal.
  • search is executed in the music search apparatus of the server, the user can also obtain the search result for a musical piece whose rhythm data is not retained although a delay is caused by communications, etc.
  • the user may be enabled to command the music search apparatus of the server to conduct search.
  • the search result is output involving the words, score, syllable names, and sound of the keying portion inputting an input time-series signal.
  • a part or all of words, a part or all of syllable names of note of musical piece, a part or all of score, or voice audition data is played back so that the user can keep track of the portion corresponding to an input time-series signal. Accordingly, if the user enters the rhythm pattern of an impressive phrase without knowing the name of a musical piece for searching for the musical piece and several candidates are displayed, it is made possible for the user to easily determine which candidate is his or her desired candidate.
  • a time stamp indicating the rhythm appears at a time position of how many seconds since the start of the musical piece containing the rhythm is retained.
  • one time stamp is registered, but two or more time stamps can be registered.
  • the corresponding portions of words, syllable names, score, voice data are registered by time stamp, whereby each music-associated information can be related to each piece of rhythm data for output.
  • the output is generated as a part of the search result in a search result generation section 207 and is output in a search result output section 208 , so that the user can rapidly determine whether or not the search result is as intended according to the words and the musical scale of the portion corresponding to the input time-series signal.
  • the data is voice data
  • the time of the start position for each reasonable portion of words, syllable names, score, etc., is previously tagged as associated information, whereby some words, syllable names, score, etc., containing the input range of the time-series signal by the user can be output.
  • an output mode wherein all words, syllable names, score, etc., are output and then the input range of the time-series signal by the user is displayed in a different color, etc., rather than the output mode of some words, syllable names, score, etc., is also effective.
  • the music search apparatus based on rhythm input can be made easier to use and easier to implement.
  • an input time-series signal at the search time is converted into rhythm data, which is then stored in a rhythm data storage section for reuse for the later search.
  • An input time-series signal at the search time is converted into rhythm data, which is then stored in a rhythm data storage section 203 for reuse for the later search, thereby enhancing the search accuracy.
  • the portion wherein the search result of a similar rhythm search section based on a signal received at a time-series signal input section is output in a search result output section is similar to that in the first embodiment.
  • the input time-series signal is converted into the data format of rhythm data for retention in the rhythm data storage section 203 .
  • the retained input time-series signal is converted into the data format of rhythm data and is additionally registered in the rhythm data storage section 203 as one piece of the rhythm data of the selected musical piece.
  • the data format conversion may be executed according to a method similar to the method previously described with reference to FIG. 8 in the first embodiment.
  • the possibility that the candidate musical piece selected this time may be displayed as a higher-order candidate becomes high in response to a search request with a similar input time-series signal.
  • the music search apparatus based on rhythm input can be made easier to use and easier to implement.
  • An information edit apparatus of the invention can be implemented as a program operated in a computer such as a workstation (WS) or a personal computer (PC).
  • a computer such as a workstation (WS) or a personal computer (PC).
  • FIG. 13 is a block diagram to show a configuration example when an information edit apparatus (music search apparatus) according to the invention is implemented in a computer.
  • This computer includes a central processing unit 1301 for executing a program, memory 1302 for storing the program and data being processed by the program, a magnetic disk drive 1303 for storing the program, the data to search for, and OS (Operating System), and an optical disk drive 1304 for reading/writing the program and data from/to an optical disk.
  • OS Operating System
  • the computer further includes an image output section 1305 of an interface for displaying a screen on a display, etc., an input acceptance section 1306 for accepting input from a keyboard, a mouse, a touch panel, etc., and an output/input section 1307 of an output/input interface with an external machine (for example, USB (Universal Serial Bus), a voice output terminal, etc.,).
  • the computer also includes a display 1308 such as an LCD, a CRT, or a projector, an input unit 1309 such as a keyboard or a mouse, and an external machine 1310 such as a memory card reader or a speaker.
  • the central processing unit 1301 reads the program from the magnetic disk drive 1303 and stores the program in the memory 1302 and then executes the program, thereby implementing the function blocks shown in FIG. 2 .
  • a part or all of the data to search for may be read from the magnetic disk drive 1303 and may be stored in the memory 1302 .
  • a search request made by the user is received through the input unit 1309 and a search is made for the data to search for stored in the magnetic disk drive 1303 or the memory 1302 in response to the search request.
  • the search result is displayed on the display 1308 .
  • the search result not only is displayed on the display 1308 , but also may be presented to the user by voice with a speaker connected to the computer as the external machine 1310 , for example. Alternatively, the search result may be presented to the user as printed matter with a printer connected to the computer as the external machine 1310 .
  • a music search apparatus including input means that inputs a time-series signal whose on state and off state are repeated alternately; data storage means for storing a plurality of pieces of rhythm data in association with music-associated information associated with music corresponding to the rhythm data; search means for searching the plurality of pieces of rhythm data stored in the data storage means for rhythm data having the same fluctuation pattern as or a similar fluctuation pattern to the time-series signal input to the input means; and search result output means for reading the music-associated information stored in association with the rhythm data found by the search means from the data storage means and outputting the read music-associated information as the search result of the search.
  • the invention relating to the apparatus also holds as the invention relating to a method and the invention relating to the method also holds as the invention relating to the apparatus.
  • the invention relating to the apparatus or the method also holds as a program for causing a computer to execute a procedure corresponding to the invention (or causing a computer to function as means corresponding to the invention or causing a computer to provide functions corresponding to the invention) and also holds as a computer-readable record medium recording the program.
  • the music search apparatus for making it possible to search for music containing a rhythm similar to the rhythm in response to rhythmical ON/OFF change input of a time-series signal and inspect the music information or play back music can be realized.

Abstract

A time-series signal input section 205 inputs a time-series signal whose on state and off state are repeated alternately. A similar rhythm search section 206 searches a plurality of pieces of rhythm data stored in a data storage section for rhythm data having the same fluctuation pattern as or a similar fluctuation pattern to the time-series signal input to the time-series signal input section 205. A music-associated information storage section 204 stores music-associated information associated with the piece of music corresponding to the rhythm data in association with the rhythm data. A search result generation section 207 generates the search result using the music-associated information (information of title, etc.,) stored in association with the found rhythm data and outputs the search result through a search result output section 208.

Description

    RELATED APPLICATIONS
  • The present disclosure relates to the subject matter contained in Japanese Patent Application No. 2004-288433 filed on Sep. 30, 2004, which is incorporated herein by reference in its entirety.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to a music search system and apparatus for searching for music, a musical piece, a song, etc.
  • 2. Description of the Related Art
  • As a music search method in a related art, a method of entering information associated with music, such as a title or a performer (artist), as a character string and searching for it is generally known.
  • In addition, as a method of searching for music or information concerning music (for example, artist of performer, etc.,), the following search methods based on input are known:
  • (1) A mobile telephone is held to the sound source of music produced in the surrounding of a searcher and the sound is input to an apparatus.
    (2) Voice waveform of hummed tune is input.
    (3) A search is made for music based on performance information of a keyboard instrument, etc.
    (4) A scale string representing syllable names in a character string, such as do re mi fa, is input for searching for music.
    (5) A chord progression of music (series of harmony of C, G7, etc.,) is input for searching for music containing the chord progression.
  • The music search methods in the related arts are characterized by the fact that any of at least timbre, melody, or harmony among rhythm, timbre, melody, and harmony of components of music is input.
  • In contrast, a method of playing back sound by inputting a time-series signal, namely, rhythm only is proposed in JP-A-2003-142511. A similar methods are disclosed in the following documents:
      • Naoki Iketani, Masanori Hattori, Akihiko Oosuga: “Rhythm inputting interface “Ta-ta-ta-tap”,” Jyouhoushori gakkai dai66kai zen'kokutaikai 4A-4, 2004
      • Haruto Takeda, Kouichi Shinoda, Shigeki Sagayama, et al.: “Rhythm recognition using Rhythm vector,” Jyouhoushori gakkai ken'kyuu houkoku
      • “On'gaku jyouhou kagaku” No. 46, 2002
  • A data searching method by inputting a time-series signal, namely, rhythm only is proposed in JP-A-2004-033492.
  • SUMMARY OF THE INVENTION
  • However, in the music search method based on input of a time-series signal in the documents described above, the rhythm identifying method is not refined and the music search method lacks effective components to make the most of the method as a music search method.
  • The conventional methods also lack effective components to implement the method as a search method for ringer music in a mobile telephone.
  • It is therefore one of objects of the invention to make a music search apparatus based on rhythm input easier to use and easier to implement.
  • According to a first aspect of the invention, there is provided a music search system including: a music search apparatus; and a music search terminal, wherein the music search apparatus includes: an input unit that inputs a time-series signal represented by on/off signals; a data storage unit that stores a plurality of pieces of rhythm data in association with music-associated information associated with music corresponding to the rhythm data; a search unit that searches the plurality of pieces of rhythm data stored in the data storage unit for rhythm data having the same fluctuation pattern as or a similar fluctuation pattern to the time-series signal input to the input unit; and a search result output unit that reads the music-associated information stored in association with the rhythm data found by the search unit from the data storage unit and outputs the read music-associated information as the search result of the search, wherein the music search terminal includes: a communication unit that communicates with the music search apparatus via a communication line; a operation unit that inputs the time-series signal to the input unit through the communication unit; a receiving unit that receives the is search result of the input time-series signal through the communication unit from the search result output unit; and a display unit that displays the received search result.
  • According to a second aspect of the invention, there is provided a music search apparatus including: an input unit that inputs a time-series signal represented by on/off signals; a data storage unit that stores a plurality of pieces of rhythm data in association with music-associated information associated with music corresponding to the rhythm data; a search unit that searches the plurality of pieces of rhythm data stored in the data storage unit for rhythm data having the same fluctuation pattern as or a similar fluctuation pattern to the time-series signal input to the input unit; and a search result output unit that reads the music-associated information stored in association with the rhythm data found by the search unit from the data storage unit and outputs the read music-associated information as the search result of the search.
  • According to a third aspect of the invention, there is provided a music search method including: inputting a time-series signal represented by on/off signals; storing a plurality of pieces of rhythm data in association with music-associated information associated with music corresponding to the rhythm data; searching the plurality of pieces of rhythm data for rhythm data having the same fluctuation pattern as or a similar fluctuation pattern to the time-series signal; reading the music-associated information found by the search; and outputting the read music-associated information as the search result of the search.
  • According to a fourth aspect of the invention, there is provided a computer-readable program product for causing a computer system to execute procedures for searching a music, including: inputting a time-series signal represented by on/off signals; storing a plurality of pieces of rhythm data in association with music-associated information associated with music corresponding to the rhythm data; searching the plurality of pieces of rhythm data for rhythm data having the same fluctuation pattern as or a similar fluctuation pattern to the time-series signal; reading the music-associated information found by the search; and outputting the read music-associated information as the search result of the search.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above objects and advantages of the present invention will become more apparent by describing in detail exemplary embodiments thereof with reference to the accompanying drawings, wherein:
  • FIG. 1 is a drawing to show the configuration of a music search apparatus based on rhythm input according to an embodiment of the invention;
  • FIG. 2 is a function block diagram to show the configuration of a music search apparatus based on rhythm input according to the embodiment;
  • FIG. 3 is a flowchart to describe the mobile telephone operation and the search apparatus operation;
  • FIG. 4 is a drawing to show a composition example of data stored in a rhythm data storage section 201;
  • FIG. 5 is a flowchart to show a processing flow of rhythm data generation;
  • FIG. 6 is a drawing to show a composition example of data stored in a rhythm data storage section 203;
  • FIG. 7 is a flowchart to show a processing flow of similar rhythm search conducted by a similar rhythm search section 206;
  • FIG. 8 is a drawing to describe a procedure of converting an input time-series signal train into rhythm data;
  • FIG. 9 is a drawing to show an example of the similar rhythm search result;
  • FIG. 10 is a drawing to show an example of information relevant to ringer tone stored in a music-associated information storage section 204;
  • FIG. 11 is a drawing to show an example of the finally output search result;
  • FIG. 12 is a flowchart to describe two-stage search; and
  • FIG. 13 is a block diagram to show the configuration when the music search apparatus according to the embodiment of the invention is implemented in a computer.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Referring now to the accompanying drawings, a description will be given in detail of embodiments of the invention.
  • First Embodiment
  • FIG. 1 is a drawing to show the configuration of a music search terminal, such as a mobile telephone, based on rhythm input according to an embodiment of the invention. In the embodiment, by way of example, a music search terminal 101 of mobile telephone type is included.
  • The embodiment typically is implemented as a computer controlled by software. The software in this case includes a program and data, the functions and effects of the invention are provided by making the most of computer hardware physically, and appropriate related arts are applied to portions where the related arts can be applied. Further, the specific types and configurations of hardware and software for embodying the invention, the software processing range, and the like can be changed as desired. Therefore, in the description that follows, a virtual function block diagram indicating the component functions of the invention as blocks is used. A program for operating a computer to embody the invention is also one form of the invention.
  • FIG. 2 is a function block diagram to show the configuration of a music search apparatus based on rhythm input according to the embodiment of the invention.
  • First, an outline of each function block will be discussed before the detailed description of the embodiment.
  • In FIG. 2, a music data storage section 201 has a function of storing music data to search for in the format of ringer melody in “.mmf” format, ringer melody in SMAF format, a kind of ringer music, MIDI format, “Chaku-uta (registered trademark in Japan)” format of audio data, linear PCM code audio format, MPEG1 AUDIO Layer 3 format, etc. The music data storage section 201 is implemented as a record medium such as memory or a hard disk, for example.
  • A rhythm data generation section 202 receives music data input from the music data storage section 201 and further input of edit data of a rhythm data editor as required and generates rhythm data fitted to each piece of music, used as a search key in searching for music for each piece of music and then registers the rhythm data in a rhythm data storage section 203. The rhythm data fitted to each piece of music is data formed as rhythm assumed to be entered by the user in searching for the piece of music or a rhythm group containing the rhythm; generally the rhythm of the beginning portion, the characteristic portion, and the theme of music becomes rhythm data for input music. That is, the rhythm data generation section 202 generally generates the rhythm data of the beginning, the characteristic portion, the theme, etc., of input music data as registered rhythm data.
  • The theme is not necessarily the fitted rhythm. Thus, rhythm data may be created for each of a plurality of parts contained in MIDI data.
  • To generate MIDI data from a WAVE file of linear PCM code audio format, a spectrum analysis of audio is conducted and mechanical conversion may be executed, for example, using “MuseBook(R) Wav2Midi Version 1.0, or using a technique employed in a computer software called “Saifunotatsujin”. Alternatively, it is also possible to generate rhythm data directly from the WAVE file or the MPEG1 AUDIO Layer 3 format depending on music data. For example, if music involves vocals, the vocal signal is extracted through an analog filter for performing signal processing of narrowing down to the voice frequency band; further, in music data of stereo audio, using the fact that the vocal part is fixed at the center and a chorus and other instrumental sounds are fixed at the left and right other than the center, the vocal signal can be extracted more precisely. For the vocal signal thus extracted, the portion where the sound volume or the sound volume change rate is high is assumed to be vocalization and the time is recorded, whereby song rhythm can be generated.
  • Further, to automatically generate rhythm data from music data, a known art of detecting the time of a characteristic portion is used together, whereby the rhythm data of the characteristic portion assumed to be entered by the user can be generated with high accuracy. The characteristic portion detection technology is described in the following document: “Masataka Gotou: “Real time musical scene description system: Sabikukan ken'shutsushuhou,” Jyouhoushori gakkai On'gakujyouhoukagaku ken'kyuukai ken'kyuu houkoku 2002-MUS-47-6, Vol. 2002, No. 100, pp. 27-34, October 2002.”
  • A rhythm data storage section 203 retains the rhythm data generated by the rhythm data generation section 202 and provides the rhythm data for a similar rhythm search section 206. The rhythm data storage section 203 is implemented as memory in a computer, for example.
  • A time-series signal input section 205 inputs a music rhythm signal input by a music searcher as a search key for searching for music (input time-series signal) and outputs the input signal to the similar rhythm search section 206 as input time-series signal. The time-series signal input section 205 may have a function capable of detecting time change of ON/OFF and can be implemented as any of various machines such as not only various keys and buttons of a keyboard of a personal computer (PC), a mouse, buttons of a mobile telephone, and buttons of a remote control, but also a touch panel and an infrared sensor.
  • The similar rhythm search section 206 inputs the input time-series signal input from the time-series signal input section 205 and the rhythm data retained in the rhythm data storage section 203, searches for rhythm data similar to the signal pattern of the input time-series signal, and outputs the search result to a search result generation section 207.
  • A music-associated information storage section 204 is a section for retaining titles, composer names, songwriters, performer names, a part or all of words, URL where music data exists, music data, etc., associated with music in association with the music data, and is implemented as memory in a computer, for example.
  • The music-associated information storage section 204 may further retain syllable names, score data, words, or sound data associated with music.
  • The search result generation section 207 inputs the search result of the similar rhythm search section 206, references the music-associated information from the music-associated information storage section 204 in response to the search result as required, and generates output data for outputting the search result to the music searcher. For example, from the search result of the similar rhythm search section 206, as relevant information to the found rhythm data, the title, the composer name, the songwriter, the performer name, a part or all of words, the URL where the music data corresponding to the rhythm data exists, the music data, and the like are output as a set to a search result output section 208 as output data.
  • The search result output section 208 outputs the output data generated by the search result generation section 207 to the searcher. For example, the data is output as screen information or sound information with the mobile telephone used for the search.
  • Next, the case implemented as a ringer tone search apparatus for the mobile telephone user as the most typical embodiment will be discussed containing the specific configurations of the function blocks. Throughout the specification, the mobile telephone may be any type of mobile telephone including PHS, a mobile telephone involved in radio LAN communications, etc.
  • Throughout the specification, the ringer tone refers to a sound produced to inform the mobile telephone user that a telephone call or mail comes in the mobile telephone, and contains not only a musical piece, but also a voice only.
  • Operation of Mobile Telephone
  • The operation of mobile telephone is executed according to a procedure shown in FIG. 3.
  • (Step S301) Mobile telephone 101 transmits an input time-series signal to search apparatus via a radio communication line. To input the input time-series signal, for example, rhythm is input using one button of the mobile telephone and the input time-series signal is transmitted to the search apparatus in sequence.
  • (Step S302) The mobile telephone 101 receives the search result from the search apparatus via the radio communication line and outputs the search result to a display 102 or a speaker 111 of the mobile telephone 101.
  • (Step S303) To terminate the ringer tone search processing, the mobile telephone 101 goes to step S304; to continue the ringer tone search processing, the mobile telephone 101 returns to step S301.
  • (Step S304) The mobile telephone 101 sends a search processing termination notification to the search apparatus via the radio communication line and terminates the processing.
  • Operation of Search Apparatus
  • The operation of the search apparatus is executed according to a procedure shown in FIG. 3.
  • (Step S351) The search apparatus receives the input time-series signal input in sequence from the mobile telephone via the radio communication line, passes the input time-series signal to the similar rhythm search section 206 in sequence, and makes a search request for searching for rhythm data similar to the input time-series signal.
  • (Step S352) The search apparatus executes search in sequence with the passage of time and outputs the search result produced using music-associated information related to the found rhythm data to the mobile telephone via the radio communication line. In the example, it is assumed that the title and the performer name among the title, the composer name, the songwriter, the performer name, a part or all of words, the URL where the music data exists, and the music data are used as the music-associated information.
  • (Step S353) To continue the search, the search apparatus returns to step S351; to terminate the search, the processing is terminated.
  • Thus, the user of the mobile telephone continues to input rhythm with one button of the mobile telephone, whereby the search apparatus searches the rhythm data storage section 203 in sequence for the rhythm data having rhythm similar to the input time-series signal and transmits the rhythm data to the mobile telephone.
  • Consequently, with the mobile telephone, the music searcher can obtain in sequence the search result of playing back the possible ringer tone to be found, seeing the title of the ringer tone, etc.
  • If the music-associated information storage section 204 can also retain the syllable names, the score data, the words, or the sound data associated with the piece of music, the music searcher can also check them as the search result.
  • In the example, the input time-series signal is transmitted in sequence from the mobile telephone to the search apparatus, which then executes search in sequence. However, the invention is not limited to it. Search may be executed with input of a search execution button pressed on the mobile telephone as a trigger. In this case, for example, the music searcher using the mobile telephone 101 inputs the input time-series signal for a given time and then presses the search execution button for transmitting the input time-series signal for the given time to the search apparatus. With pressing the search execution button as a trigger, the search apparatus may search the rhythm data storage section 203 for the rhythm data having rhythm similar to the input time-series signal and may transmit the rhythm data to the mobile telephone. In this case, the mobile telephone needs to have a function for acquiring the input time-series signal.
  • As the search result, ringer tone data or data to acquire ringer tone (link information with ringer tone data, ringer tone, or the like) is transmitted to the mobile telephone, whereby the user of the mobile telephone can download any desired ringer tone from the search result.
  • The search apparatus will be discussed below in detail for each function block:
  • The data stored in the music data storage section 201 is data for enabling music to be played back using a computer; for example, it includes data in SMAF (Synthetic music Mobile Application Format) represented by extension “.mmf,” a typical data format of ringer melody, and data in SMF (Standard Midi File) format represented by extension “.mid,” a more general MIDI standard format, as shown in FIG. 4. In addition, the data covers general music data, such as ringer melody data known as extension “.pmd,” Chaku-uta (registered trademark in Japan) having extension “.amc,” etc., data with extension “.wav” called WAVE format, and data with extension “.mp3” in AUDIO Layer 3 format. In the music data storage section 201, the music data is stored in association with music data ID. Here, the music data ID is called CID. The music data storage section 201 can be implemented as a record medium such as memory or a hard disk in a server computer.
  • Next, the rhythm data generation section 202 inputs music data from the music data storage section 201 and generates rhythm data fitted to each piece of music for each piece of music. As a typical example, a method of inputting music data in the SMF format and generating rhythm data is shown. Briefly, music data in the MIDI format including SMF is like digital data of score information and is made up of tempo of music and timbre, duration, pitch, etc., of separate sound. The rhythm data generation section 202 excludes the timbre information and the pitch information from the data, extracts the duration information, and converts it into data.
  • A processing flow of the rhythm data generation will be discussed with FIG. 5.
  • (Step S501) MIDI data file in the SMF format is read together with the CID from the music data storage section 201.
  • (Step S502) If a plurality of tracks exist in the MIDI data, one track is selected according to assistance input of the operator.
  • (Step S503) Time information (delta time) where sound rising information in the selected track (note on message) occurs is cut out and sound interval information is added to an array as a numeric string. This step is repeated and the sound interval information string for the whole music data is retained.
  • (Step S504) For the sound interval information string, one or more portions used as rhythm data are selected according to assistance input of the operator.
  • (Step S505) One or more pieces of sound interval information selected at step S504 are output to the rhythm data storage section 203 as the rhythm data in association with the CID.
  • The processing section described above may be accomplished using any of various known GUI functions in related arts.
  • In the embodiment, as the rhythm data, the time interval string of keying is indicated, separated by a comma in millisecond units, and rhythm data r is represented as follows:

  • r={r1, r2, r3, . . . , rn}
  • For example, if the rhythm is made up of four times of keying of “ta, _, ta, ta, ta” (where _denotes a one-beat rest symbol), there are three keying intervals. Thus, for example, assuming that one beat is 250 milliseconds, the rhythm data r is represented as r={500, 250, 250}. In the embodiment, the rhythm data generation section 202 is implemented as a CPU and a program of a server computer.
  • The rhythm data storage section 203 is a function block for retaining the rhythm data in association with the identifier; typically, it is implemented as storage of a computer. In the embodiment, the rhythm data storage section 203 is constructed as memory or a hard disk of a server computer and memory of a mobile telephone.
  • The rhythm data storage section 203 retains not only the rhythm data generated and output in the rhythm data generation section 202, but also the rhythm data created outside the search apparatus and the rhythm data manually entered by the user.
  • FIG. 6 shows a composition example of the data stored in the rhythm data storage section 203. The rhythm data storage section 203 stores the CID of the music data ID and the rhythm data in association with each other.
  • The time-series signal input section 205 outputs the time change of ON/OFF entered by the enterer to the similar rhythm search section 206 as an input time-series signal train.
  • The time-series signal input section 205 is an input unit that can detect at least one or more ON/OFF states; typically it is implemented as a button of a mechanical part involving electronic output. In the embodiment, it is a part of the mobile telephone of the searcher. The button and the circuit for receiving the button signal correspond to the time-series signal input section 205.
  • The time-series signal input section 205 can be implemented not only as the mobile telephone, but also as a keyboard, a mouse, etc., of PC, a touch panel, a remote control button, etc., if it can detect the ON/OFF state, as described above.
  • The similar rhythm search section 206 inputs the input time-series signal train, selects rhythm data similar to the fluctuation pattern of the signal, and outputs the similarity degree between one or more candidates and input patterns as a set to the search result generation section 207.
  • In the embodiment, the similar rhythm search section 206 is implemented as a server computer and a CPU and a program of a mobile telephone. Typically, it is implemented as a program of a computer for converting the input signal train into rhythm data and then performing calculation using an algorithm for calculating the similarity between two rhythms.
  • An example of a processing flow of similar rhythm search conducted by the similar rhythm search section 206 will be discussed with FIG. 7.
  • (Step S701) The similar rhythm search section 206 receives the input time-series signal from the time-series signal input section 205 and converts the input time-series signal into input rhythm data.
  • (Step S702) One piece of rhythm data is taken out from the rhythm data storage section 203 and the similarity between the rhythm data and the input rhythm data is calculated based on a similarity determination algorithm.
  • (Step S703) If the similarity is not calculated for all rhythm data, the process returns to step S701 and the processing is repeated; if the similarity is calculated for all rhythm data, the process goes to step S704.
  • (Step S704) The CIDs and the similarity degrees s for the three high-order pieces of rhythm data in the descending order of the rhythm similarity of the calculation result are output to the search result generation section 207 and the processing is terminated.
  • In the embodiment, the procedure of converting the input time-series signal train into the rhythm data is an easy procedure according to which the time at which the OFF-to-ON state transition is made is detected, each time interval is described in millisecond units, and the time interval string is converted into data separated by a comma, as shown in FIG. 8.
  • As the algorithm for determining the similarity between two pieces of rhythm data, a calculation method according to a rhythm vector technique in non-patent documents 1 and 2, a similarity degree calculation method according to the differential square sum, a similarity degree calculation method using dynamic programming, etc., can be used. In the embodiment, the similarity degree calculation method according to the differential square sum technique is adopted, and the calculation method will be discussed below:
  • Let the input rhythm data into which the input time-series signal train is converted be Ri and the rhythm data acquired from the rhythm data storage section 203 be Rj.

  • Ri={ri1, ri2, ri3, . . . , rin}

  • Rj={rj1, rj2, rj3, . . . , rjn, . . . , rjm}
  • Distance d between the two rhythms is calculated as follows:

  • d=Σ k((r ik −r jk)2)
  • The similarity degree s is calculated as follows:

  • s=1/(1+d)
  • The larger the similarity degree s, the higher the similarity. For example, a pair of the CID and the similarity s for each of the three high-order pieces of rhythm data Rj in the descending order of the value of s is output to the search result generation section 207.
  • In the embodiment, the similar rhythm search result as shown in FIG. 9 is output as the result of the calculation. FIG. 9 shows that the CID of the music data with the highest similarity degree is “MUS0003” and the similarity degree is “0.8.”
  • The music-associated information storage section 204 is a function block for retaining information relevant to ringer tone and retaining associated information of the title, composer name, etc., for example, with the identifier for identifying the ringer tone as a key. The music-associated information storage section 204 is implemented as memory of a computer, etc.
  • In the embodiment, the music-associated information storage section 204 retains data of “title,” “music genre,” “composer name,” “songwriter name,” and “URL where music data exists” with the CID of the music identifier as a key, as shown in FIG. 10. The invention is not limited to the format; for example, all information relevant to music and its data, such as the performer name, the album name, the music play time, the data size of ringer tone data, a part or all of words, and the ringer tone data, can be stored.
  • The search result generation section 207 receives the pairs of the CIDs and the similarity degrees s input from the similar rhythm search section 206, organizes them as output data in the search result output section 208, and outputs the organized output data to the search result output section 208. At the organizing time, any other information which needs to be output as the search result is acquired from the music-associated information storage section 204 based on the CID. The output data is made up of text information, image information, sound information, the score of the similarity degree, etc.
  • To output the similarity degree, the similarity degree may be normalized so that 0 points mean complete non-similarity and 100 points mean complete match as the score based on the similarity degree s. Accordingly, it is made possible for the searcher to see how much the result matches the input at a glance.
  • The search result output section 208 receives the search result from the search result generation section 207 and outputs the search result by displaying a character string or an image or playing back music, etc.
  • In the embodiment, for example, to output an image as shown in FIG. 11 as the search result, “title,” “music genre,” “composer name,” “songwriter name,” and “URL where music data exists” are obtained using the CID from the data stored in the music-associated information storage section 204 shown in FIG. 10 and image data is generated in the format shown in FIG. 10 and is output to the display 102 of the mobile telephone 101.
  • The ringer tone data is downloaded from URL 1 where the data exists from the music-associated information shown in FIG. 10 and is stored in the memory in the mobile telephone, and voice is output from the speaker 111 connected to the mobile telephone 101.
  • In the description of the embodiment, search for ringer tone in the mobile telephone is taken as an example, but the invention is not limited to it. The invention can also be embodied as a music search apparatus in an apparatus such as a mobile terminal or a CP that can play back/download music.
  • At this time, if a search is made to play back music data in the mobile terminal, the rhythm data storage section 203 and the music data storage section 201 are storage media such as memory in the mobile terminal, and the similar rhythm search section 206 is implemented as a CPU and a program of the mobile terminal for playing back music data as output.
  • In the embodiment, the data of “title,” “music genre,” “composer name,” “songwriter name,” “URL where music data exists,” etc., is obtained with the CID of the music identifier as a key. However, instead, the data of “title,” “music genre,” “composer name,” “songwriter name,” “URL where music data exists,” etc., may be obtained with the ID (identifier) of rhythm data as a key.
  • The search of the invention can also be used for the purpose of calling to play back a musical piece retained in the mobile telephone as the search result.
  • Only the time information (delta time) where sound rising information in the selected track (note on message) occurs is cut out at (step S503). However, further sound falling information (note off message) may also be cut out and the sound rising information and the sound falling information may be added together to the array as a numeric string. In this case, the rising time and the falling time are recorded alternately in the numeric array, and the data becomes time interval information with sound switched between ON and OFF. To use the rhythm data thus created, likewise, not only the time at which the OFF-to-ON state transition is made, but also the time at which the ON-to-OFF state transition is made is detected at the step where the input rhythm data is generated by the time-series signal input section 205 (step S701). If the data is used, search can be conducted as in the embodiment described above. Accordingly, search making the most of not only the sound rising timing, but also duration information can be conducted.
  • As described above, according to the invention, the music search apparatus based on rhythm input can be made easier to use and easier to implement.
  • Second Embodiment
  • Next, a second embodiment of the invention wherein search is made using both a terminal and a music search apparatus of a server will be discussed.
  • The difference from the first embodiment will be discussed below:
  • In the second embodiment, the functions of the function blocks shown in FIG. 2 are installed in both a mobile telephone as of a music search terminal and a music search apparatus as of a server. If the search result of a predetermined similarity degree is not provided in the terminal of the mobile telephone, etc., search is made using the music search apparatus of the server.
  • Generally, the rhythm data storage section 203 in mobile telephone 101 stores a subset of the data retained in a rhythm data storage section in the music search apparatus of the server, and it is effective to store musical pieces having a high possibility that the musical piece may become the search result. However, the rhythm data storage section 203 is not limited to a subset and the rhythm data of the musical pieces retained by the user and the rhythm data keyed by the user are stored, whereby it is also possible to provide the search result unique to the user.
  • FIG. 12 is a flowchart to describe the operation of the music search apparatus. In the description that follows, for the functions in the mobile telephone (music search apparatus), a suffix of T (Terminal) is added to each reference numeral and for the functions in the music search apparatus of the server, a suffix of S (Server) is added to each reference numeral.
  • In FIG. 12, in the mobile telephone of the terminal (music search apparatus), a time-series signal input section 205T receives a time-series signal with the passage of time and outputs the time-series signal to a similar rhythm search section 206T in the mobile telephone (step S1201). This can be executed in the mobile telephone as an application, such as i-appli (registered trademark in Japan), as follows: For example, a program for detecting an event of a center button being pressed and receiving an input time-series signal forms the time-series signal input section 205T, and rhythm data previously defined as constant data is stored in a scratch pad of a data storage mechanism of the i-appli, thereby forming a rhythm data storage section 203.
  • In the terminal, the similar rhythm search section 206T formed as an i-appli program, for example, in the mobile telephone references the rhythm data storage section 203T in the mobile telephone and performs search processing similar to that in the first embodiment (step S1202).
  • In the terminal, if the result of a predetermined similarity degree or more is obtained at step S1202 (Yes at step S1203), a search result generation section 207T uses music-associated information obtained from a music-associated information storage section 204T to generate the search result to be output as the final search result based on the obtained result (step S1204). The predetermined similarity degree mentioned above can be determined 0.9, etc., in the similarity degree with 1 as the maximum value.
  • In the terminal, if the result of a predetermined similarity degree or more is not obtained at step S1202 (No at step S1203), the time-series signal as a search request is transmitted to the music search apparatus of the server (step S1205).
  • In the music search apparatus of the server, a rhythm data storage section 203S is referenced and search processing is performed (step S2151) and then the search result to be output as the final search result is generated using music-associated information obtained from a music-associated information storage section 204S and is transmitted to the mobile telephone terminal (step S1252) as in the first embodiment. In the music search apparatus of the server, if the search processing in the terminal is not terminated (No at step S1253), the process returns to step S1251.
  • In the terminal, the search result is received from the music search apparatus of the server (step S1206).
  • In the terminal, the obtained search result is output to a display, etc., (step S1207).
  • In the terminal, to terminate the search processing, a search processing termination notification is sent to the search apparatus via a radio communication line and the processing is terminated (step S128).
  • As described above, according to the invention, the music search apparatus based on rhythm input can be made easier to use and easier to implement.
  • Further, in the second embodiment, when the search result of the predetermined similarity degree can be obtained in the mobile telephone terminal, the search result can be obtained without delay caused by communications, etc., so that the search result can be obtained at high speed and further when the search result of the predetermined similarity degree cannot be obtained, the search result is provided by the music search apparatus of the server, so that the search result considered to be more correct can be obtained.
  • Generally, the server has a margin of the storage capacity as compared with the terminal and thus stores more rhythm data than the terminal. Although it is difficult to perform calculation of search, etc., about large-scaled data in the mobile telephone, as search is executed in the music search apparatus of the server, the user can also obtain the search result for a musical piece whose rhythm data is not retained although a delay is caused by communications, etc.
  • If the result of the predetermined similarity degree or more can be found in the mobile telephone, the user may be enabled to command the music search apparatus of the server to conduct search.
  • Third Embodiment
  • Next, a third embodiment of the invention will be discussed. In the embodiment, the search result is output involving the words, score, syllable names, and sound of the keying portion inputting an input time-series signal.
  • The difference from the first embodiment will be discussed below:
  • In the third embodiment, in the process of generating and outputting the search result, a part or all of words, a part or all of syllable names of note of musical piece, a part or all of score, or voice audition data is played back so that the user can keep track of the portion corresponding to an input time-series signal. Accordingly, if the user enters the rhythm pattern of an impressive phrase without knowing the name of a musical piece for searching for the musical piece and several candidates are displayed, it is made possible for the user to easily determine which candidate is his or her desired candidate.
  • The embodiment is implemented as follows:
  • In each piece of rhythm data stored in a rhythm data storage section 203, a time stamp indicating the rhythm appears at a time position of how many seconds since the start of the musical piece containing the rhythm is retained. Usually, one time stamp is registered, but two or more time stamps can be registered. In a music-associated information storage section 204, the corresponding portions of words, syllable names, score, voice data are registered by time stamp, whereby each music-associated information can be related to each piece of rhythm data for output. The output is generated as a part of the search result in a search result generation section 207 and is output in a search result output section 208, so that the user can rapidly determine whether or not the search result is as intended according to the words and the musical scale of the portion corresponding to the input time-series signal.
  • In addition to the method of registering the data by time stamp, for example, if the data is voice data, it is easy to cut out a several-second portion from the specified portion and output the cut-out portion according to a known art, and the voice audition data corresponding to the input portion of the time-series signal by the user can be provided.
  • The time of the start position for each reasonable portion of words, syllable names, score, etc., is previously tagged as associated information, whereby some words, syllable names, score, etc., containing the input range of the time-series signal by the user can be output.
  • Alternatively, an output mode wherein all words, syllable names, score, etc., are output and then the input range of the time-series signal by the user is displayed in a different color, etc., rather than the output mode of some words, syllable names, score, etc., is also effective.
  • As described above, according to the invention, the music search apparatus based on rhythm input can be made easier to use and easier to implement.
  • Fourth Embodiment
  • Next, a fourth embodiment of the invention will be discussed. In the embodiment, an input time-series signal at the search time is converted into rhythm data, which is then stored in a rhythm data storage section for reuse for the later search.
  • The difference from the first embodiment will be discussed below:
  • In the fourth embodiment, the following mechanism is introduced: An input time-series signal at the search time is converted into rhythm data, which is then stored in a rhythm data storage section 203 for reuse for the later search, thereby enhancing the search accuracy.
  • The portion wherein the search result of a similar rhythm search section based on a signal received at a time-series signal input section is output in a search result output section is similar to that in the first embodiment.
  • In the fourth embodiment, further the input time-series signal is converted into the data format of rhythm data for retention in the rhythm data storage section 203.
  • If the user selects one musical piece from the search result, the retained input time-series signal is converted into the data format of rhythm data and is additionally registered in the rhythm data storage section 203 as one piece of the rhythm data of the selected musical piece. The data format conversion may be executed according to a method similar to the method previously described with reference to FIG. 8 in the first embodiment.
  • Accordingly, the possibility that the candidate musical piece selected this time may be displayed as a higher-order candidate becomes high in response to a search request with a similar input time-series signal. This makes the most of the nature that the feature for each user is often recognized although rhythm recognition and time-series signal input of the user are not necessarily as the musical score. If the same user executes two or more searches for one musical piece, for example, if the user searches for the musical piece to call the musical piece retained in the terminal, the accuracy is enhanced as compared with the usual case.
  • As described above, according to the invention, the music search apparatus based on rhythm input can be made easier to use and easier to implement.
  • Further, in the fourth embodiment, search with higher accuracy can be conducted.
  • The invention is not limited to the specific embodiment described above and various changes and modifications can be made without departing from the spirit and the scope of the invention.
  • An information edit apparatus of the invention can be implemented as a program operated in a computer such as a workstation (WS) or a personal computer (PC).
  • FIG. 13 is a block diagram to show a configuration example when an information edit apparatus (music search apparatus) according to the invention is implemented in a computer. This computer includes a central processing unit 1301 for executing a program, memory 1302 for storing the program and data being processed by the program, a magnetic disk drive 1303 for storing the program, the data to search for, and OS (Operating System), and an optical disk drive 1304 for reading/writing the program and data from/to an optical disk.
  • The computer further includes an image output section 1305 of an interface for displaying a screen on a display, etc., an input acceptance section 1306 for accepting input from a keyboard, a mouse, a touch panel, etc., and an output/input section 1307 of an output/input interface with an external machine (for example, USB (Universal Serial Bus), a voice output terminal, etc.,). The computer also includes a display 1308 such as an LCD, a CRT, or a projector, an input unit 1309 such as a keyboard or a mouse, and an external machine 1310 such as a memory card reader or a speaker.
  • The central processing unit 1301 reads the program from the magnetic disk drive 1303 and stores the program in the memory 1302 and then executes the program, thereby implementing the function blocks shown in FIG. 2. During the program execution, a part or all of the data to search for may be read from the magnetic disk drive 1303 and may be stored in the memory 1302.
  • As the basic operation, a search request made by the user is received through the input unit 1309 and a search is made for the data to search for stored in the magnetic disk drive 1303 or the memory 1302 in response to the search request. The search result is displayed on the display 1308.
  • The search result not only is displayed on the display 1308, but also may be presented to the user by voice with a speaker connected to the computer as the external machine 1310, for example. Alternatively, the search result may be presented to the user as printed matter with a printer connected to the computer as the external machine 1310.
  • It is to be understood that the invention is not limited to the specific embodiments described above and that the invention can be embodied with the components modified without departing from the spirit and scope of the invention. The invention can be embodied in various forms according to appropriate combinations of the components disclosed in the embodiments described above. For example, some components may be deleted from all components shown in the embodiment. Further, the components in different embodiments may be used appropriately in combination.
  • As described with reference to the embodiments, there is provided a music search apparatus including input means that inputs a time-series signal whose on state and off state are repeated alternately; data storage means for storing a plurality of pieces of rhythm data in association with music-associated information associated with music corresponding to the rhythm data; search means for searching the plurality of pieces of rhythm data stored in the data storage means for rhythm data having the same fluctuation pattern as or a similar fluctuation pattern to the time-series signal input to the input means; and search result output means for reading the music-associated information stored in association with the rhythm data found by the search means from the data storage means and outputting the read music-associated information as the search result of the search.
  • The invention relating to the apparatus also holds as the invention relating to a method and the invention relating to the method also holds as the invention relating to the apparatus.
  • The invention relating to the apparatus or the method also holds as a program for causing a computer to execute a procedure corresponding to the invention (or causing a computer to function as means corresponding to the invention or causing a computer to provide functions corresponding to the invention) and also holds as a computer-readable record medium recording the program.
  • According to the invention, the music search apparatus for making it possible to search for music containing a rhythm similar to the rhythm in response to rhythmical ON/OFF change input of a time-series signal and inspect the music information or play back music can be realized.

Claims (5)

1.-19. (canceled)
20. A music search method comprising:
inputting a time-series signal represented by on/off signals;
storing a plurality of pieces of rhythm data in association with music-associated information associated with music corresponding to the rhythm data;
searching the plurality of pieces of rhythm data for rhythm data having the same fluctuation pattern as or a similar fluctuation pattern to the time-series signal;
reading the music-associated information stored in association with the rhythm data found by the searching;
outputting the read music-associated information as the search result of the search;
determining whether or not rhythm data of a predetermined similarity degree or more has been found as the result of the searching; and
displaying the search result when rhythm data of the predetermined similarity degree or more has been found as the result of the determining, and displaying the search result when rhythm data of the predetermined similarity degree or more has not been found as the result of the determining.
21. A music search method comprising:
inputting a time-series signal represented by on/off signals;
storing a plurality of pieces of rhythm data in association with music-associated information associated with music corresponding to the rhythm data;
searching the plurality of pieces of rhythm data for rhythm data having the same fluctuation pattern as or a similar fluctuation pattern to the time-series signal;
reading the music-associated information found by the search;
outputting the read music-associated information as the search result of the search;
calculating excess or deficiency of input time of the time-series signal; and
outputting the calculated excess or deficiency.
22. A music search method as claimed in claim 20, wherein the outputting includes outputting the rhythm data found by the searching in addition to the music-associated information as a search result of the searching, and
wherein the music search method further includes storing the rhythm data found by the searching.
23. A music search method as claimed in claim 20, wherein the storing includes storing at least one item of music data of syllable names, score, words, and music data as the information associated with music, and
wherein the outputting includes outputting the information associated with music as a part of a search result of the searching.
US12/073,381 2004-09-30 2008-03-05 Music search system and music search apparatus Abandoned US20080156177A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/073,381 US20080156177A1 (en) 2004-09-30 2008-03-05 Music search system and music search apparatus

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2004288433A JP2006106818A (en) 2004-09-30 2004-09-30 Music retrieval device, music retrieval method and music retrieval program
JPP2004-288433 2004-09-30
US11/239,119 US7368652B2 (en) 2004-09-30 2005-09-30 Music search system and music search apparatus
US12/073,381 US20080156177A1 (en) 2004-09-30 2008-03-05 Music search system and music search apparatus

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/239,119 Continuation US7368652B2 (en) 2004-09-30 2005-09-30 Music search system and music search apparatus

Publications (1)

Publication Number Publication Date
US20080156177A1 true US20080156177A1 (en) 2008-07-03

Family

ID=36097535

Family Applications (3)

Application Number Title Priority Date Filing Date
US11/239,119 Expired - Fee Related US7368652B2 (en) 2004-09-30 2005-09-30 Music search system and music search apparatus
US12/073,381 Abandoned US20080156177A1 (en) 2004-09-30 2008-03-05 Music search system and music search apparatus
US12/073,380 Abandoned US20080183320A1 (en) 2004-09-30 2008-03-05 Music search system and music search apparatus

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/239,119 Expired - Fee Related US7368652B2 (en) 2004-09-30 2005-09-30 Music search system and music search apparatus

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/073,380 Abandoned US20080183320A1 (en) 2004-09-30 2008-03-05 Music search system and music search apparatus

Country Status (4)

Country Link
US (3) US7368652B2 (en)
EP (1) EP1703488A3 (en)
JP (1) JP2006106818A (en)
CN (1) CN100429656C (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090044688A1 (en) * 2007-08-13 2009-02-19 Sanyo Electric Co., Ltd. Musical piece matching judging device, musical piece recording device, musical piece matching judging method, musical piece recording method, musical piece matching judging program, and musical piece recording program
US20100313739A1 (en) * 2009-06-11 2010-12-16 Lupini Peter R Rhythm recognition from an audio signal
US20110192272A1 (en) * 2010-02-05 2011-08-11 Yamaha Corporation Tone data search apparatus and method
US8492637B2 (en) * 2010-11-12 2013-07-23 Sony Corporation Information processing apparatus, musical composition section extracting method, and program
US20140000442A1 (en) * 2012-06-29 2014-01-02 Sony Corporation Information processing apparatus, information processing method, and program
US9563701B2 (en) 2011-12-09 2017-02-07 Yamaha Corporation Sound data processing device and method

Families Citing this family (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005173938A (en) * 2003-12-10 2005-06-30 Pioneer Electronic Corp Musical piece search device, method and program and information recording media
SE527425C2 (en) * 2004-07-08 2006-02-28 Jonas Edlund Procedure and apparatus for musical depiction of an external process
KR20060073100A (en) * 2004-12-24 2006-06-28 삼성전자주식회사 Sound searching terminal of searching sound media's pattern type and the method
US7507898B2 (en) * 2005-01-17 2009-03-24 Panasonic Corporation Music reproduction device, method, storage medium, and integrated circuit
KR100735444B1 (en) * 2005-07-18 2007-07-04 삼성전자주식회사 Method for outputting audio data and music image
JP2007219178A (en) * 2006-02-16 2007-08-30 Sony Corp Musical piece extraction program, musical piece extraction device, and musical piece extraction method
US7772478B2 (en) * 2006-04-12 2010-08-10 Massachusetts Institute Of Technology Understanding music
KR20080043129A (en) * 2006-11-13 2008-05-16 삼성전자주식회사 Method for recommending photo using music of mood and system thereof
KR100775585B1 (en) * 2006-12-13 2007-11-15 삼성전자주식회사 Method for recommending music about character message and system thereof
CN101226526A (en) * 2007-01-17 2008-07-23 上海怡得网络有限公司 Method for searching music based on musical segment information inquest
CN101652807B (en) * 2007-02-01 2012-09-26 缪斯亚米有限公司 Music transcription method, system and device
EP2122510A2 (en) 2007-02-14 2009-11-25 Museami, Inc. Music-based search engine
US8283546B2 (en) * 2007-03-28 2012-10-09 Van Os Jan L Melody encoding and searching system
JP5076597B2 (en) * 2007-03-30 2012-11-21 ヤマハ株式会社 Musical sound generator and program
US8280539B2 (en) * 2007-04-06 2012-10-02 The Echo Nest Corporation Method and apparatus for automatically segueing between audio tracks
JP5088616B2 (en) * 2007-11-28 2012-12-05 ヤマハ株式会社 Electronic music system and program
US8494257B2 (en) 2008-02-13 2013-07-23 Museami, Inc. Music score deconstruction
JP4670885B2 (en) * 2008-03-28 2011-04-13 ブラザー工業株式会社 Time-series data management device and program
CN101567203B (en) * 2008-04-24 2013-06-05 深圳富泰宏精密工业有限公司 System and method for automatically searching and playing music
TWI426501B (en) * 2010-11-29 2014-02-11 Inst Information Industry A method and apparatus for melody recognition
US9053696B2 (en) 2010-12-01 2015-06-09 Yamaha Corporation Searching for a tone data set based on a degree of similarity to a rhythm pattern
JP5417365B2 (en) 2011-03-15 2014-02-12 株式会社東芝 Information distribution system, information distribution apparatus, information communication terminal, and information distribution method
JP5982980B2 (en) * 2011-04-21 2016-08-31 ヤマハ株式会社 Apparatus, method, and storage medium for searching performance data using query indicating musical tone generation pattern
JP5970934B2 (en) 2011-04-21 2016-08-17 ヤマハ株式会社 Apparatus, method, and recording medium for searching performance data using query indicating musical tone generation pattern
CN102387408A (en) * 2011-10-25 2012-03-21 深圳市同洲电子股份有限公司 Method for obtaining music information, set top box and related systems
US8586847B2 (en) 2011-12-02 2013-11-19 The Echo Nest Corporation Musical fingerprinting based on onset intervals
US20130324274A1 (en) * 2012-05-31 2013-12-05 Nike, Inc. Method and apparatus for indicating swing tempo
CN103514158B (en) * 2012-06-15 2016-10-12 国基电子(上海)有限公司 Musicfile search method and multimedia playing apparatus
JP6047985B2 (en) * 2012-07-31 2016-12-21 ヤマハ株式会社 Accompaniment progression generator and program
CN104102659A (en) * 2013-04-09 2014-10-15 华为技术有限公司 Music searching method and terminal
US10534806B2 (en) 2014-05-23 2020-01-14 Life Music Integration, LLC System and method for organizing artistic media based on cognitive associations with personal memories
KR20160020158A (en) * 2014-08-13 2016-02-23 삼성전자주식회사 Method and electronic device for providing media contents-related information
US10133537B2 (en) * 2014-09-25 2018-11-20 Honeywell International Inc. Method of integrating a home entertainment system with life style systems which include searching and playing music using voice commands based upon humming or singing
CN105513583B (en) * 2015-11-25 2019-12-17 福建星网视易信息系统有限公司 song rhythm display method and system
CN106292424A (en) * 2016-08-09 2017-01-04 北京光年无限科技有限公司 Music data processing method and device for anthropomorphic robot
US9934785B1 (en) 2016-11-30 2018-04-03 Spotify Ab Identification of taste attributes from an audio signal
CN107220330A (en) * 2017-05-24 2017-09-29 万业(天津)科技有限公司 Song intelligent search method and system
CN107016134A (en) * 2017-05-24 2017-08-04 万业(天津)科技有限公司 Can Auto-matching song intelligent search method and system
US11670322B2 (en) 2020-07-29 2023-06-06 Distributed Creation Inc. Method and system for learning and using latent-space representations of audio signals for audio content-based retrieval
CN112634843A (en) * 2020-12-28 2021-04-09 四川新网银行股份有限公司 Information generation method and device for expressing data in voice mode

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030100967A1 (en) * 2000-12-07 2003-05-29 Tsutomu Ogasawara Contrent searching device and method and communication system and method
US6678680B1 (en) * 2000-01-06 2004-01-13 Mark Woo Music search engine
US6987221B2 (en) * 2002-05-30 2006-01-17 Microsoft Corporation Auto playlist generation with multiple seed songs
US6993532B1 (en) * 2001-05-30 2006-01-31 Microsoft Corporation Auto playlist generator
US7193149B2 (en) * 2002-05-17 2007-03-20 Northern Information Technology, Inc. System handling video, control signals and power

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5963957A (en) * 1997-04-28 1999-10-05 Philips Electronics North America Corporation Bibliographic music data base with normalized musical themes
US20020073098A1 (en) * 2000-02-28 2002-06-13 Lei Zhang Methodology and system for searching music over computer network and the internet based on melody and rhythm input
JP2002014974A (en) * 2000-06-30 2002-01-18 Fuji Photo Film Co Ltd Retrieving device and system
JP2003142511A (en) 2001-11-06 2003-05-16 Hitachi Ltd Method of manufacturing semiconductor device
CN1623151A (en) * 2002-01-24 2005-06-01 皇家飞利浦电子股份有限公司 Music retrieval system for joining in with the retrieved piece of music
JP3807333B2 (en) * 2002-03-22 2006-08-09 ヤマハ株式会社 Melody search device and melody search program
JP2004033492A (en) 2002-07-03 2004-02-05 Commons Co Ltd Displaying audio visual booth
JP2004348275A (en) 2003-05-20 2004-12-09 Toshiba Corp Processing contents designating means, input method and data processor
US7193148B2 (en) * 2004-10-08 2007-03-20 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Apparatus and method for generating an encoded rhythmic pattern

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6678680B1 (en) * 2000-01-06 2004-01-13 Mark Woo Music search engine
US20030100967A1 (en) * 2000-12-07 2003-05-29 Tsutomu Ogasawara Contrent searching device and method and communication system and method
US6993532B1 (en) * 2001-05-30 2006-01-31 Microsoft Corporation Auto playlist generator
US7193149B2 (en) * 2002-05-17 2007-03-20 Northern Information Technology, Inc. System handling video, control signals and power
US6987221B2 (en) * 2002-05-30 2006-01-17 Microsoft Corporation Auto playlist generation with multiple seed songs

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090044688A1 (en) * 2007-08-13 2009-02-19 Sanyo Electric Co., Ltd. Musical piece matching judging device, musical piece recording device, musical piece matching judging method, musical piece recording method, musical piece matching judging program, and musical piece recording program
US7985915B2 (en) * 2007-08-13 2011-07-26 Sanyo Electric Co., Ltd. Musical piece matching judging device, musical piece recording device, musical piece matching judging method, musical piece recording method, musical piece matching judging program, and musical piece recording program
US20100313739A1 (en) * 2009-06-11 2010-12-16 Lupini Peter R Rhythm recognition from an audio signal
US8507781B2 (en) * 2009-06-11 2013-08-13 Harman International Industries Canada Limited Rhythm recognition from an audio signal
US20110192272A1 (en) * 2010-02-05 2011-08-11 Yamaha Corporation Tone data search apparatus and method
US8431812B2 (en) * 2010-02-05 2013-04-30 Yamaha Corporation Tone data search apparatus and method
US8492637B2 (en) * 2010-11-12 2013-07-23 Sony Corporation Information processing apparatus, musical composition section extracting method, and program
US9563701B2 (en) 2011-12-09 2017-02-07 Yamaha Corporation Sound data processing device and method
US20140000442A1 (en) * 2012-06-29 2014-01-02 Sony Corporation Information processing apparatus, information processing method, and program

Also Published As

Publication number Publication date
CN1755686A (en) 2006-04-05
CN100429656C (en) 2008-10-29
JP2006106818A (en) 2006-04-20
US7368652B2 (en) 2008-05-06
US20080183320A1 (en) 2008-07-31
EP1703488A3 (en) 2007-02-07
US20060065105A1 (en) 2006-03-30
EP1703488A2 (en) 2006-09-20

Similar Documents

Publication Publication Date Title
US7368652B2 (en) Music search system and music search apparatus
KR100267663B1 (en) Karaoke apparatus responsive to oral request of entry songs
JP2003519845A (en) Music search engine
JP2007256617A (en) Musical piece practice device and musical piece practice system
CN105740394A (en) Music generation method, terminal, and server
JP2007256618A (en) Search device
JP2002055695A (en) Music search system
JPH08160975A (en) Karaoke music selecting device
JP6835247B2 (en) Data generator and program
JP2003131674A (en) Music search system
JP4171680B2 (en) Information setting device, information setting method, and information setting program for music playback device
JP2007322933A (en) Guidance device, production device for data for guidance, and program
JP5847048B2 (en) Piano roll type score display apparatus, piano roll type score display program, and piano roll type score display method
JP6954780B2 (en) Karaoke equipment
JP6073618B2 (en) Karaoke equipment
JPH11249674A (en) Singing marking system for karaoke device
JP2007225916A (en) Authoring apparatus, authoring method and program
JP3637196B2 (en) Music player
JP3225817B2 (en) Image selection method for communication karaoke device
JP6439239B2 (en) Performance data file search method, system, program, terminal device, and server device
JP7425558B2 (en) Code detection device and code detection program
JP2003167576A (en) System for providing musical composition
JP3317127B2 (en) Karaoke equipment
JP2007233078A (en) Evaluation device, control method, and program
JP2001056817A (en) Music retrieval system

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION