IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0197184
(1998-11-20)
|
우선권정보 |
JP-0347016 (1997-12-02); JP-0018258 (1998-01-13) |
발명자
/ 주소 |
- Terada, Kosei
- Nakamura, Akitoshi
- Takahashi, Hiroaki
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 |
피인용 횟수 :
54 인용 특허 :
31 |
초록
▼
In a system for animating an object along a music, a sequencer module sequentially provides music control information and a synchronization signal in correspondence with the music to be played. A parameter setting module is operable to set motion parameters effective to determine movements of movabl
In a system for animating an object along a music, a sequencer module sequentially provides music control information and a synchronization signal in correspondence with the music to be played. A parameter setting module is operable to set motion parameters effective to determine movements of movable parts of the object. An audio module is responsive to the synchronization signal for generating a sound in accordance with the music control information to thereby play the music. A video module is responsive to the synchronization signal for generating a motion image of the object in matching with progression of the music. The video module utilizes the motion parameters to basically control the motion image, and utilizes the music control information to further control the motion image in association with the played music.
대표청구항
▼
1. A system for animating movable parts of an object along with music, said system comprising:a sequencer module that sequentially provides music control information in correspondence with the music to be played, the music control information including a plurality of types of music control event dat
1. A system for animating movable parts of an object along with music, said system comprising:a sequencer module that sequentially provides music control information in correspondence with the music to be played, the music control information including a plurality of types of music control event data for controlling a sound of the music to be played; a parameter setting module for generating a graphical user interface that is operable to select a type of music control event data from among the plurality of types of the music control event data that are graphically displayed for selection, said graphical user interface operable to assign a type of music control event data to each of the movable parts of the object such that each of the movable parts correspond to an assigned type of music control event data, wherein the correspondence between each of the movable parts and the corresponding assigned type of music control event data is displayed; an audio module for generating the sound in accordance with each music control event data included in the music control information to thereby play the music; and a video module responsive to the music control information for controlling movements of the respective movable parts in correspondence to the types of music control event data included in the music control information sequentially provided from the sequencer module, thereby generating a motion image of the object in matching with progression of the music. 2. The system as claimed in claim 1, wherein the video module analyzes a data block of the music control information for preparing a frame of the motion image in advance to generation of the sound corresponding to the same data block by the audio module, so that the video module can generate the prepared frame timely when the audio module generates the sound according to the same data block used for preparation of the frame.3. The system as claimed in claim 1, wherein the video module successively generates key frames of the motion image in response to the music control information, the video module further generating a number of sub frames inserted between the successive key frames by interpolation to smoothen the motion image while varying the number of the sub frames dependently on a resource of the system affordable to the interpolation.4. The system as claimed in claim 1, wherein the video module generates the motion image of an object representing an instrument player, the video module sequentially analyzing the music control information to determine a rendition movement of the instrument player for controlling the motion image as if the instrument player plays the music.5. The system as claimed in claim 1, wherein the parameter setting module sets motion parameters effective to determine the movements of the movable parts of the object, and the video module generates the motion image according to the motion parameters, the video module periodically resetting the motion image to revert the movable parts to the default positions in matching with the progression of the music.6. The system as claimed in claim 1, wherein the video module is responsive to the synchronization signal, which is provided from the sequencer module and which is utilized to regulate a beat of the music so that the motion image of the object is controlled in synchronization with the beat of the music.7. The system as claimed in claim 1, wherein the sequencer module provides the music control information containing the musical control event data specifying an instrument used to play the music, and wherein the video module generate the motion image of an object representing a player with the specified instrument to play the music.8. The system as claimed in claim 1, wherein the parameter setting module sets motion parameters effective to determine the movements of the movable parts of the object, and the video module utilizes the motion parameters to control the motion image of the object such that the movement of each part of the object is determined by the motion parameter, and utilizes the music control information controlling an amplitude of the sound to further control the motion image such that the movement of each movable part determined by the motion parameter is scaled in association with the amplitude of the sound.9. The system as claimed in claim 1, wherein the parameter setting module sets motion parameters effective to determine a posture of a dancer object, and wherein the video module is responsive to the synchronization signal provided from the sequencer module for generating the motion image of the dancer object according to the motion parameters such that the dancer object is controlled as if dancing in matching with progression of the music.10. An apparatus for animating movable parts of an object along with music, said apparatus comprising:sequencer means for sequentially providing performance data of the music, the performance data including a plurality of types of music control event data for controlling a sound of the music to be played; setting means for generating a graphical user interface that is operable for selecting a type of music control event data from among the plurality of types of the music control event data that are graphically displayed for selection, said graphical user interface operable for assigning a type of music control event data to each of the movable parts of the object such that each of the movable parts correspond to assigned type of music control event data, wherein the correspondence between each of the movable parts and the corresponding assigned type of music control event data is displayed; audio means for generating the sound in accordance with each music control event data included in the performance data to thereby perform the music; and video means responsive to the performance data for controlling movements of the respective movable parts in correspondence to the types of music control event data included in the performance data sequentially provided from the sequencer means, thereby generating a motion image of the object in matching with the progression of the music. 11. The apparatus as claimed in claim 10, wherein the video means includes means for analyzing a block of the performance data to prepare a frame of the motion image in advance to generation of the sound corresponding to the same block by the audio means, so that the video means can generate the prepared frame timely when the audio means generates the sound according to the same block used for preparation of the frame.12. The apparatus as claimed in claim 10, wherein the video means comprises means for successively generating key frames of the motion image in response to the performance data, and means for generating a number of sub frames inserted between the successive key frames by interpolation to smoothen the motion image while varying the number of the sub frames dependently on a resource of the apparatus affordable to the interpolation.13. The apparatus as claimed in claim 10, wherein the setting means comprises means for setting the motion parameters to design a movement of the object representing a player of an instrument, and wherein the video means comprises means for utilizing the motion parameters to form the framework of the motion image of the player and means for utilizing the performance data to modify the framework for generating the motion image presenting the player playing the instrument to perform the music.14. A method of animating movable parts of an object in association with music, said method comprising the step of:sequentially providing performance data to perform the music, the performance data including a plurality of types of music control event data associated to the music to be played; displaying a graphical user interface operable for selecting and setting a type of music control event data from amongst the plurality types of music control event data that are graphically displayed for selection, said graphical user interface operable for assigning a type of music control event data to each of the movable parts of the object such that the respective movable parts correspond to the assigned music control event data, wherein the correspondence between each of the movable parts and the corresponding assigned type of music control event data is displayed; generating a sound in accordance with the performance data to thereby perform the music; and generating a motion image of the object in matching with the progression of the music, wherein the step of generating a motion image is in response to the performance data for controlling movements of the respective movable parts in correspondence to the types of music control event data included in the performance data sequentially provided by said step of sequentially providing performance data. 15. The method as claimed in claim 14, wherein the step of generating a motion picture includes analyzing a block of the performance data to prepare a frame of the motion image in advance to generation of the sound corresponding to the same block so that the prepared frame can be generated timely when the sound is generating according to the same block used for preparation of the frame.16. The method as claimed in claim 14, wherein the step of generating a motion image comprises successively generating key frames of the motion image in response to the performance data, and generating a variable number of sub frames inserted between the successive key frames by interpolation to smoothen the motion image.17. The method as claimed in claim 14, wherein the step of displaying a graphical user interface further comprises providing motion parameters to design a movement of the object representing a player of an instrument, and wherein the step of generating a motion image further comprises utilizing the motion parameters to form the framework of the motion image of the player and utilizing the performance data to modify the framework for generating the motion image presenting the player playing the instrument to perform the music.18. A machine readable medium for use in a computer having a CPU and a display, said medium containing program instructions executable by the CPU for causing the computer system to perform a method for animating movable parts of an object along with music, said method comprising the steps of:sequentially providing music control information in correspondence with the music to be played, the music control information including a plurality of types of music control event data for controlling a sound of the music to be played; displaying a parameter setting graphical user interface, said parameter setting graphical user interface operable for selecting a type of music control event data from among the plurality of types of music control event data that are graphically displayed for selection and designating a type of music control event data to each of the movable parts of the object, wherein the correspondence between each of the movable parts and the corresponding designated type of music control data is displayed; receiving a selection from said parameter setting graphical user interface of a type of music control event data from among the plurality of types of music control event data and designation of the type of music control event data to each of the movable parts of the object such that the respective movable parts correspond to the types of music control event data; generating a sound in accordance with each music control event data included in the music control information to thereby play the music; and in response to the music control information for controlling movements of the respective movable parts in correspondence to the types of music control event data included in the music control information, generating a motion image of the object in matching with progression of the music. 19. The machine readable medium as claimed in claim 18, wherein the motion image is generated by analyzing a data block of the music control information for preparing a frame of the motion image in advance to generation of the sound corresponding to the same data block, so that the we prepared frame is generated timely when the sound is generated according to the same data block used for preparation of the frame.20. The machine readable medium as claimed in claim 18, wherein the method further comprises the steps of generating successively key frames of the motion image in response to the music control information, and generating a number of sub frames inserted between the successive key frames by interpolation to smoothen the motion image while varying the number of the sub frames dependently on a resource of the computer system affordable to the video module.21. The machine readable medium as claimed in claim 18, wherein the method further comprises the steps of generating a motion image of an object representing an instrument player, and analyzing the music control information to determine a rendition movement of the instrument player for controlling the motion image as if the instrument player plays the music.22. A system for animating movable parts of an object along with music, said system comprising:a sequencer module that sequentially provides music control information in correspondence with the music to be played such that the music control information is arranged into a plurality of channels; a parameter setting module manually operable to select a channel of music control information from among the plurality of the channels and operable to set the selected channel of the music control information to each of the movable parts of the object such that the respective movable parts correspond to the channels of the selected and set music control information; an audio module for generating a sound in accordance with the respective channels of the music control information to thereby play the music; and a video module responsive to the music control information for controlling movements of the respective movable parts in correspondence to the channels of the music control information sequentially provided from the sequencer module, thereby generating a motion image of the object in matching with progression of the music.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.