Electronic music controller using inertial navigation-2
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G10D-013/02
G10D-013/00
G10H-003/12
G10H-007/00
G10H-001/00
G10H-003/14
출원번호
US-0680591
(2015-04-07)
등록번호
US-9773480
(2017-09-26)
발명자
/ 주소
Rapp, John W.
출원인 / 주소
Rapp, John W.
대리인 / 주소
Kaplan Breyer Schwarz, LLP
인용정보
피인용 횟수 :
0인용 특허 :
9
초록▼
A percussion controller comprises an instrumented striker including devices for obtaining inertial measurements and a wireless transmitter, a sensor-enabled striking surface that receives an impact from the instrumented striker, and a data processing system that receives the inertial measurements an
A percussion controller comprises an instrumented striker including devices for obtaining inertial measurements and a wireless transmitter, a sensor-enabled striking surface that receives an impact from the instrumented striker, and a data processing system that receives the inertial measurements and predicts at least one of the force or location of impact of the instrumented striker on the sensor-enabled striking surface before impact actually occurs.
대표청구항▼
1. A percussion controller comprising: an instrumented striker; anda data processing system, wherein the data processing system:a) generates a plurality of virtual impact zones, wherein each zone corresponds to a different musical event;b) receives first signals that convey information pertaining to
1. A percussion controller comprising: an instrumented striker; anda data processing system, wherein the data processing system:a) generates a plurality of virtual impact zones, wherein each zone corresponds to a different musical event;b) receives first signals that convey information pertaining to movement of the instrumented striker;c) generates a location prediction and a force prediction based on information conveyed by the first signals, wherein: (i) the location prediction predicts a location of intersection of the instrumented striker and one of the virtual impact zones,(ii) the force prediction predicts a force with which the instrumented striker would strike the location of intersection if the virtual impact zone were physically manifested;d) relates the location of intersection to a musical event; ande) generates a musical event message based on the musical event. 2. The percussion controller of claim 1 and further wherein the location prediction is based, at least in part, on inertial navigation computations. 3. The percussion controller of claim 1 and further comprising a striking surface for striking with the instrumented striker, wherein the striking surface does not include any sensors. 4. The percussion controller of claim 3 and further wherein the data processing system maps at least some of the plurality of virtual impact zones to locations on the striking surface, thereby defining physical impact zones on the striking surface, wherein each physical impact zone corresponds to the musical event associated with virtual impact zone that defined the physical impact zone. 5. The percussion controller of claim 4 wherein the striking surface comprises a resilient surface. 6. The percussion controller of claim 5 and further wherein the striking surface comprises a plurality of lights, wherein the data processing system is operable to selectively illuminate some of the lights to demarcate the physical impact zones. 7. The percussion controller of claim 4 and further comprising an auxiliary instrumented mat that generates second signals, wherein the data processing system uses the second signals to perform at least one of the following tasks: (i) initialize inertial navigation computations, and (ii) provide on-going corrections to inertial navigation computations. 8. The percussion controller of claim 1 and further comprising a sensor-enabled striking surface including a resilient surface for striking with the instrumented striker and a plurality of sensors disposed beneath the sensor-enabled striking surface, wherein the data processing system: (f) receives second signals that convey information pertaining to the movement of the instrumented striker toward the sensor-enabled striking surface;(g) predicts, based on the information conveyed by the second signals, at least one of: (i) a force of impact of the instrumented striker on the sensor-enabled striking surface, and(ii) a location at which the instrumented striker will impact the sensor-enabled striking surface;(h) relates the location of impact to a musical event; and(i) generates a musical event message based on the musical event. 9. The percussion controller of claim 8 and further comprising an instrumented mat that controls one or more attributes of the sensor-enabled striking surface. 10. The percussion controller of claim 9 wherein striking the instrumented mat at a first location changes the musical event that corresponds to a first location on the sensor-enabled striking surface. 11. The percussion controller of claim 9 wherein striking the instrumented mat at a first location changes an instrument that the sensor-enabled striking surface simulates in conjunction with the data processing system. 12. The percussion controller of claim 10 wherein striking the instrumented mat a second location changes an instrument that the sensor-enabled striking surface simulates in conjunction with the data processing system. 13. The percussion controller of claim 9 wherein the sensor-enabled striking surface simulates a first instrument and the instrument mat simulates a second instrument. 14. The percussion controller of claim 8 and further comprising a foot switch, wherein the foot switch controls one or more attributes of the sensor-enabled striking surface. 15. The percussion controller of claim 1 and further wherein the data processing system alters a number of virtual impact zones in the plurality thereof. 16. The percussion controller of claim 15 and further wherein the data processing system increases the number of virtual impact zones, wherein additional virtual impact zones correspond to additional musical events. 17. The percussion controller of claim 1 and further wherein the data processing system changes the musical events that correspond to particular virtual impact zones. 18. The percussion controller of claim 1 wherein at least one of the virtual impact zones correspond to a cymbal. 19. The percussion controller of claim 1 and further wherein the data processing system: (f) compares the movement of the instrumented striker, as conveyed by the information in the first signals, to predetermined striker motion patterns that correspond to musical events;(g) characterizes the movement of the instrumented striker as a non-throwing motion when the striker's movement matches one of the predefined striker motion patterns; and(h) generates a second signal that conveys second information about the musical event corresponding to the matched predefined striker motion pattern. 20. The percussion controller of claim 1 and further wherein the data processing system stores information related to acceleration and position of the instrumented striker, wherein the information is indicative of a user's striker-throwing technique. 21. The percussion controller of claim 20 wherein the data processing system: generates a visual representation of the user's striker-throwing technique from the information indicative thereof; anddisplays the visual representation for viewing. 22. The percussion controller of claim 20 and further wherein the data processing system assesses the user's striker-throwing technique. 23. The percussion controller of claim 22 wherein the data processing system assesses the user's striker-throwing technique by comparing the information indicative of the user's striker-throwing technique to reference information pertaining to throwing technique. 24. The percussion controller of claim 23 wherein the reference information comprises a prerecorded reference performance. 25. A method comprising: predicting a location of intersection of an instrumented striker with a virtual impact zone based on signals received from the instrumented striker;predicting, based on the signals received from the instrumented striker, a force with which the instrumented striker would strike the virtual impact zone if the virtual impact zone were physically manifested;relating the location of intersection with a musical event;generating a first signal that conveys first information about the musical event; andtransmitting the first signal to a device that generates a second signal that can be converted to sound that is related to the musical event. 26. The method of claim 25 and further comprising mapping the virtual impact zone onto a striking surface. 27. The method of claim 25 and further comprising: mapping predefined motion patterns to musical events;comparing motion of the instrumented striker to the predefined motion patterns;when the motion matches one of the predefined motion patterns, generating a third signal that conveys second information about the corresponding musical event; andtransmitting the third signal to the device for generating signals that can be converted to a sound that is related to the corresponding musical event. 28. The method of claim 25 and further comprising storing information related to acceleration and position of the instrumented striker, wherein the information is indicative of a user's striker-throwing technique. 29. The method of claim 28 and further comprising assessing the user's striker-throwing technique. 30. The method of claim 29 wherein assessing the user's striker-throwing technique further comprises comparing the information indicative of the user's striker-throwing technique to reference information pertaining to throwing technique. 31. The method of claim 30 wherein the reference information comprises a prerecorded reference performance. 32. The method of claim 29 wherein assessing the user's throwing technique further comprises: generating a visual representation of the user's technique from the information indicative thereof; anddisplaying the visual representation for viewing. 33. The method of claim 25 and further comprising: generating, at the third device, the signals that can be converted to the sound that is related to the corresponding musical event; andgenerating the sound. 34. A method comprising: monitoring motion of a striker;predicting, using information obtained from the monitoring, at least one of a location or a force as follows: (a) a location at which the striker will impact a striking surface,(b) a location at which the striker will intersect a virtual impact zone,(c) a force with which the striker will impact the striking surface at the location, or(d) a force with which the striker would impact the virtual impact zone at the location of intersection, if the virtual impact zone were physically manifested;generating a visual representation of the monitored motion;displaying the visual representation for viewing; andgenerating a musical event message from the at least one predicted location or force. 35. The method of claim 34 and further comprising assessing a throwing technique of a user that is using the striker. 36. The method of claim 35 and wherein the throwing technique being assessed is selected from the group consisting of a wrist pivot, whether grip is slipping, whether the striker is being rolled, whether a pre-impact release of throwing force occurs, single stroke throw about wrist axis and bounce, and double stroke throw and bounce. 37. The method of claim 35 and further wherein assessing the throwing technique comprises comparing the throwing technique to reference information pertaining to striker throwing technique. 38. The method of claim 35 wherein predicting at least one of a location or a force is based, at least in part, on inertial navigation computations and further wherein assessing the throwing technique is based on at least some of the intertial navigation computations. 39. A percussion controller comprising: an instrumented striker;a resilient striking surface for striking with the instrumented striker, wherein the striking surface does not include any sensors; anda data processing system, wherein the data processing system:(a) receives first signals that convey information pertaining to kinetics of the instrumented striker; and(b) processes the first signals using inertial navigation techniques to predict at least one of: (i) a future location of the instrumented striker;(ii) a force with which the instrumented striker will impact a surface at the future location;(iii) a force with which the instrumented striker would impact a virtual impact zone at the future location, if the virtual impact zone were physically manifested.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (9)
Copeland, Brian R.; Byron, Marcel; Philip, Earle; Maynard, Keith, Apparatus for percussive harmonic musical synthesis utilizing MIDI technology.
Kay, Robert; Lesser, Ryan; LoPiccolo, Gregory B.; Schmidt, Daniel; McGinnis, Kevin Morris; Wright, Nathan H., Systems and methods for simulating a rock band experience.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.