IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0145455
(2008-06-24)
|
등록번호 |
US-8323106
(2012-12-04)
|
발명자
/ 주소 |
|
출원인 / 주소 |
- Sony Computer Entertainment America LLC
|
대리인 / 주소 |
Martine Penilla Group, LLP
|
인용정보 |
피인용 횟수 :
30 인용 특허 :
250 |
초록
▼
Game interface tracks the position of one or more game controllers in 3-dimensional space using hybrid video capture and ultrasonic tracking system. The captured video information is used to identify a horizontal and vertical position for each controller within a capture area. The ultrasonic trackin
Game interface tracks the position of one or more game controllers in 3-dimensional space using hybrid video capture and ultrasonic tracking system. The captured video information is used to identify a horizontal and vertical position for each controller within a capture area. The ultrasonic tracking system analyzes sound communications to determine the distances between the game system and each controller and to determine the distances among the controllers. The distances are then analyzed by the game interface to calculate the depths within the capture area for each controller.
대표청구항
▼
1. A method for establishing communication between a computer program in a computing system and a controller held in a hand of a user, comprising: capturing image data in a capture area in front of a display, the capturing configured to identify a two-dimensional location of a first controller and a
1. A method for establishing communication between a computer program in a computing system and a controller held in a hand of a user, comprising: capturing image data in a capture area in front of a display, the capturing configured to identify a two-dimensional location of a first controller and a two-dimensional location a second controller in the capture area;receiving data associated with bi-directional sound communication between the first controller and the second controller;capturing one-way sound communication between a location proximate to the display and the first controller; andcomputing a third dimensional location of the first controller in the capture area based on the bi-directional sound communication and the one-way sound communication. 2. The method as recited in claim 1, wherein a two-dimensional location of the first controller is identified by detecting a spherical shape within the image data, the spherical shape being associated with the first controller. 3. The method as recited in claim 1, wherein the captured image data corresponds to one of either an red-green-blue (RGB) pixel image data, a black and white pixel image data, or infrared data. 4. The method as recited in claim 1 further including, synchronizing clocks between two ends of the one-way sound communication before capturing the one-way sound communication. 5. The method as recited in claim 1, wherein the one-way sound communication includes a sender of the sound communication at the first controller. 6. The method as recited in claim 1, wherein the one-way sound communication includes a sender of the sound communication at the location proximate to the display. 7. The method as recited in claim 1, further including, capturing one-way sound communication between the location proximate to the display and the second controller. 8. The method as recited in claim 1, wherein computing the third dimensional location further includes, calculating a distance from the location proximate to the display and the location at the first controller based on a time travel of the sound in the one-way sound communication, andcalculating a depth within the capture area based on the distance,wherein the depth is an orthogonal measure to both dimensions of the two-dimensional location. 9. The method as recited in claim 1, further including, presenting in a graphical user interface in the display an object based on the two-dimensional location and the third dimensional location. 10. A system for establishing communication between a computer program and a controller, comprising: an image capture device for capturing image data in a capture area in front of a display;a sound capture device for capturing sound data in the capture area; anda computing system for executing the computer program, the computing system being connected to the display, the image capture device, and the sound capture device;wherein a first controller is operable to be held by a user with a single hand, the first controller including, a body with a first end and a second end;a grip area at about the first end;an input pad defined between the first end and the second end for entering input data to be transmitted to the computing system;a spherical-type section defined at the second end for facilitating image recognition by the image capture device of a spherical shape of the spherical-type section;a microphone; anda sound emitting device near the second end, the sound emitting device directed away from the user when held and configured for one-way sound communication with the sound capture device;wherein the computer program identifies a two-dimensional location of the first controller based on the captured image data and a third dimensional location of the first controller based on captured sound data,wherein the system further includes a second controller configured like the first controller, wherein a three-dimensional location of the second controller is identified by the computer program, wherein the first controller and the second controller are configured for two-way sound communication used to determine a distance between the first controller and the second controller, the distance being transmitted to the computing system,wherein the computer program uses the determined distance between controllers in the identification of the three-dimensional locations of the controllers. 11. The system as recited in claim 10, wherein the spherical-type section further includes a retro-reflective area. 12. The system as recited in claim 10, wherein the spherical-type section has a predetermined color to assist in identifying the spherical shape in the captured image data. 13. The system as recited in claim 10, wherein the first controller further includes, a light source inside the spherical-type section,wherein the spherical-type section is translucent,wherein the light source illuminates the spherical-type section to assist in identifying the spherical shape in the captured image data. 14. The system as recited in claim 13, wherein the light source includes an infrared light source, wherein the image capture device further includes an infrared camera. 15. The system as recited in claim 14, wherein the image capture device further includes a visible-spectrum camera, wherein the image data from the infrared camera and the visible-spectrum camera are combined in the identification of the two-dimensional location. 16. The system as recited in claim 10, wherein the first controller and the second controller are physically connected serially along a longitudinal dimension, wherein both controllers are tracked by the computer program and the identified locations of the controllers are combined to improve accuracy. 17. The system as recited in claim 16, further comprising: an adapter defined between ends of the first controller and the second controller to enable connection. 18. The system as recited in claim 10, wherein the first controller and the second controller are physically connected in parallel along a longitudinal dimension, wherein a plate is used to connect the first controller and the second controller to form an integrated controller design to be held with two hands. 19. The system as recited in claim 10 further including, a speaker connected to the computer system,wherein the first controller further includes a controller sound capturing device,wherein the first controller and the computer system establish an inverse one-way sound communication via the speaker and the controller sound capturing device. 20. The system as recited in claim 10, wherein the input pad is removable in order to accommodate different button configurations for the input pad. 21. The system as recited in claim 10, wherein the computer program tracks user movement by detecting a position of a user head within the captured image data or a position of a user torso within the captured image data. 22. The system as recited in claim 10, wherein the first controller further includes an inertial analyzer for tracking inertial activity of the first controller, the inertial activity transferred to the computer program in impact an action. 23. The system as recited in claim 10, wherein the sound emitting device further includes an acoustical chamber inside the first controller, the acoustical chamber defined to direct and receive sound in one or more directions. 24. The system as recited in claim 10, wherein the computer program samples some of the captured image and sound data, the computer program combining sampled data during the identification of the location of the first controller. 25. A system for establishing communication between a computer program and a controller, comprising: an image capture device for capturing image data in a capture area in front of a display;a sound capture device for capturing sound data in the capture area; anda computing system for executing the computer program, the computing system being connected to the display, the image capture device, and the sound capture device;wherein a first controller is operable to be held by a user with a single hand, the first controller including, a body with a first end and a second end;a grip area at about the first end;an input pad defined between the first end and the second end for entering input data to be transmitted to the computing system;a spherical-type section defined at the second end for facilitating image recognition by the image capture device of a spherical shape of the spherical-type section; anda sound emitting device near the second end, the sound emitting device directed away from the user when held and configured for one-way sound communication with the sound capture device;wherein the computer program identifies a two-dimensional location of the first controller based on the captured image data and a third dimensional location of the first controller based on captured sound data, wherein the identification of the third dimensional location further includes calculating a depth value of the third dimensional location by measuring time of flight and phase coherence of the one-way sound communication. 26. A system for establishing communication between a computer program and a controller, comprising: a first set and a second set of light emitters facing a capture area in front of a display, the first and second set of light emitters located near the display;a sound capture device for capturing sound data in the capture area; anda computing system for executing the computer program, the computing system being connected to the display, and the sound capture device;wherein the controller is operable to be held by a user with a single hand, the controller including, a body with a first end and a second end;a grip area at about the first end;an input pad defined between the first end and the second end for entering input data to be transmitted to the computing system;an image capture device near the second end for capturing image data of an area where the first and second set of light emitters are located;a microphone; anda sound emitting device near the second end, the sound emitting device directed away from the user when held, the microphone and the sound emitting device being configured for two-way sound communication with the sound capture device;wherein the computer program identifies a two-dimensional location of the controller based on the captured image data and a third dimensional location of the controller based on sound data captured by the microphone and the sound emitting device. 27. The system as recited in claim 26, wherein a processor in the controller calculates a location of the controller based on the captured image data, wherein the controller transmits the calculated location to the computing system.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.