최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
DataON 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Edison 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Kafe 바로가기국가/구분 | United States(US) Patent 등록 |
---|---|
국제특허분류(IPC7판) |
|
출원번호 | US-0059326 (2013-10-21) |
등록번호 | US-10220302 (2019-03-05) |
발명자 / 주소 |
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 | 피인용 횟수 : 0 인용 특허 : 367 |
A controller (110) for controlling an execution of a game program by a processor for enabling an interactive game to be played by a user includes a body (111) having a section to be oriented towards a screen when a progress of a game provided via execution of the game apparatus is displayed upon the
A controller (110) for controlling an execution of a game program by a processor for enabling an interactive game to be played by a user includes a body (111) having a section to be oriented towards a screen when a progress of a game provided via execution of the game apparatus is displayed upon the screen, and at least one photonically detectable (“PD”) element (e.g. 122, 124, 126, and/or 128) assembled with the body, a position of the photonically detectable element within an image being recordable by an image capture device (112) when the section is oriented at least partly towards the screen, wherein positions of the PD element at different points in time are quantifiable to quantify movement of the body in space.
1. A method for use in obtaining input for an application, comprising: establishing communications with an image capture device that is capable of providing first output data that includes depth measurements;using the image capture device to capture images of an object as the object moves in a three
1. A method for use in obtaining input for an application, comprising: establishing communications with an image capture device that is capable of providing first output data that includes depth measurements;using the image capture device to capture images of an object as the object moves in a three-dimensional manner, wherein the object comprises a telephone comprising a hand-held computing and communication device that includes a display screen;receiving the first output data provided by the image capture device as the image capture device captures images of the object;receiving additional output data from at least one sensor other than the image capture device that can be used to determine at least one of motion and orientation of the object, wherein the at least one sensor other than the image capture device does not comprise an image capture device;tracking the three-dimensional movements of the object by mixing the first output data provided by the image capture device with the additional output data provided by the at least one sensor other than the image capture device; andcausing a visual effect to occur on the display screen of the hand-held computing and communication device of the telephone in response to a movement of the object;wherein the image capture device comprises a three-dimensional camera that measures a depth of a pixel and determines a distance of the pixel from the camera. 2. The method of claim 1, further comprising: changing weights that are applied to the first output data and the additional output data in the mixing. 3. The method of claim 1, wherein the object comprises one or more light sources. 4. The method of claim 3, wherein the one or more light sources are arranged in an arcuate pattern. 5. The method of claim 3, wherein the one or more light sources are arranged in a geometric shape. 6. The method of claim 1, wherein the object comprises a controller. 7. The method of claim 1, further comprising: generating input for the application based on the tracked three-dimensional movements of the object. 8. The method of claim 1, wherein the tracking the three-dimensional movements of the object comprises: determining position information for the object. 9. The method of claim 1, wherein the tracking the three-dimensional movements of the object comprises: determining orientation information for the object. 10. The method of claim 1, wherein the tracking the three-dimensional movements of the object comprises: determining acceleration information for the object. 11. The method of claim 1, wherein the tracking the three-dimensional movements of the object comprises: determining velocity information for the object. 12. The method of claim 1, wherein the tracking the three-dimensional movements of the object comprises: determining at least one of a tilt, pitch, yaw, and roll for the object. 13. A computer program product comprising a non-transitory medium for embodying a computer program for input to a computer and a computer program embodied in the non-transitory medium for causing the computer to perform steps comprising: establishing communications with an image capture device that is capable of providing first output data that includes depth measurements;using the image capture device to capture images of an object as the object moves in a three-dimensional manner, wherein the object comprises a telephone comprising a hand-held computing and communication device that includes a display screen;receiving the first output data provided by the image capture device as the image capture device captures images of the object;receiving additional output data from at least one sensor other than the image capture device that can be used to determine at least one of motion and orientation of the object, wherein the at least one sensor other than the image capture device does not comprise an image capture device;tracking the three-dimensional movements of the object by mixing the first output data provided by the image capture device with the additional output data provided by the at least one sensor other than the image capture device; andcausing a visual effect to occur on the display screen of the hand-held computing and communication device of the telephone in response to a movement of the object;wherein the image capture device comprises a three-dimensional camera that measures a depth of a pixel and determines a distance of the pixel from the camera. 14. The computer program product of claim 13, wherein the computer program further causes the computer to perform a step comprising: changing weights that are applied to the first output data and the additional output data in the mixing. 15. The computer program product of claim 13, wherein the object comprises one or more light sources. 16. The computer program product of claim 15, wherein the one or more light sources are arranged in an arcuate pattern. 17. The computer program product of claim 15, wherein the one or more light sources are arranged in a geometric shape. 18. The computer program product of claim 13, wherein the object comprises a controller. 19. The computer program product of claim 13, wherein the computer program further causes the computer to perform a step comprising: generating input for an application based on the tracked three-dimensional movements of the object. 20. The computer program product of claim 13, wherein the tracking the three-dimensional movements of the object comprises: determining position information for the object. 21. The computer program product of claim 13, wherein the tracking the three-dimensional movements of the object comprises: determining orientation information for the object. 22. The computer program product of claim 13, wherein the tracking the three-dimensional movements of the object comprises: determining acceleration information for the object. 23. The computer program product of claim 13, wherein the tracking the three-dimensional movements of the object comprises: determining velocity information for the object. 24. The computer program product of claim 13, wherein the tracking the three-dimensional movements of the object comprises: determining at least one of a tilt, pitch, yaw, and roll for the object. 25. An apparatus, comprising: an image capture device that is capable of providing first output data that includes depth measurements; anda processing system that is communicatively coupled to the image capture device, wherein the processing system is configured to execute steps comprising using the image capture device to capture images of an object as the object moves in a three-dimensional manner, receiving the first output data provided by the image capture device as the image capture device captures images of the object, receiving additional output data from at least one sensor other than the image capture device that can be used to determine at least one of motion and orientation of the object, and tracking the three-dimensional movements of the object by mixing the first output data provided by the image capture device with the additional output data provided by the at least one sensor other than the image capture device;wherein the object comprises a telephone comprising a hand-held computing and communication device that includes a display screen;wherein the processing system is further configured to execute a step comprising causing a visual effect to occur on the display screen of the hand-held computing and communication device of the telephone in response to a movement of the object;wherein the at least one sensor other than the image capture device does not comprise an image capture device; andwherein the image capture device comprises a three-dimensional camera that measures a depth of a pixel and determines a distance of the pixel from the camera. 26. The apparatus of claim 25, wherein the processing system is further configured to execute a step comprising: changing weights that are applied to the first output data and the additional output data in the mixing. 27. The apparatus of claim 25, wherein the object comprises one or more light sources. 28. The apparatus of claim 27, wherein the one or more light sources are arranged in an arcuate pattern. 29. The apparatus of claim 27, wherein the one or more light sources are arranged in a geometric shape. 30. The apparatus of claim 25, wherein the object comprises a controller. 31. The apparatus of claim 25, wherein the processing system is further configured to execute a step comprising: generating input for an application based on the tracked three-dimensional movements of the object. 32. The apparatus of claim 25, wherein the tracking the three-dimensional movements of the object comprises: determining position information for the object. 33. The apparatus of claim 25, wherein the tracking the three-dimensional movements of the object comprises: determining orientation information for the object. 34. The apparatus of claim 25, wherein the tracking the three-dimensional movements of the object comprises: determining acceleration information for the object. 35. The apparatus of claim 25, wherein the tracking the three-dimensional movements of the object comprises: determining velocity information for the object. 36. The apparatus of claim 25, wherein the tracking the three-dimensional movements of the object comprises: determining at least one of a tilt, pitch, yaw, and roll for the object.
Copyright KISTI. All Rights Reserved.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.