최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
DataON 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Edison 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Kafe 바로가기국가/구분 | United States(US) Patent 등록 |
---|---|
국제특허분류(IPC7판) |
|
출원번호 | US-0473709 (1999-12-29) |
발명자 / 주소 |
|
대리인 / 주소 |
|
인용정보 | 피인용 횟수 : 387 인용 특허 : 6 |
The invention relates to a method of using a portable mobile communication device such as mediaphone. In the method, an image of a hand of the user is made and the natural partition of the hand is recognized. An augmented reality image of a user interface of the device is made by laying an image of
The invention relates to a method of using a portable mobile communication device such as mediaphone. In the method, an image of a hand of the user is made and the natural partition of the hand is recognized. An augmented reality image of a user interface of the device is made by laying an image of a desired user interface comprising input segments onto the image of the user's hand in such a way that the segments of the user interface are separated from each other by the natural partition of the hand. The augmented reality image of the user interface is seen by the user, who selects a desirable segment by touching a partition on the hand. The selection on the hand is recognized and a function related to the selected segment is performed by the device. MARISIL is defined and used to operate this class of devices.
1. A method of using a portable mobile communication device, the method comprising:forming an image of a physical hand of a user and recognizing a natural partition of the user's physical hand;forming an augmented reality image by overlaying an image of a desired user interface on an image of the us
1. A method of using a portable mobile communication device, the method comprising:forming an image of a physical hand of a user and recognizing a natural partition of the user's physical hand;forming an augmented reality image by overlaying an image of a desired user interface on an image of the user's physical hand formed by rays of light originating from the user's physical hand and entering the user's eyes;selecting a desired segment by touching a suitable partition on the user's physical hand, the segment being recognized as a natural partition of the user's physical hand; andperforming a function related to the segment in the device. 2. A method according to claim 1, wherein the desired segment is selected by the user. 3. A method according to claim 1, wherein the image of the desired user interface, overlaid on the image of the user's physical hand recognized by the user with his eyes, is formed by means of digital signal processing from data provided by virtual reality gloves. 4. A method according to claim 1, wherein the user interface is an alphanumerical user interface. 5. A method according to claim 1, wherein the user interface is an interface based on QFD symbols. 6. A method according to claim 1, wherein the image of the user's physical hand is formed as a video image. 7. A method according to claim 1, wherein the image of a desired user interface consists of computer graphics. 8. A method according to claim 1, wherein the user interface comprises input segments and is situated on the image of the user's physical hand in such a way that the segments of the user interface are separated from each other by the natural partition of the user's physical hand. 9. A method according to claim 1, wherein the device is operated by means of MARISIL sign language. 10. A method of using a portable mobile communication device, the method comprising:forming an image of a physical hand of a user and recognizing the a natural partition of the user's physical hand;forming an augmented reality image of a user interface of the device by overlaying an image of a desired user interface on the image of the user's physical hand;showing a user the augmented reality image of the user interface;selecting a desired segment by touching a suitable partition on the user's physical hand, the segment being recognized as a natural partition of the user's physical hand;recognizing the selection on the user's physical hand; andperforming a function related to the segment in the device. 11. A method according to claim 10, wherein the desired segment is selected by the user. 12. A method according to claim 10, wherein the image of the desired user interface, overlaid on the image of the user's physical hand, is formed by means of digital signal processing from data provided by virtual reality gloves. 13. A method according to claim 10, wherein the user interface is an alphanumerical user interface. 14. A method according to claim 10, wherein the user interface is an interface based on QFD symbols. 15. A method according to claim 10, wherein the image of the user's physical hand is formed as a video image. 16. A method according to claim 10, wherein the image of a desired user interface consists of computer graphics. 17. A method according to claim 10, wherein the user interface comprises input segments and is situated on the image of the user's physical hand in such a way that the segments of the user interface are separated from each other by the natural partition of the user's physical hand. 18. A method according to claim 10, wherein the device is operated by means of MARISIL sign language. 19. A portable mobile communication device comprising a video camera unit, a display unit, a transceiver unit and a digital signal processing unit that is arranged to control the operation of the portable mobile communication device, wherein:the video camera is arranged to form a video image of a user's physical hand and to feed the image into the digital sign al processing unit, which is arranged to recognize a natural partition of the user's physical hand in the video image;the digital signal processing unit is arranged to feed an image of a user interface into the display and to form an augmented reality image of the user interface of the device by overlaying an image of a desired user interface on an image of the user's physical hand formed by rays of light originating from the user's physical hand, passing through the display, and entering the user's eyes;a desired segment, recognized as a natural partition of the user's physical hand, is selected by the user by touching a suitable partition on the user's physical hand;the digital signal processing unit is arranged to recognize the selection on the user's physical hand; andthe device is arranged to perform a function related to the segment. 20. A device according to claim 19, wherein the user selects a desired segment. 21. A device according to claim 19, wherein the user interface is an alphanumerical user interface. 22. A device according to claim 19, wherein the user interface is an interface based on QFD symbols. 23. A device according to claim 19, wherein the image of a desired user interface consists of computer graphics. 24. A device according to claim 19, wherein the user interface comprises input segments and is situated on the image of the user's physical hand in such a way that the segments of the user interface are separated from each other by the natural partition of the user's physical hand. 25. A device according to claim 19, wherein the device is arranged to operate by means of MARISIL sign language. 26. A portable mobile communication device comprising a video camera unit, a display unit, a transceiver unit and a digital signal processing unit that is arranged to control the operation of the portable mobile communication device, wherein:the video camera is arranged to form a video image of a user's physical hand and to feed the image into the digital signal processing unit, which is arranged to recognize a natural partition of the user's physical hand in the video image;the digital signal processing unit is arranged to form an augmented reality image of a user interface of the device by overlaying an image of a desired user interface on the image of the user's physical hand;the digital signal processing unit is arranged to feed the augmented reality image of the user interface into the display, which is arranged to show the image to the user;a desired segment, recognized as a natural partition of the user's physical hand, is selected by the user by touching a suitable partition on the user's physical hand;the digital signal processing unit is arranged to recognize the selection on the user's physical hand; andthe device is arranged to perform a function related to the segment. 27. A device according to claim 26, wherein the user selects a desired segment. 28. A device according to claim 26, wherein the user interface is an alphanumerical user interface. 29. A device according to claim 26, wherein the user interface is an interface based on QFD symbols. 30. A device according to claim 26, wherein the image of a desired user interface consists of computer graphics. 31. A device according to claim 26, wherein the user interface comprises input segments and is situated on the image of the user's physical hand in such a way that the segments of the user interface are separated from each other by the natural partition of the user's physical hand. 32. A device according to claim 26, wherein the device is arranged to operate by means of MARISIL sign language. 33. A portable mobile communication device comprising a virtual reality gloves unit, a position tracking unit, a display unit, a transceiver unit and a digital signal processing unit that is arranged to control the operation of the portable mobile communication device, wherein:the virtual reality gloves are arranged to feed information on a user's physical hand int o the digital signal processing unit, which is arranged to form an image of the user's physical hand and to recognize a natural partition of the user's physical hand;the digital signal processing unit is arranged to form an augmented reality image of a user interface of the device by overlaying an image of a desired user interface on the image of the user's physical hand;the digital signal processing unit is arranged to feed the augmented reality image of the user interface into the display, which is arranged to show the image to the user;a desired segment, recognized as a natural partition of the user's physical hand, is selected by the user by touching a suitable partition on the user's physical hand;the digital signal processing unit is arranged to recognize the selection on the user's physical hand; andthe device is arranged to perform a function related to the segment. 34. A device according to claim 33, wherein the user selects a desired segment. 35. A device according to claim 33, wherein the user interface is an alphanumerical user interface. 36. A device according to claim 33, wherein the user interface is an interface based on QFD symbols. 37. A device according to claim 33, wherein the image of a desired user interface consists of computer graphics. 38. A device according to claim 33, wherein the user interface comprises input segments and is situated on the image of the user's physical hand in such a way that the segments of the user interface are separated from each other by the natural partition of the user's physical hand. 39. A device according to claim 33, wherein the device is arranged to operate by means of MARISIL sign language.
Copyright KISTI. All Rights Reserved.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.