최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
DataON 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Edison 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Kafe 바로가기국가/구분 | United States(US) Patent 등록 |
---|---|
국제특허분류(IPC7판) |
|
출원번호 | UP-0241839 (2005-09-30) |
등록번호 | US-7653883 (2010-02-24) |
발명자 / 주소 |
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 | 피인용 횟수 : 603 인용 특허 : 220 |
Proximity based systems and methods that are implemented on an electronic device are disclosed. The method includes sensing an object spaced away and in close proximity to the electronic device. The method also includes performing an action in the electronic device when an object is sensed.
The invention claimed is: 1. An I/O platform, comprising: an I/O surface having one or more integrated I/O devices selected from input devices and output devices; a proximity detection system configured to: detect when a finger is in close proximity to but not contacting the I/O surface; detect a p
The invention claimed is: 1. An I/O platform, comprising: an I/O surface having one or more integrated I/O devices selected from input devices and output devices; a proximity detection system configured to: detect when a finger is in close proximity to but not contacting the I/O surface; detect a position of the finger above the I/O surface when the finger is detected in close proximity to but not contacting the I/O surface; select, based on an application appearing on the I/O surface when the finger is detected in close proximity to the I/O surface but independent of x and y components of the position of the finger over the application, a first graphical user interface element from a plurality of available graphical user interface elements operable to allow input to be provided at least by movement of the finger when the finger is detected in close proximity to but not contacting the I/O surface; display the first graphical user interface element on the I/O surface and below the finger; and detect a proximity gesture performed by the movement of the finger above the first displayed graphical user interface element. 2. The I/O platform as recited in claim 1 wherein the proximity detection system is further configured to detect when the finger is in close proximity to but not contacting a particular I/O device of the I/O surface. 3. The I/O platform as recited in claim 1 wherein the I/O surface includes a touch panel and a display. 4. The I/O platform as recited in claim 1 wherein a display is affected in a non trivial manner when the finger is detected. 5. The I/O platform as recited in claim 1, wherein the graphical user interface element is a virtual scroll wheel, and wherein the movement of the finger is rotational movement that effectively rotates the virtual scroll wheel. 6. The I/O platform as recited in claim 1 wherein the graphical user interface element provides onscreen input tools and controls. 7. The I/O platform as recited in claim 1 wherein the proximity detection system is an optically based system, capacitively based system or an ultrasonic based system. 8. The I/O platform as recited in claim 1 wherein the proximity detection system generates one or more sensing fields above the I/O surface and produces signals when an object disturbs the one or more sensing fields. 9. The I/O platform as recited in claim 1 wherein a single sensing field that substantially covers the entire I/O surface is generated. 10. The I/O platform as recited in claim 9 wherein a single sensing field that only covers a portion of the I/O surface is generated. 11. The I/O platform as recited in claim 9 wherein multiple sensing fields that together substantially cover the entire I/O surface are generated. 12. The I/O platform as recited in claim 9 wherein multiple sensing fields that together only cover a portion of the I/O surface are generated. 13. The I/O platform as recited in claim 1 wherein the I/O platform is operated by: generating a proximity sensing field above one or more I/O devices; detecting disturbances in the proximity sensing field; translating disturbances to mean an object is present in the proximity sensing field; and generating a proximity control signal base on the presence of an object in the proximity sensing field. 14. A portable computing device, comprising: a housing; a user interface including an I/O surface having one or more integrated I/O devices, each of which is positioned at a surface of the housing; and a proximity detection system configured to: detect when an object is in close proximity but not touching the periphery of the portable computing device; detect a position of the object above the I/O surface when the object is detected in close proximity to but not touching the periphery of the portable computing device; select, based on an application appearing on the I/O surface when the object is detected in close proximity to the I/O surface but independent of x and y components of the position of the object over the application, a first graphical user interface element from a plurality of available graphical user interface elements that are each operable to allow input to be provided by movement of the object when the object is detected in close proximity to but not touching the periphery of the portable computing device; display the first graphical user interface element on the periphery of the portable computing device and below the object; and detect a proximity gesture performed by movement of the object above the first displayed graphical user interface element. 15. The portable computing device as recited in claim 14 wherein the proximity detection system is configured to detect an object above a first surface of the housing. 16. The portable computing device as recited in claim 14 wherein the proximity detection system includes a single sensing node. 17. The portable computing device as recited in claim 14 wherein the proximity detection system includes multiple sensing nodes. 18. The portable computing device as recited in claim 14 wherein at least one of the I/O devices is altered when an object is detected. 19. The portable computing device as recited in claim 18 wherein the user interface includes a display and wherein a GUI element is brought into view on the display when an object is detected. 20. The portable computing device as recited in claim 19 wherein the GUI element is an onscreen control box. 21. The portable computing device as recited in claim 19 wherein the GUI element is a virtual scroll wheel. 22. The portable computing device as recited in claim 14 wherein the user interface includes a touch panel that is distinct from the proximity detection system and wherein the proximity detection system detects objects above the touch panel. 23. A portable computing device, comprising: a display device configured to display a graphical user interface; an input means integrated with the display device and configured to provide inputs to the portable computing device; a proximity detector configured to detect a first object in a first position in space above the display device; and a processor operatively coupled to the display device, input means and the proximity detector, wherein the processor is operable to: select, based on an application appearing on the display device when the first object is detected above the display device but independent of x and y components of the first position of the first object above the application, a first GUI element from a plurality of GUI elements in response to the detection of the first object in the first position in the space above the display; cause the display device to display the first GUI element in response to the detection of the object, and perform one or more actions associated with the first GUI element when a proximity gesture is detected at the first GUI element via the input means. 24. A proximity based method implemented on a portable electronic device, comprising: sensing an object in a position spaced away and in close proximity to a user interface of the portable electronic device, the user interface integrated with the portable electronic device; and displaying and enabling, based on an application appearing on a display portion of the user interface when the object was sensed but independent of x and y components of the position of the object over the application, a particular graphical user interface element on a display portion of the user interface; monitoring input features of the user interface for a proximity gesture detected over the particular graphical user interface element; and performing one or more actions in the particular graphical user interface element based on the proximity gesture. 25. The method as recited in 24 wherein the sensing is performed over an entire side of the electronic device. 26. The method as recited in 24 wherein the sensing is performed regionally at particular locations of the electronic device. 27. The method as recited in 24 wherein multiple areas are sensed in order to monitor the motion of an object. 28. The method as recited in 24 wherein performing an action includes activating some portion of a user interface. 29. The method as recited in 24 wherein a second action is performed when the object is no longer detected. 30. The method as recited in 24 wherein monitoring input features includes monitoring a touch sensing device for touches and differentiating between light touch interactions and hard touch interactions. 31. The method as recited in 24 wherein the portable electronic device is a handheld portable electronic device. 32. The method as recited in 24 wherein the handheld portable electronic device is a media player. 33. A method of providing a user interface for an electronic device that includes a display and a proximity detector integrated with the electronic device, the method comprising: determining if an object is detected in space above the electronic device; monitoring and analyzing current operating conditions when the object is detected, the current operating conditions including an application appearing on the display; determining, based on the monitoring and analyzing of the current operating conditions but independent of x and y components of a position of the object over the application, whether to activate a first GUI element or a second GUI element from a set of available GUI elements; activating the first GUI element for a first set of operating conditions when activating the first GUI element is determined; activating the second GUI element for a second set of operating conditions when activating the second GUI element is determined; and detecting a proximity gesture performed above the activated GUI element. 34. The method as recited in claim 33 wherein the display is a touch screen display, wherein determining if an object is detected in space above the electronic device comprises detecting an object in space above the touch screen display of the electronic device, and wherein the method further comprises monitoring a touch event at the touch screen, linking the touch event to the active GUI element, and controlling the GUI element with the touch event.
Copyright KISTI. All Rights Reserved.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.