IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0424592
(2012-03-20)
|
등록번호 |
US-8416217
(2013-04-09)
|
발명자
/ 주소 |
- Eriksson, Thomas
- Leine, Per
- Laveno Mangelsdorff, Jochen
- Pettersson, Robert
- Jansson, Anders
- Goertz, Magnus
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 |
피인용 횟수 :
25 인용 특허 :
135 |
초록
▼
A light-based finger gesture user interface for an electronic device including a housing for an electronic device, a display mounted in the housing, a cavity, separated from the display, penetrating two opposite sides of the housing, a detector mounted in the housing operative to detect an object in
A light-based finger gesture user interface for an electronic device including a housing for an electronic device, a display mounted in the housing, a cavity, separated from the display, penetrating two opposite sides of the housing, a detector mounted in the housing operative to detect an object inserted in the cavity, and a processor connected to the detector and to the display for causing the display to render a visual representation in response to output from the detector.
대표청구항
▼
1. A light-based finger gesture user interface for an electronic device comprising: a housing for an electronic device comprising inner sidewalls extending from a bottom opening to a top opening to form a cavity;a touch screen display mounted in said housing such that the whole cavity is disposed al
1. A light-based finger gesture user interface for an electronic device comprising: a housing for an electronic device comprising inner sidewalls extending from a bottom opening to a top opening to form a cavity;a touch screen display mounted in said housing such that the whole cavity is disposed along a side of the display, separate from the display;a detector mounted in said inner sidewalls, comprising a plurality of light emitters that project light beams into the cavity and a plurality of light receivers that measure intensities of blocked and unblocked ones of the projected light beams; anda processor connected to said detector and to said display for generating touch position data in response to a first object touching said display based on output received from said display, for generating cavity insertion data in response to a second object, different than the first object, being inside the cavity based on output received from said detector, and for combining the touch position data and the cavity insertion data into input for the electronic device, wherein the cavity insertion data includes location and orientation of the second object inside the cavity. 2. The user interface of claim 1, wherein said housing comprises first and second sections connected by a hinge that enables folding the two sections onto each other, and wherein said display is mounted in said first section and said cavity is located in said second section within a frame that fits around the edges of said first section when the sections are folded. 3. The user interface of claim 1, wherein said cavity is located within a handle for carrying the device. 4. The user interface of claim 1 wherein said cavity is formed by extending a retractable frame that surrounds the electronic device on three sides thereof. 5. The user interface of claim 1, wherein said processor determines at least five location and orientation attributes in response to the second object being inserted inside the cavity; namely, (i) location along an x-axis;(ii) location along a y-axis;(iii) location along a z-axis;(iv) polar angle in the xz-plane; and(v) polar angle in the yz-plane, wherein the locations and polar angles are with reference to x and y axes along the length and width of the top opening of the cavity and to a z axis along the height of the cavity from top to bottom. 6. The user interface of claim 5, wherein the second object comprises two fingers, and wherein said processor further distinguishes between four multi-finger gestures performed in said cavity; namely, (i) clockwise rotation in the xy-plane;(ii) counter-clockwise rotation in the xy-plane;(iii) finger pinch in the xy-plane; and(iv) finger spread in the xy-plane. 7. The user interface of claim 5, wherein said processor renders on said display a graphic being rotated in response to changes in polar angle of the second object inside the cavity. 8. A light-based finger gesture user interface for an electronic device comprising: a housing for an electronic device comprising inner sidewalls extending from a bottom opening to a top opening to form a cavity;a touch screen display mounted in said housing for rendering output generated by an application program running on the electronic device, wherein the whole cavity is disposed along a side of the display, separate from the display;a sensor mounted in said inner sidewalls, comprising a plurality of light emitters that project light beams into the cavity and a plurality of light receivers that measure intensities of blocked and unblocked ones of the projected light beams; anda processor in said housing connected to said display and to said sensor, for executing the application program, for generating touch position data in response to a first object touching said display based on output received from said display, for generating cavity insertion data in response to a second object, different than the first object, being inside the cavity based on output received from said sensor, and for combining the touch position data and the cavity insertion data into input for the application program wherein the cavity insertion data includes location and orientation of the second object inside the cavity. 9. The user interface of claim 8, wherein the second object comprises two fingers, and wherein said processor distinguishes via the cavity insertion data between four multi-finger gestures performed in the cavity; namely, (i) clockwise rotation in the xy-plane;(ii) counter-clockwise rotation in the xy-plane;(iii) finger pinch in the xy-plane; and(iv) finger spread in the xy-plane. 10. The user interface of claim 8, wherein the second object comprises a finger, and wherein: said processor determines at least five location and orientation attributes in response to the finger being inside the cavity; namely, (i) location along an x-axis;(ii) location along a y-axis;(iii) location along a z-axis;(iv) polar angle in the xz-plane; and(v) polar angle in the yz-plane,wherein the locations and polar angles are with reference to x and y axes along the length and width of the top opening of the cavity and to a z axis along the height of the cavity from top to bottom; said processor causes said display to pan a displayed image in response to a translation gesture of the finger in the cavity; andsaid processor causes said display to rotate a displayed image in response to a change in one or both of the polar angles of the finger in the cavity. 11. The user interface of claim 8, wherein the second object comprises one or more fingers, and wherein said processor causes said display to render animated navigation of snapshots of widgets, documents, web page bookmarks, album artwork or photographs in response to detecting via the cavity insertion data a one finger or a multi-finger gesture in the cavity, wherein a speed of the animated navigation is determined by the number of fingers performing the gesture. 12. The user interface of claim 8, wherein the second object comprises two fingers, and wherein said processor causes said display to render opening of a widget, document, web page, album, or photograph in response to detecting via the cavity insertion data a multi-finger gesture. 13. The user interface of claim 11, wherein said processor causes said display to render: fast navigation of the snapshots in response to detecting the one-finger gesture; andslow navigation of the snapshots in response to detecting the multi-finger gesture. 14. The user interface of claim 9, wherein said processor causes said display to render: opening of a widget, document, web page, album or photograph in response to detecting the finger spread in the xy-plane; andclosing of an open widget, document, web page, album or photograph in response to detecting the finger pinch in the xy-plane. 15. The user interface of claim 8, wherein the object comprises a finger, wherein the application program defines a context, and wherein said processor activates a function based on the context in response to detecting via the cavity insertion data a gesture of the finger performed inside the cavity. 16. The user interface of claim 15, wherein the gesture comprises tilting the finger inside the cavity at an acute angle once, and wherein the function causes said display to render a pop-up menu of context-relevant options. 17. The user interface of claim 8 wherein the second object comprises a finger, and wherein said processor causes said display to render one or more graphics, identifies via the touch position data one of the rendered graphics as being a selected graphic, and instructs the application program to process the selected graphic in response to detecting via the cavity insertion data a gesture of the finger inside the cavity. 18. The user interface of claim 17, wherein the selected graphic is selected text within a document, and wherein the application program causes said display to maintain the selected text at its location within the document and to move unselected text surrounding the selected text within the document, thereby positioning the selected text at a different location within the document. 19. The user interface of claim 18, wherein the application program causes said display to fix the selected text at the different location in response to detecting via the cavity insertion data a second finger gesture in the cavity. 20. The user interface of claim 17, wherein the selected graphic is a selected icon, and wherein the application program causes said display to maintain the selected icon at its location on said display and to move unselected icons surrounding the selected icon on said display, thereby positioning the selected icon at a different location. 21. The user interface of claim 20, wherein the application program causes said display to fix the selected icon at the different location in response to detecting via the cavity insertion data a second finger gesture in the cavity. 22. The user interface of claim 17, wherein the application program is a communication program, wherein the selected graphic is a selected list entry in a contact list, and wherein the application program initiates a phone call to the selected entry in response to detecting via the cavity insertion data a first gesture, andopens an SMS dialog addressed to the selected entry in response to detecting via the cavity insertion data a second gesture. 23. The user interface of claim 17, wherein the application program is a calendar program that presents a time interval divided into units of a given resolution, wherein the selected graphic is a selected time interval, and wherein the application program causes said display to increase or decrease the resolution to present a respectively greater or lesser time interval centered at the selected time interval. 24. The user interface of claim 17, wherein the application program is a file manager that presents files, wherein the selected graphic is a selected file, wherein the gesture is extendable, and wherein the file manager presents properties of the selected file in response to detecting via the cavity insertion data an initial gesture, andopens the selected file in response to detecting via the cavity insertion data an extension of the initial gesture. 25. The user interface of claim 17, wherein the application program is a file manager that presents files within a file system that has multiple levels of folders, wherein the selected graphic is a selected file, wherein the gesture is extendable, and wherein the file manager presents properties of the selected file in response to detecting via the cavity insertion data an initial gesture,displays the selected file as an icon in its folder in response to detecting via the cavity insertion data an extension of the initial gesture, anddisplays its folder as an icon in its next level folder in response to detecting via the cavity insertion data a further extension of the initial gesture. 26. The user interface of claim 17, wherein the application program is a social network application, wherein the selected graphic is a person in the social network, wherein the gesture is extendable, and wherein the application program presents a first plurality of persons connected to the selected person in response to detecting via the cavity insertion data an initial gesture, andpresents a second plurality of persons connected to the first plurality of persons in response to detecting via the cavity insertion data an extension of the initial gesture. 27. The user interface of claim 17, wherein the application program is a music library manager, wherein the selected graphic is a song or artist in the music library, wherein the gesture is extendable, and wherein the music library manager presents a first plurality of songs related to the selected song or artist in response to detecting via the cavity insertion data an initial gesture, andpresents a second plurality of songs connected to the first plurality of songs in response to detecting via the cavity insertion data an extension of the initial gesture. 28. The user interface of claim 17, wherein the application program is a reservation program for restaurants or hotels, wherein the selected graphic is a selected restaurant or hotel, wherein the gesture is extendable, and wherein the reservation program presents a plurality of restaurants or hotels similar to the selected restaurant or hotel in terms of style, cost, location or other parameter, in response to detecting via the cavity insertion data an initial gesture, andpresents additional similar restaurants or hotels in response to detecting via the cavity insertion data an extension of the initial gesture. 29. The user interface of claim 8 wherein the top and bottom openings of the cavity are uncovered, wherein the object comprises a finger, and wherein said processor causes said display to render an image, identifies via the touch position data a location within the image as being a selected location, and instructs the application program to process the selected location in response to detecting via the cavity insertion data a gesture of the finger in the cavity. 30. The user interface of claim 8, wherein said processor receives the output from said display and the output from said sensor substantially simultaneously.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.