Light-based touch controls on a steering wheel and dashboard
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G06F-007/00
G06F-003/01
G06F-003/033
G06F-003/041
출원번호
US-0088458
(2013-11-25)
등록번호
US-8775023
(2014-07-08)
발명자
/ 주소
Fröjdh, Gunnar Martin
Fellin, Simon
Eriksson, Thomas
Karlsson, John
Hedin, Maria
Berglind, Richard
출원인 / 주소
Neanode Inc.
대리인 / 주소
Soquel Group LLC
인용정보
피인용 횟수 :
8인용 특허 :
144
초록▼
A system for use in a vehicle, including a steering element situated opposite a driver seat in a vehicle, the steering element including a plurality of proximity sensors encased in the periphery of the steering element operable to detect hand gestures along the outer periphery of the steering elemen
A system for use in a vehicle, including a steering element situated opposite a driver seat in a vehicle, the steering element including a plurality of proximity sensors encased in the periphery of the steering element operable to detect hand gestures along the outer periphery of the steering element, an interactive deck housed in the vehicle, for providing at least one of radio broadcast, video broadcast, audio entertainment, video entertainment and navigational assistance in the vehicle, and a processor housed in the vehicle, coupled with the proximity sensors and the deck, operable to identify the hand gestures detected by the proximity sensors, and to control the deck in response to thus-identified hand gestures.
대표청구항▼
1. A system for use in a vehicle, comprising: a steering element situated opposite a driver seat in a vehicle, the steering element comprising a plurality of proximity sensors encased in the periphery of the steering element operable to detect hand gestures along the outer periphery of the steering
1. A system for use in a vehicle, comprising: a steering element situated opposite a driver seat in a vehicle, the steering element comprising a plurality of proximity sensors encased in the periphery of the steering element operable to detect hand gestures along the outer periphery of the steering element;an interactive deck housed in the vehicle, for providing at least one of radio broadcast, video broadcast, audio entertainment, video entertainment and navigational assistance in the vehicle; anda processor housed in the vehicle, coupled with said proximity sensors and said deck, operable (i) to identify an upward hand gesture detected by said proximity sensors, and to increase an adjustable setting for said deck in response to thus-identified upward hand gesture, and (ii) to identify a downward hand gesture detected by said proximity sensors, and to decrease the adjustable setting in response to the thus-identified downward hand gesture. 2. The system of claim 1 wherein said deck operates in accordance with a plurality of features having adjustable settings, wherein said processor is operative to adjust a setting of a currently selected one of said features in response to identifying hand slide gestures, and wherein said processor is operative to change the currently selected feature in response to identifying at least one tap gesture on the outer periphery of said steering element. 3. The system of claim 2 wherein said deck comprises a display, and wherein, when said processor changes the selected feature, said processor also renders a graphic indicating the newly selected feature on said display. 4. The system of claim 1 wherein the adjustable settings are members of the group consisting of volume, radio channel, track in a music library, web page, map and picture. 5. The system of claim 1 further comprising a second plurality of proximity sensors encased in said steering element and facing the driver seat, for detecting hand wave gestures between said steering element and a driver, and wherein said processor is operable to identify the hand wave gestures detected by said second proximity sensors, and to change a mode of said deck in response to the thus-identified hand wave gestures. 6. The system of claim 5 wherein the vehicle comprises an airbag having an adjustable inflation setting, wherein said processor determines a distance between the driver and the steering element based on outputs from said second proximity sensors, and wherein said processor adjusts the inflation setting for the airbag based on the thus-determined distance. 7. The system of claim 1, wherein said processor identifies a series of objects detected by said proximity sensors as concurrently touching said steering element, as being respective fingers on one hand, wherein, when said proximity sensors detect that one of the objects is lifted from said steering element, said processor determines which of the fingers was lifted based on a relative location of the lifted object within the series of objects, and wherein said processor performs different functions in response to identifying gestures performed by different thus-determined ones of the fingers. 8. The system of claim 1 wherein said steering element further comprises: a cavity;an array of invisible-light emitters connected to said processor operable to project invisible light beams across said cavity; andan array of light detectors connected to said processor operable to detect the invisible light beams projected by said invisible-light emitters, and to detect gestures inside said cavity that interrupt the invisible light beams projected by said invisible-light emitters, and wherein said processor is operable to identify the gestures inside said cavity detected by said light detectors, and to control said deck in response to the thus-identified gestures inside said cavity. 9. The system of claim 8, further comprising: multiple arrays of invisible-light emitters connected to said processor and operable to project invisible light beams across different geometric planes inside said cavity; andmultiple arrays of light detectors connected to said processor and operable to detect the invisible light beams projected by said invisible-light emitters, and to detect wave gestures across multiple geometric planes inside said cavity that interrupt the invisible light beams projected by said invisible-light emitters, and wherein said processor is operable to identify the wave gestures inside said cavity detected by said arrays of light detectors, and to control said deck in response to the thus-identified wave gestures. 10. The system of claim 8 further comprising visible-light emitters housed in said steering element and connected to said processor, and wherein said processor is operable to activate at least one of said visible-light emitters in response said light detectors detecting that the invisible light beams projected by said invisible-light emitters are being interrupted. 11. The system of claim 8, wherein said deck comprises a display, and wherein said processor is operable to enlarge an image on said display in response to identifying a multi-finger spread gesture inside said cavity. 12. The system of claim 11, wherein said processor is operable to pan the image on said display in response to identifying a one-finger translation gesture inside said cavity. 13. The system of claim 1 wherein said processor controls said deck in response to the identified hand gestures, only when said steering element is not substantially rotated. 14. The system of claim 1 further comprising a wireless phone interface for receiving incoming phone calls, and wherein said processor is operable to reject an incoming phone call in response to identifying a single tap gesture on the outer perimeter of said steering element, and to accept an incoming call in response to identifying a double tap gesture on the outer perimeter of said steering element. 15. The system of claim 1 wherein said processor is operable to mute said deck in response to identifying a sudden quick hand slide gesture along the outer periphery of said steering element. 16. The system of claim 1, wherein the periphery of said steering element is separated into virtual input zones, wherein said processor is operable to perform a different control command on said deck in response to identifying hand gestures in different input zones, and to render an image of said steering element on a display mounted in the vehicle, the image indicating which control command is associated with each input zone, in response to identifying at least one tap gesture on the outer periphery of said steering element. 17. The system of claim 1, wherein said processor is further operable to open a gas tank in the vehicle, open a trunk in the vehicle, lock doors of the vehicle, and close windows in the vehicle, in response to identifying respective hand gestures. 18. The system of claim 1, wherein said steering element comprises at least one interactive display embedded therein, wherein the vehicle comprises a vehicle display comprising either a head-down display or a head-up display, and wherein information displayed on said at least one interactive display is replicated on the vehicle display. 19. The system of claim 1 wherein said interactive deck comprises: a display; anda frame surrounding said display comprising a second plurality of proximity sensors operable to detect hand gestures above the frame, wherein said processor is operable to present a graphic representing a group of related functions in a corner of said display, and to identify a diagonal hand wave gesture above said frame and beginning above the corner of said display detected by said second plurality of proximity sensors, and to translate the graphic across said display and thereby reveal icons for the related functions in response to the thus-identified diagonal hand wave gesture. 20. The system of claim 19, wherein said processor is operable to present multiple graphics in respective corners of said display. 21. A system for use in a vehicle, comprising: a steering element situated opposite a driver seat in a vehicle, the steering element comprising: a first plurality of proximity sensors encased in the periphery of the steering element operable to detect hand gestures along the outer periphery of the steering element; anda second plurality of proximity sensors encased in the steering element and facing the driver seat, for detecting hand wave gestures between the steering element and a driver;an interactive deck housed in the vehicle, for providing at least one of radio broadcast, video broadcast, audio entertainment, video entertainment and navigational assistance in the vehicle;an airbag having an adjustable inflation setting; anda processor housed in the vehicle, coupled with said proximity sensors and said deck, operable to (i) identify the hand gestures detected by said proximity sensors, and to control said deck in response to thus-identified hand gestures, (ii) to identify the hand wave gestures detected by said second proximity sensors, and to change a mode of said deck in response to the thus-identified hand wave gestures, and (iii) to determine a distance between the driver and the steering element based on outputs from said second proximity sensors, and to adjust the inflation setting for the airbag based on the thus-determined distance. 22. The system of claim 21 wherein said deck operates in accordance with a plurality of features having adjustable settings, wherein said processor is operative to adjust a setting of a currently selected one of said features in response to identifying hand slide gestures, and wherein said processor is operative to change the currently selected feature in response to identifying at least one tap gesture on the outer periphery of said steering element. 23. The system of claim 22 wherein said deck comprises a display, and wherein, when said processor changes the selected feature, said processor also renders a graphic indicating the newly selected feature on said display. 24. The system of claim 22 wherein the adjustable settings are members of the group consisting of volume, radio channel, track in a music library, web page, map and picture. 25. The system of claim 21 wherein said processor identifies a series of objects detected by said proximity sensors as concurrently touching said steering element, as being respective fingers on one hand, wherein, when said proximity sensors detect that one of the objects is lifted from said steering element, said processor determines which of the fingers was lifted based on a relative location of the lifted object within the series of objects, and wherein said processor performs different functions in response to identifying gestures performed by different thus-determined ones of the fingers. 26. The system of claim 21 wherein said steering element further comprises: a cavity;an array of invisible-light emitters connected to said processor operable to project invisible light beams across said cavity; andan array of light detectors connected to said processor operable to detect the invisible light beams projected by said invisible-light emitters, and to detect gestures inside said cavity that interrupt the invisible light beams projected by said invisible-light emitters, and wherein said processor is operable to identify the gestures inside said cavity detected by said light detectors, and to control said deck in response to the thus-identified gestures inside said cavity. 27. The system of claim 26 further comprising: multiple arrays of invisible-light emitters connected to said processor and operable to project invisible light beams across different geometric planes inside said cavity; andmultiple arrays of light detectors connected to said processor and operable to detect the invisible light beams projected by said invisible-light emitters, and to detect wave gestures across multiple geometric planes inside said cavity that interrupt the invisible light beams projected by said invisible-light emitters, and wherein said processor is operable to identify the wave gestures inside said cavity detected by said arrays of light detectors, and to control said deck in response to the thus-identified wave gestures. 28. The system of claim 26 further comprising visible-light emitters housed in said steering element and connected to said processor, and wherein said processor is operable to activate at least one of said visible-light emitters in response said light detectors detecting that the invisible light beams projected by said invisible-light emitters are being interrupted. 29. The system of claim 26 wherein said deck comprises a display, and wherein said processor is operable to enlarge an image on said display in response to identifying a multi-finger spread gesture inside said cavity. 30. The system of claim 29 wherein said processor is operable to pan the image on said display in response to identifying a one-finger translation gesture inside said cavity. 31. The system of claim 21 wherein said processor controls said deck in response to the identified hand gestures, only when said steering element is not substantially rotated. 32. The system of claim 21 further comprising a wireless phone interface for receiving incoming phone calls, and wherein said processor is operable to reject an incoming phone call in response to identifying a single tap gesture on the outer perimeter of said steering element, and to accept an incoming call in response to identifying a double tap gesture on the outer perimeter of said steering element. 33. The system of claim 21 wherein said processor is operable to mute said deck in response to identifying a sudden quick hand slide gesture along the outer periphery of said steering element. 34. The system of claim 21 wherein the periphery of said steering element is separated into virtual input zones, wherein said processor is operable to perform a different control command on said deck in response to identifying hand gestures in different input zones, and to render an image of said steering element on a display mounted in the vehicle, the image indicating which control command is associated with each input zone, in response to identifying at least one tap gesture on the outer periphery of said steering element. 35. The system of claim 21 wherein said processor is further operable to open a gas tank in the vehicle, open a trunk in the vehicle, lock doors of the vehicle, and close windows in the vehicle, in response to identifying respective hand gestures. 36. The system of claim 21 wherein said steering element comprises at least one interactive display embedded therein, wherein the vehicle comprises a vehicle display comprising either a head-down display or a head-up display, and wherein information displayed on said at least one interactive display is replicated on the vehicle display. 37. The system of claim 21 wherein said interactive deck comprises: a display; anda frame surrounding said display comprising a second plurality of proximity sensors operable to detect hand gestures above the frame, wherein said processor is operable to present a graphic representing a group of related functions in a corner of said display, and to identify a diagonal hand wave gesture above said frame and beginning above the corner of said display detected by said second plurality of proximity sensors, and to translate the graphic across said display and thereby reveal icons for the related functions in response to the thus-identified diagonal hand wave gesture. 38. The system of claim 37 wherein said processor is operable to present multiple graphics in respective corners of said display. 39. A system for use in a vehicle, comprising: a steering element situated opposite a driver seat in a vehicle, the steering element comprising: a plurality of proximity sensors encased in the periphery of the steering element operable to detect hand gestures along the outer periphery of the steering element;a cavity;an array of invisible-light emitters operable to project invisible light beams across said cavity; andan array of light detectors operable to detect the invisible light beams projected by said invisible-light emitters, and to detect gestures inside said cavity that interrupt the invisible light beams projected by said invisible-light emitters,an interactive deck housed in the vehicle, for providing at least one of radio broadcast, video broadcast, audio entertainment, video entertainment and navigational assistance in the vehicle; anda processor housed in the vehicle, coupled with said proximity sensors and said deck, operable (i) to identify the hand gestures detected by said proximity sensors, and to control said deck in response to thus-identified hand gestures, (ii) to identify the gestures inside said cavity detected by said light detectors, and to control said deck in response to the thus-identified gestures inside said cavity. 40. The system of claim 39 wherein said deck operates in accordance with a plurality of features having adjustable settings, wherein said processor is operative to adjust a setting of a currently selected one of said features in response to identifying hand slide gestures, and wherein said processor is operative to change the currently selected feature in response to identifying at least one tap gesture on the outer periphery of said steering element. 41. The system of claim 40 wherein said deck comprises a display, and wherein, when said processor changes the selected feature, said processor also renders a graphic indicating the newly selected feature on said display. 42. The system of claim 40 wherein the adjustable settings are members of the group consisting of volume, radio channel, track in a music library, web page, map and picture. 43. The system of claim 39 further comprising a second plurality of proximity sensors encased in said steering element and facing the driver seat, for detecting hand wave gestures between said steering element and a driver, and wherein said processor is operable to identify the hand wave gestures detected by said second proximity sensors, and to change a mode of said deck in response to the thus-identified hand wave gestures. 44. The system of claim 39 wherein said processor identifies a series of objects detected by said proximity sensors as concurrently touching said steering element, as being respective fingers on one hand, wherein, when said proximity sensors detect that one of the objects is lifted from said steering element, said processor determines which of the fingers was lifted based on a relative location of the lifted object within the series of objects, and wherein said processor performs different functions in response to identifying gestures performed by different thus-determined ones of the fingers. 45. The system of claim 39 further comprising: multiple arrays of invisible-light emitters connected to said processor and operable to project invisible light beams across different geometric planes inside said cavity; andmultiple arrays of light detectors connected to said processor and operable to detect the invisible light beams projected by said invisible-light emitters, and to detect wave gestures across multiple geometric planes inside said cavity that interrupt the invisible light beams projected by said invisible-light emitters, and wherein said processor is operable to identify the wave gestures inside said cavity detected by said arrays of light detectors, and to control said deck in response to the thus-identified wave gestures. 46. The system of claim 39 further comprising visible-light emitters housed in said steering element and connected to said processor, and wherein said processor is operable to activate at least one of said visible-light emitters in response said light detectors detecting that the invisible light beams projected by said invisible-light emitters are being interrupted. 47. The system of claim 39 wherein said deck comprises a display, and wherein said processor is operable to enlarge an image on said display in response to identifying a multi-finger spread gesture inside said cavity. 48. The system of claim 47 wherein said processor is operable to pan the image on said display in response to identifying a one-finger translation gesture inside said cavity. 49. The system of claim 39 wherein said processor controls said deck in response to the identified hand gestures, only when said steering element is not substantially rotated. 50. The system of claim 39 further comprising a wireless phone interface for receiving incoming phone calls, and wherein said processor is operable to reject an incoming phone call in response to identifying a single tap gesture on the outer perimeter of said steering element, and to accept an incoming call in response to identifying a double tap gesture on the outer perimeter of said steering element. 51. The system of claim 39 wherein said processor is operable to mute said deck in response to identifying a sudden quick hand slide gesture along the outer periphery of said steering element. 52. The system of claim 39 wherein the periphery of said steering element is separated into virtual input zones, wherein said processor is operable to perform a different control command on said deck in response to identifying hand gestures in different input zones, and to render an image of said steering element on a display mounted in the vehicle, the image indicating which control command is associated with each input zone, in response to identifying at least one tap gesture on the outer periphery of said steering element. 53. The system of claim 39 wherein said processor is further operable to open a gas tank in the vehicle, open a trunk in the vehicle, lock doors of the vehicle, and close windows in the vehicle, in response to identifying respective hand gestures. 54. The system of claim 39 wherein said steering element comprises at least one interactive display embedded therein, wherein the vehicle comprises a vehicle display comprising either a head-down display or a head-up display, and wherein information displayed on said at least one interactive display is replicated on the vehicle display. 55. The system of claim 39 wherein said interactive deck comprises: a display; anda frame surrounding said display comprising a second plurality of proximity sensors operable to detect hand gestures above the frame, wherein said processor is operable to present a graphic representing a group of related functions in a corner of said display, and to identify a diagonal hand wave gesture above said frame and beginning above the corner of said display detected by said second plurality of proximity sensors, and to translate the graphic across said display and thereby reveal icons for the related functions in response to the thus-identified diagonal hand wave gesture. 56. The system of claim 55 wherein said processor is operable to present multiple graphics in respective corners of said display. 57. A system for use in a vehicle, comprising: a steering element situated opposite a driver seat in a vehicle, the steering element comprising a plurality of proximity sensors encased in the periphery of the steering element operable to detect hand gestures along the outer periphery of the steering element;an interactive deck housed in the vehicle, for providing at least one of radio broadcast, video broadcast, audio entertainment, video entertainment and navigational assistance in the vehicle; anda processor housed in the vehicle, coupled with said proximity sensors and said deck, operable to identify a sudden quick hand slide gesture along the outer periphery of said steering element detected by said proximity sensors, and to mute said deck in response to thus-identified sudden quick hand slide gesture. 58. The system of claim 57 wherein said deck operates in accordance with a plurality of features having adjustable settings, wherein said processor is operative to adjust a setting of a currently selected one of said features in response to identifying hand slide gestures, and wherein said processor is operative to change the currently selected feature in response to identifying at least one tap gesture on the outer periphery of said steering element. 59. The system of claim 58 wherein said deck comprises a display, and wherein, when said processor changes the selected feature, said processor also renders a graphic indicating the newly selected feature on said display. 60. The system of claim 58 wherein the adjustable settings are members of the group consisting of volume, radio channel, track in a music library, web page, map and picture. 61. The system of claim 57 further comprising a second plurality of proximity sensors encased in said steering element and facing the driver seat, for detecting hand wave gestures between said steering element and a driver, and wherein said processor is operable to identify the hand wave gestures detected by said second proximity sensors, and to change a mode of said deck in response to the thus-identified hand wave gestures. 62. The system of claim 57 wherein said processor identifies a series of objects detected by said proximity sensors as concurrently touching said steering element, as being respective fingers on one hand, wherein, when said proximity sensors detect that one of the objects is lifted from said steering element, said processor determines which of the fingers was lifted based on a relative location of the lifted object within the series of objects, and wherein said processor performs different functions in response to identifying gestures performed by different thus-determined ones of the fingers. 63. The system of claim 57 wherein said processor controls said deck in response to the identified hand gestures, only when said steering element is not substantially rotated. 64. The system of claim 57 further comprising a wireless phone interface for receiving incoming phone calls, and wherein said processor is operable to reject an incoming phone call in response to identifying a single tap gesture on the outer perimeter of said steering element, and to accept an incoming call in response to identifying a double tap gesture on the outer perimeter of said steering element. 65. The system of claim 57 wherein the periphery of said steering element is separated into virtual input zones, wherein said processor is operable to perform a different control command on said deck in response to identifying hand gestures in different input zones, and to render an image of said steering element on a display mounted in the vehicle, the image indicating which control command is associated with each input zone, in response to identifying at least one tap gesture on the outer periphery of said steering element. 66. The system of claim 57 wherein said processor is further operable to open a gas tank in the vehicle, open a trunk in the vehicle, lock doors of the vehicle, and close windows in the vehicle, in response to identifying respective hand gestures. 67. The system of claim 57 wherein said steering element comprises at least one interactive display embedded therein, wherein the vehicle comprises a vehicle display comprising either a head-down display or a head-up display, and wherein information displayed on said at least one interactive display is replicated on the vehicle display. 68. The system of claim 57 wherein said interactive deck comprises: a display; anda frame surrounding said display comprising a second plurality of proximity sensors operable to detect hand gestures above the frame, wherein said processor is operable to present a graphic representing a group of related functions in a corner of said display, and to identify a diagonal hand wave gesture above said frame and beginning above the corner of said display detected by said second plurality of proximity sensors, and to translate the graphic across said display and thereby reveal icons for the related functions in response to the thus-identified diagonal hand wave gesture. 69. The system of claim 68 wherein said processor is operable to present multiple graphics in respective corners of said display. 70. A system for use in a vehicle, comprising: a steering element situated opposite a driver seat in a vehicle, the steering element comprising a plurality of proximity sensors encased in the periphery of the steering element operable to detect hand gestures along the outer periphery of the steering element;an interactive deck housed in the vehicle, for providing at least one of radio broadcast, video broadcast, audio entertainment, video entertainment and navigational assistance in the vehicle; anda processor housed in the vehicle, coupled with said proximity sensors and said deck, operable to identify the hand gestures detected by said proximity sensors, and to open a gas tank in the vehicle, open a trunk in the vehicle, lock doors of the vehicle, and close windows in the vehicle, in response to respective thus-identified hand gestures. 71. The system of claim 70 wherein said deck operates in accordance with a plurality of features having adjustable settings, wherein said processor is operative to adjust a setting of a currently selected one of said features in response to identifying hand slide gestures, and wherein said processor is operative to change the currently selected feature in response to identifying at least one tap gesture on the outer periphery of said steering element. 72. The system of claim 71 wherein said deck comprises a display, and wherein, when said processor changes the selected feature, said processor also renders a graphic indicating the newly selected feature on said display. 73. The system of claim 71 wherein the adjustable settings are members of the group consisting of volume, radio channel, track in a music library, web page, map and picture. 74. The system of claim 70 further comprising a second plurality of proximity sensors encased in said steering element and facing the driver seat, for detecting hand wave gestures between said steering element and a driver, and wherein said processor is operable to identify the hand wave gestures detected by said second proximity sensors, and to change a mode of said deck in response to the thus-identified hand wave gestures. 75. The system of claim 70 wherein said processor identifies a series of objects detected by said proximity sensors as concurrently touching said steering element, as being respective fingers on one hand, wherein, when said proximity sensors detect that one of the objects is lifted from said steering element, said processor determines which of the fingers was lifted based on a relative location of the lifted object within the series of objects, and wherein said processor performs different functions in response to identifying gestures performed by different thus-determined ones of the fingers. 76. The system of claim 70 wherein said processor controls said deck in response to the identified hand gestures, only when said steering element is not substantially rotated. 77. The system of claim 70 further comprising a wireless phone interface for receiving incoming phone calls, and wherein said processor is operable to reject an incoming phone call in response to identifying a single tap gesture on the outer perimeter of said steering element, and to accept an incoming call in response to identifying a double tap gesture on the outer perimeter of said steering element. 78. The system of claim 70 wherein the periphery of said steering element is separated into virtual input zones, wherein said processor is operable to perform a different control command on said deck in response to identifying hand gestures in different input zones, and to render an image of said steering element on a display mounted in the vehicle, the image indicating which control command is associated with each input zone, in response to identifying at least one tap gesture on the outer periphery of said steering element. 79. The system of claim 70 wherein said steering element comprises at least one interactive display embedded therein, wherein the vehicle comprises a vehicle display comprising either a head-down display or a head-up display, and wherein information displayed on said at least one interactive display is replicated on the vehicle display. 80. The system of claim 70 wherein said interactive deck comprises: a display; anda frame surrounding said display comprising a second plurality of proximity sensors operable to detect hand gestures above the frame, wherein said processor is operable to present a graphic representing a group of related functions in a corner of said display, and to identify a diagonal hand wave gesture above said frame and beginning above the corner of said display detected by said second plurality of proximity sensors, and to translate the graphic across said display and thereby reveal icons for the related functions in response to the thus-identified diagonal hand wave gesture. 81. The system of claim 80 wherein said processor is operable to present multiple graphics in respective corners of said display.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (144)
Denlinger Michael B. (Haworth NJ), Ambient-light-responsive touch screen data input method and system.
Moran Thomas P. ; Chiu Patrick ; Melle William Van ; Kurtenbach Gordon,CAX, Apparatus and method for implementing visual animation illustrating results of interactive editing operations.
Gottfurcht,Elliot A.; Gottfurcht,Grant E.; Long,Albert Michel C., Apparatus and method of manipulating a region on a wireless device screen for viewing, zooming and scrolling internet content.
Conrad Thomas J. ; Moller Elizabeth Ann Robinson, Computer system with graphical user interface including windows having an identifier within a control region on the dis.
Brian Finlay Beaton CA; Colin Donald Smith CA; Francois Blouin CA; Guillaume Comeau CA; Arthur Julian Patterson Craddock CA, Contextual gesture interface.
Katsuyuki Omura JP; Takao Inoue JP, Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system.
Gruaz Daniel (Montigny le Bretonneux FRX) Marchal Claude (Garancieres FRX), Device for detecting the position of a control member on a touch-sensitive pad.
Bishop Edward H. ; Connor Alfred William ; Cox Aaron Roger ; Crompton Dennis ; McDonald Mark Gehres, Front cover assembly for a touch sensitive device.
Beernink Ernest H. (San Carlos CA) Foster Gregg S. (Woodside CA) Capps Stephen P. (San Carlos CA), Gesture sensitive buttons for graphical user interfaces.
Inagaki,Takeo; Saito,Junko; Ihara,Keigo; Sueyoshi,Takahiko; Yamaguchi,Yoshihiro; Gomi,Shinichiro, Information processing apparatus, method of displaying movement recognizable standby state, method of showing recognizable movement, method of displaying movement recognizing process, and program sto.
Cavallucci, Gilles; Sylvestre, Julien P.; Plantier, Philippe G., Method and a device for optically detecting the position of an object by measuring light reflected by that object.
Naughton Patrick J. ; Clanton ; III Charles H. ; Gosling James A. ; Warth Chris ; Palrang Joseph M. ; Frank Edward H. ; LaValle David A. ; Sheridan R. Michael, Method and apparatus for improved graphical user interface having anthropomorphic characters.
Gauthey,Darryl; Farine,Pierre Andre, Method of input of a security code by means of a touch screen for access to a function, an apparatus or a given location, and device for implementing the same.
Heikkinen Teuvo,FIX ; Piippo Petri,FIX ; Wikberg Harri,FIX ; Silfverberg Miika,FIX ; Korhonen Panu,FIX ; Kiljander Harri,FIX, Mobile station with touch input having automatic symbol magnification function.
McCharles,Randy; Morrison,Gerald; Worthington,Steve; Akitt,Trevor, Synchronization of cameras in camera-based touch system to enhance position determination of fast moving objects.
Eliasson,Jonas Ove Philip; ��stergaard,Jens Wagenblast Stubbe, System and method of determining a position of a radiation scattering/reflecting element.
Gough Michael L. (Ben Lomond CA) Holloway Bruce V. (Marina CA), System for entering data into an active application currently running in the foreground by selecting an input icon in a.
Hube Randall R. (Rochester NY) Jacobs Craig W. (Fairport NY) Moon William J. (Marion NY), Touch screen user interface with expanding touch locations for a reprographic machine.
Morrison, Gerald D.; McCharles, Randy; Tseng Su, Scott Yu; Singh, Manvinder, Touch system and method for determining pointer contacts on a touch surface.
Chew, Chee H.; Bastiaanse, Elizabeth A.; Blum, Jeffrey R.; Coomer, Christen E.; Enomoto, Mark H.; Keyser, Greg A.; Parker, Kathryn L.; Vong, William H.; Zuberec, Sarah E., User interface for palm-sized computing devices and method and apparatus for displaying the same.
Hanuschak, Gregor Z., Method and apparatus for interacting with a personal computing device such as a smart phone using portable and self-contained hardware that is adapted for use in a motor vehicle.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.