A steering wheel that identifies gestures performed on its surface, including a circular gripping element including a thumb-receiving notch disposed along its circumference, an array of light-based proximity sensors, mounted in the gripping element, that projects light beams through the notch radial
A steering wheel that identifies gestures performed on its surface, including a circular gripping element including a thumb-receiving notch disposed along its circumference, an array of light-based proximity sensors, mounted in the gripping element, that projects light beams through the notch radially outward from the gripping element, and detects light beams reflected back into the gripping element by a moving object at or near the notch, and a processor, coupled with the proximity sensor array, for determining polar angles along the circumference of the gripping element occupied by the object, responsive to light beams projected by the proximity sensor array and reflected back by the object being detected by the proximity sensor array.
대표청구항▼
1. A steering wheel that identifies gestures performed on its surface, comprising: a circular gripping element comprising a thumb-receiving notch disposed along its circumference;an array of light-based proximity sensors, mounted in said gripping element, that projects light beams through said notch
1. A steering wheel that identifies gestures performed on its surface, comprising: a circular gripping element comprising a thumb-receiving notch disposed along its circumference;an array of light-based proximity sensors, mounted in said gripping element, that projects light beams through said notch radially outward from said gripping element, and detects light beams reflected back into said gripping element by a moving object at or near said notch; anda processor, coupled with said proximity sensor array, for determining polar angles along the circumference of said gripping element occupied by the object, responsive to light beams projected by said proximity sensor array and reflected back by the object being detected by said proximity sensor array. 2. The steering wheel of claim 1 wherein said processor generates one or more data structures for the moving object, each data structure comprising: a) a time stamp,b) a polar angle at which the moving object starts,c) a polar angle at which the moving object ends,d) a middle polar angle of the moving object, ande) an assigned state being one of the group RECOGNIZED, UPDATED and ENDED, wherein said processor assigns the RECOGNIZED state to the data structure when the moving object is initially detected by said proximity sensor array, wherein said processor assigns the UPDATED state to the data structure when the moving object is further detected by said proximity sensor array after the initial detection, and wherein said processor assigns the ENDED state to the data structure when the moving object ceases to be detected by said proximity sensor array. 3. The steering wheel of claim 2 wherein said processor discriminates between large and small objects based on the polar angles in said data structure. 4. The steering wheel of claim 3 wherein said processor identifies a component gesture of the moving object as consisting of one of the types: (i) a small-object tap gesture, (ii) a small-object glide gesture, (iii) a small-object touch-and-hold gesture, (iv) a large-object tap gesture, and (v) a large-object grab gesture. 5. The steering wheel of claim 4 wherein said processor combines at least two component gestures of the same type, having assigned states ENDED, to further identify a compound gesture. 6. The steering wheel of claim 4 wherein said processor combines at least two component gestures of the same type, having assigned states RECOGNIZED and UPDATED, respectively, to further identify a compound gesture. 7. The steering wheel of claim 6 wherein the compound gesture is a double-tap gesture, combining two small-object tap gestures. 8. The steering wheel of claim 7 wherein said processor activates a cruise control application in response to identifying the double-tap gesture. 9. The steering wheel of claim 8 further comprising visible light illumination means, connected to said processor, for selectively illuminating portions of said notch, wherein the activated cruise control application illuminates a portion of said notch at which the double-tap gesture was performed, and wherein in response to a small-object tap gesture at one end of the illuminated portion, the cruise control application increases a cruise control speed by a first increment, and in response to a small-object tap gesture at the other end of the illuminated portion, the cruise control application decreases the cruise control speed by the first increment. 10. The steering wheel of claim 9 wherein in response to a small-object touch-and-hold gesture at one end of the illuminated portion, the cruise control application increases the cruise control speed by a second increment, different than the first increment, and in response to a small-object touch-and-hold gesture at the other end of the illuminated portion, the cruise control application decreases the cruise control speed by the second increment. 11. The steering wheel of claim 6 wherein the compound gesture is a swipe gesture, combining two small-object glide gestures having different middle polar angles. 12. The steering wheel of claim 11 wherein said processor is in communication with a telephone, and in response to a first swipe gesture said processor issues a command to the telephone to answer an incoming call. 13. The steering wheel of claim 12 wherein in response to a second swipe gesture said processor issues a command to the telephone to hang up the answered call. 14. The steering wheel of claim 13 wherein when the answered call continues for a time interval, said processor only issues the hang up command when the second swipe gesture is preceded by a large-object tap gesture. 15. The steering wheel of claim 12 further comprising visible light illumination means, connected to said processor, for selectively illuminating portions of said notch, wherein the processor illuminates at least a portion of said notch to indicate the incoming call. 16. The steering wheel of claim 6 wherein the compound gesture is a slide gesture, combining two consecutive large-object grab gestures having different middle polar angles. 17. The steering wheel of claim 16, further comprising visible light illumination means, connected to said processor, for selectively illuminating portions of said notch, wherein in response to a detected slide gesture the illuminated portions track the middle polar angles of the slide gesture. 18. The steering wheel of claim 6 wherein the compound gesture is an extended touch gesture, combining two consecutive small-object touch-and-hold gestures. 19. The steering wheel of claim 1 wherein said processor further determines a radial coordinate of the object, responsive to light beams projected by said proximity sensor array and reflected back by the object being detected by said proximity sensor array. 20. The steering wheel of claim 19 wherein said processor combines at least two component gestures of the same type, having similar middle polar angles and different radial coordinates, respectively, to further identify a radial swipe gesture. 21. The steering wheel of claim 19 wherein in response to the radial swipe gesture, the processor opens a menu of options for an active application on a display connected to the steering wheel.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (149)
Denlinger Michael B. (Haworth NJ), Ambient-light-responsive touch screen data input method and system.
Moran Thomas P. ; Chiu Patrick ; Melle William Van ; Kurtenbach Gordon,CAX, Apparatus and method for implementing visual animation illustrating results of interactive editing operations.
Gottfurcht,Elliot A.; Gottfurcht,Grant E.; Long,Albert Michel C., Apparatus and method of manipulating a region on a wireless device screen for viewing, zooming and scrolling internet content.
Conrad Thomas J. ; Moller Elizabeth Ann Robinson, Computer system with graphical user interface including windows having an identifier within a control region on the dis.
Brian Finlay Beaton CA; Colin Donald Smith CA; Francois Blouin CA; Guillaume Comeau CA; Arthur Julian Patterson Craddock CA, Contextual gesture interface.
Katsuyuki Omura JP; Takao Inoue JP, Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system.
Gruaz Daniel (Montigny le Bretonneux FRX) Marchal Claude (Garancieres FRX), Device for detecting the position of a control member on a touch-sensitive pad.
Bishop Edward H. ; Connor Alfred William ; Cox Aaron Roger ; Crompton Dennis ; McDonald Mark Gehres, Front cover assembly for a touch sensitive device.
Beernink Ernest H. (San Carlos CA) Foster Gregg S. (Woodside CA) Capps Stephen P. (San Carlos CA), Gesture sensitive buttons for graphical user interfaces.
Inagaki,Takeo; Saito,Junko; Ihara,Keigo; Sueyoshi,Takahiko; Yamaguchi,Yoshihiro; Gomi,Shinichiro, Information processing apparatus, method of displaying movement recognizable standby state, method of showing recognizable movement, method of displaying movement recognizing process, and program sto.
Cavallucci, Gilles; Sylvestre, Julien P.; Plantier, Philippe G., Method and a device for optically detecting the position of an object by measuring light reflected by that object.
Naughton Patrick J. ; Clanton ; III Charles H. ; Gosling James A. ; Warth Chris ; Palrang Joseph M. ; Frank Edward H. ; LaValle David A. ; Sheridan R. Michael, Method and apparatus for improved graphical user interface having anthropomorphic characters.
Gauthey,Darryl; Farine,Pierre Andre, Method of input of a security code by means of a touch screen for access to a function, an apparatus or a given location, and device for implementing the same.
Heikkinen Teuvo,FIX ; Piippo Petri,FIX ; Wikberg Harri,FIX ; Silfverberg Miika,FIX ; Korhonen Panu,FIX ; Kiljander Harri,FIX, Mobile station with touch input having automatic symbol magnification function.
McCharles,Randy; Morrison,Gerald; Worthington,Steve; Akitt,Trevor, Synchronization of cameras in camera-based touch system to enhance position determination of fast moving objects.
Eliasson,Jonas Ove Philip; ��stergaard,Jens Wagenblast Stubbe, System and method of determining a position of a radiation scattering/reflecting element.
Gough Michael L. (Ben Lomond CA) Holloway Bruce V. (Marina CA), System for entering data into an active application currently running in the foreground by selecting an input icon in a.
Hube Randall R. (Rochester NY) Jacobs Craig W. (Fairport NY) Moon William J. (Marion NY), Touch screen user interface with expanding touch locations for a reprographic machine.
Morrison, Gerald D.; McCharles, Randy; Tseng Su, Scott Yu; Singh, Manvinder, Touch system and method for determining pointer contacts on a touch surface.
Chew, Chee H.; Bastiaanse, Elizabeth A.; Blum, Jeffrey R.; Coomer, Christen E.; Enomoto, Mark H.; Keyser, Greg A.; Parker, Kathryn L.; Vong, William H.; Zuberec, Sarah E., User interface for palm-sized computing devices and method and apparatus for displaying the same.
Watz, Christopher F.; King, Todd M.; Rouleau, James E., Retractable steering column assembly having lever, vehicle having retractable steering column assembly, and method.
Schulz, John F.; Bodtker, Joen C., Torque feedback system for a steer-by-wire vehicle, vehicle having steering column, and method of providing feedback in vehicle.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.