A user interface for a vehicle, including a steering wheel for the vehicle, including a grip, a sensor operable to detect objects at a plurality of locations along the grip, and an illuminator operable to illuminate different portions of the grip, a processor in communication with the sensor, with t
A user interface for a vehicle, including a steering wheel for the vehicle, including a grip, a sensor operable to detect objects at a plurality of locations along the grip, and an illuminator operable to illuminate different portions of the grip, a processor in communication with the sensor, with the illuminator and with a controller of vehicle functions, and a non-transitory computer readable medium storing instructions which cause the processor to identify, via the sensor, a location of a first object along the grip, to illuminate, via the illuminator, a portion of the grip, adjacent to the identified location, to further identify, via the sensor, a second object being at the illuminated portion of the grip, and to activate, via the controller, a vehicle function in response to the second object being at the illuminated portion of the grip.
대표청구항▼
1. A user interface for a vehicle, comprising: a steering wheel for the vehicle, comprising: a grip;a sensor operable to detect objects at a plurality of locations along said grip; andan illuminator operable to illuminate different portions of said grip;a processor in communication with said sensor,
1. A user interface for a vehicle, comprising: a steering wheel for the vehicle, comprising: a grip;a sensor operable to detect objects at a plurality of locations along said grip; andan illuminator operable to illuminate different portions of said grip;a processor in communication with said sensor, with said illuminator and with a controller of vehicle functions; anda non-transitory computer readable medium storing instructions which cause said processor: to identify, via said sensor, a location of a first object along said grip,to illuminate, via said illuminator, a portion of said grip, adjacent to the identified location,to further identify, via said sensor, a second object being at the illuminated portion of said grip, andto activate, via the controller, a vehicle function in response to the second object being at the illuminated portion of said grip. 2. The user interface of claim 1 wherein the instructions cause said processor: to illuminate, via said illuminator, two portions of said grip, andto further identify, via said sensor, the second object as being at one of the illuminated portions. 3. The user interface of claim 2 wherein the instructions cause said processor to activate, via the controller, different vehicle functions in response to the second object being at one of the illuminated portions, according to which of the illuminated portions the second object is at. 4. The user interface of claim 1 wherein the instructions further cause said processor: to identify, via said sensor, that the first object has moved to a different location along said grip, andto move, via said illuminator, the illumination to a portion of said grip that is adjacent to the different location. 5. The user interface of claim 1 wherein the vehicle is an autonomous drive enabled vehicle, wherein the controller controls autonomous drive functions, and wherein said processor activates, via the controller, an autonomous drive function in response to the second object being at the illuminated portion of said grip. 6. A user interface method for a vehicle having a steering wheel that comprises (i) a grip, (ii) a sensor operable to detect objects at a plurality of locations along the grip, and (iii) an illuminator operable to illuminate different portions of the grip, the method comprising: identifying a location of a first object along the grip;illuminating a portion of the grip, adjacent to the identified location;further identifying a second object being at the illuminated portion of the grip; and,activating a vehicle function in response to the second object being at the illuminated portion of the grip. 7. The user interface method of claim 6 wherein said illuminating comprises illuminating two portions of the grip, and wherein said further identifying identifies the second object as being at one of the illuminated portions. 8. The user interface method of claim 7 wherein said activating activates different vehicle functions in response to the second object being at one of the illuminated portions, according to which of the illuminated portions the second object is at. 9. The user interface method of claim 6 further comprising: identifying that the first object has moved to a different location along the grip; andmoving the illumination to a portion of the grip that is adjacent to the different location. 10. The user interface method of claim 6 wherein the vehicle is an autonomous drive enabled vehicle, and wherein said activating activates an autonomous drive function in response to the second object being at the illuminated portion of the grip.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (159)
Denlinger Michael B. (Haworth NJ), Ambient-light-responsive touch screen data input method and system.
Moran Thomas P. ; Chiu Patrick ; Melle William Van ; Kurtenbach Gordon,CAX, Apparatus and method for implementing visual animation illustrating results of interactive editing operations.
Gottfurcht,Elliot A.; Gottfurcht,Grant E.; Long,Albert Michel C., Apparatus and method of manipulating a region on a wireless device screen for viewing, zooming and scrolling internet content.
Conrad Thomas J. ; Moller Elizabeth Ann Robinson, Computer system with graphical user interface including windows having an identifier within a control region on the dis.
Brian Finlay Beaton CA; Colin Donald Smith CA; Francois Blouin CA; Guillaume Comeau CA; Arthur Julian Patterson Craddock CA, Contextual gesture interface.
Katsuyuki Omura JP; Takao Inoue JP, Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system.
Gruaz Daniel (Montigny le Bretonneux FRX) Marchal Claude (Garancieres FRX), Device for detecting the position of a control member on a touch-sensitive pad.
Bishop Edward H. ; Connor Alfred William ; Cox Aaron Roger ; Crompton Dennis ; McDonald Mark Gehres, Front cover assembly for a touch sensitive device.
Beernink Ernest H. (San Carlos CA) Foster Gregg S. (Woodside CA) Capps Stephen P. (San Carlos CA), Gesture sensitive buttons for graphical user interfaces.
Inagaki,Takeo; Saito,Junko; Ihara,Keigo; Sueyoshi,Takahiko; Yamaguchi,Yoshihiro; Gomi,Shinichiro, Information processing apparatus, method of displaying movement recognizable standby state, method of showing recognizable movement, method of displaying movement recognizing process, and program sto.
Cavallucci, Gilles; Sylvestre, Julien P.; Plantier, Philippe G., Method and a device for optically detecting the position of an object by measuring light reflected by that object.
Naughton Patrick J. ; Clanton ; III Charles H. ; Gosling James A. ; Warth Chris ; Palrang Joseph M. ; Frank Edward H. ; LaValle David A. ; Sheridan R. Michael, Method and apparatus for improved graphical user interface having anthropomorphic characters.
Gauthey,Darryl; Farine,Pierre Andre, Method of input of a security code by means of a touch screen for access to a function, an apparatus or a given location, and device for implementing the same.
Heikkinen Teuvo,FIX ; Piippo Petri,FIX ; Wikberg Harri,FIX ; Silfverberg Miika,FIX ; Korhonen Panu,FIX ; Kiljander Harri,FIX, Mobile station with touch input having automatic symbol magnification function.
McCharles,Randy; Morrison,Gerald; Worthington,Steve; Akitt,Trevor, Synchronization of cameras in camera-based touch system to enhance position determination of fast moving objects.
Eliasson,Jonas Ove Philip; ��stergaard,Jens Wagenblast Stubbe, System and method of determining a position of a radiation scattering/reflecting element.
Gough Michael L. (Ben Lomond CA) Holloway Bruce V. (Marina CA), System for entering data into an active application currently running in the foreground by selecting an input icon in a.
Hube Randall R. (Rochester NY) Jacobs Craig W. (Fairport NY) Moon William J. (Marion NY), Touch screen user interface with expanding touch locations for a reprographic machine.
Morrison, Gerald D.; McCharles, Randy; Tseng Su, Scott Yu; Singh, Manvinder, Touch system and method for determining pointer contacts on a touch surface.
Chew, Chee H.; Bastiaanse, Elizabeth A.; Blum, Jeffrey R.; Coomer, Christen E.; Enomoto, Mark H.; Keyser, Greg A.; Parker, Kathryn L.; Vong, William H.; Zuberec, Sarah E., User interface for palm-sized computing devices and method and apparatus for displaying the same.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.