A computer readable medium storing instructions which cause a processor to generate data structures for an object moving along the perimeter of a curved touch-sensitive user input device, each data structure corresponding to a gesture and including a time stamp, polar angles at which the object star
A computer readable medium storing instructions which cause a processor to generate data structures for an object moving along the perimeter of a curved touch-sensitive user input device, each data structure corresponding to a gesture and including a time stamp, polar angles at which the object starts and ends, a middle polar angle of the object, and an assigned state being one of the group RECOGNIZED, UPDATED and ENDED, wherein the instructions cause the processor to assign the RECOGNIZED state to the data structure when the moving object is initially detected on the perimeter of the device, to assign the UPDATED state to the data structure when the moving object is further detected on the perimeter of the device after the initial detection, and to assign the ENDED state to the data structure when the moving object ceases to be detected on the perimeter of the device.
대표청구항▼
1. A non-transitory computer readable medium storing instructions which, when executed by a processor connected to a curved touch-sensitive user input device, cause the processor to generate one or more data structures for an object moving along the perimeter of the curved user input device, each da
1. A non-transitory computer readable medium storing instructions which, when executed by a processor connected to a curved touch-sensitive user input device, cause the processor to generate one or more data structures for an object moving along the perimeter of the curved user input device, each data structure corresponding to a component gesture and comprising: a) a time stamp,b) a polar angle at which the moving object starts,c) a polar angle at which the moving object ends,d) a middle polar angle of the moving object, ande) an assigned state being one of the group RECOGNIZED, UPDATED and ENDED, wherein the instructions cause the processor to assign the RECOGNIZED state to the data structure when the moving object is initially detected on the perimeter of the user input device, to assign the UPDATED state to the data structure when the moving object is further detected on the perimeter of the user input device after the initial detection, and to assign the ENDED state to the data structure when the moving object ceases to be detected on the perimeter of the user input device. 2. The computer readable medium of claim 1 wherein the instructions cause the processor to discriminate between large and small objects based on the polar angles in the data structure. 3. The computer readable medium of claim 2 wherein the instructions cause the processor to identify a component gesture of the moving object as consisting of one of the types: (i) a small-object tap gesture, (ii) a small-object glide gesture, (iii) a small-object touch-and-hold gesture, (iv) a large-object tap gesture, and (v) a large-object grab gesture. 4. The computer readable medium of claim 3 wherein the instructions cause the processor to combine at least two component gestures of the same type, having assigned states ENDED, to further identify a compound gesture. 5. The computer readable medium of claim 4 wherein the compound gesture is a double-tap gesture, combining two small-object tap gestures. 6. The computer readable medium of claim 3 wherein the instructions cause the processor to combine at least two component gestures of the same type, having assigned states RECOGNIZED and UPDATED, to further identify a compound gesture. 7. The computer readable medium of claim 6 wherein the compound gesture is a swipe gesture, combining two small-object glide gestures having different middle polar angles. 8. The computer readable medium of claim 6 wherein the compound gesture is a slide gesture, combining two consecutive large-object grab gestures having different middle polar angles. 9. The computer readable medium of claim 8, wherein the user input device comprises visible light illumination means connected to the processor, and the instructions cause the processor to selectively illuminate locations on the user input device perimeter in accordance with the polar angles of the detected slide gesture. 10. The computer readable medium of claim 6 wherein the compound gesture is an extended touch gesture, combining two consecutive small-object touch-and-hold gestures. 11. The computer readable medium of claim 1 wherein the instructions cause the processor to determine a radial coordinate of the object, and to combine at least two component gestures having similar middle polar angles and different radial coordinates to identify a radial swipe gesture. 12. The computer readable medium of claim 11 wherein the instructions cause the processor to open a menu of options for an active application on a display in response to an identified radial swipe gesture. 13. The computer readable medium of claim 1 wherein the user input device comprises visible light illumination means connected to the processor, and the instructions cause the processor to combine at least two component gestures to further identify a compound gesture, and to selectively illuminate that portion of the user input device perimeter at which the compound gesture is performed. 14. The computer readable medium of claim 13 wherein the instructions cause the processor to issue a first command in response to a first gesture identified at one end of the illuminated portion, and cause the processor to issue a second command in response to a second gesture identified at the other end of the illuminated portion, the second command being an opposite of the first command. 15. The computer readable medium of claim 13 wherein the instructions cause the processor to issue a first command in response to a first sweep gesture in a first direction along the illuminated portion, and cause the processor to issue a second command in response to a second sweep gesture in the direction opposite the first direction along illuminated portion, the second command being an opposite of the first command.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (154)
Denlinger Michael B. (Haworth NJ), Ambient-light-responsive touch screen data input method and system.
Moran Thomas P. ; Chiu Patrick ; Melle William Van ; Kurtenbach Gordon,CAX, Apparatus and method for implementing visual animation illustrating results of interactive editing operations.
Gottfurcht,Elliot A.; Gottfurcht,Grant E.; Long,Albert Michel C., Apparatus and method of manipulating a region on a wireless device screen for viewing, zooming and scrolling internet content.
Conrad Thomas J. ; Moller Elizabeth Ann Robinson, Computer system with graphical user interface including windows having an identifier within a control region on the dis.
Brian Finlay Beaton CA; Colin Donald Smith CA; Francois Blouin CA; Guillaume Comeau CA; Arthur Julian Patterson Craddock CA, Contextual gesture interface.
Katsuyuki Omura JP; Takao Inoue JP, Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system.
Gruaz Daniel (Montigny le Bretonneux FRX) Marchal Claude (Garancieres FRX), Device for detecting the position of a control member on a touch-sensitive pad.
Bishop Edward H. ; Connor Alfred William ; Cox Aaron Roger ; Crompton Dennis ; McDonald Mark Gehres, Front cover assembly for a touch sensitive device.
Beernink Ernest H. (San Carlos CA) Foster Gregg S. (Woodside CA) Capps Stephen P. (San Carlos CA), Gesture sensitive buttons for graphical user interfaces.
Inagaki,Takeo; Saito,Junko; Ihara,Keigo; Sueyoshi,Takahiko; Yamaguchi,Yoshihiro; Gomi,Shinichiro, Information processing apparatus, method of displaying movement recognizable standby state, method of showing recognizable movement, method of displaying movement recognizing process, and program sto.
Cavallucci, Gilles; Sylvestre, Julien P.; Plantier, Philippe G., Method and a device for optically detecting the position of an object by measuring light reflected by that object.
Naughton Patrick J. ; Clanton ; III Charles H. ; Gosling James A. ; Warth Chris ; Palrang Joseph M. ; Frank Edward H. ; LaValle David A. ; Sheridan R. Michael, Method and apparatus for improved graphical user interface having anthropomorphic characters.
Gauthey,Darryl; Farine,Pierre Andre, Method of input of a security code by means of a touch screen for access to a function, an apparatus or a given location, and device for implementing the same.
Heikkinen Teuvo,FIX ; Piippo Petri,FIX ; Wikberg Harri,FIX ; Silfverberg Miika,FIX ; Korhonen Panu,FIX ; Kiljander Harri,FIX, Mobile station with touch input having automatic symbol magnification function.
McCharles,Randy; Morrison,Gerald; Worthington,Steve; Akitt,Trevor, Synchronization of cameras in camera-based touch system to enhance position determination of fast moving objects.
Eliasson,Jonas Ove Philip; ��stergaard,Jens Wagenblast Stubbe, System and method of determining a position of a radiation scattering/reflecting element.
Gough Michael L. (Ben Lomond CA) Holloway Bruce V. (Marina CA), System for entering data into an active application currently running in the foreground by selecting an input icon in a.
Hube Randall R. (Rochester NY) Jacobs Craig W. (Fairport NY) Moon William J. (Marion NY), Touch screen user interface with expanding touch locations for a reprographic machine.
Morrison, Gerald D.; McCharles, Randy; Tseng Su, Scott Yu; Singh, Manvinder, Touch system and method for determining pointer contacts on a touch surface.
Chew, Chee H.; Bastiaanse, Elizabeth A.; Blum, Jeffrey R.; Coomer, Christen E.; Enomoto, Mark H.; Keyser, Greg A.; Parker, Kathryn L.; Vong, William H.; Zuberec, Sarah E., User interface for palm-sized computing devices and method and apparatus for displaying the same.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.