A user interface apparatus has a flexible part including a one-dimensional analog sensor for sensing distortion of the flexible part, and a processor unit for detecting one of input states based on a value of the detected distortion and having a task run. The task relates to a selected input state.
A user interface apparatus has a flexible part including a one-dimensional analog sensor for sensing distortion of the flexible part, and a processor unit for detecting one of input states based on a value of the detected distortion and having a task run. The task relates to a selected input state. The input states are related to dynamic or static positive/negative distortion of the flexible part. The user interacts with the apparatus by physically manipulating a body of the apparatus.
대표청구항▼
What is claimed is: 1. A user interface apparatus having a flexible part, comprising: an analogue sensor for sensing distortion of the flexible part, the analogue sensor comprising a pair of pressure sensors, means for detecting one of a plurality of first-type input states based on a value of the
What is claimed is: 1. A user interface apparatus having a flexible part, comprising: an analogue sensor for sensing distortion of the flexible part, the analogue sensor comprising a pair of pressure sensors, means for detecting one of a plurality of first-type input states based on a value of the distortion sensed by the analogue sensor and having a task run, the task being related to a selected first-type input state, wherein the means for detecting detects a direction of force applied to the pair of pressure sensors to cause distortion of the flexible part, wherein the plurality of first-type input states are respectively related to different distortion statuses of the flexible part, a two-dimensional position sensor, distinct from and used in conjunction with the analogue sensor, for sensing at least one of a user touch position in a two-dimensional plane and/or a direction of movement of the user touch position; wherein at least one of the tasks is for controlling at least one graphical user interface object; and means for selecting a view of a plurality items based on the analogue sensor sensing a distortion status of the flexible part greater than a predetermined value. 2. The user interface apparatus according to claim 1, wherein one of the plurality of first-type input states is related to a neutral state of the flexible part in which no distortion is detected. 3. The user interface apparatus according to claim 1, wherein: the present user interface apparatus is configured as an electric device of a single body including a flexible display panel as the flexible part, and the two-dimensional position sensor is disposed on the back of the flexible display panel. 4. An apparatus including a user interface unit, wherein the user interface unit includes the user interface apparatus according to claim 1 and one or a plurality of additional input devices. 5. The user interface apparatus according to claim 1, wherein a first level of a hierarchical structured menu includes a plurality of individually selectable characters and a second level of said hierarchical structured menu includes a different plurality of individually selectable characters. 6. The user interface apparatus according to claim 1, wherein the graphical user interface object is for a hierarchical structured menu, the depth of the hierarchical structured menu is selected by said first-type input states, and an item of the menu at the selected depth is selected by said touch position. 7. The user interface apparatus according to claim 1, including means for changing a transparency value of an integrated preview of an item based on the analogue sensor sensing a distortion status of the flexible part. 8. The user interface apparatus according to claim 1, including means for changing a zoom value of an integrated preview of an item based on the analogue sensor sensing a distortion status of the flexible part. 9. An apparatus configured to have a single body including a processing unit and a display unit, the apparatus comprising: an analogue sensor disposed on the body for detecting user's analogue input applied on the body of the apparatus, the analogue sensor comprising a pair of pressure sensors, wherein the user's analogue input corresponds to distortion of a flexible portion of the single body, wherein the analogue sensors are disposed on the body for detecting a direction of force applied to the body to cause distortion of the flexible portion, wherein the processing unit changes a screen view of a plurality of items displayed on the display unit based on an output value of the analogue sensor where the output value is greater than a predetermined value, and a two-dimensional position sensor, distinct from and used in conjunction with the analogue sensor, for sensing at least one of a user touch position in a two-dimensional plane and/or a direction of movement of the user touch position; wherein the processing unit changing the screen view includes controlling at least one graphical user interface object. 10. The apparatus according to claim 9, wherein: the screen view to be changed includes an image superposed on an existing view, and the processing unit changes one of visual properties of the superposed image in accordance with the output value of the analogue sensor, and wherein the image replaces the existing view on the display unit. 11. The apparatus according to claim 9, wherein the screen view to be changed includes an image that enables to provide a visual impression to a user that the image indicates selectable items and an item selected, and the processing unit changes selectable items and an item selected included in the image in accordance with the output value of the analogue sensor. 12. The apparatus according to claim 9, further comprising: scroll means for controlling scrolling of the screen view in accordance with user's input, wherein the user's input is input via the two-dimensional position sensor or the analogue sensor, and wherein the processing unit selects one of selectable graphic user interface elements displayed in a current screen view by detecting if a position of the graphic user interface element is reached to a predetermined position of a screen of the display unit, and switches a mode of operation so as to accept a user input for confirming selection of the detected element. 13. The apparatus according to claim 9, wherein graphical user interface object is for a hierarchical structured menu, wherein the depth of the hierarchical structured menu is selected in accordance with input from the analogue sensor, and an item of the menu at the selected depth is selected in accordance with input from the two-dimensional position sensor. 14. The apparatus according to claim 9, wherein the processing unit changes a transparency value of an integrated previous of an item based on the analogue sensor sensing a distortion status of the flexible part. 15. The apparatus according to claim 9, including means for changing a zoom value of an integrated preview based on the analogue sensor sensing a distortion status of the flexible part. 16. An apparatus configured to have a single body including a processing unit and a display unit, the apparatus comprising: an analogue sensor disposed on the body for detecting user's analogue input applied on the body of the apparatus, the analogue sensor comprising a pair of pressure sensors disposed on the body for detecting a direction of force applied to the body, and a two-dimensional position sensor, distinct from and used in conjunction with the analogue sensor, for sensing at least one of a user touch position in a two-dimensional plane and a direction of movement of the user touch position; wherein the processing unit comprises an image processing unit having a plurality of operation modes to generate a screen view of a plurality of items displayed on the display unit, and wherein the processing unit controls functionality of at least one of the operation modes based on an output value of the analogue sensor being greater than a predetermined value, and wherein the processing unit causes an image to be superimposed over a portion of the screen view based on the output value of the analogue sensor; and wherein the processing unit causes the image to replace the screen view on the display unit; wherein the processing unit controls at least one graphical user interface object. 17. The user interface apparatus according to claim 16, wherein the screen view and the image are non-identical portions of a map. 18. The apparatus according to claim 16, wherein graphical user interface object is for a hierarchical structured menu in accordance with an analogue input from the analogue sensor and a touch position input from the two-dimensional touch position sensor, the depth of the hierarchical structured menu being controlled by said analogue input, and an item of the menu at the selected depth being controlled by said touch position input. 19. The user interface apparatus according to claim 16, wherein the processing unit changes a transparency value of an item based on the analogue sensor sensing a distortion status of the flexible part. 20. The user interface apparatus according to claim 16, wherein the processing unit changes a zoom value of an integrated preview based on the analogue sensor sensing a distortion status of the flexible part. 21. A portable information apparatus operated in response to a user input, comprising: a main body; gesture input means for obtaining physical interaction applied on the main body by a user, wherein the physical interaction applied on the main body causes the main body to bend, the gesture input means comprising an analogue sensor, the analogue sensor comprising a pair of pressure sensors for detecting a direction of force applied to cause distortion in an operation section with respect to the main body; processing means for executing processing in accordance with the user input, a visual display for visually displaying a result of the processing by the processing means, direction input means, distinct from and used in conjunction with the gesture input means, for inputting a direction in a display screen of the visual display in response to an operation performed with a user's finger; wherein the processing means controls at least one graphical user interface object; and selecting means for selecting a view of a plurality items based on the analogue sensor sensing a distortion status of the flexible part greater than a predetermined value. 22. The portable information apparatus according to claim 21, wherein the visual display is placed in a front surface of the main body and the direction input means is placed in a back surface of the main body. 23. The portable information apparatus according to claim 21, further comprising a tactile presentation section for providing a tactile feedback indicating a processing result obtained in the processing means. 24. The portable information apparatus according to claim 21, wherein the gesture input means comprises: operation sections turnably connected to both right and left edge portions of the main body, respectively; a rotation sensor for detecting an operation amount of turning at least one of the operation sections with respect to the main body; and a data acquisition section for providing an output of the rotation sensor, as a gesture input, to the processing means. 25. The portable information apparatus according to claim 21, wherein: the main body is flexible; and the gesture input means comprises a bend sensor for detecting an amount of bend in the main body caused by the physical interaction by the user, and a data acquisition section for providing an output of the bend sensor, as a gesture input, to the processing means. 26. The portable information apparatus according to claim 25, wherein the bend sensor detects a direction of bending in the main body in addition to the amount thereof. 27. The portable information apparatus according to claim 25, further comprising: a flexible visual display, which is placed in a front surface of the main body, for visually displaying a result of the processing by the processing means; and a flexible direction input means, which is placed in a back surface of the main body, for inputting a direction in response to an operation performed with a users finger. 28. The portable information apparatus according to claim 21, wherein the gesture input means comprises: operation sections attached to both right and left edge portions of the main body, respectively; a force sensor for detecting force applied to cause distortion in at least one of the operation sections with respect to the main body; and a data acquisition section for providing an output of the force sensor, as a gesture input, to the processing means. 29. The portable information apparatus according to claim 28, wherein the force sensor detects a value and a direction of the force applied on the main body. 30. The portable information apparatus according to claim 21, wherein the gesture input means comprises: a data acquisition section for providing an output of the pressure sensors, as a gesture input, to the processing means. 31. The portable information apparatus according to claim 30, wherein the pressure sensors are placed in both front and back surfaces of the main body, and detects pressure applied by the user, who is holding both right and left edge portions of the main body, so as to cause upward and/or downward bending in the main body. 32. The portable information apparatus according to claim 21, wherein the visual display is placed in a front surface of the main body; and wherein the processing means simultaneously and transparently processes the gesture input from the gesture input means and the direction input from the direction input means. 33. The portable information apparatus according to claim 32, wherein the processing means performs a process, (which corresponds to the physical interaction accepted by the gesture input means, to an object in the display screen, the object being designated by using the direction input means. 34. The portable information apparatus according to claim 21, wherein: the gesture input means comprises a force sensor for detecting a strength of the physical interaction applied on the main body by the user; and the processing means uses an output of the force sensor, which is a continuous variable, as an analogue value for interface control. 35. The portable information apparatus according to claim 34, further comprising a tactile presentation section for providing a tactile feedback to the user, the tactile feedback indicating the analogue value accepted by the gesture input means. 36. The portable information apparatus according to claim 21, wherein: the gesture input means comprises a force sensor for detecting a strength of the physical interaction applied on the main body by the user; and the processing means controls a system operation in response to an output of the force sensor if the output of the force sensor exceeds a predetermined threshold. 37. The portable information apparatus according to claim 36, further comprising a tactile presentation section for providing a tactile feedback to the user so as to confirm validity of the physical interaction accepted by the gesture input means. 38. The portable information apparatus according to claim 21, wherein: the gesture input means comprises a force sensor for detecting a strength of the physical interaction applied on the main body by the user; and the processing means analyzes a pattern of the force detected by the force sensor, and uses the pattern of the force as a specific command. 39. The portable information apparatus according to claim 38, further comprising a tactile presentation section for providing a tactile feedback to the user so as to confirm successful analysis of the physical interaction accepted by the gesture input means and corresponding successful execution of the specific command. 40. The portable information apparatus according to claim 21, wherein graphical user interface object is for a hierarchical structured menu, the depth of the hierarchical structured menu being selected by gesture input from said gesture input means, and an item of the menu at the selected depth being selected by direction input from said direction input means. 41. The portable information apparatus according to claim 21, wherein the processing means changes a transparency value of an item based on the gesture input means sensing a distortion status of the flexible part. 42. The portable information apparatus according to claim 21, wherein the processing means changes a zoom value of an integrated preview based on the gesture input means sensing a distortion status of the flexible part.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (15)
Fishkin Kenneth P. ; Harrison Beverly L. ; Want Roy, Computer user interface using a physical manipulatory grammar.
Adler Annette M. ; Fishkin Kenneth P. ; Harrison Beverly L. ; Howard Matthew E. ; Want Roy, Multiple interacting computers interfaceable through a physical manipulatory grammar.
Motosyuku Hiroshi (Hitachi JPX) Yokosuka Hirobumi (Hitachi JPX), Small-sized information processor capable of scrolling screen in accordance with tilt, and scrolling method therefor.
Chang Bay-Wei W. ; Fishkin Kenneth P. ; Harrison Beverly L. ; Igarashi Takeo,JPX ; Mackinlay Jock D. ; Want Roy ; Zellweger Polle T., Spinning as a morpheme for a physical manipulatory grammar.
Ulrich Karl T. (Belmont MA) Filerman Marc (Watertown MA) Sachs Emanuel (Somerville MA) Roberts Andrew (Charlestown MA) Siler Todd (Cambridge MA) Berkery Daniel J. (Boston MA) Robertson David C. (Wake, Three-dimensional tactile computer input device.
Bisset Stephen (Palo Alto CA) Miller Robert J. (Fremont CA) Allen Timothy P. (Los Gatos CA) Steinbach Gunter (Palo Alto CA 4), Touch pad driven handheld computing device.
Migos, Charles J.; Capela, Jay Christopher; Thimbleby, William John, Device, method, and graphical user interface for copying user interface objects between content regions.
Weeldreyer, Christopher Douglas; Rapp, Peter William; Marr, Jason Robert; Leffert, Akiva Dov; Capela, Jay Christopher, Device, method, and graphical user interface for manipulating user interface objects.
Missig, Julian; Koch, Jonathan; Cieplinski, Avi E.; Victor, B. Michael; Bernstein, Jeffrey Traer; Kerr, Duncan R.; Haggerty, Myra M., Device, method, and graphical user interface for manipulating workspace views.
Missig, Julian; Koch, Jonathan; Cieplinski, Avi E.; Victor, B. Michael; Bernstein, Jeffrey Traer; Kerr, Duncan R.; Haggerty, Myra M., Device, method, and graphical user interface for moving a calendar entry in a calendar application.
Capela, Jay Christopher; Migos, Charles J.; Thimbleby, William John; Weeldreyer, Christopher Douglas, Device, method, and graphical user interface for precise positioning of objects.
Migos, Charles J.; Capela, Jay Christopher; Weeldreyer, Christopher Douglas; Thimbleby, William John; Reid, Elizabeth Gloria Guarino, Device, method, and graphical user interface for reordering the front-to-back positions of objects.
Capela, Jay Christopher; Migos, Charles J.; Thimbleby, William John; Weeldreyer, Christopher Douglas, Device, method, and graphical user interface for resizing objects.
Capela, Jay Christopher; Migos, Charles J.; Thimbleby, William John; Weeldreyer, Christopher Douglas, Device, method, and graphical user interface for resizing objects.
Capela, Jay Christopher; Migos, Charles J.; Thimbleby, William John; Weeldreyer, Christopher Douglas, Device, method, and graphical user interface for selecting and moving objects.
Butcher, Gary Ian; Chaudhri, Imran; Dascola, Jonathan R.; Dye, Alan C.; Foss, Christopher Patrick; Gross, Daniel C.; Karunamuni, Chanaka G.; Lemay, Stephen O.; Maric, Natalia; Wilson, Christopher; Yang, Lawrence Y., Displaying relevant user interface objects.
Cox, Keith; Kapoor, Gaurav; Culbert, Michael, Method and apparatus for detecting conditions of a peripheral device including motion, and determining/predicting temperature(S) wherein at least one temperature is weighted based on detected conditions.
Wehrenberg, Paul J.; Leiba, Aaron; Williams, Richard C.; Falkenburg, David R.; Gerbarg, Louis G.; Chang, Ray L., Methods and apparatuses for operating a portable device based on an accelerometer.
Wehrenberg, Paul J.; Leiba, Aaron; Williams, Richard C.; Falkenburg, David R.; Gerbarg, Louis G.; Chang, Ray L., Methods and apparatuses for operating a portable device based on an accelerometer.
Wehrenberg, Paul J.; Leiba, Aaron; Williams, Richard C.; Falkenburg, David R.; Gerbarg, Louis G.; Chang, Ray L., Methods and apparatuses for operating a portable device based on an accelerometer.
Wehrenberg, Paul J.; Leiba, Aaron; Williams, Richard C.; Falkenburg, David R.; Gerbarg, Louis G.; Chang, Ray L., Methods and apparatuses for operating a portable device based on an accelerometer.
Wehrenberg, Paul J.; Leiba, Aaron; Williams, Richard C.; Falkenburg, David R.; Gerbarg, Louis G.; Chang, Ray L., Methods and apparatuses for operating a portable device based on an accelerometer.
Filiz, Sinan; Huppi, Brian Q.; Butler, Christopher J.; Grunthaner, Martin P.; Shahparnia, Shahrooz; Kang, Sunggu; Wang, Kai, Piezo based force sensing.
Levesque, Vincent; Modarres, Ali; Cruz-Hernandez, Juan Manuel; Weddle, Amaya Becvar; Birnbaum, David M.; Grant, Danny A., Systems and methods for generating friction and vibrotactile effects.
Hoen, Storrs T.; Augenbergs, Peteris K.; Brock, John M.; Harley, Jonah A.; Sarcia, Sam Rhea, Touch input device including a moment compensated bending sensor for load measurement on platform supported by bending beams.
Shin, Sang Hyun; Chae, Ji Suk; Park, Ho Joo; Ham, Young Ho; Yoo, Kyung Hee; Kim, Ji Ae; Kim, Yu Mi, Touch screen device and method of method of displaying images thereon.
Park, Ho Joo; Chae, Ji Suk; Ham, Young Ho; Yoo, Kyung Hee; Kim, Ji Ae; Kim, Yu Mi; Shin, Sang Hyun; Bae, Seung Jun; Koo, Yoon Hee; Kim, Jun Hee; Kang, Seong Cheol, Touch screen device and method of selecting files thereon.
Park, Ho Joo; Chae, Ji Suk; Ham, Young Ho; Yoo, Kyung Hee; Kim, Ji Ae; Kim, Yu Mi; Shin, Sang Hyun; Bae, Seung Jun; Koo, Yoon Hee; Kang, Seong Cheol, Touch screen device and operating method thereof.
Park, Ho Joo; Chae, Ji Suk; Ham, Young Ho; Yoo, Kyung Hee; Kim, Ji Ae; Kim, Yu Mi; Shin, Sang Hyun; Bae, Seung Jun; Koo, Yoon Hee; Kang, Seong Cheol, Touch screen device and operating method thereof.
Shin, Sang Hyun; Chae, Ji Suk; Park, Ho Joo; Ham, Young Ho; Kim, Jun Hee; Yoo, Kyung Hee; Kim, Yu Mi, Touch screen device and operating method thereof.
Shin, Sang Hyun; Chae, Ji Suk; Park, Ho Joo; Ham, Young Ho; Kim, Jun Hee; Yoo, Kyung Hee; Kim, Yu Mi, Touch screen device and operating method thereof.
Baudisch, Patrick M.; Petschnigg, Georg F.; Wykes, David H.; Shum, Albert Yiu-So; Geiger, Avi; Hinckley, Kenneth P.; Sinclair, Michael J.; Jacobs, Joel B.; Friedman, Jonathan D.; Ho, Rosanna H., Tracking input in a screen-reflective interface environment.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.