IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
UP-0697056
(2003-10-30)
|
등록번호 |
US-7532196
(2009-07-01)
|
발명자
/ 주소 |
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 |
피인용 횟수 :
164 인용 특허 :
6 |
초록
▼
Methods and apparatus of the invention allow the coordination of resources of mobile computing devices to jointly execute tasks. In the method, a first gesture input is received at a first mobile computing device. A second gesture input is received at a second mobile computing device. In response, a
Methods and apparatus of the invention allow the coordination of resources of mobile computing devices to jointly execute tasks. In the method, a first gesture input is received at a first mobile computing device. A second gesture input is received at a second mobile computing device. In response, a determination is made as to whether the first and second gesture inputs form one of a plurality of different synchronous gesture types. If it is determined that the first and second gesture inputs form the one of the plurality of different synchronous gesture types, then resources of the first and second mobile computing devices are combined to jointly execute a particular task associated with the one of the plurality of different synchronous gesture types.
대표청구항
▼
What is claimed is: 1. A method of coordinating resources of mobile computing devices to jointly execute tasks, the method comprising: receiving a first user gesture input at a first mobile computing device; receiving a second user gesture input at a second mobile computing device, wherein the firs
What is claimed is: 1. A method of coordinating resources of mobile computing devices to jointly execute tasks, the method comprising: receiving a first user gesture input at a first mobile computing device; receiving a second user gesture input at a second mobile computing device, wherein the first and second gesture inputs are received respectively at the first and second mobile computing devices; determining whether the first and second user gesture inputs together form one of a plurality of different synchronous gesture types, wherein determining whether the first and second gesture inputs received at the first and second mobile computing devices form one of the plurality of different synchronous gesture types further comprises determining whether the first and second user gesture inputs received at the first and second mobile computing devices are of corresponding types and are synchronized in time by being received within a predetermined time period of each other; and combining resources of the first and second mobile computing devices, in response to a determination being made that the first and second user gesture inputs received at the first and second mobile computing devices are of corresponding types and are synchronized in time to form one of the plurality of synchronous gesture types, to jointly execute a particular task associated with the one of the plurality of different synchronous gesture types. 2. The method of claim 1, wherein receiving the first gesture input further comprises receiving an output of an accelerometer of the first mobile computing device, and wherein receiving the second gesture input further comprises receiving an output of an accelerometer of the second mobile computing device. 3. The method of claim 2, wherein the outputs of the accelerometers of the first and second mobile computing devices are indicative of whether the first and second mobile computing devices have been bumped against one another, thereby forming a bump type synchronous gesture. 4. The method of claim 3, and further comprising: receiving a touch sensor output from the first mobile computing device indicative of whether the first mobile computing device is being held during a potential bump type synchronous gesture; and wherein determining whether the first and second gesture inputs form the bump type synchronous gesture comprises determining that the first and second gesture inputs form the bump type synchronous gesture only if the touch sensor output indicates that the first mobile computing device is being held. 5. The method of claim 1, wherein receiving the first gesture input further comprises receiving an input which is indicative of proximity of a stylus to a screen of the first mobile computing device, and wherein receiving the second gesture input further comprises receiving an input which is indicative of proximity of a stylus to a screen on the second mobile computing device. 6. The method of claim 5, wherein proximity of the stylus to one or both of the first and second mobile computing devices includes contact of the stylus with one or both of the first and second mobile computing devices. 7. The method of claim 6, wherein the first and second gesture inputs are indicative of whether a stitch type synchronous gesture has been formed. 8. The method of claim 7, wherein the first and second gesture inputs are indicative of whether a scribble type synchronous gesture has been formed. 9. The method of claim 1, wherein combining resources of the first and second mobile computing devices to jointly execute the task associated with the one of the plurality of different synchronous gesture types further comprises combining resources of the first and second mobile computing devices to share display real estate. 10. The method of claim 9, wherein combining resources of the first and second mobile computing devices to share display real estate further comprises combining resources of the first and second mobile computing devices to jointly display the same image. 11. The method of claim 9, wherein combining resources of the first and second mobile computing devices to share display real estate further comprises combining resources of the first and second mobile computing devices to each display different portions of a single image. 12. The method of claim 1, wherein combining resources of the first and second mobile computing devices to jointly execute the task associated with the one of the plurality of different synchronous gesture types further comprises combining resources of the first and second mobile computing devices to transfer data from the first mobile computing device to the second mobile computing device. 13. A system which coordinates resources of mobile computing devices to jointly execute tasks, the system comprising: a first mobile computing device configured to receive a first user gesture input; a second mobile computing device configured to receive a second user gesture input; processing circuitry configured to determine whether the first and second user gesture inputs together form one of a plurality of different synchronous gesture types by determining whether the first and second gesture inputs are of corresponding types and are synchronized in time by being received within a predetermined time period of each other; and the first and second mobile computing devices being further configured to combine resources in response to a determination being made that the first and second user gesture inputs received at the first and second mobile computing devices are of corresponding types and are synchronized in time to form one of the plurality of synchronous gesture types, to jointly execute a particular task associated with the one of the plurality of different synchronous gesture types. 14. The system of claim 13, and further comprising a network communicatively coupling the first and second mobile computing devices. 15. The system of claim 14, wherein the processing circuitry comprises processing circuitry of one or both of the first and second mobile computing devices. 16. The system of claim 14, wherein the processing circuitry comprises processing circuitry of the network. 17. The system of claim 16, wherein the processing circuitry further comprises a proximity server. 18. The system of claim 13, and further comprising an accelerometer coupled to the first mobile computing device and an accelerometer coupled to the second mobile computing device, wherein the first gesture input is an output of the accelerometer coupled to the first mobile computing device, and wherein the second gesture input is an output of the accelerometer coupled to the second mobile computing device. 19. The system of claim 18, wherein the outputs of the accelerometers coupled to the first and second mobile computing devices are indicative of whether the first and second mobile computing devices have been bumped against one another, thereby forming a bump type synchronous gesture. 20. The system of claim 19, and further comprising: a touch sensor coupled to the first mobile computing device, the first mobile computing device being further configured to receive a touch sensor output indicative of whether the first mobile computing device is being held during a potential bump type synchronous gesture; and wherein the processing circuitry is further configured to determine whether the first and second gesture inputs form the bump type synchronous gesture only if the touch sensor output indicates that the first mobile computing device is being held. 21. The system of claim 13, wherein the first mobile computing device is configured to receive the first gesture input by receiving an input which is indicative of proximity of a stylus to a screen of the first mobile computing device, and wherein the second mobile computing device is configured to receive the second gesture input by receiving an input which is indicative of proximity of a stylus to a screen on the second mobile computing device. 22. The system of claim 21, wherein proximity of the stylus to one or both of the first and second mobile computing devices includes contact of the stylus with one or both of the first and second mobile computing devices. 23. The system of claim 22, wherein the first and second gesture inputs are indicative of whether a stitch type synchronous gesture has been formed. 24. The system of claim 23, wherein the first and second gesture inputs are indicative of whether a scribble type synchronous gesture has been formed. 25. The system of claim 13, wherein the first and second mobile computing devices are configured to combine resources by sharing display real estate. 26. The system of claim 25, wherein the first and second mobile computing devices are configured to share display real estate by jointly display the same image. 27. The system of claim 26, wherein the first and second mobile computing devices are configured to jointly display the same image by each displaying different portions of a single image. 28. The system of claim 13, wherein the first and second mobile computing devices are configured to combine resources to transfer data from the first mobile computing device to the second mobile computing device.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.