IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0101527
(2008-04-11)
|
등록번호 |
US-8115753
(2012-02-14)
|
우선권정보 |
NZ-554416 (2007-04-11) |
발명자
/ 주소 |
|
출원인 / 주소 |
|
대리인 / 주소 |
Kilpatrick Townsend & Stockton LLP
|
인용정보 |
피인용 횟수 :
20 인용 특허 :
321 |
초록
▼
A touch screen system that can approximate tracking and dragging states regardless of the user's orientation and without reliance on direct sensing of touch pressure or area. A first detector generates a signal representing a first image of an object interacting with the touch screen. A second detec
A touch screen system that can approximate tracking and dragging states regardless of the user's orientation and without reliance on direct sensing of touch pressure or area. A first detector generates a signal representing a first image of an object interacting with the touch screen. A second detector generates a signal representing a second image of the object. A signal processor processes the first signal to determine approximated coordinates of a first pair of outer edges of the object and processes the second signal to determine approximated coordinates of a second pair of outer edges of the object. The signal processor then calculates an approximated touch area based on the approximated coordinates of the first pair of outer edges and the approximated coordinates of the second pair of outer edges of the object. If the approximated touch area is less than or equal to a threshold touch area, the signal processor determines that the object interacting with the touch screen indicates a tracking state. If the approximated touch area is greater than the threshold touch area, the signal processor determines that the object interacting with the touch screen indicates a selection state. The threshold touch area may be established by calibrating the touch screen system when the object interacting with the touch screen is known to indicate the tracking state.
대표청구항
▼
1. A method of discerning between user interaction states in a touch screen system, comprising: receiving a first signal from a first detector of said touch screen system, said first signal representing a first image of an object interacting with a touch screen;receiving a second signal from a secon
1. A method of discerning between user interaction states in a touch screen system, comprising: receiving a first signal from a first detector of said touch screen system, said first signal representing a first image of an object interacting with a touch screen;receiving a second signal from a second detector, said second signal representing a second image of the object interacting with the touch screen;processing the first signal to determine approximated coordinates of a first pair of outer edges of the object;processing the second signal to determine approximated coordinates of a second pair of outer edges of the object;calculating an approximated touch area based on the approximated coordinates of the first pair of outer edges and the approximated coordinates of the second pair of outer edges of the object;if the approximated touch area is less than or equal to a threshold touch area, determining that the object interacting with the touch screen indicates a tracking state;if the approximated touch area is greater than the threshold touch area, determining that the object interacting with the touch screen indicates a selection state;if the object interacting with the touch screen indicates the selection state, determining whether the object moves relative to the touch screen;if the object moves relative to the touch screen, re-calculating the approximated touch area and determining whether the re-calculated touch area remains greater than or equal to the threshold touch area; andif the re-calculated touch area remains greater than the threshold touch area, determining that the object interacting with the touch screen indicates a dragging state. 2. The method of claim 1, wherein the approximated coordinates of the first pair of outer edges and the approximated coordinates of the second pair of outer edges of the object are determined using slope line calculations. 3. The method of claim 1, wherein the threshold touch area is established by calibrating the touch screen system when the object interacting with the touch screen is known to indicate the tracking state. 4. The method of claim 1, wherein the threshold touch area is established by an operator of the touch screen system. 5. The method of claim 1, further comprising: if the object interacting with the touch screen indicates either the selection state or the tracking state, determining whether the object becomes undetected by the first detector and the second detector; andif the object becomes undetected by the first detector and the second detector, determining that the object interacting with the touch screen indicates an out-of-range state. 6. The method of claim 1, further comprising if the re-calculated touch area does not remain greater than the threshold touch area, determining that the object interacting with the touch screen indicates the tracking state. 7. The method of claim 1, further comprising: if the object interacting with the touch screen indicates either the selection state, the dragging state or the tracking state, determining whether the object becomes undetected by the first detector and the second detector; andif the object becomes undetected by the first detector and the second detector, determining that the object interacting with the touch screen indicates an out-of-range state. 8. The method of claim 1, wherein the first detector and the second detector are each selected from the group consisting of: a line scan camera, an area scan camera and a phototransistor. 9. A touch screen system for discerning between user interaction states, comprising: a touch screen;a first detector in proximity to the touch screen for generating a first signal representing a first image of an object interacting with the touch screen;a second detector in proximity to the touch screen for generating a second signal representing a second image of the object interacting with the touch screen; anda signal processor for executing computer-executable instructions for: processing the first signal to determine approximated coordinates of a first pair of outer edges of the object,processing the second signal to determine approximated coordinates of a second pair of outer edges of the object,calculating an approximated touch area based on the approximated coordinates of the first pair of outer edges and the approximated coordinates of the second pair of outer edges of the object,if the approximated touch area is less than or equal to a threshold touch area, determining that the object interacting with the touch screen indicates a tracking state,if the approximated touch area is greater than the threshold touch area, determining that the object interacting with the touch screen indicates a selection state,if the object interacting with the touch screen indicates the selection state, determining whether the object moves relative to the touch screen,if the object moves relative to the touch screen, re-calculating the approximated touch area and determining whether the re-calculated touch area remains greater than or equal to the threshold touch area, andif the re-calculated touch area remains greater than or equal to the threshold touch area, determining that the object interacting with the touch screen indicates a dragging state. 10. The touch screen system of claim 9, wherein the approximated coordinates of the first pair of outer edges and the approximated coordinates of the second pair of outer edges of the object are determined using slope line calculations. 11. The touch screen system of claim 9, wherein the threshold touch area is established by calibrating the touch screen system when the object interacting with the touch screen is known to indicate the tracking state. 12. The touch screen system of claim 9, wherein the signal processor executes further computer-executable instructions for: if the object interacting with the touch screen indicates either the selection state or the tracking state, determining whether the object becomes undetected by the first detector and the second detector; andif the object becomes undetected by the first detector and the second detector, determining that the object interacting with the touch screen indicates an out-of-range state. 13. The touch screen system of claim 9, wherein the signal processor executes further computer-executable instructions for determining that the object interacting with the touch screen indicates the tracking state, if the re-calculated touch area does not remain greater than the threshold touch area. 14. The touch screen system of claim 9, wherein the signal processor executes further computer-executable instructions for: if the object interacting with the touch screen indicates either the selection state, the dragging state or the tracking state, determining whether the object becomes undetected by the first detector and the second detector; andif the object becomes undetected by the first detector and the second detector, determining that the object interacting with the touch screen indicates an out-of-range state. 15. The touch screen system of claim 9, wherein the first detector and the second detector are each selected from the group consisting of: a line scan camera, an area scan camera and a phototransistor. 16. The touch screen system of claim 9, further comprising a light source for illuminating the object; and wherein the first detector and the second detector detect illumination level variations caused by the object interacting with the touch screen. 17. The touch screen system of claim 9, wherein the object comprises a user's finger. 18. The touch screen system of claim 9, wherein the object comprises a stylus having a spring loaded plunger protruding from a tip of the stylus, said plunger producing a relatively small touch area when interacting with the touch screen; and wherein said plunger collapses into the tip of the stylus when sufficient compression is applied to the spring, causing the tip of the stylus to contact the touch screen and producing a relatively larger touch area.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.