User/object interactions in an augmented reality environment
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G06T-019/00
G06T-007/00
G06T-011/00
G06F-003/01
G06F-003/00
출원번호
US-0158062
(2011-06-10)
등록번호
US-10008037
(2018-06-26)
발명자
/ 주소
Worley, III, William Spencer
Crump, Edward Dietz
Yuan, Robert A.
Coley, Christopher
Cederlof, Colter E.
출원인 / 주소
Amazon Technologies, Inc.
대리인 / 주소
Lee & Hayes, PLLC
인용정보
피인용 횟수 :
1인용 특허 :
4
초록▼
An augmented reality environment allows interaction between virtual and real objects. By monitoring user actions with the augmented reality environment various functions are provided to users. Users may buy or sell items with a gesture, check inventory of objects in the augmented reality environment
An augmented reality environment allows interaction between virtual and real objects. By monitoring user actions with the augmented reality environment various functions are provided to users. Users may buy or sell items with a gesture, check inventory of objects in the augmented reality environment, view advertisements, and so forth.
대표청구항▼
1. An augmented reality system comprising: a processor;a projector coupled to the processor, and configured to generate structured light;a camera coupled to the processor, and configured to receive at least a portion of the structured light that has been reflected off one or more objects in an envir
1. An augmented reality system comprising: a processor;a projector coupled to the processor, and configured to generate structured light;a camera coupled to the processor, and configured to receive at least a portion of the structured light that has been reflected off one or more objects in an environment; andcomputer-readable media storing computer-executable instructions that, when executed by the processor, cause the processor to perform acts comprising: identifying an object of the one or more objects within the environment at least in part with the at least the portion of the structured light;determining an identity of the user within the environment, based at least in part on the at least the portion of the structured light;associating the object with first content based at least in part on the identity of the user; andpresenting the first content with use of the projector, the first content remaining within a line of sight of the user as the user moves within the environment and accepting input associated with the first content received via the camera. 2. The system of claim 1, wherein the acts further comprise presenting the first content on a wall within the environment. 3. The system of claim 1, wherein the acts further comprise determining that the user is interacting with another object and ceasing presentation of the first content associated with the object. 4. The system of claim 1, wherein the acts further comprise initiating an action based at least in part upon a user input received via the camera. 5. The system of claim 4, wherein the action comprises an affirmative response to the first content. 6. The system of claim 1, wherein the first content comprises both audio and visual content. 7. The system of claim 1, wherein the augmented reality system resides within an augmented reality environment, and wherein the acts further comprise tracking the object within the augmented reality environment and maintaining a current location of the object. 8. The system of claim 1, wherein the acts further comprise tracking the location of the user as the user moves within the environment and presenting the first content with use of the projector as the user moves within the environment. 9. A method comprising: under control of one or more computer systems configured with executable instructions,identifying an object based at least in part on interaction with structured light;identifying a user based at least in part on interaction with structured light;selecting first content for association with the object based at least in part on an identity of the user; andpresenting, via a projector, the first content, the first content remaining within a line of sight of the user as the user moves within an environment. 10. The method of claim 9, further comprising accepting an input associated with the first content. 11. The method of claim 9, further comprising authorizing the user prior to initiating an action based at least in part upon user input. 12. The method of claim 11, wherein the user input comprises a gesture made by the user having one or more motions that occur at least in part free from contact with other objects. 13. The method of claim 11, further comprising initiating an action based at least in part upon the user input. 14. The method of claim 9, further comprising: receiving image data representing a gesture of the user in response to the first content;determining, based at least in part on the image data, a starting location of the gesture and an ending location of the gesture;determining, based at least in part on the image data, a distance between the starting location and the ending location;determining, based at least in part on the image data, whether the distance between the starting location and the ending location is greater than a predetermined threshold distance; andstoring information associated with the gesture of the user based at least in part on the first content and the distance being greater than the predetermined threshold distance. 15. A method comprising: under control of one or more computer systems configured with executable instructions,associating a category of content with an object in an augmented reality environment;determining one or more physical characteristics of a user in the augmented reality environment;determining a first distance between the user and the object;associating first content from the category of content with the object based at least in part on the physical characteristics of the user and the first distance between the user and the object;presenting, via a projector, the first content within the augmented reality environment, the first content remaining within a line of sight of the user as the user moves within the augmented reality environment;determining a second distance between the user and the object, the second distance being different than the first distance; andpresenting second content about the object within the augmented reality environment, based at least in part on the second distance. 16. The method of claim 15, wherein the first content comprises a projected image. 17. The method of claim 15, wherein the first content comprises an offer relating to a transaction involving goods or services. 18. The method of claim 15, wherein the first content comprises an advertisement related to the object. 19. The method of claim 15, further comprising limiting content based at least in part upon the identity of the user within the augmented reality environment. 20. The method of claim 15, wherein the one or more physical characteristics of the user comprises at least one of a gender of the user or an age of the user. 21. The method of claim 15, further comprising: receiving image data representing a gesture of the user in response to the first content;determining, based at least in part on the image data, whether the gesture is a predetermined gesture indicating a negative response to the first content; andceasing presentation of the first content based at least in part on the gesture being the predetermined gesture.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (4)
Schott Eric G. (Mercer Island WA), Method and apparatus for data alteration by manipulation of representational graphs.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.