Augmented reality with direct user interaction
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G06F-003/01
G06F-003/03
A63F-013/20
H04N-013/04
출원번호
US-0940322
(2010-11-05)
등록번호
US-9529424
(2016-12-27)
발명자
/ 주소
Hilliges, Otmar
Kim, David
Izadi, Shahram
Molyneaux, David
Hodges, Stephen Edward
Butler, David Alexander
출원인 / 주소
Microsoft Technology Licensing, LLC
대리인 / 주소
Wong, Tom
인용정보
피인용 횟수 :
3인용 특허 :
9
초록▼
Augmented reality with direct user interaction is described. In one example, an augmented reality system comprises a user-interaction region, a camera that captures images of an object in the user-interaction region, and a partially transparent display device which combines a virtual environment wit
Augmented reality with direct user interaction is described. In one example, an augmented reality system comprises a user-interaction region, a camera that captures images of an object in the user-interaction region, and a partially transparent display device which combines a virtual environment with a view of the user-interaction region, so that both are visible at the same time to a user. A processor receives the images, tracks the object's movement, calculates a corresponding movement within the virtual environment, and updates the virtual environment based on the corresponding movement. In another example, a method of direct interaction in an augmented reality system comprises generating a virtual representation of the object having the corresponding movement, and updating the virtual environment so that the virtual representation interacts with virtual objects in the virtual environment. From the user's perspective, the object directly interacts with the virtual objects.
대표청구항▼
1. An augmented reality system, comprising: a user-interaction region;an image sensor arranged to capture images of a user-controlled object in the user-interaction region;a display device which is at least partially transparent and arranged to combine a displayed virtual environment with a view of
1. An augmented reality system, comprising: a user-interaction region;an image sensor arranged to capture images of a user-controlled object in the user-interaction region;a display device which is at least partially transparent and arranged to combine a displayed virtual environment with a view of the user-interaction region, such that both the virtual environment and the object are concurrently visible to a viewing user on the display device;a processor arranged to receive the images and track movement of the object in the user-interaction region, calculate a corresponding movement within the virtual environment that is visually aligned with the movement of the object when viewed by the user on the display device, and control the display device to update the virtual environment based on the corresponding movement; andat least one of: the display device comprising a plurality of partitioned sections, the processor being further arranged to selectively control the display device to reduce the light emitted from at least one partition of the display device at a location coincident with a user's hand, such that at least one partition of the display device emits less light than at least one other partition of the display device and a portion of the user-interaction region corresponding to the at least one partition of the display device with reduced light emission is more visible through the display device than surrounding portions; orthe processor being further arranged to selectively control the display device to render black pixels at locations in the virtual environment that are coincident with the user's hands. 2. An augmented reality system according to claim 1, wherein the user-controlled object is a body part of the user. 3. An augmented reality system according to claim 1, further comprising a tracking sensor arranged to monitor at least one of an eye, face and head of the user, and wherein the processor is further arranged to use data from the tracking sensor to track the user's viewing position, and adapt the display of the virtual environment in dependence on the viewing position such that the virtual environment and the user-interaction region are visually aligned from the user's perspective. 4. An augmented reality system according to claim 1, further comprising a projector arranged to selectively project light into the user-interaction region responsive to commands from the processor. 5. An augmented reality system according to claim 4, wherein the projector is controlled by the processor to selectively omit lighting in one or more portions of the user-interaction region, such that the one or more portions are less visible to the user. 6. An augmented reality system according to claim 1, further comprising a light sensor arranged to provide an ambient light level measurement to the processor, and wherein the processor is further arranged to control an output light intensity value for the display device responsive to the ambient light level measurement. 7. An augmented reality system according to claim 1, wherein the display device comprises at least one of: a transparent organic light emitting diode display; a liquid crystal display; an organic light emitting diode display; a stereoscopic display; an autostereoscopic display; and a volumetric display. 8. An augmented reality system according to claim 1, wherein the display device is arranged to be switchable between a first state in which both the user-interaction region and the virtual environment is visible to the user, and a second state in which only the virtual environment is visible to the user. 9. An augmented reality system according to claim 1, wherein the display device comprises: a display screen arranged to display the virtual scene; andan optical beam splitter positioned to reflect light from the display screen and transmit light from the user-interaction region, such that both the virtual environment and the object are concurrently visible to a viewing user on the beam splitter. 10. An augmented reality system according to claim 9, wherein the display screen comprises a first portion and second portion; and wherein the display device further comprises: a half-silvered mirror positioned to reflect light from the first portion of the display screen; anda full-silvered mirror positioned to reflect light from the second portion of the display and through the half-silvered mirror, such that images shown on the first and second portions appear overlaid but at different depths when viewed by the user. 11. An augmented reality system according to claim 9, wherein the optical beam-splitter comprises a half-silvered mirror. 12. An augmented reality system according to claim 9, wherein the optical beam-splitter is arranged so that the user-interaction region is located between a table-top and the optical beam-splitter. 13. An augmented reality system according to claim 1, wherein the image sensor is a depth camera. 14. A computer-implemented method of direct user-interaction in an augmented reality system, comprising: controlling, using a processor, an at least partially transparent display device to display a virtual environment combined with a view of a user-interaction region, such that both the virtual environment and an object are concurrently visible and visually aligned to a viewing user on the display device;tracking movement of a user-controlled object located in the user-interaction region using an image sensor;generating a virtual representation of the object having a corresponding movement in the virtual environment that is visually aligned with the movement of the object when viewed by a user on the display device;updating the virtual environment on the display device such that the virtual representation interacts with one or more virtual objects in the virtual environment, such that, from the perspective of the user, the movement of the object viewed through the display device directly interacts with the virtual objects in the virtual environment; andat least one of: the at least partially transparent display device comprising a plurality of partitioned sections, selectively controlling the at least partially transparent display device to reduce the light emitted from at least one partition of the display device at a location coincident with a user's hand, such that at least one partition of the display device emits less light than at least one other partition of the display device and a corresponding portion of the user-interaction region is more visible through the least partially transparent display device than surrounding portions; orselectively controlling the at least partially transparent display device to render black pixels at locations in the virtual environment that are coincident with the user's hands, the black pixels making the user's hands more visible in the user-interaction region. 15. A method according to claim 14, wherein the virtual environment is a physics simulation-based user interface generated by the processor, and the virtual representation interacts with one or more virtual objects in the virtual environment by simulating forces imparted to the one or more virtual objects by the movement of the virtual representation. 16. A method according to claim 14, further comprising the step of the processor rendering the one or more virtual objects such that the virtual objects appear placed on top of the user-controlled object when viewed by the user on the display device, and the step of updating the virtual environment comprises moving the virtual objects in alignment with the movement of the user-controlled object. 17. A method according to claim 14, further comprising the step of receiving, at the processor, an image of the user's face from a tracking sensor, and determining the user's viewing position. 18. A method according to claim 17, wherein the step of controlling the display device comprises using the user's viewing position to visually align the virtual environment with the user-interaction region from the user's perspective. 19. A method according to claim 14, further comprising the step of the processor selectively controlling the display device to reduce the light emitted from at least a portion of the display device, such that a corresponding portion of the user-interaction region is more visible through the display device than surrounding portions. 20. An augmented reality system, comprising: a display screen arranged to display a three-dimensional virtual environment comprising one or more virtual objects;a user-interaction region;a depth camera arranged to capture depth images of a user's hand located within the user-interaction region;a half-silvered mirror positioned to reflect light from the display screen and transmit light from within the user-interaction region, such that both the virtual environment and the user's hand are concurrently visible to the user on a surface of the half-silvered mirror;a processor arranged to: control the display screen to display the virtual environment such that the virtual environment is visually aligned with the user-interaction region on a surface of the half-silvered mirror from the perspective of the user; receive the depth images and track movement of the user's hand in three dimensions in the user-interaction region; generate a virtual representation of the user's hand having a corresponding movement in the virtual environment that is visually aligned with the movement of the user's hand when viewed by the user on the half-silvered mirror; and update the virtual environment on the display screen such that the virtual representation interacts with the one or more virtual objects in the virtual environment, such that, from the perspective of the user, the movement of the user's hand viewed through the half-silvered mirror directly interacts with the virtual objects in the virtual environment; andthe display screen comprising a plurality of partitioned sections, the processor being further arranged to selectively control the display device to reduce the light emitted from at least one partition of the display device at a location coincident with a user's hand, such that at least one partition of the display device emits less light than at least one other partition of the display device and a portion of the user-interaction region corresponding to the at least one partition of the display device with reduced light emission is more visible through the display device than surrounding portions.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (9)
Ahmed,Syed Nadeem; Kockro,Ralf Alfons, Augmented reality system controlled by probe position.
Chen Michael (Mountain View CA) Houde Stephanie L. (Cambridge MA) Seidl Robert H. (Palo Alto CA), Method and apparatus for direct manipulation of 3-D objects on computer displays.
Carmel, Ron; DesRosiers, Hugo J. C.; Gomez, Daniel; Kramer, James F.; Tian, Jerry; Tremblay, Marc; Ullrich, Christopher J., System, method and data structure for simulated interaction with graphical objects.
Bass Robert (2832 NE. 35th St. Fort Lauderdale FL 33306) Bass John (2832 NE. 35th St. Fort Lauderdale FL 33306), Three dimensional optical viewing system.
De La Riviere, Jean-Baptiste; Chartier, Christophe; Hachet, Martin; Bossavit, Benoit; Casiez, Gery, System for colocating a touch screen and a virtual object, and device for manipulating virtual objects implementing such a system.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.