In embodiments of an auto-stereoscopic augmented reality display, the display device is implemented with an imaging structure that includes a waveguide for see-through viewing of an environment. The waveguide also transmits light of a virtual image that is generated as a near-display object to appea
In embodiments of an auto-stereoscopic augmented reality display, the display device is implemented with an imaging structure that includes a waveguide for see-through viewing of an environment. The waveguide also transmits light of a virtual image that is generated as a near-display object to appear at a distance in the environment. The imaging structure includes switchable diffractive elements that are integrated in the waveguide and configured in display zones. The switchable diffractive elements are switchable to independently activate the display zones effective to correct for an accurate stereopsis view of the virtual image that appears at the distance in the environment.
대표청구항▼
1. An imaging structure implemented in a display device, the imaging structure comprising: a waveguide configured for see-through viewing of an environment, the waveguide further configured to transmit light of a virtual image that is generated as a near-display object to appear at a distance in the
1. An imaging structure implemented in a display device, the imaging structure comprising: a waveguide configured for see-through viewing of an environment, the waveguide further configured to transmit light of a virtual image that is generated as a near-display object to appear at a distance in the environment when the environment is viewed through the waveguide;one or more sensors configured to provide reference data related to at least a position and an orientation of the imaging structure in the environment with respect to a real object in the environment; andswitchable diffractive elements integrated in the waveguide and configured in display zones of the display device, the display zones including vector adjustments, based in part on the reference data, to account for the position and the orientation of the imaging structure and enable the virtual image that appears at the distance in the environment to be generated with an accurate viewing angle relative to a viewing angle of the real object in the environment, the switchable diffractive elements switchable to independently activate the display zones to correct for an accurate stereopsis view of the virtual image that appears at the distance in the environment,wherein: one or more first display zones can be activated to provide a representation of the virtual image for a right eye of a user based on tracked pupil positions of the user, one or more second display zones can be activated to provide a different representation of the virtual image for a left eye of the user based on the tracked pupil positions of the user, andthe one or more first display zones and the one or more second display zones are determined by calculating a ray-trace bisector for each of one or more tiles of the display device relative to a current bisector eye position. 2. An imaging structure as recited in claim 1, further comprising an element drive circuit that is controllable to selectively activate the switchable diffractive elements in respective display zones to project the virtual image for display. 3. An imaging structure as recited in claim 1, wherein the switchable diffractive elements in a display zone are configured for activation based on an eye distance of the user from the imaging structure and viewing angles of the right eye and the left eye to a center of the imaging structure. 4. An imaging structure as recited in claim 1, wherein the switchable diffractive elements are configured in sets of stacked elements, and each switchable diffractive element in a set of stacked elements is configured to diffract the light of the virtual image in a different field of view. 5. An imaging structure as recited in claim 4, wherein the different fields of view projected by each of the switchable diffractive elements in the set of stacked elements combine for a sequential field of view that spans an activated display zone. 6. An imaging structure as recited in claim 1, wherein the switchable diffractive elements comprise Switchable Bragg Gratings. 7. A computing device, comprising: a see-through display device configured as an auto-stereoscopic augmented reality display to display a virtual image as a near-display object that appears at a distance in an environment that is viewable through the see-through display device;one or more sensors configured to provide reference data related to at least a position and an orientation of the see-through display device in the environment with respect to a real object in the environment; anda processing system to implement an imaging controller that is configured to control activation of switchable diffractive elements configured in display zones of the see-through display device, the display zones of the see-through display device including vector adjustments, based in part on the reference data, to account for the position and the orientation of the see-through display device and enable the virtual image that appears at the distance in the environment to be generated with an accurate viewing angle relative to a viewing angle of the real object in the environment, and the display zones independently controllable to correct for an accurate stereopsis view of the virtual image that appears at the distance in the environment, the see-through display device configured to activate one or more first display zones to display a representation of the virtual image for a right eye of a user based on tracked pupil positions of the user, and activate one or more second display zones to display a different representation of the virtual image for a left eye of a user based on the tracked pupil positions of the user,wherein the one or more first display zones and the one or more second display zones are determined by calculating a ray-trace bisector for each of one or more tiles of the see-through display device relative to a current bisector eye position. 8. A computing device as recited in claim 7, further comprising: a camera configured to capture digital images of the left and right eyes of the user of the computing device and wherein pupil positions of the left and right eyes are tracked based on the digital images of the left and right eyes of the user. 9. A computing device as recited in claim 8, wherein a distance from the left and right eyes to the see-through display device is determined and viewing angles of the left and right eyes to a center of the see-through display device are determined. 10. A computing device as recited in claim 9, wherein the imaging controller is configured to control activation of the switchable diffractive elements in a display zone based on the pupil positions of the left and right eyes, the distance from the left and right eyes to the see-through display device, and the viewing angles of the left and right eyes to the center of the see-through display device. 11. A computing device as recited in claim 7, further comprising an element drive circuit configured to selectively activate the switchable diffractive elements in the display zones of the see-through display device based on imaging controller inputs. 12. A computing device as recited in claim 7, wherein the switchable diffractive elements are configured in sets of stacked elements integrated in the see-through display device, and each switchable diffractive element in a set of stacked elements is configured to diffract light of the virtual image in a different field of view. 13. A computing device as recited in claim 12, wherein the different fields of view projected by each of the switchable diffractive elements in the set of stacked elements combine for a sequential field of view that spans an activated display zone. 14. A computing device as recited in claim 7, wherein the switchable diffractive elements comprise Switchable Bragg Gratings. 15. A method, comprising: generating a virtual image for display on a see-through display device;displaying the virtual image as a near-display object that appears at a distance in an environment that is viewable through the see-through display device;controlling activation of switchable diffractive elements configured in display zones of the see-through display device, the display zones independently controllable to correct for an accurate stereopsis view of the virtual image that appears at the distance in the environment, the controlling activation further comprising: tracking pupil positions of left and right eyes of a user; andcontrolling at least one of the display zones to be switched on to provide a representation of the virtual image for the right eye of the user based on the pupil positions and controlling at least one other of the display zones to be switched on to provide a different representation of the virtual image for a left eye of the user based on the pupil positions,wherein the at least one of the display zones and the at least one other of the display zones are determined by calculating a ray-trace bisector for each of one or more tiles of the see-through display device relative to a current bisector eye position. 16. A method as recited in claim 15, wherein the tracking pupil positions of the left and right eyes of the user is based on digital images that capture user eye position, and wherein the method further comprises: determining a distance from the left and right eyes to the see-through display device; and determining viewing angles of the left and right eyes to a center of the see-through display device. 17. A method as recited in claim 16, wherein activation of the switchable diffractive elements in a display zone is controlled based on the pupil positions of the left and right eyes, the distance from the left and right eyes to the see-through display device, and the viewing angles of the left and right eyes to the center of the see-through display device. 18. A method as recited in claim 15, further comprising: generating a sequential field of view that spans an activated display zone, the sequential field of view said generated from a combination of different fields of view that are each projected by respective switchable diffractive elements in sets of stacked elements. 19. A method as recited in claim 15, wherein the controlling activation further comprises controlling at least one of the display zones to be switched off based on the pupil positions.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (194)
Kamath,Vinod; Loebach,Beth Frayne; Makley,Albert Vincent, Acoustic and thermal energy management system.
Knowles Terence J. (7525 Taft Cir. Hanover Park IL 60103) Bremigan ; III Charles F. (Rte. 1 ; Box 189R Jarrell TX 76537), Acoustic wave touch panel with inlayed, etched arrays and method of making the panel.
Tissot, Serge; Demonchaux, Thierry; Oconte, Philippe; Vanneuville, Guy, Device for cooling an electronic card by conduction comprising heat pipes, and corresponding method of fabrication.
Ishihara,Katsuyoshi; Hamamoto,Tatsuo, Liquid crystal display device including an electrode constituting pixel electrode connected to another electrode through opening formed in color filter.
Kruit, Pieter; Slot, Erwin; Teepen, Tijs Frans; Wieland, Marco Jan Jaco; Steenbrink, Stijn Willem Karel Herman, Lithography system, sensor and measuring method.
Belliveau, Richard S., Method and apparatus for controlling the temperature of a multiparameter light and/or a component thereof using orientation and/or parameter information.
Robbins, Steven J.; Stanley, James H.; Raynal, Francois; Brown, Robert D.; Tedesco, James M.; Hendrick, Wyatt L.; Popovich, Milan M.; Waldern, Jonathan D.; Grant, Alastair J., Optical displays.
Tomoshi Takigawa JP; Tsutomu Osaka JP, Stereoscopic image display apparatus for detecting viewpoint and forming stereoscopic image while following up viewpoint position.
Tuceryan, Mihran; Navab, Nassir; Genc, Yakup, System and method for calibrating a stereo optical see-through head-mounted display system for augmented reality.
McCleary, Jacob D.; Smith, Brian; Dodd, Curtis W., Systems and methods for sensing and indicating orientation of electrical equipment with active cooling.
Katoh,Takayuki; Miyashita,Atsushi; Yamazaki,Mitsuhiro; Uchida,Hiroyuki; Shimotono,Susumu; Tadokoro,Mizuho, Thermal management of a personal computing apparatus.
Bisset Stephen (Palo Alto CA) Miller Robert J. (Fremont CA) Allen Timothy P. (Los Gatos CA) Steinbach Gunter (Palo Alto CA 4), Touch pad driven handheld computing device.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.