Interactive augmented reality using a self-propelled device
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
A63F-013/00
G06K-009/32
G06T-011/00
A63F-013/65
A63F-013/327
G06T-007/246
G06T-007/269
출원번호
US-0054636
(2013-10-15)
등록번호
US-9827487
(2017-11-28)
발명자
/ 주소
Polo, Fabrizio
Carroll, Jonathan
Castator-Smith, Skylar
Ingram, Ross
출원인 / 주소
Sphero, Inc.
대리인 / 주소
Merchant & Gould P.C.
인용정보
피인용 횟수 :
0인용 특허 :
151
초록▼
A method is disclosed for operating a mobile computing device. The method may include a communication link between the mobile computing device and a second computing device. The second computing device may provide a virtual environment for the mobile computing device. Furthermore, the mobile computi
A method is disclosed for operating a mobile computing device. The method may include a communication link between the mobile computing device and a second computing device. The second computing device may provide a virtual environment for the mobile computing device. Furthermore, the mobile computing device may allow a user to control a self-propelled device, which may be rendered as a virtual entity upon the virtual environment.
대표청구항▼
1. A method for operating a mobile computing device to control a remote controlled vehicle, the method being performed by one or more processors of the mobile computing device and comprising: in response to user inputs on a user interface generated on a display of the mobile computing device, transm
1. A method for operating a mobile computing device to control a remote controlled vehicle, the method being performed by one or more processors of the mobile computing device and comprising: in response to user inputs on a user interface generated on a display of the mobile computing device, transmitting control information to the remote controlled vehicle to control movement of the remote controlled vehicle in a real-world environment;interfacing with a second computing device that provides, via execution of a program by one or more processors of the second computing device, a virtual environment;transmitting position information, corresponding to the remote controlled vehicle, to the second computing device to cause the second computing device to render a virtual entity upon the virtual environment, the virtual entity representing the remote controlled vehicle moving in the real-world environment;receiving, from the second computing device, data corresponding to the virtual entity within the virtual environment; anddisplaying, on the display of the mobile computing device, the virtual environment including the virtual entity representing the remote controlled vehicle. 2. The method of claim 1, further comprising: detecting an event relating to the remote controlled vehicle in the real-world environment;determining a virtual event based on the detected event; andincorporating the virtual event into the virtual environment. 3. The method of claim 1, further comprising: accessing saved data associated with the virtual environment stored on the second computing device, the saved data corresponding to previous interactions within the virtual environment; andcongruently linking the saved data to a current control session comprising the virtual environment and control of the remote controlled vehicle in the real-world environment. 4. The method of claim 3, wherein the previous interactions and the current control session of the remote controlled vehicle correspond to gameplay, simultaneously controlling the remote controlled vehicle in the real-world environment and the virtual entity within the virtual environment. 5. The method of claim 4, wherein the gameplay is associated with one or more of a minigame, emergent gameplay, or a mobile game. 6. The method of claim 1, wherein the virtual environment displayed on the mobile computing device is linked to a display of the virtual environment through the second computing device. 7. The method of claim 1, wherein the virtual environment corresponds to augmented reality. 8. The method of claim 1, wherein the mobile computing device is one or more of a smart phone, a tablet computer, or a laptop computer. 9. The method of claim 1, wherein interfacing with the second computing device is performed by the mobile computing device. 10. The method of claim 1, wherein the step of interfacing is performed by the second computing device. 11. A system comprising: a remote controlled vehicle; anda mobile computing device comprising one or more processors executing a set of instructions that cause the one or more processors to: in response to user inputs on a user interface generated on a display of the mobile computing device, transmit control information to the remote controlled vehicle to control movement of the remote controlled vehicle in a real-world environment;interface with a second computing device that provides, via execution of a program by one or more processors of the second computing device, a virtual environment;transmit position information, corresponding to the remote controlled vehicle, to the second computing device to cause the second computing device to render a virtual entity upon the virtual environment, the virtual entity representing the remote controlled vehicle moving in the real-world environment;receive, from the second computing device, data corresponding to the virtual entity within the virtual environment; anddisplay, on the display of the mobile computing device, the virtual environment including the virtual entity representing the remote controlled vehicle. 12. The system of claim 11, wherein executed set of instructions further cause the one or more processors of the mobile computing device to: detect an event associated with the remote controlled vehicle in the real-world environment;determine a virtual event based on the detected event; and incorporate the virtual event in the virtual environment. 13. The system of claim 11, wherein the executed set of instructions further cause the one or more processors of the mobile computing device to: access saved data associated with the virtual environment stored on the second computing device, the saved data corresponding to previous interactions within the virtual environment. 14. The system of claim 13, wherein the executed set of instructions further cause the one or more processors of the mobile computing device to: congruently link the saved data to a current control session comprising the virtual environment and control of the remote controlled vehicle in the real-world environment. 15. The system of claim 14, wherein the previous interactions and the current control session correspond to gameplay within the virtual environment. 16. The system of claim 15, wherein the gameplay is associated with one or more of a minigame, emergent gameplay, or a mobile game. 17. The system of claim 11, wherein the virtual environment corresponds to augmented reality. 18. The system of claim 11, wherein the mobile computing device is one or more of a smart phone, a tablet computer, or a laptop computer. 19. The system of claim 11, wherein the mobile computing device performs one or more operations that enable the mobile computing device to interface with the second computing device. 20. The system of claim 11, wherein the second computing device performs one or more operations to interface with the mobile computing device. 21. A non-transitory computer readable medium storing instructions that, when executed by one or more processors of a mobile computing device, cause the one or more processors to: in response to user inputs on a user interface generated on a display of the mobile computing device, transmit control information to a remote controlled vehicle to control movement of the remote controlled vehicle in a real-world environment;interface with a second computing device that provides, via execution of a program by one or more processors of the second computing device, a virtual environment;transmit position information, corresponding to the remote controlled vehicle to the second computing device to cause the second computing device to render a virtual entity upon the virtual environment, the virtual entity representing the remote controlled vehicle in the real-world environment;receive, from the second computing device, data corresponding to the virtual entity within the virtual environment; anddisplay, on the display of the mobile computing device, the virtual environment including the virtual entity representing the remote controlled vehicle. 22. The non-transitory computer readable medium of claim 21, wherein the executed instructions further cause the one or more processors to: detect an event relating to the remote controlled vehicle in the real-world environment;determine a virtual event based on the detected event; andincorporate the virtual event into the virtual environment. 23. The non-transitory computer readable medium of claim 21, wherein the instructions executed further cause the one or more processors to: access saved data associated with the virtual environment stored on the second computing device, the saved data corresponding to previous interactions within the virtual environment; andcongruently link the saved data to a current control session comprising the virtual environment and control of the remote controlled vehicle in the real-world environment. 24. The non-transitory computer readable medium of claim 21, wherein the executed instructions further cause the one or more processors to: render the virtual environment as augmented reality.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (151)
Chieffo Joseph M. (1023 Yates Ave. Marcus Hook PA 19061), Amusement device.
Garretson, Justin R.; Parker, Eric P.; Gladwell, T. Scott; Rigdon, J. Brian; Oppel, III, Fred J., Apparatus and method for modifying the operation of a robotic vehicle in a real environment, to emulate the operation of the robotic vehicle operating in a mixed reality environment.
Rankin David B. (Lowell MA) Roberts ; Jr. Edgar P. (Winston-Salem NC) Kluttz James W. (Winston-Salem NC), Apparatus and method for tracking the flight of a golf ball.
Rankin David B. (Lowell MA) Roberts ; Jr. Edgar P. (Winston-Salem NC) Kluttz James W. (Winston-Salem NC), Apparatus and method for tracking the flight of a golf ball.
Takayama Kuniharu,JPX ; Nakano Eiji,JPX ; Mori Yoshikazu,JPX ; Takahashi Takayuki,JPX, Apparatus for controlling motion of normal wheeled omni-directional vehicle and method thereof.
Matsuoka,Tsunetaro; Otsuki,Tadashi; Konishi,Tetsuya; Kasuga,Tomoaki; Takemoto,Kunio; Okita,Ayako; Fujita,Yaeko; Ogura,Toshiya, Automatic apparatus, information server, robotic apparatus and commercial transaction method for performing an action based on information.
Osawa, Hiroshi; Hosonuma, Naoyasu, Charging system for mobile robot, method for searching charging station, mobile robot, connector, and electrical connection structure.
McCulloch, Daniel J.; Navratil, Arnulfo Zepeda; Steed, Jonathan T.; Hastings, Ryan L.; Scott, Jason; Mount, Brian J.; Hirzel, Holly A.; Bennett, Darren; Scavezze, Michael J., Controlling a virtual object with a real controller device.
Bakholdin Daniel (14929 Sylvan St. Van Nuys CA 91411) Bosley Robert W. (18104 Hoffman Ave. Cerritos CA 90701) Rosen Harold A. (14629 Hilltree Rd. Santa Monica CA 90402) Grayer William (15720 Ventura , Flywheel rotor with conical hub and methods of manufacture therefor.
Eric Richard Bartsch ; Charles William Fisher ; Paul Amaat France ; James Frederick Kirkpatrick ; Gary Gordon Heaton ; Thomas Charles Hortel ; Arseni Velerevich Radomyselski ; James Randy Stig, Home cleaning robot.
Hoffberg, Steven M; Hoffberg-Borghesani, Linda I, Mobile system, a method of operating mobile system and a non-transitory computer readable medium for a programmable control of a mobile system.
Allen Ross R. (408 Hainline Dr. Belmont CA 94002) Beard David (842 Los Robles Palo Alto CA 94306) Smith Mark T. (726 Pico Ave. San Mateo CA 94403) Tullis Barclay J. (1795 Guinda St. Palo Alto CA 9430, Navigation technique for detecting movement of navigation sensors relative to an object.
Elangovan, Vidya; Cavallaro, Richard H.; White, Marvin S.; Milnes, Kenneth A., Providing virtual inserts using image tracking with camera and position sensors.
Kim, Dong yoon; Oh, Jong koo; Bang, Won chul; Cho, Joon kee; Kang, Kyoung ho; Cho, Sung jung; Choi, Eun sook; Chang, Wook, Remote robot control method using three-dimensional pointing procedure and robot control system using the remote robot control method.
Sawada, Tsutomu; Fujita, Masahiro; Takagi, Tsuyoshi, Robot behavior control based on current and predictive internal, external condition and states with levels of activations.
Nielsen, Curtis W.; Bruemmer, David J.; Walton, Miles C.; Hartley, Robert S.; Gertman, David I.; Kinoshita, Robert A.; Whetten, Jonathan, Robots, systems, and methods for hazard evaluation and visualization.
Matsuoka, Tsunetaro; Otsuki, Tadashi; Konishi, Tetsuya; Kasuga, Tomoaki; Takemoto, Kunio; Okita, Ayako; Fujita, Yaeko; Ogura, Toshiya, System and method for generating an action of an automatic apparatus.
Karlsson, L. Niklas; Pirjanian, Paolo; Goncalves, Luis Filipe Domingues; Di Bernardo, Enrico, Systems and methods for using multiple hypotheses in a visual simultaneous localization and mapping system.
Boyden, Edward S.; Hyde, Roderick A.; Ishikawa, Muriel Y.; Leuthardt, Eric C.; Myhrvold, Nathan P.; Rivet, Dennis J.; Weaver, Thomas Allan; Wood, Jr., Lowell L., Systems for autofluorescent imaging and target ablation.
Niemelä, Esko; Öberg, Pierre; Kjellsson, Jimmy; Strand, Martin; Grönqvist, Åsa; Tasala, Seija, Wireless controller and a method for wireless control of a device mounted on a robot.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.