Orienting a user interface of a controller for operating a self-propelled device
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G05D-001/00
A63H-030/04
A63H-033/00
G05D-001/02
B62D-061/00
G05D-001/08
출원번호
US-0040331
(2016-02-10)
등록번호
US-9841758
(2017-12-12)
발명자
/ 주소
Bernstein, Ian H.
Wilson, Adam
Smith, Brian Keith
출원인 / 주소
SPHERO, INC.
인용정보
피인용 횟수 :
0인용 특허 :
150
초록▼
A self-propelled device determines an orientation for its movement based on a pre-determined reference frame. A controller device is operable by a user to control the self-propelled device. The controller device includes a user interface for controlling at least a direction of movement of the self-p
A self-propelled device determines an orientation for its movement based on a pre-determined reference frame. A controller device is operable by a user to control the self-propelled device. The controller device includes a user interface for controlling at least a direction of movement of the self-propelled device. The self-propelled device is configured to signal the controller device information that indicates the orientation of the self-propelled device. The controller device is configured to orient the user interface, based on the information signaled from the self-propelled device, to reflect the orientation of the self-propelled device.
대표청구항▼
1. A controller device for operating a self-propelled device, the controller device comprising: a touch-sensitive display;one or more processors; andone or more memory resources storing instructions that, when executed by the one or more processors, cause the one or more processors to: generate a us
1. A controller device for operating a self-propelled device, the controller device comprising: a touch-sensitive display;one or more processors; andone or more memory resources storing instructions that, when executed by the one or more processors, cause the one or more processors to: generate a user interface on the touch-sensitive display, the user interface comprising a virtual controls to enable a user to remotely operate the self-propelled device;using a camera, detect the self-propelled device in image data captured by the camera based on a location of the self-propelled device in a real-world environment, generate a virtual representation of the self-propelled device in a virtual environment on the user interface;receive one or more user interactions with the virtual controls, the one or more user interactions to maneuver the self-propelled device; andtransmit one or more commands to the self-propelled device, the one or more commands to maneuver the self-propelled device in accordance with the one or more user interactions;wherein the executed instructions cause the one or more processors to generate the virtual representation of the self-propelled device in the virtual environment to reflect user control of the self-propelled device in the real-world environment, and wherein the executed instructions cause the one or more processors to generate the virtual representation of the self-propelled device in the virtual environment to reflect user control of the self-propelled device in the real-world environment in accordance with execution of a gaming application that provides a gaming environment. 2. The controller device of claim 1, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: directionally calibrate the user interface based on an orientation of the self-propelled device in relation to an initial frame of reference. 3. The controller device of claim 1, wherein the executed instructions cause the one or more processors to detect the self-propelled device in the image data as a fiducial marker for the virtual representation of the self-propelled device. 4. The controller device of claim 1, wherein the self-propelled device comprises a remotely operated aircraft. 5. The controller device of claim 1, wherein the executed instructions cause the one or more processors to detect the self-propelled device in the image data based on a pattern of light emitted by the self-propelled device. 6. The controller device of claim 1, wherein the gaming application corresponds to one of a plurality of gaming applications, executable by the controller device, and each providing a unique gaming environment. 7. A computer-implemented method for operating a self-propelled device, the method performed by one or more processors of a controller device and comprising: generating a virtual user interface on a touch-sensitive display of the controller device, the user interface comprising virtual controls enabling a user to remotely operate the self-propelled device;using a camera, detecting the self-propelled device in image data captured by the camera; based on a location of the self-propelled device in a real-world environment, generating a virtual representation of the self-propelled device in a virtual environment on the user interface; receiving one or more user interactions with the virtual controls, the one or more user interactions to maneuver the self-propelled device; andtransmitting one or more commands to the self-propelled device, the one or more commands to maneuver the self-propelled device in accordance with the one or more user interactions;wherein the one or more processors generate the virtual representation of the self-propelled device in the virtual environment to reflect user control of the self-propelled device in the real-world environment, and wherein the one or more processors generate the virtual representation of the self-propelled device in the virtual environment to reflect user control of the self-propelled device in the real-world environment in accordance with execution of a gaming application that provides a gaming environment. 8. The method of claim 7, further comprising: directionally calibrating the user interface based on an orientation of the self-propelled device in relation to an initial frame of reference. 9. The method of claim 7, wherein the one or more processors detect the self-propelled device in the image data as a fiducial marker for the virtual representation of the self-propelled device. 10. The method of claim 7, wherein the self-propelled device comprises a remotely operated aircraft. 11. The method of claim 7, wherein the one or more processors detect the self-propelled device in the image data based on a pattern of light emitted by the self-propelled device. 12. The method of claim 7, wherein the gaming application corresponds to one of a plurality of gaming applications, executable by the controller device, and each providing a unique gaming environment. 13. A non-transitory computer readable medium storing instructions for operating a self-propelled device, wherein the instructions, when executed by one or more processors of a controller device, cause the one or more processors to: generate a user interface on a touch-sensitive display of the controller device, the user interface comprising virtual controls enabling a user to remotely operate the self-propelled device;using a camera, detect the self-propelled device in image data captured by the camera;based on a location of the self-propelled device in a real-world environment, generate a virtual representation of the self-propelled device in a virtual environment on the user interface;one or more user interactions with the virtual controls, the one or more user interactions to maneuver the self-propelled device; andtransmit one or more commands to the self-propelled device, the one or more commands to maneuver the self-propelled device in accordance with the one or more user interactions;wherein the executed instructions cause the one or more processors to generate the virtual representation of the self-propelled device in the virtual environment to reflect user control of the self-propelled device in the real-world environment, and wherein the executed instructions cause the one or more processors to generate the virtual representation of the self-propelled device in the virtual environment to reflect user control of the self-propelled device in the real-world environment in accordance with execution of a gaming application that provides a gaming environment. 14. The non-transitory computer readable medium of claim 13, wherein the instructions when executed by the one or more processors further cause the one or more processors to: directionally calibrate the user interface based on an orientation of the self-propelled device in relation to an initial frame of reference. 15. The non-transitory computer readable medium of claim 13, wherein the executed instructions cause the one or more processors to detect the self-propelled device in the image data as a fiducial marker for the virtual representation of the self-propelled device. 16. The non-transitory computer readable medium of claim 15, wherein the self-propelled device comprises a remotely operated aircraft. 17. The non-transitory computer readable medium of claim 13, wherein the executed instructions cause the one or more processors to detect the self-propelled device in the image data based on a pattern of light emitted by the self-propelled device.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (150)
Chieffo Joseph M. (1023 Yates Ave. Marcus Hook PA 19061), Amusement device.
Garretson, Justin R.; Parker, Eric P.; Gladwell, T. Scott; Rigdon, J. Brian; Oppel, III, Fred J., Apparatus and method for modifying the operation of a robotic vehicle in a real environment, to emulate the operation of the robotic vehicle operating in a mixed reality environment.
Rankin David B. (Lowell MA) Roberts ; Jr. Edgar P. (Winston-Salem NC) Kluttz James W. (Winston-Salem NC), Apparatus and method for tracking the flight of a golf ball.
Rankin David B. (Lowell MA) Roberts ; Jr. Edgar P. (Winston-Salem NC) Kluttz James W. (Winston-Salem NC), Apparatus and method for tracking the flight of a golf ball.
Takayama Kuniharu,JPX ; Nakano Eiji,JPX ; Mori Yoshikazu,JPX ; Takahashi Takayuki,JPX, Apparatus for controlling motion of normal wheeled omni-directional vehicle and method thereof.
Matsuoka,Tsunetaro; Otsuki,Tadashi; Konishi,Tetsuya; Kasuga,Tomoaki; Takemoto,Kunio; Okita,Ayako; Fujita,Yaeko; Ogura,Toshiya, Automatic apparatus, information server, robotic apparatus and commercial transaction method for performing an action based on information.
Osawa, Hiroshi; Hosonuma, Naoyasu, Charging system for mobile robot, method for searching charging station, mobile robot, connector, and electrical connection structure.
McCulloch, Daniel J.; Navratil, Arnulfo Zepeda; Steed, Jonathan T.; Hastings, Ryan L.; Scott, Jason; Mount, Brian J.; Hirzel, Holly A.; Bennett, Darren; Scavezze, Michael J., Controlling a virtual object with a real controller device.
Bakholdin Daniel (14929 Sylvan St. Van Nuys CA 91411) Bosley Robert W. (18104 Hoffman Ave. Cerritos CA 90701) Rosen Harold A. (14629 Hilltree Rd. Santa Monica CA 90402) Grayer William (15720 Ventura , Flywheel rotor with conical hub and methods of manufacture therefor.
Eric Richard Bartsch ; Charles William Fisher ; Paul Amaat France ; James Frederick Kirkpatrick ; Gary Gordon Heaton ; Thomas Charles Hortel ; Arseni Velerevich Radomyselski ; James Randy Stig, Home cleaning robot.
Hoffberg, Steven M; Hoffberg-Borghesani, Linda I, Mobile system, a method of operating mobile system and a non-transitory computer readable medium for a programmable control of a mobile system.
Allen Ross R. (408 Hainline Dr. Belmont CA 94002) Beard David (842 Los Robles Palo Alto CA 94306) Smith Mark T. (726 Pico Ave. San Mateo CA 94403) Tullis Barclay J. (1795 Guinda St. Palo Alto CA 9430, Navigation technique for detecting movement of navigation sensors relative to an object.
Kim, Dong yoon; Oh, Jong koo; Bang, Won chul; Cho, Joon kee; Kang, Kyoung ho; Cho, Sung jung; Choi, Eun sook; Chang, Wook, Remote robot control method using three-dimensional pointing procedure and robot control system using the remote robot control method.
Sawada, Tsutomu; Fujita, Masahiro; Takagi, Tsuyoshi, Robot behavior control based on current and predictive internal, external condition and states with levels of activations.
Nielsen, Curtis W.; Bruemmer, David J.; Walton, Miles C.; Hartley, Robert S.; Gertman, David I.; Kinoshita, Robert A.; Whetten, Jonathan, Robots, systems, and methods for hazard evaluation and visualization.
Matsuoka, Tsunetaro; Otsuki, Tadashi; Konishi, Tetsuya; Kasuga, Tomoaki; Takemoto, Kunio; Okita, Ayako; Fujita, Yaeko; Ogura, Toshiya, System and method for generating an action of an automatic apparatus.
Karlsson, L. Niklas; Pirjanian, Paolo; Goncalves, Luis Filipe Domingues; Di Bernardo, Enrico, Systems and methods for using multiple hypotheses in a visual simultaneous localization and mapping system.
Boyden, Edward S.; Hyde, Roderick A.; Ishikawa, Muriel Y.; Leuthardt, Eric C.; Myhrvold, Nathan P.; Rivet, Dennis J.; Weaver, Thomas Allan; Wood, Jr., Lowell L., Systems for autofluorescent imaging and target ablation.
Niemelä, Esko; Öberg, Pierre; Kjellsson, Jimmy; Strand, Martin; Grönqvist, Åsa; Tasala, Seija, Wireless controller and a method for wireless control of a device mounted on a robot.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.