Method and system for applying gearing effects to visual tracking
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G09G-005/00
A63F-013/42
A63F-013/213
A63F-013/215
A63F-013/98
A63F-013/20
A63F-013/211
출원번호
US-0283131
(2016-09-30)
등록번호
US-10099130
(2018-10-16)
발명자
/ 주소
Zalewski, Gary M.
Marks, Richard
Mao, Crusoe Xiaodong
출원인 / 주소
Sony Interactive Entertainment America LLC
대리인 / 주소
Martine Penilla Group, LLP
인용정보
피인용 횟수 :
0인용 특허 :
253
초록▼
A method is provided, including: receiving inertial data from an input device, the inertial data being generated from one or more inertial sensors of the input device; receiving captured image data from an image capture device configured to capture images of an interactive environment in which the i
A method is provided, including: receiving inertial data from an input device, the inertial data being generated from one or more inertial sensors of the input device; receiving captured image data from an image capture device configured to capture images of an interactive environment in which the input device is disposed, the input device having a light emitting diode (LED) array that generates infrared light; processing the inertial data and the captured image data to determine a movement of the input device in the interactive environment; establishing a gearing that adjusts an amount by which the movement of the input device is mapped to movement of an image that is rendered to a display; changing the gearing to different settings during the movement of the image as the movement of the image is rendered to the display.
대표청구항▼
1. A method, comprising: receiving inertial data from an input device, the inertial data being generated from one or more inertial sensors of the input device;receiving captured image data from an image capture device configured to capture images of an interactive environment in which the input devi
1. A method, comprising: receiving inertial data from an input device, the inertial data being generated from one or more inertial sensors of the input device;receiving captured image data from an image capture device configured to capture images of an interactive environment in which the input device is disposed, the input device having a light emitting diode (LED) array that generates infrared light;processing the inertial data and the captured image data to determine a movement of the input device in the interactive environment;establishing a gearing that adjusts an amount by which the movement of the input device is mapped to movement of an image that is rendered to a display;changing the gearing to different settings during the movement of the image as the movement of the image is rendered to the display. 2. The method of claim 1, wherein processing the captured image data includes tracking the LED array in the captured image data. 3. The method of claim 2, wherein tracking the LED array includes analyzing a position and deformation of the LED array in the captured image data to determine a position and orientation of the input device in the interactive environment. 4. The method of claim 2, wherein the input device is configured to modulate the LED array to enable the tracking of the LED array in the captured image data. 5. The method of claim 1, wherein changing the gearing to different settings is in accordance with a user selectable setting. 6. The method of claim 1, wherein the inertial sensors include one or more of an accelerometer and a gyroscope. 7. The method of claim 6, wherein processing the inertial data includes analyzing signals from the accelerometer or gyroscope to determine a position or orientation of the input device in the interactive environment. 8. The method of claim 1, wherein the movement of the image is a translational or rotational movement the image. 9. The method of claim 1, wherein changing the gearing to different settings occurs during the movement of the input device. 10. A method, comprising: receiving inertial data from an input device, the inertial data being generated from one or more inertial sensors of the input device;receiving captured image data from an image capture device configured to capture images of an interactive environment in which the input device is disposed, the input device having a plurality of lights that generate infrared light;processing the inertial data and the captured image data to determine a movement of the input device in the interactive environment, wherein determining the movement of the input device includes tracking a position and orientation of the input device based on the inertial data and the captured image data;establishing a gearing that adjusts an amount by which the movement of the input device is mapped to a variable used to determine movement of an image that is rendered to a display;changing the gearing to different settings during the movement of the image as the movement of the image is rendered to the display and during the movement of the input device. 11. The method of claim 10, wherein processing the captured image data includes analyzing a position and deformation of the plurality of lights in the captured image data to determine the position and orientation of the input device in the interactive environment. 12. The method of claim 10, wherein the input device is configured to modulate the plurality of lights to enable the tracking of the position and orientation of the input device. 13. The method of claim 10, wherein changing the gearing to different settings is in accordance with a user selectable setting. 14. The method of claim 10, wherein the inertial sensors include one or more of an accelerometer and a gyroscope. 15. The method of claim 14, wherein processing the inertial data includes analyzing signals from the accelerometer or gyroscope to determine the position or orientation of the input device in the interactive environment. 16. The method of claim 10, wherein the movement of the image is a translational or rotational movement the image. 17. A system, comprising: an input device, the input device including,one or more inertial sensors configured to generate inertial data in response to sensed motion of the input device,a light emitting diode (LED) array that generates infrared light;an image capture device configured to capture images of an interactive environment in which the input device is disposed, the image capture device configured to generate captured image data that is processed with the inertial data to determine a movement of the input device in the interactive environment;wherein a gearing is established that adjusts an amount by which the movement of the input device is mapped to movement of an image that is rendered to a display, the gearing changing to different settings during the movement of the image as the movement of the image is rendered to the display. 18. The system of claim 17, wherein the captured image data is processed to track the LED array in the captured image data, including analyzing a position and deformation of the LED array in the captured image data to determine a position and orientation of the input device in the interactive environment. 19. The system of claim 18, wherein the input device is configured to modulate the LED array to enable the tracking of the LED array in the captured image data. 20. The system of claim 17, wherein the inertial sensors include one or more of an accelerometer and a gyroscope, wherein processing the inertial data includes analyzing signals from the accelerometer or gyroscope to determine a position or orientation of the input device in the interactive environment.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (253)
Cipolla Roberto (Cambridge GBX) Okamoto Yasukazu (Chiba-ken JPX) Kuno Yoshinori (Osaka-fu JPX), 3D human interface apparatus using motion recognition based on dynamic image processing.
Marks, Richard L.; Mao, Xiadong; Zalewski, Gary M., Computer image and audio processing of intensity and input devices for interfacing with a computer program.
Nobuo Fukushima JP; Tomotaka Muramoto JP; Masayoshi Sekine JP, Display apparatus which detects an observer body part motion in correspondence to a displayed element used to input operation instructions to start a process.
Stoel Leon P. (Sioux Falls SD) Bankers David M. (Sioux Falls SD) Hills Vernon E. (Sioux Falls SD) Plucker Prentice J. (Chanceller SD) Cinco Christopher A. (Sioux Falls SD), Entertainment system and method for controlling connections between terminals and game generators and providing video ga.
Tamura Akihiro (Yawata JPX) Sakaue Shigeo (Moriguchi JPX), Gradation correction device and image sensing device therewith for supplying images with good gradation for both front-l.
Cartabiano Michael C. ; Curran Kenneth J. ; Dick David J. ; Gibbs Douglas R. ; Kirby Morgan H. ; May Richard L. ; Storer William J. A. ; Ullman Adam N., Hand-attachable controller with direction sensing.
Sata Hironori,JPX, Image generating system and information storage medium capable of changing angle of view of virtual camera based on object positional information.
Wallace,Jon K.; Luo,Yun; Dziadula,Robert; Khairallah,Farid, Method and apparatus for determining an occupant's head location in an actuatable occupant restraining system.
Florent Raoul (Valenton FRX) Lelong Pierre (Nogent-Sur-marne FRX), Method and device for processing an image in order to construct from a source image a target image with charge of perspe.
Maes Pattie E. (Somerville MA) Blumberg Bruce M. (Pepperell MA) Darrell Trevor J. (Cambridge MA) Starner Thad E. (Somerville MA) Johnson Michael P. (Cambridge MA) Russell Kenneth B. (Boston MA) Pentl, Method and system for facilitating wireless, full-body, real-time user interaction with a digitally represented visual e.
Matey James R. (Mercerville NJ) Aceti John G. (Cranbury NJ) Pletcher Timothy A. (East Hampton NJ), Method and system for object detection for instrument control.
Wergen, Gerhard; Franz, Klaus, Method for transferring characters especially to a computer and an input device which functions according to this method.
Kobayashi Hiroshi (3-15 Hanakoganei Kodaira-shi ; Tokyo JPX) Machida Haruhiko (10-7 Nakaochiai 4-chome Shinjuki-ku ; Tokyo JPX) Ema Hideaki (Shizuoka JPX) Akedo Jun (Tokyo JPX), Method of measuring the amount of movement of an object having uniformly periodic structure.
Everett ; Jr. Hobart R. (San Diego CA) Gilbreath Gary A. (San Diego CA) Laird Robin T. (San Diego CA), Navigational control system for an autonomous vehicle.
Elko Gary W. (Summit NJ) Sondhi Man M. (Berkeley Heights NJ) West James E. (Plainfield NJ), Noise reduction processing arrangement for microphone arrays.
Lake Royden J. (Armidale AUX) Moore John C. (Armidale AUX) Kowald Errol M. (Armidale AUX) Doerr Annegret (Armidale AUX), Optically readable coded target.
Krueger Myron W. (55 Edith Rd. Vernon CT 06066) Hinrichsen Katrin (81 Willington Oaks Storrs CT 06268) Gionfriddo Thomas S. (81 Willington Oaks Storrs CT 06268), Real time perception of and response to the actions of an unencumbered participant/user.
Mark John G. (Pasadena CA) Tazartes Daniel A. (West Hills CA) Ebner Robert E. (Tarzana CA) Dahlen Neal J. (Freiburg CA DEX) Datta Nibir K. (West Hills CA), Ring laser gyroscope enhanced resolution system.
Yen, Wei; Wright, Ian; Tu, Xiaoyuan; Reynolds, Stuart; Powers, III, William Robert; Musick, Charles; Funge, John; Dobson, Daniel; Bererton, Curt, Self-contained inertial navigation system for interactive control using movable controllers.
Addeo Eric J. (Long Valley NJ) Robbins John D. (Denville NJ) Shtirmer Gennady (Morris Plains NJ), Sound localization system for teleconferencing using self-steering microphone arrays.
Chang Bay-Wei W. ; Fishkin Kenneth P. ; Harrison Beverly L. ; Igarashi Takeo,JPX ; Mackinlay Jock D. ; Want Roy ; Zellweger Polle T., Spinning as a morpheme for a physical manipulatory grammar.
Dengler,John D.; Garci,Erik J.; Cox,Brian C.; Tolman,Kenneth T.; Weber,Hans X.; Hall,Gerard J., System and method for inserting content into an image sequence.
Lyons Damian M., System and method for permitting three-dimensional navigation through a virtual reality environment using camera-based gesture inputs.
Stam, Joseph S.; Bechtel, Jon H.; Reese, Spencer D.; Roberts, John K.; Tonar, William L.; Poe, G. Bruce; Newhouse, Douglas J., System for controlling exterior vehicle lights.
Stam, Joseph S.; Bechtel, Jon H.; Reese, Spencer D.; Roberts, John K.; Tonar, William L.; Poe, G. Bruce; Newhouse, Douglas J., System for controlling exterior vehicle lights.
Stam, Joseph S.; Bechtel, Jon H.; Reese, Spencer D.; Roberts, John K.; Tonar, William L.; Poe, G. Bruce; Newhouse, Douglas J., System for controlling exterior vehicle lights.
Wang John Y. A. (Cambridge MA) Adelson Edward H. (Cambridge MA), System for encoding image data into multiple layers representing regions of coherent motion and associated motion parame.
Freeman William T. ; Leventon Michael E., System for reconstructing the 3-dimensional motions of a human figure from a monocularly-viewed image sequence.
Oishi, Toshimitsu; Okubo, Toru; Domitsu, Hideyuki; Yamano, Tomoya, Video game apparatus, method and recording medium storing program for controlling viewpoint movement of simulated camera in video game.
Bouton Frank M. (Beaverton OR) Kaminsky Stephen T. (Salem OR), Video pinball machine controller having an optical accelerometer for detecting slide and tilt.
Fishkin Kenneth P. ; Goldberg David ; Gujar Anuj Uday ; Harrison Beverly L. ; Mynatt Elizabeth D. ; Stone Maureen C. ; Want Roy, Zoomorphic computer user interface.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.