An electronic device (100) includes a first processor (802) and a second processor (804) coupled to the first processor. The first processor (802) is to receive image data from a first imaging camera (114, 116) and to determine two-dimensional (2D) spatial feature data representing one or more 2D sp
An electronic device (100) includes a first processor (802) and a second processor (804) coupled to the first processor. The first processor (802) is to receive image data from a first imaging camera (114, 116) and to determine two-dimensional (2D) spatial feature data representing one or more 2D spatial features identified from the image data. The second processor (804) is to determine three-dimensional (3D) spatial feature data representing one or more 3D spatial features identified based on the 2D spatial feature data. Further, the first processor (802) can initiate detection of one or more 2D spatial features from a portion of an image frame prior to receiving the entire image frame.
대표청구항▼
1. An electronic device comprising: a first processor to receive image data representing an image frame from a first imaging camera and to determine two-dimensional (2D) spatial feature data representing one or more 2D spatial features identified from the image data, wherein the first processor is t
1. An electronic device comprising: a first processor to receive image data representing an image frame from a first imaging camera and to determine two-dimensional (2D) spatial feature data representing one or more 2D spatial features identified from the image data, wherein the first processor is to process a portion of the image of data for identification of 2D spatial features while a next portion of the image data is being received from the first image camera; anda second processor, coupled to the first processor, to determine three-dimensional (3D) spatial feature data representing one or more 3D spatial features identified based on the 2D spatial feature data. 2. The electronic device of claim 1, further comprising: the first imaging camera, disposed at a first surface of the electronic device, and having a first field of view; anda second imaging camera, disposed at the first surface of the electronic device, and having a second field of view narrower than the first field of view. 3. The electronic device of claim 2, further comprising: a third imaging camera, disposed at a second surface of the electronic device, and having a third field of view greater than the second field of view; andwherein the first processor is to determine the 2D spatial feature data further based on one or more 2D spatial features identified from image data captured by the third imaging camera. 4. The electronic device of claim 1, further comprising: a depth sensor to capture depth data; andwherein the second processor is to determine the 3D spatial feature data further based on the depth data. 5. The electronic device of claim 4, wherein: the depth sensor includes: a modulated light projector; andthe depth data includes: image data captured by the first imaging camera and representing a reflection of a modulated light pattern projected by the modulated light projector. 6. The electronic device of claim 1, further comprising: a sensor, coupled to the second processor, to provide non-image sensor data; andwherein the second processor is to determine the 3D spatial feature data further based on the non-image sensor data. 7. The electronic device of claim 6, wherein: for each image frame: the first processor is to capture at least one sensor state of the sensor; andthe first processor is to determine a 2D spatial feature list of 2D spatial features identified in the image frame and to send the 2D spatial feature list and a representation of the at least one sensor state to the second processor. 8. The electronic device of claim 6, wherein the sensor comprises at least one selected from a group consisting of: an accelerometer; a gyroscope; an ambient light sensor; a magnetometer; a gravity gradiometer; a wireless cellular interface; a wireless local area network interface; a wired network interface; a near field communications interface; a global positioning system interface; a microphone; and a keypad. 9. A method comprising: receiving, at a first processor of an electronic device, a first image data portion from a first image camera of the electronic device, the first image data portion representing a first portion of a first image frame captured by the first imaging camera;receiving, at the first processor, a second image data portion from the first imaging camera after receiving the first image data portion, the second image data portion representing a second portion of the first image frame;determining, at the first processor, a first set of one or more two-dimensional (2D) spatial features from the first image data portion prior to receiving the second image data portion; anddetermining, at a second processor of the electronic device, a set of one or more three-dimensional (3D) spatial features using the first set of one or more 2D spatial features. 10. The method of claim 9, further comprising: receiving, at the first processor, a third image data portion from a second imaging camera of the electronic device, the third image data portion representing a portion of a second image frame captured by the second imaging camera;determining, at the first processor, a second set of one or more 2D spatial features from the third image data portion; andwherein determining the set of one or more 3D spatial features includes: determining the set of one or more 3D spatial features based on correlations between the first set of one or more 2D spatial features and the second set of one or more 2D spatial features. 11. The method of claim 10, further comprising: aligning image data captured by the first imaging camera and image data captured by the second imaging camera to generate a combined image frame; anddisplaying the combined image frame at the electronic device. 12. The method of claim 9, further comprising: receiving, at the first processor, depth data captured by a depth sensor of the electronic device; andwherein determining the set of one or more 3D spatial features includes: determining the set of one or more 3D spatial features further based on the depth data. 13. The method of claim 9, further comprising: determining, at the first processor, sensor data representative of a sensor state of at least one non-imaging sensor of the electronic device concurrent with the capture of the first image data; andwherein determining the set of one or more 3D spatial features includes: determining the set of one or more 3D spatial features further based on the sensor data. 14. A method comprising: receiving, at a first processor of an electronic device, a sequence of image data portions for a first image frame captured by a first imaging camera of the electronic device, each image data portion representing a corresponding portion of the first image frame; anddetermining, at the first processor, a set of one or more two-dimensional (2D) spatial features for each image data portion of the sequence of image data portions, and sending first 2D spatial feature data representative of the set of one or more 2D spatial features to a second processor of the electronic device while continuing to receive at the first processor the next image data portion of the sequence of image data portions. 15. The method of claim 14, further comprising: determining, at the second processor, a first set of one or more three-dimensional (3D) spatial features based on the first 2D spatial feature data. 16. The method of claim 15, further comprising: receiving, at the first processor, depth data captured by a depth sensor of the electronic device; andwherein determining the first set of one or more 3D spatial features includes: determining the first set of one or more 3D spatial features further based on the depth data. 17. The method of claim 15, further comprising: receiving sensor data representative of a sensor state of at least one non-imaging sensor of the electronic device concurrent with receiving the sequence of image data portions; andwherein determining the first set of one or more 3D spatial features includes: determining the first set of one or more 3D spatial features further based on the sensor data. 18. The method of claim 17, wherein: the non-imaging sensor is a gyroscope; andwherein determining the first set of one or more 3D spatial features includes: determining the first set of one or more 3D spatial features further based on an orientation reading from the gyroscope. 19. The method of claim 14, wherein: the first imaging camera includes a rolling shutter imaging camera having a plurality of rows of pixel sensors; andreceiving the sequence of image data portions includes: receiving a row-by-row stream of image data captured by the rolling shutter imaging camera; andeach image data portion comprises image data of a corresponding set of one or more rows of the rolling shutter imaging camera. 20. The method of claim 14, further comprising: receiving, at the first processor, image data representing a second image frame captured by a second imaging camera of the electronic device; anddetermining, at the first processor, a second set of one or more 2D spatial features for the second image frame and streaming second 2D spatial feature data representative of the second set of one or more 2D spatial features to the second processor. 21. The method of claim 20, further comprising: determining, at the second processor, a second set of one or more three-dimensional (3D) spatial features based on the second 2D spatial feature data;receiving, at the first processor, depth data captured by a depth sensor of the electronic device; andwherein determining the second set of one or more 3D spatial features includes: determining the second set of one or more 3D spatial features further based on the depth data. 22. The electronic device of claim 1, wherein the second processor further is to: initiate determination of at least one of a position and an orientation of the electronic device using at least a portion of the 3D spatial feature data prior to the first processor receiving a last portion of the image data from the first imaging camera.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (5)
Nayar Shree K. ; Swaminathan Rahul ; Gluckman Joshua M., Combined wide angle and narrow angle imaging system and method for surveillance and monitoring.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.