H04N13/271

Efficient sub-pixel disparity estimation for all sub-aperture images from densely sampled light field cameras
11257235 · 2022-02-22 · ·

A system for sub-pixel disparity estimation is described herein. The system includes memory circuitry to store image data and at least one processor to execute instructions to calculate a first disparity for a set of reference views. The reference views correspond to a first subset of views among a plurality of sub-aperture views represented in the image data. The at least one processor is to refine the first disparity to a second disparity for the reference views. The second disparity has higher precision than the first disparity. The at least one processor is to map the second disparity from the reference views to a second subset of views among the plurality of sub-aperture views different than the first subset of views.

Efficient sub-pixel disparity estimation for all sub-aperture images from densely sampled light field cameras
11257235 · 2022-02-22 · ·

A system for sub-pixel disparity estimation is described herein. The system includes memory circuitry to store image data and at least one processor to execute instructions to calculate a first disparity for a set of reference views. The reference views correspond to a first subset of views among a plurality of sub-aperture views represented in the image data. The at least one processor is to refine the first disparity to a second disparity for the reference views. The second disparity has higher precision than the first disparity. The at least one processor is to map the second disparity from the reference views to a second subset of views among the plurality of sub-aperture views different than the first subset of views.

Systems and methods for controlling aliasing in images captured by an array camera for use in super resolution processing using pixel apertures

Imager arrays, array camera modules, and array cameras in accordance with embodiments of the invention utilize pixel apertures to control the amount of aliasing present in captured images of a scene. One embodiment includes a plurality of focal planes, control circuitry configured to control the capture of image information by the pixels within the focal planes, and sampling circuitry configured to convert pixel outputs into digital pixel data. In addition, the pixels in the plurality of focal planes include a pixel stack including a microlens and an active area, where light incident on the surface of the microlens is focused onto the active area by the microlens and the active area samples the incident light to capture image information, and the pixel stack defines a pixel area and includes a pixel aperture, where the size of the pixel apertures is smaller than the pixel area.

Light source module and display module

A light source module and a display module are provided. The light source module includes: a substrate; a plurality of first light sources spaced on the substrate, each of the plurality of first light sources being configured to emit visible light; a depth sensor on the substrate; and a first blocking member configured to shield visible light but transmit invisible light. The depth sensor includes a second light source configured to emit invisible light and a light receiving member configured to sense invisible light, an orthographic projection of the second light source on the substrate is located within an orthographic projection of a gap between two adjacent first light sources on the substrate, an orthographic projection of the first blocking member on the substrate at least partially covers the orthographic projection of the second light source on the substrate.

Method for controlling depth of object in mirror display system

Disclosed is a mirror display system. The mirror display system includes a sensor module, a mirror display including an optical mirror and a display, and at least one control circuit electrically connected to the sensor module and the mirror display, wherein the at least one control circuit may determine a first angle value indicating a graze direction of a user reflected in the optical mirror through the sensor module, determine a distance value between a subject reflected in the optical mirror and the mirror display, determine a depth value of the subject on the mirror display based on the first angle value and the distance value, and display an object having a depth value corresponding to the depth value of the subject through the display.

Imaging apparatus, imaging system, imaging method, and image processing method
09826219 · 2017-11-21 · ·

An imaging apparatus which captures an image used for obtaining depth data indicating a depth of a captured scene and restored image data with a reduced optical blur, includes: an optical device which causes an optical blur having a rotationally asymmetric shape; an actuator which rotates a part or a whole of the optical device; and an imaging unit configured to capture the image each time the optical device is rotated, to generate a plurality of captured image data items.

System and apparatus for co-registration and correlation between multi-modal imagery and method for same

The present disclosure provides an image capturing, device that captures images of a first sensor that includes a first imaging modality, a second sensor that includes a first imaging modality and a third sensor that includes a second imaging modality. A controller connected with the first sensor, the second sensor and the third sensor, wherein the controller registers an image captured by the first sensor or the second sensor to an image captured by the third sensor.

System and apparatus for co-registration and correlation between multi-modal imagery and method for same

The present disclosure provides an image capturing, device that captures images of a first sensor that includes a first imaging modality, a second sensor that includes a first imaging modality and a third sensor that includes a second imaging modality. A controller connected with the first sensor, the second sensor and the third sensor, wherein the controller registers an image captured by the first sensor or the second sensor to an image captured by the third sensor.

Apparatus and method for providing 3-dimensional around view through a user interface module included in a vehicle

A three-dimensional around view providing apparatus for providing a 3D around view through a user interface module included in a vehicle may include a plurality of image pickup units mounted in the vehicle, a depth estimator configured to receive a plurality of images from the plurality of image pickup units and to acquire a plurality of depth maps corresponding to the plurality of images, a controller configured to minimize a depth difference between a first boundary region of a first depth map and a second boundary region of a second depth map. At least one among an autonomous vehicle, a user terminal, and a server according to embodiments of the present disclosure may be associated or integrated with an artificial intelligence (AI) module, a drone (unmanned aerial vehicle (UAV)), a robot, an augmented reality (AR) device, a virtual reality (VR) device, a 5th-generation (5G) service related device, and the like.

Apparatus and method for providing 3-dimensional around view through a user interface module included in a vehicle

A three-dimensional around view providing apparatus for providing a 3D around view through a user interface module included in a vehicle may include a plurality of image pickup units mounted in the vehicle, a depth estimator configured to receive a plurality of images from the plurality of image pickup units and to acquire a plurality of depth maps corresponding to the plurality of images, a controller configured to minimize a depth difference between a first boundary region of a first depth map and a second boundary region of a second depth map. At least one among an autonomous vehicle, a user terminal, and a server according to embodiments of the present disclosure may be associated or integrated with an artificial intelligence (AI) module, a drone (unmanned aerial vehicle (UAV)), a robot, an augmented reality (AR) device, a virtual reality (VR) device, a 5th-generation (5G) service related device, and the like.