H04N13/271

Timestamp calibration of the 3D camera with epipolar line laser point scanning

Using the same image sensor to capture a two-dimensional (2D) image and three-dimensional (3D) depth measurements for a 3D object. A laser point-scans the surface of the object with light spots, which are detected by a pixel array in the image sensor to generate the 3D depth profile of the object using triangulation. Each row of pixels in the pixel array forms an epipolar line of the corresponding laser scan line. Timestamping provides a correspondence between the pixel location of a captured light spot and the respective scan angle of the laser to remove any ambiguity in triangulation. An Analog-to-Digital Converter (ADC) in the image sensor operates as a Time-to-Digital (TDC) converter to generate timestamps. A timestamp calibration circuit is provided on-board to record the propagation delay of each column of pixels in the pixel array and to provide necessary corrections to the timestamp values generated during 3D depth measurements.

SYSTEM AND METHOD FOR UPDATING AN AUTONOMOUS VEHICLE DRIVING MODEL BASED ON THE VEHICLE DRIVING MODEL BECOMING STATISTICALLY INCORRECT
20230259131 · 2023-08-17 ·

Systems and methods for implementing one or more autonomous features for autonomous and semi-autonomous control of one or more vehicles are provided. More specifically, image data may be obtained from an image acquisition device and processed utilizing one or more machine learning models to identify, track, and extract one or more features of the image utilized in decision making processes for providing steering angle and/or acceleration/deceleration input to one or more vehicle controllers. In some instances, techniques may be employed such that the autonomous and semi-autonomous control of a vehicle may change between vehicle follow and lane follow modes. In some instances, at least a portion of the machine learning model may be updated based on one or more conditions.

CAMERA SYSTEM, MOBILE TERMINAL, AND THREE-DIMENSIONAL IMAGE ACQUISITION METHOD
20230260190 · 2023-08-17 ·

A camera system, a mobile terminal, and a three-dimensional image acquisition method are disclosed. The camera system may include, a first photographing device, a second photographing device, a photographing assistance device, and a processor; the photographing assistance device is configured to emit a first feature light to an object; the first photographing device is configured to collect a second feature light reflected by the object; the second photographing device includes a main camera configured to collect a first image of the object and a secondary camera configured to collect a second image of the object; and the processor is configured to acquire depth information of the object according to the second feature light, and perform feature fusion on the first and second images, and perform stereo registration on a result of feature fusion and the depth information, to acquire a 3D image of the object.

Method and system of adaptable exposure control and light projection for cameras
11330199 · 2022-05-10 · ·

Techniques related to a method and system of adaptable exposure control and light projection for cameras are described herein.

Method and system of adaptable exposure control and light projection for cameras
11330199 · 2022-05-10 · ·

Techniques related to a method and system of adaptable exposure control and light projection for cameras are described herein.

Combining light-field data with active depth data for depth map generation

Depths of one or more objects in a scene may be measured with enhanced accuracy through the use of a light-field camera and a depth sensor. The light-field camera may capture a light-field image of the scene. The depth sensor may capture depth sensor data of the scene. Light-field depth data may be extracted from the light-field image and used, in combination with the sensor depth data, to generate a depth map indicative of distance between the light-field camera and one or more objects in the scene. The depth sensor may be an active depth sensor that transmits electromagnetic energy toward the scene; the electromagnetic energy may be reflected off of the scene and detected by the active depth sensor. The active depth sensor may have a 360° field of view; accordingly, one or more mirrors may be used to direct the electromagnetic energy between the active depth sensor and the scene.

Combining light-field data with active depth data for depth map generation

Depths of one or more objects in a scene may be measured with enhanced accuracy through the use of a light-field camera and a depth sensor. The light-field camera may capture a light-field image of the scene. The depth sensor may capture depth sensor data of the scene. Light-field depth data may be extracted from the light-field image and used, in combination with the sensor depth data, to generate a depth map indicative of distance between the light-field camera and one or more objects in the scene. The depth sensor may be an active depth sensor that transmits electromagnetic energy toward the scene; the electromagnetic energy may be reflected off of the scene and detected by the active depth sensor. The active depth sensor may have a 360° field of view; accordingly, one or more mirrors may be used to direct the electromagnetic energy between the active depth sensor and the scene.

Laser speckle system and method for an aircraft

A system for registering multiple point clouds captured by an aircraft is disclosed. The system includes a speckle generator, at least one three-dimensional (3D) scanner, and a processor coupled thereto. In operation, the speckle generator projects a laser speckle pattern onto a surface (e.g., a featureless surface). The at least one 3D scanner scans the featureless surface to generate a plurality of point clouds of the featureless surface and to image at least a portion of the laser speckle pattern. The processor, which is communicatively coupled with the at least one 3D scanner, registers the plurality of point clouds to generate a complete 3D model of the featureless surface based at least in part on the laser speckle pattern.

Laser speckle system and method for an aircraft

A system for registering multiple point clouds captured by an aircraft is disclosed. The system includes a speckle generator, at least one three-dimensional (3D) scanner, and a processor coupled thereto. In operation, the speckle generator projects a laser speckle pattern onto a surface (e.g., a featureless surface). The at least one 3D scanner scans the featureless surface to generate a plurality of point clouds of the featureless surface and to image at least a portion of the laser speckle pattern. The processor, which is communicatively coupled with the at least one 3D scanner, registers the plurality of point clouds to generate a complete 3D model of the featureless surface based at least in part on the laser speckle pattern.

Dynamic vision sensor and projector for depth imaging
11330247 · 2022-05-10 · ·

Systems, devices, and techniques related to matching features between a dynamic vision sensor and one or both of a dynamic projector or another dynamic vision sensor are discussed. Such techniques include casting a light pattern with projected features having differing temporal characteristics onto a scene and determining the correspondence(s) based on matching changes in detected luminance and temporal characteristics of the projected features.