Patent classifications
G01B11/2545
Systems and Methods for Estimating Depth from Projected Texture using Camera Arrays
Systems and methods for estimating depth from projected texture using camera arrays are described. A camera array includes a conventional camera and at least one two-dimensional array of cameras, where the conventional camera has a higher resolution than the cameras in the at least one two-dimensional array of cameras, an illumination system configured to illuminate a scene with a projected texture, where an image processing pipeline application directs the processor to: utilize the illumination system controller application to control the illumination system to illuminate a scene with a projected texture, capture a set of images of the scene illuminated with the projected texture, and determining depth estimates for pixel locations in an image from a reference viewpoint using at least a subset of the set of images.
RAY TRACKING FOR INTRAORAL 3D SCANNER
An intraoral scanning system comprises and intraoral scanner and a processor. The intraoral scanner comprises one or more cameras and one or more structured light projectors, the intraoral scanner to generate a series of images using the one or more cameras, each image including at least a portion of a pattern projected by the one or more structured light projectors onto an intraoral three-dimensional surface. The processor runs a correspondence algorithm to compute respective three-dimensional positions of a plurality of features of the pattern on the intraoral three-dimensional surface, as captured in the series of images. The processor identifies the computed three-dimensional position of a detected feature of the pattern as corresponding to a particular projector ray r, in at least a subset of the series of images. The processor tracks the particular projector ray r across one or more additional images of the series of images.
IMAGE PROJECTION MEASURING APPARATUS
An image projection measuring apparatus includes a measurer, a storage, an image projection data generator, an image projector, and an image projection controller. The measurer is configured to measure a shape of a work by illuminating an outer surface of the work with a line laser from a laser emitter. The storage is configured to store measurement data acquired by the measurer. The image projection data generator is configured to generate image projection data from the measurement data. The image projector includes a camera and is configured to project the image projection data from the camera. The image projection controller is configured to cause the image projector to project the image projection data onto the outer surface of the work.
High contrast structured light patterns for QIS sensors
A structured-light pattern for a structured-light system includes a base light pattern that includes a row of a plurality of sub-patterns extending in a first direction. Each sub-pattern is adjacent to at least one other sub-pattern, and each sub-pattern is different from each other sub-pattern. Each sub-pattern includes a first number of portions in a sub-row and a second number of portions in a sub-column. Each sub-row extends in the first direction and each sub-column extends in a second direction that is substantially orthogonal to the first direction. Each portion may be a first-type portion or a second-type portion. A size of a first-type portion is larger in the first direction and in the second direction than a size of a second-type portion in the first direction and in the second direction. In one embodiment, a first-type portion is a black portion and the second-type portion is a white portion.
Intraoral 3D scanner employing multiple miniature cameras and multiple miniature pattern projectors
An apparatus for intraoral scanning includes an elongate handheld wand that has a probe. One or more light projectors and two or more cameras are disposed within the probe. The light projectors each has a pattern generating optical element, which may use diffraction or refraction to form a light pattern. Each camera may be configured to focus between 1 mm and 30 mm from a lens that is farthest from the camera sensor. Other applications are also described.
PHASE UNWRAPPING METHOD BASED ON MULTI-VIEW CONSTRAINTS OF LIGHT FIELD AND RELATED COMPONENTS
The present disclosure provides a phase unwrapping method based on multi-view constraints of a light field and related components. The method includes: sampling main view phases at different depths in a measurement scene; performing polynomial calibration on a mapping relation from the main view phase to the pixel coordinates of the corresponding points in auxiliary view images; calculating a candidate absolute phase set of each pixel in a main view image; traversing the candidate absolute phase set, calculating an error value between the candidate absolute phase set and wrapped phases of the pixel coordinates of the corresponding points in all the auxiliary view images by utilizing a calibrated mapping relation, and taking a candidate phase corresponding to the minimum error value as an absolute phase of each pixel in the main view image. The present disclosure can stably realize accurate phase unwrapping of the structured light field.
Systems and methods for augmentation of sensor systems and imaging systems with polarization
A multi-modal sensor system includes: an underlying sensor system; a polarization camera system configured to capture polarization raw frames corresponding to a plurality of different polarization states; and a processing system including a processor and memory, the processing system being configured to control the underlying sensor system and the polarization camera system, the memory storing instructions that, when executed by the processor, cause the processor to: control the underlying sensor system to perform sensing on a scene and the polarization camera system to capture a plurality of polarization raw frames of the scene; extract first tensors in polarization representation spaces based on the plurality of polarization raw frames; and compute a characterization output based on an output of the underlying sensor system and the first tensors in polarization representation spaces.
Compensation of three-dimensional measuring instrument having an autofocus camera
A three-dimensional (3D) measuring instrument includes a registration camera and a surface measuring system having a projector and an autofocus camera. For the instrument in a first pose, the registration camera captures a first registration image of first registration points. The autofocus camera captures a first surface image of first light projected onto the object by the projector and determines first 3D coordinates of points on the object. For the instrument in a second pose, the registration camera captures a second registration image of second registration points. The autofocus camera adjusts the autofocus mechanism and captures a second surface image of second light projected by the projector. A compensation parameter is determined based at least in part on the first registration image, the second registration image, the first 3D coordinates, the second surface image, and the projected second light.
INTRAORAL 3D SCANNER EMPLOYING MULTIPLE CAMERAS AND MINIATURE PATTERN PROJECTORS
An apparatus for intraoral scanning comprises an elongate handheld wand comprising a probe at a distal end, one or more light projectors, and two or more cameras. Each light projector comprises at least one light source configured to generate light and a pattern generating optical element configured to generate a pattern of light when the light is transmitted through the pattern generating optical element. Each camera comprises a camera sensor and one or more lenses and is configured to capture a plurality of images that depict at least a portion of the projected pattern of light on an intraoral surface, wherein each camera is configured to focus at an object focal plane that is located between about 1 mm and about 30 mm from a lens of the one or more lenses that is farthest from the camera sensor.
PROJECTION SYSTEM FOR DIRECTING AND MONITORING ACTIVITY IN LARGE SCALE FACILITIES
A first projection device includes a first laser projector and a first measurement system. A second projection device includes a second laser projector and a second measurement system. The first projection device and the second projection device is interconnected with a controller. The controller is programmed with computer aided design data representative of a large scale work area and coordinates electronic interaction between the first projection device and the second projection device. The first projection device projects a first indicia that is detectable by the second measurement system and the second projection device projects a second indicia that is detectable by the first measurement system. The controller is adapted for determining relative position within three-dimensional coordinate system of the first projection device to the second projection device from the first indicia detected by the second measurement system and the second indicia detected by the first measurement system.