Patent classifications
G01B11/2545
Method and apparatus for vehicle inspection and safety system calibration using projected images
A vehicle service system and method to determine spatial parameters of a vehicle, employing a display system under processor control, to display or project visible indicia onto surfaces in proximity to a vehicle undergoing a safety system service or inspection identifying one or more locations, relative to the determined vehicle centerline or thrust line, at which a calibration fixture, optical target, or simulated test drive imagery is visible for observation by a sensor onboard the vehicle.
Methods and apparatus for optimizing image acquisition of objects subject to illumination patterns
The techniques described herein relate to methods, apparatus, and computer readable media configured to determine parameters for image acquisition. One or more image sensors are each arranged to capture a set of images of a scene, and each image sensor comprises a set of adjustable imaging parameters. A projector is configured to project a moving pattern on the scene, wherein the projector comprises a set of adjustable projector parameters. The set of adjustable projector parameters and the set of adjustable imaging parameters are determined, based on a set of one or more constraints, to reduce noise in 3D data generated based on the set of images.
Three-dimensional shape measurement apparatus
A three-dimensional shape measurement apparatus includes main pattern illumination parts, main image-capturing parts and a control part. The main pattern illumination parts obliquely illuminate grating pattern light in different directions toward a measurement target. The main image-capturing parts obtain a grating pattern image of the measurement target by receiving reflection light of the grating pattern light illuminated from the main pattern illumination parts and obliquely reflected by the measurement target. The control part produces height data of the measurement target using grating pattern images of the measurement target, or produces height data of the measurement target using image positions of plane images for the measurement target and texture information of the measurement target. The control part employs a grating pattern illuminated on the measurement target as the texture information to produce height data of the measurement target. Thus, a three-dimensional shape may be measured more easily and accurately.
METHOD AND APPARATUS FOR DETERMINING 3D COORDINATES OF AT LEAST ONE PREDETERMINED POINT OF AN OBJECT
In a method for determining 3D coordinates of at least one predetermined point of an object, the object is arranged in a measurement region and a variable illumination source projects a variable pattern onto the object arranged in the measurement region. An image recording apparatus which is arranged in a previously known relationship with respect to the illumination source records an image of at least one section of the object illuminated by the variable illumination source. The at least one predetermined point is detected in the recorded image. The 3D coordinates of the at least one predetermined point are determined from the recorded image, taking into account the previously known relationship of the image recording apparatus with respect to the illumination source, if a check of the recorded image reveals that the at least one predetermined point is marked in the recorded image by a feature of the variable pattern.
THREE-DIMENSIONAL MEASUREMENT DEVICE
A scanner capable of determining 3D coordinates in the presence of bright background light includes a laser and a camera, the laser emitting light at a wavelength adjusted with a thermoelectric cooler, the camera passing the adjusted wavelength through a bandpass filter.
3D CAMERA SYSTEM WITH ROLLING-SHUTTER IMAGE SENSOR
The system comprises an array of addressable light sources, which is configured for an activation of the light sources individually or in groups, an image sensor comprising pixels, which are configured for the detection of a predefined light pattern, and a rolling shutter of the image sensor. The array of addressable light sources is configured for a consecutive activation of the light sources according to the predefined light pattern or part of the predefined light pattern, and the rolling shutter is configured to expose areas of the image sensor in accordance with the activation of the light sources, so that the pixels in an exposed area are illuminated and the pixels that are outside the exposed area are shielded from illumination.
Multiple camera microscope imaging with patterned illumination
An array of more than one digital micro-camera, along with the use of patterned illumination and a digital post-processing operation, jointly create a multi-camera patterned illumination (MCPI) microscope. Each micro-camera includes its own unique lens system and detector. The field-over-view of each micro-camera unit at least partially overlaps with the field-of-view of one or more other micro-camera units within the array. The entire field-of-view of a sample of interest is imaged by the entire array of micro-cameras in a single snapshot. In addition, the MCPI system uses patterned optical illumination to improve its effective resolution. The MCPI system captures one or more images as the patterned optical illumination changes its distribution across space and/or angle at the sample. Then, the MCPI system digitally combines the acquired image sequence using a unique post-processing algorithm.
SYSTEM AND METHOD FOR MEASURING THREE-DIMENSIONAL COORDINATES
A three-dimensional (3D) measurement system, a method of measuring 3D coordinates, and a method of generating dense 3D data is provided. The method of measuring 3D coordinates includes using a first 3D measurement device and a second 3D measurement device in a cooperative manner is provided. The method includes acquiring a first set of 3D coordinates with the first 3D measurement device. The first set of 3D coordinates are transferred to the second 3D measurement device. A second set of 3D coordinates is acquired with the second 3D measurement device. The second set of 3D coordinates are registered to the first set of 3D coordinates in real-time while the second 3D measurement device is acquiring the second set of 3D coordinates.
Image recognition device, image recognition method and image recognition unit
An image recognition device, an image recognition method and an image recognition unit are capable of performing touch recognition high in accuracy. The image recognition device includes a measurement point determination section adapted to determine a fingertip from an image obtained by a camera, a pattern display section adapted to make a projector display a first pattern having a first linear pattern varying in luminance with a first pitch along a direction parallel to an epipolar line passing through the fingertip, and a second linear pattern varying in luminance with a second pitch along a direction parallel to the epipolar line, and a position detection section adapted to perform touch recognition based on a variation of the first pattern from the image including the first pattern.
Systems and methods for estimating depth from projected texture using camera arrays
Systems and methods in accordance with embodiments of the invention estimate depth from projected texture using camera arrays. One embodiment of the invention includes: at least one two-dimensional array of cameras comprising a plurality of cameras; an illumination system configured to illuminate a scene with a projected texture; a processor; and memory containing an image processing pipeline application and an illumination system controller application. In addition, the illumination system controller application directs the processor to control the illumination system to illuminate a scene with a projected texture. Furthermore, the image processing pipeline application directs the processor to: utilize the illumination system controller application to control the illumination system to illuminate a scene with a projected texture capture a set of images of the scene illuminated with the projected texture; determining depth estimates for pixel locations in an image from a reference viewpoint using at least a subset of the set of images. Also, generating a depth estimate for a given pixel location in the image from the reference viewpoint includes: identifying pixels in the at least a subset of the set of images that correspond to the given pixel location in the image from the reference viewpoint based upon expected disparity at a plurality of depths along a plurality of epipolar lines aligned at different angles; comparing the similarity of the corresponding pixels identified at each of the plurality of depths; and selecting the depth from the plurality of depths at which the identified corresponding pixels have the highest degree of similarity as a depth estimate for the given pixel location in the image from the reference viewpoint.