G01B11/2545

Three-dimensional measurement system and three-dimensional measurement method

A three-dimensional measurement system capable of realizing high-speed processing while increasing measurement resolution are provided. The system includes: an image capture unit including a first and second image capture units that are spaced apart; a first calculation unit calculates a parallax at first feature points in the images using distance information of a three-dimensional measurement method other than a stereo camera method or information for calculating a distance, using at least one of the first and second image capture units; and a second calculation unit calculates a parallax at second feature points based on a corresponding point for the second feature point by using the stereo camera method using the first and second image capture units, and specifies a three-dimensional shape based on the parallax at the first and second feature points. The second calculation unit sets a search area based on the parallax at the first feature points.

THREE-DIMENSIONAL MEASUREMENT APPARATUS, SYSTEM, AND PRODUCTION METHOD
20220108464 · 2022-04-07 ·

A three-dimensional measurement apparatus includes an attachment portion for attaching the three-dimensional measurement apparatus to a robot, a flange for attaching an end effector, a sensor configured to receive light from an object, and a calculation unit configured to determine three-dimensional information about the object by performing a calculation using data obtained by the sensor. A shortest distance among distances from a center of the flange to an outer peripheral edge of the calculation unit is less than or equal to a radius of the attachment portion or the flange, as viewed from the flange.

Three-dimensional measurement device

A system and method of determining three-dimensional coordinates is provided. The method includes, with a projector, projecting onto an object a projection pattern that includes collection of object spots. With a first camera, a first image is captured that includes first-image spots. With a second camera, a second image is captured that includes second-image spots. Each first-image spot is divided into first-image spot rows. Each second-image spot is divided into second-image spot rows. Central values are determined for each first-image and second-image spot row. A correspondence is determined among first-image and second-image spot rows, the corresponding first-image and second-image spot rows being a spot-row image pair. Tach spot-row image pair having a corresponding object spot row on the object. Three-dimensional (3D) coordinates of each object spot row are determined on the central values of the corresponding spot-row image pairs. The 3D coordinates of the object spot rows are stored.

System and method for measuring three-dimensional coordinates

A three-dimensional (3D) measurement system, a method of measuring 3D coordinates, and a method of generating dense 3D data is provided. The method of measuring 3D coordinates includes using a first 3D measurement device and a second 3D measurement device in a cooperative manner is provided. The method includes acquiring a first set of 3D coordinates with the first 3D measurement device. The first set of 3D coordinates are transferred to the second 3D measurement device. A second set of 3D coordinates is acquired with the second 3D measurement device. The second set of 3D coordinates are registered to the first set of 3D coordinates in real-time while the second 3D measurement device is acquiring the second set of 3D coordinates.

AUGMENTED REALITY READY OPTICAL TRACKING SYSTEM
20220087749 · 2022-03-24 ·

A computer assisted system is disclosed that includes an optical tracking system. The optical tracking system includes an RGB sensor and is configured to capture color images of an environment in the visible light spectrum and tracking images of fiducials in the environment in a near-infrared spectrum. The computer assisted surgical system is configured to generate a color image of the environment using the color images, identify fiducial locations using the tracking images, register pre-operative and/or intra-operative data to the color images using the fiducial locations, and generate an augmented reality (AR) image of the environment by overlaying the data over the color image. The computer assisted surgical system can further include a monitor or a head-mounted display (HMD) configured to present the AR image.

SYSTEM AND METHOD FOR MEASURING THREE-DIMENSIONAL COORDINATES

A three-dimensional (3D) measurement system, a method of measuring 3D coordinates, and a method of generating dense 3D data is provided. The method of measuring 3D coordinates includes using a first 3D measurement device and a second 3D measurement device in a cooperative manner is provided. The method includes acquiring a first set of 3D coordinates with the first 3D measurement device. The first set of 3D coordinates are transferred to the second 3D measurement device. A second set of 3D coordinates is acquired with the second 3D measurement device. The second set of 3D coordinates are registered to the first set of 3D coordinates in real-time while the second 3D measurement device is acquiring the second set of 3D coordinates.

THREE DIMENSIONAL IMAGING

Disclosed are a 3D scanner, an additive manufacturing system and an apparatus and method for identifying features of a 3D object manufactured in such a system. An apparatus comprises an optical projection assembly comprising a light source and an optical grating, for illuminating an object with first and second light patterns having different spatial frequencies, wherein the optical projection assembly provides a first light pattern in a first configuration of the optical projection assembly and provides a second light pattern in a second configuration of the optical projection assembly. An image capturing apparatus is used to capture images corresponding to reflections of the first and second light patterns from the illuminated object, and a processing unit is used to identify, from the captured reflections of the first and second light patterns, the effects of distortions in the reflected light patterns corresponding to features of the illuminated object.

Volume measuring apparatus and volume measuring method for box
11270454 · 2022-03-08 · ·

A volume measuring apparatus having a first camera, a second camera, an emitting unit and a processing unit is disclosed. The processing unit controls the emitting unit to emit invisible structure light, and controls the first and second camera to capture a left and a right image both containing a target-box. The processing unit generates a depth graph according to the left and right image, and scans the depth graph through multiple scanning lines for determining a middle line, a bottom line, a left-sideline, and a right-sideline of the target-box in the depth graph. The processing unit performs scanning, within a range of the middle line, the bottom line, the left-sideline, and the right-sideline, for obtaining a plurality of width information, height information, and length information. The processing unit computes the volume related data of the target-box according to the plurality of width information, height information, and length information.

Methods and apparatus for optimizing image acquisition of objects subject to illumination patterns

The techniques described herein relate to methods, apparatus, and computer readable media configured to determine parameters for image acquisition. One or more image sensors are each arranged to capture a set of images of a scene, and each image sensor comprises a set of adjustable imaging parameters. A projector is configured to project a moving pattern on the scene, wherein the projector comprises a set of adjustable projector parameters. The set of adjustable projector parameters and the set of adjustable imaging parameters are determined, based on a set of one or more constraints, to reduce noise in 3D data generated based on the set of images.

Estimation of spatial relationships between sensors of a multi-sensor device
11238616 · 2022-02-01 · ·

In one implementation, a device has a processor, a projector, a first infrared (IR) sensor, a second IR sensor, and instructions stored on a computer-readable medium that are executed by the processor to estimate the sensor-to-sensor extrinsic parameters. The projector projects IR pattern elements onto an environment surface. The first sensor captures a first image including first IR pattern elements corresponding to the projected IR pattern elements and the device estimates 3D positions for first IR pattern elements. The second IR sensor captures a second image including second IR pattern elements corresponding to the projected IR pattern elements and the device matches the first IR pattern elements and the second IR pattern elements. Based on this matching, the device estimates a second extrinsic parameter corresponding to a spatial relationship between the first IR sensor and the second IR sensor.