3D SCANNER WITH STRUCTURED LIGHT PATTERN PROJECTOR AND METHOD OF USING SAME FOR PERFORMING LIGHT PATTERN MATCHING AND 3D RECONSTRUCTION
20240288267 · 2024-08-29
Inventors
Cpc classification
G01B11/2545
PHYSICS
G06T7/521
PHYSICS
G01B11/254
PHYSICS
International classification
Abstract
A scanner for generating 3D data relating to a surface of a target object includes a scanner frame on which is mounted a set of imaging modules including a light projector unit for projecting a structured light pattern of the surface of the target object, wherein the structured light pattern includes a plurality of elongated light stripes arranged alongside one another, and further defining discrete coded elements extending from at least some of the elongated light stripes in the plurality of elongated light stripes. The set of imaging modules further includes a set of cameras positioned alongside the light projector unit for capturing data conveying a set of images including reflections of the structured light pattern projected onto the surface of the target object, and one or more processor in communication with the set of imaging modules for receiving and processing the data conveying the set of images. Related systems and methods are also described.
Claims
1.-124. (canceled)
125. A scanner for generating 3D data relating to a surface of a target object, the scanner comprising: a. a scanner frame; b. a set of imaging modules mounted to the scanner frame in an arrangement defining a plurality of epipolar planes, the set of imaging modules including: i. a light projector unit for projecting a structured light pattern onto the surface of the target object, wherein the projected structured light pattern includes a plurality of elongated light stripes arranged alongside one another and discrete coded elements extending from at least some elongated light stripes in the plurality of elongated light stripes and wherein, for a subset of adjacent elongated light stripes in the plurality of elongated light stripes, an even line corresponding to a specific epipolar plane in the plurality of epipolar planes intersects: A. only a single discrete coded element extending from the subset of adjacent elongated light stripes; or B. multiple discrete coded elements extending from the subset of adjacent elongated light stripes, each discrete coded element in said multiple discrete coded elements being of a different type; and ii. a set of cameras positioned alongside the light projector unit for capturing data conveying a set of images including reflections of the projected structured light pattern projected onto the surface of the target object; and c. one or more processors in communication with the set of imaging modules for receiving and processing the data conveying the set of images.
126. The scanner as defined in claim 125, wherein the light projector unit includes a light source and a pattern generator, the pattern generator including an optical element having translucent portions and opaque portions, the translucent portions and the opaque portions being arranged to shape light emitted by the light source into the projected structured light pattern.
127. The scanner as defined in claim 126, wherein the optical element includes a glass layer, the translucent portions and the opaque portions being defined upon the glass layer, and the opaque portions include a layer of material disposed on the glass layer, the layer of material being substantially opaque to the light source of the light projector unit.
128. The scanner as defined in claim 127, wherein the layer of material comprises at least one of metallic particles or a film.
129. The scanner as defined in claim 125, wherein the set of cameras includes a first camera and a second camera, wherein the first camera and the second camera are spaced from one another and oriented such as to define a baseline for the plurality of epipolar planes.
130. The scanner as defined in claim 125, wherein the discrete coded elements extending from the at least some elongated light stripes include a plurality of different types of discrete coded elements, wherein different types of discrete coded elements in the plurality of different types of discrete coded elements present different specific shapes when extending from the at least some elongated light stripes.
131. The scanner as defined in claim 130, wherein the plurality of different types of discrete coded elements includes at least two different types of discrete coded elements, at least three different types of discrete coded elements, or at least four different types of discrete coded elements.
132. The scanner as defined in claim 125, wherein: a. a first specific elongated light stripe of the at least some elongated light stripes includes a first set of discrete coded elements, each discrete coded element of the first set of discrete coded elements being of a first type; and b. a second specific elongated light stripe of the at least some elongated light stripes includes a second set of discrete coded elements, each discrete coded element of the second set of discrete coded elements being of a second type, wherein the first specific elongated light stripe is distinct from the second specific elongated light stripe.
133. The scanner as defined in claim 125, wherein: a. a first specific elongated light stripe of the at least some elongated light stripes includes a first set of discrete coded elements, at least some discrete coded elements of the first set of discrete coded elements being of different types and being arranged in accordance with a first coding pattern; and b. a second specific elongated light stripe of the at least some elongated light stripes includes a second set of discrete coded elements, at least some discrete coded elements of the second set of discrete coded elements being of different types and being arranged in accordance with a second coding pattern distinct from the first coding pattern.
134. The scanner as defined in claim 125, wherein specific elongated light stripes of the at least some elongated light stripes include respective sets of discrete coded elements, at least some of the discrete coded elements of each set being of different types, and the discrete coded elements of each set being arranged in accordance with a specific one of at least two distinct coding patterns.
135. The scanner as defined in claim 125, wherein a first discrete coded element extends from a first elongated light stripe of the subset of adjacent elongated light stripes and a second discrete coded element extends from a second elongated light stripe of the subset of adjacent elongated light stripes, wherein a position at which the first discrete coded element extends from the first elongated light stripe is diagonally offset from a position at which the second discrete coded element extends from the second elongated light stripe.
136. The scanner as defined in claim 135, wherein the first elongated light stripe is immediately adjacent the second elongated light stripe and wherein the first discrete coded element and the second discrete coded element are of a same type.
137. The scanner as defined in claim 125, wherein discrete coded elements extend from at least some elongated light stripes in the subset of adjacent elongated light stripes, and wherein the discrete coded elements extending from the at least some elongated light stripes in the subset of adjacent elongated light stripes are arranged to form an overall diagonally arranged pattern of discrete coded elements.
138. The scanner as defined in claim 125, wherein discrete coded elements extend from each elongated light stripe in the subset of adjacent elongated light stripes, and wherein the discrete coded elements extending from the each elongated light stripe in the subset of adjacent elongated light stripes are arranged to form an overall diagonally arranged pattern of discrete coded elements.
139. The scanner as defined in claim 125, wherein the even line intersects two discrete coded elements of a same type extending from two different elongated light stripes in the plurality of elongated light stripes, the two different elongated light stripes being separated from one another by at least a minimum number of elongated light stripes.
140. The scanner as defined in claim 139, wherein the minimum number of elongated light stripes is greater than a total number of elongated light stripes in the subset of adjacent elongated light stripes.
141. The scanner as defined in claim 125, wherein the subset of adjacent elongated light stripes includes at least three adjacent elongated light stripes, at least six adjacent elongated stripes or at least eight adjacent elongated light stripes.
142. The scanner as defined in claim 125, wherein discrete coded elements extending from a same specific elongated light stripe in the plurality of elongated light stripes are spaced apart from each other.
143. The scanner as defined in claim 125, wherein each discrete coded element of the discrete coded elements extending from the at least some elongated light stripes comprise at least one protrusion extending from the at least some elongated light stripes or at least one notch extending from the at least some elongated light stripes.
144. The scanner as defined in claim 125, wherein the projected structured light pattern includes discrete coded elements extending from fewer than all elongated light stripes in the plurality of elongated light stripes and includes discrete coded elements extending from at most one of ?, ?, ?, ? and ? of the plurality of elongated light stripes.
145. The scanner as defined in claim 125, wherein the plurality of elongated light stripes in the projected structured light pattern is comprised of non-intersecting elongated light stripes, wherein the non-intersecting elongated light stripes are substantially parallel to one another.
146. The scanner as defined in claim 125, wherein: a. discrete coded elements extending from one elongated light stripe in the plurality of elongated light stripes assist in identifying the one elongated light stripe amongst the plurality of elongated light stripes; or b. discrete coded elements extending from specific elongated light stripes of the plurality of elongated light stripes assist in identifying the specific elongated light stripes amongst the plurality of elongated light stripes.
147. The scanner as defined in claim 125, wherein the scanner is a handheld scanner.
148. The scanner as defined in claim 125, wherein the one or more processors are configured for processing the set of images including the reflections of the projected structured light pattern to perform a 3D reconstruction process of the surface of the target object, the 3D reconstruction process being performed at least in part using the discrete coded elements extending from the at least some elongated light stripes.
149. The scanner as defined in claim 125, wherein the one or more processors are configured for transmitting the data conveying the set of images including the reflections of the projected structured light pattern to a remote computing system distinct from the scanner, the remote computing system being configured for performing a 3D reconstruction process of the surface of the target object using the data conveying the set of images including the reflections of the projected structured light pattern, the 3D reconstruction process being performed at least in part using the discrete coded elements extending from the at least some elongated light stripes.
150. A scanning system comprising: a. the scanner as defined in claim 125; and b. a computing system in communication with said scanner, the computing system being configured for performing a 3D reconstruction process of the surface of the target object using the data conveying the set of images including the reflections of the projected structured light pattern captured by the scanner, the 3D reconstruction process being performed at least in part using the discrete coded elements extending from the at least some elongated light stripes.
151. A light projector unit for projecting a structured light pattern on a surface of an object, the light projector unit being configured for use in a scanner having a set of cameras for capturing data conveying a set of images including reflections of the projected structured light pattern projected on the surface of the object, wherein the set of cameras and the light projector unit are configured to be mounted to the scanner in an arrangement defining a plurality of epipolar planes, wherein the projected structured light pattern includes a plurality of elongated light stripes arranged alongside one another and discrete coded elements extending from at least some elongated light stripes in the plurality of elongated light stripes, and wherein, for a subset of adjacent elongated light stripes in the plurality of elongated light stripes, an even line corresponding to a specific epipolar plane in the plurality of epipolar planes intersects: a. only a single discrete coded element extending from the subset of adjacent elongated light stripes; or b. multiple discrete coded elements extending from the subset of adjacent elongated light stripes, each discrete coded element in said multiple discrete coded elements being of a different type.
152. The light projector unit as defined in claim 151, further comprising a light source and a pattern generator, the pattern generator including an optical element having translucent portions and opaque portions, the translucent portions and the opaque portions being arranged to shape light emitted by the light source into the projected structured light pattern.
153. The light projector unit as defined in claim 152, wherein the optical element includes a glass layer, the translucent portions and the opaque portions being defined upon the glass layer, the opaque portions including a layer of material disposed on the glass layer, the layer of material being substantially opaque to the light source.
154. The light projector unit as defined in claim 151, wherein the even line intersects two discrete coded elements of a same type extending from two different elongated light stripes in the plurality of elongated light stripes, the two different elongated light stripes being separated from one another by at least a minimum number of elongated light stripes.
155. The light projector unit as defined in claim 154, wherein the minimum number of elongated light stripes is greater than a total number of elongated light stripes in the subset of adjacent elongated light stripes.
156. The light projector unit as defined in claim 151, wherein the projected structured light pattern includes discrete coded elements extending from fewer than all elongated light stripes in the plurality of elongated light stripes and includes discrete coded elements extending from at most one of ?, ?, ?, ? and ? of the plurality of elongated light stripes.
157. A computer-implemented method for 3D measurement of a surface of an object, the method comprising: a. receiving at least one image acquired by a sensor that includes reflections of a structured light pattern projected from a light projector onto the surface of the object, wherein the sensor and the light projector are arranged to define a plurality of epipolar planes, wherein the projected structured light pattern comprises a plurality of elongated light stripes and discrete coded elements extending from at least some elongated light stripes in the plurality of elongated light stripes, and wherein, for a subset of adjacent elongated light stripes in the plurality of elongated light stripes, an even line corresponding to a specific epipolar plane in the plurality of epipolar planes intersects: i. only a single discrete coded element extending from the subset of adjacent elongated light stripes; or ii. multiple discrete coded elements extending from the subset of adjacent elongated light stripes, each discrete coded element in said multiple discrete coded elements being of a different type; b. extracting a specific image portion at least in part by identifying areas of the at least one image corresponding to continuous segments of the reflections of the projected structured light pattern; c. associating the specific image portion with at least one discrete coded element of the discrete coded elements; and d. determining a measurement relating to the surface of the object based on a correspondence between the specific image portion and the at least one discrete coded element.
158. The computer-implemented method as defined in claim 157, comprising labelling the specific image portion with a unique identifier.
159. The computer-implemented method as defined in claim 158, comprising: a. selecting a specific epipolar plane of the plurality of epipolar planes; and b. identifying plausible combinations on the specific epipolar plane, the plausible combinations including a light stripe label of the plurality of elongated light stripes and the unique identifier, for a plausible continuous segment of the reflections of the projected structured light pattern selected from the continuous segments of the reflections of the projected structured light pattern in the at least one image.
160. The computer-implemented method as defined in claim 159, comprising identifying the plausible combinations by proximity to the associated at least one continuous segment of the reflections of the projected structured light pattern and the at least one discrete coded elements.
161. The computer-implemented method as defined in claim 159, comprising: a. calculating a matching error for each of the plausible combinations; b. determining a most probable combination by computing a figure of merit for each of the plausible combinations using the matching error to find a most probable match; c. associating each continuous segment of the reflections of the projected structured light pattern with the most probable match; and d. calculating a set of 3D points using matching points of the most probable match.
162. The computer-implemented method as defined in claim 161, comprising validating the matching points to discard the matching points if the figure of merit fails to meet a quality of match threshold.
163. The computer-implemented method as defined in claim 157, wherein the even line intersects two discrete coded elements of a same type extending from two different elongated light stripes in the plurality of elongated light stripes, the two different elongated light stripes being separated from one another by at least a minimum number of elongated light stripes.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0030] The above-mentioned features and objects of the present disclosure will become more apparent with reference to the following description taken in conjunction with the accompanying drawings, wherein like reference numerals denote like elements and in which:
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
[0044] In the drawings, exemplary embodiments are illustrated by way of example. It is to be expressly understood that the description and drawings are only for the purpose of illustrating certain embodiments and are an aid for understanding. They are not intended to be a definition of the limits of the invention.
DETAILED DESCRIPTION OF EMBODIMENTS
[0045] A detailed description of one or more specific embodiments of the invention is provided below along with accompanying Figures that illustrate principles of the invention. The invention is described in connection with such embodiments, but the invention is not limited to any specific embodiment described. The scope of the invention is limited only by the claims. Numerous specific details are set forth in the following description in order to provide a thorough understanding of the invention. These details are provided for the purpose of describing non-limiting examples and the invention may be practiced according to the claims without some or all of these specific details. For the purpose of clarity, technical material that is known in the technical fields related to the invention has not been described in great detail so that the invention is not unnecessarily obscured.
[0046] The present disclosure presents methods and systems that match specific continuous segments of light reflections (or blobs) observed in a frame capture of a surface of an object to specific corresponding light stripes from a plurality of light stripes in a structured light pattern projected on the surface of the object. With increasing numbers of projected light stripes (e.g., in the hundreds) an increased number of ambiguities are introduced when trying to match possible continuous segment-light stripe combinations, ambiguities that will be discarded if they cannot be resolved. The use of a structured light pattern including discrete coded elements extending from the projected light stripes may reduce the number of plausible combinations needed to resolve the ambiguities. In particular, the discrete coded elements may accelerate the matching of the continuous segments to projected stripes, accelerating the fluidity of the scan (e.g., faster scan speed, and less frame drop) and reducing bad matches or outliers on the measured scanned surface. Use of the discrete coded elements removes the need for a third camera to resolve ambiguities, allowing for a less costly cost two-camera system without compromising comprising accuracy.
Definitions
[0047] Herein, light stripes refers to projected lines of light emitted by a projector and forming a pattern on an object's surface or scene.
[0048] Herein, light blobs refer to continuous segments of light on the images reflected from a surface of an object. As the projected light stripes can be partially or wholly obfuscated and/or deformed depending on the shape of the object's surface, the cameras will detect these continuous segments of light (blobs) rather than elongated lines. Moreover, segments of light (blobs) that correspond to same light strip of the structured light pattern may or may not be connected to each other and thus more than one segment of light (blob) may be matched to a same light stripe from the plurality of light stripes projected by the projector.
[0049] Herein, ambiguities refers to multiple possible matches between a continuous segment of light and multiple candidate light stripes in the structured light pattern. Ambiguities may arise for example if the light stripes in the structured light pattern are similar in position relative to the position of the continuous segment of light in an epipolar plane.
3D Measurements of a Surface
[0050]
[0051] In some specific practical implementations, the light source of the light projector unit 36 may include one or more LEDs 38 configured to all emit the same type of light or configured to emit different types of light (e.g., IR and/or white light and/or blue light).
[0052] The type of cameras used for the first and second cameras 31, 32 are typically monochrome cameras and will depend on the type of the light source(s) used in the light projector unit 36. In some embodiments, the first and second cameras 31, 32 may be monochrome, visible color spectrum, or near infrared cameras and the light projector unit 36 is an infrared light projector or near-infrared light projector. The cameras 31, 32 may implement any suitable shutter technology, including but not limited to: rolling shutters, global shutters, mechanical shutters and optical liquid crystal display (LCD) shutters and the like. In some implementations, the third camera 34 may be a color camera (also called a texture camera). The texture camera may implement any suitable shutter technology, including but not limited to, rolling shutters, global shutters, mechanical shutters and optical liquid crystal display (LCD) shutters and the like. In other implementations, the third camera 34 may be of similar configuration to the first and second cameras 31, 32 and used to improve matching confidence and speed. In such embodiments, a fourth camera may be included, so that the scanner includes three near infrared cameras and a color camera (in one example configuration). In further embodiments, a single camera can be used, and the second (and third and/or fourth) camera omitted.
[0053] As depicted in
[0054] The texture camera 34 is also positioned on the main member 52 of the frame structure 20 and, as depicted, may be positioned alongside the first camera 31, the second camera 32 and the light projector unit 36. The texture camera 34 is oriented in a third camera direction and is configured to have a third camera field of view at least partially overlapping with the field of projection, with the first field of view, and with the second field of view.
[0055] A data connection (such as a USB connection) between the scanner 10 and one or more computer processors (shown in
[0056]
[0057] The light projector unit P may be configured to project a structured light pattern comprised of a plurality of sheets of light that are arranged alongside one another. The sheets of light may appear as elongated light stripes when projected onto a surface of an object. The elongated light stripes are non-intersecting elongated light stripes and, in some implementations, may be substantially parallel to each other. In some embodiments, the light projector unit P can be a programmable light projector unit that can project more than one pattern of light. For example, the light projector unit P can be configured to project different structured line pattern configurations. In some embodiments, the light projector unit P can emit light having wavelengths between 405 nm and 1100 nm.
[0058] The cameras C1, C2 and the light projector unit P are calibrated in a common coordinate system using methods known in the art. In some practical implementations, films performing bandpass filter functions may be affixed on the camera lenses to match the wavelength(s) of the projector P. Such films performing bandpass filter functions may help reduce interferences from ambient light and other sources.
[0059] Using the set of imaging modules 100 with at least one computer processor 160 (shown in 1B), measurements of 3D points can be obtained after applying a triangulation-based computer-implemented method. In a typical process, two images of a frame are captured using the two cameras C1, C2. The two images are captured simultaneously, with either no relative displacement (or negligible relative displacement) between the object being scanned (or sense) and the set of imaging modules 100 occurring during the acquisition of the images. The cameras C1 and C2 may be synchronized to either capture the images at the same time or sequentially during a period of time in which the relative position of the set of imaging modules 100 with respect to the scene remains the same or varies within a predetermined negligible range. Both of these cases are considered to be a simultaneous capture of the images by the set of imaging modules 100.
[0060] Once the two images of a frame have been captured by C1 and C2, image processing may be applied to the images to derived 3D measurements of the surface of the object being scanned. The two images generated from the two respective viewpoints of the cameras C1, C2 contain reflection of the structured light pattern projected by the light projector unit P onto the object being scanned (the scene). The reflected structured light pattern may appear as a set of continuous segments of light reflection (sometimes referred to as blobs) in each image rather than as continuous light stripes. These segments(blobs) in the images appear lighter than the background and can be segmented using any suitable approach known in the art techniques, such as thresholding the image signal and applying segmentation validation. To reduce an impact of noise in the image, a minimum length of a segment(blob) may be set to a predetermined number of pixels, such as 2 pixels, for example. The pixels that are part of the same continuous segments of light reflection may be indexed with a label.
[0061] Once continuous segments of light reflections have been identified in the two images of a frame captured by cameras C1 and C2, an epipolar plane may be selected in the next processing step.
[0062] In the case illustrated in
[0063]
[0064] In
[0065] The one or more computer processors 160 (shown in
[0066] Since the light projector unit P and the cameras C1, C2 are calibrated in a same coordinate system, it is possible to derive triplets of indices where a triplet (I1, I2, IP) is composed of (i) the index of the curve in the first image I1 captured by camera C1; (ii) the index of a candidate corresponding curve in the second image I2 captured by camera C2; and (iii) the index of the elongated light stripe in the structured light pattern projected by light projector unit P. The number of possible combinations of triplets is O(N.sup.3), and grows with N, the number of light stripes in the projected structured light pattern. To limit the number of possible combinations, one may analyze the intersections of the line rays from the two cameras C1, C2 and the light projector unit P within the epipolar plane and attribute an error measure to a given intersection.
[0067]
[0068] Different error measurements can be attributed to an intersection. For example, the error measure can be the minimal sum of distances between a point and each of the three rays. Alternatively, the error measure can be the distance between the intersection of the two camera rays and the projector ray. Other variants are possible. The number of plausible combinations can be reduced significantly after imposing a threshold to the obtained values. When the light stripes of the projector can be approximated by planes that are indexed by an angle, the second error measure can be computed efficiently while allowing one to keep only the closest plane. This will reduce the matching complexity to O(N.sup.2).
[0069] After completing these operations, one obtains a list of triplets of potential matches where each is attributed an error and an index corresponding to a specific epipolar plane. This operation is repeated for all epipolar planes crossing continuous light segments (or blobs) in the images captures by camera C1 and C3, typically (although not necessarily) for all rows of pixels in the images.
[0070] The triplets along with their associated error and epipolar index are then mapped against the epipolar index. In
[0071] In
[0072] After completion of the matching step for images captured by cameras C1 and C2 for a given frame, measurements of 3D points may be calculated by processing the triplets. For that purpose, one may minimize the distance between the 3D point and each of the three rays in space. It is then assumed that the projected light sheets are very well calibrated, either parametrically or using a look-up table (LUT) to eventually obtain more accurate measurements. In practical applications, the projected light sheet produced through commercial optic components may not correspond exactly to a plane. For this reason, the use of a LUT may be more appropriate. Another possible approach consists in only exploiting the images from the two cameras for the final calculation of the 3D points.
Line Matching
[0073] To enable matching of light stripes, the light projector unit P can be programmed to emit a structured light pattern including elongated light stripes (e.g., lines of light that include rays 402) from which extend discrete coded elements.
[0074]
[0075] From each of the reflected elongated light stripes 600 in the structured light pattern 700, protrudes a set of discrete coded elements 602 along the length of the light stripe 600. The differently shaped discrete coded elements 602 are located in repeating blocks at known positions along the length of each elongated light stripe 600, such that that the combination of elongated light stripes forms a known pattern with the discrete coded elements 602 at known locations and isolated from each other. In the example image of light pattern 700, four different shaped types of discrete coded elements 602b, 602c, 602d, and 602e are used. The four types of discrete coded elements 602b, 602c, 602d, and 602e are arranged to form a known overall pattern, in this case a diagonally arrange pattern. Units of each ones of the four types of discrete coded elements 602b, 602c, 602d, and 602e are located at known intervals along each light elongated light stripe 600 of the structured light pattern 700.
[0076] In some embodiments, each of the discrete coded elements 602 could appear, for example, at intervals of approximately 1/100th the total length of a light stripe.
[0077] In the light pattern 700 in
[0078] As depicted in the Figures, the structured light pattern 700 may include the discrete coded elements in an alternating sequence at regular intervals to form a diagonally arrange pattern, however other suitable arrangements of the discrete coded elements may also be contemplated and will become apparent to the person skilled in the art in view of the present disclosure. For example, in
[0079] In some embodiments, discrete coded elements extend from each elongated light stripe in the structured light pattern, while in other embodiments discrete coded element extend from fewer than all of the light stripes. For example, ?, ?, ?, ? or ? of the light stripes can include discrete coded elements extending therefrom.
Method
[0080] When generating 3D data relating to a surface of a target of 3D measurements, the existence of a discrete coded element on an extended light stripe (or an absence of a discrete coded element) is information that may be used to reduce the set of plausible combinations in correctly matching continuous segments to a light stripe, and thus reduced potential ambiguities. Given a specific epipolar plane, to reduce the number of possible matches, continuities and protrusions (indicating the potential presence of discrete coded elements) in the continuous light segments are identified over multiple epipolar planes and these continuities and protrusions are used to find which set of light stripes have better correspondence. Finding a specific discrete coded element in the continuous light segments helps to identify the light stripe number and help reduces the possible number of matches. In addition, a first continuous light segment near a second continuous light segment that as been assigned an identified marker can also be more easily matched to an elongated light stripe in the structured light pattern.
[0081]
[0082]
Hardware
[0083]
[0084]
[0085] The pattern generator 924 can be an optical element such as a glass layer 926 with an opaque layer 928 that selectively transmits light from the light source 920 through the glass layer 926 in the desired structured pattern. For example, the glass layer 926 can be optical glass and the opaque layer 928 can be a metallic layer formed of metallic particles that forms a film on the optical glass. The metallic particles can be chromium. The opaque layer 928 can be deposited onto the glass layer 926 to form the pattern of lines and coded elements. The opaque layer 928 can be formed using techniques such as thin film physical vapor deposition techniques like sputtering (direct current DC or radio frequency sputtering), thermal evaporation, and etching, as is known in the art. In other embodiments, the pattern generator 924 may be a liquid crystal display-type including a liquid crystal screen, or other device for creating structured light passed from the light source 920, such as using diffractive or interferential light generation methods. The translucent portions of the glass layer 926 are free from the layer of material that is opaque to the light source of the light projector unit, and so acts to shape light being projected through the pattern generator 924.
[0086] The light projector unit 988 further includes a lens 948 for projecting the structured light generated by the light source 920 and shaped by the pattern generator 924 onto the surface of the object being measured.
[0087] Referring back to
[0088] In a non-limiting example, some or all the functionality of the computer processor 992 (e.g., the computer processor 160 of
[0089] As will be readily understood, although the method described herein is carried out with two images thereby forming triplet combinations, in alternative implementations more than two images could be acquired per frame using addition cameras positioned at additional different known viewpoints (such as 1 camera, 2 cameras, 3 cameras, 4 cameras or even more) and the combinations could contain more than three elements. Alternatively or additionally, if more than two images are acquired per frame, the triplet combinations for two of these images could be used to match the points and the additional image(s) could be used to validate the match.
[0090] Those skilled in the art should appreciate that in some non-limiting embodiments, all or part of the functionality previously described herein with respect to the processing system of the system for displaying indications of uncertainty as described throughout this specification, may be implemented using pre-programmed hardware or firmware elements (e.g., microprocessors, FPGAs, application specific integrated circuits (ASICs), electrically erasable programmable read-only memories (EEPROMs), etc.), or other related components.
[0091] In other non-limiting embodiments, all or part of the functionality previously described herein with respect to a computer processor 160 of the set of imaging modules 100 of the scanner 10 may be implemented as software consisting of a series of program instructions for execution by one or more computing units. The series of program instructions can be tangibly stored on one or more tangible computer readable storage media, or the instructions can be tangibly stored remotely but transmittable to the one or more computing unit via a modem or other interface device (e.g., a communications adapter) connected to a computer network over a transmission medium. The transmission medium may be either a tangible medium (e.g., optical or analog communications lines) or a medium implemented using wireless techniques (e.g., microwave, infrared or other transmission schemes).
[0092] The methods described above for generating 3D data relating to a surface of a target object, may be implemented, for example, in hardware, software tangibly stored on a computer-readable medium, firmware, or any combination thereof. For example, the techniques described above may be implemented in one or more computer programs executing on a programmable computer including a processor, a storage medium readable by the processor (including, for example, volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Program code may be applied to input entered using the input device to perform the functions described and to generate output. The output may be provided to one or more output devices, such as a display screen.
[0093] Those skilled in the art should further appreciate that the program instructions may be written in a number of suitable programming languages for use with many computer architectures or operating systems.
[0094] In some embodiments, any feature of any embodiment described herein may be used in combination with any feature of any other embodiment described herein.
[0095] Note that titles or subtitles may be used throughout the present disclosure for convenience of a reader, but in no way these should limit the scope of the invention. Moreover, certain theories may be proposed and disclosed herein; however, in no way they, whether they are right or wrong, should limit the scope of the invention so long as the invention is practiced according to the present disclosure without regard for any particular theory or scheme of action.
[0096] All references cited throughout the specification are hereby incorporated by reference in their entirety for all purposes.
[0097] It will be understood by those of skill in the art that throughout the present specification, the term a used before a term encompasses embodiments containing one or more to what the term refers. It will also be understood by those of skill in the art that throughout the present specification, the term comprising, which is synonymous with including, containing, or characterized by, is inclusive or open-ended and does not exclude additional, un-recited elements or method steps.
[0098] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention pertains. In the case of conflict, the present document, including definitions will control.
[0099] As used in the present disclosure, the terms around, about or approximately shall generally mean within the error margin generally accepted in the art. Hence, numerical quantities given herein generally include such error margin such that the terms around, about or approximately can be inferred if not expressly stated.
[0100] In describing embodiments, specific terminology has been resorted to for the sake of description, but this is not intended to be limited to the specific terms so selected, and it is understood that each specific term comprises all equivalents. In case of any discrepancy, inconsistency, or other difference between terms used herein and terms used in any document incorporated by reference herein, meanings of the terms used herein are to prevail and be used.
[0101] Although various embodiments of the disclosure have been described and illustrated, it will be apparent to those skilled in the art in light of the present description that numerous modifications and variations can be made. The scope of the invention is defined more particularly in the appended claims.