SYSTEM AND METHOD FOR OBJECT RECOGNITION USING THREE DIMENSIONAL MAPPING TOOLS IN A COMPUTER VISION APPLICATION

20220319205 · 2022-10-06

    Inventors

    Cpc classification

    International classification

    Abstract

    Described herein are a system and a method for object recognition via a computer vision application, the system including at least the following components: an object to be recognized, the object having object specific reflectance and luminescence spectral patterns, a light source which is configured to project at least one light pattern on a scene which includes the object to be recognized, a sensor which is configured to measure radiance data of the scene including the object when the scene is illuminated by the light source, a data storage unit which includes luminescence spectral patterns together with appropriately assigned respective objects, and a data processing unit.

    Claims

    1. A system for object recognition via a computer vision application, the system comprising at least the following components: an object to be recognized, the object having object specific reflectance and luminescence spectral patterns, a light source which is configured to project at least one light pattern on a scene which includes the object to be recognized, a sensor which is configured to measure radiance data of the scene including the object when the scene is illuminated by the light source, a data storage unit which comprises luminescence spectral patterns together with appropriately assigned respective objects, and a data processing unit which is configured to detect the object specific luminescence spectral pattern of the object to be recognized out of the radiance data of the scene and to match the detected object specific luminescence spectral pattern with the luminescence spectral patterns stored in the data storage unit, and to identify a best matching luminescence spectral pattern and, thus, its assigned object, and calculate a distance, a shape, a depth and/or surface information of the identified object in the scene by reflectance characteristics measured by the sensor.

    2. The system according to claim 1, wherein the at least one light pattern is a temporal light pattern, a spatial light pattern or a temporal and spatial light pattern.

    3. The system according to claim 2, wherein in the case that the light source is configured to project a spatial light pattern or a temporal and spatial light pattern on the scene, the spatial part of the light pattern is formed as a grid, an arrangement of horizontal, vertical, and/or diagonal bars, an array of dots or a combination thereof.

    4. The system according to claim 2, wherein in the case that the light source is configured to project a temporal light pattern or a temporal and spatial light pattern on the scene, the light source comprises a pulsed light source which is configured to emit light in single pulses thus providing the temporal part of the light pattern.

    5. The system according to claim 2, wherein the light source is selected from the group consisting of a dot matrix projector and a time of flight sensor.

    6. The system according to claim 1, wherein the sensor is a hyperspectral camera or a multispectral camera.

    7. The system according to claim 1, wherein the light source is configured to emit one or more spectral bands within UV, visible and/or infrared light simultaneously or at different times in the at least one light pattern.

    8. The system according to claim 1, wherein the object to be recognized is provided with a predefined luminescence material and the resulting object's luminescence spectral pattern is known and used as a tag.

    9. The system according to claim 1, further comprising a display unit which is configured to display at least the identified object and the calculated distance, shape, depth and/or surface information of the identified object.

    10. A method for object recognition via a computer vision application, the method comprising at least the following steps: providing an object with object specific reflectance and luminescence spectral patterns, the object is to be recognized, projecting by means of a light source, at least one light pattern on a scene which includes the object to be recognized, measuring, by means of a sensor, radiance data of the scene including the object when the scene is illuminated by the light source, providing a data storage unit which comprises luminescence spectral patterns together with appropriately assigned respective objects, and providing a data processing unit which is programmed to detect the object specific luminescence spectral pattern of the object to be recognized out of the radiance data of the scene and to match the detected object specific luminescence spectral pattern with the luminescence spectral patterns stored in the data storage unit, and to identify a best matching luminescence spectral pattern and, thus, its assigned object, and to calculate a distance, a shape, a depth and/or surface information of the identified object in the scene by reflectance characteristics measured by the sensor.

    11. The method according to claim 10, wherein the step of providing an object to be recognized comprises providing the object with a luminescence material, thus providing the object with an object specific luminescence spectral pattern.

    12. The method according to claim 10, further comprising the step of displaying via a display device at least the identified object and the calculated distance, shape, depth and/or surface information of the identified object.

    13. The method according to claim 10, wherein the matching step comprises identifying the best matching specific luminescence spectral pattern by using any number of matching algorithms between the estimated object specific luminescence spectral pattern and the stored luminescence spectral pattern.

    14. The method according to claim 10, wherein the detecting step comprises estimating, using the measured radiance data, the luminescence spectral pattern and the reflective spectral pattern of the object in a multistep optimization process.

    15. A non-transitory computer-readable medium storing instructions that, when executed by one or more processors, cause a machine to: provide an object with object specific reflectance and luminescence spectral patterns, the object is to be recognized, project, by a light source, at least one light pattern on a scene which includes the object to be recognized, measure, by means of a sensor, radiance data of the scene including the object when the scene is illuminated by the light source, provide a data storage unit which comprises luminescence spectral patterns together with appropriately assigned respective objects, extract the object specific luminescence spectral pattern of the object to be recognized out of the radiance data of the scene, match the extracted object specific luminescence spectral pattern with the luminescence spectral patterns stored in the data storage unit, identify a best matching luminescence spectral pattern and, thus, its assigned object, and calculate a distance, a shape, a depth and/or surface information of the identified object in the scene by reflectance characteristics measured by the sensor.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0091] FIGS. 1a and 1b show schematically embodiments of the proposed system.

    DETAILED DESCRIPTION OF THE DRAWINGS

    [0092] FIG. 1a and FIG. 1b show schematically embodiments of the proposed system. In FIG. 1a the system 100 includes at least one object 130 to be recognized. Further, the system includes two sensors 120 and 121 which can be realized by an imager, such as a camera, particularly a multispectral or hyperspectral camera, respectively. The system 100 further includes a light source 110. The light source 110 is composed of different individual illuminants, the number of which and nature thereof depend on the method used. The light source 110 may be composed of two illuminants or of three illuminants, for example, that are commonly available. The two illuminants could be chosen as custom LED illuminants. Three illuminants can be commonly available incandescent, compact fluorescent and white light LED bulbs.

    [0093] The light source 110 in FIG. 1a is configured to project a light pattern on a scene 140 which includes the object 130 to be recognized. The light pattern projected by the light source 110 on the scene 140 is chosen here as a spatial light pattern, namely as a grid. That means that only some points within the scene 140 and, thus, only some points of the object 130 to be recognized are hit by the light emitted by the light source 110.

    [0094] The sensors shown in FIG. 1a are both configured to measure radiance data of the scene 140 including the object 130 when the scene 140 is illuminated by the light source 110. It is possible to choose different sensors, namely one sensor which is configured to only measure light of the same wavelength as the emitted structured light. Thus, the effect of ambient lighting condition is minimized and the sensor can clearly measure a deviation from the known geometry of the light introduced to the scene 140 upon the return of the light reflected back to the sensor 120, 121 so that a data processing unit which is not shown here can use such distortions to calculate a distance, a shape, a depth and/or other object information of the object 130 to be recognized. Wavelength of light used by this sensor 120, 121 can be anywhere in UV, visible or near-IR regions of the whole light spectrum. The second sensor 120, 121 may be a multispectral or hyperspectral camera which is configured to measure radiance data of the scene 140 including the object 130 over the entire light spectrum, or over at least that part of the light spectrum that comprises the fluorescence spectral pattern of the object 130. Thus, the second sensor 120, 121 is also configured to measure radiance data of the scene 140 including the object 130 resulting not only from the reflective but also the fluorescent response of the object 130. The data processing unit is configured to extract the object-specific luminescence spectral pattern of the object 130 to be recognized out of the radiance data of the scene 140 and to match the extracted object-specific luminescence spectral pattern with luminescence spectral patterns stored in a data storage unit (not shown here) and to identify a best matching luminescence spectral pattern and, thus, its assigned object. Further, as already mentioned above, the data processing unit is configured to calculate a distance, a shape, a depth and/or surface information of the identified object 130 in the scene 140 by the way the reflected light pattern deforms when striking a surface of the object 130. The system 100 shown here uses on the one side structured light to calculate things such as distance to the object 130 or object shape by means of the reflective answer of the object 130 when being hit by the light emitted from the light source 110. On the other hand, the proposed system 100 uses the separation of fluorescent emission and reflective components of the object 130 to be recognized to identify the object 130 by its spectral signature, namely by its specific fluorescence spectral pattern. Thus, the proposed system 100 combines both methods, namely the method of identifying the object 130 by its object-specific fluorescence pattern and, in addition, the method of identifying its distance, shape and other properties with the reflected portion of the light spectrum due to the distortion of the structured light pattern. The data processing unit and the data storage unit are also components of the system 100.

    [0095] FIG. 1b shows an alternative embodiment of the proposed system. The system 100′ comprises a light source 110′ which is configured to emit UV, visible or infrared light in a known pattern, such as a dot matrix as indicated in FIG. 1b. Generally, it is possible that the light source 110′ is configured to either emit pulses of light into the scene 140′, thus, generating a temporal light pattern, to partially emit light into the scene 140′, generating a spatial light pattern or to emit a combination of the two. A combination of pulsed and spatially structured light can be emitted for example by a dot matrix projector, a LiDAR, etc. The system 100′ shown in figure lb further comprises a sensor 120′ which is configured to sense/record radiance data/responses over the scene 140′ at different wavelength ranges. That means that not only a merely reflective response of the scene 140′ including the object 130′ to be recognized is recorded but also a fluorescent response of the object 130′. The system 100′ further comprises a data processing unit and a data storage unit. The data storage unit comprises a database of fluorescence spectral patterns of a plurality of different objects. The data processing unit is in communicative connection with the data storage unit and also with the sensor 120′. Therefore, the data processing unit can calculate the luminescence emission spectrum of the object 130′ to be recognized and search the database of the data storage unit for a match with the calculated luminescence emission spectrum. Thus, the object 130′ to be recognized can be identified if a match within the database can be found. Additionally, it is possible by using the structured light which has been emitted from the light source 110′ and projected on the scene 140′ and, thus, also on the object 130′ to be recognized, to derive from a measured distortion of the light pattern reflected back to the sensor 120′ further information about the object 130′ to be recognized such as distance, shape, surface information of the object 130′. That means that by choosing a light source 110′ generally used for 3D mapping tools to accommodate luminescence responses from the object to be recognized and utilizing a sensor 120′ with specific spectral reading bands, the proposed system 100′ is able to calculate not only a best matching spectral luminescent material but also a distance to the object 130′ or an object shape and other 3D information about the object 130′. The proposed system enables the use of luminescent color-based object recognition system and 3D space mapping tools simultaneously. That means that the proposed system 100′ allows identifying the object 130′ by its spectral signature such as its object-specific luminescence spectrum in addition to calculate its distance/shape/other properties with the reflected portion of the light which has been projected into the scene 140′.

    [0096] Further, it is to be stated that it is possible that the light source emits a plurality of different light patterns one after the other or to emit a plurality of different light patterns simultaneously. By the usage of different light patterns it is possible to derive from the respective different reflected responses of the scene, and the object within the scene detailed information about the shape, depth and distance of the object. Each of the plurality of light patterns which is projected into the scene hits the object at different sections/areas of its surface and, therefore, each pattern provides different information which can be derived from the respective reflective response. The data processing unit which is in communicative connection with the sensor which records all those reflective responses can merge all the different reflective responses assigned to the different light patterns and can calculate therefrom a detailed 3D structure of the object to be recognized. Summarized, the proposed system can identify the object due to a measurement of the object-specific luminescence spectral pattern and provide detailed information about the distance of the object to the sensor and, further, 3D information of the object due to distortion of the light pattern reflected back to the sensor. Not only different light patterns can be projected onto the object in order to hit all surface sections of the object but also different patterns of light at different wavelength ranges can be projected onto the object, thus providing further information about the reflective and also fluorescent nature of the surface of the object.

    LIST OF REFERENCE SIGNS

    [0097] 100, 100′ system

    [0098] 110, 110′ light source

    [0099] 120, 121, 120′ sensor

    [0100] 130, 130′ object to be recognized

    [0101] 140, 140′ scene