Method and system for localizing mobile robot using external surveillance cameras
09849589 ยท 2017-12-26
Assignee
Inventors
Cpc classification
Y10S901/01
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
G06V10/25
PHYSICS
B25J9/1666
PERFORMING OPERATIONS; TRANSPORTING
G06V10/42
PHYSICS
International classification
Abstract
A mobile robot employing a method and a system for localizing the mobile robot using external surveillance cameras acquires images from the surveillance cameras installed indoors adjacent to each other, recognizes objects included in the images by removing shadows from the images and performing a homography scheme, and avoids the recognized objects. The mobile robot employing the method and the system for localizing the mobile robot using the external surveillance cameras enables rapider localization and lower price as compared with a conventional image-based autonomous robot, so that the commercialization of a service robot is accelerated.
Claims
1. A system for localizing a mobile robot using external surveillance cameras, the system comprising: a control unit that recognizes an object and an obstacle included in received surveillance camera images through a shadow removing scheme and a homography scheme, creates a moving path to allow travelling while avoiding the object and the obstacle, and generates a corresponding control signal to control components other than the control unit; a sensing unit that senses a steering angle and rotation of a driving motor and transmits the steering angle and the rotation of the driving motor to the control unit; a traveling unit that generates driving force by the control signal; a steering unit that performs steering along the moving path by the control signal; a communication unit that transmits images, which are obtained from the surveillance cameras, to the control unit; and a power supply unit that supplies power to components other than the power supply unit.
2. The system of claim 1, further comprising a proximity sensor or a distance sensor provided on one side of an outer portion of a shadow removing system.
3. The system of claim 1, wherein a communication protocol applied to the communication unit is a Zigbee wireless communication protocol.
4. The system of claim 1, wherein one wireless communication protocol or a plurality of wireless communication protocols are applied to the communication unit.
5. The system of claim 1, wherein the traveling unit comprises a moving unit including a wheel, a caterpillar, or a leg for walking.
6. The system of one of claim 1, wherein the sensing unit comprises a camera or a vision sensor.
7. The system of one of claim 2, wherein the sensing unit comprises a camera or a vision sensor.
8. The system of one of claim 3, wherein the sensing unit comprises a camera or a vision sensor.
9. The system of one of claim 4, wherein the sensing unit comprises a camera or a vision sensor.
10. The system of one of claim 5, wherein the sensing unit comprises a camera or a vision sensor.
11. A method of localizing a mobile robot using external surveillance cameras, the method comprising: converting original images acquired from indoor surveillance cameras installed adjacent to each other to binary images and removing shadows from the binary images; merging the binary images having no shadows with each other through a homography scheme; recognizing locations and sizes of objects included into the original images through a contour scheme; and compensating for errors of the recognized locations and sizes of the objects, and mapping the objects having the compensated locations and sizes with an image of a real floor, which is merged with a grid.
12. The method of claim 11, wherein, in the merging of the binary images having no shadows with each other through the homography scheme, Equations 1 and 2 are applied to the binary images having no shadows,
q=HQ,Equation 1 in which q and Q represent planes q and Q, respectively, in Equation 1, and
13. The method of claim 11, further comprising planning a moving path based on the mapping after the mapping of the objects having the compensated locations and sizes.
14. The method of claim 12, further comprising planning a moving path based on the mapping after the mapping of the objects having the compensated locations and sizes.
15. The method of claim 13, further comprising: marking the planed moving path and an actual traveling path on a same map and calculating an error bound between the moving path and the traveling path; and transmitting the error bound to an external system, after the planning of the moving path and the actual traveling of the mobile robot along the planed moving path.
16. The method of claim 14, further comprising: marking the planed moving path and an actual traveling path on a same map and calculating an error bound between the moving path and the traveling path; and transmitting the error bound to an external system, after the planning of the moving path and the actual traveling of the mobile robot along the planed moving path.
17. The method of claim 15, wherein a square lattice or grid including gradations spaced at a predetermined interval is displayed together with the moving path and the traveling path on the map having the planed moving path and the actual traveling path that are marked thereon.
18. The method of claim 16, wherein a square lattice or grid including gradations spaced at a predetermined interval is displayed together with the moving path and the traveling path on the map having the planed moving path and the actual traveling path that are marked thereon.
19. The method of claim 15, wherein the external system, to which the error bound is transmitted, stores and processes error bounds transmitted from a plurality of mobile robots, and transmits the stored error bound or a processing result to an outside.
20. The method of claim 16, wherein the external system, to which the error bound is transmitted, stores and processes error bounds transmitted from a plurality of mobile robots, and transmits the stored error bound or a processing result to an outside.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
DETAILED DESCRIPTION OF THE INVENTION
(15) Hereinafter, some exemplary embodiments of the present invention will be described in detail with reference to accompanying drawings.
(16) Before the detailed description of the present invention, terms and words used in the specification shall not be interpreted as commonly-used dictionary meanings, but shall be interpreted as to be relevant to the technical scope of the invention based on the fact that the inventor may properly define the concept of the terms to explain the invention in best ways.
(17) In other words, the terms are only used in the specification to explain the exemplary embodiment of the present invention, and not used to limit the scope of the present invention. In addition, those skilled in the art should understand that the terms are defined in consideration with various possibilities of the present invention.
(18) Further, in the following description, a predetermined component expressed in the singular may contain a plurality of components unless otherwise indicated. Similarly, components expressed in the plural may contain a singular concept.
(19) Throughout the whole detailed description, when a predetermined component includes another component, the predetermined component does not exclude other components, but may further include other components unless otherwise indicated.
(20) Further, those skilled in the art should understand the following. When it is described that a predetermined component exists in another component, or is installed in connection with another component, the predetermined component may be directly connected with another component, may be installed in contact with another component, or may be spaced apart from another component by a predetermined distance. If the predetermined component is spaced apart from another component by the predetermined distance, a third component or a third unit may be provided to fix or connect the predetermined component to another component, and the details of the third component or unit may be omitted.
(21) Meanwhile, those skilled in the art should understand that, when the predetermined component is directly linked to or directly connected with another component, the third component or the third unit is not provided.
(22) Similarly, other expressions to explain the relationships among components, that is, between, immediately between, adjacent to, and directly adjacent to should be interpreted under the same intent.
(23) In the following description, the terms of one surface, opposite surface, one side, opposite side, first, and second, if used, are used to clearly distinguish between one component and a different component, and the meanings of relevant components should not be limitingly interpreted due to the terms.
(24) Further, in the following description, terms of on, under, left, and right related to a position, if used, should be interpreted as representing a relative position of a relevant component in a relevant drawing. In addition, unless the position is not specified as an absolute position, the terms related to the position should not be interpreted as representing the absolute position.
(25) Moreover, those skilled in the art should understand that the term of part, unit, module, or device, if used, means a unit capable of processing at least one function or operation, and can be realized in hardware, software, or the combination of the hardware and the software in the detailed description of the present invention.
(26) In the following description of the present invention, when assigning a reference numeral to each component in each drawing, the same component is assigned with the same reference numeral even if the same component is expressed in a different drawing. In other words, throughout the whole detailed description of the invention, the same reference numeral is assigned to the same component.
(27) The sizes and the positions of components constituting the present invention, the relationship between the components may be partially exaggerated, reduced, or omitted for clarity. Accordingly, the proportion or the scale does not strict.
(28) In addition, the details of the generally-known technology that makes the subject matter of the present invention unclear will be omitted in the following description.
(29)
(30) As shown in
(31) The control unit 100 is a component to control the sensing unit 200, the traveling unit 300, the steering unit 400, and the communication unit 500 so that the localization system 1000 may travel while avoiding an obstacle. The control unit 100 receives images, which are photographed by the surveillance cameras (reference number thereof is omitted), from the communication unit 500 to remove a shadow of an object from the image, to perceive the object and an obstacle included in the image, to form a traveling path allowing the localization system 1000 and the mobile robot to travel while avoiding the object and the obstacle, and to control a driving unit 450.
(32) The sensing unit 200 refers to an encoder (not shown) provided in the traveling unit 300 to count the rotation of a driving motor 360, and a turret encoder 420 provided in the steering unit 400 to measure a steering angle. The sensing unit 200 may further include a proximity sensor or a distance sensor.
(33) The traveling unit 300, which is a unit to move the system 1000 for localizing the mobile robot using the external surveillance camera, includes a motor driver 310, a driving wheel 320, driving pulleys 330 and 350, a driving belt 340, and the driving motor 360 according to the example embodiment of the present invention (see
(34) In this case, the motor driver 310 controls a rotation direction and a rotation degree of the driving motor 360 under the control of the control unit 100.
(35) The driving wheel 320 is directly linked to the driving pulley 330 to receive rotational force from the driving pulley 350, which is directly linked to the driving motor 360, through the driving belt 340 and thus rotate, thereby moving the localization system 1000.
(36) The steering unit 400 includes a turret pulley 410, the turret encoder 420, and a turret motor (not shown) (see
(37) The driving unit 450 refers to both of the traveling unit 300 and the steering unit 400 to move the localization system 1000.
(38) The communication unit 500 receives the images, which are photographed by the surveillance cameras (reference number thereof is omitted), from the surveillance cameras to transmit the images to the control unit 100. In this case, preferably, a communication protocol between each of the surveillance cameras and the communication unit 500 may be a protocol of a Zigbee scheme. However, various communication protocols other than the Zigbee protocol may be employed according to work objects or work environments.
(39) The power supply unit 600 supplies power to other components, that is, the control unit 100, the sensing unit 200, the traveling unit 300, the steering unit 400, and the communication unit 500. The control unit 100, the sensing unit 200, the control unit 100, the steering unit 400, and the communication unit 500 are directly connected with the power supply unit 600 to receive power, and the traveling unit 300 is not directly connected with the power supply unit 600, but connected with the steering unit 400 to receive the power. Alternatively, according to another embodiment, the traveling unit 300 may be directly connected with the power supply unit 600 to receive the power.
(40) The system 1000 for localizing the mobile robot using the external surveillance camera, which includes the control unit 100, the sensing unit 200, the traveling unit 300, the steering unit 400, and the communication unit 500, may be preferably realized in the form of a mobile robot to travel while avoiding an obstacle.
(41) In addition, the distance sensor or the proximity sensor is further provided on one side of an outer portion of the system 1000 for localizing the mobile robot using the external surveillance cameras according to the exemplary embodiment of the present invention to measure the distances among the localization system 1000, a mobile robot, and an obstacle, or to detect that the localization system 1000 and the mobile robot are in the proximity to the obstacle.
(42) In addition, the traveling unit 300 includes a BLDC motor, and the steering unit 400 includes a stepping motor so that the lifespan of the traveling unit 300 may be extended, and a finely steering capability of the sensing unit 200 may be improved.
(43) In addition, although the above description has been made in that the system 1000 for localizing the mobile robot using the external surveillance cameras according to the exemplary embodiment travels with a wheel as a moving unit thereof, the present invention is not limited thereto. In other words, the system 1000 for localizing the mobile robot using the external surveillance cameras according to the present invention may be applied to various robots employing various moving units such as caterpillars or legs for walking.
(44) In addition, although
(45) In addition, the sensing unit 200 may further include a camera or a vision sensor, so that the activity ranges of both of the system 1000 for localizing the mobile robot using the external surveillance cameras according to the present invention and the mobile robot may be expanded to the outside as well as an interior.
(46) Hereinafter, lines to connect the components with each other will be described with reference to
(47) Lines having no arrows at terminals thereof are power lines to supply power, which is generated from the power supply unit 600, to the components.
(48) Lines having arrows at terminals thereof are lines to represent transmission directions of the control signal or data generated from the control unit 100, and the control signal or data are transmitted from a component connected with an end portion of a relevant line having no arrow to a component connected with an end portion of the relevant line having an arrow.
(49)
(50) In this case, the external surveillance cameras, which are installed indoors outside the robot, are preferably installed on two opposite sides of an indoor space, so that the information of the region covered by the object may be complemented.
(51)
(52) A corridor image obtained from the surveillance camera as shown in left side of
(53)
(54) In order to perform projection-conversion like the air view shown in right side of
(55) In the chess board shown in right side of
(56)
(57)
(58) A 33 homography matrix H may be found using four points of the plan Q and four points of the plan q, which are found out as shown in
q=HQEquation 1
(59) In Equation 1, q and Q represent the planes q and Q, respectively.
(60) If coordinates of feature points of two images, which are projection conversion results through the homography scheme, are combined and expressed to coordinates of a new projected plane as expressed by reference number {circle around (1)} of
(61)
(62) In order to find out an object bottom region on the 2-D map, the original images of the surveillance cameras are acquired as shown in
(63) Next, the central position and the size of an object image in the 2-D map must be calculated.
(64) On the assumption that coordinates of two projection-converted images approximate to each other, the size and the location of the object in contact with a floor included in the image may be substantially matched with those of an object in a real environment. Accordingly, if overlapped image parts are removed from two images except for image parts in close contact with the floors of two images, an effect that the objects are viewed from the top similarly to a bird eye view may be produced
(65) Following Equation 2 is used for the homography projection.
(66) In this case, on the assumption that a projected image of an image by camera 1 is I.sub.1.sup.H(x,y), and a projected image of an image by camera 2 is I.sub.2.sup.H(x,y), the size and the location of an object on a projected plane are produced by Equation 2.
(67)
(68) In Equation 2, I.sub.1.sup.H(x,y) and I.sub.2.sup.H(x,y) represent the projected images by camera 1 and camera 2, respectively.
(69)
(70)
(71) In order to detect information of each object from the image, following procedures are performed.
(72) First, an object region is detected. In order to detect the object from the image, a labeling or contour scheme is typically used. The schemes are appropriate to the detection of the object region from the image of
(73) Second, a moment is calculated in order to calculate the central coordinates and the area of the object region. The moment is used when the size of the object region is calculated. According to the present invention, the size and the central coordinates of the object are calculated based on the information of the outline found through the contour scheme.
(74) In order to compare the location of the object region of
(75)
(76) When the size and the central location of the object region obtained through the contour scheme are compared with the size and the central location of a real object, errors are greatly made. Accordingly, the difference in location between the detected object and the real object is measured through image processing based on the lattice shown in
(77)
(78) In order to an error measurement experiment, a cylindrical object having a diameter of 20 cm is used. The primary measurement error compensation in the detection of the object region is performed through the homography scheme. The error bound is 7.1 cm on the 2-D map obtained by mapping a grid image detected through image processing with the floor image on the air view.
(79)
(80)
(81) In this case, as shown in
(82)
(83) The error bound is 5 cm between the moving path planned by the mobile robot employing the method and system for localizing the mobile robot using the external surveillance cameras according to the exemplary embodiment of the present invention, and an actual traveling path that the mobile robot actually moves.
(84) In this case, the mobile robot may calculate the error bound between the planned moving path and the traveling path that the mobile robot actually moves, and may transmit the calculated result to a specific external system. The external system, which is a computing system specified by a user of the mobile robot, may be a personal computer system, or a computer system for service, such as a server.
(85)
(86) As described above, although exemplary embodiments of the present invention have been described, various embodiments disclosed in DETAILED DESCRIPTION OF THE INVENTION are provided only for the illustrative purpose. Those skilled in the art can understand that various modifications, variations, and equivalents of the present invention are possible based on the above description.
(87) In addition, since the present invention can be realized in various forms, the present invention is not limited to the above embodiments. The above description is provided only to allow those skilled in the art to perfectly understand the scope of the present invention, and those skilled in the art should know that the present invention is defined by the appended claims.