Vehicle path restoration system through sequential image analysis and vehicle path restoration method using the same
20230230391 · 2023-07-20
Inventors
Cpc classification
G06V20/588
PHYSICS
International classification
Abstract
Disclosed are a vehicle path restoration system through sequential image analysis which includes: an image capturing unit that acquires sequential images from the front camera installed in the subject vehicle; an image analysis unit for generating multiple lanes that can be recognized from the sequential images of the video file acquired by the image capturing unit and multi-paths calculated using the geometric characteristics of the lanes recognized at the current time and the speed of the subject vehicle, that restores the path of the subject vehicle and restores the path of the front vehicle driving in front of the subject vehicle; a memory for storing path data of the subject vehicle and the front vehicle restored by the image analysis unit; and a display unit that expresses the path data of the subject vehicle and the front vehicle stored in the memory in the form of a top view.
Claims
1. Vehicle path restoration system through sequential image analysis, comprising: an image capturing unit that acquires sequential images from the front camera installed in the subject vehicle; an image analysis unit for generating multiple lanes that can be recognized from the sequential images of the video file acquired by the image capturing unit and multi-paths calculated using the geometric characteristics of the lanes recognized at the current time and the speed of the subject vehicle, that restores the path of the subject vehicle and restores the path of the front vehicle driving in front of the subject vehicle; a memory for storing path data of the subject vehicle and the front vehicle restored by the image analysis unit; and a display unit that expresses the path data of the subject vehicle and the front vehicle stored in the memory in the form of a top view.
2. The vehicle path restoration system through sequential image analysis of claim 1, wherein the multi paths are located by a plurality of indexes in a predefined memory space, so that it is possible to store path data even when the subject vehicle changes lanes left and right between the multi paths.
3. Vehicle path restoration method through sequential image analysis, comprising: acquiring sequential images from a front camera of an image capturing unit installed in a subject vehicle; generating a path of a subject vehicle and a front vehicle by receiving the sequential image and performing image analysis in an image analysis unit; and expressing the image analysis of the image analysis unit in the form of a top view on the display unit, wherein performing image analysis in the image analysis unit comprises: determining a lane of the current time; generating a multi-lane recognizable in the front camera image and a multi-path for each index calculated using geometric characteristics of the lane recognized at the current time and the speed of the subject vehicle; generating path data of the subject vehicle; and generating path data of the front vehicle.
4. The vehicle path restoration method through sequential image analysis of claim 3, wherein creating a path set of each index in the multi-path comprises: receiving input information that is input data for creating a path set; determining whether the path is a curved line or a straight lane; generating a curved path when the path is determined as a curved lane, and generating a straight path when the path is determined as a straight lane; and creating a path set of the corresponding index by integrating the curved path and the straight path.
5. The vehicle path restoration method through sequential image analysis of claim 4, wherein the input data for creating the path set are the speed of the subject vehicle, the path index number, the equation coefficient of a straight line and the equation coefficient of the curve suitable for the lane recognized in the image, the radius of curvature, and the curve discrimination coefficient.
6. The vehicle path restoration method through sequential image analysis of claim 3, wherein generating the multi-path for each index includes: determining an index of a path connected to the lane recognized in the current frame in the front camera image as an active path index, and determining an index of a path not connected to the lane recognized in the current frame in the front camera image as an inactive path index; the active path index is passed to the active multi-path to generate an active multi-path; and the inactive path index is transferred to the inactive multi-path, and in the inactive multi-path generation, a new path is not added, but according to the movement of the subject vehicle, correcting the location of the path stored in the past time.
7. The vehicle path restoration method through sequential image analysis of claim 6, wherein the active multi-path generation is the entire process of a method of creating a path set of each individual index, and the active single path generation step is performed several times as many as the number of lanes recognized in the current image.
8. The vehicle path restoration method through sequential image analysis of claim 3, wherein generating path data of the subject vehicle comprises: obtaining equation coefficients of straight lines and the equation coefficients of curve of a driving lane, which are both lanes of a subject vehicle; determining the equation coefficients of the straight line and the equation coefficients of curve for the path of the subject vehicle by using the equation coefficients of the straight line and the equation coefficients of the curve of the driving lane; determining whether the path is a curved line or a straight lane; generating a curved path when the path is determined as a curved lane, and generating a straight path when the path is determined as a straight lane; and generating a set of paths for the subject vehicle by integrating the curved path and the straight path.
9. The vehicle path restoration method through sequential image analysis of claim 8, further comprising: making the driving lane in a parallel state: determining a lane probability variable that is a probability of being a lane with respect to a right lane and a left lane; when the difference between the left lane and the right lane probability variable is less than or equal to a predetermined error tolerance, obtaining a median value of the slope and curvature of the two lane information, and calculating the equations of straight lines and curves for the path with respect to the left and right lanes; comparing the sizes of the left lane and right lane probability variables when the difference between the left and right lane probability variables is out of a predetermined error tolerance; if the right-lane probability variable is larger than the left-lane probability variable, calculate the equations of straight lines and curves for the left-lane path, such as the slope and curvature of the right-lane; and if the right-lane probability variable is smaller than the left-lane probability variable, calculate the equations of straight lines and curves for the right-lane path, such as the slope and curvature of the left-lane.
10. The vehicle path restoration method through sequential image analysis of claim 3, wherein when the location of the multi-lane is changed because the subject vehicle is changed to a lane, the information of the multi-lane is moved together.
11. The vehicle path restoration method through sequential image analysis of claim 3, wherein generating of the path data of the front vehicle includes: determining a vector b representing that the front vehicle moves from a specific position of an image of a previous frame to a specific position of an image of a current frame when the camera coordinates of the subject vehicle are taken as the origin; determining a motion vector h of the subject vehicle connecting the current coordinates and the coordinates of the previous frame in the path of the subject vehicle; and obtaining a motion vector v of the front vehicle that is a vector addition of the vector b and the vector h.
Description
BRIEF DESCRIPTION OF THE DRAWING FIGURES
[0024] The above object and advantages of the present invention will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings in which:
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
DETAILED DESCRIPTION OF THE EXEMPLARY EMBODIMENTS
[0044] Hereinafter, vehicle path restoration system through sequential image analysis and vehicle path restoration method using the same according to a preferred embodiment of the present invention will be described in detail with reference to the accompanying drawings.
[0045]
[0046] Referring to
[0047] The image analysis unit 200 analyzes the sequential images of the video files obtained by the image capturing unit 100 to generate a multi-lane path and restore the path of the subject vehicle. In addition, the image analysis unit 200 restores the path of the front vehicle driving ahead.
[0048] The path data of the subject vehicle and the front vehicle is stored in the memory 300. It is stored with respect to the time frame of the sequential image.
[0049] Finally, the above data is expressed in the form of a top view on the display unit 400.
[0050]
[0051] Referring to
[0052] Next, the image analysis unit 200 performs image analysis by inputting sequential images (S102). The image analysis unit 200 includes technologies such as lane recognition, vehicle recognition, and object recognition.
[0053] First, a lane of the current time for displaying the top view is determined (S1021).
[0054] A multi-lane path is generated for each index by using information on various objects on the road obtained through image analysis (S1022).
[0055] A path data of the subject vehicle is formed through image analysis (S1023).
[0056] A path data of the front vehicles are generated through image analysis (S1024).
[0057] Next, the information obtained in S1021-S1024 is expressed in the form of a top view (S103).
[0058]
[0059] Referring to
[0060] Referring to
[0061] In addition, the vehicle picture located between the solid line and the dotted line shows the location of the subject vehicle 10 in which the camera is installed. The vehicle picture displayed on the upper end of the subject vehicle 10 shows the position of the front vehicle 20 (boundary box) detected in
[0062] The present invention analyzes the video in real time, and aims to show a lot of information at the moment of the accident. Therefore, the past multi-path 30 and the forward multi-lane 40 recognized at the present time are shown together.
[0063] (Multi-Lane and Multi-Path Structure)
[0064] Hereinafter, the structures of the multi-lane 40 and the multi-path 30 will be described.
[0065] The multi-lane 40 means a lane recognizable in the front camera image, for example 4 lanes in
[0066]
[0067] Referring to
[0068] The circular number ({circle around (0)}{circle around (1)}{circle around (2)}{circle around (3)}) shows that a number is assigned to each multi-lane 40.
[0069] The lanes {circle around (1)} and {circle around (2)} mean the left and right lanes recognized from the left and right sides of the subject vehicle 10, and these are called driving lanes.
[0070] Lane {circle around (0)} and {circle around (3)} are lanes recognized from the left and right sides of the driving lane and are called extended lanes.
[0071] In addition, four dotted lines ([0][1][2][3]) existing under the subject vehicle 10 mean the multi-path 30. Here, the past multi-path is calculated using the geometrical characteristics of the lane recognized at the present time and the speed of the subject vehicle.
[0072] For example, if the lane {circle around (1)} in the path [1] is a straight line, the predicted path can be calculated using the vehicle speed and the linear function. If the lane {circle around (1)} in the path [1] is a curve, the expected path can be calculated using the vehicle speed, the movement vector using the inscribed circle of the parabola, and the correction of the entire path etc. A detailed description of obtaining a movement path of a curved lane is disclosed in Korean Patent Registration No. 10-2296520 (corresponding U.S. patent application Ser. No. 17/159,150) of the present applicant.
[0073]
[0074]
[0075] Referring to
[0076] The orthogonal coordinate system displayed on the picture of the subject vehicle in
[0077]
[0078] Referring to
[0079] In the present invention, a path set is formed for each multi-path index [0] to [11].
[0080]
[0081] Referring to
[0082] The input information is input data for creating a path set. Specifically, the input data includes the speed of the subject vehicle, the path index number ([0] to [11] in
[0083] Next, it is determined whether it is a curved or straight lane (S802). It is a conditional statement that divides into the next step according to the value of the curve discriminant coefficient.
[0084] The method to obtain the curve discriminant coefficient is as follows. For the skeleton coordinates constituting the lane, fit the equation of a straight line or curve using the least squares method. In this case, the shortest distance d.sub.S between the coordinates of the skeleton and the equation of the straight line can be obtained. This shortest distance can be obtained for the number N of all skeleton coordinates, and the average value can be obtained.
[0085] By applying the above contents, it can be expressed as in Equation 1, μ.sub.S and can be called an error rate of a straight line with respect to a lane.
[0086] Similarly to the above method, the shortest distance d.sub.c between the skeleton coordinates and the equation of the curve is obtained. This shortest distance can be obtained for the number N of all skeleton coordinates, and the average value can be obtained. By applying the above contents, it can be expressed as in Equation 2, and μ.sub.c can be called the error rate of the curve with respect to the lane.
[0087] If the error rate of the straight line is smaller than the error rate of the curve, the curve discriminant coefficient is set to 0, and if the error rate of the straight line is greater than the error rate of the curve, the curve discriminant coefficient is set to 1.
[0088] Next, when it is determined as a curved lane, a curved path is generated (S803). A method of generating a curved path is disclosed in Korean Patent Registration No. 10-2296520 (corresponding U.S. patent application Ser. No. 17/159,150) of the present applicant.
[0089] In addition, when it is determined as a straight lane, a straight path is generated (S804). A method of generating a straight path will be described later.
[0090] Next, a path set of the corresponding index is generated by integrating the above-described curved path and straight path (S805).
[0091]
[0092] Referring to
[0093]
[0094] Referring to
L.sub.c=√{square root over ((X.sub.0−X.sub.E).sup.2+(Y.sub.0−Y.sub.E).sup.2)} [Equation 3]
X.sub.E=a+bY.sub.E [Equation 4]
[0095] L.sub.S in Equation 3 means the Euclidean length of the vector {right arrow over (P.sub.1P.sub.0)}. Here, since the unknown is (X.sub.E, Y.sub.E), by substituting Equation 4 into Equation 3, a quadratic equation (Equation 5) can be obtained.
(b.sup.2+1)Y.sub.E.sup.2−2{b(X.sub.0−a)+Y.sub.0}Y.sub.E+(X.sub.0−a).sup.2+Y.sub.0.sup.2−L.sub.S.sup.2=0 [Equation 5]
[0096] Among the two values of Y.sub.E obtained in Equation 5, a negative number is selected and inputted in Equation 4 to obtain X.sub.E. Through the above method, a path of a straight lane can be obtained.
[0097]
[0098] Referring to
[0099] What is important in the process of creating a multi-path is whether or not a lane is detected in the corresponding path. For example, in
[0100] The active path index is transferred to the active multi-path (S1102).
[0101] In the active multi-path, each active single path is generated (S1103, S1104). Here, each active single path generation (S1103, S1104) is the entire process of the method for creating the path set described in
[0102] When the index of the path connected to the lane recognized in the current frame is determined as the inactive path index (S1105), it is transferred to the inactive multi-path (S1106). In the inactive single path generation (S1107, S1108), a new path is not added, but the location of the path stored in the past time is corrected according to the movement of the subject vehicle.
[0103] The remaining paths [0][1][2][3][8][9][10][11] other than the lane path [4][5][6][7] detected in
[0104] (How to Restore Multi-Lane to Parallel State)
[0105] A method is required to ensure that multiple lanes remain parallel.
[0106] If the result of lane recognition is used as it is for path generation, multi-lane paths cannot be maintained parallel to each other. For example, the slopes of the left and right lanes obtained by image recognition in a straight section cannot completely match. (See
[0107] To solve the above problem, lane information for a path managed separately from lane information recognized as an image is generated. In the multi-lane case, both lanes of the subject vehicle as a reference are called driving lanes.
[0108] Although the average value of the information of the two driving lanes may be used, in this case, incorrect lane information may be input and an error of the path may occur.
[0109] Hereinafter, a method for reducing a path error will be described.
[0110] In order to create a driving lane in a parallel state, a reference lane must be selected among the two lanes. The reference lane may mean a lane having a higher probability of being the lane among the two lanes.
[0111] The probability that the data detected in the image is a lane is defined as a lane probability variable.
[0112]
[0113] Referring to
[0114] The lane probability variable of the present invention will be described.
[0115] The first probability is the ratio of the number of fitted skeleton line to the total number of skeleton line. A method of making a skeleton line is disclosed in Korean Patent Registration No. 10-2296520 (corresponding U.S. patent application Ser. No. 17/159,150) of the present applicant.
[0116] The first probability is about how long the lanes of the image look. This can be expressed as a mathematical expression as follows.
[0117] Here, N denotes the maximum length in which the lane skeleton line can exist in the image. n.sub.S means the length of the skeleton line of the lane detected in the current image.
[0118] The second probability relates to how close the y.sub.2.sup.d value of the lower coordinate c.sub.1.sup.d of the skeleton line L.sub.1 is to the subject vehicle. This means that the closer the skeleton line is to the subject vehicle, the higher the probability of obtaining the correct lane.
[0119] Equation 7 shows how to obtain the probability p.sub.2.
[0120] Here, y.sub.0.sup.d means the Y-coordinate of the lowermost end of the image.
[0121] As the y.sub.1.sup.d value of the dotted line is closer to the bottom of the image, the value of approaches 1, and as the y.sub.1.sup.d value of the dotted line moves away from the bottom of the image, the value of p.sub.2 approaches 0. w is a weight value for adjusting the probability value. For example, when y.sub.1.sup.d exists at about 30 m, if you want to make probability p.sub.2 0.5, you can set w to 1/30.
[0122] The lane probability variable P for one lane means the joint probability of the probabilities p.sub.1 and p.sub.2.
[0123] Since the probabilities p.sub.1 and p.sub.2 and are independent events, they can be expressed by Equation 8.
P=p.sub.1×p.sub.2 [Equation 8]
[0124] Since the driving lane consists of left and right, the left lane random variable is and the right lane random variable is p.sub.R.
[0125] The relationship between the two lane probability variables (P.sub.L, P.sub.R) can be defined in three ways.
[0126] The first relationship is when two lane probability variables have similar values. It can be expressed as a conditional expression as in Equation 9.
|P.sub.L−P.sub.R|<θ [Equation 9]
[0127] In Equation 9, θ is the difference between the lane probability variables of the left and right lanes, and means an error tolerance value. For example, if θ is 0.2, if the difference between the lane probability variables of the left lane and the right lane has a value of 0.2 or less, it may be determined that they are similar.
[0128] The second relation is the case where P.sub.R is greater than P.sub.L, and the third relation is the case where P.sub.R is less than P.sub.L.
[0129]
[0130] Referring to
[0131] If Equation 9 is satisfied, a median value of the slope and curvature of the two lane information is obtained, and the equations of the straight line and the curve for the path are calculated for the left and right lanes (S1302).
[0132] If Equation 9 is not satisfied, the values of P.sub.L, and P.sub.R are compared (S1303).
[0133] When P.sub.R, is greater than P.sub.L, the equation of straight line and curve for the path of the left lane, such as the slope and curvature of the right lane, is calculated (S1304).
[0134] When P.sub.R is less than P.sub.L, equation of straight line and curve for the path of the right lane, such as the slope and curvature of the left lane, is calculated (S1305).
[0135] The form of the equation of the straight line (Equation 10) and the equation of the curve (Equation 11) used in the present invention is as follows.
x=a+by [Equation 10]
x=c+dy+ey.sup.2 [Equation 11]
[0136] Coefficients a, b, c, d, and e in Equations 10 and 11 denote the results of image recognition, where a and b are coefficients of a straight line, and c, d, and e are coefficients of a parabola.
[0137] a.sup.v, b.sup.v, c.sup.v, d.sup.v, e.sup.v, which will be described later, is defined as an equation coefficient of a straight line and a curve for a path.
[0138] In the case of step S1302 in
[0139] In the case of a straight line, the intermediate value of the slope b of the straight line of the two lanes is set as the slope of the reference lane. This slope is set to b.sup.v, and the coefficient a.sup.v for a straight line passing through the skeleton line coordinates of the lane is obtained using the least square method. At this time, the equation of a straight line of the left and right lanes can be obtained, and this is used for path generation. The coefficient of the left path straight line is a.sub.L.sup.v, and the right path straight coefficient is denoted as a.sub.R.sup.v.
[0140] In the case of the curve, the median value of the slope d of the curve of the two lanes is set as the slope d.sup.v of the reference lane. Also, an intermediate value of the curvature e of the curves of the two lanes is set as the curvature e.sup.v of the reference lane. The slope d.sup.v and the curvature e.sup.v are fixed as constant values, and the curve coefficient for a path passing through the skeleton coordinates of the lane is calculated using the least squares method. At this time, the equation of the curve of the left and right lanes can be obtained, and this is used for path generation. The coefficient of the left path curve is c.sub.L.sup.v, and the coefficient of the right path curve is denoted as c.sub.R.sup.v.
[0141] Steps S1304 and S1305 in
[0142] In the case of step S1304, the lane probability variable of the right lane is large, and the coefficients b, d, and e of the right lane are fixed as constants b.sup.v, d.sup.v, e.sup.v with the right lane as the reference lane. Using the least-squares method, the straight line and curve of the left lane for the path are set by obtaining the lane coefficient a.sub.L.sup.v, c.sub.L.sup.v for the path passing through the skeleton line coordinates of the left lane.
[0143] In the case of step S1305, the lane probability variable of the left lane is large, and coefficients b, d, and e of the left lane are fixed as constant b.sup.v, d.sup.v, e.sup.v using the left lane as the reference lane. Using the least-squares method, the straight line and curve of the driving lane for the path are set by obtaining the lane coefficient a.sub.R.sup.v, c.sub.R.sup.v for the path that passes through the coordinates of the skeleton line of the right lane.
[0144] The lane information for the path of the extended lane uses the lane information of the driving lane for the path generated above. By inputting the coefficient of the left driving lane and the coordinates of the skeleton line, the coefficient b.sub.L.sup.v, d.sub.L.sup.v, e.sub.L.sup.v of the left extension lane may be calculated using the least squares method. The remaining coefficient b.sub.L.sup.x, d.sub.L.sup.x, e.sub.L.sup.x of left extended lane are equal to the coefficient b.sub.L.sup.v, d.sub.L.sup.v, e.sub.L.sup.v of left driving lane. By inputting the coefficient b.sub.R.sup.x, d.sub.R.sup.x, e.sub.R.sup.x of the right driving lane and the coordinates of the skeleton line, the coefficient a.sub.R.sup.x, c.sub.R.sup.x of the right extension lane can be calculated using the least squares method. The remaining coefficient b.sub.R.sup.x, d.sub.R.sup.x, e.sub.R.sup.x of right extended lane are equal to the coefficient b.sub.R.sup.v, d.sub.R.sup.v, e.sub.R.sup.v of right driving lane.
[0145] (Interlocking Method of Multi-Lane and Multi-Path when Changing Lanes)
[0146] Hereinafter, when a lane change occurs, a method of interlocking multi-lanes and multi-paths will be described.
[0147]
[0148] Referring to
[0149] Referring to
[0150] Referring to
[0151] Referring to
[0152] A method of detecting a lane change is as follows.
[0153] For example, in
[0154] Summarizing the above in a formula, it is as follows:
(X.sub.k−1.sup.L<0)∩(X.sub.k.sup.L>0) [Equation 12]
[0155] When the condition of Equation 12 is satisfied, it may be determined that the lane has changed to the left. Here, X.sub.k−1.sup.L is the X value of the left lane of the previous frame, and X.sub.k.sup.L is the X value of the left lane of the current frame.
[0156] In the world coordinate system, (X, Y)=(0, 0) is the position of the camera installed on the subject vehicle. In other words, if the left lane is located to the left in the previous frame and to the right in the current frame of the reference value, it may be determined that the lane is changed to the left.
[0157] The following Equation 13 is a discriminant for occurrence of a lane change in the subject vehicle to the right.
(X.sub.k−1.sup.R<0)∩(X.sub.k.sup.R>0) [Equation 13]
[0158] Here, X.sub.k−1.sup.R is the X value of the right lane of the previous frame, and X.sub.k.sup.R is the X value of the right lane of the current frame.
[0159] When the lane change of the subject vehicle occurs, the location of the multi-lane changes with respect to the path as shown in
[0160]
[0161] Referring to
[0162] When explaining the meaning of each row in
[0163] Referring to
[0164] When explaining the meaning of each row in
[0165] (Generate Path of Subject Vehicle)
[0166] Hereinafter, a method for generating a path of a subject vehicle will be described. Since the path of the subject vehicle can be estimated using the path of the driving lane, it can be obtained using the method of creating one path of
[0167] The average value of the coefficients of the equation of the driving lane is used as input information in step S801 of
[0168] Equation 14 shows a method of calculating path input information of a subject vehicle.
[0169] In Equation 14, a.sub.S.sup.v and b.sub.S.sup.v denote a coefficient of the equation of a straight line (Equation 8), c.sub.S.sup.v, d.sub.S.sup.v, and e.sub.S.sup.v denote a coefficient of the equation of a curve (Equation 11). b.sub.S.sup.v, d.sub.S.sup.v, and e.sub.S.sup.v denote the slope and curvature of the path of the subject vehicle. The average value of the slope and curvature of the driving lane may be estimated as a coefficient of the equation of the path of the subject vehicle. a.sub.S.sup.v and c.sub.S.sup.v mean the position of the x-axis of the subject vehicle, and since the coordinate system based on the present invention is the world coordinate system with the position of the camera of the subject vehicle as the origin, a.sub.S.sup.v and c.sub.S.sup.v have a value of 0.
[0170] If the subject vehicle information obtained in Equation 14 is input in step S801 of
[0171]
[0172] Referring to
[0173] (How to Create a Path for the Front Vehicle)
[0174] Hereinafter, a method for generating a path for front vehicle will be described.
[0175] Front vehicle seen by the front camera of the subject vehicle can be recognized using a deep learning method using tensorflow. World coordinate values can be obtained with the information on the positions of the front vehicle, but these are coordinate values with the subject vehicle as the origin of the coordinate axis.
[0176]
[0177] Referring to
[0178] In
[0179] Referring to
[0180] In the case of
[0181] Referring to
[0182]
[0183] Referring to
[0184] In the embodiment, the subject vehicle and the front vehicle are moving at a constant velocity at a speed of about 70 km/h, and since the vector b has a relatively small vector quantity, the vector component of the subject vehicle is successfully applied to the vector calculation of the front vehicle. Additionally, the motion vector v denotes the instantaneous speed of the front vehicle.
[0185] The vector v of the front vehicle described above is for the current frame. The vectors v.sub.k obtained in the past time are stored in the memory in the form of a set.
[0186] Hereinafter, a method of concatenating vector sets into a path set will be described.
[0187]
[0188] Referring to
{(x.sub.0,y.sub.0),(x.sub.1,y.sub.1), . . . ,(x.sub.n,y.sub.n)}.
[0189] The vector component of v.sub.1 is (a.sub.1, b.sub.1) and the vector component of v.sub.n is (a.sub.n, b.sub.n).
[0190] In
x.sub.2=x.sub.1−a.sub.2
y.sub.2=y.sub.1−b.sub.2 [Equation 15]
[0191] If Equation 15 is expanded into a general expression that can be calculated for all path coordinates, Equation 16 is obtained.
x.sub.k+1=x.sub.k−a.sub.k+1
y.sub.k+1=y.sub.k−b.sub.k+1
1≤k≤(n−1) [Equation 16]
[0192] In Equation 16, the range of k is limited to 1 to n−1. n means the number of total vectors. Here, the starting point (x.sub.1, y.sub.1) and the ending point (x.sub.0, y.sub.0) of the vector v.sub.1 should be input as initial values. The starting point (x.sub.1, y.sub.1) and the ending point (x.sub.0, y.sub.0) mean v.sub.S and v.sub.E in
[0193] All path coordinates of the front vehicle can be obtained using the above method, and red arrows in
[0194] While the present invention has been particularly shown and described with reference to preferred embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the following claims