ORIENTATION CONTROL METHOD FOR DRONE
20180067503 ยท 2018-03-08
Inventors
Cpc classification
B64U2101/30
PERFORMING OPERATIONS; TRANSPORTING
G01S5/06
PHYSICS
G06F3/0338
PHYSICS
G06F3/0346
PHYSICS
International classification
G05D1/00
PHYSICS
Abstract
An orientation control method for a drone (3) is disclosed. The method receives wireless signal sent wirelessly from a target device (5) by at least three receivers (33) arranged on the drone (5), and detects the phase of the wireless signal respectively received by each of the three receivers (33) with a phase detector (34) of the drone. Next, the method calculates current relative orientation of the drone (3) with respect the target device (5) through a phase differences of the three phases of the three receivers (33). Next, the method controls the drone (3) to move to orientate user-designated direction according to the calculated current relative orientation and a directional signal sent wirelessly from a remote control (4).
Claims
1. An orientation control method applied to a drone (3) having at least three receivers (33), the method comprising: a) using the at least three receivers (33) to receive a set of wireless signals sent from a target device (5) respectively; b) using a phase detector (34) to detect the phases of the wireless signals received by the at least three receivers (33); c) calculating a relative orientation of the drone (3) with respect to the target device (5) based on phase differences between the wireless signals received by the at least three receivers (33); d) receiving a directional signal sent from a remote control (4); and e) controlling a movement of the drone (3) based on the relative orientation and the directional signal.
2. The method in claim 1, wherein the at least three receivers (33) are radio wave receivers (33) and the wireless signal is radio wave signal.
3. The method in claim 2, wherein the three receivers (33) comprise a first receiver (331), a second receiver (332) and a third receiver (333), a wavelength of the wireless signal is equal to or larger than two times of a distance between the first receiver (331) and the second receiver (332).
4. The method in claim 3, wherein the three receivers (33) are arranged at vertices of an isosceles triangle or vertices of an equilateral triangle.
5. The method in claim 3, further comprising: f) determining whether the wireless signals are not received; and g) repeatedly executing the step a) to the step f) before stopping receiving the wireless signals.
6. The method in claim 3, wherein the step c) calculates the relative orientation based on a first formula:
7. The method in claim 6, wherein step c) calculates the relative orientation based on a second formula:
8. The method in claim 7, further comprising following steps: h) calculating a horizontal azimuth angle (A) and a vertical elevation angle (B) of the target device (5) with respect to the drone (3) based on the relative orientation calculated; i) determining a moving direction and a destination coordinate of the drone (3) based on the horizontal azimuth angle (A), the vertical elevation angle (B) and the directional signal; j) moving the drone (3) along the moving direction to reach the destination coordinate.
9. The method in claim 8, wherein step h) calculates the horizontal azimuth angle (A) based on a third formula:
10. The method in claim 9, wherein the step h) calculates the vertical elevation angle (B) based on a fourth formula:
11. The method in claim 8, wherein the remote control (4) comprises a geomagnetic meter (44) for sensing an azimuth angle, a gyroscope (45) for sensing an elevation angle and an acceleration meter (46) for sensing an acceleration, wherein the remote control (4) is configured to generate a directional azimuth angle based on the azimuth angle and to generate a directional elevation angle based on the elevation angle and/or a moving acceleration of the remote control (4), the remote control (4) is configured to generate the directional signal based on the directional azimuth angle and the directional elevation angle; wherein in the step i) the remote control (4) is configured to determine the moving direction and the destination coordinate based on an azimuth angle difference between the directional azimuth angle and the horizontal azimuth angle (A) and an elevation angle difference between the directional elevation angle and the vertical elevation angle (B).
12. The method in claim 11, wherein the remote control (4) further comprises a human-machine interface (42) to receive a user input and modify the directional signal based on the user input to rotate the drone (3) toward a designated angle.
Description
BRIEF DESCRIPTION OF DRAWING
[0013] One or more embodiments of the present disclosure are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements. These drawings are not necessarily drawn to scale.
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
DETAILED DESCRIPTION OF THE INVENTION
[0025]
[0026] The drone 3 comprises a processor 31, a driving unit 32, a plurality of receivers (or transceivers) 33, a phase detector 34, a memory unit 35 and a camera 36. The processor 31 electrically connects with the driving unit 32, the receivers 33, the phase detector 34, the memory unit 35 and the camera 36. The driving unit 32 is used to control the movement (horizontal movement or vertical movement) and the rotation of the drone 3. The memory unit 35 is used to store data and the camera 36 is used to fetch image. The processor 31 is used to control the drone 3 by controlling the above units.
[0027] The receivers 33 respectively receive wireless signals sent from the remote control 4 and the target device 5. The phase detector 34 is used to detect the phase difference between the wireless signals received by the receivers 33. In this application, the drone 3 first uses the processor 31 to calculate a phase difference between the wireless signals received by the receivers 33 and then determines the orientation of the drone 3 with respect to the target device 5 (generally worn by user or by the object to be photo-taken by the camera 36). The drone 3 can then be orientated to facilitate the control of the drone 3 through the remote control 4 for user.
[0028] The remote control 4 comprises a processing module 41, a human-machine interface 42, a transceiver module (or receiver module) 43, a geomagnetic meter 44, a gyroscope 45 and an acceleration meter 46, where the processing module 41 electrically connects with the human-machine interface 42, the transceiver module 43, the geomagnetic meter 44, the gyroscope 45 and the acceleration meter 46. In this application, the user holds the remote control 4 to directly perform orientation operation. In an embodiment, after receiving an orientation operation of user, the remote control 4 may obtain the azimuth angle thereof (for example, north-east or south-west) by sensing the geomagnetic change through using the geomagnetic meter 44, obtain the elevation angle thereof through using the gyroscope 45 (for example, a three-axis gyroscope) and obtain a moving acceleration thereof through using the acceleration meter 46 (for example, a three-axis acceleration meter).
[0029] In this embodiment, the processing module 41 of the remote control 4 performs calculation to generate an orientation azimuth angle based on the azimuth angle, generate an orientation elevation angle based on the elevation angle and/or the moving acceleration. The processing module 41 generates a directional signal based on the directional azimuth angle and the directional elevation angle. The transceiver module 43 sends the directional signal outward.
[0030] The human-machine interface 42 (such as knob, button, joystick or the combination thereof) receives operations of user and modifies the directional signal based on the contents in the received operations of user (such as clockwise rotating the drone 3 by 20 degree or approaching the user by 10 centimeters and so on), thus controlling the drone 3 for movement or rotation with corresponding angle. In another embodiment, the human-machine interface 42 may also include signal indicator such as display screen, loud speaker, indication lamp to feedback information of the drone 3 for user.
[0031] The processing module 41 generates the directional signal based on the sensed information from the geomagnetic meter 44, the gyroscope 45 and the acceleration meter 46 and the operation information from the human-machine interface 42. Moreover, the transceiver module 43 sends the directional signal outward. The processing module 41 further controls the processing module 41 to receive related information for the drone 3 sent from external devices and display/indicate the related information through the human-machine interface 42.
[0032] Preferably, the casing of the remote control 4 is fit for single hand operation and for holding by user. The user can smoothly operate the remote control 4 to point to the desired orientation and more intuitionally control the drone 3 to move to the destined direction/orientation or position.
[0033] The target device 5 mainly includes a processing unit 51 and a transceiver unit 52 electrically connected with the processing unit 51.
[0034] Preferably, the target device 5 is worn by user, where the user is a user holding the remote control 4 or an object to be photo-taken by the drone 3 (for example, the drone 3 conducting a selfie mode to take photo for user). The target device 5 uses the processing unit 51 to calculate the position information thereof (namely, the position information of user) and generate a wireless signal based on the position information. The wireless signal is sent by the transceiver unit 52. The drone 3 identifies the position of the target device 5 by the received wireless signal and then performs positioning (orientating), movement, and rotation with the target device 5 as axial origin.
[0035] More particularly, the receivers 33, the transceiver module 43, and the transceiver unit 52 are wireless radio wave receivers (transceivers), the directional signal and the wireless signal are wireless radio wave signals, where the wireless signal sent by the target device 5 has a wavelength of .
[0036] In the present invention, the receivers 33 include at least a first receiver 331, a second receiver 332 and a third receiver 333. The drone 3 may directly calculate the orientation of the drone 3 with respect to the target device 5 based on phase differences of wireless signals received by the three receivers 33. Therefore, the drone 3 and the target device 5 can be dispensed with the GPS device and need not receive external positioning signals by bluetooth or other wireless transmission scheme. The cost of the drone system 2 is reduced and the positioning (orientating) accuracy of the drone 3 is enhanced.
[0037]
[0038] In a preferred embodiment, the receivers 33 are arranged at vertices of an isosceles triangle (namely, the distance between the first receiver 331 and the third receiver 333 is equal to the distance between the second receiver 332 and the third receiver 333) or arranged at vertices of an equilateral triangle, but the scope of the present invention is not limited to above examples. In the present invention, the wavelength of the wireless signal is equal to or slightly larger than two times of the distance D between the first receiver 331 and the second receiver 332, namely, 2D.
[0039] As mentioned above, the present invention performs positioning (orientating) by using the phase difference of the received wireless signals. When the phase difference of the wireless signals received by the first receiver 331 and the second receiver 332 is zero and when the phase difference of the wireless signal corresponding to the midpoint and the wireless signal received by the third receiver 333 is also zero, it means that the drone 3 orientates toward (faces) the target device 5 directly. The camera 36 arranged on the front face of the drone 3 also orientates toward (faces) the target device 5 directly (namely faces the user directly) when the drone 3 orientates toward (faces) the target device 5 directly. Therefore, user can control the drone 3 for selfie.
[0040] The method of the present invention will be described in detail below. It should be noted that the method described in various embodiments of the present invention can be implemented by the drone system 2 shown in
[0041]
[0042] Afterward, the processor 31 calculates a relative orientation of the drone 3 with respect to the target device 5 in step S14. In this embodiment, the relative orientation comprises an included angle between a plane connecting the first receiver 331 and the second receiver 332 and the target device 5, and an included angle between a plane connecting the midpoint 334 and the third receiver 333 and the target device 5.
[0043] During reception of the wireless signals, the drone 3 uses one of or all of the three receivers 33 to receive the directional signal from the remote control 4. The processor 31 of the drone 3 controls the movement of the drone 3 based on the relative orientation and the directional signal.
[0044] More particularly, the processor 31 calculates a relative orientation of the drone 3 with respect to the target device 5 based on the above-mentioned phase differences, and automatically controls the movement (or rotation) of the drone 3 based on the calculated relative orientation such that the drone 3 orientates toward (faces) the target device 5 directly. If the user operates the remote control 4 to send the directional signal and the directional signal includes signal controlling the drone 3 to rotate, the processor 31 fine tunes the direction of the drone 3 based on the directional signal. In one embodiment, the drone 3 mainly provides selfie function. Therefore, the above mentioned movement or rotation renders the drone 3 to move or rotate to orientate toward (face) the target device 5 directly or renders the camera (not shown in
[0045] After the step S16, the processor 31 determines whether the three receivers 33 stop receiving the wireless signals (step S18), namely whether the target device 5 is turned off, or the communication between the drone 3 and the target device 5/the remote control 4 is terminated. If the three receivers 33 stop receiving the wireless signal, the processor 31 ends the orientation control method. If the three receivers 33 continually receive the wireless signals, the processor 31 repeatedly executes step S10 to step S16 to continually orientate the drone 3 and control the movement of the drone 3.
[0046] With reference both to
[0047] Based on the experiment result of inventor, the phase difference between the two wireless signals can be optimally identified if the wavelength of the wireless signal is equal to or slightly large than two times of the distance D between the first receiver 331 and the second receiver 332. In this invention, the wavelength of the wireless signal is preset to be two times of the distance D. It should be noted that the directional signal has arbitrary wavelength and needs not to be equal to or slightly large than two times of the distance D, because the drone 3 is not orientated with reference to the directional signal sent by the remote control 4.
[0048] Accordingly, the processor 31 may apply the first formula below to calculate the relative orientation of the drone 3 with respect to the target device 5 once the processor 31 calculates the phase difference between the wireless signals received by the first receiver 331 and the second receiver 332.
[0049] In above first formula, is the included angle between the plane (
[0050] In the present invention, the processor 31 may control the drone 3 to rotate based on the above calculation result such that the phase difference between the phase .sub.1 of the wireless signal received by the first receiver 331 and the phase .sub.2 of the wireless signal received by the second receiver 332 is zero. The plane (
[0051] As shown in
[0052] In embodiment shown in
while the phase of the wireless signal corresponding to the midpoint 334 is 5/2, the difference thereof is
[0053] Accordingly, the processor 31 may apply the second formula below to calculate the another relative orientation component of the drone 3 to the target device 5 once the processor 31 calculates the phase difference between the wireless signal of the midpoint 334 and the wireless signal received by the third receiver 333, namely, applying the second formula to execute the step S14 shown in
[0054] In above second formula, a is the included angle between the plane (
[0055] By exploiting the phase difference between two receivers (for example, the first receiver 331 and the second receiver 332), the orientation of the drone 3 with respect to the target device 5 can only be calibrated on two dimensional spaces (namely alone a plane). However, when the number of the receivers is equal to or more than three, the orientation of the drone 3 with respect to the target device can be calibrated on three dimensional spaces, thus more accurately orientate the drone 3. Therefore, in the present invention, the numbers of the receivers is at least three.
[0056]
[0057] In the embodiment shown in
[0058] Afterward, as shown in
[0059] The above-mentioned orientation way may also be applied to the midpoint 334 and the third receiver 333, the detailed description thereof is omitted here for brevity. After the drone 3 is orientated, the user may effectively control the drone 3 for selfie, or control the drone 3 to move/rotate alone designated direction such that the drone 3 can be orientated to the direction designated by user.
[0060]
[0061] In this embodiment, in step S36 the processor 31 further calculates a horizontal azimuth angle and a vertical elevation angle of the target device 5 with respect to the drone 3 based on the relative orientation calculated in step S34 (which will be detailed later). Moreover, the processor 31 determines a moving direction and a destination coordinate based on the horizontal azimuth angle, the vertical elevation angle and the directional signal sent from the remote control 4 (step S38). Finally, the processor 31 controls the drone 3 to move along the moving direction and reach the destination coordinate (step S40) to comply with the directional operation. Therefore, the drone 3 keeps orientating with the target device 5 as axial origin and is moved based on the directional operation of user, thus the drone 3, the remote control 4 and the target device 5 have effective interaction therebetween.
[0062] After step S40, the processor 31 determines whether the wireless signals are not received in step S42, namely, whether the receivers 33 stop receiving the wireless signals. The processor 31 ends the orientation control method if the wireless signal is not received. The processor 31 repeatedly executes step S30 to step S40 to continually orientate the drone 3 and control the movement of the drone 3 if the wireless signals are continually received.
[0063] With reference also to
[0064] In above third formula, A is the horizontal azimuth angle, D is the distance between the first receiver 331 and the second receiver 332, L is the distance between the midpoint 334 and the third receiver 333, .sub.1 is the phase of the wireless signal received by the first receiver 331, .sub.2 is the phase of the wireless signal received by the second receiver 332 and .sub.3 is the phase of the wireless signal received by the third receiver 333.
[0065] More particularly, according to the relative position of the target device 5 with respect to the plurality of receivers, we can derive the result:
The value of cos and cos can be obtained by above orientation calculation, therefore, the fourth formula can be derived as above.
[0066] In above fourth formula, B is the vertical elevation angle, D is the distance between the first receiver 331 and the second receiver 332, L is the distance between the midpoint 334 and the third receiver 333, .sub.1 is the phase of the wireless signal received by the first receiver 331, .sub.2 is the phase of the wireless signal received by the second receiver 332, .sub.3 is the phase of the wireless signal received by the third receiver 333 and is the is the wavelength of the wireless signal.
[0067] More particularly, according to the relative position of the target device 5 with respect to the plurality of receivers, we can derive the result: cos B={square root over (cos .sup.2+cos .sup.2)}. The value of cos and cos can be obtained by above orientation calculation, therefore, the fourth formula can be derived as above.
[0068] According to above third formula and fourth formula, the processor 31 can precisely calculate the relative position (orientation) of the drone 3 with respect to the target device 5 without using additional positioning device (such as GPS device), thus move the drone 3 with respect to user's directional operation.
[0069] As mentioned above, in the step S38 of
[0070] More particularly, upon receiving the directional operation from user, the remote control 4 mainly uses the geomagnetic meter 44 to sense the geomagnetic change and generate the current azimuth angle for the remote control 4. The remote control 4 obtains the elevation angle thereof through using the gyroscope 45 and obtains a moving acceleration thereof through using the acceleration meter 46. Moreover, the remote control 4 uses the processing module 41 therein to calculate the directional azimuth angle based on the azimuth angle and to calculate the directional elevation angle based on the elevation angle and/or the moving acceleration. The remote control 4 generates the directional signal based on the directional azimuth angle and the directional elevation angle and then sends the directional signal through the transceiver module 43 therein.
[0071] In above mentioned step S38, the processor 31 receives the directional signal from the remote control 4. The processor 31 calculates the moving direction and the destination coordinate for the drone 3 based on an azimuth angle difference between the directional azimuth angle and the horizontal azimuth angle A and an elevation angle difference between the directional elevation angle and the vertical elevation angle B. Therefore, the drone 3 can be orientated according to the wireless signals sent from the target device 5 without needing GPS device. The drone 3 can be moved according to the directional signal sent from the remote control 4 to achieve precise interaction between user and the object to be photo-taken.
[0072] Thus, particular embodiments have been described. Other embodiments are within the scope of the following claims. For example, the actions recited in the claims may be performed in a different order and still achieve desirable results.