Autonomous Flying Fun
Previously, Raul has written articles introducing Ardupilot and PX4 for drone development. Here, he follows that trend with an intro to crafting drone applications with the MAVSDK MAVLink library and the PX4 autopilot ecosystem. He also discusses how to set up a development environment and install a basic toolchain, including PX4 and MAVSDK software.
In this article, I share a quick introduction to drone applications development with the MAVSDK MAVLink library and the PX4 autopilot ecosystem. I also discuss how to set up a development environment in the Ubuntu operating system by installing a basic toolchain that includes the PX4 Software-In-The-Loop (SITL) flight stack, jMAVSim and Gazebo simulators, MAVSDK-Python (which is the Python wrapper for the MAVSDK core C++ library) and QGroundControl ground station software. I then explain how to write your first quadrotor autonomous flight application, by showing an example in which the vehicle basically follows a predetermined path to write your name initials in the air. SITL simulation is used to run the example, but I also discuss briefly how to run it with a real drone.
This article is a follow-up to my two-part article “An Introduction to Ardupilot and PX4- Part 1” (Circuit Cellar 357, April 2020) and “An Introduction to Ardupilot and PX4- Part 2” (Circuit Cellar 358, May 2020) which was a general introduction to quadrotor drones and the PX4 platform. Although it is intended for beginners who never experimented with drone autonomous flight and/or the MAVSDK library, it will require that you have at least general introductory knowledge on how quadcopter drones work, the PX4 flight controller software ecosystem—including the MAVLink protocol and QGroundControl—and, of course, the Python programming language.
DEVELOPMENT TOOLCHAIN
Mac OS, Linux and Windows operating systems are officially supported for PX4 development. Having tried the toolchain installation in Windows and Linux (unfortunately not in Mac OS), I found that Windows is not really recommended for beginners. That’s because the installation process can be cumbersome, unless you have previous experience setting up compiling environments in Windows. The Linux installation seems to be easier and has better official support.
Apparently, Linux is used extensively as a preferred platform for PX4 development. The custom example presented in this article was tested in Ubuntu 18.04, both natively and also in a virtual machine using VirtualBox in a Windows host. It should work in Ubuntu 16.04 as well. A Docker container with the development toolchain installed beforehand is also available. However, at the time of publishing this article, it was only supported for Linux.
Now let’s look at how to set up the development toolchain in Ubuntu 18.04. It will include the PX4 SITL flight stack with the jMAVSim and Gazebo simulators, the MAVSDK-Python library and QGroundControl ground station software. I will not go into minute detail on all the steps involved. That said, there’s a text file INSTALL_RUN_SIMULATION.md included with the source code for this article that describes all the steps I took to install the toolchain. It also contains links to PX4’s official pages covering detailed procedures. All such files are available on Circuit Cellar’s article code and files webpage.
If you only have a Windows machine on hand, you can use VirtualBox to create a virtual Linux Ubuntu machine inside Windows. Check the RESOURCES section at the end of the article for some recommended Web links about tools, procedures and concepts outlined in this article. Once you have your Ubuntu operating system ready, follow these steps to set up the development toolchain. First, install the PX4 SITL jMAVSim/Gazebo simulation environment. There may be some prerequisites, check the INSTALL_RUN_SIMULATION.md text file for detailed steps.
— ADVERTISMENT—
—Advertise Here—
After the installation is complete, the procedure will leave you at the default compilation subdirectory, which is ~/src/Firmware. From this subdirectory you can compile and run the PX4 SITL flight stack with both jMAVSim and Gazebo vehicle simulators. For instance, to compile and run PX4 SITL with Gazebo, in the same command terminal run make px4_sitl gazebo
. It is possible the compilation may fail due to some missing libraries in your Ubuntu Installation. Check the INSTALL_RUN_SIMULATION.md file on how to fix some compilation errors you might get.
Once the compilation finishes without errors, the Gazebo simulator will start and show a simulated 3DR Iris+ quadcopter (Figure 1). In the same command terminal, run commander takeoff
, and you will see the drone begin to take off to a default altitude. In some cases, the quadcopter will land by itself after a few seconds. If not, run commander land
to make it land to its home position. To stop the simulation, press <Ctrl+c> in the command terminal. You can try PX4 SITL with the jMAVSim simulator as well, by running make px4_sitl jmavsim
. Figure 2 shows the jMAVSim simulation after commander takeoff has been executed.
Next, Install MAVSDK-Python. Download and install QGroundControl and clone the MAVSDK-Python GitHub repository to run the examples described in the next section. These last steps are straightforward (see the INSTALL_RUN_SIMULATION.md file for details). If you managed to get up to this point, you are ready to try some Python code to make the quadcopter fly by itself!
RUNNING BASIC EXAMPLES
Let’s try some examples from the MAVSDK-Python repository we have just cloned. First, run one of the two simulators (for instance, you can run make px4_sitl gazebo
). Second, run the QGroundControl ground station software. Third, open another command terminal window and run cd ~/MAVSDK-Python/examples
to go to the MAVSDK-Python’s examples directory (assuming you cloned MAVSDK-Python in your home directory). Next, run python3 takeoff_and_land.py
to try the “take off and land” example. You should see the quadcopter take off first, and then land automatically after a few seconds. Figure 3 shows this example running. If this worked well for you, then you have the toolchain correctly installed in your machine.
In a similar way, you can also try the example offboard_position_ned.py, which gives the drone some navigation coordinates in their local coordinate frame. Run the command python3 offboard_position_ned.py
to try this example as well.
If you check the code for the offboard_position_ned.py example, you will notice that the code lines in charge of commanding the drone to a certain pose look like this:
await drone.offboard.set_position_ned(PositionNedYaw(5.0, 10.0, -5.0, 90.0))
Note that the set_position_ned() function is being used to command the drone, and the arguments it receives are coordinates in the “North, East, Down” (NED) coordinate system. So, the first number in the argument (5.0) is the distance along the northern axis, the second number (10.0) is the distance along the eastern axis and the third number (-5.0) is the altitude. If the altitude is negative, it means the drone will actually climb up, because in NED, “Down -5.0” is really “Up +5.0.” Altitudes and distances are in meters.
The fourth number (90.0) is the desired “bearing angle” in degrees for the drone, once it reaches the desired position. Bearing angle is the angle around the drone’s vertical axis, known also as the “yaw” angle. Therefore, the PositionNedYaw()
function will take these four numbers as arguments and convert them to a data structure of type PositionNedYaw
, which is defined and used in the MAVSDK library for poses in the NED coordinate system.
— ADVERTISMENT—
—Advertise Here—
NED and another variant called ENU (“East, North, Up”) are two geographical local coordinate systems used for aircraft navigation. Their axes are oriented along the geodetic directions defined by the Earth’s surface, with their origins fixed at the aircraft’s center of gravity [1]. Both follow the “right hand” rule.
PYTHON ASYNCHRONOUS PROGRAMMING
Another characteristic also worth noting in the examples we just ran is the use of the asyncio Python library and its corresponding async/await
keywords. Even if you have programmed in Python before, there’s a chance you never used the asyncio asynchronous programming library. When using this library, the async keyword lets us declare an asynchronous function, typically called a “coroutine.”
A coroutine is basically a function that execution can be paused at some point—generally, to wait for some event or data on which their subsequent execution depends. In other words, asynchronous functions can stop to wait for some kind of event, but without blocking other tasks in the same program [2]. So, the Python asyncio library with its async/await syntax enables concurrent asynchronous programming.
In summary, you declare a function async
to make it a coroutine, and in some part of its body, by using the await
keyword, the function cooperatively hands the control back to the event loop. It then waits for some external event to occur—such as the arrival of data it needs to resume its task, or a returning value from another function.
WRITE MY INITIALS
The custom example, write_my_initials.py that I’ve prepared for this article is based on the offboard_position_ned.py and telemetry_takeoff_and_land.py examples found in the MAVSDK-Python/examples folder. In the write_my_initials.py example, we will make the drone autonomously follow a certain path to write in the air the first initials of our name. For the sake of the example, we will write Circuit Cellar magazine’s initials (CCM), but you can later modify the code to write any letters you want. Let’s see what I did to accomplish this task.
First, I drew the “CCM” initials on a grid paper and determined the minimum amount of points required to clearly define them. Figure 4 shows how the drawing looked. The graph’s scale is 1m per division. So, every letter is 4m by 5m in size. All defined points are in an XY Cartesian plane. But by adding a fixed 5m altitude to each one of them, we get all points defined in an XY plane that is 5m above the ground. We could also think of them as defined in an XYZ Cartesian coordinate system, with all points having the same Z coordinate (Figure 5).
However, because we will also use the set_position_ned()
function to command the drone (as in the offboard_position_ned.py example), we need to give the drone all navigation points in the NED coordinate system. For that reason, each of our defined points in the XYZ coordinate system must be mapped to the NED coordinate system. The axis labels in Figure 5 already show how the XYZ and NED coordinate systems are defined, and how their axes correspond to each other. As you can see, the X coordinate maps to East, the Y coordinate maps to North and Z coordinate to -Down.
Much of the initialization code for this example is the same found in the offboard_position_ned.py example. The program’s “main” entry function is also async def run()
(Listing 1), and the first code lines in charge of the initialization steps are self-explanatory. In code line 10, with drone = System()
we instantiate a System
object, which in the MAVSDK API represents the vehicle and will allow us to interact with it. With line 11 immediately below, we are making a UDP connection to the drone, to a default IP address and port 14540. With “udp://:14540
” we expect to find the drone at “localhost:14540.” That will be the case when our Python script runs in the same machine as the SITL simulation.
Next, with an async for
loop (line 14), we poll the drone’s connection state until a connected
state is received. Subsequently, we print the drone’s identifier code (UUID) and break the async for
loop—otherwise, the loop would run indefinitely, because in MAVSDK, PX4 telemetry data is provided as continuous streams. Line 20 arms the drone and line 23 lets us wait until we have acquired the home position coordinates (more on the get_drone_home_pos
(drone) function later). Line 26 sets an initial pose for the drone before changing to offboard
flight mode in lines 29 to 35. offboard
flight mode requires we begin by sending some poses before changing to this flight mode. If, for any reason, the flight mode change fails, the drone is disarmed and a return
from the main function is executed to abort the whole program. PX4’s offboard
flight mode lets us control the drone by using position, velocity, acceleration (thrust) and altitude setpoint commands via MAVLink.
Listing 2 shows how each one of the three initials are defined as Python lists (actually, as “list of lists”), in which every vertex point for a letter is an inner list of NED coordinates in meters and a bearing angle in degrees. The bearing angle is the angle measured in a clockwise direction from the northern line, and is the angle the drone must have after reaching the given NED coordinates. For instance, a bearing angle of 270 degrees means the drone will be pointing toward the West. In line 3, we have the first “C” letter defined as a list containing eight vertex points.
COORDINATES AND ANGLES
The NED coordinates for each point were copied from Figures 4 and Figure 5. The bearing angles were picked at random, so their values don’t have major significance regarding this particular example. For the second “C” letter, in a for
loop, I just added 6 to the East coordinate of each point in the first “C” letter, to make a copy of it that’s displaced 6 units to the right (see lines 7-9). Figure 4 shows how that makes sense intuitively. For the last letter, “M,” defined in lines 11-12, I also copied the coordinates from the grid graph and gave each point a random bearing.
In line 14, the three letter lists are concatenated into one list called path_points_list
by using the Python + concatenation operator. Next, in the for
loop, we iterate over each point contained in the aforementioned list and make the drone navigate to each one of them in the given order. For instance, to go to the current point, in lines 18-21, first we retrieve the North, East and Down coordinates, along with the bearing angle. Then, in lines 23-24, we calculate the point’s angle with respect to the East and its horizontal distance from the origin (the drone’s initial position), by using the arctangent trigonometric function and the Pythagorean formula, as if we were in the XY Cartesian plane (Figure 6). Then, in lines 26-29, we convert the point’s angle (with respect to the eastern axis) to bearing (with respect to the northern axis), and with the bearing, the distance and the drone’s home latitude and longitude, we call the get_dest_latlong()
function to calculate the GPS coordinates for the current goal point.
Because the drone’s distance to current goal will be measured in the async def check_is_at_goal (drone)
function (more on this function later), to check if the drone has reached the current goal point, we populate the cur_goal_point_pos
global variable (a Python dictionary, actually) with the current goal’s latitude, longitude and altitude. Next, in line 35 we call the drone.offboard.set_position_ned()
function to set the current NED coordinates and Yaw angle as the next drone pose. The await keyword in this line ensures the for loop will pause until the drone receives this new goal pose. Finally, in line 38 we call the check_is_at_goal
(drone) function, which will return after the drone reaches the current goal point. Figure 7 shows a screen capture of the write_my_initials.py example running.
AUXILIARY FUNCTIONS
Listing 3 shows the two asynchronous auxiliary functions that are part of write_my_initials.py. Let’s explain what these functions do. The first function, async def get_drone_home_pos(drone)
is in charge of obtaining the drone’s home position GPS coordinates and relative altitude. With the async for
loop in lines 5-11, this function starts iterating over the “home” position incoming telemetry stream, and after receiving the first reading and storing it in the drone_home_pos
Python dictionary, in line 11 the function returns. The home position is the spot at which the drone takes off, and it won’t change during the execution of our Python script, so we need to read it just once at the beginning. If you remember, we call this function in the initialization steps in the main async def run()
function (see line 23 in Listing 1).
The second function, async def check_is_at_goal(drone)
, is in charge of monitoring the drone’s progress to the current goal point. To achieve this, it iteratively reads the drone’s current position by calling the drone.telemetry.position()
function in the async
for loop (line 21). After obtaining the drone’s current position, the latitude, longitude and relative altitude are stored in the drone_current_pos
Python dictionary (lines 23-25). In line 27, it computes the horizontal distance in meters between the drone’s current position and the current goal’s position by using the get_haversine_distance()
function. The “Haversine” formula used in the latter function allows to calculate “circular distances” (arcs) over the Earth’s surface between two sets of latitude and longitude coordinates.
In line 30, we compute the vertical distance as well, which will be useful when the drone is going to a point in a different altitude than the previous one. Then, with the first if
sentence (line 33), we check if the drone has reduced the horizontal distance to the current goal in at least 1m, and print
the current (rounded) distance value to the terminal window, just for the purpose of visualizing the drone’s horizontal navigation progress. With the second if
sentence (line 37), we check if both the horizontal and vertical distances to the current goal are less than or equal to the maximum allowed distance errors MAX_HOR_DIST_ERROR
and MAX_VER_DIST_ERROR
, to consider the current goal point reached. If this is true, after printing a message, the function returns to the main caller function, in which the for loop in charge of controlling the navigation (see line 16 in Listing 2) will pick the next point and command the drone to go to it.
RUNNING ON A REAL DRONE
This very same example can be run with a real quadcopter drone connected to the PC via telemetry modules, after changing line 11 in Code Listing 1 for:
await drone.connect(system_address="serial:///dev/ttyUSB0:57600")
where /dev/ttyUSB0
is the port name assigned by your Linux system to the USB telemetry module connected to your PC.
Obviously, in this case you don’t need to run the Gazebo or jMAVSim simulators anymore, nor will you be able to use QGroundControl. If QGroundControl is open, it will automatically connect to the drone by opening the USB serial port, and your Python script won’t be able to open the same port to connect to the drone. You can first use QGroundControl to check if the drone connects properly with your computer via the telemetry modules, and then close it before running the Python script.
Please, bear in mind I could be leaving out some additional details (beyond the scope of this article) for the process of running MAVSDK-Python code to control a real drone. The point I’m trying to make here is that, in general, once you have some application correctly running in simulation, testing it with a real drone won’t require making major changes to the code you wrote and software tools you’ve been using. Nevertheless, I don’t recommend running this example with a real drone, unless you have at least relatively good experience configuring and flying real quadcopters, and you really understand how PX4 flight modes work. However, if you do want to run it with a real drone, I recommend you first try a simpler example such as takeoff_and_land.py to be sure your workflow is okay. Also, remember to consider all the required space the drone would need to cover the whole flying path
CONCLUSIONS
I hope you saw how relatively easy it is to get into autonomous drone application development with MAVSDK-Python. Sure enough, you have to know Python beforehand and also get the toolchain installed and working properly. And because MAVSDK-Python uses the asyncio library, you also need to have some familiarity with asynchronous programming—nothing fancy, just a general understanding of how concurrency works and specifically how to use the async/await keywords.
If you want to further experiment with the MAVSDK-Python API, there are a couple of things you should do. First, get access to PX4’s “Slack” workspace and “Discuss” forum [3], where you can get support for all things related to the PX4 ecosystem, not just MAVSDK. Second, try to study all the examples included in the MAVSDK-Python repository. Some of them are intuitive and easy to understand, whereas some others would require you to research a bit more. Read the MAVSDK API documentation as well, for a detailed reference about the API’s functions and data structures used in the examples.
In follow-ups to this article, I will be presenting some more advanced examples of autonomous flight with MAVSDK, and also discussing how to upgrade a quadcopter with a companion computer to do autonomous object tracking with computer vision. I will also talk about how to integrate PX4 with the Robot Operating System (ROS), by using MAVROS (the MAVLink-to-ROS bridge). Until the next time!
RESOURCES
References:
[1] https://www.mathworks.com/help/aeroblks/about-aerospace-coordinate-systems.html
[2] https://snarky.ca/how-the-heck-does-async-await-work-in-python-3-5/
[3] PX4 Support (Slack and Discuss)https://dev.px4.io/v1.9.0/en/contribute/support.html
MAVSDK
https://mavsdk.mavlink.io/develop/en/index.html
PX4 SITL simulation
https://dev.px4.io/v1.9.0/en/simulation/index.html
Development Environment Installation
https://dev.px4.io/v1.9.0/en/setup/dev_env.html
Getting started with MAVSDK-Python
https://auterion.com/getting-started-with-mavsdk-python/
MAVSDK-Python: easy asyncio
https://auterion.com/mavsdk-python-easy-asyncio/
PX4 platform
https://px4.io/
QGroundControl
http://qgroundcontrol.com/
PX4 Support (Slack and Discuss)
https://dev.px4.io/v1.9.0/en/contribute/support.html
Local Tangent Plane Coordinates (NED Coordinate System)
https://en.wikipedia.org/wiki/Local_tangent_plane_coordinates
Speed Up Your Python Program With Concurrency
https://realpython.com/python-concurrency/
Install Ubuntu 18.04 LTS in a VirtualBox
https://linuxhint.com/install_ubuntu_18-04_virtualbox/
Ardupilot | https://ardupilot.org
Dronecode | www.dronecode.org
PX4 Autopilot | https://px4.io
Mathworks | www.mathworks.com
PUBLISHED IN CIRCUIT CELLAR MAGAZINE • AUGUST 2020 #361 – Get a PDF of the issue
Sponsor this ArticleRaul Alvarez Torrico has a B.E. in electronics engineering and is the founder of TecBolivia, a company offering services in physical computing and educational robotics in Bolivia. In his spare time, he likes to experiment with wireless sensor networks, robotics and artificial intelligence. He also publishes articles and video tutorials about embedded systems and programming in his native language (Spanish), at his company’s web site www.TecBolivia.com. You may contact him at raul@tecbolivia.com