Design Solutions Research & Design Hub

Writing MAVSDK/PX4 Drone Applications

Autonomous Flying Fun

Previously, Raul has written articles introducing Ardupilot and PX4 for drone development. Here, he follows that trend with an intro to crafting drone applications with the MAVSDK MAVLink library and the PX4 autopilot ecosystem. He also discusses how to set up a development environment and install a basic toolchain, including PX4 and MAVSDK software.

  • How to develop drone applications with the MAVSDK MAVLink library and the PX4 autopilot ecosystem

  • How to set up a development environment and install a basic toolchain, including PX4 and MAVSDK software

  • How to set up the development toolchain in Ubuntu 18.04

  • How to run some examples from the MAVSDK-Python repository

  • How to work with NED coordinates

  • How to understand the asynchronous auxiliary functions

  • How to run the software on a real drone

  • MAVSDK MAVLink library

  • PX4 autopilot ecosystem

  • PX4 Software-In-The-Loop (SITL) flight stack

  • jMAVSim and Gazebo simulators

  • MAVSDK-Python

  • QGroundControl ground station software

In this article, I share a quick introduction to drone applications development with the MAVSDK MAVLink library and the PX4 autopilot ecosystem. I also discuss how to set up a development environment in the Ubuntu operating system by installing a basic toolchain that includes the PX4 Software-In-The-Loop (SITL) flight stack, jMAVSim and Gazebo simulators, MAVSDK-Python (which is the Python wrapper for the MAVSDK core C++ library) and QGroundControl ground station software. I then explain how to write your first quadrotor autonomous flight application, by showing an example in which the vehicle basically follows a predetermined path to write your name initials in the air. SITL simulation is used to run the example, but I also discuss briefly how to run it with a real drone.

This article is a follow-up to my two-part article “An Introduction to Ardupilot and PX4- Part 1” (Circuit Cellar 357, April 2020) and “An Introduction to Ardupilot and PX4- Part 2” (Circuit Cellar 358, May 2020) which was a general introduction to quadrotor drones and the PX4 platform. Although it is intended for beginners who never experimented with drone autonomous flight and/or the MAVSDK library, it will require that you have at least general introductory knowledge on how quadcopter drones work, the PX4 flight controller software ecosystem—including the MAVLink protocol and QGroundControl—and, of course, the Python programming language.


Mac OS, Linux and Windows operating systems are officially supported for PX4 development. Having tried the toolchain installation in Windows and Linux (unfortunately not in Mac OS), I found that Windows is not really recommended for beginners. That’s because the installation process can be cumbersome, unless you have previous experience setting up compiling environments in Windows. The Linux installation seems to be easier and has better official support.

Apparently, Linux is used extensively as a preferred platform for PX4 development. The custom example presented in this article was tested in Ubuntu 18.04, both natively and also in a virtual machine using VirtualBox in a Windows host. It should work in Ubuntu 16.04 as well. A Docker container with the development toolchain installed beforehand is also available. However, at the time of publishing this article, it was only supported for Linux.

Now let’s look at how to set up the development toolchain in Ubuntu 18.04. It will include the PX4 SITL flight stack with the jMAVSim and Gazebo simulators, the MAVSDK-Python library and QGroundControl ground station software. I will not go into minute detail on all the steps involved. That said, there’s a text file included with the source code for this article that describes all the steps I took to install the toolchain. It also contains links to PX4’s official pages covering detailed procedures. All such files are available on Circuit Cellar’s article code and files webpage.

If you only have a Windows machine on hand, you can use VirtualBox to create a virtual Linux Ubuntu machine inside Windows. Check the RESOURCES section at the end of the article for some recommended Web links about tools, procedures and concepts outlined in this article. Once you have your Ubuntu operating system ready, follow these steps to set up the development toolchain. First, install the PX4 SITL jMAVSim/Gazebo simulation environment. There may be some prerequisites, check the text file for detailed steps.

After the installation is complete, the procedure will leave you at the default compilation subdirectory, which is ~/src/Firmware. From this subdirectory you can compile and run the PX4 SITL flight stack with both jMAVSim and Gazebo vehicle simulators. For instance, to compile and run PX4 SITL with Gazebo, in the same command terminal run make px4_sitl gazebo. It is possible the compilation may fail due to some missing libraries in your Ubuntu Installation. Check the file on how to fix some compilation errors you might get.

Once the compilation finishes without errors, the Gazebo simulator will start and show a simulated 3DR Iris+ quadcopter (Figure 1). In the same command terminal, run commander takeoff, and you will see the drone begin to take off to a default altitude. In some cases, the quadcopter will land by itself after a few seconds. If not, run commander land to make it land to its home position. To stop the simulation, press <Ctrl+c> in the command terminal. You can try PX4 SITL with the jMAVSim simulator as well, by running make px4_sitl jmavsimFigure 2 shows the jMAVSim simulation after commander takeoff has been executed.

FIGURE 1 – PX4 SITL with Gazebo simulator
FIGURE 2 – PX4 SITL with jMAVSim simulator

Next, Install MAVSDK-Python. Download and install QGroundControl and clone the MAVSDK-Python GitHub repository to run the examples described in the next section. These last steps are straightforward (see the file for details). If you managed to get up to this point, you are ready to try some Python code to make the quadcopter fly by itself!


Let’s try some examples from the MAVSDK-Python repository we have just cloned. First, run one of the two simulators (for instance, you can run make px4_sitl gazebo). Second, run the QGroundControl ground station software. Third, open another command terminal window and run cd ~/MAVSDK-Python/examples to go to the MAVSDK-Python’s examples directory (assuming you cloned MAVSDK-Python in your home directory). Next, run python3 to try the “take off and land” example. You should see the quadcopter take off first, and then land automatically after a few seconds. Figure 3 shows this example running. If this worked well for you, then you have the toolchain correctly installed in your machine.

FIGURE 3 – Running example of

In a similar way, you can also try the example, which gives the drone some navigation coordinates in their local coordinate frame. Run the command python3 to try this example as well.

If you check the code for the example, you will notice that the code lines in charge of commanding the drone to a certain pose look like this:

await drone.offboard.set_position_ned(PositionNedYaw(5.0, 10.0, -5.0, 90.0))

Note that the set_position_ned() function is being used to command the drone, and the arguments it receives are coordinates in the “North, East, Down” (NED) coordinate system. So, the first number in the argument (5.0) is the distance along the northern axis, the second number (10.0) is the distance along the eastern axis and the third number (-5.0) is the altitude. If the altitude is negative, it means the drone will actually climb up, because in NED, “Down -5.0” is really “Up +5.0.” Altitudes and distances are in meters.

The fourth number (90.0) is the desired “bearing angle” in degrees for the drone, once it reaches the desired position. Bearing angle is the angle around the drone’s vertical axis, known also as the “yaw” angle. Therefore, the PositionNedYaw() function will take these four numbers as arguments and convert them to a data structure of type PositionNedYaw, which is defined and used in the MAVSDK library for poses in the NED coordinate system.

NED and another variant called ENU (“East, North, Up”) are two geographical local coordinate systems used for aircraft navigation. Their axes are oriented along the geodetic directions defined by the Earth’s surface, with their origins fixed at the aircraft’s center of gravity [1]. Both follow the “right hand” rule.


Another characteristic also worth noting in the examples we just ran is the use of the asyncio Python library and its corresponding async/await keywords. Even if you have programmed in Python before, there’s a chance you never used the asyncio asynchronous programming library. When using this library, the async keyword lets us declare an asynchronous function, typically called a “coroutine.”

A coroutine is basically a function that execution can be paused at some point—generally, to wait for some event or data on which their subsequent execution depends. In other words, asynchronous functions can stop to wait for some kind of event, but without blocking other tasks in the same program [2]. So, the Python asyncio library with its async/await syntax enables concurrent asynchronous programming.

In summary, you declare a function async to make it a coroutine, and in some part of its body, by using the await keyword, the function cooperatively hands the control back to the event loop. It then waits for some external event to occur—such as the arrival of data it needs to resume its task, or a returning value from another function.


The custom example, that I’ve prepared for this article is based on the and examples found in the MAVSDK-Python/examples folder. In the example, we will make the drone autonomously follow a certain path to write in the air the first initials of our name. For the sake of the example, we will write Circuit Cellar magazine’s initials (CCM), but you can later modify the code to write any letters you want. Let’s see what I did to accomplish this task.

First, I drew the “CCM” initials on a grid paper and determined the minimum amount of points required to clearly define them. Figure 4 shows how the drawing looked. The graph’s scale is 1m per division. So, every letter is 4m by 5m in size. All defined points are in an XY Cartesian plane. But by adding a fixed 5m altitude to each one of them, we get all points defined in an XY plane that is 5m above the ground. We could also think of them as defined in an XYZ Cartesian coordinate system, with all points having the same Z coordinate (Figure 5).

FIGURE 4 – Grid graph with defined letter coordinates
FIGURE 5 – Letters in XYZ and NED coordinate systems

However, because we will also use the set_position_ned() function to command the drone (as in the example), we need to give the drone all navigation points in the NED coordinate system. For that reason, each of our defined points in the XYZ coordinate system must be mapped to the NED coordinate system. The axis labels in Figure 5 already show how the XYZ and NED coordinate systems are defined, and how their axes correspond to each other. As you can see, the X coordinate maps to East, the Y coordinate maps to North and Z coordinate to -Down.

Much of the initialization code for this example is the same found in the example. The program’s “main” entry function is also async def run() (Listing 1), and the first code lines in charge of the initialization steps are self-explanatory. In code line 10, with drone = System() we instantiate a System object, which in the MAVSDK API represents the vehicle and will allow us to interact with it. With line 11 immediately below, we are making a UDP connection to the drone, to a default IP address and port 14540. With “udp://:14540” we expect to find the drone at “localhost:14540.” That will be the case when our Python script runs in the same machine as the SITL simulation.

# There are some additional code lines before this point...

### ---------- This is the application's 'main' asynchronous function ----------
async def run():
""" Does Offboard control using position NED coordinates. """

global drone_home_pos
global cur_goal_point_pos

drone = System()
await drone.connect(system_address="udp://:14540")

print("Waiting for drone to connect...")
async for state in drone.core.connection_state():
if state.is_connected:
print(f"Drone discovered with UUID: {state.uuid}")

print("-- Arming")
await drone.action.arm()

print("Awaiting to get home position...")
await get_drone_home_pos(drone)

print("-- Setting initial setpoint")
await drone.offboard.set_position_ned(PositionNedYaw(0.0, 0.0, 0.0, 0.0))

print("-- Starting offboard")
await drone.offboard.start()
except OffboardError as error:
print(f"Starting offboard mode failed with error code: {error._result.result}")
print("-- Disarming")
await drone.action.disarm()

# There are some additional code lines after this point...

Next, with an async for loop (line 14), we poll the drone’s connection state until a connected state is received. Subsequently, we print the drone’s identifier code (UUID) and break the async for loop—otherwise, the loop would run indefinitely, because in MAVSDK, PX4 telemetry data is provided as continuous streams. Line 20 arms the drone and line 23 lets us wait until we have acquired the home position coordinates (more on the get_drone_home_pos(drone) function later). Line 26 sets an initial pose for the drone before changing to offboard flight mode in lines 29 to 35. offboard flight mode requires we begin by sending some poses before changing to this flight mode. If, for any reason, the flight mode change fails, the drone is disarmed and a return from the main function is executed to abort the whole program. PX4’s offboard flight mode lets us control the drone by using position, velocity, acceleration (thrust) and altitude setpoint commands via MAVLink.

Listing 2 shows how each one of the three initials are defined as Python lists (actually, as “list of lists”), in which every vertex point for a letter is an inner list of NED coordinates in meters and a bearing angle in degrees. The bearing angle is the angle measured in a clockwise direction from the northern line, and is the angle the drone must have after reaching the given NED coordinates. For instance, a bearing angle of 270 degrees means the drone will be pointing toward the West. In line 3, we have the first “C” letter defined as a list containing eight vertex points.

# There are some additional code lines before this point...

c_letter_1 = [[0.0, 0.0, -5.0, 0.0], [1.0, -1.0, -5.0, 30.0], [1.0, -3.0, -5.0, 60.0],
[0.0, -4.0, -5.0, 90.0], [-3.0, -4.0, -5.0, 120.0], [-4.0, -3.0, -5.0, 180.0],
[-4.0, -1.0, -5.0, 210.0], [-3.0, 0.0, -5.0, 0.0]]

c_letter_2 = []for point in c_letter_1 :
c_letter_2.append([point[0], point[1]+6, point[2], point[3]])

m_letter = [[-4.0, 8.0, -5.0, 0.0], [1.0, 8.0, -5.0, 0.0], [-1.0, 10.0, -5.0, 0.0],
[1.0, 12.0, -5.0, 0.0], [-4.0, 12.0, -5.0, 0.0]]

path_points_list = c_letter_1 + c_letter_2 + m_letter

for goal_point in path_points_list:

north_coord = goal_point[0] east_coord = goal_point[1] down_coord = goal_point[2] yaw_angle = goal_point[3]

goal_point_angle = atan2(north_coord , east_coord);
distance_to_goal_point = sqrt((east_coord**2) + (north_coord**2)) / 1000.0 # In kilometers

goal_point_bearing = (450 - degrees(goal_point_angle)) % 360;
dest_latlong = get_dest_latlong(drone_home_pos['lat'], drone_home_pos['lon'],
distance_to_goal_point, goal_point_bearing)


Advertise Here

cur_goal_point_pos['lat'] = dest_latlong[0] cur_goal_point_pos['lon'] = dest_latlong[1] cur_goal_point_pos['rel_alt'] = -down_coord

await drone.offboard.set_position_ned(
PositionNedYaw(north_coord , east_coord, down_coord, yaw_angle))

await check_is_at_goal(drone)

# There are some additional code lines after this point...


The NED coordinates for each point were copied from Figures 4 and Figure 5. The bearing angles were picked at random, so their values don’t have major significance regarding this particular example. For the second “C” letter, in a for loop, I just added 6 to the East coordinate of each point in the first “C” letter, to make a copy of it that’s displaced 6 units to the right (see lines 7-9). Figure 4 shows how that makes sense intuitively. For the last letter, “M,” defined in lines 11-12, I also copied the coordinates from the grid graph and gave each point a random bearing.

In line 14, the three letter lists are concatenated into one list called path_points_list by using the Python + concatenation operator. Next, in the for loop, we iterate over each point contained in the aforementioned list and make the drone navigate to each one of them in the given order. For instance, to go to the current point, in lines 18-21, first we retrieve the North, East and Down coordinates, along with the bearing angle. Then, in lines 23-24, we calculate the point’s angle with respect to the East and its horizontal distance from the origin (the drone’s initial position), by using the arctangent trigonometric function and the Pythagorean formula, as if we were in the XY Cartesian plane (Figure 6). Then, in lines 26-29, we convert the point’s angle (with respect to the eastern axis) to bearing (with respect to the northern axis), and with the bearing, the distance and the drone’s home latitude and longitude, we call the get_dest_latlong() function to calculate the GPS coordinates for the current goal point.

FIGURE 6 – Point distance and bearing calculation

Because the drone’s distance to current goal will be measured in the async def check_is_at_goal (drone) function (more on this function later), to check if the drone has reached the current goal point, we populate the cur_goal_point_pos global variable (a Python dictionary, actually) with the current goal’s latitude, longitude and altitude. Next, in line 35 we call the drone.offboard.set_position_ned() function to set the current NED coordinates and Yaw angle as the next drone pose. The await keyword in this line ensures the for loop will pause until the drone receives this new goal pose. Finally, in line 38 we call the check_is_at_goal(drone) function, which will return after the drone reaches the current goal point. Figure 7 shows a screen capture of the example running.

FIGURE 7 – Example shown here of running

Listing 3 shows the two asynchronous auxiliary functions that are part of Let’s explain what these functions do. The first function, async def get_drone_home_pos(drone) is in charge of obtaining the drone’s home position GPS coordinates and relative altitude. With the async for loop in lines 5-11, this function starts iterating over the “home” position incoming telemetry stream, and after receiving the first reading and storing it in the drone_home_pos Python dictionary, in line 11 the function returns. The home position is the spot at which the drone takes off, and it won’t change during the execution of our Python script, so we need to read it just once at the beginning. If you remember, we call this function in the initialization steps in the main async def run() function (see line 23 in Listing 1).

async def get_drone_home_pos(drone):

global drone_home_pos

async for home_position in drone.telemetry.home():

drone_home_pos['lat'] = home_position.latitude_deg
drone_home_pos['lon'] = home_position.longitude_deg
drone_home_pos['rel_alt'] = home_position.relative_altitude_m


async def check_is_at_goal(drone):

global cur_goal_point_pos

drone_current_pos = {'lat': None, 'lon': None, 'rel_alt': None}

prev_round_hor_dist_to_goal = None

async for position in drone.telemetry.position():

drone_current_pos['lat'] = position.latitude_deg
drone_current_pos['lon'] = position.longitude_deg
drone_current_pos['rel_alt'] = position.relative_altitude_m

hor_dist_to_goal = 1000.0 * get_haversine_distance(drone_current_pos['lat'],
drone_current_pos['lon'], cur_goal_point_pos['lat'], cur_goal_point_pos['lon'])

ver_dist_to_goal = abs(cur_goal_point_pos['rel_alt'] - drone_current_pos['rel_alt'])


Advertise Here

round_hor_dist_to_goal = round(hor_dist_to_goal)
if round_hor_dist_to_goal != prev_round_hor_dist_to_goal:
prev_round_hor_dist_to_goal = round_hor_dist_to_goal
print(f"...round_hor_dist_to_goal: {round_hor_dist_to_goal}")

if hor_dist_to_goal <= MAX_HOR_DIST_ERROR and ver_dist_to_goal <= MAX_VER_DIST_ERROR:
print(f">>> Current goal reached!, hor_dist_to_goal={(hor_dist_to_goal):.2f} mts")

The second function, async def check_is_at_goal(drone), is in charge of monitoring the drone’s progress to the current goal point. To achieve this, it iteratively reads the drone’s current position by calling the drone.telemetry.position() function in the async for loop (line 21). After obtaining the drone’s current position, the latitude, longitude and relative altitude are stored in the drone_current_pos Python dictionary (lines 23-25). In line 27, it computes the horizontal distance in meters between the drone’s current position and the current goal’s position by using the get_haversine_distance() function. The “Haversine” formula used in the latter function allows to calculate “circular distances” (arcs) over the Earth’s surface between two sets of latitude and longitude coordinates.

In line 30, we compute the vertical distance as well, which will be useful when the drone is going to a point in a different altitude than the previous one. Then, with the first if sentence (line 33), we check if the drone has reduced the horizontal distance to the current goal in at least 1m, and print the current (rounded) distance value to the terminal window, just for the purpose of visualizing the drone’s horizontal navigation progress. With the second if sentence (line 37), we check if both the horizontal and vertical distances to the current goal are less than or equal to the maximum allowed distance errors MAX_HOR_DIST_ERROR and MAX_VER_DIST_ERROR, to consider the current goal point reached. If this is true, after printing a message, the function returns to the main caller function, in which the for loop in charge of controlling the navigation (see line 16 in Listing 2) will pick the next point and command the drone to go to it.

This very same example can be run with a real quadcopter drone connected to the PC via telemetry modules, after changing line 11 in Code Listing 1 for:

await drone.connect(system_address="serial:///dev/ttyUSB0:57600")

where /dev/ttyUSB0 is the port name assigned by your Linux system to the USB telemetry module connected to your PC.

Obviously, in this case you don’t need to run the Gazebo or jMAVSim simulators anymore, nor will you be able to use QGroundControl. If QGroundControl is open, it will automatically connect to the drone by opening the USB serial port, and your Python script won’t be able to open the same port to connect to the drone. You can first use QGroundControl to check if the drone connects properly with your computer via the telemetry modules, and then close it before running the Python script.

Please, bear in mind I could be leaving out some additional details (beyond the scope of this article) for the process of running MAVSDK-Python code to control a real drone. The point I’m trying to make here is that, in general, once you have some application correctly running in simulation, testing it with a real drone won’t require making major changes to the code you wrote and software tools you’ve been using. Nevertheless, I don’t recommend running this example with a real drone, unless you have at least relatively good experience configuring and flying real quadcopters, and you really understand how PX4 flight modes work. However, if you do want to run it with a real drone, I recommend you first try a simpler example such as to be sure your workflow is okay. Also, remember to consider all the required space the drone would need to cover the whole flying path


I hope you saw how relatively easy it is to get into autonomous drone application development with MAVSDK-Python. Sure enough, you have to know Python beforehand and also get the toolchain installed and working properly. And because MAVSDK-Python uses the asyncio library, you also need to have some familiarity with asynchronous programming—nothing fancy, just a general understanding of how concurrency works and specifically how to use the async/await keywords.

If you want to further experiment with the MAVSDK-Python API, there are a couple of things you should do. First, get access to PX4’s “Slack” workspace and “Discuss” forum [3], where you can get support for all things related to the PX4 ecosystem, not just MAVSDK. Second, try to study all the examples included in the MAVSDK-Python repository. Some of them are intuitive and easy to understand, whereas some others would require you to research a bit more. Read the MAVSDK API documentation as well, for a detailed reference about the API’s functions and data structures used in the examples.

In follow-ups to this article, I will be presenting some more advanced examples of autonomous flight with MAVSDK, and also discussing how to upgrade a quadcopter with a companion computer to do autonomous object tracking with computer vision. I will also talk about how to integrate PX4 with the Robot Operating System (ROS), by using MAVROS (the MAVLink-to-ROS bridge). Until the next time! 


[3] PX4 Support (Slack and Discuss)



Advertise Here

PX4 SITL simulation

Development Environment Installation

Getting started with MAVSDK-Python

MAVSDK-Python: easy asyncio

PX4 platform


PX4 Support (Slack and Discuss)

Local Tangent Plane Coordinates (NED Coordinate System)

Speed Up Your Python Program With Concurrency

Install Ubuntu 18.04 LTS in a VirtualBox

Ardupilot |
Dronecode |
PX4 Autopilot |
Mathworks |


Keep up-to-date with our FREE Weekly Newsletter!

Don't miss out on upcoming issues of Circuit Cellar.

Note: We’ve made the Dec 2022 issue of Circuit Cellar available as a free sample issue. In it, you’ll find a rich variety of the kinds of articles and information that exemplify a typical issue of the current magazine.

Would you like to write for Circuit Cellar? We are always accepting articles/posts from the technical community. Get in touch with us and let's discuss your ideas.

Sponsor this Article

Raul Alvarez Torrico has a B.E. in electronics engineering and is the founder of TecBolivia, a company offering services in physical computing and educational robotics in Bolivia. In his spare time, he likes to experiment with wireless sensor networks, robotics and artificial intelligence. He also publishes articles and video tutorials about embedded systems and programming in his native language (Spanish), at his company’s web site You may contact him at

Supporting Companies

Upcoming Events

Copyright © KCK Media Corp.
All Rights Reserved

Copyright © 2024 KCK Media Corp.

Writing MAVSDK/PX4 Drone Applications

by Raul Alvarez Torrico time to read: 18 min