Building the System
Andrei continues his two-part article series about his CovidTestDrone project—a drone delivery system that allows individuals to receive a self-administered lower nasal swab COVID-19 test at home via a drone and then return the sample to the lab via the same drone after administering the swab. In Part 2, he describes the construction steps, the code architecture and the dashboard development.
To review from Part 1 (Circuit Cellar 372, July 2021), my CovidTestDrone project enables self-administered COVID-19 tests to be delivered within minutes to patients’ homes via drone delivery and returned to the lab to be analyzed (Figure 1). There is no need for patients to leave their homes nor get into contact with other individuals to get tested. Tests can be provided on-demand—no waiting for a doctor to check your symptoms in person. The system provides testing that is precise, non-invasive, and user friendly. In Part 2, we’ll dive into the application in depth, and look at all the steps required to build the complete CovidTestDrone system.
This section will walk readers through the technical aspect of the drone application and all assisting applications. The solution can be split into two types of applications: frontend and backend apps. The frontend refers to the drone system that delivers test kits. The backend encompasses digital assisting applications such as the IoT Central dashboard.
I bought a drone development kit that included all the components and materials necessary to put together a customizable drone. I assembled this drone and used open-source firmware to run on it. This provided me with a starting point for developing the solution. To create the project, I chose to use the NXP Hovergames drone kit . This is a kit developed by NXP that provides the user with all the contents needed to build a robust and modular drone. The drone is powered by a flight management unit (FMU)—the RDDRONE-FMUK66 . This FMU runs on PX4 firmware. The kit also contains all the hardware (sensors, motors and so forth) as well as the parts needed to construct a fully operational drone. The drone uses MavLink to communicate to a base computer while the drone is flying. This allows the drone to be managed via the base.
The drone runs using PX4, open-source firmware. This can be regarded as the operating system of the FMU. The drone ships without any firmware installed and expects a bootloader to be flashed to operate. The PX4 bootloader is open source and available on GitHub . The firmware is the code run by the FMU on the drone. This is the software used to interface all the sensors and motors on the device as well as communications and GPS.
A crucial aspect of the drone implementation is its ability to be monitored during flight. This ensures that its operation is optimal, and action can be taken if the drone is experiencing issues. The drone is equipped with both a telemetry radio transceiver  and a radio controller .
These devices are used for different purposes. The telemetry radio transceiver is used to communicate with the base while the drone is in flight. This radio has a range of 2km to 3km when equipped with powerful antennae. One telemetry radio is placed on the drone while the other is connected to a computer running QGroundControl at the base. The drone uses the MavLink protocol to report its geolocation, yaw, pitch and roll, and other sensor readings to QGroundControl over the radios as well as its mission status. This system is used for monitoring the drone while flying autonomously.
The drone is also equipped with a radio controller with a range of about 250 meters. The drone can be manually controlled via the radio controller. This will not be used in this project because the drone will only fly autonomously.
RANGE AND FLIGHT MODES
The drone does have a limited range because it uses radio technology to communicate with the base. The range of the drone is limited by the range of the telemetry transceiver, so this means that the drone has a range of 2km to 3km. It is crucial that the drone always stays in range for it to be monitored from base. Note that the base uploads all flight data at the start of the flight and not during flight. Therefore, if the drone were to go out of range, it would continue its flight plan unless a failsafe was implemented.
Dronecode PX4 comes with a lot of flight modes . The flight mode can be changed via QGroundControl. The flight modes are split into two categories: autonomous and manual. The project uses a single flight mode—mission flight mode. Mission mode allows users to prepare a flight plan that the drone will execute by dropping pins (waypoints) on a map in QGroundControl. The drone will go from one waypoint to the next until it reaches the last one. Here, the drone can be programmed to land or return to base.
In the case of CovidTestDrone, when the consumer purchases a test, they must mention where the drone should land. A person oversees planning routes to people’s homes at the base and they must analyze the landscape and plan a route and altitude to get to the home. They must then identify the landing spot and ensure it is adequate for the landing. Certain obstacles can be evaded by adding waypoints around them or changing the flight altitude. After the plan is complete, it can be uploaded to the drone using the radio transceivers. The drone will then take off and execute the plan autonomously. It can also be monitored on a map on QGroundControl.
BATTERY LIFE AND ENCLOSURE
The battery life of the drone is about 40 minutes of flight and 1 hour of standby before it needs recharging. I am using a 5,000mAhour, 3S, 11.1V battery. Note that all batteries require XT60 connectors. The battery life is about ideal for one trip to and from the consumer and standby while the consumer is administering the test. The battery life can be extended by using a bigger battery, perhaps a 6,500mAhour battery, if need be.
The container is the device that stores the COVID-19 test while the drone is flying. I fully developed this container to ensure that it is perfectly suited for the project. The enclosure refers to the construction of the container. The container is made up of four parts. These parts are visible in the Figure 1. The enclosure consists of a bottom plate, a side plate, a top plate and a door.
The inside of the enclosure is split into two parts separated by a plate. One of the sides is designed to hold all the components while the other side holds the test kit. The test kit is accessible via the door, which can be opened to access the test container. The container has a series of holes for different components such as the servo, LED and antenna. There is also a hole for the USB port of the microcontroller. The enclosure was designed to be either 3D printed or CNC machined. I used a CNC machine to cut out the different pieces and then glued them together. All design files are open source and available on GitHub. These are ready to be 3D printed.
MCU AND OTHER HARDWARE
The Arduino MKR GSM 1400  is the microcontroller I chose to use when building the container. The device runs on SAMD architecture and the chipset is provided by Arm. This chip architecture is unique to MKR boards and offers a low power solution that extends the battery life of the device while maintaining high performance.
There are a few other components used in the project. These are shown in Figure 2, which include an RGB LED, an antenna, a servo, a button, a logic level converter and a keypad. Also included are a GY-21 temperature and humidity sensor, an Adafruit IR break beam module, a 4.7µF capacitor, a 1kΩ resistor, three 100Ω resistors, breadboard and jumper wires. The RGB LED is used to indicate if the container is open or closed to the consumer and operator. The button and keypad are used for authentication and the servo is used to open and close the container.
A logic level converter is needed in the project. This is because the servo needs 5V to work and the MKR GSM can only provide 3.3V. The servo is provided with 5V from the MKR GSM’s 5V pin that connects directly to the power source. The level is then reduced to 3.3V when connecting the servo data pin to the MKR GSM.
The MKR GSM uses GSM to communicate  with the backend. This protocol was used because of its wide availability throughout the world and data transfer speeds. The MKR GSM is also equipped with GNSS connectivity. This allows it to get its geolocation from cell towers. This data is sent to the backend so that the location of the container can be monitored live. This can be useful if for instance the container separated from the drone, allowing the location of the container to still be found.
The application uses Microsoft Azure IoT Central  (Figure 3). This is a comprehensive, industry-leading backend solution for IoT devices provided by Microsoft. IoT Central allows for thousands of devices to stream data to the cloud. This data can then be processed and displayed on dashboards online. IoT Central allows for bidirectional communication. The IoT device (the container) can send data to the service and receive commands from it. This allows comprehensive monitoring of the container remotely from the base.
The container reports its geolocation and if it is locked or unlocked to the backend together with other sensor readings. The operator can change properties such as the unlock PIN and lock and unlock the container remotely from the dashboard. The container has a locking mechanism to ensure that only authorized individuals have access to the test kit inside. It is important that the test is received by the right person. A keypad unlock mechanism is implemented for the container.
The PIN can be set via the IoT Central dashboard by the operator. The 6-digit PIN will then be sent to the device, which will store it securely. To open the container, the individual must press the red button and then input the PIN. They have three attempts to get the PIN right before they will need to restart the process. This feature is optional and can be disabled through the backend so that the container can be opened without a PIN. When closing the container, only the red button must be pressed. All these actions are sent back to the dashboard where they can be monitored in real time. An RGB LED has been added to the design for user feedback.
BATTERY AND POWER
The device is powered by a power bank through its USB port. This approach was taken to add modularity to the design; the power bank can be easily replaced and recharged when it runs out. I am using a 2,200mAhour power bank. The battery life of the container is about 48 hours with this power bank. The power bank can come with a screen to indicate how much battery is left. As previously mentioned, the container is equipped with a temperature and humidity sensor. The climate in the container must be kept within specific parameters to ensure the integrity of the test kit and specimen collected.
The device is also equipped with an IR break beam module. This module consists of a transceiver and a receiver. The transceiver emits infrared light in a straight line and the receiver can receive this light. The two components are placed directly in front of each other on either side of the container. If the test kit is placed in the container, the light beam will be interrupted, and the receiver will no longer detect any light. This allows the device to detect the presence of the test kit.
The COVID test kit that the drone carries is smaller than a typical kit. The dimensions of the box containing the kit are 70mm × 70mm × 40mm. This is just about right to fit everything in. The swab may traditionally be bigger than the size of the kit. A custom swab that folds in two can be used to save space, this may also be easier to break in half for it to fit in the test tube.
Figure 4 showcases the container’s code architecture. This section will briefly walk through it. When the button on the container is pressed, the container will check if it is locked or unlocked. If the container is unlocked, the container will be locked and the change will be reported to the backend.
If the container is locked, the device will check if a PIN is required to unlock the container. If one is, the device will wait for the user to input the correct PIN and then unlock the container. Otherwise, it will unlock the container without a PIN.
The device reports its geolocation, temperature, humidity and if the test kit is inside to the backend at a fixed interval of time that can be customized via the dashboard. The device will also check for property changes in the backend such as a change in the PIN and reflect these changes locally.
Finally, the device will check if it received any commands from the backend. The backend can send a command to open or close the container, which will be executed by the container. All code is open source and available on the GitHub repository .
NAV-Q ONBOARD COMPUTER
The Nav-Q onboard computer is the device responsible for capturing and streaming footage live from the drone to the operator. The device is equipped with a camera facing the front of the drone. Nav-Q is an onboard computer developed by NXP and EMCraft. It is equipped with a quad core processor and 2GB of RAM. The device also comes with a Google Coral Camera, which is interfaced in the project to stream video live back to the operator.
Having a live video feed from the drone is of great help when monitoring the drone remotely. It allows the operator to visualize where the drone is going and identify unexpected obstacles in its way. The operator can even take over manual control of the drone, if need be, or make it hover if an unexpected event happens. This allows the operator to manually fly the drone out of danger or change the flight plan to accommodate for the unexpected event. Overall, this makes the drone flight more safe and easier to monitor.
The Nav-Q is programmed to run the GStreamer application to capture video live from the camera and stream it live to the operator’s computer running QGroundControl. This way, the video feed is visible together with all the other tools in QGroundConrol (Figure 5). The Nav-Q will start streaming the video automatically after it connects to the Internet. This ensures full autonomy and allows the device to set up and work autonomously after it is plugged in.
The Nav-Q computer is powered by the drone’s onboard battery. This eliminates the need for a separate battery, which would increase costs. At present, GStreamer only supports streaming over Wi-Fi to another device on the network. This is clearly not ideal for deployment in the field. A GSM module  should be added to the device when deployed on the field and the code should be altered to allow for streaming the video to external sources outside the network. This could be achieved by streaming the video live to an adequate server and then querying it from the operating computer.
IoT CENTRAL BACKEND
The backend is comprised of two applications: IoT Central and QGroundControl. Azure IoT Central is the backend of the container device. The service is offered by Microsoft and allows IoT devices to send data and receive commands via a customizable dashboard. IoT Central allows developers to create an application and then connect devices to it using Azure IoT Hub DPS. Figure 6 illustrates the data communication protocol in IoT Central. The IoT device (the container) establishes a connection to IoT Central via GSM. The device authenticates via SAS (shared access signature) authentication.
After a connection is established, the IoT device will send telemetry data to the backend. These are data points reported by the device to the backend at fixed intervals of time. All this data is visualized on a custom dashboard in IoT Central (Figure 7). The following telemetry data is sent: Geolocation, Container open/close, Geolocation accuracy, Temperature and humidity and Presence of test kit.
Besides telemetry data, there is another type of data called properties. Proporties can come in two types: normal and twin properties. Normal properties are constant values that do not change. The container will send details such as its IMEI (International Mobile Equipment Identity) number and firmware version to the backend when it boots up. This helps operators identify devices and ensure they are running the correct firmware (Figure 8).
Twin properties are properties that are present in IoT Central and as local variables on the device. These properties can be changed via the dashboard. When a twin property is changed, IoT Central will send a message to the device to let it know the desired value of the variable. The following are twin properties:
• Container Unlock Option (PIN unlock or no auth)
• Container Unlock PIN (6-digit PIN)
• LED Feedback (if the RGB LED is being used)
• Telemetry send interval (the interval of time when telemetry data is reported to the backend)
All these datapoints can be changed via the settings tab on the dashboard. It is very simple for the operator to change these properties remotely. The container can also be locked and unlock remotely. This is done through the commands tab on the dashboard. The container state is represented by a switch. The operator can toggle this switch to send a command to lock or unlock the container, respectively. A global dashboard can be implemented if multiple drones are in operation. This would plot the location of all containers on the same map. This could be used by the operator to monitor all containers at the same time.
QGroundControl is an application that allows remote interfacing and monitoring of the drone. QGroundControl is ideal because of its seamless integration with PX4 and the MavLink communication protocol. The application is also free to use. In a nutshell, QGroundControl enables everything from programming missions for the drone to monitoring its flight, battery level, video stream and other parameters. The application works on Windows, Android, and iOS. It is also easy to use and provides an intuitive interface. The drone will continuously broadcast the geolocation and sensor data. The drone is always shown on a map on the app and the sensor readings are displayed in the upper right corner (Figure 9).
QGroundControl is also used to plan routes for the drone. These are called missions. To plan a mission, the operator can select a take-off location (this can be set to the current location of the drone) and then plot waypoints on the map. The drone will go from one waypoint to another in straight lines and will then change the heading accordingly and/or altitude and proceed to the next waypoint. Finally, a return waypoint is placed at the destination. This makes the drone land when it reaches its destination. Figure 10 walks through the steps to take when planning the route.
This process is repeated when the drone needs to take off from the consumer and return to base/to the lab (if in range). After the route is planned, it can be uploaded to the drone via the transceivers. The operator can then start the mission through QGroundControl. In the next set of sections, we will walk through constructing the project and provide a guide encompassing all the steps needed to fully build the prototype. I split this guide into three separate sections covering the drone, the container and finally the Nav-Q.
CONSTRUCTING THE DRONE
This section will give an overview of the steps needed to get the drone kit up and running. Most of it will be links to external guides from NXP.
Step 1: Buying the drone kit: The drone kit can be purchased from the NXP website at . The kit includes all components needed to assemble the drone together with a tool kit to help you get going. Start off by buying the kit.
Note that you will also have to buy a radio transceiver set adequate to your region, this can be purchased together with the drone on NXP’s website. You will also have to buy a battery as this does not ship together with the drone. It is recommended that you buy a 5,000mAhour, 3S, 11.1V battery with an XT60 connector.
Step 2: Assembling the drone: After you receive the kit, you can start putting everything together. NXP has a great series of videos and articles that detail assembling the drone. They guide you through putting the drone together in a few hours. Check them out at .
Step 3: Finishing setup: After you have have finished building the drone, it’s time to get everything connected and ready to go. There are numerous NXP articles that have to be followed to get the drone ready to fly. Review the following: Setting up the Radio Controllers , Programming FMUK66 for first use  and PX4 configuration using QGroundControl . Make sure to go through all of them. You should also follow the guides on the subpages and follow any external link provided to ensure everything is ready to go.
Step 4: Test flight: Now that everything is ready, you can move on to testing the drone. Please refer to the guide at  provided by NXP detailing what should be done preflight, during flight and after flight. After you have finished the preflight checks and are ready to go, make sure you check out drone flying regulations in your area. I have personally been stopped from flying drones in multiple parks and other areas. Make sure you are flying the drone in an open area with minimal obstacles and stuff like vehicles that you could end up destroying. Additional guidelines are provided at .
Once you are ready to go, make sure you bring your computer outside with you so you can connect the drone up to QGroundControl. Connect the drone to the battery, connect the telemetry transceiver to your computer, turn the radio controller on, and get flying. I personally tested the manual flight mode first by flying the drone in position flight mode. Flight modes are provided at . After getting used to the controls, I started playing around with QGroundControl. I started getting the drone to take off and land via QGroundControl and then sending simple missions for it to execute.
There really is a lot you can do. I would read the Hovergames GitBook  in detail and then scroll through the Dronecode website to get a taste of all the flight modes and options you can try. As a general rule of thumb, don’t fly the drone higher than 4m when starting off and keep it well within your line of sight. If something goes wrong or unexpected, put it in land mode. (I would customize a switch on the controller to land the drone). If you starts to lose control, ground it immediately. It’s better to deal with the damage to the drone than the damage it could do to you.
There are a couple of issues of which to be aware. I for one mounted the motors wrong and the drone would essentially shoot propellers at me. This helpful troubleshooting guide  from NXP covers a lot of problems. If you can’t find it there, the Hovergames Discussion forum on Hackster  is the place to go. It’s most likely that someone had the problem before you and found a fix for it. Enjoy flying!
CONSTRUCTING THE CONTAINER
After you’re familiar with the drone, it’s time to move on to the container. This is the part of the project that carries the test kits. This section will walk you through its construction.
Step 1: Getting the components: Unlike with the drone kit, the components for the container cannot be found in a kit and have to be bought separately. All the components can be bought off the shelf at different retailers. A list together with links to get these components are found in the earlier MCU and Other Hardware section.
Step 2: Connecting the Circuit: After you have all the components, it is time to put them together. Begin by connecting the antenna to your MKR GSM and then inserting your Hologram SIM card in the slot below the device. After you have the MKR GSM ready, move on to connecting the other components to the device (Figure 11). You may choose to do this using a breadboard as I did or soldering the components together. Ensure the servo is connected to the MKR GSM via the logic level converter appropriately. Great, now we will move on with the backend.
Step 3: Preparing Hologram IoT: The next thing you need to do is register the SIM card you are using with Hologram and get our account set up. You can follow this guide  to get this done. The process is quite straightforward. Note that you will have to input your credit card details to pay for using the SIM card. Make sure you add a balance to your account of at least €2 (about $2.50) before continuing.
Step 4: Preparing IoT Central: Before you can move on, ensure you have a Microsoft account  and a Microsoft Azure account . Click the links to create your free accounts, if required. Ok, now that you’re ready, go to apps.azureiotcentral.com/myapps.
1) When you reached the page, you may be asked to sign into your account. Ensure you sign in with the Microsoft Azure account created. You will see a button in the top left of the screen saying New App. Click this button to create a new application.
2) Input a name for your application in the next window and ensure that Standard 2 is selected for the pricing plan (it’s free for two devices). Select your subscription and make sure to select the region you’re in. Now click Create. And that’s it, your app is ready for configuration. I will exaplain that in the next step.
Step 5: Setting up the device template: Before I move on, please make sure to clone the project’s GitHub repo  to your machine.
1) Navigate to the /iotcentral folder where you will find a JSON file. You will need this file for this stage.
2) Start off by navigating to the Device Templates tab from the menu on the left of the screen. Create a device template by pressing the button on the page. On the next screen, select IoT Device and click Next.
3) Click on Import a Model on the next screen and upload the JSON file from the GitHub repo. This essentially keeps a list of all variables the device will send and receive from the backend.
4) You should see the menu on the left populating. The next thing to do is to go into the Customize tab. This allows you to set nicknames for different property states. For example, the device will represent the presence of the test kit inside the container through the boolean Test Kit Presence. If you expand the respective variable from the list, you can change the value displayed if the variable is true or false like in Figure 12. This is not necessary but does improve the UX.
5) Finally, click on the Publish button from the top of the page to publish the template.
Step 6: Creating a device: You are nearly there. Now you need to create a device in the backend. First, navigate to the Devices tab from the menu on the left of the screen. Now click on Create a Device. In the dialog box, give the device a name (avoid using spaces) and select the device template you just created. Click on Create.
MORE CONTAINER STEPS
Step 7: Preparing the container code: Let’s get back to the frontend for a minute. Locate the /container folder in the GitHub repo. This contains all the code you will be running on the MKR GSM. Before you go any further, open the code either in the Arduino IDE or VS Code if you have the Arduino extension. First, make sure you install the Arduino Devices needed to operate the MKR GSM. A guide is provided at . Install the Arduino SAMD package to continue. Allow the installation of all drivers when requested.
The code relies on a bunch of libraries you will need to get. They are as follows: Keypad, Servo, SparkFunHTU21D, MKRGSM, Arduino_MKRENV, RTCZero, PubSubClient and ArduinoHttpClient. If you are not familiar with installing libraries, this guide  walks you through the process. Some of the libraries are available directly through the Arduino Library Manager.
After all the libraries are successfully installed, compile the code. If the code compiles without any errors, you are ready to go. If you get an error, read through the message given. Arduino error messages are quite good so you should be able to identify the problem. If you need help, feel free to get in touch with me and I will do my best to assist you.
Step 8: Adding authentication: Now that everything is more or less in place, you need to add your authentication tokens in the code so the device can connect to the backend.
1) Start off by navigating to your device in IoT Central.
2) Click on the device and then select the Connect button from the top right of the screen. You will see a pop-up appear with credentials that will be used to authenticate the device’s connection (Figure 13).
3) You are interested in the first two values: the ID scope and Device ID. Now open the /container folder from the project’s GitHub repo and navigate to /container/configure.h. This file keeps all the keys and settings as well as global variables the project uses.
4) The file should look something like Figure 14. You are interested in lines 15 to 17. These hold the credentials used to connect to the backend. Copy the ID scope field from IoT Central and paste it in the iotc_scopeId definition. Then copy the Device ID into the iotc_modelId definition.
5) Now you need to get the enrolment key for the device. In IoT Central, navigate to the Administration tab from the bottom of the menu on the left of the screen. Click on Device Connection from the next menu.
6) Now click on the SAS-IoT-Devices option from the list that appears. In the next window, copy the Primary Key into the iotc_enrollmentKey definition. And that’s it! You have all the variables you need to connect.
Step 9: Flashing and troubleshooting: You are now ready to flash the code to the MKR GSM. To do this, connect it to your computer via a micro-B to A cable and open the code in the Arduino IDE or VS Code. Ensure the right port is selected and the MKR GSM device is also available. Flash the code to the device and open the serial monitor.
The device will set up its GSM connection and will then get the time from a server. It will then lock onto its location and start connecting to IoT Central. The debug is very simple and clear, so you can keep track of this along the way. So ideally, your device should be connected to the backend now. If the authentication succeeded, the device will sync its twin properties and start streaming telemetry data to the backend. But there is a chance that you encountered an error along the way. Here are some errors and quick fixes:
Not connecting to GSM Network—If you are using a SIM provider other than Hologram, you may need to input your APN data and your PIN number. Add this data to respective variables in /container/configure.h. Also, make sure you’re close to a window when connecting.
Not getting the time or getting the wrong time—If you are not receiving a timestamp from the server, try looking for another time server you could use or use GPST alternatively. If the timestamp is offset by a matter of hours, you can adjust this through the timeZone variable in /container/configure.h.
Failed to connect to IoT Central (rc-2, rc-5 errors)—These errors are caused by IoT Central refusing to sign the connection certificate of the device. There are two things to check here: first, ensure the timestamp is correct. The MKR GSM will print this out in the console at different stages. Second, double check your credentials to make sure everything checks out.
Weird data, components not working—Ensure the wiring is correct. Be careful with this as there is voltage regulation involved and connecting components incorrectly can damage the MKR GSM and the respective components.
Finally, there are several places you can check if data is being sent. First, check in the console. The data points collected will be printed out in a JSON format when they are collected. If data is printed out here, you will know that the device successfully collected data locally.
The next place you can check is your Hologram Dashboard . If your device is sending data via the cellular network, pings will be visible in the dashboard. Click on the device and see if messages are being sent. If that checks out, you can check the data being received in IoT Central by clicking on the device and then navigating to Raw Data. This will show all data sent in a list over time. If the table is populated, it shows that the data is getting to IoT Central. Well, I hope that helped. If you have a bug that you can’t seem to fix, feel free to get in touch with me and I will try my best to help you.
Step 10: Deploying: Now that the device is streaming data to the backend, we are ready to deploy it. In the /container/configure.h file, set the debugging variable to false to not print data to the serial monitor (note that some data may still be printed). You can flash this to the device and then connect it to the power bank. That’s it!
Step 11: The enclosure: Now that the container is ready, we need to enclose all the components and have room for the test kit that will be used. All the design files are in the GitHub repo under /Enclosure. These parts were designed in Solidworks. You can either 3D print the parts or CNC machine them. The assembly is quite straightforward. Finally, place the components in the container and then glue it together.
The drone kit comes with a stand that can be attached to the front of the drone. I glued this to the top of the enclosure and slid the enclosure into place on the drone. I then attached the power bank onto the back of the drone and connected it to the container via a USB wire.
Step 12: The dashboard: One final thing you need to do is create a dashboard where you can monitor the data streamed by the container and interact with it remotely. In this project, the dashboard is comprised of three parts. I will go through all of them in this section.
1) Start off by adding a settings view to the device. This is a page that allows you to change the PIN and other settings on the device. Navigate to your device template and then click on Views from the menu. In this window, create the first option available.
2) Now you can customize your view. Name it “Settings” and choose the following datapoints to display.
3) After you click Add, all the selected datapoints will be added. That’s it, you’re done! By the way, you can find these views populated with data if you click on the device.
Now, let’s deal with the device specific dashboard. This is optional. This dashboard will display the data specific to the respective device that sent it.
1) Start off by going back into the Views option from the menu and this time select the second option.
2) You will be directed to a richer interface with multiple options. On the left, you will see a menu allowing you to select telemetry and properties to display. To add a card, click on the data point and then click Add Tile at the bottom of the screen.
3) The design of the dashboard is up to you. Spend some time experimenting with it. You can add maps to map the location of the device, and cards and graphs to display data. Each card will have options available in the top right corner of the card. Play around with the card type and interpolation. Start adding more cards to the canvas to display all the information in a neat manner. Mine ended up looking something like Figure 15.
Remember that this view is accessible from the device in the cloud and will be populated with the data sent by the specific device. Now, finally, you can create a dashboard that can display data from multiple devices at a time.
1) Start off by going into the Dashboard from the menu on the left. Click the New button to create a new dashboard. This interface is very similar to the device specific dashboard.
2) The only difference is that you need to select the device group and devices you want to display the data from. If there are multiple devices, the data from all of them can be displayed on one dashboard.
3) Continue by selecting the telemetry and properties to display like in the device specific dashboard. You can make the design identical to the device specific dashboard if you wish. Note that there are different tiles such as Markdown tiles and image tiles that can be used. And that’s it! The container is done!
CONSTRUCTING THE NAV-Q
Finally, the last thing you need to do is set up the Nav-Q Computer for streaming video from the drone using GStreamer.
Step 1: Getting the kit and setting things up: First, the kit can be purchased from the EMCraft website at  this website. Before going any further please read through the NavQ GitBook  and set the device up using the 8MMNavQ Quick Start Guide at . The key sections to concentrate on in that guide are Powering the NavQ, Using NavQ as Desktop Computer, Downloading and Flashing Latest Linux Image and Connecting to WiFi. Make sure to run sudo apt-get update and upgrade. Also ensure you installed the nano editor.
Step 2: SSHing into the device remotely: It can be quite inconvenient to have to connect the NavQ to an external monitor and connect peripherals to it. We can SSH into it remotely using PuTTY.
1) Install the adequate version from the official website at .
2) The next thing you want to do is get the IP address of your NavQ device. You can do this by running ifconfig -a in the terminal on the NavQ or checking your router home page.
3) Open PuTTY on your computer and paste the IP address of the NavQ into the field and make sure Port 22 is selected. You will be asked for a username and password (both are “navq”) to connect.
4) After you input the credentials, you will be connected to the device remotely. Now let’s configure GStreamer.
Step 3: Running the GStreamer pipeline: Before running the pipeline, we need to disable the firewall on our computer so the stream can get through. Go to settings/Firewall and disable the public network firewall if you’re on a hotspot or private network firewall if you’re on Wi-Fi.
1) We need to make a small adjustment in QGroundControl now. Go into application settings and find the Video Settings section. Enable it by changing the source to UDP h.264 Video Stream. This may be in a different place depending on your version of QGroundControl.
2) All the packages needed to run the pipeline are already included. Run the following script in the terminal to turn the pipeline on. Make sure you get the IP address of your computer (same method as before) and input it in place of the
sudo gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480,framerate=30/1 ! vpuenc_h264 bitrate=500 ! rtph264pay ! udpsink host=xxx.xxx.xxx.xxx port=5600 sync=false
3. If everything worked, you should get footage from your camera. If the stream is not happening, double check the IP address used for the computer and that the script is right. Make sure the firewall is disabled too.
Step 4: Automating the GStreamer pipeline: Although it is great that the pipeline is online, you don’t want to have to SSH into the device and then run the script every time the device turns on. You need the NavQ to automatically start streaming video when it is connected to the internet. You can use a tool called systemctl to do this for us. Listing 1 shows you the steps and code to set this up.
Listing 1 These are the steps for automating the GStreamer pipeline using the systemctl tool. 1. Start off by making a new directory /home/covidtestdrone. Next, cd into the directory and create a new file using sudo nano. The code for the file is available in the GitHub repo (/NavQ) and displayed below. #!/bin/sh sudo gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480,framerate=30/1 ! vpuenc_h264 bitrate=500 ! rtph264pay ! udpsink host=xxx.xxx.xxx.xxx port=5600 sync=false 2. Paste the script above in the file and ensure to once again replace the xxx.xxx.xxx.xxx with your computer’s IP address. Save the file and name it run-gstreamer.sh. Now we want to make the script executable, paste the following command in the terminal: sudo chmod u+x run-gstreamer.sh 3. Great, now we want to integrate systemctl. Run the following command in the terminal: sudo systemctl edit --force --full covidtestdrone.service 4. A nano window will open prompting you to type. Copy and paste the following text into the terminal: [Unit] Description=My Script Service Wants=network-online.target After=network-online.target [Service] Type=simple WorkingDirectory=/home/covidtestdrone ExecStart=/home/covidtestdrone/rungstreamer.sh [Install] WantedBy=multi-user.target 5. This is a unit file that tells the system what script to execute under which conditions. 6. Now save the file with the default name. Run the following to check the status of the script service: sudo systemctl status covidtestdrone.service 7. You should get a response similar to the one in the image above. Now we need to enable the service. sudo systemctl enable covidtestdrone.service 8. Now we want to see if the service functions. Start the service with the following command: sudo systemct1 start covidtestdrone.service 9. This will force-start the service. Run the following again: sudo systemctl status covidtestdrone.service 10. You should get a video feed in QGroundControl. You can reboot the device now. The video feed should come in automatically. sudo systemctl reboot That’s it! You’re done!
Step 5: Attaching the NavQ to the drone: Finally, the NavQ has to be attached to the drone. The section of the Quick Start guide at  in the NavQ GitBook walks through attaching the NavQ to the drone using components that come with the device.
Nice! Everything is done now, and the drone is ready to fly! Thank you for reading and I hope you enjoyed it! If you have any questions or need help, feel free to reach out to me.
 NXP Hovergames drone kit
 GitHub for COVIDTestDrone project
 Arduino MKR GSM 1400
 Microsoft Azure IoT Central
Adafruit | www.adafruit.com
Arduino | www.arduino.cc
Arm | www.arm.com
Dronecode Foundation | www.dronecode.org
Fritzing | www.fritzing.org
Hologram | www.hologram.io
MavLink | https://mavlink.io/en
Microsoft Azure | www.azure.microsoft.com
NXP Semiconductors | www.nxp.com
PX4 Autopilot | https://px4.io
QGroundControl | http://qgroundcontrol.com
PUBLISHED IN CIRCUIT CELLAR MAGAZINE • SEPTEMBER 2021 #374 – Get a PDF of the issueSponsor this Article
Andrei Florian is a student in Dublin, Ireland. He has been working on tightening the connection between humans and technology by designing applications that will help us in our lives. This includes working on projects that combat pollution and climate change as well as monitoring our natural environment and our cities. He has also been working on personal security and big data. Andrei can be contacted at firstname.lastname@example.org