Affordable 3D Map Creation
360-degree LIDAR devices are expensive. Find out how these four engineering students crafted an affordable solution by attaching a flat-plane LIDAR to a mounted stepper motor, and then combining the planes and data to represent a 360-degree sweep. They then developed software that virtually builds the space detected by the LIDAR.
A gamer, two engineers and a construction worker walk into a bar. What do they make? Well, after a few too many drinks, one of them asks, “Why can’t we digitally map this bar, so that we can turn it into our own meeting area?” This question was the impetus behind the Calron360 Point Cloud Mapping Tool: an innovative way to map out and transmit a virtual rendering of any room. Users can recreate any space they scan, and use it for a plethora of applications.
All bad “guys walk into a bar” jokes aside, our team realized it would be extremely useful to be able to digitally map any space, such as a busy construction site or office building. Relying on our experiences working in the armed forces, we came up with the idea to use a Light Detection and Ranging (LIDAR) device as the medium for scanning a space. LIDAR utilizes light in a concentrated beam to produce a reflection, and creates a data point coordinate to represent where the object is from the reflection. The LIDAR meets all our needs for a safe scanning tool on a busy job site, and it has the accuracy to provide a clear scan of any room and a good virtual representation of the space.
LIDARs are expensive. We ran into an issue sourcing a LIDAR that was affordable, and this, in turn, led us to develop our project—with cost being a key consideration. Most LIDARs on the market and in use as scanning tools are well outside the average person’s budget. The average price for a small unit that scans a space at a 360-degree sweep is well over $10,000. The LIDAR we found was just $400, but only scanned on a flat plane. We came up with the idea to attach it to a mounted stepper motor, and then combine the planes and data to represent an affordable 360-degree sweep. This idea also led us to develop our Unity software, so that we could build the space virtually and even walk through the space using a virtual reality helmet.
The ideas from our four intrepid characters in this story resulted in the creation of a package that contains everything needed to map a small space and create a point cloud representation of the space—a scanner using a LIDAR module from the construction worker, a customized control node and comma separated value (CSV) file from the engineers, and a custom Unity engine from the gamer.
THE CONSTRUCTION WORKER
First, our construction worker wanted both accuracy and affordability in the scanner. Being the most prudent and cost-minded of the group, affordability was a major point. To this end, we made the scanner at a cost that is within even a hobbyist’s budget. He also wanted it to scan a space and replace the large amount of labor needed to post-check the construction against the schematics. The hardware description of this scanner is shown in the flow chart in Figure 1. The accuracy of the scan will, of course, be dependent on the LIDAR module used in the project.
Because we could not afford a LIDAR capable of a full 360-degree sweep—and we wanted to keep the entire unit affordable—we chose Slamtec’s RPLIDAR A2M8 360-degree Laser Range Scanner. This unit is affordable (only around $400), and it has enough range to scan most rooms or indoor spaces with reasonable accuracy. Our LIDAR unit has only 10 meters of range for an accurate scan, and will scan on only one plane. This can produce a 2D data set, but not the 3D set we required. Now that we had our LIDAR, we needed a reliable way to turn it on the other plane to create that 3D data set.
This is where a stepper motor comes into play. We selected the 28BYJ-48 stepper motor from MikroElektronika to control our mount. It divides its rotation into steps, in our case 2,048 steps per 180 degrees of rotation. By dividing the rotation into steps and calculating the number of steps per degree, we can determine the angle for each scan on the vertical axis as the stepper motor rotates.
Replacing the horizontal rotation with the stepper motor was key to keeping our unit affordable, because most LIDARs include a full vertical and horizontal sweep, whereas our model does only a 2D sweep. The stepper motor requires a driver to provide enough torque and keep the windings energized. We used the ULN2003 motor driver (available from Texas Instruments and others), which allows enough to drive the stepper motor and provide torque to turn our pintle mount. To control the motor driver and LIDAR our processing requirements together and implement a Raspberry Pi 4.
MCU AND CUSTOM Pi HAT
(Construction Worker Continued). We selected the Raspberry Pi 4, because we needed a processor that could source enough amperage to run the LIDAR, and enough processing power to create the point cloud data set in a reasonable amount of time. The Pi 4 was perfect for this, because the manufacturer had upped the current it was capable of sourcing to 2A, which is just above what we needed to run the stepper motor and LIDAR without overdrawing the Pi. The Pi 4 also provides us with an abundance of processing power that not only allows us to easily handle the current processing, but also has a good deal of room for expansion going forward.
Even with all the power the Pi 4 offered, considerable customization was required to adapt it to our needs. First, we needed a reliable way to attach our stepper motor driver and all its control pins and power to the Pi 4. We designed a custom expansion board for the Pi 4, also known as a Pi HAT, because of the way it is seated on top of the Pi like a little hat (Figure 2).
During testing, we discovered that the Pi 4 would heat up more than desired. To combat this, we added a mounting for a fan in the center of the Pi HAT, which served to cool the CPU on the Pi (see our Altium design in Figure 3). The other main reason for selecting the Pi 4 was for the USB C input that could handle the data transfer rate of the LIDAR module. Having a fast connection enables all the data to be read accurately at a near-real-time pace.
Another requirement for these scans is they must run uninterrupted. This means we needed a battery unit that could last long enough for a complete scan. To ensure we had a power source that had enough endurance for a complete scan, we selected a 6-cell Li-ion battery pack, with a built-in charge controller that allows the battery pack to be charged externally and keeps the power supplied to our unit stable and clean.
We included space for two of these battery packs to be mounted in the unit, and ensured they can be run in parallel, essentially doubling the charge lifespan. Using this approach to power our unit, we have ensured that the pack can easily run the entire unit, including the stepper motor and LIDAR, for several full, continuous scans. The battery packs also serve as a weight to counterbalance the pintle mount and LIDAR module on top of the custom housing.
Our unit sits inside a custom housing, (Figure 4). that consists of a top pintle mount on which our LIDAR is mounted and will rotate with the stepper motor. Below this is the main housing that contains the stepper motor, drive controller, Pi4, Pi HAT and both battery packs. We 3D printed the housing, and everything is fitted to our components. We also added intakes for airflow to the fan for cooling the Pi 4. Figure 5 shows a panned-out view of the Scanner, with the housing disassembled and the Pi 4 plus Pi HAT on display.
The Engineers understood that the manipulation of data needed to be powerful enough to retain the accuracy they wanted for the space, but also fit into a file small enough to be emailed easily. To accomplish this, they did most of the computations in the device, created a point cloud data array, and loaded that into a CSV file for transmission. To start, they needed a way to pull the data from the LIDAR.
We selected the Robot operating system (ROS) to pull the data, since it had been used in conjunction with robots to communicate with LIDAR sensors in the past. Unfortunately, we had quite a bit of trouble getting it to work with our module. In the end, we had to write our own control node to communicate with our LIDAR and pull the necessary data to create our point cloud data set.
With our point cloud data from the LIDAR gathered, we still had to combine it with the rotational data from our stepper motor, to place the points at the proper angle from the origin. This proved to be a tough design decision, because there were many ways to combine all this information. Ultimately, we used an amalgamation of several methods. In our customized method, each rotation of the stepper motor generated a scan file, and we would stitch these scan files together to create a full set of data incorporating our rotational angle and the LIDAR XYZ data array. This stitched file took the form of a customized CSV file.
The final part of our hardware end was to take all the data we had generated and turn it into a custom CSV file that contained properly converted data. To accomplish this, we attempted to use a combination of NumPy and Pandas to convert our point cloud data into a format that worked with our Unity engine. Both are Python libraries that enable us to apply higher-level math to convert the data sets to standard distances and angles.
To our dismay, this did not work, and neither of these libraries could do what we required. Therefore, we wrote our own libraries as substitutes, and they worked perfectly for our project. With the new libraries in place, we were able to generate a CSV file that would load into our Unity software. All this data coming together created our finalized file, which could be sent to any Unity-capable machine and, with our software, create the virtual space.
Our gamer wanted to create a virtual environment that allowed you to move around in the space. He had been using the Unity game engine to great effect lately, and decided that it should be implemented to read the point cloud data and render it into 3D. Then a mesh was added to the space (Figure 6) to make it virtual.
The first hurdle was reading the point cloud data into the engine. To read the CSV file into Unity, our gamer had to create a customized script that would translate the XYZ format into Unity’s expected XYZ format, which was slightly different from ours. With this data loaded, we could create the framework for our Unity space. The point cloud data served as the base from which we can build the space almost like building a house (Figure 7). Once the origin was set for the space, we printed out the data, creating points around the origin. Then, we layered meshes over top of these points, to reduce the number of points needed to generate a wall or a solid surface. These meshes enabled us to represent a large amount of data in Unity with the available hardware.
Unity works well for our purpose, because it is a game engine with many tools for implementing meshes that made for a more streamlined creation of our surfaces. Representing each point from our point cloud data as a solid object proved to be resource-heavy and not feasible for the average computer. For this reason, we implemented a mesh rendering that replaced each point with triangles. A mesh network consists of two arrays that contain points known as vertices and triangles. For our mesh, we combined two triangles at each point to create a square, for better coverage of surface. By allowing two triangles to share two points, we can seam these together.
We created a custom control panel that gives the user many options for accessing Unity’s tools, to control the space and apply meshes. Controls for orbiting of the space for demonstration purposes include speed options that allow you to slow it down upon request for a close examination—for example, to show the space to clients. We also included mesh-rendering options, so users can add or remove meshes. The three options for the meshes are: no mesh and just the raw point cloud data, 2D square meshes and 3D cube meshes. Finally, there is the option to use an attached VR headset that is compatible with the Unity engine, for a virtual walkthrough of your space.
RESULTS AND CONCLUSION
The results of the first generation of our scanner surpassed our expectations. We succeeded in accurately scanning a room and creating a point cloud array that represented the room. We kept enough points to make the representation accurate but manageable in terms of its small file size. The point cloud array successfully feeds into our CSV file through our customized ROS node, along with the positional data to create a file that can be emailed anywhere. To top it off, our Unity software quickly and accurately loads and renders the point cloud data via its tool set and control panel. Figure 8 shows our team working out of our cupboard at Camosun College. We turned the cupboard into a working space with a power source and parts kits.
Calron360 is the successful product of a collaborative effort whose humble beginnings are the stuff of “guys walk into a bar” jokes. Our first-generation scanner, the Calron360 will scan in a timely manner and produce a beautiful rendering of the space in our 3D environment. Then we can step into our virtual world and whether we are across the city from each other or across the world we can now email a virtual bar to each other and meet in a place we know and recognize
28BYJ-48 stepper motor
ULN2003 Motor Driver
Raspberry Pi 4 A
RPLIDAR A2M8 360° Laser Range Scanner
PUBLISHED IN CIRCUIT CELLAR MAGAZINE • APRIL 2020 #357 – Get a PDF of the issueSponsor this Article