Raspberry Pi-Based Network Monitoring Device

In 2012, Al Anderson, IT director at Salish Kootenai College in Pablo, MT, and his team wired the dorms and student housing units at the small tribal college with fiber and outdoor CAT 5 cable to provide reliable Internet service to students. “Our prior setup was wireless and did not provide very good service,” Anderson says.

The 25 housing units, each with a small unmanaged Ethernet switch, were daisy chained in several different paths. Anderson needed a way to monitor the links from the system’s Simple Network Management Protocol (SNMP) network monitoring software, Help/Systems’s InterMapper. He also wanted to ensure the switches installed inside the sun-exposed utility boxes wouldn’t get too hot.

The Raspberry Pi is a small SBC based on an ARM processor. Its many I/O ports make it very useful for embedded devices that need a little more power than the typical 8-bit microcontroller.

Photo 1: The Raspberry Pi is a small SBC based on an ARM processor. Its many I/O ports make it very useful for embedded devices that need a little more power than the typical 8-bit microcontroller.

His Raspberry Pi-based solution is the subject of an article appearing in Circuit Cellar’s April issue. “We chose the Raspberry Pi because it was less expensive, we had several on hand, and I wanted to see what I could do with it,” Anderson says (see Photo 1).

The article walks readers through each phase of the project:

“I installed a Debian Linux distro, added an I2C TMP102 temperature sensor from SparkFun Electronics, wrote a small Python program to get the temperature via I2C and convert it to Fahrenheit, installed an SNMP server on Linux, added a custom SNMP rule to display the temperature from the script, and finally wrote a custom SNMP MIB to access the temperature information as a string and integer.”

Setting up the SBC and Linux was simple, Anderson says. “The prototype Raspberry Pi has now been running since September 2012 without any problems,” he says in his article. “It has been interesting to see how the temperature fluctuates with the time of day and the level of network activity. As budget and time permit, we will be installing more of these onto our network.”

In the following excerpt, Anderson discusses the project’s design, implementation, and OS installation and configuration. For more details on a project inspired, in part, by the desire to see what a low-cost SBC can do, read Anderson’s full article in the April issue.

DESIGN AND IMPLEMENTATION
Figure 1 shows the overall system design. The TMP102 is connected to the Raspberry Pi via I2C. The Raspberry Pi is connected to the network via its Ethernet port. The monitoring system uses TCP/IP over the Ethernet network to query the Raspberry Pi via SNMP. The system is encased in a small acrylic Adafruit Industries case, which we used because it is inexpensive and easy to customize for the sensor.

The system is designed around the Raspberry Pi SBC. The Raspberry Pi uses the I2C protocol to query the Texas Instruments TMP102 temperature sensor. The Raspberry Pi is queried via SNMP.

Figure 1: The system is designed around the Raspberry Pi SBC. The Raspberry Pi uses the I2C protocol to query the Texas Instruments TMP102 temperature sensor. The Raspberry Pi is queried via SNMP.

Our first step was to set up the Raspberry Pi. We started by installing the OS and the various software packages needed. Next, we wrote the Python script that queries the I2C temperature sensor. Then we configured the SNMP daemon to run the Python script when it is queried. With all that in place, we then set up the SNMP monitoring software that is configured with a custom MIB and a timed query. Finally, we modified the Raspberry Pi case to expose the temperature sensor to the air and installed the device in its permanent location.

OS INSTALLATION AND CONFIGURATION
The Raspberry Pi requires a Linux OS compiled to run on an ARM processor, which is the brain of the device, to be installed on an SD card. It does not have a hard drive. Setting up the SD card is straightforward, but you cannot simply copy the files onto the card. The OS has to be copied in such a way that the SD card has a boot sector and the Linux partitioning and file structure is properly maintained. Linux and Mac OS X users can use the dd command line utility to copy from the OS’s ISO image. Windows users can use a utility (e.g., Win32DiskImager) to accomplish the same thing. A couple of other utilities can be used to copy the OS onto the SD card, but I prefer using the command line.

A Debian-based distribution of Linux seems to be the most commonly used Linux distribution on the Raspberry Pi, with the Raspbian “wheezy” as the recommended distribution. However, for this project I chose Adafruit Learning Systems’s Occidentalis V0.2 Linux distribution because it had several hardware-hacker features rolled into the distribution, including the kernel modules for the temperature sensor. This saved me some work getting those installed and debugged.

Before you can copy the OS to the SD card, you need to download the ISO image. The Resources section of this article lists several sources including a link to the Adafruit Linux distribution. Once you have an ISO image downloaded, you can copy it to the SD card. The Resources section also includes a link to an Embedded Linux Wiki webpage, “RPi Easy SD Card Setup,” which details this copying process for several OSes.

The quick and dirty instructions are to somehow get the SD card hooked up to your computer, either using a built-in SD reader or a peripheral card reader. I used a USB attached reader. Then you need to format the card. The best format is FAT32, since it will get reformatted by the copy command anyway. Next, use your chosen method to copy the OS onto the card. On Linux or Mac OS X, the command:

dd bs=4M if=~/linux_distro.img of=/dev/sdd

will properly copy the OS onto the SD card.

You will need to change two important things in this command for your system. First, the
if parameter, which is the name the in file (i.e., your ISO image) needs to match the file you downloaded. Second, the of device (i.e., the out file or our SD drive in this case) needs to match the SD card. Everything, including devices, is a file in Linux, in case you are wondering why your SD drive is considered a file. We will see this again in a bit with the I2C device. You can toast your hard drive if you put the wrong device path in here. If you are unsure about this, you may want to use a GUI utility so you don’t overwrite your hard drive.

Once the OS is copied onto the SD card, it is time to boot up the Raspberry Pi. A default username and password are available from wherever you download the OS. With our OS, the defaults are “pi” and “raspberry.” Make it your first mission to change that password and maybe even add a new account if your project is going to be in production.

Another thing you may have to change is the IP address configuration on the Ethernet interface. By default, these distributions use DHCP to obtain an address. Unless you have a need otherwise, it is best to leave that be. If you need to use a static IP address, I have included a link in the Resources section with instructions on how to do this in Linux.

To access your Raspberry Pi, hook up a local keyboard and monitor to get to a command line. Once you have the network running and you know the IP address, you can use the SSH utility to gain access via the network.

To get SNMP working on the Raspberry Pi, you need to install two Debian packages: snmpd and snmp. The snmpd package is the actual SNMP server software that will enable other devices to query for SNMP on this device. The second package, snmp, is the client. It is nice to have this installed for local troubleshooting.

We used the Debian package manager, apt-get, to install these packages. The commands also must be run as the root or superuser.

The sudo apt-get install snmpd command installs the snmpd software. The sudo part runs the apt-get command as the superuser. The install and snmpd parts of the command are the arguments for the apt-get command.

Next we issued the
sudo apt-get install snmp command, which installed the SNMP client. Issue the ps -ax | grep snmpd command to see if the snmpd daemon is running after the install. You should see something like this:

1444 ? S 14:22 /usr/sbin/snmpd -Lsd -Lf /dev/null -u snmp -g snmp -I -smux -p /var/run/snmpd.pid

If you do not see a line similar to this, you can issue the sudo /etc/init.d/snmpd command start to start the service. Once it is running, it is time to turn your attention to the Python script that reads the temperature sensor. Configure the SNMP daemon after you get the Python script running.

The Raspberry Pi’s final installation is shown. The clear acrylic case can be seen along with the Texas Instruments TMP102 temperature sensor, which is glued below the air hole drilled into the case. We used a modified ribbon cable to connect the various TMP102 pins to the Raspberry Pi.

The Raspberry Pi’s final installation is shown. The clear acrylic case can be seen along with the Texas Instruments TMP102 temperature sensor, which is glued below the air hole drilled into the case. We used a modified ribbon cable to connect the various TMP102 pins to the Raspberry Pi.

An Organized Space for Programming, Writing, and Soldering

AndersonPhoto1

Photo 1—This is Anderson’s desk when he is not working on any project. “I store all my ‘gear’ in a big plastic bin with several smaller bins inside, which keeps the mess down. I have a few other smaller storage bins as well hidden here and there,” Anderson explained.

AndersonPhoto2

Photo 2—Here is Anderson’s area set up for soldering and running his oscilloscope. “I use a soldering mat to protect my desk surface,” he says. “The biggest issue I have is the power cords from different things getting in my way.”

Al Anderson’s den is the location for a variety of ongoing projects—from programming to writing to soldering. He uses several plastic bins to keep his equipment neatly organized.

Anderson is the IT Director for Salish Kootenai College, a small tribal college based in Pablo, MT. He described some of his workspace features via e-mail:

I work on many different projects. Lately I have been doing more programming. I am getting ready to write a book on the Xojo development system.

Another project I have in the works is using a Raspberry Pi to control my hot tub. The hot tub is about 20 years old, and I want to have better control over what it is doing. Plus I want it to have several features. One feature is a wireless interface that would be accessible from inside the house. The other is a web control of the hot tub so I can turn it on when we are still driving back from skiing to soak my tired old bones.

I am also working on a home yard sprinkler system. I laid some of the pipe last fall and have been working on and off with the controller. This spring I will put in the sprinkler heads and rest of the pipe. I tend to like working with small controllers (e.g., the Raspberry Pi, BeagleBoard’s BeagleBone, and Arduino) and I have a lot of those boards in various states.

Anderson’s article about a Raspberry Pi-based monitoring device will appear in Circuit Cellar’s April issue. You can follow him on Twitter at @skcalanderson.

Q&A: Andrew Godbehere, Imaginative Engineering

Engineers are inherently imaginative. I recently spoke with Andrew Godbehere, an Electrical Engineering PhD candidate at the University of California, Berkeley, about how his ideas become realities, his design process, and his dream project. —Nan Price, Associate Editor

Andrew Godbehere

Andrew Godbehere

NAN: You are currently working toward your Electrical Engineering PhD at the University of California, Berkeley. Can you describe any of the electronics projects you’ve worked on?

ANDREW: In my final project at Cornell University, I worked with a friend of mine, Nathan Ward, to make wearable wireless accelerometers and find some way to translate a dancer’s movement into music, in a project we called CUMotive. The computational core was an Atmel ATmega644V connected to an Atmel AT86RF230 802.15.4 wireless transceiver. We designed the PCBs, including the transmission line to feed the ceramic chip antenna. Everything was hand-soldered, though I recommend using an oven instead. We used Kionix KXP74 tri-axis accelerometers, which we encased in a lot of hot glue to create easy-to-handle boards and to shield them from static.

This is the central control belt-pack to be worn by a dancer for CUMotive, the wearable accelerometer project. An Atmel ATmega644V and an AT86RF230 were used inside to interface to synthesizer. The plastic enclosure has holes for the belt to attach to a dancer. Wires connect to accelerometers, which are worn on the dancer’s limbs.

This is the central control belt-pack to be worn by a dancer for CUMotive, the wearable accelerometer project. An Atmel ATmega644V and an AT86RF230 were used inside to interface to synthesizer. The plastic enclosure has holes for the belt to attach to a dancer. Wires connect to accelerometers, which are worn on the dancer’s limbs.

The dancer had four accelerometers connected to a belt pack with an Atmel chip and transceiver. On the receiver side, a musical instrument digital interface (MIDI) communicated with a synthesizer. (Design details are available at http://people.ece.cornell.edu/land/courses/ece4760/FinalProjects/s2007/njw23_abg34/index.htm.)

I was excited about designing PCBs for 802.15.4 radios and making them work. I was also enthusiastic about trying to figure out how to make some sort of music with the product. We programmed several possibilities, one of which was a sort of theremin; another was a sort of drum kit. I found that this was the even more difficult part—not just the making, but the making sense.

When I got to Berkeley, my work switched to the theoretical. I tried to learn everything I could about robotic systems and how to make sense of them and their movements.

NAN: Describe the real-time machine vision-tracking algorithm and integrated vision system you developed for the “Are We There Yet?” installation.

ANDREW: I’ve always been interested in using electronics and robotics for art. Having a designated emphasis in New Media on my degree, I was fortunate enough to be invited to help a professor on a fascinating project.

This view of the Yud Gallery is from the installed camera with three visitors present. Note the specular reflections on the floor. They moved throughout the day with the sun. This movement needed to be discerned from a visitor’s typical movement .

This view of the Yud Gallery is from the installed camera with three visitors present. Note the specular reflections on the floor. They moved throughout the day with the sun. This movement needed to be discerned from a visitor’s typical movement .

For the “Are We There Yet?” installation, we used a PointGrey FireFlyMV camera with a wide-angle lens. The camera was situated a couple hundred feet away from the control computer, so we used a USB-to-Ethernet range extender to communicate with the camera.

We installed a color camera in a gallery in the Contemporary Jewish Museum in San Francisco, CA. We used Meyer Sound speakers with a high-end controller system, which enabled us to “position” sound in the space and to sweep audio tracks around at (the computer’s programmed) will. The Meyer Sound D-Mitri platform was controlled by the computer with Open Sound Control (OSC).

This view of the Yud Gallery is from the perspective of the computer running the analysis. This is a probabilistic view, where the brightness of each pixel represents the “belief” that the pixel is part of an interesting foreground object, such as a pedestrian. Note the hot spots corresponding nicely with the locations of the visitors in the image above.

This view of the Yud Gallery is from the perspective of the computer running the analysis. This is a probabilistic view, where the brightness of each pixel represents the “belief” that the pixel is part of an interesting foreground object, such as a pedestrian. Note the hot spots corresponding nicely with the locations of the visitors in the image above.

The hard work was to then program the computer to discern humans from floors, furniture, shadows, sunbeams, and cloud reflections. The gallery had many skylights, which made the lighting very dynamic. Then, I programmed the computer to keep track of people as they moved and found that this dynamic information was itself useful to determine whether detected color-perturbance was human or not.

Once complete, the experience of the installation was beautiful, enchanting, and maybe a little spooky. The audio tracks were all questions (e.g., “Are we there yet?”) and they were always spoken near you, as if addressed to you. They responded to your movement in a way that felt to me like dancing with a ghost. You can watch videos about the installation at www.are-we-there-yet.org.

The “Are We There Yet?” project opens itself up to possible use as an embedded system. I’ve been told that the software I wrote works on iOS devices by the start-up company Romo (www.kickstarter.com/projects/peterseid/romo-the-smartphone-robot-for-everyone), which was evaluating my vision-tracking code for use in its cute iPhone rover. Further, I’d say that if someone were interested, they could create a similar pedestrian, auto, pet, or cloud-tracking system using a Raspberry Pi and a reasonable webcam.

I may create an automatic cloud-tracking system to watch clouds. I think computers could be capable of this capacity for abstraction, even though we think of the leisurely pastime as the mark of a dreamer.

NAN: Some of the projects you’ve contributed to focus on switched linear systems, hybrid systems, wearable interfaces, and computation and control. Tell us about the projects and your research process.

ANDREW: I think my research is all driven by imagination. I try to imagine a world that could be, a world that I think would be nice, or better, or important. Once I have an idea that captivates my imagination in this way, I have no choice but to try to realize the idea and to seek out the knowledge necessary to do so.

For the wearable wireless accelerometers, it began with the thought: Wouldn’t it be cool if dance and music were inherently connected the way we try to make it seem when we’re dancing? From that thought, the designs started. I thought: The project has to be wireless and low power, it needs accelerometers to measure movement, it needs a reasonable processor to handle the data, it needs MIDI output, and so forth.

My switched linear systems research came about in a different way. As I was in class learning about theories regarding stabilization of hybrid systems, I thought: Why would we do it this complicated way, when I have this reasonably simple intuition that seems to solve the problem? I happened to see the problem a different way as my intuition was trying to grapple with a new concept. That naive accident ended up as a publication, “Stabilization of Planar Switched Linear Systems Using Polar Coordinates,” which I presented in 2010 at Hybrid Systems: Computation and Control (HSCC) in Stockholm, Sweden.

NAN: How did you become interested in electronics?

ANDREW: I always thought things that moved seemingly of their own volition were cool and inherently attention-grabbing. I would think: Did it really just do that? How is that possible?

Andrew worked on this project when computers still had parallel ports. a—This photo shows manually etched PCB traces for a digital EKG (the attempted EEG) with 8-bit LED optoisolation. The rainbow cable connects to a computer’s parallel port. The interface code was written in C++ and ran on DOS. b—The EKG circuitry and digitizer are shown on the left. The 8-bit parallel computer interface is on the right. Connecting the two boards is an array of coupled LEDs and phototransistors, encased in heat shrink tubing to shield against outside light.

Andrew worked on this project when computers still had parallel ports. a—This photo shows manually etched PCB traces for a digital EKG (the attempted EEG) with 8-bit LED optoisolation. The rainbow cable connects to a computer’s parallel port. The interface code was written in C++ and ran on DOS. b—The EKG circuitry and digitizer are shown on the left. The 8-bit parallel computer interface is on the right. Connecting the two boards is an array of coupled LEDs and phototransistors, encased in heat shrink tubing to shield against outside light.

Electric rally-car tracks and radio-controlled cars were a favorite of mine. I hadn’t really thought about working with electronics or computers until middle school. Before that, I was all about paleontology. Then, I saw an episode of Scientific American Frontiers, which featured Alan Alda excitedly interviewing RoboCup contestants. Watching RoboCup [a soccer game involving robotic players], I was absolutely enchanted.

While my childhood electronic toys moved and somehow acted as their own entities, they were puppets to my intentions. Watching RoboCup, I knew these robots were somehow making their own decisions on-the-fly, magically making beautiful passes and goals not as puppets, but as something more majestic. I didn’t know about the technical blood, sweat, and tears that went into it all, so I could have these romantic fantasies of what it was, but I was hooked from that moment.

That spurred me to apply to a specialized science and engineering high school program. It was there that I was fortunate enough to attend a fabulous electronics class (taught by David Peins), where I learned the basics of electronics, the joy of tinkering, and even PCB design and assembly (drilling included). I loved everything involved. Even before I became academically invested in the field, I fell in love with the manual craft of making a circuit.

NAN: Tell us about your first design.

ANDREW: Once I’d learned something about designing and making circuits, I jumped in whole-hog, to a comical degree. My very first project without any course direction was an electroencephalograph!

I wanted to make stuff move on my computer with my brain, the obvious first step. I started with a rough design and worked on tweaking parameters and finding components.

In retrospect, I think that first attempt was actually an electromyograph that read the movements of my eye muscles. And it definitely was an electrocardiograph. Success!

Someone suggested that it might not be a good idea to have a power supply hooked up in any reasonably direct path with your brain. So, in my second attempt, I tried to make something new, so I digitized the signal on the brain side and hooked it up to eight white LEDs. On the other side, I had eight phototransistors coupled with the LEDs and covered with heat-shrink tubing to keep out outside light. That part worked, and I was excited about it, even though I was having some trouble properly tuning the op-amps in that version.

NAN: Describe your “dream project.”

ANDREW: Augmented reality goggles. I’m dead serious about that, too. If given enough time and money, I would start making them.

I would use some emerging organic light-emitting diode (OLED) technology. I’m eyeing the start-up MicroOLED (www.microoled.net) for its low-power “near-to-eye” display technologies. They aren’t available yet, but I’m hopeful they will be soon. I’d probably hook that up to a Raspberry Pi SBC, which is small enough to be worn reasonably comfortably.

Small, high-resolution cameras have proliferated with modern cell phones, which could easily be mounted into the sides of goggles, driving each OLED display independently. Then, it’s just a matter of creativity for how to use your newfound vision! The OpenCV computer vision library offers a great starting point for applications such as face detection, image segmentation, and tracking.

Google Glass is starting to get some notice as a sort of “heads-up” display, but in my opinion, it doesn’t go nearly far enough. Here’s the craziest part—please bear with me—I’m willing to give up directly viewing the world with my natural eyes, I would be willing to have full field-of-vision goggles with high-resolution OLED displays with stereoscopic views from two high-resolution smartphone-style cameras. (At least until the technology gets better, as described in Rainbows End by Vernor Vinge.) I think, for this version, all the components are just now becoming available.

Augmented reality goggles would do a number of things for vision and human-computer interaction (HCI). First, 3-D overlays in the real world would be possible.

Crude example: I’m really terrible with faces and names, but computers are now great with that, so why not get a little help and overlay nametags on people when I want? Another fascinating thing for me is that this concept of vision abstracts the body from the eyes. So, you could theoretically connect to the feed from any stereoscopic cameras around (e.g., on an airplane, in the Grand Canyon, or on the back of some wild animal), or you could even switch points of view with your friend!

Perhaps reality goggles are not commercially viable now, but I would unabashedly use them for myself. I dream about them, so why not make them?

Member Profile: Walter O. Krawec

Walter O. Krawec

Walter O. Krawec

LOCATION:
Upstate New York

OCCUPATION:
Research Assistant and PhD Student, Stevens Institute of Technology

MEMBER STATUS:
Walter has been reading Circuit Cellar since he got his first issue in 1999. Free copies were available at the Trinity College Fire Fighting Robot Contest, which was his first experience with robotics. Circuit Cellar was the first magazine for which he wrote an article (“An HC11 File Manager,” two-part series, issues 129 and 130, 2001).

TECH INTERESTS:
Robotics, among other things. He is particularly interested in developmental and evolutionary robotics (where the robot’s strategies, controllers, and so forth are evolved instead of programmed in directly).

RECENT TECH ACQUISITION:
Walter is enjoying his Raspberry Pi. “What a remarkable product! I think it’s great that I can take my AI software, which I’ve been writing on a PC, copy it to the Raspberry Pi, compile it with GCC, then off it goes with little or no modification!”

CURRENT PROJECTS:
Walter is designing a new programming language and interpreter (for Windows/Mac/Linux, including the Raspberry Pi) that uses a simulated quantum computer to drive a robot. “What better way to learn the basics of quantum computing than by building a robot around one?” The first version of this language is available on his website (walterkrawec.org). He has plans to release an improved version.

THOUGHTS ON EMBEDDED TECH:
Walter said he is amazed with the power of the latest embedded technology, for example the Raspberry Pi. “For less than $40 you have a perfect controller for a robot that can handle incredibly complex programs. Slap on one of those USB battery packs and you have a fully mobile robot,” he said. He used a Pololu Maestro to interface the motors and analog sensors. “It all works and it does everything I need.” However, he added, “If you want to build any of this yourself by hand it can be much harder, especially since most of the cool stuff is surface mount, making it difficult to get started.”

Low-Cost SBCs Could Revolutionize Robotics Education

For my entire life, my mother has been a technology trainer for various educational institutions, so it’s probably no surprise that I ended up as an engineer with a passion for STEM education. When I heard about the Raspberry Pi, a diminutive $25 computer, my thoughts immediately turned to creating low-cost mobile computing labs. These labs could be easily and quickly loaded with a variety of programming environments, walking students through a step-by-step curriculum to teach them about computer hardware and software.

However, my time in the robotics field has made me realize that this endeavor could be so much more than a traditional computer lab. By adding actuators and sensors, these low-cost SBCs could become fully fledged robotic platforms. Leveraging the common I2C protocol, adding chains of these sensors would be incredibly easy. The SBCs could even be paired with microcontrollers to add more functionality and introduce students to embedded design.

rover_webThere are many ways to introduce students to programming robot-computers, but I believe that a web-based interface is ideal. By setting up each computer as a web server, students can easily access the interface for their robot directly though the computer itself, or remotely from any web-enabled device (e.g., a smartphone or tablet). Through a web browser, these devices provide a uniform interface for remote control and even programming robotic platforms.

A server-side language (e.g., Python or PHP) can handle direct serial/I2C communications with actuators and sensors. It can also wrap more complicated robotic concepts into easily accessible functions. For example, the server-side language could handle PID and odometry control for a small rover, then provide the user functions such as “right, “left,“ and “forward“ to move the robot. These functions could be accessed through an AJAX interface directly controlled through a web browser, enabling the robot to perform simple tasks.

This web-based approach is great for an educational environment, as students can systematically pull back programming layers to learn more. Beginning students would be able to string preprogrammed movements together to make the robot perform simple tasks. Each movement could then be dissected into more basic commands, teaching students how to make their own movements by combining, rearranging, and altering these commands.

By adding more complex commands, students can even introduce autonomous behaviors into their robotic platforms. Eventually, students can be given access to the HTML user interfaces and begin to alter and customize the user interface. This small superficial step can give students insight into what they can do, spurring them ahead into the next phase.
Students can start as end users of this robotic framework, but can eventually graduate to become its developers. By mapping different commands to different functions in the server side code, students can begin to understand the links between the web interface and the code that runs it.

Kyle Granat

Kyle Granat, who wrote this essay for Circuit Cellar,  is a hardware engineer at Trossen Robotics, headquarted in Downers Grove, IL. Kyle graduated from Purdue University with a degree in Computer Engineering. Kyle, who lives in Valparaiso, IN, specializes in embedded system design and is dedicated to STEM education.

Students will delve deeper into the server-side code, eventually directly controlling actuators and sensors. Once students begin to understand the electronics at a much more basic level, they will be able to improve this robotic infrastructure by adding more features and languages. While the Raspberry Pi is one of today’s more popular SBCs, a variety of SBCs (e.g., the BeagleBone and the pcDuino) lend themselves nicely to building educational robotic platforms. As the cost of these platforms decreases, it becomes even more feasible for advanced students to recreate the experience on many platforms.

We’re already seeing web-based interfaces (e.g., ArduinoPi and WebIOPi) lay down the beginnings of a web-based framework to interact with hardware on SBCs. As these frameworks evolve, and as the costs of hardware drops even further, I’m confident we’ll see educational robotic platforms built by the open-source community.