Innovations in Mobile Robotics: An Interview with Nick Kohut

Nick Kohut and a lab mate turned their academic interest in mobile robotics into an exciting business—Dash Robotics, which sells a small, insect-like running robot that you can control with a smartphone. We recently asked Nick about advances in running robot technology, the benefits of aerodynamic turning , and his thoughts on the future of robotics.

Nick Kohut (Co-Founder, Dash Robotics)

Nick Kohut (Co-Founder, Dash Robotics)

CIRCUIT CELLAR: When did you become interested in robotics? Can you tell us about your first robotics project?

NICK: I actually first became interested in robotics in 2010, which was my third year of graduate school. I had become an engineer originally because I was really interested in cars, specifically vehicle dynamics. I had just wrapped up my Master’s working on a research project at Cal with Audi, and I needed a new project for my PhD.

I looked around at different labs, and the work being done in Ron Fearing’s robotics lab seemed really interesting—basically vehicles with legs. Believe it or not, I had never done any robotics or even soldered a single joint until that point. I had a steep learning curve in the lab, and my first robotics project was MEDIC, a 4 cm walking robot. It was pretty tough but I learned a lot, and fell in love with the subject.

CIRCUIT CELLAR: Why did you decide to focus your studies on control systems? Whose work inspired you to focus on control systems?

NICK: At the University of Illinois Prof. Andrew Alleyne was one of my advisors, and I took his intro controls course junior year. I really liked it—controls and dynamics are definitely my favorite subject, anything that moves keeps my interest. It also had a lot of math which I was pretty decent at for an engineer so I did really well in the class. I decided I should study it in grad school. What they don’t tell you is that grad school controls is totally different, but I ended up liking that too.

The TAYLRoACH (tail-actuated yaw locomotion roach)

The TAYLRoACH (tail-actuated yaw locomotion roach)

CIRCUIT CELLAR: Tell us about the work you did in the Biomimetics and Dexterous Manipulation Laboratory under Professor Mark Cutkosky.

NICK: I was only in Mark’s lab for about seven months. It was a great place to work, but I had founded Dash Robotics in between taking the postdoc position and actually starting the postdoc. Because of that I only worked on one project and in seven months there’s only so much you can do. I was trying to scale up an Electroactive Polymer actuator (EAP) for use in Honda’s Asimo robot. It’s an interesting challenge that involves a lot of rapid prototyping, materials research, and solid mechanics. Also quality control, which is hard to in a lab setting.

CIRCUIT CELLAR: How did you come to use aerodynamic forces to turn running robots? What led you to this field of research?

NICK: This actually started with biologists like Robert Full and Tom Libby studying lizards. Bob and Tom had discovered that when lizards jump they use their tail as a form of attitude control. They had also shown that in a wind tunnel they will use their tail to turn. I was tasked with getting a robot to turn using a tail, which I did with some pretty good success. TaylRoACH (the robot I built in 2012) ended up being the fastest turning legged robot in the world. It could turn 90° in 1/4 of a second.

After I had shown that, I started to wonder what else the tail could do. I tried a lot of things – mostly back of the envelope ideas—like stability on inclines or using as a “7th leg” in confined places. A lot of those didn’t work out, and someone suggested I use it as a helicopter blade, half-joking. It got me to thinking, what if you used it as a sail? I ran the numbers in like an hour and realized, man, this might actually work.

CIRCUIT CELLAR: What are the benefits of aerodynamic turning?

NICK: There are a few benefits. One interesting thing is that it will only work at small scales, but that’s probably where you want it. When robots start to get smaller and smaller, you become really limited with what you can do. You can’t add a lot of sensors, actuators, or computing power (though this is changing every day!). So you probably have a very simple robot, maybe only a few actuators. The SailRoACH has six legs, and only three actuators, but can make wide turns, rapid turns, and pretty much everything in between. So it can keep things simple.

It also can be used in a research setting to study the dynamics of the robot. If you want to add a constant yaw disturbance to the robot and measure how that affects its running ability, this is a way to do that. This may sound like an esoteric need but it’s how research gets done, and it helps us understand running robots better.

CIRCUIT CELLAR: Tell us about the Millirobot Enabled Diagnostic of Integrated Circuits (MEDIC) project. Why did you start the project and what were the results?

NICK: MEDIC was an interesting project because it was my first robotics project and we were trying to solve a very difficult problem, which was “Can you build a robot to navigate inside a computer motherboard?” We were contracted to work on this with Lockheed Martin, and they supplied the software end of things.

Basically we built this incredibly small robot (~5 cm and 5 g) that had legs and a hull that would allow it to scoot around a motherboard, turn, and climb over basic, short obstacles (like a microchip). I worked on the mechanics and design of the robot, with a lot of help from other lab members on the electronics, and Lockheed provided the software that allowed MEDIC (called “Adorable Turtle Bot” by us) to navigate. It actually had a little camera on it, so it would take a picture, send that information to a laptop, then the laptop would send a few instructions (“go forward two steps, then turn left for two step”), the robot would execute the instructions, take another picture, and repeat the process.

It was pretty cool because you had this tiny robot doing SLAM and navigating autonomously inside a computer motherboard. Unfortunately it was slower than oatmeal running downhill and didn’t work most of the time, but that’s research. By the end we had some results we were very happy with and wrote two solid publications on it.

You can control Dash robots with a cell phone

You can control Dash robots with a cell phone

CIRCUIT CELLAR: What led you and your co-founders to launch Dash Robotics in 2013? And can you tell us about your current team?

NICK: My co-founder Andrew and I were lab mates in grad school and climbing buddies. We both knew that we wanted to run our own business, because we couldn’t stand working in a cubicle; it’s why we were in grad school in the first place.

This idea of starting a business bounced around for a couple years but we never did anything with it. In the meantime, we had been to various events at schools and museums and saw that people loved the robot, they just went wild for it. Everyone asked to buy it but we always told them, “no, this is just a research tool.” In late 2012 we saw the first beginnings of all these smart toys and thought “well, what we have here is way cooler than that.” So we formed Dash Robotics, Inc. We hadn’t even graduated yet but we got a lot of support from the University and friends and family and were able to make it until February 2015 without taking any venture investment. Now I’m very happy we have that.

The robots are made flat. Simple fold and assemble them.

The robots are made flat. Simple fold and assemble them.

CIRCUIT CELLAR: Dash’s first robot is a phone-controlled, insect-like running robot. It is shipped “origami”-style for people to assemble themselves. Tell us a bit about the process of planning and designing the robot.

NICK: This is pretty tough to answer in one question. The “origami” style is a process called SCM that was originally developed at UC Berkeley. The design is all done in 2-D and then cut out and folded up to 3-D, so it takes a bit of experience to become good at designing mechanisms using this process. You can’t just build it in 3-D CAD and see what it will look like before making it.

There are some people who are trying to change that, like Dan Aukes from Harvard. Right now we still do it all on intuition and experience. The original robot was developed in 2009, and it saw incremental changes over the next 4 years or so. In 2013 when we founded the company we had a whole new set of requirements for the robot (a research tool and consumer product are vastly different) so we started making a lot of changes. There have probably been at least 50 revisions since 2013—maybe 100. Each time it gets a little better, and we do a lot of testing to make sure we’re on the right track.

CIRCUIT CELLAR: Is DIY, hobby robotics your main focus at Dash Robotics? Do plan you branch out, perhaps into robot systems for industry, military, or medical applications?

NICK: That’s our main focus right now, along with making a product that kids will love as well. I think there are a lot of potential directions like agriculture, infrastructure inspection, search and rescue, etc. That’s much further down the road though.

CIRCUIT CELLAR: What’s next for Dash Robotics? Where would you like to see the company in 12 months?

NICK: With its products flying off store shelves and a great team in place making it all happen!

CIRCUIT CELLAR: What are your thoughts on the future of robotics?

NICK: This is a great, and of course difficult, question. It also depends on how you define robotics. I think on one end you’re going to see a lot of jobs displaced by self-driving cars and trucks, robotic dishwashers, housecleaners, etc. On the other end AI is going to be able to do a lot of knowledge work now done by lawyers, doctors, and engineers. Both of those advances are going to be a major challenge for society.

If you’re talking about mobile robotics specifically, where a lot of my interest lies, there is a major challenge in actuators and power density. Boston Dynamics builds some amazing machines but the internal combustion engine is loud and dirty, and current lithium batteries are only going to get you so far. Tesla is working very hard on the battery problem, and hopefully its new Gigafactory will bring prices down. If Tesla makes a big advance in battery technology I think you may see a whole new category of mobile robots breaking out.

This interview appears in Circuit Cellar 304 (November 2015).


Matrix Launches Formula AllCode Kickstarter Campaign (sponsored)

Matrix TSL has launched a Kickstarter campaign for its Formula AllCode robotics course, which features a high-specification, Bluetooth-enabled robot. You can program the robot via Python, AppBuilder, Flowcode, Matlab, LabVIEW, C, and more. It is compatible with Raspberry Pi, Android, iPhone, and Windows devices.AllCodeKickstarter


Formula AllCode is a platform for both novice or advanced electronics enthusiasts to learn and test their robotics skills. Participate in the campaign: Formula AllCode

The funds raised from this Kickstarter project will allow Matrix to take the current prototype development shown in the project videos to the next level with a technical specification to beat any other like-for-like robot buggy and subsequent manufacture of 1000 units to be launched world-wide.

By backing the  Kickstarter campaign, you are supporting a project which allows users to develop their robotics understanding on a platform of their choice. Whether your starting out with your first robotics project or you’re a fully fledged robotics developer, the Formula AllCode will work for you. The project must be funded by Sunday, September 6, 2015.AllCodeSpecs



  • 2 Push to make switch
  • 8 IR distance sensors
  • Light sensor
  • Microphone
  • I2C accelerometer/compass
  • 2 Line following sensors
  • Audio gain


  • 8 LEDs (one port)
  • Speaker
  • Expansion port (8 bit)
  • 4 Servo outputs
  • E-blocks expansion port


  • Left and Right
  • Integrated gear box
  • Integrated encoders


  • Reset switch
  • 16-bit PIC24 microcontroller
  • USB rechargeable lithium battery
  • 4 × 40 char backlit LCD
  • Micro SD card
  • Integrated Bluetooth
  • Crystal Oscillator
  • Micro USB Socket

Wireless Data Link

In 2001, while working on self-contained robot system called “Scout,” Tom Dahlin and Donald Krantz developed an interesting wireless data link. A tubular, wheeled robot, Scout’s wireless data link is divided into separate boards, one for radio control and another containing RF hardware.

Dahlin and Krantz write:

This article will describe the hardware and software design and implementation of a low-power, wireless RF data link. We will discuss a robotic application in which the RF link facilitates the command and control functions of a tele-operated miniature robot. The RF Monolithics (RFM) TR-3000 chip is the core of the transceiver design. We use a straightforward interface to a PIC controller, so you should be able to use or adapt much of this application for your needs…

Photo 1: The robot measures a little over 4″. Designed for tele-operated remote surveillance, it contains a video camera and transmitter. Scout can hop over obstacles by hoisting its tail spring (shown extended) and quickly releasing it to slap the ground and propel the robot into the air.

Photo 1: The robot measures a little over 4″. Designed for teleoperated remote surveillance, it contains a video camera and transmitter. Scout can hop over obstacles by hoisting its tail spring (shown extended) and quickly releasing it to slap the ground and propel the robot into the air.

The robot, called Scout, is packed in a 38-mm diameter tube with coaxial-mounted wheels at each end, approximately 110-mm long. The robot is shown in Photo 1. (For additional information, see the “Key Specifications for Scout Robot” sidebar.) Scout carries a miniature video camera and video transmitter, allowing you to tele-operate the robot by sending it steering commands while watching video images sent back from Scout. The video transmitter and data transceiver contained on the robot are separate devices, operating at 915 and 433MHz, respectively. Also contained on Scout are dual-axis magnetometers (for compass functions) and dual-axis accelerometers (for tilt/inclination measurement).

Figure 1: For the radio processor board, a PIC16F877 provides the horsepower to perform transceiver control, Manchester encoding, and packet formatting.

Figure 1: For the radio processor board, a PIC16F877 provides the horsepower to perform transceiver control, Manchester encoding, and packet formatting.

Scout’s hardware and software were designed to be modular. The wireless data link is physically partitioned onto two separate boards, one containing a PIC processor for radio control, message formatting, and data encoding (see Figure 1). The other board contains the RF hardware, consisting of the RFM TR3000 chip and supporting discrete components. By separating the two boards, we were able to keep the digital noise and trash away from the radio.

Read the full article.

Advances in Haptics Research

Katherine J. Kuchenbecker is an Associate Professor in Mechanical Engineering and Applied Mechanics at the University of Pennsylvania, with a secondary appointment in Computer and Information Science. She directs the Penn Haptics Group, which is part of the General Robotics, Automation, Sensing, and Perception (GRASP) Laboratory. In this interview, she tells us about her research, which centers on the design and control of haptic interfaces for applications such as robot-assisted surgery, medical simulation, stroke rehabilitation, and personal computing.

Katherine J. Kuchenbecker

Katherine J. Kuchenbecker

CIRCUIT CELLAR: When did you first become interested in haptics and why did you decide to pursue it?

KATHERINE: I chose to become an engineer because I wanted to create technology that helps people. Several topics piqued my interest when I was pursuing my undergraduate degree in mechanical engineering at Stanford, including mechatronics, robotics, automotive engineering, product design, human-computer interaction, and medical devices. I was particularly excited about areas that involve human interaction with technology. Haptics is the perfect combination of these interests because it centers on human interaction with real, remote, or virtual objects, as well as robotic interaction with physical objects.

My first exposure to this field was a “haptic paddle” lab in a Stanford course on system dynamics, but that alone wouldn’t have been enough to make me fall in love with this field. Instead, it was conversations with Günter Niemeyer, the professor who advised me in my PhD at Stanford. I knew I wanted a doctorate so that I could become a faculty member myself, and I was inspired by the work he had done as an engineer at Intuitive Surgical, Inc., the maker of the da Vinci system for robotic surgery. Through my early research with Günter, I realized that it is incredibly satisfying to create computer-controlled electromechanical systems that enable the user to touch virtual objects or control a robot at a distance. I love demonstrating haptic systems because people make such great faces when they feel how the system responds to their movements. Another great benefit of studying haptics is that I get to work on a wide variety of applications that could potentially impact people in the near future: robotic surgery, medical training, stroke rehabilitation, personal robotics, and personal computing, to name a few.

CIRCUIT CELLAR: What is haptography? What are its benefits?

KATHERINE: I coined the term “haptography” (haptic photography) to proclaim an ambitious goal for haptics research: we should be able to capture and reproduce how surfaces feel with the same acuity that we can capture and reproduce how surfaces look.

When I entered the field of haptics in 2002, a lot of great research had been done on methods for letting a user feel a virtual three-dimensional shape through a stylus or thimble. Essentially, the user holds on to a handle attached to the end of a lightweight, back-drivable robot arm; the 3D Systems Touch device is the most recent haptic interface of this type. A computer measures the motion that the person makes and constantly outputs a three-dimensional force vector to give the user the illusion that they are touching the object shown on the screen. I was impressed with the haptics demonstrations I tried back in 2002, but I was also deeply disappointed with how the virtual surfaces felt. Everything was soft, squishy, and indistinct compared to how real objects feel. That’s one of the benefits of being new to a field; you’re not afraid to question the state of the art.

I started working to improve this situation as a doctoral student, helping invent a way to make hard virtual surfaces like wood and metal feel really hard and realistic. The key was understanding that the human haptic perceptual system keys in on transients instead of steady-state forces when judging hardness. I had to write a research statement to apply for faculty positions at the end of 2005, so I wrote all about haptography. Rather than trying to hand-program how various surfaces should feel, I wanted to make it all data driven. The idea is to use motion and force sensors to record everything a person feels when using a tool to touch a real surface. We then analyze the recorded data to make a model of how the surface responds when the tool moves in various ways. As with hardness, high-frequency vibration transients are also really important to human perception of texture, which is a big part of what makes different surfaces feel distinct. Standard haptic interfaces weren’t designed to output high-frequency vibrations, so we typically attach a voice-coil actuator (much like an audio speaker) to the handle, near the user’s fingertips. When the user is touching a virtual surface, we output data-driven tapping transients, friction forces, and texture vibrations to try to fool them into thinking they are touching the real surface from which the model was constructed.

After many years of research by my PhD students Heather Culbertson and Joe Romano, we’ve been able to create the most realistic haptic surfaces in the world. My work in haptography is motivated by a belief that there are myriad applications for highly realistic haptic virtual surfaces.

One exciting use is in recording what doctors and other clinical practitioners feel as they use various tools to care for their patients, such as inserting an epidural needle or examining teeth for decay (more on this below). Haptography would enable us to accurately simulate those interactions so that trainees can practice critical perceptualmotor skills on a computer model instead of on a human patient.

Another application that excites us is adding tactile feedback to online shopping. We’d love to use our technology to let consumers feel the fabrics and surfaces of products they’re considering without having to visit a physical store. Touch-mediated interaction plays an important role in many facets of human life; I hope that my team’s work on haptography will help bring highly realistic touch feedback into the digital domain.

Read Circuit Cellar’s interviews with other engineers, academics, and innovators.

CIRCUIT CELLAR: Which of the Penn Haptics Group’s projects most interest you at this time?

KATHERINE: That’s a hard question! I’m excited about all of the projects we are pursuing. There are a few I can’t talk about, because we’re planning to patent the underlying technology once we confirm that it works as well as we think it does. Two of those that are in the public domain have been fascinating me recently. Tactile Teleoperation: My lab shares a Willow Garage PR2 (Personal Robot 2) humanoid robot with several of the other faculty in Penn’s GRASP Lab. Our PR2’s name is Graspy.

This wearable device allows the user to control the motion of the PR2 robot’s hand and also feel what the PR2 is feeling. The haptic feedback is delivered via a geared DC motor and two voice-coil actuators.

This wearable device allows the user to control the motion of the PR2 robot’s hand and also feel what the PR2 is feeling. The haptic feedback is delivered via a geared DC motor and two voice-coil actuators.

While we’ve done lots of fun research to enable this robot to autonomously pick up and set down unknown objects, I’d always dreamed of having a great system for controlling Graspy from a distance. Instead of making the operator use a joystick or a keyboard, we wanted to let him or her control Graspy using natural hand motions and also feel what Graspy was feeling during interactions with objects.

My PhD student Rebecca Pierce recently led the development of a wearable device that accomplishes exactly this goal. It uses a direct drive geared DC motor with an optical encoder to actuate and sense a revolute joint that is aligned with the base joint of the operator’s index finger. Opening and closing your hand opens and closes the robot’s paralleljaw gripper, and the motor resists the motion of your hand if the robot grabs onto something. We supplement this kinesthetic haptic feedback with tactile feedback delivered to the pads of the user’s index finger and thumb. A voice coil actuator mounted in each location moves a platform into and out of contact with the finger to match what the robot’s tactile sensors detect. Each voice coil presses with a force proportional to what the corresponding robot finger is feeling, and the voice coils also transmit the high-frequency vibrations (typically caused by collisions) that are sensed by the MEMS-based accelerometer embedded in the robot’s hand. We track the movement of this wearable device using a Vicon optical motion tracking system, and Graspy follows the movements of the operator in real time. The operator sees a video of the interaction taking place. We’re in the process of having human participants test this teleoperation setup right now, and I’m really excited to learn how the haptic feedback affects the operator’s ability to control the robot.

high-bandwidth MEMS-based accelerometer records the sensations a dentist feels as she probes an extracted human tooth. Feeling these recordings lets dental trainees practice diagnosing dental decay before they treat live patients.

The high-bandwidth MEMS-based accelerometer records thesensations a dentist feels as she probes an extracted human tooth. Feeling these recordings lets dental trainees practice diagnosing dental decay before they treat live patients.

CIRCUIT CELLAR: In your TEDYouth talk, you describe a project in which a dental tool is fitted with an accelerometer to record what a dentist feels and then replay it back for a dental student. Can you tell us a bit about the project?

KATHERINE: This project spun out of my haptography research, which I described above. While we were learning to record and model haptic data from interactions between tools and objects, we realized that the original recordings had value on their own, even before we distilled them into a virtual model of what the person was touching. One day I gave a lab tour to two faculty members from the Penn School of Dental Medicine who were interested in new technologies. I hit it off with Dr. Margrit Maggio, who had great experience in teaching general dentistry skills to dental students. She explained that some dental students really struggled to master some of the tactile judgments needed to practice dentistry, particularly in discerning whether or not a tooth surface is decayed (in popular parlance, whether it has a cavity). A few students and I went over to her lab to test whether our accelerometer-based technology could capture the subtle details of how decayed vs. healthy tooth tissue feels. While the recordings are a little creepy to feel, they are super accurate. We refined our approach and conducted several studies on the potential of this technology to be used in training dental students. The results were really encouraging, once again showing the potential that haptic technology holds for improving clinical training.

CIRCUIT CELLAR: What is the “next big thing” in the field of haptics? Is there a specific area or technology that you think will be a game changer?

KATHERINE: Of course this depends on where you’re looking. While cell phones and game controllers have had vibration alerts for a long time, I think we’re just starting to see highquality haptic feedback emerge in consumer products. Haptics can definitely improve the user experience, which will give haptic products a market advantage, but their cost and implementation complexity need to be low enough to keep the product competitive. On the research side, I’m seeing a big move toward tactile feedback and wearable devices. Luckily there are enough interesting open research questions to keep my students and me busy for 30 more years, if not longer!

The complete interview appears in Circuit Cellar 296 (March 2015).

DIY Interactive Robots: An Interview with Erin Kennedy

Erin “RobotGrrl” Kennedy designs award-winning robots. Her RoboBrrd DIY robot-building kit successfully launched in 2012 and was featured in IEEE Spectrum, Forbes, Wired, and on the Discovery Channel. Erin was recognized as  one of the 20 Intel Emerging Young Entrepreneurs. In this interview she tells us about her passion for robotics, early designs, and future plans.5938310667_89a68ca380_o

CIRCUIT CELLAR: How and when did Erin Kennedy become “RobotGrrl?”

ERIN: I used to play an online game, but didn’t want to use my nickname from there. I was building LEGO robots at the time, so my friend suggested “RobotGrrl.” It sounds like a growl without the “ow.”

CIRCUIT CELLAR: Tell us about Why and when did you decide to start blogging?

ERIN: I started around 2006 to document my adventures into the world of robotics. I would post updates to my project on there, similar to a log book. It helped me gain a community that would follow my adventures.

CIRCUIT CELLAR: Your RoboBrrd company is based on the success of your RoboBrrd beginner robot-building kit, which was funded by Indiegogo in 2012. How does the robot work? What is included in the kit?

ERIN: RoboBrrd works by using three servos, a laser-cut chassis, and an Arduino derivative for its brain. Two of the servos are used for the robot’s wings and the third one is used for the beak mechanism. To construct the chassis, all you need is glue. The brains are on a custom-designed Arduino derivative, complete with RoboBrrd doodles on the silkscreen.



The first prototype of RoboBrrd was created with pencils and popsicle sticks. Adafruit sent me the electronics and in return I would make weekly videos about building the robot. People seemed to like the robot, so I kept making newer prototypes that would improve on problems and add more to the design.

Eventually I started working on a laser-cut kit version. I won the WyoLum Open Hardware grant and, with the money, I was able to order PCBs I designed for RoboBrrd.

I had enough money for a flight out to California (for RoboGames and Maker Faire Bay Area) where I was an artist in residence at Evil Mad Scientist Laboratories. It was helpful to be able to use their laser cutter right when a new design was ready. Plus, I was able to build a really old and cool Heathkit.

RoboBrrd chassis

RoboBrrd chassis

Afterward, I worked on the design a little more. SpikenzieLabs ( helped laser cut it for me and eventually it was all finished. It was such an awesome feeling to finally have a solid design!

In 2012, RoboBrrd launched on Indiegogo and luckily there were enough friends out there who were able to help the project and back it. They were all very enthusiastic about the project. I was really lucky.

Now I am working on a newer version of the 3-D printed RoboBrrd and some iOS applications that use Bluetooth Low Energy (BLE) to communicate with it. The design has come a long way, and it has been fun to learn many new things from RoboBrrd.

CIRCUIT CELLAR: RoboBrrd has had widespread popularity. The robots have been featured on The Discovery Channel, Forbes, MAKE, and WIRED. To what do you attribute your success?

ERIN: The success of RoboBrrd is attributed to everyone who is enthusiastic about it, especially those who have bought a kit or made their own RoboBrrds. It is always fun to see whenever people make modifications to their RoboBrrds.

All I did was make and deliver the kit. It’s all of the “friends of RoboBrrd” who bring their own creative ideas to make it really shine. Also, from the previous question, the readers can see that I had a lot of help along the way.

Having the robots featured on many websites required some luck. You never know if your e-mail pitch is what the journalists are looking for to cover the robot. I was really lucky that websites featured RoboBrrd; it provides it with a little more credibility.

In my opinion, the quirkiness of RoboBrrd helps as well. Sometimes people view it as the “open-source hardware (OSHW) Furby.” It’s a robotic bird and it isn’t your regular wheeled-robot.

CIRCUIT CELLAR: What was the first embedded system you designed. Where were you at the time? What did you learn from the experience?

ERIN: There were systems that I designed using the LEGO Mindstorms RCX 2.0, but my very first design from scratch was a robot called BubbleBoy. The outer appearance looked like a pink snowman. It sat on a green ice cream container and wore a top hat. It was very rudimentary. At the time I was in Grade 11.

Inside of the body sphere were two servos. The servos would push/pull on paper clips that were attached to the head. Inside the head there was a DC motor to spin the top hat around. There was also a smaller DC motor inside the body to attach to a hula hoop to wiggle it. The electronics were enclosed in the container. The robot used an Arduino Diecimila microcontroller board (limited-edition prototype version) and some transistors to control the motors from battery power. There was also a LCD to display the robot’s current mood and water and food levels. On each side of the screen buttons incremented the water or food levels.

There’s a 2009 video of me showing BubbleBoy on Fat Man & Circuit Girl. (Jeri Ellsworth co-hosted the webcast.)

There was not as much documentation online about the Arduino and learning electronics as there is now. I gained many skills from this experience.

The biggest thing I learned from BubbleBoy was how to drive DC motors by using transistors. I also learned how to not mount servos. The hot glue on polystyrene was never rigid enough and kept moving. It was a fun project; the hands-on making of a robot character can really help you kick off making bigger projects.

You can read the entire interview in Circuit Cellar 293 (December 2014).

Robotics & Intelligent Gaming

When Alessandro Giacomel discovered Arduino in 2009, he quickly became hooked. Since then, he’s been designing “little robots” around Ardunio and blogging about his work and findings. In this interview, Alessandro tells us about his most interesting projects and shares his thoughts on the future of robotics, 3-D printing, and more.

CIRCUIT CELLAR: How long have you been designing embedded systems and what sparked your interest

ALESSANDRO: I have been designing embedded systems for about five years. My interest arose from the possibility of building robots. When I was a kid, I found robots extremely fascinating. The ability to force matter to do something we decided always seemed to be one of the main goals conceded to man.

CIRCUIT CELLAR: Tell us about your first design.

ALESSANDRO: My first embedded system was an Arduino 2009. The availability of a huge shield, sensors, and actuators has enabled me to design many applications at an acceptable price for an amateur like me.


Alessandro’s first robot

I started like many people, with a robot on wheels moving around avoiding obstacles. It’s a standard robot that almost all beginners build. It’s simple because it is built with only a few components and a standard Arduino 2009. The design included modified servomotors that can rotate 360° moving the robot and connected to the wheels and a servomotor to move a little head where there is an ultrasonic distance sensor. The distance sensor lets you know when the robot is in front of an obstacle and helps you decide the most convenient way for the robot to escape.

In its simplicity, this robot enables one to understand the basics for the development of a microcontroller-based robot: the need to have separate power supplies for the motors’ power circuits and for the microcontroller’s logic, the need to have precise sensor reading timing, and the importance of having efficient algorithms to ensure that the robot moves in the desired mode.

My first robot took me a long time to build. But all the elements of the robot (hardware and software) were developed by me and this was important because it let me begin to face the real problems that arise when you are building a robot. Today there are many resources on the Internet that enable one to build a robot simply replicating a set of steps anyone has described. These guides should be used as a source of inspiration, never followed exactly step-by-step, otherwise—while in the end it is true that you can build a robot—you don’t own the knowledge of what has been done.

My robot evolved with the ability to speak, thanks to a sound module. When I build a robot the goal is always to experiment with a technology and to have fun. My friends have enjoyed seeing the robot turning around, speaking, and telling funny stories.

CIRCUIT CELLAR: Your blog, Robottini (, is described as “little robots with Arduino.” What inspired you to begin the blog

ALESSANDRO: I strongly believe in sharing knowledge and open-source hardware and software. I thought it was normal to try to share what I was designing when I started to build robots. When I started, I had the benefit of what others had made and published on the Internet. I thought about writing a blog in my language, Italian, but I thought also it would be a good exercise for me to try to write in English and, most importantly, this enabled me to reach a much wider audience.

The site description includes the philosophy at the basis of the blog: small robots built using Arduino. I build small robots because I’m an amateur and my house isn’t very big, so I only build robots that I can put in an armoire. I use Arduino because it is a microcontroller developed in Italy, it was obvious for me to use it, and it is really a great board for a beginner—inexpensive and robust.


Alessandro’s first robot at the Arduino Day 2011 event

The community has developed thousands of applications that can be reused. When I started the blog in 2011, I was building small robots for a few years. In the beginning, finding information was much more complicated and there were few shields that were not cheap. So, I always tried to use “poor” materials (e.g., recovered or recycled). Decreasing the cost of implementation and reusing and imagining new purposes for the things already available in a normal house seemed like a good way to work.

My achievements documented in the blog are never step-by-step guides to build the robot. I include a list of components to buy, the source code, and sometimes the wiring diagram. But I never provide a complete guide, since I think everyone should try to build their own robot because, once built, the satisfaction is enormous.

Through my blog I am available to help with problems people encounter when they are building robots, but I think it is important to give people the tools to build, rather than providing detailed explanations. Everyone can learn only by fighting the difficulties, without having someone preparing everything perfectly.

CIRCUIT CELLAR: Robottini obviously includes quite a few robotics projects. Why did you build them? Do you have a favorite?

ALESSANDRO: Many times people ask me what is the meaning of the robots I build. The answer that I give them leaves people puzzled. The answer is this: My robots are useless. They are useful only as fun—as a passion. I’m happy when I see my little son, Stefano, who is three years old, watching and laughing at a robot turning around in our house. But this does not mean I don’t follow a branch of research when I build robots.

Initially, I built robots to understand how the driver for the motors works, the sensors, and the problems related to the logic of the robot. Afterward, the first branch of research was the issue of control, how to set the proportional, integral, derivative (PID) control to follow a line or make a robot that is in balance. This has enabled me to address the management of complex sensors, such as the inertial measurement unit (IMU).

To have a robot balance on two wheels it is important to measure how much the robot is tilting from the vertical. To do this, typically a cluster of sensors is used, called IMU, which are based on multi-axes combinations of precision gyroscopes, accelerometers, magnetometers, and pressure sensors. In a more simple version, the IMU uses an accelerometer and a gyroscope, and it is mandatory to use both signals to obtain a correct value of the tilt angle from the vertical (it is called fusion of signals).

The most common method used is based on the Kalman filter, which is a mathematical tool that enables you to combine two or more signals to obtain the value of the angle. But it is a highly sophisticated and difficult for an amateur to understand, and it requires fairly advanced knowledge of mathematics. A new method that is rather simple has been proposed in the last years. It is called the “complementary filter.“

One of the studies I performed and posted on my blog compares in practice the signals of the two filters to verify if the complementary filter is able to approximate the Kalman filter in typical situations coming up in robotics. This post has had a great following, and I’ve been surprised to see that several university-level scientific publications have linked to it. I only wrote the post because I was curious to see a simple and almost trivial method that has become helpful to researchers and hobbyists. It has been a pleasure for me.

In the last year, I have followed the trend of art and interaction (i.e., the possibility of building something that can somehow marry art with technology). It was the theme of the stall I had at Maker Faire Europe in Rome, Italy, in October 2013. Arduino is an electronic circuit without a heart and without a soul. Can an Arduino be an artist? I’m trying to do something with Arduino that could be “art.” The arts include painting, poetry, music, sculpture, and so on. I’m trying to do something in different fields of art.

My first experiment is the Dadaist Poetry Box, which is a box capable of composing and printing Dadaist poems. It’s made with an Arduino and uses a printer for receipts to write poems. The box uses an algorithm to compose the poems in autonomy. You push the button and here is your Dadaist poem.


Dadaist poetry box design

Normally, the poem is a valuable asset, the result of an intimate moment when the poet transposes on paper the emotions of his soul. It is an inspired act, an act of concentration and transport. It’s not immediate. The poem box instead is trivial, it seems almost “anti-poem.” But it’s not; it’s a Dadaist poem. A user can push the button and have an original poem. I like the machine because it gives everyone something material to take home. In this way, the experience of interaction with the machine goes beyond the moment.

Another of my favorite robots is one that is capable of drawing portraits. I’ve never been good at drawing, and I’ve always been envious of those who can easily use a pencil to make a portrait. So I tried using my technical skills to fill this gap.


Portrait-drawing robot

The search of the algorithm that—starting from a picture—is able to detect the most important lines of the face has been particularly long and difficult. I used the OpenCV open-source libraries for computer vision and image processing, which are very powerful, but hard to handle. Installing the libraries is not a simple undertaking and using them is even more complicated. I used the OpenCV for Processing. Processing is an open-source programming language and integrated development environment (IDE) built for the electronic arts, new media art, and visual design communities with the purpose of teaching the fundamentals of computer programming in a visual context.

The algorithm initially found facial lines using the algorithms for calculation of edges of a picture. So I used the Canny edge detector, the Sobel edge detector, and all the other main edge detection algorithms, but none of these proved to be adequate to draw a face. Then I changed the course and used the Laplacian filter with threshold. I think I reached a good result because it takes less than 10 minutes to draw a portrait, which enables me to take pictures of people and make their portrait before they lose their patience.

CIRCUIT CELLAR: What new technologies excite you and why?

ALESSANDRO: I work almost strictly with Arduino microcontrollers. I was excited with the arrival of Linux-embedded mini-PCs (e.g., the Raspberry PI, the pcDuino, and’s BeagleBone Black). Forcibly, I’m very intrigued by the new Arduino Tre, which is a mini-PC with Linux joined with an Arduino Leonardo. Combining a PC’s processing power of with Linux with the real-time management of the sensors and actuators made by an Arduino is an interesting road. It offers the possibility to manage the real-time processing of video streams through, for example, the OpenCV libraries, with the option of acquiring signals from analog sensors and the possibility of drive motors. For example, this enables one to have a completely autonomous 3-D printer and to perform the slicing and management of the 3-D printer. It also opens up new perspectives in the robotics and computer vision. The main limitation, which is now present in embedded systems, is the limited processing capacity. The ability to have in the same card a Linux system—with the world of applications and drivers already available—linked to the ability to manage physical devices brings a revolution. And I’m already excited to see the next results.

Read the complete interview in Circuit Cellar 292 November 2014.

Book: Advanced Control Robotics

When it comes to robotics, the future is now! With the ever-increasing demand for robotics applications—from home control systems to animatronic toys to unmanned planet rovers—it’s an exciting time to be a roboticist. Whether you’re a weekend DIYer, a computer science student, or a professional engineer, you’ll find this book to be a valuable reference tool.

Advanced Control Robotics, by Hanno Sander

It doesn’t matter if you’re building a line-following robot toy or tasked with designing a mobile system for an extraterrestrial exploratory mission: the more you know about advanced robotics technologies, the better you’ll fare at your workbench. Hanno Sander’s Advanced Control Robotics (Elektor/Circuit Cellar, 2014) is intended to help roboticists of various skill levels take their designs to the next level with microcontrollers and the know-how to implement them effectively.

Advanced Control Robotics simplifies the theory and best practices of advanced robot technologies. You’re taught basic embedded design theory and presented handy code samples, essential schematics, and valuable design tips (from construction to debugging).

Sponsored by Circuit Cellar — Read the Table of Contents for Advanced Control Robotics. Ready to start learning? Purchase a copy of Advanced Control Robotics today!

You will learn about:

  • Control Robotics: robot actions, servos, and stepper motors
  • Embedded Technology: microcontrollers and peripherals
  • Programming Languages: machine level (Assembly), low level (C/BASIC/Spin), and human (12Blocks)
  • Control Structures: functions, state machines, multiprocessors, and events
  • Visual Debugging: LED/speaker/gauges, PC-based development environments, and test instruments
  • Output: sounds and synthesized speech
  • Sensors: compass, encoder, tilt, proximity, artificial markers, and audio
  • Control Loop Algorithms: digital control, PID, and fuzzy logic
  • Communication Technologies: infrared, sound, and XML-RPC over HTTP
  • Projects: line following with vision and pattern tracking
Hanno Sander at Work

Hanno Sander at Work

About the author: Hanno Sander earned a degree in Computer Science from Stanford University, where he built one of the first hybrid cars, collaborated on a microsatellite, and studied artificial intelligence. He later founded a startup to develop customized information services and then transitioned to product marketing in Silicon Valley with Oracle, Yahoo, and Verity. Today, Hanno’s company, HannoWare, seeks to make sophisticated technology—robots, programming languages, debugging tools, and oscilloscopes—more accessible. Hanno lives in Christchurch, New Zealand, where he enjoys his growing family and focuses on his passion of improving education with technology.

Self-Reconfiguring Robotic Systems & M-Blocks

Self-reconfiguring robots are no longer science fiction. Researchers at MIT are rapidly innovating shape-shifting robotic systems. In the August 2014 issue of Circuit Cellar, MIT researcher Kyle Gilpin presents M-Blocks, which are 50-mm cubic modules capable of controlled self-reconfiguration.

The creation of autonomous machines capable of shape-shifting has been a long-running dream of scientists and engineers. Our enthusiasm for these self-reconfiguring robots is fueled by fantastic science fiction blockbusters, but it stems from the potential that self-reconfiguring robots have to revolutionize our interactions with the world around us.

Source: Kyle Gilpin

Source: Kyle Gilpin

Imagine the convenience of a universal toolkit that can produce even the most specialized tool on demand in a matter of minutes. Alternatively, consider a piece of furniture, or an entire room, that could change its configuration to suit the personal preferences of its occupant. Assembly lines could automatically adapt to new products, and construction scaffolding could build itself while workers sleep. At MIT’s Distributed Robotics Lab, we are working to make these dreams into reality through the development of the M-Blocks.

The M-Blocks are a set of 50-mm cubic modules capable of controlled self-reconfiguration. Each M-Block is an autonomous robot that can not only move independently, but can also magnetically bond with other M-Blocks to form larger reconfigurable systems. When part of a group, each module can climb over and around its neighbors. Our goal is that a set of M-Blocks, dispersed randomly across the ground, could locate one another and then independently move to coalesce into a macro-scale object, like a chair. The modules could then reconfigure themselves into a sphere and collectively roll to a new location. If, in the process, the collective encounters an obstacle (e.g., a set of stairs to be ascended), the sphere could morph into an amorphous collection in which the modules climb over one another to surmount the obstacle.  Once they have reached their final destination, the modules could reassemble into a different object, like a desk.

The M-Blocks move and reconfigure by pivoting about their edges using an inertial actuator. The energy for this actuation comes from a 20,000-RPM flywheel contained within each module. Once the motor speed has stabilized, a servomotor-driven, self-tightening band brake decelerates the flywheel to a complete stop in 15 ms. All of the momentum that had been accumulated in the flywheel is transferred to the frame of the M-Block. Consequently, the module rolls forward from one face to the next, or if the flywheel velocity is high enough, it rapidly shoots across the ground or even jumps several body lengths through the air. (Refer to  to watch the cubes move.)

While the M-Blocks are capable of independent movement, their true potential is only realized when many modules operate as a group. Permanent magnets on the outside of each M-Block serve as un-gendered connectors. In particular, each of the 12 edges holds two cylindrical magnets that are captive, but free to rotate, in a semi-enclosing cage. These magnets are polarized through their radii, not through their long axes, so as they rotate, they can present either magnetic pole. The benefit of this arrangement is that as two modules are brought together, the magnets will automatically rotate to attract. Furthermore, as one and then two additional M-Blocks are added to form a 2 × 2 grid, the magnets will always rotate to realign and accommodate the additional modules.

The same cylindrical magnets that bond neighboring M-Blocks together form excellent pivot axes, about which the modules may roll over and around one another. We have shown that the modules can climb vertically over other modules, move horizontally while cantilevered from one side, traverse while suspended from above, and even jump over gaps. The permanent magnet connectors are completely passive, requiring no control and no planning. Because all of the active components of an M-Block are housed internally, the modules could be hermetically sealed, allowing them to operate in extreme environment where other robotic systems may fail.

While we have made significant progress, many exciting challenges remain. In the current generation of modules, there is only a single flywheel, and it is fixed to the module’s frame, so the modules can only move in one direction along a straight line. We are close to publishing a new design that enables the M-Blocks to move in three dimensions, makes the system more robust, and ensures that the modules’ movements are highly repeatable. We also hope to build new varieties of modules that contain cameras, grippers, and other specialized, task-specific tools. Finally, we are developing algorithms that will allow for the coordinated control of large ensembles of hundreds or thousands of modules. With this continued development, we are optimistic that the M-Blocks will be able to solve a variety of practical challenges that are, as of yet, largely untouched by robotics.

Kyle Gilpin

Kyle Gilpin


Kyle Gilpin, PhD, is a Postdoctoral Associate in the Distributed Robotics Lab at the Massachusetts Institute of Technology (MIT) where he is collaborating with Professor Daniela Rus and John Romanishin to develop the M-Blocks. Kyle works to improve communication and control in large distributed robotic systems. Before earning his PhD, Kyle spent two years working as a senior electrical engineer at a biomedical device start-up. In addition to working for MIT, he owns a contract design and consulting business, Crosscut Prototypes. His past projects include developing cellular and Wi-Fi devices, real-time image processing systems, reconfigurable sensor nodes, robots with compliant SMA actuators, integrated production test systems, and ultra-low-power sensors.

Circuit Cellar 289 (August 2014) is now available.

24-Channel Digital I/O Interface for Arduino & Compatibles

SCIDYNE Corp. recently expanded its product line by developing a digital I/O interface for Arduino hardware. The DIO24-ARD makes it easy to connect to solid-state I/O racks, switches, relays, LEDs, and many other commonly used peripheral devices. Target applications include industrial control systems, robotics, IoT, security, and education.Scidyne

The board provides 24 nonisolated I/O channels across three 8-bit ports. Each channel’s direction can be individually configured as either an Input or Output using standard SPI library functions. Outputs are capable of sinking 85 mA at 5 V. External devices attach by means of a 50 position ribbon-cable style header.

The DIO24-ARD features stack-through connectors with long-leads allowing systems to be built around multiple Arduino shields. It costs $38.

[Source: SCIDYNE Corp.]

Artisan’s Asylum (Somerville, MA, USA)

Artisan’s Asylum in Somerville, MA has the mission to promote and support the teaching, learning, and practicing of all varieties. Soumen Nandy is the Front Desk, General Volunteer, and Village Idiot of Artisan’s Asylum and she decided to tell us a little bit more about it.


Photo courtesy of Artisan’s Asylum Facebook page

Location 10 Tyler St
Somerville, MA 02143
Members 400 active members

Tell us about your meeting space!

We have around 40,000 sq. ft. that includes more than 150 studio spaces ranging from 50 sq. ft. to 200+ sq. ft. Our storage includes: lockers, 2 x 2 x 2 rack space, 40″ x 44″ pallets (up to 10′ tall), flexspace and studios. We have a truck-loading dock and a rail stop — yup, entire trains can pull up to our back doors for delivery. Can any other Maker Space say that? We also host a large roster of formal training courses in practical technologies, trades, crafts and arts, to help our members and the general community learn skills, and increase their awesomeness. (And not incidentally: become certified to safely use our gear.)

What are you working with?

Fully equipped wood, metal, machine, robotics, electronics, jewelry/glass shops, 12 sewing stations,  computer lab with all major professional modeling, CNC, and simulation packages (via direct partnerships with the respective companies). Multiple types of 3D printers, laser cutters, CNC routers, lathes, mills, etc. Too much more to list; if the Asylum doesn’t own/lease it, often a member, their business, or an institutional member can get it from you or get you access. And yet, it’s never enough.

Are there any tools your group really wants or needs?

Quite a few things, but it’s a delicate balance between sustainable operations, growth and space for member studios vs. facilities. We’ve spun off or attracted many companies, so the empty factory complex we moved into (until recently the worlds largest envelope factory) has almost completely filled up.

Does your group work with embedded tech (Arduino, Raspberry Pi, embedded security, MCU-based designs, etc.)?

Many of our members do. The group itself is too diverse to easily characterize.

What has your group been up to?


Hanging with a giant robot. (Courtesy of

We’re not purely a technological space. We have artists, artisans, tradespesons, crafters, hobbyists, and technologists. I know of at least two-million dollar Kickstarters that launched from here. Hmmm… How about the 18-foot wide rideable-hexapod robot that’s nearing conclusion (we call it “Stompy“) or the 4′ x 8′ large format laser cutter that should be operational any day now? These are just some notably big projects, not necessarily our most awesome.

Oh, wait. we did an Ides of March Festival, dressing up Union Square as a Roman Forum.

What’s the craziest project your group or group members have completed?

Well, a few weeks ago, I went home at 10 PM, and woke to a tweeted photo announcing that this had been built in our social area; It’s actually not among our most surprising events, but it has reappeared several times since (fast dis/assembly), and a reporter caught it once. I just happened to receive this link a couple of hours ago, so it was handy to forward to you. We do a lot of art and participation projects around Boston.

What does Artisan’s social calendar look like these days?

Too many events to list! We’re really looking to stabilize our base, seek congruent funding donors (we are a non-profit, but thus far have mostly run on internally-earned income). I’d be happy to arrange an interview with one of our honchos if you like—the goings-ons around here are really too much to fit in one brain. Those of us who give tours actually regularly take each other’s tours to learn stuff about the place we never knew.

What would you like to say to fellow hackers out there?

Keep getting awesomer. We love you!

Also, any philanthropists out there? Our members and facilities could be an excellent way to multiply your awesome impact.

Keep up with Artisan’s Asylum! Check out their website!

Show us your hackerspace! Tell us about your group! Where does your group design, hack, create, program, debug, and innovate? Do you work in a 20′ × 20′ space in an old warehouse? Do you share a small space in a university lab? Do you meet at a local coffee shop or bar? What sort of electronics projects do you work on? Submit your hackerspace and we might feature you on our website!

Robotics, Hardware Interfacing, and Vintage Electronics

Gerry O’Brien, a Toronto-based robotics and electronics technician at R.O.V. Robotics, enjoys working on a variety of projects in his home lab. His projects are largely driven by his passion for electronics hardware interfacing.

Gerry’s background includes working at companies such as Allen-Vanguard Corp., which builds remotely operated vehicle (ROV) robots and unmanned ground vehicles (UGVs) for military and police bomb disposal units worldwide. “I was responsible for the production, repair, programming and calibration of the robot control consoles, VCU (vehicle control unit) and the wireless communication systems,” he says.

Gerry recently sent Circuit Cellar photos of his home-based electronics and robotics lab. (More images are available on his website.) This is how he describes the lab’s layout and equipment:

In my lab I have various designated areas with lab benches that I acquired from the closing of a local Nortel  R&D office over 10 years ago.

All of my electronics benches have ESD mats and ground wrist straps.  All of my testing gear, I have purchased on eBay over the years….

PCB flip rack

PCB flip-rack

To start, I have my “Electronics Interfacing Bench” with a PCB flip-rack , which allows me to Interface PCBs while they are powered (in-system testing). I am able to interface my Tektronix TLA715 logic analyzer and other various testing equipment to the boards under test. My logic analyzer currently has two  logic I/O modules that have 136 channels each. So combined, I have 272 channels for logic analysis. I also have a four-channel digital oscilloscope module to use with this machine. I can now expand this even further by interfacing my newly acquired expansion box, which allows me to interface many more modules to the logic analyzer mainframe.

Gerry's lab bench

Gerry’s lab bench

Gerry recently upgraded his  Tektronix logic analyzer with an expansion box.

Gerry recently upgraded his Tektronix logic analyzer with an expansion box.

Interface probes

Logic analyzer interface probes

I also have a soldering bench where I have all of my soldering gear, including a hot-air rework station and 90x dissecting microscope with a video interface.

Dissecting microscope with video interface

Dissecting microscope with video interface

My devoted robotics bench has several robotic arm units, Scorbot and CRS robots with their devoted controllers and pneumatic Interface control boards.

Robotics bench

Robotics bench and CRS robot

On my testing bench, I currently have an Agilent/HP 54610B 500-MHz oscilloscope with the GPIB to RS-232 adapter for image capturing. I also have an Advantest model R3131A 9 kHz to 3-GHz bandwidth spectrum analyzer, a Tektronix model AFG3021 function generator, HP/Agilent 34401A multimeter and an HP 4CH programmable power supply. For the HP power supply, I built a display panel with four separate voltage output LCD displays, so that I can monitor the voltages of all four outputs simultaneously. The stock monochrome LCD display on the HP unit itself is very small and dim and only shows one output at a time.

Anyhow, my current testing bench setup will allow me to perform various signal mapping and testing on chips with a large pin count, such as the older Altera MAX9000 208-pin CPLDs and many others that I enjoy working with.

The testing bench

The testing bench

And last but not least… I have my programming and interfacing bench devoted to VHDL programming, PCB Design, FPGA hardware programming (JTAG), memory programming (EEPROM  and flash memory), web design, and video editing.

Interfacing bench and "octo-display"

Interfacing bench and “octo-display”

I built a PC computer and by using  a separate graphics display cards, one being an older Matrols four-port SVGA display card; I was able to build a “octo-display” setup. It seamlessly shares eight monitors providing a total screen resolution size of 6,545 x 1,980 pixels.

If you care to see how my monitor mounting assembly was built, I have posted pictures of its construction here.

A passion for electronics interfacing drives Gerry’s work:

I love projects that involve hardware Interfacing.  My area of focus is on electronics hardware compared to software programming. Which is one of the reasons I have focused on VHDL programming (hardware description language) for FPGAs and CPLDs.

I leave the computer software programming of GUIs to others. I will usually team up with other hobbyists that have more of a Knack for the Software programming side of things.  They usually prefer to leave the electronics design and hardware production to someone else anyhow, so it is a mutual arrangement.

I love to design and build projects involving vintage Altera CPLDs and FPGAs such as the Altera MAX7000 and MAX9000 series of Altera components. Over the years, I have a managed to collect a large arsenal of vintage Altera programming hardware from the late ’80s and early ’90s.  Mainly for the Altera master programming unit (MPU) released by Altera in the early ’90s. I have been building up an arsenal of the programming adapters for this system. Certain models are very hard to find. Due to the rarity of this Altera programming system, I am currently working on designing my own custom adapter interface that will essentially allow me to connect any compatible Altera component to the system… without the need of the unique adapter. A custom made adapter essentially.  Not too complicated at all really, it’s just a lot of fun to build and then have the glory of trying out other components.

I love to design, build, and program FPGA projects using the VHDL hardware description language and also interface to external memory and sensors. I have a devoted website and YouTube channel where I post various hardware repair videos or instructional videos for many of my electronics projects. Each project has a devoted webpage where I post the instructional videos along with written procedures and other information relating to the project. Videos from “Robotic Arm Repair” to a “DIY SEGA Game Gear Flash Cartridge” project. I even have VHDL software tutorials.

The last project I shared on my website was a project to help students dive into a VHDL based VGA Pong game using the Altera DE1 development board.


Client Profile: Integrated Knowledge Systems

Integrated Knowledge Systems' NavRanger board

Integrated Knowledge Systems’ NavRanger board

Phoenix, AZ

CONTACT: James Donald,

EMBEDDED PRODUCTS: Integrated Knowledge Systems provides hardware and software solutions for autonomous systems.
featured Product: The NavRanger-OEM is a single-board high-speed laser ranging system with a nine-axis inertial measurement unit for robotic and scanning applications. The system provides 20,000 distance samples per second with a 1-cm resolution and a range of more than 30 m in sunlight when using optics. The NavRanger also includes sufficient serial, analog, and digital I/O for stand-alone robotic or scanning applications.

The NavRanger uses USB, CAN, RS-232, analog, or wireless interfaces for operation with a host computer. Integrated Knowledge Systems can work with you to provide software, optics, and scanning mechanisms to fit your application. Example software and reference designs are available on the company’s website.

EXCLUSIVE OFFER: Enter the code CIRCUIT2014 in the “Special Instructions to Seller” box at checkout and Integrated Knowledge Systems will take $20 off your first order.


Circuit Cellar prides itself on presenting readers with information about innovative companies, organizations, products, and services relating to embedded technologies. This space is where Circuit Cellar enables clients to present readers useful information, special deals, and more.

Q&A: Robotics Mentor and Champion

Peter Matteson, a Senior Project Engineer at Pratt & Whitney in East Hartford, CT, has a passion for robotics. We recently discussed how he became involved with mentoring a high school robotics team, the types of robots the team designs, and the team’s success.—Nan Price, Associate Editor


NAN: You mentor a FIRST (For Inspiration and Recognition of Science and Technology) robotics team for a local high school. How did you become involved?

Peter Matteson

Peter Matteson

PETER: I became involved in FIRST in late 2002 when one of my fraternity brothers who I worked with at the time mentioned that FIRST was looking for new mentors to help the team the company sponsored. I was working at what was then known as UTC Power (sold off to ClearEdge Power Systems last year) and the company had sponsored Team 177 Bobcat Robotics since 1995.

After my first year mentoring the kids and experiencing the competition, I got hooked. I loved the competition and strategy of solving a new game each year and designing and building a robot. I enjoyed working with the kids, teaching them how to design and build mechanisms and strategize the games.

The FIRST team’s 2010 robot is shown.

The FIRST team’s 2010 robot is shown.

A robot’s articulating drive train is tested  on an obstacle (bump) at the 2010 competition.

A robot’s articulating drive train is tested on an obstacle (bump) at the 2010 competition.

NAN: What types of robots has your team built?

A temporary control board was used to test the drive base at the 2010 competition.

A temporary control board was used to test the drive base at the 2010 competition.

PETER: Every robot we make is purposely built for a specific game the year we build it. The robots have varied from arm robots with a 15’ reach to catapults that launch a 40” diameter ball, to Frisbee throwers, to Nerf ball shooters.

They have varied in drive train from 4 × 4 to 6 × 6 to articulating 8 × 8. Their speeds have varied from 6 to 16 fps.

NAN: What types of products do you use to build the robots? Do you have any favorites?

PETER: We use a variant of the Texas Instruments (TI) cRIO electronics kit for the controller, as is required per the FIRST competition rules. The motors and motor controllers we use are also mandated to a few choices. We prefer VEX Robotics VEXPro Victors, but we also design with the TI Jaguar motor controllers. For the last few years, we used a SparkFun CMUcam webcam for the vision system. We build with Grayhill encoders, various inexpensive limit switches, and gyro chips.

The team designed a prototype minibot.

The team designed a prototype minibot.

For pneumatics we utilize compressors from Thomas and VIAIR. Our cylinders are primarily from Bimba, but we also use Parker and SMC. For valves we use SMC and Festo. We usually design with clipart plastic or stainless accumulator tanks. Our gears and transmissions come from AndyMark, VEX Robotics’s VEXPro, and BaneBots.

The AndyMark shifter transmissions were a mainstay of ours until last year when we tried the VEXPro transmissions for the first time. Over the years, we have utilized many of the planetary transmissions from AndyMark, VEX Robotics, and BaneBots. We have had good experience with all the manufacturers. BaneBots had a shaky start, but it has vastly improved its products.

We have many other odds and ends we’ve discovered over the years for specific needs of the games. Those are a little harder to describe because they tend to be very specific, but urethane belting is useful in many ways.

NAN: Has your team won any competitions?

Peter’s FIRST team is pictured at the 2009 championship at the Georgia Dome in Atlanta, GA. (Peter is standing fourth from the right.)

Peter’s FIRST team is pictured at the 2009 championship at the Georgia Dome in Atlanta, GA. (Peter is standing fourth from the right.)

PETER: My team is considered one of the most successful in FIRST. We have won four regional-level competitions. We have always shined at the competition’s championship level when the 400 teams from the nine-plus countries that qualify vie for the championship.

In my years on the team, we have won the championship twice (2007 and 2010), been the championship finalist once (2011), won our division, made the final four a total of six times (2006–2011), and were division finalists in 2004.

A FIRST team member works on a robot “in the pits” at the 2011 Hartford, CT, regional competition.

A FIRST team member works on a robot “in the pits” at the 2011 Hartford, CT, regional competition.

Team 177 was the only team to make the final four more than three years in a row, setting the bar at six consecutive trips. It was also the only team to make seven trips to the final four, including in 2001.

NAN: What is your current occupation?

PETER: I am a Senior Project Engineer at Pratt & Whitney. I oversee and direct a team of engineers designing components for commercial aircraft propulsion systems.

NAN: How and when did you become interested in robotics?

PETER: I have been interested in robotics for as long as I can remember. The tipping point was probably when I took an industrial robotics course in college. That was when I really developed a curiosity about what I could do with robots.

The industrial robots course started with basic programming robots for tasks. We had a welding robot we taught the weld path and it determined on its own how to get between points.

We also worked with programming a robot to install light bulbs and then determine if the bulbs were working properly.

In addition to practical labs such as those, we also had to design the optimal robot for painting a car and figure out how to program it. We basically had to come up with a proposal for how to design and build the robot from scratch.

This robot from the 2008 competition holds a 40” diameter ball for size reference.

This robot from the 2008 competition holds a 40” diameter ball for size reference.

NAN: What advice do you have for engineers or students who are designing robots or robotic systems?

PETER: My advice is to clearly set your requirements at the beginning of the project and then do some research into how other people have accomplished them. Use that inspiration as a stepping-off point. From there, you need to build a prototype. I like to use wood, cardboard, and other materials to build prototypes. After this you can iterate to improve your design until it performs exactly as expected.

Ohio-Based “Design Dungeon”

“Steve Ciarcia had a ‘Circuit Cellar.’ I have a ‘Design Dungeon,’” Steve Lubbers says about his Dayton, OH-based workspace.

“An understanding wife and a spare room in the house allocated a nice place for a workshop. Too bad the engineer doesn’t keep it nice and tidy! I am amazed by the nice clean workspaces that have previously been published! So for those of you who need a visit from FEMA, don’t feel bad. Take a look at my mess.”

Steve Lubbers describes his workbench as a “work in progress.”

Steve Lubbers describes his workbench as a “work in progress.”

The workspace is a creative mess that has produced dozens of projects for Circuit Cellar contests. From the desk to the floor to the closet, the space is stocked with equipment and projects in various stages.

Lubbers writes:

The doorway is marked “The Dungeon.” The first iteration of The Dungeon was in my parents’ basement. When I bought a house, the workshop and the sign moved to my new home.

The door is a requirement when company comes to visit. Once you step inside, you will see why. The organizational plan seems to be a pile for everything, and everything in a pile. Each new project seems to reduce the amount of available floor space.


Lubbers’s organization plan is simple: “A pile for everything, and everything in a pile.”

“High-tech computing” is accomplished on a PDP-11/23. This boat anchor still runs to heat the room, but my iPod has more computing abilities! My nieces and nephews don’t really believe in 8” disks, but I have proof.

The desk (messy of course) holds a laptop computer and a ham radio transceiver. Several of my Circuit Cellar projects have been related to amateur radio. A short list of my ham projects includes a CW keyer, an antenna controller, and a PSK-31 (digital communications) interface.


Is there a desk under there?

My workbench has a bit of clear space for my latest project and fragments of previous projects are in attendance. The skull in the back right is wearing the prototype for my Honorable Mention in the Texas Instruments Design Stellaris 2010 contest. It’s a hands-free USB mouse. The red tube was the fourth-place winner in the microMedic 2013 National Contest.

Front and center is the prototype for my March 2014 Circuit Cellar article on robotics. Test equipment is a mix of old and new. Most of the newer equipment has been funded by past Circuit Cellar contests and articles.


“My wife allows my Hero Jr. robot to visit the living room. He is housebroken after all,” Lubbers says.

The closet is a “graveyard” for all of the contest kits I have received, models I would like to build, and other contraptions the wife doesn’t allow to invade the rest of the house. (She is pretty considerate because you will find my Hero Jr. robot in the living room.)

At one time, The Dungeon served as my home office. For about five years I had the ideal “down the hall” commute. A stocked lab helped justify my ability to work from home.

When management pulled the plug on working remotely, the lab got put to work developing about a dozen projects for Circuit Cellar contests. There has been a dry spell since my last contest entry, so these days I am helping develop the software for the ham radio Satellite FOX-1. My little “CubeSat” will operate as a ham radio transponder and a platform for university experiments when it launches in late 2014. Since I will probably never go to space myself, the next best thing is launching my code into orbit. It’s a good thing that FOX-1 is smaller than a basketball. If it was bigger, it might not fit on my workbench!

Lubbers’s article about building a swarm of robots will appear in Circuit Cellar’s March issue. To learn more about Lubbers, read our 2013 interview.

Q&A: Hacker, Roboticist, and Website Host

Dean “Dino” Segovis is a self-taught hardware hacker and maker from Pinehurst, NC. In 2011, he developed the Hack A Week website, where he challenges himself to create and post weekly DIY projects. Dino and I recently talked about some of his favorite projects and products. —Nan Price, Associate Editor


NAN: You have been posting a weekly project on your website, Hack A Week, for almost three years. Why did you decide to create the website?

Dean "Dino" Segovis at his workbench

Dean “Dino” Segovis at his workbench

DINO: One day on the Hack A Day website I saw a post that caught my attention. It was seeking a person to fill a potential position as a weekly project builder and video blogger. It was offering a salary of $35,000 a year, which was pretty slim considering you had to live in Santa Monica, CA. I thought, “I could do that, but not for $35,000 a year.”

That day I decided I was going to challenge myself to come up with a project and video each week and see if I could do it for at least one year. I came up with a simple domain name,, bought it, and put up a website within 24 h.

My first project was a 555 timer-based project that I posted on April 1, 2011, on my YouTube channel, “Hack A Week TV.” I made it through the first year and just kept going. I currently have more than 3.2 million video views and more than 19,000 subscribers from all over the world.

NAN: Hack A Week features quite a few robotics projects. How are the robots built? Do you have a favorite?

rumblebot head

Dino’s very first toy robot hack was the Rumble robot. The robot featured an Arduino that sent PWM to the on-board H-bridge in the toy to control the motors for tank steering. A single PING))) sensor helped with navigation.

Rumble robot

The Rumble robot

DINO: I usually use an Arduino as the robot’s controller and Roomba gear motors for locomotion. I have built a few others based on existing wheeled motorized toys and I’ve made a few with the Parallax Propeller chip.

My “go-to” sensor is usually the Parallax PING))) ultrasonic sensor. It’s easy to connect and work with and the code is straightforward. I also use bump sensors, which are just simple contact switches, because they mimic the way some insects navigate.

Nature is a great designer and much can be learned from observing it. I like to keep my engineering simple because it’s robust and easy to repair. The more you complicate a design, the more it can do. But it also becomes more likely that something will fail. Failure is not a bad thing if it leads to a better design that overcomes the failure. Good design is a balance of these things. This is why I leave my failures and mistakes in my videos to show how I arrive at the end result through some trial and error.

My favorite robot would be “Photon: The Video and Photo Robot” that I built for the 2013 North Carolina Maker Faire. It’s my masterpiece robot…so far.

NAN: Tell us a little more about Photon. Did you encounter any challenges while developing the robot?

Photon awaits with cameras rolling, ready to go forth and record images.

Photon awaits with cameras rolling, ready to go forth and record images.

DINO: The idea for Photon first came to me in February 2013. I had been playing with the Emic 2 text-to-speech module from Parallax and I thought it would be fun to use it to give a robot speech capability. From there the idea grew to include cameras that would record and stream to the Internet what the robot saw and then give the robot the ability to navigate through the crowd at Maker Faire.

I got a late start on the project and ended up burning the midnight oil to get it finished in time. One of the bigger challenges was in designing a motorized base that would reliably move Photon across a cement floor.

The problem was in dealing with elevation changes on the floor covering. What if Photon encountered a rug or an extension cord?

I wanted to drive it with two gear motors salvaged from a Roomba 4000 vacuum robot to enable tank-style steering. A large round base with a caster at the front and rear worked well, but it would only enable a small change in surface elevation. I ended up using that design and made sure that it stayed away from anything that might get it in trouble.

The next challenge was giving Photon some sensors so it could navigate and stay away from obstacles. I used one PING))) sensor mounted on its head and turned the entire torso into a four-zone bump sensor, as was a ring around the base. The ring pushed on a series of 42 momentary contact switches connected together in four zones. All these sensors were connected to an Arduino running some simple code that turned Photon away from obstacles it encountered. Power was supplied by a motorcycle battery mounted on the base inside the torso.

The head held two video cameras, two smartphones in camera mode, and one GoPro camera. One video camera and the GoPro were recording in HD; the other video camera was recording in time-lapse mode. The two smartphones streamed live video, one via 4G to a Ustream channel and the other via Wi-Fi. The Ustream worked great, but the Wi-Fi failed due to interference.

Photon’s voice came from the Emic 2 connected to another Arduino sending it lines of text to speak. The audio was amplified by a small 0.5-W LM386 amplifier driving a 4” speaker. An array of blue LEDs mounted on the head illuminated with the brightness modulated by the audio signal when Photon spoke. The speech was just a lot of lines of text running in a timed loop.

Photon’s brain includes two Arduinos and an LM386 0.5-W audio amplifier with a sound-to-voltage circuit added to drive the mouth LED array. Photon’s voice comes from a Parallax Emic 2 text-to-speech module.

Photon’s brain includes two Arduinos and an LM386 0.5-W audio amplifier with a sound-to-voltage circuit added to drive the mouth LED array. Photon’s voice comes from a Parallax Emic 2 text-to-speech module.

Connecting all of these things together was very challenging. Each component needed a regulated power supply, which I built using LM317T voltage regulators. The entire current draw with motors running was about 1.5 A. The battery lasted about 1.5 h before needing a recharge. I had an extra battery so I could just swap them out during the quick charge cycle and keep downtime to a minimum.

I finished the robot around 11:00 PM the night before the event. It was a hit! The videos Photon recorded are fascinating to watch. The look of wonder on people’s faces, the kids jumping up to see themselves in the monitors, the smiles, and the interaction are all very interesting.

NAN: Many of your Hack A Week projects include Parallax products. Why Parallax?

DINO: Parallax is a great electronics company that caters to the DIY hobbyist. It has a large knowledge base on its website as well as a great forum with lots of people willing to help and share their projects.

About a year ago Parallax approached me with an offer to supply me with a product in exchange for featuring it in my video projects on Hack A Week. Since I already used and liked the product, it was a perfect offer. I’ll be posting more Parallax-based projects throughout the year and showcasing a few of them on the ELEV-8 quadcopter as a test platform.

NAN: Let’s change topics. You built an Electronic Fuel Injector Tester, which is featured on Can you explain how the 555 timer chips are used in the tester?

DINO: 555 timers are great! They can be used in so many projects in so many ways. They’re easy to understand and use and require only a minimum of external components to operate and configure.

The 555 can run in two basic modes: monostable and astable.

Dino keeps this fuel injector tester in his tool box at work. He’s a European auto technician by day.

Dino keeps this fuel injector tester in his tool box at work. He’s a European auto technician by day.

An astable circuit produces a square wave. This is a digital waveform with sharp transitions between low (0 V) and high (+ V). The durations of the low and high states may be different. The circuit is called astable because it is not stable in any state: the output is continually changing between “low” and “high.”

A monostable circuit produces a single output pulse when triggered. It is called a monostable because it is stable in just one state: “output low.” The “output high” state is temporary.

The injector tester, which is a monostable circuit, is triggered by pressing the momentary contact switch. The single-output pulse turns on an astable circuit that outputs a square-wave pulse train that is routed to an N-channel MOSFET. The MOSFET turns on and off and outputs 12 V to the injector. A flyback diode protects the MOSFET from the electrical pulse that comes from the injector coil when the power is turned off and the field collapses. It’s a simple circuit that can drive any injector up to 5 A.

This is a homebrew PCB for Dino's fuel injector tester. Two 555s drive a MOSFET that switches the injector.

This is a homebrew PCB for Dino’s fuel injector tester. Two 555s drive a MOSFET that switches the injector.

NAN: You’ve been “DIYing” for quite some time. How and when did your interest begin?

DINO: It all started in 1973 when I was 13 years old. I used to watch a TV show on PBS called ZOOM, which was produced by WGBH in Boston. Each week they had a DIY project they called a “Zoom-Do,” and one week the project was a crystal radio. I ordered the Zoom-Do instruction card and set out to build one. I got everything put together but it didn’t work! I checked and rechecked everything, but it just wouldn’t work.

I later realized why. The instructions said to use a “cat’s whisker,” which I later found out was a thin piece of wire. I used a real cat’s whisker clipped from my cat! Anyway, that project sparked something inside me (pun intended). I was hooked! I started going house to house asking people if they had any broken or unwanted radios and or TVs I could have so I could learn about electronics and I got tons of free stuff to mess with.

My mom and dad were pretty cool about letting me experiment with it all. I was taking apart TV sets, radios, and tape recorders in my room and actually fixing a few of them. I was in love with electronics. I had an intuition for understanding it. I eventually found some ham radio guys who were great mentors and I learned a lot of good basic electronics from them.

NAN: Is there a particular electronics engineer, programmer, or designer who has inspired the work you do today?

DINO: Forrest Mims was a great inspiration in my early 20s. I got a big boost from his “Engineer’s Notebooks.” The simple way he explained things and his use of graph paper to draw circuit designs really made learning about electronics easy and fun. I still use graph paper to draw my schematics during the design phase and for planning when building a prototype on perf board. I’m not interested in any of the software schematic programs because most of my projects are simple and easy to draw. I like my pencil-and-paper approach.

NAN: What was the last electronics-design related product you purchased and what type of project did you use it with?

DINO: An Arduino Uno. I used two of these in the Photon robot.

NAN: What new technologies excite you and why?

DINO: Organic light-emitting diodes (OLEDs). They’ll totally change the way we manufacture and use digital displays.

I envision a day when you can go buy your big-screen TV that you’ll bring home in a cardboard tube, unroll it, and place it on the wall. The processor and power supply will reside on the floor, out of the way, and a single cable will go to the panel. The power consumption will be a fraction of today’s LCD or plasma displays and they’ll be featherweight by comparison. They’ll be used to display advertising on curved surfaces anywhere you like. Cell phone displays will be curved and flexible.

How about a panoramic set of virtual reality goggles or a curved display in a flight simulator? Once the technology gets out of the “early adopter” phase, prices will come down and you’ll own that huge TV for a fraction of what you pay now. One day we might even go to a movie and view it on a super-huge OLED panorama screen.

NAN: Final question. If you had a full year and a good budget to work on any design project you wanted, what would you build?

DINO: There’s a project I’ve wanted to build for some time now: A flight simulator based on the one used in Google Earth. I would use a PC to run the simulator and build a full-on seat-inside enclosure with all the controls you would have in a jet airplane. There are a lot of keyboard shortcuts for a Google flight simulator that could be triggered by switches connected to various controls (e.g., rudder pedals, flaps, landing gear, trim tabs, throttle, etc.). I would use the Arduino Leonardo as the controller for the peripheral switches because it can emulate a USB keyboard. Just program it, plug it into a USB port along with a joystick, build a multi-panel display (or use that OLED display I dream of), and go fly!

Google Earth’s flight simulator also lets you fly over the surface of Mars! Not only would this be fun to build and fly, it would also be a great educational tool. It’s definitely on the Hack A Week project list!

Editor’s Note: This article also appears in the Circuit Cellar’s upcoming March issue, which focuses on robotics. The March issue will soon be available for membership download or single-issue purchase.