AAR Arduino Autonomous Mobile Robot

The AAR Arduino Robot is a small autonomous mobile robot designed for those new to robotics and for experienced Arduino designers. The robot is well suited for hobbyists and school projects. Designed in the Arduino open-source prototyping platform, the robot is easy to program and run.

The AAR, which is delivered fully assembled, comes with a comprehensive CD that includes all the software needed to write, compile, and upload programs to your robot. It also includes a firmware and hardware self test. For wireless control, the robot features optional Bluetooth technology and a 433-MHz RF.

The AAR robot’s features include an Atmel ATmega328P 8-bit AVR-RISC processor with a 16-MHz clock, Arduino open-source software, two independently controlled 3-VDC motors, an I2C bus, 14 digital I/Os on the processor, eight analog input lines, USB interface programming, an on-board odometer sensor on both wheels, a line tracker sensor, and an ISP connector for bootloader programming.

The AAR’s many example programs help you get your robot up and running. With many expansion kits available, your creativity is unlimited.

Contact Global Specialties for pricing.

Global Specialties
http://globalspecialties.com

Using Socially Assistive Robots to Address the Caregiver Gap

David Feil-Seifer

Editor’s Note: David Feil-Seifer, a Postdoctoral Fellow in the Computer Science Department at Yale University, wrote this  essay for Circuit Cellar. Feil-Seifer focuses his research on socially assistive robotics (SAR), particularly the study of human-robot interaction for children with autism spectrum disorders (ASD). His dissertation work addressed autonomous robot behavior so that socially assistive robots can recognize and respond to a child’s behavior in unstructured play. He recently was hired as Assistant Professor of Computer Science at the University of Nevada, Reno.

There are looming health care and education crises on the horizon. Baby boomers are getting older and requiring more care, which puts pressure on caregivers. The US nursing shortage is projected to worsen. Similarly, the rapid growth of diagnoses of developmental disorders suggests a greater need for educators, one that the education system is struggling to meet. These great and growing shortfalls in the number of caregivers and educators may be addressed (in part) through the use of socially assistive robotics.

In health care, non-contact repetitive tasks make up a large part of a caregiver’s day. Tasks such as monitoring instruments only require a check to verify that readings are within norms. By offloading these tasks to an automated system, a nurse or doctor could spend more time doing work that better leverages their medical training. A robot can effectively perform simple repetitive tasks (e.g., monitoring breath spirometry exercises or post-stroke rehabilitation compliance).

I coined the term “socially assistive robotics” (SAR) to describe robots that provide such assistance through social rather than physical interaction. My research is the development of SAR algorithms and complete systems relevant to domains such as post-stroke rehabilitation, elder care, and therapeutic interaction for children with autism spectrum disorders (ASD). A key challenge for such autonomous SAR systems is the ability to sense, interpret, and properly respond to human social behavior.

One of my research priorities is developing a socially assistive robotic system for children with ASD. Children with ASD are characterized by social impairments, communication difficulties, and repetitive and stereotyped behaviors. Significant anecdotal evidence indicates that some children with ASD respond socially to robots, which could have therapeutic ramifications. We envision a robot that could act as a catalyst for social interaction, both human-robot and human-human, thus aiding ASD users’ human-human socialization. In such a scenario, the robot is not specifically generating social behavior or participating in social interaction, but instead behaves in a way known to provoke human-human interaction.

David Feil-Seifer developed an autonomous robot that recognizes and appropriately responds to a child’s free-form behavior in play contexts, similar to those seen in some more traditional autism spectrum disorder (ASD) therapies.

Enabling a robot to exhibit and understand social behavior with a child is challenging. Children are highly individual and thus technology used for social interaction needs to be robust to be effective. I developed an autonomous robot that recognizes and appropriately responds to a child’s free-form behavior in play contexts, similar to those seen in some more traditional ASD therapies.

To detect and mitigate child distress, I developed a methodology for learning and then applying a data-driven spatiotemporal model of social behavior based on distance-based features to automatically differentiate between typical vs. aversive child-robot interactions. Using a Gaussian mixture model learned over distance-based feature data, the developed system was able to detect and interpret social behavior with sufficient accuracy to recognize child distress. The robot can use this to change its own behavior to encourage positive social interaction.

To encourage human-human interaction once human-robot interaction was achieved, I developed a navigation planner that used the above spatiotemporal model. This was used to maintain the robot’s spatial relationship with a child to sustain interaction while also guiding the child to a particular location in a room. This could be used to encourage a child to move toward another interaction partner (e.g., a parent). The desired spatial interaction behavior is achieved by modifying an established trajectory planner to weigh candidate trajectories based on conformity to a trained model of the desired behavior.

I also developed a methodology for robot behavior that provides autonomous feedback for a robot-child imitation and turn-taking game. This was accomplished by incorporating an established therapeutic model of feedback along with a trained model of imitation behavior. This is used as part of an autonomous system that can play Simon Says, recognize when the rules have been violated, and provide appropriate feedback.

A growing body of data supports the hypothesis that robots have the potential to aid in addressing the needs of people through non-contact assistance. My research, along with that of many others, has resulted in technical advances for robots providing assistance to people. However, there is a long way to go before these systems can be deployed as a therapeutic platform. Given that the beneficiary populations are growing, and the required therapeutic needs are increasing far more rapidly than the existing resources to address it, SAR could provide lasting benefits to people in need.

David Feil-Seifer, a Postdoctoral Fellow in the Computer Science Department at Yale University, focuses his research on socially assistive robotics (SAR), particularly the study of human-robot interaction for children with autism spectrum disorders (ASD). His dissertation work addressed autonomous robot behavior so that socially assistive robots can recognize and respond to a child’s behavior in unstructured play. David received his MS and PhD in Computer Science from the University of Southern California and a BS in Computer Science from the University of Rochester, NY. He recently was hired as Assistant Professor of Computer Science at the University of Nevada, Reno.

Electrostatic Cleaning Robot Project

How do you clean a clean-energy generating system? With a microcontroller (and a few other parts, of course). An excellent example is US designer Scott Potter’s award-winning, Renesas RL78 microcontroller-based Electrostatic Cleaning Robot system that cleans heliostats (i.e., solar-tracking mirrors) used in solar energy-harvesting systems. Renesas and Circuit Cellar magazine announced this week at DevCon 2012 in Garden Grove, CA, that Potter’s design won First Prize in the RL78 Green Energy Challenge.

This image depicts two Electrostatic Cleaning Robots set up on two heliostats. (Source: S. Potter)

The nearby image depicts two Electrostatic Cleaning Robots set up vertically in order to clean the two heliostats in a horizontal left-to-right (and vice versa) fashion.

The Electrostatic Cleaning Robot in place to clean

Potter’s design can quickly clean heliostats in Concentrating Solar Power (CSP) plants. The heliostats must be clean in order to maximize steam production, which generates power.

The robot cleaner prototype

Built around an RL78 microcontroller, the Electrostatic Cleaning Robot provides a reliable cleaning solution that’s powered entirely by photovoltaic cells. The robot traverses the surface of the mirror and uses a high-voltage AC electric field to sweep away dust and debris.

Parts and circuitry inside the robot cleaner

Object oriented C++ software, developed with the IAR Embedded Workbench and the RL78 Demonstration Kit, controls the device.

IAR Embedded Workbench IDE

The RL78 microcontroller uses the following for system control:

• 20 Digital I/Os used as system control lines

• 1 ADC monitors solar cell voltage

• 1 Interval timer provides controller time tick

• Timer array unit: 4 timers capture the width of sensor pulses

• Watchdog timer for system reliability

• Low voltage detection for reliable operation in intermittent solar conditions

• RTC used in diagnostic logs

• 1 UART used for diagnostics

• Flash memory for storing diagnostic logs

The complete project (description, schematics, diagrams, and code) is now available on the Challenge website.

 

Autonomous Mobile Robot (Part 2): Software & Operation

I designed a microcontroller-based mobile robot that can cruise on its own, avoid obstacles, escape from inadvertent collisions, and track a light source. In the first part of this series, I introduced my TOMBOT robot’s hardware. Now I’ll describe its software and how to achieve autonomous robot behavior.

Autonomous Behavior Model Overview
The TOMBOT is a minimalist system with just enough components to demonstrate some simple autonomous behaviors: Cruise, Escape, Avoid, and Home behaviors (see Figure 1). All the behaviors require left and right servos for maneuverability. In general, “Cruise” just keeps the robot in motion in lieu of any stimulus. “Escape” uses the bumper to sense a collision and then 180 spin with reverse. “Avoid” makes use of continuous forward looking IR sensors to veer left or right upon approaching a close obstacle. Finally “Home” utilizes the front optical photocells to provide robot self-guidance to a strong light highly directional source.

Figure 1: High-level autonomous behavior flow

Figure 2 shows more details. The diagram captures the interaction of TOMBOT hardware and software. On the left side of the diagram are the sensors, power sources, and command override (the XBee radio command input). All analog sensor inputs and bumper switches are sampled (every 100 ms automatically) during the Microchip Technology PIC32 Timer 1 interrupt. The bumper left and right switches undergo debounce using 100 ms as a timer increment. The analog sensors inputs are digitized using the PIC32’s 10-bit ADC. Each sensor is assigned its own ADC channel input. The collected data is averaged in some cases and then made available for use by the different behaviors. Processing other than averaging is done within the behavior itself.

Figure 2: Detailed TOMBOT autonomous model

All behaviors are implemented as state machines. If a behavior requests motor control, it will be internally arbitrated against all other behaviors before motor action is taken. Escape has the highest priority (the power behavior is not yet implemented) and will dominate with its state machine over all the other behaviors. If escape is not active, then avoid will dominate as a result of its IR detectors are sensing an object in front of the TOMBOT less than 8″ away. If escape and avoid are not active, then home will overtake robot steering to sense track a light source that is immediately in front of TOMBOT. Finally cruise assumes command, and takes the TOMBOT in a forward direction temporarily.

A received command from the XBee RF module can stop and start autonomous operation remotely. This is very handy for system debugging. Complete values of all sensors and battery power can be viewed on graphics display using remote command, with LEDs and buzzer, announcing remote command acceptance and execution.

Currently, the green LED is used to signal that the TOMBOT is ready to accept a command. Red is used to indicate that the TOMBOT is executing a command. The buzzer indicates that the remote command has been completed coincident with the red led turning on.

With behavior programming, there are a lot of considerations. For successful autonomous operation, calibration of the photocells and IR sensors and servos is required. The good news is that each of these behaviors can be isolated (selectively comment out prior to compile time what is not needed), so that phenomena can be isolated and the proper calibrations made. We will discuss this as we get a little bit deeper into the library API, but in general, behavior modeling itself does not require accurate modeling and is fairly robust under less than ideal conditions.

TOMBOT Software Library
The TOMBOT robot library is modular. Some experience with C programming is required to use it (see Figure 3).

Figure 3: TOMBOT Library

The entire library is written using Microchip’s PIC32 C compiler. Both the compiler and Microchip’s 8.xx IDE are available as free downloads at www.microchip.com. The overall library structure is shown. At a highest level library has three main sections: Motor, I/O and Behavior. We cover these areas in some detail.

TOMBOT Motor Library
All functions controlling the servos’ (left and right wheel) operation is contained in this part of the library (see Listing1 Motor.h). In addition the Microchip PIC32 peripheral library is also used. Motor initialization is required before any other library functions. Motor initialization starts up both left and right servo in idle position using PIC32 PWM peripherals OC3 and OC4 and the dual Timer34 (32 bits) for period setting. C Define statements are used to set pulse period and duty cycle for both left and right wheels. These defines provide PWM varies from 1 to 2 ms for different speed CCW rotation over a 20-ms period and from 1.5 ms to 1 ms for CC rotation.

Listing 1: All functions controlling the servos are in this part of the library.

V_LEFT and V_RIGHT (velocity left and right) use the PIC32 peripheral library function to set duty cycle. The other motor functions, in turn, use V_LEFT and V_RIGHT with the define statements. See FORWARD and BACKWARD functions as an example (see Listing 2).

Listing 2: Motor function code examples

In idle setting both PWM set to 1-ms center positions should cause the servos not to turn. A servo calibration process is required to ensure center position does not result in any rotation. For the servos we have a set screw that can be used to adjust motor idle to no spin activity with a small Philips screwdriver.

TOMBOT I/O Library

This is a collection of different low level library functions. Let’s deal with these by examining their files and describing the function set starting with timer (see Listing 3). It uses Timer45 combination (full 32 bits) for precision timer for behaviors. The C defines statements set the different time values. The routine is noninterrupt at this time and simply waits on timer timeout to return.

Listing 3: Low-level library functions

The next I/O library function is ADC. There are a total of five analog inputs all defined below. Each sensor definition corresponds to an integer (32-bit number) designating the specific input channel to which a sensor is connected. The five are: Right IR, Left IR, Battery, Left Photo Cell, Right Photo Cell.

The initialization function initializes the ADC peripheral for the specific channel. The read function performs a 10-bit ADC conversion and returns the result. To faciliate operation across the five sensors we use SCAN_SENSORS function. This does an initialization and conversion of each sensor in turn. The results are placed in global memory where the behavior functions can access . SCAN_SENOR also performs a running average of the last eight samples of photo cell left and right (see Listing 4).

Listing 4: SCAN_SENOR also performs a running average of the last eight samples

The next I/O library function is Graphics (see Listing 5). TOMBOT uses a 102 × 64 monchrome graphics display module that has both red and green LED backlights. There are also red and green LEDs on the module that are independently controlled. The module is driven by the PIC32 SPI2 interface and has several control lines CS –chip select, A0 –command /data.

Listing 5: The Graphics I/O library function

The Graphics display relies on the use of an 8 × 8 font stored in as a project file for character generation. Within the library there are also cursor position macros, functions to write characters or text strings, and functions to draw 32 × 32 bit maps. The library graphic primitives are shown for intialization, module control, and writing to the module. The library writes to a RAM Vmap memory area. And then from this RAM area the screen is updated using dumpVmap function. The LED and backlight controls included within these graphics library.

The next part of I/O library function is delay (see Listing 6). It is just a series of different software delays that can be used by other library function. They were only included because of legacy use with the graphics library.

Listing 6: Series of different software delays

The next I/O library function is UART-XBEE (see Listing 7). This is the serial driver to configure and transfer data through the XBee radio on the robot side. The library is fairly straightforward. It has an initialize function to set up the UART1B for 9600 8N1, transmit and receive.

Listing 7: XBee library functions

Transmission is done one character at a time. Reception is done via interrupt service routine, where the received character is retrieved and a semaphore flag is set. For this communication, I use a Sparkfun XBee Dongle configured through USB as a COM port and then run HyperTerminal or an equivalent application on PC. The default setting for XBee is all that is required (see Photo 1).

Photo 1: XBee PC to TOMBOT communications

The next I/O library function is buzzer (see Listing 8). It uses a simple digital output (Port F bit 1) to control a buzzer. The functions are initializing buzzer control and then the on/off buzzer.

Listing 8: The functions initialize buzzer control

TOMBOT Behavior Library
The Behavior library is the heart of the autonomous TOMBOT and where integrated behavior happens. All of these behaviors require the use of left and right servos for autonomous maneuverability. Each behavior is a finite state machine that interacts with the environment (every 0.1 s). All behaviors have a designated priority relative to the wheel operation. These priorities are resolved by the arbiter for final wheel activation. Listing 9 shows the API for the entire Behavior Library.

Listing 9: The API for the entire behavior library

Let’s briefly cover the specifics.

  • “Cruise” just keeps the robot in motion in lieu of any stimulus.
  • “Escape” uses the bumper to sense a collision and then 180° spin with reverse.
  • “Avoid” makes use of continuous forward looking IR sensors to veer left or right upon approaching a close obstacle.
  • “Home” utilizes the front optical photocells to provide robot self-guidance to a strong light highly directional source.
  • “Remote operation” allows for the TOMBOT to respond to the PC via XBee communications to enter/exit autonomous mode, report status, or execute a predetermined motion scenario (i.e., Spin X times, run back and forth X times, etc.).
  • “Dump” is an internal function that is used within Remote.
  • “Arbiter” is an internal function that is an intrinsic part of the behavior library that resolves different behavior priorities for wheel activation.

Here’s an example of the Main function-invoking different Behavior using API (see Listing 10). Note that this is part of a main loop. Behaviors can be called within a main loop or “Stacked Up”. You can remove or stack up behaviors as you choose ( simply comment out what you don’t need and recompile). Keep in mind that remote is a way for a remote operator to control operation or view status.

Listing 10: TOMBOT API Example

Let’s now examine the detailed state machine associated with each behavior to gain a better understanding of behavior operation (see Listing 11).

Listing 11:The TOMBOT’s arbiter

The arbiter is simple for TOMBOT. It is a fixed arbiter. If either during escape or avoid, it abdicates to those behaviors and lets them resolve motor control internally. Home or cruise motor control requests are handled directly by the arbiter (see Listing 12).

Listing 12: Home behavior

Home is still being debugged and is not yet a final product. The goal is for the TOMBOT during Home is to steer the robot toward a strong light source when not engaged in higher priority behaviors.

The Cruise behavior sets motor to forward operation for one second if no other higher priority behaviors are active (see Listing 13).

Listing 13: Cruise behavior

The Escape behavior tests the bumper switch state to determine if a bump is detected (see Listing 14). Once detected it runs through a series of states. The first is an immediate backup, and then it turns around and moves away from obstacle.

Listing 14: Escape behavior

This function is a response to the remote C or capture command that formats and dumps (see Listing 15) to the graphics display The IR left and right, Photo left and Right, and battery in floating point format.

Listing 15: The dump function

This behavior uses the IR sensors and determines if an object is within 8″ of the front of TOMBOT (see Listing 16).

Listing 16: Avoid behavior

If both sensors detect a target within 8″ then it just turns around and moves away (pretty much like escape). If only the right sensor detects an object in range spins away from right side else if on left spins away on left side (see Listing 17).

Listing 17: Remote part 1

Remote behavior is fairly comprehensive (see Listing 18). There are 14 different cases. Each case is driven by a different XBee received radio character. Once a character is received the red LED is turned on. Once the behavior is complete, the red LED is turned off and a buzzer is sounded.

Listing 18: Remote part 2

The first case toggles Autonomous mode on and off. The other 13 are prescribed actions. Seven of these 13 were written to demonstrate TOMBOT mobile agility with multiple spins, back and forwards. The final six of the 13 are standard single step debug like stop, backward, and capture. Capture dumps all sensor output to the display screen (see Table 1).

Table 1: TOMBOT remote commands

Early Findings & Implementation
Implementation always presents a choice. In my particular case, I was interested in rapid development. At that time, I selected to using non interrupt code and just have linear flow of code for easy debug. This amounts to “blocking code.” Block code is used throughout the behavior implementation and causes the robot to be nonresponsive when blocking occurs. All blocking is identified when timeout functions occur. Here the robot is “blind” to outside environmental conditions. Using a real-time operating system (e.g., Free RTOS) to eliminate this problem is recommended.

The TOMBOT also uses photocells for homing. These sensitive devices have different responses and need to be calibrated to ensure correct response. A photocell calibration is needed within the baseline and used prior to operation.

TOMBOT Demo

The TOMBOT was successfully demoed to a large first-grade class in southern California as part of a Science, Technology, Engineering and Mathematics (STEM) program. The main behaviors were limited to Remote, Avoid, and Escape. With autonomous operation off, the robot demonstrated mobility and maneuverability. With autonomous operation on, the robot could interact with a student to demo avoid and escape behavior.

Tom Kibalo holds a BSEE from City College of New York and an MSEE from the University of Maryland. He as 39 years of engineering experience with a number of companies in the Washington, DC area. Tom is an adjunct EE facility member for local community college, and he is president of Kibacorp, a Microchip Design Partner.

DIY Green Energy Design Projects

Ready to start a low-power or energy-monitoring microcontroller-based design project? You’re in luck. We’re featuring eight award-winning, green energy-related designs that will help get your creative juices flowing.

The projects listed below placed at the top of Renesas’s RL78 Green Energy Challenge.

Electrostatic Cleaning Robot: Solar tracking mirrors, called heliostats, are an integral part of Concentrating Solar Power (CSP) plants. They must be kept clean to help maximize the production of steam, which generates power. Using an RL78, the innovative Electrostatic Cleaning Robot provides a reliable cleaning solution that’s powered entirely by photovoltaic cells. The robot traverses the surface of the mirror and uses a high voltage AC electric field to sweep away dust and debris.

Parts and circuitry inside the robot cleaner

Cloud Electrofusion Machine: Using approximately 400 times less energy than commercial electrofusion machines, the Cloud Electrofusion Machine is designed for welding 0.5″ to 2″ polyethylene fittings. The RL78-controlled machine is designed to read a barcode on the fitting which determines fusion parameters and traceability. Along with the barcode data, the system logs GPS location to an SD card, if present, and transmits the data for each fusion to a cloud database for tracking purposes and quality control.

Inside the electrofusion machine (Source: M. Hamilton)

The Sun Chaser: A GPS Reference Station: The Sun Chaser is a well-designed, solar-based energy harvesting system that automatically recalculates the direction of a solar panel to ensure it is always facing the sun. Mounted on a rotating disc, the solar panel’s orientation is calculated using the registered GPS position. With an external compass, the internal accelerometer, a DC motor and stepper motor, you can determine the solar panel’s exact position. The system uses the Renesas RDKRL78G13 evaluation board running the Micrium µC/OS-III real-time kernel.

[Video: ]

Water Heater by Solar Concentration: This solar water heater is powered by the RL78 evaluation board and designed to deflect concentrated amounts of sunlight onto a water pipe for continual heating. The deflector, armed with a counterweight for easy tilting, automatically adjusts the angle of reflection for maximum solar energy using the lowest power consumption possible.

RL78-based solar water heater (Source: P. Berquin)

Air Quality Mapper: Want to make sure the air along your daily walking path is clean? The Air Quality Mapper is a portable device designed to track levels of CO2 and CO gasses for constructing “Smog Maps” to determine the healthiest routes. Constructed with an RDKRL78G13, the Mapper receives location data from its GPS module, takes readings of the CO2 and CO concentrations along a specific route and stores the data in an SD card. Using a PC, you can parse the SD card data, plot it, and upload it automatically to an online MySQL database that presents the data in a Google map.

Air quality mapper design (Source: R. Alvarez Torrico)

Wireless Remote Solar-Powered “Meteo Sensor”: You can easily measure meteorological parameters with the “Meteo Sensor.” The RL78 MCU-based design takes cyclical measurements of temperature, humidity, atmospheric pressure, and supply voltage, and shares them using digital radio transceivers. Receivers are configured for listening of incoming data on the same radio channel. It simplifies the way weather data is gathered and eases construction of local measurement networks while being optimized for low energy usage and long battery life.

The design takes cyclical measurements of temperature, humidity, atmospheric pressure, and supply voltage, and shares them using digital radio transceivers. (Source: G. Kaczmarek)

Portable Power Quality Meter: Monitoring electrical usage is becoming increasingly popular in modern homes. The Portable Power Quality Meter uses an RL78 MCU to read power factor, total harmonic distortion, line frequency, voltage, and electrical consumption information and stores the data for analysis.

The portable power quality meter uses an RL78 MCU to read power factor, total harmonic distortion, line frequency, voltage, and electrical consumption information and stores the data for analysis. (Source: A. Barbosa)

High-Altitude Low-Cost Experimental Glider (HALO): The “HALO” experimental glider project consists of three main parts. A weather balloon is the carrier section. A glider (the payload of the balloon) is the return section. A ground base section is used for communication and display telemetry data (not part of the contest project). Using the REFLEX flight simulator for testing, the glider has its own micro-GPS receiver, sensors and low-power MCU unit. It can take off, climb to pre-programmed altitude and return to a given coordinate.

High-altitude low-cost experimental glider (Source: J. Altenburg)