Member Profile: Walter O. Krawec

Walter O. Krawec

Walter O. Krawec

LOCATION:
Upstate New York

OCCUPATION:
Research Assistant and PhD Student, Stevens Institute of Technology

MEMBER STATUS:
Walter has been reading Circuit Cellar since he got his first issue in 1999. Free copies were available at the Trinity College Fire Fighting Robot Contest, which was his first experience with robotics. Circuit Cellar was the first magazine for which he wrote an article (“An HC11 File Manager,” two-part series, issues 129 and 130, 2001).

TECH INTERESTS:
Robotics, among other things. He is particularly interested in developmental and evolutionary robotics (where the robot’s strategies, controllers, and so forth are evolved instead of programmed in directly).

RECENT TECH ACQUISITION:
Walter is enjoying his Raspberry Pi. “What a remarkable product! I think it’s great that I can take my AI software, which I’ve been writing on a PC, copy it to the Raspberry Pi, compile it with GCC, then off it goes with little or no modification!”

CURRENT PROJECTS:
Walter is designing a new programming language and interpreter (for Windows/Mac/Linux, including the Raspberry Pi) that uses a simulated quantum computer to drive a robot. “What better way to learn the basics of quantum computing than by building a robot around one?” The first version of this language is available on his website (walterkrawec.org). He has plans to release an improved version.

THOUGHTS ON EMBEDDED TECH:
Walter said he is amazed with the power of the latest embedded technology, for example the Raspberry Pi. “For less than $40 you have a perfect controller for a robot that can handle incredibly complex programs. Slap on one of those USB battery packs and you have a fully mobile robot,” he said. He used a Pololu Maestro to interface the motors and analog sensors. “It all works and it does everything I need.” However, he added, “If you want to build any of this yourself by hand it can be much harder, especially since most of the cool stuff is surface mount, making it difficult to get started.”

Low-Cost SBCs Could Revolutionize Robotics Education

For my entire life, my mother has been a technology trainer for various educational institutions, so it’s probably no surprise that I ended up as an engineer with a passion for STEM education. When I heard about the Raspberry Pi, a diminutive $25 computer, my thoughts immediately turned to creating low-cost mobile computing labs. These labs could be easily and quickly loaded with a variety of programming environments, walking students through a step-by-step curriculum to teach them about computer hardware and software.

However, my time in the robotics field has made me realize that this endeavor could be so much more than a traditional computer lab. By adding actuators and sensors, these low-cost SBCs could become fully fledged robotic platforms. Leveraging the common I2C protocol, adding chains of these sensors would be incredibly easy. The SBCs could even be paired with microcontrollers to add more functionality and introduce students to embedded design.

rover_webThere are many ways to introduce students to programming robot-computers, but I believe that a web-based interface is ideal. By setting up each computer as a web server, students can easily access the interface for their robot directly though the computer itself, or remotely from any web-enabled device (e.g., a smartphone or tablet). Through a web browser, these devices provide a uniform interface for remote control and even programming robotic platforms.

A server-side language (e.g., Python or PHP) can handle direct serial/I2C communications with actuators and sensors. It can also wrap more complicated robotic concepts into easily accessible functions. For example, the server-side language could handle PID and odometry control for a small rover, then provide the user functions such as “right, “left,“ and “forward“ to move the robot. These functions could be accessed through an AJAX interface directly controlled through a web browser, enabling the robot to perform simple tasks.

This web-based approach is great for an educational environment, as students can systematically pull back programming layers to learn more. Beginning students would be able to string preprogrammed movements together to make the robot perform simple tasks. Each movement could then be dissected into more basic commands, teaching students how to make their own movements by combining, rearranging, and altering these commands.

By adding more complex commands, students can even introduce autonomous behaviors into their robotic platforms. Eventually, students can be given access to the HTML user interfaces and begin to alter and customize the user interface. This small superficial step can give students insight into what they can do, spurring them ahead into the next phase.
Students can start as end users of this robotic framework, but can eventually graduate to become its developers. By mapping different commands to different functions in the server side code, students can begin to understand the links between the web interface and the code that runs it.

Kyle Granat

Kyle Granat, who wrote this essay for Circuit Cellar,  is a hardware engineer at Trossen Robotics, headquarted in Downers Grove, IL. Kyle graduated from Purdue University with a degree in Computer Engineering. Kyle, who lives in Valparaiso, IN, specializes in embedded system design and is dedicated to STEM education.

Students will delve deeper into the server-side code, eventually directly controlling actuators and sensors. Once students begin to understand the electronics at a much more basic level, they will be able to improve this robotic infrastructure by adding more features and languages. While the Raspberry Pi is one of today’s more popular SBCs, a variety of SBCs (e.g., the BeagleBone and the pcDuino) lend themselves nicely to building educational robotic platforms. As the cost of these platforms decreases, it becomes even more feasible for advanced students to recreate the experience on many platforms.

We’re already seeing web-based interfaces (e.g., ArduinoPi and WebIOPi) lay down the beginnings of a web-based framework to interact with hardware on SBCs. As these frameworks evolve, and as the costs of hardware drops even further, I’m confident we’ll see educational robotic platforms built by the open-source community.

I/O Raspberry Pi Expansion Card

The RIO is an I/O expansion card intended for use with the Raspberry Pi SBC. The card stacks on top of a Raspberry Pi to create a powerful embedded control and navigation computer in a small 20-mm × 65-mm × 85-mm footprint. The RIO is well suited for applications requiring real-world interfacing, such as robotics, industrial and home automation, and data acquisition and control.

RoboteqThe RIO adds 13 inputs that can be configured as digital inputs, 0-to-5-V analog inputs with 12-bit resolution, or pulse inputs capable of pulse width, duty cycle, or frequency capture. Eight digital outputs are provided to drive loads up to 1 A each at up to 24 V.
The RIO includes a 32-bit ARM Cortex M4 microcontroller that processes and buffers the I/O and creates a seamless communication with the Raspberry Pi. The RIO processor can be user-programmed with a simple BASIC-like programming language, enabling it to perform logic, conditioning, and other I/O processing in real time. On the Linux side, RIO comes with drivers and a function library to quickly configure and access the I/O and to exchange data with the Raspberry Pi.

The RIO features several communication interfaces, including an RS-232 serial port to connect to standard serial devices, a TTL serial port to connect to Arduino and other microcontrollers that aren’t equipped with a RS-232 transceiver, and a CAN bus interface.
The RIO is available in two versions. The RIO-BASIC costs $85 and the RIO-AHRS costs $175.

Roboteq, Inc.
www.roboteq.com

Electrical Engineering and Artistic Expression

I think we’re on the verge of the next artistic renaissance. This time, instead of magnificent architecture, beautifully painted portraits, and the rise of humanism, I think engineering (specifically electrical engineering) will begin to define exciting new forms of artistic expression.

Cornell University graduate and electrical engineer Jeremy Blum in 2011 blog post

Regular Circuit Cellar readers will recognize Jeremy Blum as our November issue interview subject. Blum’s post sums up a philosophy that seems to be shared by some other recent EE graduates or aspiring electrical engineers. They view their work as art, or at least they like to occasionally work in art.

For example, Circuit Cellar’s January issue will feature an interview with Andrew Godbehere, an Electrical Engineering PhD candidate at the University of California, Berkeley. He has intertwined engineering and art more than once.

This is the central control belt pack worn by a dancer for CUMotive, the wearable accelerometer project. An Atmel Mega644V and an AT86RF230 were used inside to interface to synthesizer. The plastic enclosure has holes for the belt to attach to a dancer. Wires connect to accelerometers, which are worn on the dancer’s limbs.

This is the central control belt pack worn by a dancer for CUMotive, the wearable accelerometer project. An Atmel Mega644V and an AT86RF230 were used inside to interface to synthesizer. The plastic enclosure has holes for the belt to attach to a dancer. Wires connect to accelerometers, which are worn on the dancer’s limbs.

When he was Cornell student, he collaborated with Nathan Ward on a final project to translate a dancer’s movement into music. They created a central control belt pack for the dancer, which connected to four wearable wireless accelerometers to measure the dancer’s movements. Inside the belt pack, an ATmega 644V connected to an Atmel AT86RF230 wireless transceiver interfaced with a musical instrument digital interface (MIDI) and synthesizer.

When Godbehere graduated from Cornell and headed to UC Berkeley, his focus shifted to theoretical topics and robotic systems. But he jumped at a professor’s invitation to become involved in the “Are We There Yet?” art installation in 2011 at the Contemporary Jewish Museum in San Francisco.

During the four-month exhibit, visitors entered a nearly empty gallery to encounter recorded questions emanating from numerous floor speakers. A camera followed each visitor’s moves and robotic algorithms enabled it to determine which floor speaker to activate. The questions heard could range from “What Is My Purpose?” to “What’s Up Doc?”

How a visitor moved through the interactive installation triggered the combination of questions he or she heard.

Video documentary of “Are We There Yet?” 

Godbehere was the computer vision system engineer working with artists Gil Gershoni and Ken Goldberg, who is also a robotics and new media professor at UC Berkeley.

“We installed a color camera in a beautiful gallery in the Contemporary Jewish Museum… and a set of speakers with a high-end controller system from Meyer Sound that enabled us to ‘position’ sound in the space and to sweep audio tracks around at (the computer’s programmed) will,” Godbehere says. “The Meyer Sound System is the D-Mitri control system, controlled by the computer with Open Sound Control (OSC).

“The hard work was then to program the computer to discern humans from floors, furniture, shadows, sunbeams, and reflections of clouds. The gallery had many skylights, making the lighting very dynamic. Then, I programmed the computer to keep track of people as they moved and found that this dynamic information was itself useful in determining if detected color-perturbance was human or not.”

Behind the technology of “Are We There Yet?”

Can such art also have “practical” consumer applications? Godbehere says there are elements that can be used as an embedded system.

“I’ve been told that the software I wrote works on iOS devices by the startup company Romo, which was evaluating my vision-tracking code for use in its cute iPhone rover. Further, I’d say that if someone were interested, they could create a similar pedestrian, auto, pet, or cloud tracking system using a Raspberry Pi and a reasonable webcam.”

If you’re interested in learning more about Godbehere’s engineering and artistic work, be sure to check out the January issue of Circuit Cellar.

And if you have an opinion on electrical engineering and art, please post your comments below.

MIT’s Self-Assembling Robots

Calling it a low-tech solution to a high-tech challenge, MIT researchers have received a lot of attention recently for their modular system of self-assembling robot cubes. The video of the so-called M-Blocks in action, which MIT posted earlier this month on YouTube, has also become high profile. A recent tally has the video at nearly 1.5 million views and counting.

 

The text accompanying the video explains how the cubes are able to move around and climb over each other,  jump into the air, and roll across surfaces as they connect in a variety of configurations. And they do all this without any external moving parts. Instead, each M-Block contains a flywheel that can reach speeds of 20,000 rpm. When the flywheel brakes, it imparts angular momentum to the cube.  Precisely placed magnets on every face and edge of each M-Block enable any two cubes to attach to each other.

The simple design holds short- and long-term promise.  According  to an October 4 article by Larry Hardesty of the MIT News Office, it is hoped that the blocks can be miniaturized someday, perhaps to swarming microbots that can self-assemble with a purpose. Even at their current size, further development of the M-Blocks might lead to “armies of mobile cubes” that can help repair bridges and buildings in emergencies, raise scaffolding, reconfigure into heavy equipment or furniture as needed, or head in to environments hostile to humans to diagnose and repair problems, the article suggests.

While it may not rise to “cooperative group behavior,”  the ability of one cube to drag another and influence its alignment is impressive. What could 100 or more of these robots accomplish as MIT researchers continue to develop algorithms to control them?

A prototype of the new modular robot, with its flywheel exposed. (Photo: M. Scott Brauer)

A prototype of the new modular robot, with its interior and flywheel exposed.
(Photo: M. Scott Brauer)