In Memoriam: Richard Alan Wotiz

Richard Alan Wotiz—a multitalented electronics engineer, inventor, and author—provided the international embedded design community with creative projects and useful electronics engineering lessons since the early 1980s when he graduated from Princeton University. Sadly, Richard passed away unexpectedly on May 30, 2012 while hiking with a group of friends (a group called “Take a Hike”) in Santa Cruz County, California.

Richard Alan Wotiz

Richard started writing his “Embedded Unveiled” column for Circuit Cellar magazine in 2011. You can read each of his columns by clicking the links below:

Prior to becoming a columnist, Richard placed highly in several international embedded design challenges. Amazingly, he won First Prize in both the Texas Instruments 2010 DesignStellaris Challenge and the 2010 WIZnet iMCU Challenge. That’s right—he won First Place in both of Circuit Cellar’s 2010 design challenges!

Richard published intriguing feature article about some of his prize-winning projects. Interestingly, he liked combining his passion for engineering with his love of the outdoors. When he did so, the results were memorable designs intended to be used outdoors: a backpack water level monitor, an earth field magnetometer, and an ABS brake system for a mountain bike.

Richard’s ABS system is built around a Texas Instruments EKK-LM3S9B96 evaluation board, which contains the Stellaris LM3S9B96 microcontroller and support circuitry. The mechanism mounts to the front fork in place of the reflector, and the control unit sits on a bracket that’s also attached to the handlebars. A veritable maze of wires runs to the various sensors on the brake levers and wheels.

His other projects were well-built systems—such as his single-phase, variable-speed drive for AC induction motors—intended to solve real-world problems or handy DIY designs—such as his “Net Butler” network control system—that he could use in his daily life.

Richard’s single-phase, variable-speed drive for AC induction motors is an excellent device for powerful, yet quiet, pump operation. Designed for use with a capacitor-start/capacitor-run motor, it includes active power factor correction (PFC) and inrush current limiting. This is the drive unit. A Microchip Technology dsPIC30F2020 and all of the control circuitry is at the upper right, with all of the power components below. The line filter and low-voltage supplies are in a separate box to the left. It’s designed to sit vertically with the three large filter capacitors at the bottom, so they stay as cool as possible.

Richard named his finished network control system the “Net Butler.” This innovative multifunctional design can control, monitor, and automatically maintain a home network. Built around a WIZnet iMCU7100EVB, the design has several functions, such as reporting on connected network devices and downloading Internet-based content.

I last saw Richard in March 2012 at the Design West Conference in San Jose, CA. As usual, he stopped by our booth to chat about his work and Circuit Cellar magazine in general. He had a great passion for both, and it showed whenever I spoke with him. He was a true believer of this magazine and its mission. During our chat, he asked if he could write about the seven-processor Intel Industrial Control Robotic Orchestra system on display at the conference. I agreed, of course! His enthusiasm for doing such an article was apparent. Soon thereafter he was at the Intel booth taking photos and notes for his column.

I’m happy to announce that the column—which he titled “EtherCAT Orchestra”—will appear in Circuit Cellar 264 (July 2012).

Richard’s work was a wonderful contribution to this magazine, and we’re grateful to have published his articles. We’re sure Richard’s inventive design ideas and technical insight will endure to help countless more professionals, academics, and students to excel at electronics engineering for years to come.

Wireless Data Control for Remote Sensor Monitoring

Circuit Cellar has published dozens of interesting articles about handy wireless applications over the years. And now we have another innovative project to report about. Circuit Cellar author Robert Bowen contacted us recently with a link to information about his iFarm-II controller data acquisition system.

The iFarm-II controller data acquisition system (Source: R. Bowen)

The design features two main components. Bowen’s “iFarm-Remote” and the “iFarm-Base controller” work together to as an accurate remote wireless data acquisition system. The former has six digital inputs (for monitoring relay or switch contacts) and six digital outputs (for energizing a relay’s coil). The latter is a stand-alone wireless and internet ready controller. Its LCD screen displays sensor readings from the iFarm-Remote controller. When you connect the base to the Internet, you can monitor data reading via a browser. In addition, you can have the base email you notifications pertaining to the sensor input channels.

You can connect the system to the Internet for remote monitoring. The Network Settings Page enables you to configure the iFarm-Base controller for your network. (Source: R. Bowen)

Bowen writes:

The iFarm-II Controller is a wireless data acquisition system used to remotely monitor temperature and humidity conditions in a remote location. The iFarm consists of two controllers, the iFarm-Remote and iFarm-Base controller. The iFarm-Remote is located in remote location with various sensors (supports sensors that output +/-10VDC ) connected. The iFarm-Remote also provides the user with 6-digital inputs and 6-digital outputs. The digital inputs may be used to detect switch closures while the digital outputs may be used to energize a relay coil. The iFarm-Base supports either a 2.4GHz or 900Mhz RF Module.

The iFarm-Base controller is responsible for sending commands to the iFarm-Remote controller to acquire the sensor and digital input status readings. These readings may be viewed locally on the iFarm-Base controllers LCD display or remotely via an Internet connection using your favorite web-browser. Alarm conditions can be set on the iFarm-Base controller. An active upper or lower limit condition will notify the user either through an e-mail or a text message sent directly to the user. Alternatively, the user may view and control the iFarm-Remote controller via web-browser. The iFarm-Base controllers web-server is designed to support viewing pages from a PC, Laptop, iPhone, iTouch, Blackberry or any mobile device/telephone which has a WiFi Internet connection.—Robert Bowen,

iFarm-Host/Remote PCB Prototype (Source: R. Bowen)

Robert Bowen is a senior field service engineer for MTS Systems Corp., where he designs automated calibration equipment and develops testing methods for customers involved in the material and simulation testing fields. Circuit Cellar has published three of his articles since 2001:

Design West Update: Intel’s Computer-Controlled Orchestra

It wasn’t the Blue Man Group making music by shooting small rubber balls at pipes, xylophones, vibraphones, cymbals, and various other sound-making instruments at Design West in San Jose, CA, this week. It was Intel and its collaborator Sisu Devices.

Intel's "Industrial Controller in Concert" at Design West, San Jose

The innovative Industrial Controller in Concert system on display featured seven Atom processors, four operating systems, 36 paint ball hoppers, and 2300 rubber balls, a video camera for motion sensing, a digital synthesizer, a multi-touch display, and more. PVC tubes connect the various instruments.

Intel's "Industrial Controller in Concert" features seven Atom processors 2300

Once running, the $160,000 system played a 2,372-note song and captivated the Design West audience. The nearby photo shows the system on the conference floor.

Click here learn more and watch a video of the computer-controlled orchestra in action.

Robot Design with Microsoft Kinect, RDS 4, & Parallax’s Eddie

Microsoft announced on March 8 the availability of Robotics Developer Studio 4 (RDS 4) software for robotics applications. RDS 4 was designed to work with the Kinect for Windows SDK. To demonstrate the capabilities of RDS 4, the Microsoft robotics team built the Follow Me Robot with a Parallax Eddie robot, laptop running Windows 7, and the Kinect.

In the following short video, Microsoft software developer Harsha Kikkeri demonstrates Follow Me Robot.

Circuit Cellar readers are already experimenting Kinect and developing embedded system to work with it n interesting ways. In an upcoming article about a Kinect-based project, designer Miguel Sanchez describes a interesting Kinect-based 3-D imaging system.

Sanchez writes:

My project started as a simple enterprise that later became a bit more challenging. The idea of capturing the silhouette of an individual standing in front of the Kinect was based on isolating those points that are between two distance thresholds from the camera. As depth image already provides the distance measurement, all the pixels of the subject will be between a range of distances, while other objects in the scene will be outside of this small range. But I wanted to have just the contour line of a person and not all the pixels that belong to that person’s body. OpenCV is a powerful computer vision library. I used it for my project because of function blobs. This function extracts the contour of the different isolated objects of a scene. As my image would only contain one object—the person standing in front of the camera—function blobs would return the exact list of coordinates of the contour of the person, which was what I needed. Please note that this function is a heavy image processing made easy for the user. It provides not just one, but a list of all the different objects that have been detected in the image. It can also specify is holes inside a blob are permitted. It can also specify the minimum and maximum areas of detected blobs. But for my project, I am only interested in detecting the biggest blob returned, which will be the one with index zero, as they are stored in decreasing order of blob area in the array returned by the blobs function.

Though it is not a fault of blobs function, I quickly realized that I was getting more detail than I needed and that there was a bit of noise in the edges of the contour. Filtering out on a bit map can be easily accomplished with a blur function, but smoothing out a contour did not sound so obvious to me.

A contour line can be simplified by removing certain points. A clever algorithm can do this by removing those points that are close enough to the overall contour line. One of these algorithms is the Douglas-Peucker recursive contour simplification algorithm. The algorithm starts with the two endpoints and it accepts one point in between whose orthogonal distance from the line connecting the two first points is larger than a given threshold. Only the point with the largest distance is selected (or none if the threshold is not met). The process is repeated recursively, as new points are added, to create the list of accepted points (those that are contributing the most to the general contour given a user-provided threshold). The larger the threshold, the rougher the resulting contour will be.

By simplifying a contour, now human silhouettes look better and noise is gone, but they look a bit synthetic. The last step I did was to perform a cubic-spline interpolation so contour becomes a set of curves between the different original points of the simplified contour. It seems a bit twisted to simplify first to later add back more points because of the spline interpolation, but this way it creates a more visually pleasant and curvy result, which was my goal.


(Source: Miguel Sanchez)
(Source: Miguel Sanchez)

The nearby images show aspects of the process Sanchez describes in his article, where an offset between the human figure and the drawn silhouette is apparent.

The entire article is slated to appear in the June or July edition of Circuit Cellar.

Aerial Robot Demonstration Wows at TEDTalk

In a TEDTalk Thursday, engineer Vijay Kumar presented an exciting innovation in the field of unmanned aerial vehicle (UAV) technology. He detailed how a team of UPenn engineers retrofitted compact aerial robots with embedded technologies that enable them to swarm and operate as a team to take on a variety of remarkable tasks. A swarm can complete construction projects, orchestrate a nine-instrument piece of music, and much more.

The 0.1-lb aerial robot Kumar presented on stage—built by UPenn students Alex Kushleyev and Daniel Mellinger—consumed approximately 15 W, he said. The 8-inch design—which can operate outdoors or indoors without GPS—featured onboard accelerometers, gyros, and processors.

“An on-board processor essentially looks at what motions need to be executed, and combines these motions, and figures out what commands to send to the motors 600 times a second,” Kumar said.

Watch the video for the entire talk and demonstration. Nine aerial robots play six instruments at the 14:49 minute mark.

Zero-Power Sensor (ZPS) Network

Recently, we featured two notable projects featuring Echelon’s Pyxos Pyxos technology: one about solid-state lighting solutions and one about a radiant floor heating zone controller. Here we present another innovative project: a zero-power sensor (ZPS) network on polymer.

The Zero Power Switch (Source: Wolfgang Richter, Faranak M.Zadeh)

The ZPS system—which was developed by Wolfgang Richter and Faranak M. Zadeh of Ident Technology AG— doesn’t require battery or RF energy for operation. The sensors, developed on polymer foils, are fed by an electrical alternating field with a 200-kHz frequency. A Pyxos network enables you to transmit of wireless sensor data to various devices.

In their documentation, Wolfgang Richter and Faranak M. Zadeh write:

“The developed wireless Zero power sensors (ZPS) do not need power, battery or radio frequency energy (RF) in order to operate. The system is realized on polymer foils in a printing process and/or additional silicon and is very eco-friendly in production and use. The sensors are fed by an electrical alternating field with the frequency of 200 KHz and up to 5m distance. The ZPS sensors can be mounted anywhere that they are needed, e.g. on the body, in a room, a machine or a car. One ZPS server can work for a number of ZPS-sensor clients and can be connected to any net to communicate with network intelligence and other servers. By modulating the electric field the ZPS-sensors can transmit a type of “sensor=o.k. signal” command. Also ZPS sensors can be carried by humans (or animals) for the vital signs monitoring. So they are ideal for wireless monitoring systems (e.g. “aging at home”). The ZPS system is wireless, powerless and cordless system and works simultaneously, so it is a self organized system …

The wireless Skinplex zero power sensor network is a very simply structured but surely functioning multiple sensor system that combines classical physics as taught by Kirchhoff with the latest advances in (smart) sensor technology. It works with a virtually unlimited number of sensor nodes in inertial space, without a protocol, and without batteries, cables and connectors. A chip not bigger than a particle of dust will be fabricated this year with the assistance of Cottbus University and Prof. Wegner. The system is ideal to communicate via PYXOS/Echelon to other instances and servers.

Pyxos networks helps to bring wireless ZPS sensor data over distances to external instances, nets and servers. With the advanced ECHELON technology even AC Power Line (PL) can be used.

As most of a ZPS server is realized in software it can be easily programmed into a Pyxos networks device, a very cost saving effect! Applications start from machine controls, smart office solutions, smart home up to Homes of elderly and medical facilities as everywhere else where Power line (PL) exists.”

Inside the ZPS project (Source: Wolfgang Richter, Faranak M.Zadeh)

For more information about Pyxos technology, visit

This project, as well as others, was promoted by Circuit Cellar based on a 2007 agreement with Echelon.

Robot Nav with Acoustic Delay Triangulation

Building a robot is a rite of passage for electronics engineers. And thus this magazine has published dozens of robotics-related articles over the years.

In the March issue, we present a particularly informative article on the topic of robot navigation in particular. Larry Foltzer tackles the topic of robot positioning with acoustic delay triangulation. It’s more of a theoretical piece than a project article. But we’re confident you’ll find it intriguing and useful.

Here’s an excerpt from Foltzer’s article:

“I decided to explore what it takes, algorithmically speaking, to make a robot that is capable of discovering its position on a playing field and figuring out how to maneuver to another position within the defined field of play. Later on I will build a minimalist-like platform to test algorithms performance.

In the interest of hardware simplicity, my goal is to use as few sensors as possible. I will use ultrasonic sensors to determine range to ultrasonic beacons located at the corners of the playing field and wheel-rotation sensors to measure distance traversed, if wheel-rotation rate times time proves to be unreliable.

From a software point of view, the machine must be able to determine robot position on a defined playing field, determine robot position relative to the target’s position, determine robot orientation or heading, calculate robot course change to approach target position, and periodically update current position and distance to the target. Because of my familiarity with Microchip Technology’s 8-bit microcontrollers and instruction sets, the PIC16F627A is my choice for the microcontrollers (mostly because I have them in my inventory).

To this date, the four goals listed—in terms of algorithm development and code—are complete and are the main subjects of this article. Going forward, focus must now shift to the hardware side, including software integration to test beyond pure simulation.

A brief survey of ultrasonic ranging sensors indicates that most commercially available units have a range capability of 20’ or less. This is for a sensor type that detects the echo of its own emission. However, in this case, the robot’s sensor will not have to detect its own echoes, but will instead receive the response to its query from an addressable beacon that acts like an active mirror. For navigation purposes, these mirrors are located at three of the four corners of the playing field. By using active mirrors or beacons, received signal strength will be significantly greater than in the usual echo ranging situation. Further, the use of the active mirror approach to ranging should enable expansion of the effective width of the sensor’s beam to increase the sensor’s effective field of view, reducing cost and complexity.

Taking the former into account, I decided the size of the playing field will be 16’ on a side and subdivided into 3” squares forming an (S × S) = (64 × 64) = (26, 26) unit grid. I selected this size to simplify the binary arithmetic used in the calculations. For the purpose of illustration here, the target is considered to be at the center of the playing field, but it could very well be anywhere within the defined boundaries of the playing field.

Figure 1: Squarae playing field (Source: Larry Foltzer CC260)

Referring to Figure 1, the corners of the square playing field are labeled in clockwise order from A to D. Ultrasonic sonar transceiver beacons/active mirrors are placed at three of the corners of the playing field, at the corners marked A, B, and D.”

The issue in which this article appears will available here in the coming days.