Microcontroller-Based Wireless Pedometer & Pace Tacker Project

Anyone can easily order a pedometer or GPS sports watch on the Internet. But engineers like challenges, right? With the right parts and a little knowhow, you can engineer your own microcontroller-based wireless Bluetooth pedometer and pace tracker.

In the article, “Run With It” (Circuit Cellar 203, December 2014), Ellen Chuang and Julie Wang explain how they built an Atmel ATmega1284P microcontroller-based wireless pedometer and pace tracker. They wrote:

There’s a simple question most runners, walkers, and joggers ask themselves: How fast am I going? There are various tools for measuring pace, including step counters, GPS units, and smartphone applications. Pedometers are common tools for tracking physical activity. However, pedometers are typically either self-contained units that are out of sight and out of mind or expensive products like the FitBit and Nike+. So, for our culminating design project for Cornell University’s ECE 4760 Microcontrollers course, we decided to create a low-budget wireless pedometer and pace tracker.

The wireless pedometer hardware implementation includes a (a) foot module and a (b) wrist module on the right.

The wireless pedometer includes a foot module (a) and a wrist module (b).

Chuang and Wang’s design is notable because they enabled it with Bluetooth so they could connect it with other Bluetooth devices. In addition, they separated the user interface from the measurement unit.

Our system contains two separate modules: a foot-mounted module that captures acceleration data and a wrist-mounted module with a user interface. The foot module captures your acceleration data, lightly processes it to find values of interest, and sends the data over Bluetooth to the wrist module. The wrist module then further processes the information to determine if a step has occurred. The wrist module also handles user input and displaying information on an LCD.

To use the device, the foot unit is strapped on your lower leg and the wrist unit is either strapped to your wrist or held in your hand. At power-up, the units automatically pair over a Bluetooth link. The wrist unit powers up in Configuration mode, which enables you to use the two push buttons to adjust your stride length and also your desired pace in minutes per mile. You are first prompted for your stride length—the average length of one step—and then the targeted speed in minutes per mile. The Enter button enables you to confirm the parameters, exit Configuration mode, and enter into Pace Display mode. In Pace Display mode, the LCD displays the cumulative number of steps, your calculated pace, and your desired pace. At any point, you can reenter Configuration mode by toggling a switch on the wrist module.

Both modules contain an Atmel ATmega1284P microcontroller and an HC-05 Bluetooth master/slave module.

The foot unit contains a 1.5-g, single-axis Freescale Semiconductor MMA2260D accelerometer orientated with its positive measurement axis pointed away from the center of the Earth. We’ll refer to this direction as the z-axis. The analog data coming from the accelerometer is very noisy. To filter out the excessive noise, the accelerometer data passes through a low-pass filter with a cutoff frequency around 33 Hz using a 10-kΩ resistor and 470-nF capacitor. Human stride frequency typically falls between 185 and 200 strides per minute, or around 3.33 Hz. In order to capture this frequency, as well as higher frequency signals that correspond to other foot movements, we set the low-pass filter’s cutoff frequency to 33 Hertz. Additionally, to fully utilize the internal ADC’s 10-bit range, we dropped the voltage across a 20-kΩ resistor…

This is the foot unit. We used MAX233/RS-232 for debugging purposes and not for the final foot module.

This is the foot unit. We used MAX233/RS-232 for debugging purposes and not for the final foot module.

The wrist unit includes a 16 × 2 LCD a number of user input buttons/switches, status-linked LEDs, and a Bluetooth transceiver (see Figure 2). The three buttons and switch are used in Configuration mode. Additionally, we included the following LEDs: a red LED on the board that toggles every time a step is detected; a yellow LED that toggles every time a packet is received via Bluetooth; and a green “keep alive” LED. The software on the microcontroller manages all of these tasks, which we cover in the next section.

The complete article appears in Circuit Cellar 293 (December 2014).

Freescale High-Sensitivity Accelerometer Family

Freescale recently introduced a new range of three-axis accelerometers offering high sensitivity at low power consumption. According to Freescale, the FXLN83xxQ family is capable of detecting acceleration information often missed by less accurate sensors commonly used in consumer products such as smartphones and exercise activity monitors. In conjunction with appropriate software algorithms, its improved sensitivity allows the new sensor to be used for equipment fault prognostication (for predictive maintenance), condition monitoring, and medical tamper detection applications.

Source: Freescale

Source: Freescale

The 3 mm × 3 mm chip has a bandwidth of 2.7 kHz and uses analog output signals for direct connection to a microcontroller’s ADC input. Each chip has two levels of sensitivity that can be changed on the fly. The complete family covers acceleration ranges of ±2, ±4, ±8, and ±16 g, with gains of, 229.0, 114.5, 57.25, and 28.62 mV/g, respectively. Zero g is indicated by an output level of 0.75 V.

The FXLN83xxQ family:

  • FXLN83x1Q ±2 or ±8 g range
  • FXLN83x2Q ±4 or ±16 g
  • FXLN836xQ 1.1 kHz x- and y-axis bandwidth (Z = 600 Hz)
  • FXLN837xQ 2.7 kHz x- and y-axis bandwidth (Z = 600 Hz)

The sensors operate from 1.71 to 3.6 V (at 180 µA typically, 30 nA shutdown). The company has also made available the DEMOFXLN83xxQ evaluation break-out board with a ready-mounted sensor to simplify device integration into a test and development environment.

Q&A: Andrew Godbehere, Imaginative Engineering

Engineers are inherently imaginative. I recently spoke with Andrew Godbehere, an Electrical Engineering PhD candidate at the University of California, Berkeley, about how his ideas become realities, his design process, and his dream project. —Nan Price, Associate Editor

Andrew Godbehere

Andrew Godbehere

NAN: You are currently working toward your Electrical Engineering PhD at the University of California, Berkeley. Can you describe any of the electronics projects you’ve worked on?

ANDREW: In my final project at Cornell University, I worked with a friend of mine, Nathan Ward, to make wearable wireless accelerometers and find some way to translate a dancer’s movement into music, in a project we called CUMotive. The computational core was an Atmel ATmega644V connected to an Atmel AT86RF230 802.15.4 wireless transceiver. We designed the PCBs, including the transmission line to feed the ceramic chip antenna. Everything was hand-soldered, though I recommend using an oven instead. We used Kionix KXP74 tri-axis accelerometers, which we encased in a lot of hot glue to create easy-to-handle boards and to shield them from static.

This is the central control belt-pack to be worn by a dancer for CUMotive, the wearable accelerometer project. An Atmel ATmega644V and an AT86RF230 were used inside to interface to synthesizer. The plastic enclosure has holes for the belt to attach to a dancer. Wires connect to accelerometers, which are worn on the dancer’s limbs.

This is the central control belt-pack to be worn by a dancer for CUMotive, the wearable accelerometer project. An Atmel ATmega644V and an AT86RF230 were used inside to interface to synthesizer. The plastic enclosure has holes for the belt to attach to a dancer. Wires connect to accelerometers, which are worn on the dancer’s limbs.

The dancer had four accelerometers connected to a belt pack with an Atmel chip and transceiver. On the receiver side, a musical instrument digital interface (MIDI) communicated with a synthesizer. (Design details are available at http://people.ece.cornell.edu/land/courses/ece4760/FinalProjects/s2007/njw23_abg34/index.htm.)

I was excited about designing PCBs for 802.15.4 radios and making them work. I was also enthusiastic about trying to figure out how to make some sort of music with the product. We programmed several possibilities, one of which was a sort of theremin; another was a sort of drum kit. I found that this was the even more difficult part—not just the making, but the making sense.

When I got to Berkeley, my work switched to the theoretical. I tried to learn everything I could about robotic systems and how to make sense of them and their movements.

NAN: Describe the real-time machine vision-tracking algorithm and integrated vision system you developed for the “Are We There Yet?” installation.

ANDREW: I’ve always been interested in using electronics and robotics for art. Having a designated emphasis in New Media on my degree, I was fortunate enough to be invited to help a professor on a fascinating project.

This view of the Yud Gallery is from the installed camera with three visitors present. Note the specular reflections on the floor. They moved throughout the day with the sun. This movement needed to be discerned from a visitor’s typical movement .

This view of the Yud Gallery is from the installed camera with three visitors present. Note the specular reflections on the floor. They moved throughout the day with the sun. This movement needed to be discerned from a visitor’s typical movement .

For the “Are We There Yet?” installation, we used a PointGrey FireFlyMV camera with a wide-angle lens. The camera was situated a couple hundred feet away from the control computer, so we used a USB-to-Ethernet range extender to communicate with the camera.

We installed a color camera in a gallery in the Contemporary Jewish Museum in San Francisco, CA. We used Meyer Sound speakers with a high-end controller system, which enabled us to “position” sound in the space and to sweep audio tracks around at (the computer’s programmed) will. The Meyer Sound D-Mitri platform was controlled by the computer with Open Sound Control (OSC).

This view of the Yud Gallery is from the perspective of the computer running the analysis. This is a probabilistic view, where the brightness of each pixel represents the “belief” that the pixel is part of an interesting foreground object, such as a pedestrian. Note the hot spots corresponding nicely with the locations of the visitors in the image above.

This view of the Yud Gallery is from the perspective of the computer running the analysis. This is a probabilistic view, where the brightness of each pixel represents the “belief” that the pixel is part of an interesting foreground object, such as a pedestrian. Note the hot spots corresponding nicely with the locations of the visitors in the image above.

The hard work was to then program the computer to discern humans from floors, furniture, shadows, sunbeams, and cloud reflections. The gallery had many skylights, which made the lighting very dynamic. Then, I programmed the computer to keep track of people as they moved and found that this dynamic information was itself useful to determine whether detected color-perturbance was human or not.

Once complete, the experience of the installation was beautiful, enchanting, and maybe a little spooky. The audio tracks were all questions (e.g., “Are we there yet?”) and they were always spoken near you, as if addressed to you. They responded to your movement in a way that felt to me like dancing with a ghost. You can watch videos about the installation at www.are-we-there-yet.org.

The “Are We There Yet?” project opens itself up to possible use as an embedded system. I’ve been told that the software I wrote works on iOS devices by the start-up company Romo (www.kickstarter.com/projects/peterseid/romo-the-smartphone-robot-for-everyone), which was evaluating my vision-tracking code for use in its cute iPhone rover. Further, I’d say that if someone were interested, they could create a similar pedestrian, auto, pet, or cloud-tracking system using a Raspberry Pi and a reasonable webcam.

I may create an automatic cloud-tracking system to watch clouds. I think computers could be capable of this capacity for abstraction, even though we think of the leisurely pastime as the mark of a dreamer.

NAN: Some of the projects you’ve contributed to focus on switched linear systems, hybrid systems, wearable interfaces, and computation and control. Tell us about the projects and your research process.

ANDREW: I think my research is all driven by imagination. I try to imagine a world that could be, a world that I think would be nice, or better, or important. Once I have an idea that captivates my imagination in this way, I have no choice but to try to realize the idea and to seek out the knowledge necessary to do so.

For the wearable wireless accelerometers, it began with the thought: Wouldn’t it be cool if dance and music were inherently connected the way we try to make it seem when we’re dancing? From that thought, the designs started. I thought: The project has to be wireless and low power, it needs accelerometers to measure movement, it needs a reasonable processor to handle the data, it needs MIDI output, and so forth.

My switched linear systems research came about in a different way. As I was in class learning about theories regarding stabilization of hybrid systems, I thought: Why would we do it this complicated way, when I have this reasonably simple intuition that seems to solve the problem? I happened to see the problem a different way as my intuition was trying to grapple with a new concept. That naive accident ended up as a publication, “Stabilization of Planar Switched Linear Systems Using Polar Coordinates,” which I presented in 2010 at Hybrid Systems: Computation and Control (HSCC) in Stockholm, Sweden.

NAN: How did you become interested in electronics?

ANDREW: I always thought things that moved seemingly of their own volition were cool and inherently attention-grabbing. I would think: Did it really just do that? How is that possible?

Andrew worked on this project when computers still had parallel ports. a—This photo shows manually etched PCB traces for a digital EKG (the attempted EEG) with 8-bit LED optoisolation. The rainbow cable connects to a computer’s parallel port. The interface code was written in C++ and ran on DOS. b—The EKG circuitry and digitizer are shown on the left. The 8-bit parallel computer interface is on the right. Connecting the two boards is an array of coupled LEDs and phototransistors, encased in heat shrink tubing to shield against outside light.

Andrew worked on this project when computers still had parallel ports. a—This photo shows manually etched PCB traces for a digital EKG (the attempted EEG) with 8-bit LED optoisolation. The rainbow cable connects to a computer’s parallel port. The interface code was written in C++ and ran on DOS. b—The EKG circuitry and digitizer are shown on the left. The 8-bit parallel computer interface is on the right. Connecting the two boards is an array of coupled LEDs and phototransistors, encased in heat shrink tubing to shield against outside light.

Electric rally-car tracks and radio-controlled cars were a favorite of mine. I hadn’t really thought about working with electronics or computers until middle school. Before that, I was all about paleontology. Then, I saw an episode of Scientific American Frontiers, which featured Alan Alda excitedly interviewing RoboCup contestants. Watching RoboCup [a soccer game involving robotic players], I was absolutely enchanted.

While my childhood electronic toys moved and somehow acted as their own entities, they were puppets to my intentions. Watching RoboCup, I knew these robots were somehow making their own decisions on-the-fly, magically making beautiful passes and goals not as puppets, but as something more majestic. I didn’t know about the technical blood, sweat, and tears that went into it all, so I could have these romantic fantasies of what it was, but I was hooked from that moment.

That spurred me to apply to a specialized science and engineering high school program. It was there that I was fortunate enough to attend a fabulous electronics class (taught by David Peins), where I learned the basics of electronics, the joy of tinkering, and even PCB design and assembly (drilling included). I loved everything involved. Even before I became academically invested in the field, I fell in love with the manual craft of making a circuit.

NAN: Tell us about your first design.

ANDREW: Once I’d learned something about designing and making circuits, I jumped in whole-hog, to a comical degree. My very first project without any course direction was an electroencephalograph!

I wanted to make stuff move on my computer with my brain, the obvious first step. I started with a rough design and worked on tweaking parameters and finding components.

In retrospect, I think that first attempt was actually an electromyograph that read the movements of my eye muscles. And it definitely was an electrocardiograph. Success!

Someone suggested that it might not be a good idea to have a power supply hooked up in any reasonably direct path with your brain. So, in my second attempt, I tried to make something new, so I digitized the signal on the brain side and hooked it up to eight white LEDs. On the other side, I had eight phototransistors coupled with the LEDs and covered with heat-shrink tubing to keep out outside light. That part worked, and I was excited about it, even though I was having some trouble properly tuning the op-amps in that version.

NAN: Describe your “dream project.”

ANDREW: Augmented reality goggles. I’m dead serious about that, too. If given enough time and money, I would start making them.

I would use some emerging organic light-emitting diode (OLED) technology. I’m eyeing the start-up MicroOLED (www.microoled.net) for its low-power “near-to-eye” display technologies. They aren’t available yet, but I’m hopeful they will be soon. I’d probably hook that up to a Raspberry Pi SBC, which is small enough to be worn reasonably comfortably.

Small, high-resolution cameras have proliferated with modern cell phones, which could easily be mounted into the sides of goggles, driving each OLED display independently. Then, it’s just a matter of creativity for how to use your newfound vision! The OpenCV computer vision library offers a great starting point for applications such as face detection, image segmentation, and tracking.

Google Glass is starting to get some notice as a sort of “heads-up” display, but in my opinion, it doesn’t go nearly far enough. Here’s the craziest part—please bear with me—I’m willing to give up directly viewing the world with my natural eyes, I would be willing to have full field-of-vision goggles with high-resolution OLED displays with stereoscopic views from two high-resolution smartphone-style cameras. (At least until the technology gets better, as described in Rainbows End by Vernor Vinge.) I think, for this version, all the components are just now becoming available.

Augmented reality goggles would do a number of things for vision and human-computer interaction (HCI). First, 3-D overlays in the real world would be possible.

Crude example: I’m really terrible with faces and names, but computers are now great with that, so why not get a little help and overlay nametags on people when I want? Another fascinating thing for me is that this concept of vision abstracts the body from the eyes. So, you could theoretically connect to the feed from any stereoscopic cameras around (e.g., on an airplane, in the Grand Canyon, or on the back of some wild animal), or you could even switch points of view with your friend!

Perhaps reality goggles are not commercially viable now, but I would unabashedly use them for myself. I dream about them, so why not make them?

Real-Time Trailer Monitoring System

Dean Boman, a retired electrical engineer and spacecraft communications systems designer, noticed a problem during vacations towing the family’s RV trailer—tire blowouts.

“In every case, there were very subtle changes in the trailer handling in the minutes prior to the blowouts, but the changes were subtle enough to go unnoticed,” he says in his article appearing in January’s Circuit Cellar magazine.

So Boman, whose retirement hobbies include embedded system design, built the trailer monitoring system (TMS), which monitors the vibration of each trailer tire, displays the

Figure 1—The Trailer Monitoring System consists of the display unit and a remote data unit (RDU) mounted in the trailer. The top bar graph shows the right rear axle vibration level and the lower bar graph is for left rear axle. Numbers on the right are the axle temperatures. The vertical bar to the right of the bar graph is the driver-selected vibration audio alarm threshold. Placing the toggle switch in the other position  displays the front axle data.

Photo 1 —The Trailer Monitoring System consists of the display unit and a remote data unit (RDU) mounted in the trailer. The top bar graph shows the right rear axle vibration level and the lower bar graph is for left rear axle. Numbers on the right are the axle temperatures. The vertical bar to the right of the bar graph is the driver-selected vibration audio alarm threshold. Placing the toggle switch in the other position displays the front axle data.

information to the driver, and sounds an alarm if tire vibration or heat exceeds a certain threshold. The alarm feature gives the driver time to pull over before a dangerous or damaging blowout occurs.

Boman’s article describes the overall layout and operation of his system.

“The TMS consists of accelerometers mounted on each tire’s axles to convert the gravitational (g) level vibration into an analog voltage. Each axle also contains a temperature sensor to measure the axle temperature, which is used to detect bearing or brake problems. Our trailer uses the Dexter Torflex suspension system with four independent axles supporting four tires. Therefore, a total of four accelerometers and four temperature sensors were required.

“Each tire’s vibration and temperature data is processed by a remote data unit (RDU) that is mounted in the trailer. This unit formats the data into RS-232 packets, which it sends to the display unit, which is mounted in the tow vehicle.”

Photo 1 shows the display unit. Figure 1 is the complete system’s block diagram.

Figure 1—This block diagram shows the remote data unit accepting data from the accelerometers and temperature sensors and sending the data to the display unit, which is located in the tow vehicle for the driver display.

Figure 1—This block diagram shows the remote data unit accepting data from the accelerometers and temperature sensors and sending the data to the display unit, which is located in the tow vehicle for the driver display.

The remote data unit’s (RDU’s) hardware design includes a custom PCB with a Microchip Technology PIC18F2620 processor, a power supply, an RS-232 interface, temperature sensor interfaces, and accelerometers. Photo 2 shows the final board assembly. A 78L05 linear regulator implements the power supply, and the RS-232 interface utilizes a Maxim Integrated MAX232. The RDU’s custom software design is written in C with the Microchip MPLAB integrated development environment (IDE).

The remote data unit’s board assembly is shown.

Photo 2—The remote data unit’s board assembly is shown.

The display unit’s hardware includes a Microchip Technology PIC18F2620 processor, a power supply, a user-control interface, an LCD interface, and an RS-232 data interface (see Figure 1). Boman chose a Hantronix HDM16216H-4 16 × 2 LCD, which is inexpensive and offers a simple parallel interface. Photo 3 shows the full assembly.

The display unit’s completed assembly is shown with the enclosure opened. The board on top is the LCD’s rear view. The board on bottom is the display unit board.

Photo 3—The display unit’s completed assembly is shown with the enclosure opened. The board on top is the LCD’s rear view. The board on bottom is the display unit board.

Boman used the Microchip MPLAB IDE to write the display unit’s software in C.

“To generate the display image, the vibration data is first converted into an 11-element bar graph format and the temperature values are converted from Centigrade to Fahrenheit. Based on the toggle switch’s position, either the front or the rear axle data is written to the LCD screen,” Boman says.

“To implement the audio alarm function, the ADC is read to determine the driver-selected alarm level as provided by the potentiometer setting. If the vibration level for any of the four axles exceeds the driver-set level for more than 5 s, the audio alarm is sounded.

“The 5-s requirement prevents the alarm from sounding for bumps in the road, but enables vibration due to tread separation or tire bubbles to sound the alarm. The audio alarm is also sounded if any of the temperature reads exceed 160°F, which could indicate a possible bearing or brake failure.”

The comprehensive monitoring gives Boman peace of mind behind the wheel. “While the TMS cannot prevent tire problems, it does provide advance warning so the driver can take action to prevent serious damage or even an accident,” he says.

For more details about Boman’s project, including RDU and display unit schematics, check out the January issue.

High-Speed Laser Range Finder Board with IMU

Integrated

The NavRanger-OEM

The NavRanger-OEM combines a 20,000 samples per second laser range finder with a nine-axis inertial measurement unit (IMU) on a single 3“ × 6“ (7.7 × 15.3 cm) circuit board. The board features I/O resources and processing capability for application-specific control solutions.

The NavRanger‘s laser range finder measures the time of flight of a short light pulse from an IR laser. The time to digital converter has a 65-ps resolution (i.e., approximately 1 cm). The Class 1M laser has a 10-ns pulse width, a 0.8 mW average power, and a 9° × 25° divergence without optics. The detector comprises an avalanche photo diode with a two-point variable-gain amplifier and variable threshold digitizer. These features enable a 10-cm × 10-cm piece of white paper to be detected at 30 m with a laser collimator and 25-mm receiver optics.

The range finder includes I/O to build a robot or scan a solution. The wide range 9-to-28-V input supply voltage enables operation in 12- and 24-V battery environments. The NavRanger‘s IMU is an InvenSense nine-axis MPU-9150, which combines an accelerometer, a gyroscope, and a magnetometer on one chip. A 32-bit Freescale ColdFire MCF52255 microcontroller provides the processing the power and additional I/O. USB and CAN buses provide the board’s high-speed interfaces. The board also has connectors and power to mount a Digi International XBee wireless module and a TTL GPS.

The board comes with embedded software and a client application that runs on a Windows PC or Mac OS X. It also includes modifiable source code for the embedded and client applications. The NavRanger-OEM costs $495.

Integrated Knowledge Systems, Inc.
www.iknowsystems.com