Build a Persistence-of-Vision Display

Using LEDs and PIC32

Learn how these two Cornell University students created a persistence-of-vision (POV) display. They found a way to fit an LED strip onto the mechanically rigid base of a box window fan. The POV display creates the illusion of an image and show anything from an analog clock to ASCII text and complex images.

By Han Li and Emily Sun

Visual feedback is a key aspect of human interaction in everyday life. With technology, the beauty of the visual world can be preserved with images and videos. We set out to create a persistence-of-vision (POV) display that is both multifunctional and easy to use, through the use of a large box fan. Box fans are often found by the window on hot summer days, and can be quite unique with the integration of a “cool” POV display. For our project, we found a way to fit a DotStar LED strip onto the mechanically rigid box base of a box fan. As such, the box fan serves as an ideal platform for a POV display, without needing to construct a well-calibrated rotating setup with a DC motor. The box fan also has preset settings for speed which is convenient for testing.

The novelty of this POV display makes it a good conversation starter, and it can be easily assembled and customized. The display creates an illusion of an image and shows anything from an analog clock to ASCII text and complex images.

In designing our POV fan display, the first thing we measured was the fan’s speed of rotation. This was calculated by flashing a blinking strobe light through the fan blades. On the slowest setting, the fan rotates at approximately 7 Hz, which is equivalent to 143 ms per rotation around the circular radius of the spinning fan. The angle resolution of the image generator of the POV display is limited by time constraints, so we defined 100 tick locations around the peripheral of the fan. Since the LEDs are programmed to light up twice per rotation, the images can be rendered twice as fast, thus increasing the refresh rate of the display to around 14 Hz—each pixel is blinking 14 times per second. For the human eye, the POV effect is achieved around 15 Hz, which means we are getting a decent result with our setup.

With an interrupt time of approximately 1 ms, and through the use of the Hall effect sensor that updates the period on each rotation, the positioning of displayed elements on the fan varies to at most 2.5 degrees. During testing, there are no observations of rotational jittering greater than 2.5 degrees with 100 display angles.

HARDWARE DESIGN

The hardware components are a box fan, DotStar LED strip, tri-state buffer, Hall effect sensor, 5 V battery bank, 9 V battery, a piece of 0.635 cm × 2.54 cm × 50 cm plywood and a Microchip Technology PIC32 microcontroller on a custom PCB [1].

Figure 1
Hardware setup with a closeup of Hall effect sensor and magnet placement

The custom PCB with the mounted PIC32 is secured onto the protoboard above a piece of Styrofoam to prevent short-circuiting (Figure 1). The protoboard itself contains the necessary power distribution and level shifting required for the LED strip. The DotStar LED strip must be driven at 5 V and takes about 60 mA per LED at full intensity. Because of the PIC32’s 3.3 V output, an ON Semiconductor 74LS125 tri-state buffer [2] is used as a level shifter. This is done by shorting the gate on the tri-state buffer to ground and powering the buffer with the 5 V rail (Figure 2). The 9 V battery is then connected directly to the custom PCB with the adapter, and the 5 V battery pack is connected to the power rail on the protoboard (Figure 1).

Figure 2
Schematic of hardware setup

In terms of mechanical setup, the front-facing grill on the box fan is removed for easy access. A piece of plywood is mounted onto the fan with two wood screws on the opposite side of the fan’s plastic centerpiece. The DotStar LED strip is secured to the plywood with zip ties. The metal ridges that secure the front facing grills are bent outward to allow for smooth rotation of the mounted plywood piece.  …

Read the full article in the October 339 issue of Circuit Cellar

Don’t miss out on upcoming issues of Circuit Cellar. Subscribe today!

Note: We’ve made the October 2017 issue of Circuit Cellar available as a free sample issue. In it, you’ll find a rich variety of the kinds of articles and information that exemplify a typical issue of the current magazine.

LIDAR 3D Imaging on a Budget

PIC-32-Based Design

Demand is on the rise for 3D image data in a variety of applications. That has spurred research into LIDAR systems capable of keeping pace. Learn how this Cornell student leveraged inexpensive LIDAR sensors to build a 3D imaging system—all within a budget of around $200.

By Chris Graef

There’s a growing demand for 3D image data in a variety of applications, from autonomous cars to military base security. This has prompted research into high precision LIDAR systems capable of creating extremely clear 3D images to meet this demand. While these high-end systems can produce accurate and precise images, they can cost on the order of multiple thousands to tens of thousands of dollars. A side effect of this research, however, is the increasing availability of LIDAR devices at a cost much more affordable for tinkerers, students, hobbyists and budget-constrained embedded system developers. Using this new supply of inexpensive LIDAR sensors, I was able to build a 3D imaging system with a budget of around $200. The major parts used for the system can be seen in Table 1.

Table 1 Shown here are the cost and quantity of the major components used in the project. Not included are some smaller components such as wires, resistors and op amps.

At a glance, my LIDAR scanner works by turning a single-point LIDAR range finder through a scan pattern. I use a Microchip PIC32 microcontroller to control two analog feedback servos—one setting azimuth angle and one setting altitude angle—to move a mounted LIDAR distance sensor through a scan pattern. By synchronizing the feedback data of these two servos with the distance readings from the LIDAR sensor, the system defines one point in 3D space in a spherical coordinate format. After allowing the system time to create 10,000 to 20,000 points, the result is a 3D image made up of distinct spatial points. These points are stored in a point cloud data file format, which can be displayed by graphing software such as MATLAB.

MECHANICAL DESIGN

A CAD model of the imaging system is shown in Figure 1. The servos are shown in blue, the LIDAR is shown in red and the 3D printed mounts are shown in gray. All the components are connected using nuts and machine screws. The lower (azimuth) servo rotates the entire apparatus above it. The upper (altitude) servo rotates just the LIDAR sensor. The combined motion of the two servos results in the scan pattern of the system.

Figure 1
A CAD model of the LIDAR sensor and servo mounting. The LIDAR sensor is shown in red, the servos are shown in blue and the mounting brackets are shown in gray.

One thing to note in this design are the slots used on the mounting brackets to fasten both the altitude servo and the LIDAR sensor. One of the biggest requirements for the mechanical design of this project was to ensure that the center of rotation for the LIDAR sensor was in the center of the scanner. If the LIDAR sensor is positioned away from either axis of rotation, error gets introduced into the system. Here’s why this occurs: When converting raw data to cartesian points, we assume that the LIDAR sensor is giving us the distance to a point in 3D space from the origin of our spherical coordinate space. Deviation from the center of rotation for the azimuth or altitude angle would mean that we are recording a distance from somewhere else in our geometric plane.

It’s still possible to get accurate 3D points if the LIDAR sensor is not at the center of rotation, but this requires precise measurement of where the LIDAR sensor actually is in our coordinate space, and the use of complex mathematics to transform the measured data into accurate 3D position points. I thought that adding a couple of slots to a 3D bracket would be slightly easier and more effective. These slots allow for micro adjustments to be made in two dimensions, so that the LIDAR sensor lies in the direct center of both axes of rotation.

ELECTRICAL DESIGN

There are two main electrical circuits in this design: The power/servo control circuit and the feedback amplifier circuit.The power/servo control circuit shown in Figure 2 was designed to allow the PIC32 MCU to send a pulse width modulation (PWM) signal to the servos, while protecting the MCU from possibly harmful electrical noise made by the servo motors. The first step to reduce noise was to use an opto-isolator as a switch for the servo motors control pin. By driving pins RPB9 and RPB7 high, the MCU connects the servo motors’ control pin to the 5-V source. This converts the PIC32’s 3.3-V PWM output into a 5-V PWM usable by the two servos, while isolating the PIC32’s output pin from any electrical noise.

Figure 2
Shown here is the circuit of the power supply module. The servos are shown as motors. The RPB9 and the RPB7 are wires connected to output pins on the PIC32.

If the servos were the only things that needed to be connected to the MCUs, then the opto-isolator configuration would have been enough to protect the PIC32 from the servo motors’ electrical noise. However, the MCU must share a common ground with the LIDAR sensor, to be able to read the sensor’s analog output. This means electrical noise on the power/servo controller circuit can travel to the PIC32 through this common ground. To reduce the chance of electrical damage, two capacitors—one ceramic and one electrolytic—were connected across the 5-V source and the ground. The smaller ceramic capacitor attenuates any smaller amplitude, high frequency noise and the larger electrolytic capacitor is used to attenuate the lower frequency noise. The combination of these two capacitors ideally stops any damaging noise from travelling through the power connection by shunting the noise to ground instead.  …

Read the full article in the September 338 issue of Circuit Cellar

Don’t miss out on upcoming issues of Circuit Cellar. Subscribe today!

Note: We’ve made the October 2017 issue of Circuit Cellar available as a free sample issue. In it, you’ll find a rich variety of the kinds of articles and information that exemplify a typical issue of the current magazine.

Money Sorting Machines (Part 3)

Bill Validation

In this final article of his money sorting machine series, Jeff wraps up his coin sorting project and examines how a bill validator can tell one bill’s denomination from another.

By Jeff Bachiochi

Most of us connect Ben Franklin with kites and lightning. He was also a printer and might be best known for Poor Richard’s Almanack—a yearly publication that he published from 1732 to 1758 under the pseudonym of Richard Saunders. It was a best-seller and thanks to his wit and wisdom, his portrait was added to the cover of The Old Farmer’s Almanac in 1851—appearing opposite the founder Robert B. Thomas. It remains there today.

As a master printer and engraver, in 1730 Franklin began printing all paper money issued by Pennsylvania, New Jersey and Delaware. Paper money was first introduced in the region in 1723, but it remained a hot political issue. That’s because it helped farmers and tradesmen, while merchants and landowners wanted it eliminated or limited in its circulation. Paper money printed from ordinary type was easy to counterfeit, but Ben’s ingenuity solved that problem by printing pictures of leaves on every piece of money. Counterfeiters could not duplicate—or even imitate—the fine lines and irregular patterns. The process by which he made the printing plates was secret, but were probably cast in type metal from molds made by pressing leaves into plaster of Paris. There began the Feds vigilant effort to thwart counterfeiters.

Today every aspect of our paper currency is controlled—from its design to its printing, as well as its monitoring and destruction. The paper (which is not paper) and ink (multiple types and formulas) are fabricated for the express use by the Department of Engraving. That department is the Treasury bureau responsible for paper money—as opposed to the U.S Mint, which is the Treasury bureau responsible for coinage. US currency consists of 25% linen and 75% cotton and contains small randomly disbursed red and blue security fibers embedded throughout the material. Depending on the denomination the material is further enhanced by embedding security threads, ribbons and watermarks. Since 1996, printing with colored and color changing inks make the new currency pop. While older black and green currency is rather drab in comparison, it is still legal tender and remains the target of most counterfeiters.

The previous two parts of this article series (December 329 and January 330) centered around coinage. Before we look at bill validation for paper money, I need to finish up with that project. I had purchased a few Coin Acceptors and showed how they are used to identify coinage, especially but not limited to US coins. The acceptance and dispensing of money is presently used in many ways today, including vending machines and ATMs. The discussion also included National Automatic Merchandising Association (NAMA), the organization that developed the international specification for the Multi-Drop Bus/ Internal Communication Protocol (MDB/CP) released in July 2010. The MDB/ICP enables communication between a master controller and any of the peripheral hardware like Coin Acceptors and bill validators. By adhering to these guidelines, any manufacturer’s equipment is interchangeable.

Turns out the Coin Acceptors I purchased don’t have the MDB interface necessary to communicate with a Vending Machine Controller (VMC). I reviewed the protocol and VMC/Peripheral Communication Specifications required by the Coin Acceptor/Changer peripheral and began work on developing an MDB interface that would bridge my Coin Acceptor with the multi-drop bus. 

Read the full article in the February 331 issue of Circuit Cellar

Don’t miss out on upcoming issues of Circuit Cellar. Subscribe today!
Note: We’ve made the October 2017 issue of Circuit Cellar available as a free sample issue. In it, you’ll find a rich variety of the kinds of articles and information that exemplify a typical issue of the current magazine.

New PIC32 MPLAB Harmony Ecosystem Development Program

Microchip Technology’s new MPLAB Harmony Ecosystem Program is for the developers of embedded middleware and operating systems who are seeking to unlock the business potential of the 32-bit PIC32 microcontroller customer base. Ecosystem partners also gain early and easy access to the complete and current set of MPLAB Harmony tools. MPLAB Harmony is a 32-bit microcontroller firmware development framework, which integrates licensing, resale, and support of both Microchip and third-party middleware, drivers, libraries, and RTOSes. The new Ecosystem Program builds on that framework by offering an open and structured method to become certified as “Harmony Compatible,” using the embedded-control industry’s only set of test harnesses, checklists, and reference validation points. By accessing this broader, code-compatible ecosystem when creating complimentary value-added software, developers can reduce risks and overall costs, accelerate time to market, and grow their businesses by gaining opportunities to market to thousands of PIC32 MPLAB Harmony users.MPLAB Harmony

As today’s applications get increasingly sophisticated, embedded developers need to rapidly bring complex solutions to market. Microchip’s MPLAB Harmony framework for its PIC32 microcontrollers can reduce the development time of a typical project by a minimum of 20–35%, by providing a single integrated, abstracted, and flexible source of tested, debugged, and interoperable code. Additionally, MPLAB Harmony provides a modular architecture that enables the efficient integration of multiple drivers, middleware and libraries, while offering an RTOS-independent environment. Not only does this pre-verification and integration speed development, it also increases reuse. On the hardware side, the MPLAB Harmony framework makes it even easier to port code, thereby simplifying migration among all of Microchip’s 32-bit PIC32 microcontrollers, enabling a highly profitable, multitiered end equipment offering with minimal code redevelopment. Middleware and Operating System developers who take advantage of Microchip’s Ecosystem Development Program will be better able to offer customers solutions that leverage MPLAB Harmony’s efficiency and reliability advantages.

The MPLAB Harmony Integrated Software Framework is also supported by Microchip’s free MPLAB Harmony Configurator and MPLAB XC32 Compiler v1.40, all of which operate within Microchip’s MPLAB X IDE, and all of which are available for free download. Additionally, MPLAB Harmony and the PIC32 are supported with a comprehensive set of low-cost development boards available from microchipDIRECT or by contacting one of Microchip’s authorized distribution partners.

Source: Microchip Technology

New PIC32 Bluetooth Starter Kit

Microchip Technology recently announced the new PIC32 Bluetooth Starter Kit, which is intended for low-cost applications such as a Bluetooth thermostat, wireless diagnostic tools, and Bluetooth GPS receivers. According to Microchip, the kit includes “a PIC32 microcontroller, HCI-based Bluetooth radio, Cree high-output multi-color LED, three standard single-color LEDs, an analog three-axis accelerometer, analog temperature sensor, and five push buttons for user-defined inputs.”

PIC32 Bluetooth Starter Kit (Source: Microchip Technology)

PIC32 Bluetooth Starter Kit (Source: Microchip Technology)

PICkit On Board (PKOB) eliminates the need for an external debugger/programmer, USB connectivity, and GPIOs for rapid development of Bluetooth Serial Port Profile (SPP), USB and general-purpose applications.  The starter kit also features a plug-in interface for an audio CODEC daughter card. The kit’s PIC32MX270F256D microcontroller operates at 83 DMIPS with 256-KB flash memory and 64-KB RAM.

The PIC32 Bluetooth Starter Kit is supported by Microchip’s free MPLAB X IDE and MPLAB Harmony Integrated Software Framework.  Additionally, the free Quick Start Package is available with an Android application development environment. It also includes a free SDK with the application source code and binary for Microchip’s Bluetooth SPP library.  Both are optimized for the on-board PIC32 MCU and are available for free at www.microchip.com/get/1AVL.

 

The PIC32 Bluetooth Starter Kit costs $79.99.

Autonomous Mobile Robot (Part 1): Overview & Hardware

Welcome to “Robot Boot Camp.” In this two-part article series, I’ll explain what you can do with a basic mobile machine, a few sensors, and behavioral programming techniques. Behavioral programming provides distinct advantages over other programming techniques. It is independent of any environmental model, and it is more robust in the face of sensor error, and the behaviors can be stacked and run concurrently.

My objectives for my recent robot design were fairly modest. I wanted to build a robot that could cruise on its own, avoid obstacles, escape from inadvertent collisions, and track a light source. I knew that if I could meet such objective other more complex behaviors would be possible (e.g., self-docking on low power). There certainly many commercial robots on the market that could have met my requirements. But I decided that my best bet would be to roll my own. I wanted to keep things simple, and I wanted to fully understand the sensors and controls for behavioral autonomous operation. The TOMBOT is the fruit of that labor (see Photo 1a). A colleague came up with the name TOMBOT in honor of its inventor, and the name kind of stuck.

Photo 1a—The complete TOMBOT design. b—The graphics display is nice feature.

In this series of articles, I’ll present lessons learned and describe the hardware/software design process. The series will detail TOMBOT-style robot hardware and assembly, as well as behavior programming techniques using C code. By the end of the series, I’ll have covered a complete behavior programming library and API, which will be available for experimentation.

DESIGN BASICS

The TOMBOT robot is certainly minimal, no frills: two continuous-rotation, variable-speed control servos; two IR (850 nm) analog distance measurement sensors (4- to 30-cm range); two CdS photoconductive cells with good lux response in visible spectrum; and, finally, a front bumper (switch-activated) for collision detection. The platform is simple: servos and sensors on the left and right side of two level platforms. The bottom platform houses bumper, batteries, and servos. The top platform houses sensors and microcontroller electronics. The back part of the bottom platform uses a central skid for balance between the two servos (see Photo 1).

Given my background as a Microchip Developer and Academic Partner, I used a Microchip Technology PIC32 microcontroller, a PICkit 3 programmer/debugger, and a free Microchip IDE and 32-bit complier for TOMBOT. (Refer to the TOMBOT components list at the end of this article.)

It was a real thrill to design and build a minimal capability robot that can—with stacking programming behaviors—emulate some “intelligence.” TOMBOT is still a work in progress, but I recently had the privilege of demoing it to a first grade class in El Segundo, CA, as part of a Science Technology Engineering and Mathematics (STEM) initiative. The results were very rewarding, but more on that later.

BEHAVIORAL PROGRAMMING

A control system for a completely autonomous mobile robot must perform many complex information-processing tasks in real time, even for simple applications. The traditional method to building control systems for such robots is to separate the problem into a series of sequential functional components. An alternative approach is to use behavioral programming. The technique was introduced by Rodney Brooks out of the MIT Robotics Lab, and it has been very successful in the implementation of a lot of commercial robots, such as the popular Roomba vacuuming. It was even adopted for space applications like NASA’s Mars Rover and military seekers.

Programming a robot according to behavior-based principles makes the program inherently parallel, enabling the robot to attend simultaneously to all hazards it may encounter as well as any serendipitous opportunities that may arise. Each behavior functions independently through sensor registration, perception, and action. In the end, all behavior requests are prioritized and arbitrated before action is taken. By stacking the appropriate behaviors, using arbitrated software techniques, the robot appears to show (broadly speaking) “increasing intelligence.” The TOMBOT modestly achieves this objective using selective compile configurations to emulate a series of robot behaviors (i.e., Cruise, Home, Escape, Avoid, and Low Power). Figure 1 is a simple model illustration of a behavior program.

Figure 1: Behavior program

Joseph Jones’s Robot Programming: A Practical Guide to Behavior-Based Robotics (TAB Electronics, 2003) is a great reference book that helped guide me in this effort. It turns out that Jones was part of the design team for the Roomba product.

Debugging a mobile platform that is executing a series of concurrent behaviors can be daunting task. So, to make things easier, I implemented a complete remote control using a wireless link between the robot and a PC. With this link, I can enable or disable autonomous behavior, retrieve the robot sensor status and mode of operations, and curtail and avoid potential robot hazard. In addition to this, I implemented some additional operator feedback using a small graphics display, LEDs, and a simple sound buzzer. Note the TOMBOT’s power-up display in Photo 1b. We take Robot Boot Camp very seriously.

Minimalist System

As you can see in the robot’s block diagram (see Figure 2), the TOMBOT is very much a minimalist system with just enough components to demonstrate autonomous behaviors: Cruise, Escape, Avoid, and Home. All these behaviors require the use of left and right servos for autonomous maneuverability.

Figure 2: The TOMBOT system

The Cruise behavior just keeps the robot in motion in lieu of any stimulus. The Escape behavior uses the bumper to sense a collision and then 180° spin with reverse. The Avoid behavior makes use of continuous forward-looking IR sensors to veer left or right upon approaching a close obstacle. The Home behavior utilizes the front optical photocells to provide robot self-guidance to a strong light highly directional source. It all should add up to some very distinct “intelligent” operation. Figure 3 depicts the basic sensor and electronic layout.

Figure 3: Basic sensor and electronic layout

TOMBOT Assembly

The TOMBOT uses the low-cost robot platform (ArBot Chassis) and wheel set (X-Wheel assembly) from Budget Robotics (see Figure 4).

Figure 4: The platform and wheel set

A picture is worth a thousand words. Photo 2 shows two views of the TOMBOT prototype.

Photo 2a: The TOMBOT’s Sharp IR sensors, photo assembly, and more. b: The battery pack, right servo, and more.

Photo 2a shows dual Sharp IR sensors. Just below them is the photocell assembly. It is a custom board with dual CdS GL5528 photoconductive cells and 2.2-kΩ current-limiting resistors. Below this is a bumper assembly consisting of two SPDT Snap-action switches with lever (All Electronics Corp. CAT# SMS-196, left and right) fixed to a custom pre-fab plastic front bumper. Also shown is the solderless breakout board and left servo. Photo 2b shows the rechargeable battery pack that resides on the lower base platform and associated power switch. The electronics stack is visible. Here the XBee/Buzzer and graphics card modules residing on the 32-bit Experimenter. The Experimenter is plugged into a custom carrier board that allows for an interconnection to the solderless breakout to the rest of the system. Finally, note that the right servo is highlighted. The total TOMBOT package is not ideal; but remember, I’m talking about a prototype, and this particular configuration has held up nicely in several field demos.

I used Parallax (Futaba) continuous-rotation servos. They use a three-wire connector (+5 V, GND, and Control).

Figure 5 depicts a second-generation bumper assembly.  The same snap-action switches with extended levers are bent and fashioned to interconnect a bumper assembly as shown.

Figure 5: Second-generation bumper assembly

TOMBOT Electronics

A 32-bit Micro Experimenter is used as the CPU. This board is based the high-end Microchip Technology PIC32MX695F512H 64-pin TQFP with 128-KB RAM, 512-KB flash memory, and an 80-MHz clock. I did not want to skimp on this component during the prototype phase. In addition the 32-bit Experimenter supports a 102 × 64 monographic card with green/red backlight controls and LEDs. Since a full graphics library was already bundled with this Experimenter graphics card, it also represented good risk reduction during prototyping phase. Details for both cards are available on the Kiba website.

The Experimenter supports six basic board-level connections to outside world using JP1, JP2, JP3, JP4, BOT, and TOP headers.  A custom carrier board interfaces to the Experimenter via these connections and provides power and signal connection to the sensors and servos. The custom carrier accepts battery voltage and regulates it to +5 VDC. This +5 V is then further regulated by the Experimenter to its native +3.3-VDC operation. The solderless breadboard supports a resistor network to sense a +9-V battery voltage for a +3.3-V PIC processor. The breadboard also contains an LM324 quad op-amp to provide a buffer between +3.3-V logic of the processor and the required +5-V operation of the servo. Figure 6 is a detailed schematic diagram of the electronics.

Figure 6: The design’s circuitry

A custom card for the XBee radio carrier and buzzer was built that plugs into the Experimenter’s TOP and BOT connections. Photo 3 shows the modules and the carrier board. The robot uses a rechargeable 1,600-mAH battery system (typical of mid-range wireless toys) that provides hours of uninterrupted operation.

Photo 3: The modules and the carrier board

PIC32 On-Chip Peripherals

The major PIC32 peripheral connection for the Experimenter to rest of the system is shown. The TOMBOT uses PWM for servo, UART for XBee, SPI and digital for LCD, analog input channels for all the sensors, and digital for the buzzer and bumper detect. The key peripheral connection for the Experimenter to rest of the system is shown in Figure 7.

Figure 7: Peripheral usage

The PIC32 pinouts and their associated Experimenter connections are detailed in Figure 8.

Figure 8: PIC32 peripheral pinouts and EXP32 connectors

The TOMBOT Motion Basics and the PIC32 Output Compare Peripheral

Let’s review the basics for TOMBOT motor control. The servos use the Parallax (Futaba) Continuous Rotation Servos. With two-wheel control, the robot motion is controlled as per Table 1.

Table 1: Robot motion

The servos are controlled by using a 20-ms (500-Hz) pulse PWM pattern where the PWM pulse can from 1.0 ms to 2.0 ms. The effects on the servos for the different PWM are shown in Figure 9.

Figure 9: Servo PWM control

The PIC32 microcontroller (used in the Experimenter) has five Output Compare modules (OCX, where X =1 , 2, 3, 4, 5). We use two of these peripherals, specifically OC3, OC4 to generate the PWM to control the servo speed and direction. The OCX module can use either 16 Timer2 (TMR2) or 16 Timer3 (TMR3) or combined as 32-bit Timer23 as a time base and for period (PR) setting for the output pulse waveform. In our case, we are using Timer23 as a PR set to 20 ms (500 Hz). The OCXRS and OCXR registers are loaded with a 16-bit value to control width of the pulse generated during the output period. This value is compared against the Timer during each period cycle. The OCX output starts high and then when a match occurs OCX logic will generate a low on output. This will be repeated on a cycle-by-cycle basis (see Figure 10).

Figure 10: PWM generation

Next Comes Software

We set the research goals and objectives for our autonomous robot. We covered the hardware associated with this robot and in the next installment we will describe the software and operation.

Tom Kibalo holds a BSEE from City College of New York and an MSEE from the University of Maryland. He as 39 years of engineering experience with a number of companies in the Washington, DC area. Tom is an adjunct EE facility member for local community college, and he is president of Kibacorp, a Microchip Design Partner.