Talking Hands: American Sign Language Gesture Recognition Glove

Roberto developed a glove that enables communication between the user and those
around him. While the design is intended for use by people communicating in American Sign Language, you can apply what you learn in this article to a variety of communications applications.Capture
PHOTO 1-Here you see the finished product with all of the sensors sewn in. The use of string as opposed to adhesive for the sensors allowed the components to smoothly slide back and forth as the hand was articulated.

By Roberto Villalba

While studying at Cornell University in 2014, my lab partner Monica Lin and I designed and built a glove to be worn on the right hand that uses a machine learning (ML) algorithm to translate sign language into spoken English (see Photo 1). Our goal was to create a way for the speech impaired to be able to communicate with the general public more easily. Since every person’s hand is a unique size and shape, we aimed to create a device that could provide reliable translations regardless of those differences. Our device relies on a variety of sensors, such as flex sensors, a gyroscope, an accelerometer, and touch sensors to quantify the state of the user’s hand. These sensors allow us to capture the flex on each of the fingers, the hand’s orientation, rotation, and points of contact. By collecting a moderate amount of this data for each sign and feeding it into a ML algorithm, we are able to learn the association between sensor readings and their corresponding signs. We make use of a microcontroller to read, filter and send the data from the glove to a PC. Initially, some data is gathered from the users and the information is used to train a classifier that learns to differentiate between signs. Once the training is done, the user is able to put on the glove and make gestures which the computer then turns into audible output.

After performing some calculation and characterizing our flex sensors, we decided to use a 10-kΩ resistor. Note that the rightmost point goes into one of the microcontroller’s ADC.

FIGURE 1-After performing some calculation and characterizing our flex sensors, we decided to use a 10-kΩ resistor. Note that the rightmost point goes into one of the microcontroller’s ADC.

We use the microcontroller’s analog-to digital converter (ADC) to read the voltage drop across each of the flex sensors. We then move on to reading the linear acceleration and rotation values from the accelerometer and gyro sensor using I 2C. And finally, we get binary readings from each of the touch sensors regarding if there exists contact or not. We perform as many readings as possible within a given window of time and use all of this data to do some smoothing. This information is then sent through serial to the PC where it is gathered and processed. Python must listen to information coming in from the microprocessor and either store data or predict based on already learned information. Our code includes scripts for gathering data, loading stored data, classifying the data that is being streamed live, and some additional scripts to help with visualization of sensor readings and so on.

The design comprises an Atmel ATmega1284P microcontroller and a glove onto which the various sensors and necessary wires were sewn. Each finger has one Spectra Symbol flex sensor stitched on the backside of the glove. The accelerometer and gyro sensors are attached to the center of the back of the glove. The two contact sensors were made out of copper tape and wire that was affixed to four key locations.

Since each flex sensor has a resistance that varies depending on how much the finger is bent, we attached each flex sensor as part of a voltage divider circuit in order to obtain a corresponding voltage that can then be input into the microcontroller.


We determined a good value for R1 by analyzing expected values from the flex sensor. Each one has a flat resistance of 10 k and a maximum expected resistance (obtained by measuring its resistance on a clenched fist) of about 27 k. In order to obtain the maximum range of possible output voltages from the divider circuit given an input voltage of 5 V, we plotted the expected ranges using the above equation and values of R1 in the range of 10 to 22 k. We found that the differences between the ranges were negligible and opted to use 10 k for R1 (see Figure 1).

Our resulting voltage divider has an output range of about 1 V. We were initially concerned that the resulting values from the microcontroller’s ADC converter would be too close together for the learning algorithm to discern between different values sufficiently. We planned to address this by increasing the input voltage to the voltage divider if necessary, but we found that the range of voltages described earlier was sufficient and performed extremely well.

The InvenSense MPU-6050 accelerometer and gyro sensor packet operates on a lower VCC (3.3 V) compared to the microcontroller’s 5 V. So as not to burn out the chip, we created a voltage regulator using an NPN transistor and a trimpot, connected as shown. The trimpot was adjusted so that the output of the regulator reads 3.3 V. This voltage also serves as the source for the pull-up resistors on the SDA and SCL wires to the microcontroller. Since the I 2C devices are capable only of driving the input voltages low, we connect them to VCC via two 4.7-k pull-up resistors (see Figure 2).

As described later, we found that we needed to add contact sensors to several key spots on the glove (see Figure 3). These would essentially function as switches that would pull the microcontroller input pins to ground to signal contact (be sure to set up the microcontroller pins to use the internal pull up resistors).

Figure 2: Here we see the schematic of the voltage regulator circuit that we created in order to obtain 3.3 V. The bottom of the schematic shows how this same regulator was used to pull up the signals at SCL and SDA. Figure 3: The contact sensor circuitry was quite simple. The input pins of the microcontroller are set to the internal pull-up resistors and whenever the two corresponding copper ends on the fingers touch the input is pulled low.

Figure 2: Here we see the schematic of the voltage regulator circuit that we created in order to obtain 3.3 V. The bottom of the schematic shows how this same regulator was used to pull up the signals at SCL and SDA.

Figure 3: The contact sensor circuitry was quite simple. The input pins of the microcontroller are set to the internal pull-up resistors and whenever the two corresponding copper ends on the fingers touch the input is pulled low.

Interfacing with the MPU-6050 required I 2C communication, for which we chose to use Peter Fleury’s public I 2C library for AVR microcontrollers. I 2C is designed to support multiple devices using a single dedicated data (SDA) bus and a single clock (SCL) bus. Even though we were only using the interface for the microcontroller to regularly poll the MPU6050, we had to adhere to the I 2C protocol. Fleury’s library provided us with macros for issuing start and stop conditions from the microcontroller (which represent different signals that the microcontroller is requesting data from the MPU-6050 or is releasing control of the bus). These provided macros allowed for us to easily initialize the I 2C interface, set up the MPU-6050, and request and receive the accelerometer and gyroscope data (described later).

Figure 4: The image is the visual output received from plotting sequences of sensor readings. The clear divisions across the horizontal signal the different signs A, B, C, and D, respectively.

Figure 4: The image is the visual output received from plotting sequences of sensor readings. The clear divisions across the horizontal signal the different signs A, B, C, and D, respectively.

While testing our I2C communication with the MPU-6050, we found that the microcontroller would on rare occasions hang while waiting for data from the I2C bus. To prevent this from stalling our program, we enabled a watchdog timer that would reset the system every 0.5 seconds, unless our program continued to progress to regular checkpoint intervals, at which time we would reset the watchdog timer to prevent it from unnecessarily resetting the system. We were able to leverage the fact that our microcontroller’s work consists primarily of continuously collecting sensor data and sending packets to a separate PC.

Photo 2: In this image we see the hand gestures for R, U, and V. As you can tell, there is not much difference in the hand’s orientation or the amount of flex on the fingers. However, note that the copper pieces make different kinds of contact for each of the signs.

Photo 2: In this image we see the hand gestures for R, U, and V. As you can tell, there is not much difference in the hand’s orientation or the amount of flex on the fingers. However, note that the copper pieces make different kinds of contact for each of the signs.

For the majority of the code, we used Dan Henriksson and Anton Cervin’s TinyRealTime kernel. The primary reason for using this kernel is that we wanted to take advantage of the already implemented non-blocking UART library in order to communicate with the PC. While we only had a single thread running, we tried to squeeze in as much computation as possible while the data was being transmitted.

The program first initializes the I 2C, the MPU, and the ADC. After it enters an infinite loop it resets the watchdog timer and gets 16 readings from all of the sensors: accelerometers, gyroscopes, flex-sensors, and touch sensors. We then take all of the sensor values and compute filtered values by summing all of the 16 readings from each sensor. Since summation of the IMU sensors can produce overflow, we make sure to shift all of their values by 8 before summing them up. The data is then wrapped up into byte array packet that is organized in the form of a header (0xA1B2C3D4), the data, and a checksum of the data. Each of the sensors is stored into 2 bytes and the checksum is calculated by summing up the unsigned representation of each of the bytes in the data portion of the packet into a 2-byte integer. Once the packet has been created it is sent through the USB cable into the computer and the process repeats.

Communication with the microcontroller was established through the use of Python’s socket and struct libraries. We created a class called SerialWrapper whose main goal is to receive data from the microcontroller. It does so by opening a port and running a separate thread that waits on new data to be available. The data is then scanned for the header and a packet of the right length is removed when available. The checksum is then calculated and verified, and, if valid, the data is unpacked into the appropriate values and fed into a queue for other processes to extract. Since we know the format of the packet, we can use the struct library to extract all of the data from the packet, which is in a byte array format. We then provide the user with two modes of use. One that continuously captures and labels data in order to make a dataset, and another that continuously tries to classify incoming data. Support Vector Machines (SVM) are a widely used set of ML algorithms that learn to classify by using a kernel. While the kernel can take various forms, the most common kind are the linear SVMs. Simply put, the classification, or sign, for a set of readings is decided by taking the dot product of the readings and the classifier. While this may seem like a simple approach, the results are quite impressive. For more information about SVMs, take a look at scikit-learn’s “Support Vector Machines” (

For the purposes of this project we chose to focus primarily on the alphabet, a-z, and we added two more labels, “nothing” and “relaxed”, to the set. Our rationale for providing the classifier “nothing” was in order to have a class that was made up of mostly noise. This class would not only provide negative instances to help learn our other classes, but it also gave the classifier a way of outputting that the gestured sign is not recognized as one of the ones that we care about. In addition, we didn’t want the classifier to be trying to predict any of the letters when the user was simply standing by, thus we taught it what a “relaxed” state was. This state was simply the position that the user put his/her hand when they were not signing anything. In total there were 28 signs or labels. For our project we made extensive use of Python’s scikit-learn library. Since we were using various kinds of sensors with drastically different ranges of values, it was important to scale all of our data so that the SVM would have an easier time classifying. To do so we made use of the preprocessing tools available from scikit-learn. We chose to take all of our data and scale it so that the mean for each sensor was centered at zero and the readings had unit variance. This approach brought about drastic improvements in our performance and is strongly recommended. The classifier that we ended up using was a SVM that is provided by scikit-learn under the name of SVC.

Figure 5: The confusion matrix demonstrates how many times each label is predicted and how many times that prediction is accurate. We would like to see a perfect diagonal line, but we see that one square does not adhere to this. This square corresponds to “predicted V when it was really U” and it shows about a 66% accuracy.

Figure 5: The confusion matrix demonstrates how many times each label is predicted and how many times that prediction is accurate. We would like to see a perfect diagonal line, but we see that one square does not adhere to this. This square corresponds to “predicted V when it was really U” and it shows about a 66% accuracy.

Another part that was crucial to us as developers was the use of plotting in order to visualize the data and qualify how well a learning algorithm should be able to predict the various signs. The main tool that was developed for this was the plotting of a sequence of sensor readings as an image (see Figure 4). Since each packet contained a value for each of the sensors (13 in total), we could concatenate multiple packets to create a matrix. Each row is thus one of the sensor and we look at a row from left to right we get progressively later sensor readings. In addition, every packet makes up a column. This matrix could then be plotted with instances of the same sign grouped together and the differences between these and the others could then be observed. If the difference is clear to us, then the learning algorithm should have no issue telling them apart. If this is not the case, then it is possible that the algorithm could struggle more and changes to the approach could have been necessary.

The final step to classification is to pass the output of the classifier through a final level of filtering and debouncing before the output reaches the user. To accomplish this, we fill up a buffer with the last 10 predictions and only consider something a valid prediction if it has been predicted for at least nine out of the last 10 predictions. Furthermore, we debounce this output and only notify the user if this is a novel prediction and not just a continuation of the previous. We print this result on the screen and also make use of Peter Parente’s pyttsx text-to-speech x-platform to output the result as audio in the case that it is neither “nothing” or “relaxed.”

Our original glove did not have contact sensors on the index and middle fingers. As a result, it had a hard time predicting “R,” “U,” and “V” properly. These signs are actually quite similar to each other in terms of hand orientation and flex. To mitigate this, we added two contact sensors: one set on the tips of the index and middle fingers to detect “R,” and another pair in between the index and middle fingers to discern between “U” and “V.”

As you might have guessed, the speed of our approach is limited by the rate of communication between the microcontroller and the computer and by the rate at which we are able to poll the ADC on the microprocessor. We determined how quickly we could send data to the PC by sending data serially and increasing the send rate until we noticed a difference between the rate at which data was being received and the rate at which data was being sent. We then reduced the send frequency back to a reasonable value and converted this into a loop interval (about 3 ms).

We then aimed to gather as much data as possible from the sensors in between packet transmission. To accomplish this, we had the microcontroller gather as much data as possible between packets. And in addition to sending a packet, the microcontroller also sent the number of readings that it had performed. We then used this number to come up with a reasonable number of values to poll before aggregating the data and sending it to the PC. We concluded that the microcontroller was capable of reading and averaging each of the sensors 16 times each, which for our purposes would provide enough room to do some averaging.

The Python algorithm is currently limited by the rate at which the microcontroller sends data to the PC and the time that it takes the speech engine to say the word or letter. The rate of transfer is currently about thirty hertz and we wait to fill a buffer with about ten unanimous predictions. This means that the fastest that we could output a prediction would be about three times per second which for our needs was suitable. Of course, one can mess around with the values in order to get faster but slightly less accurate predictions. However, we felt that the glove was responsive enough at three predictions per second.

While we were able to get very accurate predictions, we did see some slight variations in accuracy depending on the size of the person’s hands. The accuracy of each flexsensor is limited beyond a certain point. Smaller hands will result in a larger degree of bend. As a result, the difference between slightly different signs with a lot of flex tends to be smaller for users with more petite hands. For example, consider the signs for “M” and “S.” The only difference between these signs is that “S” will elicit slightly more flex in the fingers. However, for smaller hands, the change in the resistance from the flex-sensor is small, and the algorithm may be unable to discern the difference between these signs.

Figure 6: We can see that even with very small amounts of data the classifier does quite well. After gathering just over 60 readings per sign it achieves an accuracy of over 98%.

Figure 6: We can see that even with very small amounts of data the classifier does quite well. After gathering just over 60 readings per sign it achieves an accuracy of over 98%.

In the end, our current classifier was able to achieve an accuracy of 98% (the error being composed almost solely of u, v sign confusion) on a task of 28 signs, the full alphabet as well as “relaxed” and “nothing” (see Figure 5). A random classifier would guess correctly 4% of the time, clearly indicating that our device is quite accurate. It is however worth noting that the algorithm could greatly benefit from improved touch sensors (seeing as the most common mistake is confusing U for V), being trained on a larger population of users, and especially on larger datasets. With a broad enough data set we could provide the new users with a small test script that only covers difficult letters to predict and relies on the already available data for the rest. The software has currently been trained on the two team members and it has been tested on some users outside of the team. The results were excellent for the team members that trained the glove and mostly satisfying though not perfect for the other volunteers. Since the volunteers did not have a chance to train the glove and were not very familiar with the signs, it is hard to say if their accuracy was a result of overfitting, individual variations in signing, or inexperience with American Sign Language. Regardless, the accuracy of the software on users who trained was near perfect and mostly accurate for users that did not know American Sign Language prior to and did not train the glove.

Lastly it is worth noting that the amount of data necessary for training the classifier was actually surprisingly small. With about 60 instances per label the classifier was able to reach the 98% mark. Given that we receive 30 samples per second and that there are 28 signs, this would mean that gathering data for training could be done in under a minute (see Figure 6).

The project met our expectations. Our initial goal was to create a system capable of recognizing and classifying gestures. We were able to do so with more than 98% average accuracy across all 28 classes. While we did not have a solid time requirement for the rate of prediction, the resulting speed made using the glove comfortable and it did not feel sluggish. Looking ahead, it would make sense to improve our approach for the touch sensors since the majority of the ambiguity in signs come from the difference between U and V. We want to use materials that lend themselves more seamlessly to clothing and provide a more reliable connection. In addition, it will be beneficial to test and train our project on a large group of people since this would provide us with richer data and more consistency. Lastly, we hope to make the glove wireless, which would allow it to easily communicate with phones and other devices and make the system truly portable.

Arduino, “MPU-6050 Accelerometer + Gyro,” Main/MPU-6050.

Atmel Corp., “8-Bit AVR Microcontroller with 128K Bytes In-System Programmable Flash: ATmega1284P,” 8059D­AVR­ 11/09, 2009,
www.atmel. com/images/doc8059.pdf.

Fleury, “AVR-Software,” 2006,

Lund University, “Tiny Real Time,” 2006, www.control.lth. se/~anton/tinyrealtime/.

Parente, “pyttsx – Text-tospeech x-platform,” pyttsx “struct–Interpret Strings as Packed Binary Data,” library/struct.html.

scikit-learn, “Preprocessing Data,”
http:// html.

“Support Vector Machines,”

Spectra Symbol, “Flex Sensor FS,” 2015,

Villalba and M. Lin, “Sign Language Glove,” ECE4760, Cornell University, 2014,

ATmega1284P Microcontroller Atmel |

MPU-6050 MEMS MotionTracking Device InvenSense |

Article originally published in Circuit Cellar June 2016, Issue #311

BitCloud 4.0 Complete ZigBee Software Development Kit

Microchip Technology recently announced the industry’s first ZigBee alliance-certified ZigBee platform with ZigBee PRO and Green Power features (formerly ZigBee 3.0). The software stack and corresponding BitCloud 4.0 software development kit is well-suited for the design of home automation and Internet of Things (IoT) applications. Enableing cross-functional device support, the solution is backward-compatible with existing ZigBee-certified products for seamless interoperability.Zigbee Microchip

Features, specs, and benefits:

  • Low latency suitable for RF remote applications
  • Mesh networking for large networks such as lighting applications
  • The ZigBee PRO Green Power feature enables battery-less devices to securely join a network
  • ZigBee Light Link and ZigBee Home Automation devices are fully supported.

The BitCloud 4.0 Software Development Kit (SDK) enables application development on the SAM R21 Xplained Pro Evaluation Kit, a Cortex M0+-based 32-bit microcontroller with an integrated 2.4-GHz 802.15.4-compliant radio. When used with the newly certified software stack, the SDK provides a complete ZigBee-certified development platform.

Microchip currently offers two platforms to begin ZigBee development. The SAM R21 Xplained Pro Evaluation Kit (ATSAMR21-XPRO) is available for $58. The SAM R21 ZigBee Light Link Evaluation Kit (ATSAMR21ZLL-EK) costs $92.

Source: Microchip Technology

D-Stick Simplifies Project Development

The ME Labs Standard D-Stick provides all the functionality of Microchip Technology’s 40-pin PIC16F1937 in a hardware module that includes a USB on-board programmer and virtual COM port. It’s a compact, easy-to-use alternative to connecting a serial port, programmer, and power supply to a solderless breadboard for project development.  After development, you can simply replace the D-Stick with the pinout-compatible, production-ready PIC16F1937 microcontroller. You can combine the Standard D-Stick with the free PICBASIC PRO Compiler Student Edition for comprehensive development system that includes a code editor, BASIC compiler, in-circuit debugger, and device programmer for under $30.Dstick_3_inch (1)

Features, benefits, and specs:

  • Identical pinout to Microchip Technology’s standard 40-pin DIP
  • Round, machined pins are easy on spring contacts allowing for multiple insertion cycles
  • Built-in micro-USB port supplies power, a programming connection, and a virtual COM port
  • Suitable for serial in-circuit debugging
  • Compatible with all ME Labs Trainer programs

The standard version—which is based on the PIC16F1937 compatible with the free PBP Student Edition—costs $29.99. (It will be available for purchase by February 15, 2017.) The Advanced Version—which is based on PIC18F compatible with PBP Gold Edition (sold separately)—will be coming soon!

Source: ME Labs

Next-Generation 8-bit tinyAVR Microcontrollers

Microchip Technology recently launched a new generation of 8-bit tinyAVR microcontrollers. The four new devices range from 14 to 24 pins and 4 KB to 8 KB of flash memory. Furthermore, they are the first tinyAVR microcontrollers to feature Core Independent Peripherals (CIPs). The new devices will be supported by Atmel START, an innovative online tool for intuitive, graphical configuration of embedded software projects.Microchip 8bittinyAVR

The new ATtiny817/816/814/417 devices provide features to help drive product innovation including small, low pin count and feature-rich packaging in 4 or 8 KB of flash memory. Other integrated features include:

  • A CIP called Peripheral Touch Controller (PTC)
  • Event System for peripheral co-operation
  • Custom programmable logic blocks
  • Self-programming for firmware upgrades
  • Nonvolatile data storage
  • 20-MHz internal oscillator
  • High-speed serial communication with USART
  • Operating voltages ranging from 1.8 to 5.5 V
  • 10-bit ADC with internal voltage references
  • Sleep currents at less than 100 nA in power down mode with SRAM retention

CIPs allow the peripherals to operate independently of the core, including serial communication and analog peripherals. Together with the Event System, that allows peripherals to communicate without using the CPU and applications can be optimized at a system level. This lowers power consumption and increases throughput and system reliability.

Accompanying the release of the four new devices, Microchip is adding support for the new AVR family in Atmel START, the online tool to configure software components and tailor embedded applications. This tool is free of charge and offers an optimized framework that allows the user to focus on adding differentiating features to their application.

To help accelerate evaluation and development, a new Xplained Mini Kit is now available for $8.88 USD. The Xplained Mini Kit is also compatible with the Arduino kit ecosystem. The kit can be used for standalone development and is fully supported by the Atmel START and Atmel Studio 7 software development tools.

The new generation of 8-bit tinyAVR MCUs is now available in QFN and SOIC packaging. Devices are available in 4 KB and 8 KB Flash variants, with volume pricing starting at $0.43 for 10,000-unit quantities.

Source: Microchip Technology

IAR Workbench Supports Next-Generation AVR MCUs

IAR Systems recently announced that its Embedded Workbench now supports a new generation of Microchip Technology 8-bit AVR microcontrollers. You can use Workbench with AVR microcontrollers to develop a wide range of low-power systems.

IAR Embedded Workbench’s features, benefits, and specs:

  • The compiler and debugger toolchain IAR Embedded Workbench incorporates the IAR C/C++ Compiler, which can create compact code for Microchip’s 8-bit AVR MCUs
  • IDE tools (editor, project manager, and library tools)
  • Build tools (compiler, assembler, and linker)
  • C-SPY debugger (simulator driver, hardware debugging, power debgging, and RTOS plugins
  • IAR Embedded Workbench offers full-scale support for AVR 32-bit MCUs as well as Microchip’s ARM®-based and 8051-based MCUs families.

Source: IAR Systems

Low-Power BLE Sensor Node for IoT Applications

Microchip Technology recently released a demonstration platform for the lowest-power Bluetooth Low Energy (BLE) sensor node. The platform features an ultra-low-power BTLC1000-certified module, a SMART SAM L21 Cortex-M0+ MCU, Bosch sensor technology, and a complete software solution. The BLE demonstration platform includes source code, hardware design files, a user guide, and Android application source code.Microchip BLE Demo Platform

Features, benefits, and specs:

  • An integrated BTLC1000-MR110CA BLE module, delivering at least 30% more power savings compared to existing solutions.
  • An ultra-tiny 2.2 mm × 2.1 mm Wafer Level Chipscale Package (WLCP).
  • A SAM L21 that achieves a ULPBench score of 185, with power consumption down to 35 µA/MHz in active mode and 200 nA in sleep mode.
  • Bosch six-axis motion (BHI160) and environment (BME280) sensors that can be used for a wide variety of sensing applications.

The Ultra-Low-Power Connected Demonstrator Platform costs $39.

Source: Microchip Technology

New Low-Power Embedded Wi-Fi Solutions for the IoT

Microchip Technology recently launched four low-power, highly integrated solutions that enable Wi-Fi and networking capability to be embedded into a wide variety of devices, including Internet of Things (IoT) applications. These four modules provide complete solutions for 802.11b/g/n and are industry-certified in a variety of countries.Microcontroller  MRF24

The new RN1810 and RN1810E are stand-alone, surface-mount WiFly radio modules that include a TCP/IP stack, cryptographic accelerator, power management subsystem, 2.4-GHz 802.11b/g/n-compliant transceivers, and 2.4 RF power amplifier. You can pair them with any microcontroller and configure them using simple ASCII commands. WiFly provides a simple data pipe for sending data over a Wi-Fi network, requiring no prior Wi-Fi experience to get a product connected. Once configured, the device automatically accesses a Wi-Fi network and sends and receives serial data. The RN1810 features an integrated PCB antenna. The RN1810E supports an external antenna.

The new MRF24WN0MA and MRF24WN0MB are Wi-Fi modules that interface with Microchip’s PIC32 microcontrollers and support Microchip’s MPLAB Harmony integrated software framework with a TCP/IP stack that can be downloaded for free at The modules connect to the microcontroller via a four-wire SPI. They area an ideal solution for low-power, low-data-rate Wi-Fi sensor networks, home automation, building automation, and consumer applications. In addition, an MRF24WN0MA has an integrated PCB antenna, while the MRF24WN0MB supports an external antenna.

The RN1810/E and MRF24WN0MA/B are now available and start at $13.05 each in 1,000-unit quantities. Also available is the $34.95 MRF24WN0MA Wi-Fi PICtail/PICtail Plus Daughter Board, a demonstration board for evaluating Wi-Fi connectivity using PIC microcontrollers and the MRF24WN0MA module (part # AC164153). In addition, a $49.95 RN1810 Wi-Fi PICtail/PICtail Plus Daughter Board is available today with a fully integrated TCP/IP stack and USB interface for easy plug-and-play development with a PC (part # RN-1810-PICTAIL).

Source: Microchip Technology

Execute Open-Source Arduino Code in a PIC Microcontroller Using the MPLAB IDE

The Arduino single-board computer is a de facto standard tool for developing microcomputer applications within the hobbyist and educational communities. It provides an open-source hardware (OSH) environment based on a simple microcontroller board, as well as an open-source (OS) development environment for writing software for the board.

Here’s an approach that enables Arduino code to be configured for execution with the Microchip Technology PIC32MX250F128B small-outline 32-bit microcontroller. It uses the Microchip Technology MPLAB X IDE and MPLAB XC32 C Compiler and the Microchip Technology Microstick II programmer/debugger.

Subscribe to Circuit Cellar magazine! Get 12 months of electrical engineering project articles, embedded design tutorials, embedded programming tips, and much more. We cover micrcontrollers, robotics, embedded Linux, IoT projects, and more!

Your own reasons for using this approach will depend on your personal needs and background. Perhaps as a long-term Arduino user, you want to explore a new processor performance option with your existing Arduino code base. Or, you want to take advantage of or gain experience with the Microchip advanced IDE development tools and debug with your existing Arduino code. All of these goals are easily achieved using the approach and the beta library covered in this article.

Several fundamental open-source Arduino code examples are described using the beta core library of Arduino functions I developed. The beta version is available, for evaluation purposes only, as a free download from the “Arduino Library Code for PIC32” link on my KibaCorp company website, From there, you can also download a short description of the Microstick II hardware configuration used for the library.

To illustrate the capabilities in their simplest form, here is a simple Blink LED example from my book Beginner’s Guide to Programming the PIC32. The example shows how this custom library makes it easy to convert Arduino code to a PIC32 binary file.

The Arduino code example is as follows: Wire an LED through a 1-K resistor to pin 13 (D7) of the Arduino. An output pin is configured to drive an LED using pinMode () function under setup (). Then under loop () this output is set high and then low using digitalWrite () and delay () functions to blink the LED. The community open-source Arduino code is:

The open-source example uses D13 or physical pin 13 on the Arduino. In relation to the PIC32MX, the D13 is physical pin 25. Pin 25 will be used in prototyping wiring.

Now, let’s review and understand the PIC32 project template and its associated “wrapping functions.”  The Arduino uses two principal functions: setup () to initialize the system and loop () to run a continuous execution loop. There is no Main function. Using the Microchip Technololgy XC32 C compiler, we are constrained to having a Main function. The Arduino setup () and loop () functions can be accommodated, but only as part of an overall template Main “wrapping” function. So within our PIC32 template, we accommodate this as follows:

Listing 2

This piece of code is a small but essential part of the template. Note that in this critical wrapping function, setup () is called once as in Arduino and loop () is configured to be called continuously (simulating the loop () function in Arduino) through the use of a while loop in Main.

The second critical wrapping function for our template is the use of C header files at the beginning of the code. The XC32 C compiler uses the C compiler directive #include reference files within the Main code. Arduino uses import, which is a similar construct that is used in higher-level languages such as Java and Python, which cannot be used by the MPLAB XC32 C.

The two include files necessary for our first example are as follows:

Listing 3

System.h references all the critical Microchip library functions supporting the PIC32MX250F128B. The Ardunio.h provides the Arduino specific library function set. Given these two key “wrapper” aspects, where does the Arduino code go? This is best illustrated with a side-by-side comparison between Arduino code and its Microchip equivalent. The Arduino code is essentially positioned between the wrapper codes as part of the Main function.

Blink side-by-side comparison

Blink side-by-side comparison

This approach enables Arduino code to execute on a Microchip PIC32 within an MPLAB X environment. Note that the Arduino code void setup () now appears as void setup (void), and void loop () appears as void loop (void). This is a minor inconvenience but again necessary for our C environment syntax for C prototype function definitions. Once the code is successfully compiled, the environment enables you to have access to the entire built-in tool suite of the MPLAB X and its debugger tool suite.

Configure the Microstick II prototype as in the following schematic. Both the schematic and prototype are shown below:

Exercise 1 schematic

Exercise 1 schematic

Exercise 1 prototype

Exercise 1 prototype

Table 1 compares Arduino core functionality to what is contained in the Microchip PIC32 expanded beta library. In the beta version, I added additional C header files to accomplish the necessary library functionality. Table 2 compares variable types between Arduino and PIC32 variable types. Both Table 1 and Table 2 show the current beta version has a high degree of Arduino core library functionality. Current limitations are the use of only one serial port, interrupt with INT0 only, and no stream capability. In addition, with C the “!” operator is used for conditional test only and not as a complement function, as in Arduino. To use the complement function in C, the “~” operator is used. The library is easily adapted to other PIC32 devices or board types.

Table 1

Table 1: Arduino vs Microchip Technology PIC32 core library function comparison

Talble 2

Table 2: Arduino vs Microchip Technology PIC32 core library variable types

If you use interrupts, you must identify to C the name of your interrupt service routine as used in your Arduino script. See below:

Interrupt support

Interrupt support

For more information on the beta release or to send comments and constructive criticism, or to report any detected problems, please contact me here.

Four test case examples demonstrating additional core library functions are shown below as illustrations.

Serial communications

Serial communications

Serial find string test case

Serial find string test case

Serial parse INT

Serial parse INT



Editor’s Note: Portions of this post first appeared in Tom Kibalo’s book Beginner’s Guide to Programming the PIC32 (Electronics Products, 2013). They are reprinted with permission from Chuck Hellebuyck, Electronic Products. If you are interested in reading more articles by Kibalo, check out his two-part Circuit Cellar “robot boot camp” series posted in 2012 : “Autonomous Mobile Robot (Part 1): Overview & Hardware” and “Autonomous Mobile Robot (Part 2): Software & Operation.”


Tom Kibalo

Tom Kibalo

Tom Kibalo is principal engineer at a large defense firm and president of KibaCorp, a company dedicated to DIY hobbyist, student, and engineering education. Tom, who is also an Engineering Department adjunct faculty member at Anne Arundel Community College in Arnold, MD, is keenly interested in microcontroller applications and embedded designs. To find out more about Tom, read his 2013 Circuit Cellar member profile.

New IC for Real-Time Power Monitoring of Multiple Loads

Microchip Technology recently expanded its power-monitoring IC portfolio with the addition of the MCP39F511N, which provides standard power calculations and event monitoring of two electrical loads. It includes three ADCs for voltage and two current load measurements, a 16-bit calculation engine, EEPROM, and a flexible two-wire interface.Microchip MCP39F511N

An integrated low-drift voltage reference in addition to the 94.5 dB of SINAD performance on each current measurement channel enables the MCP39F511N to monitor two current loads with only 0.5% error across a wide 4000:1 dynamic range.  The ability to measure active, reactive and apparent power, active and reactive energy accumulation, RMS current and RMS voltage, line frequency, and power factor combined with advanced, integrated features enables you to reduce bill of materials and time to market.

The MCP39F511N is supported by Microchip Technology’s $200 MCP39F511N Power Monitor Demonstration Board (ADM00706). The MCP39F511N is available for sampling and volume production in a 28-lead, 5 × 5 mm QFN package. It costs $1.82 in 5,000-unit quantities.

Source: Microchip Technology

Microchip and SiS Offer PCAP and 3D-Gesture Interface Modules

Microchip Technology and Silicon Integrated Systems Corp. (SiS) recently partnered to offer complete projected-capacitive touch (PCAP) and 3D-gesture interface modules. The modules are intended to simplify the design of multi-touch and 3D gesture displays with Microchip’s GestIC technology.Microchip PCAP 3D

Microchip’s GestIC is intended to be combined with multi-touch PCAP controllers. The modules from SiS integrate 2D PCAP and 3D gesture technologies. SiS modules with Microchip’s GestIC technology will enable engineers to deliver innovative 3D control displays in the consumer, home-automation, and Internet of Things markets.

Source: Microchip Technology

New PIC32 MPLAB Harmony Ecosystem Development Program

Microchip Technology’s new MPLAB Harmony Ecosystem Program is for the developers of embedded middleware and operating systems who are seeking to unlock the business potential of the 32-bit PIC32 microcontroller customer base. Ecosystem partners also gain early and easy access to the complete and current set of MPLAB Harmony tools. MPLAB Harmony is a 32-bit microcontroller firmware development framework, which integrates licensing, resale, and support of both Microchip and third-party middleware, drivers, libraries, and RTOSes. The new Ecosystem Program builds on that framework by offering an open and structured method to become certified as “Harmony Compatible,” using the embedded-control industry’s only set of test harnesses, checklists, and reference validation points. By accessing this broader, code-compatible ecosystem when creating complimentary value-added software, developers can reduce risks and overall costs, accelerate time to market, and grow their businesses by gaining opportunities to market to thousands of PIC32 MPLAB Harmony users.MPLAB Harmony

As today’s applications get increasingly sophisticated, embedded developers need to rapidly bring complex solutions to market. Microchip’s MPLAB Harmony framework for its PIC32 microcontrollers can reduce the development time of a typical project by a minimum of 20–35%, by providing a single integrated, abstracted, and flexible source of tested, debugged, and interoperable code. Additionally, MPLAB Harmony provides a modular architecture that enables the efficient integration of multiple drivers, middleware and libraries, while offering an RTOS-independent environment. Not only does this pre-verification and integration speed development, it also increases reuse. On the hardware side, the MPLAB Harmony framework makes it even easier to port code, thereby simplifying migration among all of Microchip’s 32-bit PIC32 microcontrollers, enabling a highly profitable, multitiered end equipment offering with minimal code redevelopment. Middleware and Operating System developers who take advantage of Microchip’s Ecosystem Development Program will be better able to offer customers solutions that leverage MPLAB Harmony’s efficiency and reliability advantages.

The MPLAB Harmony Integrated Software Framework is also supported by Microchip’s free MPLAB Harmony Configurator and MPLAB XC32 Compiler v1.40, all of which operate within Microchip’s MPLAB X IDE, and all of which are available for free download. Additionally, MPLAB Harmony and the PIC32 are supported with a comprehensive set of low-cost development boards available from microchipDIRECT or by contacting one of Microchip’s authorized distribution partners.

Source: Microchip Technology

New Industrial-Grade Ethernet Physical-Layer transceiver

Microchip Technology recently announced the KSZ8061 single-chip 10BASE-T/100BASE-TX automotive- and industrial-grade Ethernet physical-layer transciever. Intended for data communication over low-cost Unshielded Twisted Pair (UTP) cables,  it is the first of a new family based on the programmable Quiet-WIRE enhanced EMC technology, providing reduced line emissions and superior receiver immunity performance. LinkMD+ advanced cable diagnostics improves system reliability. Microchip KSZ8061

For energy-efficient applications, Microchip’s integrated EtherGREEN technology includes a unique Ultra Deep Sleep mode with signal-detect wakeup, which lowers standby power consumption to the sub-microampere range. With fast boot and linkup in less than 20 ms, the KSZ8061 is well suited for applications where startup time is critical. The KSZ8061 family is available with an extended temperature range of °40 to 105°Celsius for harsh-environment applications (e.g.,  industrial sensor networks and robotics). This Ethernet PHY transceiver family provides support for both the MII and RMII processor interfaces, for easy integration with numerous processors, MCUs and SoCs.

Microchip also has a new evaluation board, to enable functional and performance testing of the KSZ8061.  The $115 KSZ8061MNX evaluation board is now available for pre-ordering.

The KSZ8061 costs $1.16 each in 10,000-unit quantities for industrial grade. Volume-production availability is expected in early 2016.

Source: Microchip Technology

Low Power PIC MCUs Extend Battery Life, Eliminate External Memory via Flash

Microchip Technology recently expanded its Low Power PIC microcontroller portfolio. The new PIC24F GB6 family includes up to 1 MB of Flash memory with Error Correction Code (ECC) and 32 KB of RAM. The new 16-bit MCU in Microchip’s first to offer such a large memory size. Featuring dual-partition flash with Live Update capability, the devices can hold two independent software applications, permitting the simultaneous programming of one partition while executing application code from the other. This useful combination of features makes the PIC24F GB6 family ideal for a wide variety of applications (e.g., industrial, computer, medical/fitness, and portable applications).Microchip plugin mod

Microcontrollers in the PIC24F GB6 family have active current consumption as low as 190 µA/MHz and 3.2 µA in Sleep mode. With the ability to perform over-the-air firmware updates, designers can provide a cost-effective, reliable and secure method for updating their applications. The robust peripheral set for these devices includes a 200 ksps, 24 channel, 12-bit analog-to-digital converter (ADC).

The PIC24F GB6 family is supported by Microchip’s standard suite of development tools. The new PIC24FJ1024GB610 Plug-In Module (part # MA240023, $25) is available today for the Explorer 16 Development Board (part # DM240001, $129).

All eight members of the PIC24F GB6 microcontroller family are released for volume production, and are available within normal lead times. Pricing starts at $1.74 each, in high volumes. Product variants are available in 64-, 100-, and 121-pin packages, with flash memory ranging from 128 KB to 1 MB.

Source: Microchip Technology

New Dual-Channel USB Port Power Controller

Microchip Technology recently expanded its programmable USB-port power controller portfolio with the dual-channel UCS2112. This UCS2112 port power controller supports two ports, with eight programmable continuous current limits for each port, ranging from 0.53 to 3 A for faster charging times at higher currents. You can use it as is or with USB hubs to create a complete charging or USB communication system.Microchip UCS2112


The UCS2112 port power controller is supported by Microchip’s new $140 UCS2112 Evaluation Board. The UCS2112 is available for sampling and volume production in a 20-pin QFN package. Pricing starts at $1.80 each, in 5,000-unit quantities. Microchip Eval Board USC21212

Source: Microchip Technology

Next-Gen Bluetooth Low Energy Solutions

Microchip Technology recently launched next-generation Bluetooth Low Energy (LE) solutions intended for Internet of Things (IoT) and Bluetooth Beacon applications: the IS1870 Bluetooth LE RF module, the IS1871 Bluetooth LE RF module, and the BM70 module.Microchip BM70

The Bluetooth LE devices include an integrated, certified Bluetooth 4.2 firmware stack. Data is transmitted over the Bluetooth link using Transparent UART mode, which you can integrate with any processor or PIC microcontroller with a UART interface. The module also supports standalone “hostless” operation for beacon applications.

The optimized power profile of these new devices minimizes current consumption for extended battery life, in compact form factors as small as 4 × 4 mm for the RF ICs and 15 × 12 mm for the module. The module options include RF regulatory certifications, or noncertified (unshielded/antenna-less) for smaller and more remote antenna designs that will undergo end-product emission certifications.

The BM70 Bluetooth Low Energy PICtail/PICtail Plus daughter board enables code development via USB interface to a PC. Or you can connect to Microchip’s existing microcontroller development boards, such as the Explorer 16, PIC18 Explorer and PIC32 I/O Expansion Board. The BM-70-PICTAIL costs $89.99.

The IS1870 Bluetooth LE RF IC (6 × 6 mm, 48-pin QFN package) costs $1.79 in 1,000-unit quantities. The IS1871 (4 × 4 mm, 32-pin QFN package) costs $1.76 in 1,000-unit quantities. The 30-pin BM70 Bluetooth LE modules are available with or without built-in PCB antennas, starting at $4.99 each in 1,000-unit quantities.

Source: Microchip Technology