March Circuit Cellar: Sneak Preview

The March issue of Circuit Cellar magazine is coming soon. And we’ve got a healthy serving of embedded electronics articles for you. Here’s a sneak peak.

Not a Circuit Cellar subscriber?  Don’t be left out! Sign up today:

 

Here’s a sneak preview of March 2018 Circuit Cellar:

TECHNOLOGY FOR THE INTERNET-OF-THINGS

IoT: From Device to Gateway
The Internet of Things (IoT) is one of the most dynamic areas of embedded systems design today. This feature focuses on the technologies and products from edge IoT devices up to IoT gateways. Circuit Cellar Chief Editor Jeff Child examines the wireless technologies, sensors, edge devices and IoT gateway technologies at the center of this phenomenon.

Texting and IoT Embedded Devices
Texting has become a huge part of our daily lives. But can texting be leveraged for use in IoT Wi-Fi devices? Jeff Bachiochi lays the groundwork for describing a project that will involve texting. In this part, he gets into out the details for getting started with a look at Espressif System’s ESP8266EX SoC.

Exploring the ESP32’s Peripheral Blocks
What makes an embedded processor suitable as an IoT or home control device? Wi-Fi support is just part of the picture. Brian Millier has done some Wi-Fi projects using the ESP32, so here he shares his insights about the peripherals on the ESP32 and why they’re so powerful.

MICROCONTROLLERS HERE, THERE & EVERYWHERE

Designing a Home Cleaning Robot (Part 4)
In this final part of his four-part article series about building a home cleaning robot, Nishant Mittal discusses the firmware part of the system and gets into the system’s actual operation. The robot is based on Cypress Semiconductor’s PSoC microcontroller.

Apartment Entry System Uses PIC32
Learn how a Cornell undergraduate built a system that enables an apartment resident to enter when keys are lost or to grant access to a guest when there’s no one home. The system consists of a microphone connected to a Microchip PIC32 MCU that controls a push solenoid to actuate the unlock button.

Posture Corrector Leverages Bluetooth
Learn how these Cornell students built a posture corrector that helps remind you to sit up straight. Using vibration and visual cues, this wearable device is paired with a phone app and makes use of Bluetooth and Microchip PIC32 technology.

INTERACTING WITH THE ANALOG WORLD

Product Focus: ADCs and DACs
Makers of analog ICs are constantly evolving their DAC and ADC chips pushing the barriers of resolution and speeds. This new Product Focus section updates readers on this technology and provides a product album of representative ADC and DAC products.

Stepper Motor Waveforms
Using inexpensive microcontrollers, motor drivers, stepper motors and other hardware, columnist Ed Nisley built himself a Computer Numeric Control (CNC) machines. In this article Ed examines how the CNC’s stepper motors perform, then pushes one well beyond its normal limits.

Measuring Acceleration
Sensors are a fundamental part of what make smart machines smart. And accelerometers are one of the most important of these. In this article, George Novacek examines the principles behind accelerometers and how the technology works.

SOFTWARE TOOLS AND PROTOTYPING

Trace and Code Coverage Tools
Today it’s not uncommon for embedded devices to have millions of lines of software code. Trace and code coverage tools have kept pace with these demands making it easier for embedded developers to analyze, debug and verify complex embedded software. Circuit Cellar Chief Editor Jeff Child explores the latest technology trends and product developments in trace and code coverage tools.

Manual Pick-n-Place Assembly Helper
Prototyping embedded systems is an important part of the development cycle. In this article, Colin O’Flynn presents an open-source tool that helps you assemble prototype devices by making the placement process even easier.

Chipsets Provide Low Power LoRa Solutions

Semtech has announced its next generation LoRa devices and wireless radio frequency (RF) technology (LoRa Technology) chipsets enabling innovative LPWAN use cases for consumers with its advanced technology. Addressing the need for cost-effective and reliable sensor-to-cloud connectivity in any type of RF environment, the new features and capabilities will significantly improve the performance and capability of IoT sensor applications that demand ultra-low power, small form factor and long range wireless connectivity with a shortened product development cycle.

The next generation LoRa radios extends Semtech’s industry leading link budget by 20% with a 50% reduction in receiver current (4.5 mA) and a high power +22 dBm option. This extends battery life of LoRa-based sensors up to 30%, which reduces the frequency of battery replacement. The extended connectivity range, with the ability to reach deep indoor and outdoor sensor locations, will create new markets as different types of verticals integrate LoRa Technology in their IoT applications including healthcare and pharmaceuticals, media and advertising, logistics/shipping and asset tracking.

The new platform has a command interface that simplifies radio configuration and shortens the development cycle, needing only 10 lines of code to transmit or receive a packet, which will allow users to focus on applications. The small footprint, 45% less than the current generation, is highly configurable to meet different application requirements utilizing the global LoRaWAN open standard. The chipsets also supports FSK modulation to allow compatibility with legacy protocols that are migrating to the LoRaWAN open protocol for all the performance benefits LoRa Technology provides.

Three new devices, SX1262 (+22dBm), SX1261 (+15dBm) and SX1268 (+22dBm, China frequency bands) are currently sampling to lead customers and partners and will be available in full production in late Q1 2018. Development kits for various regions and associated software will also be available at that time.

LoRa Technology New Features:

  • 50% less power in receive mode
  • 20% more extended range
  • +22 dBm transmit power
  • A 45% reduction in size: 4mm by 4mm
  • Global continuous frequency coverage: 150-960MHz
  • Simplified user interface with implementation of commands
  • New spreading factor of SF5 to support dense networks
  • Protocol compatible with existing deployed LoRaWAN networks

 

Semtech | www.semtech.com/iot

Partner Program to Focus on Security

Microchip Technology has also established a Security Design Partner Program for connecting developers with third-party partners that can enhance and expedite secure designs. Along with the program, the company has also released its ATECC608A CryptoAuthentication device, a secure element that allows developers to add hardware-based security to their designs.

Microchip 38318249941_bf38a56692_zAccording to Microchip, the foundation of secured communication is the ability to create, protect and authenticate a device’s unique and trusted identity. By keeping a device’s private keys isolated from the system in a secured area, coupled with its industry-leading cryptography practices, the ATECC608A provides a high level of security that can be used in nearly any type of design. The ATECC608A includes the Federal Information Processing Standard (FIPS)-compliant Random Number Generator (RNG) that generates unique keys that comply with the latest requirements from the National Institute of Standards and Technology (NIST), providing an easier path to a whole-system FIPS certification.

Other features include:

  • Boot validation capabilities for small systems: New commands facilitate the signature validation and digest computation of the host microcontroller firmware for systems with small MCUs, such as an ARM Cortex-M0+ based device, as well as for more robust embedded systems.
  • Trusted authentication for LoRa nodes: The AES-128 engine also makes security deployments for LoRa infrastructures possible by enabling authentication of trusted nodes within a network.
  •  Fast cryptography processing: The hardware-based integrated Elliptical Curve Cryptography (ECC) algorithms create smaller keys and establish a certificate-based root of trust more quickly and securely than other implementation approaches that rely on legacy methods.
  •  Tamper-resistant protections: Anti-tampering techniques protect keys from physical attacks and attempted intrusions after deployment. These techniques allow the system to preserve a secured and trusted identity.
  •  Trusted in-manufacturing provisioning: Companies can use Microchip’s secured manufacturing facilities to safely provision their keys and certificates, eliminating the risk of exposure during manufacturing.

In addition to providing hardware security solutions, customers have access to Microchip’s Security Design Partner Program. These industry-leading companies, including Amazon Web Services (AWS) and Google Cloud Platform, provide complementary cloud-driven security models and infrastructure. Other partners are well-versed in implementing Microchip’s security devices and libraries. Whether designers are looking to secure an Internet of Things (IoT) application or add authentication capabilities for consumables, such as cartridges or accessories, the expertise of the Security Design Partners can reduce both development cost and time to market.

For rapid prototyping of secure solutions, designers can use the new CryptoAuth Xplained Pro evaluation and development kit (ATCryptoAuth-XPRO-B) which is an add-on board, compatible with any Microchip Xplained or Xplained Pro evaluation board. The ATECC608A is available for $0.56 each in 10,000 unit quantities. The ATCryptoAuth-XPRO-B add-on development board is available for $10.00 each.

Microchip Technology | www.microchip.com

Infineon MCUs Serve Audi’s Autonomous Car Functionality

Infineon Technologies has announced that it supplies key components for the Audi A8, the first series production car featuring level 3 automated driving. The ability of cars to self-drive is split into a number of different levels: With level 3, drivers can temporarily take their hands off the steering wheel under certain conditions.  The Audi A8 allows this when parking and exiting, in slow-moving traffic or in traffic congestion. Using microelectronics from Infineon Technologies, a car can take over in this kind of driving situation.

Various types of chips from Infineon serve the safe automated driving in the Audi A8: sensors, microcontrollers and power semiconductors. Radar sensor chips from the RASIC family are installed in the front and corner radar. They send and receive high-frequency 77-GHz signals and forward these on to the central driver assistance controller (zFAS).

A microcontroller from the AURIX family is a key component of the zFAS for reliable automated driving. AURIX enables to secure the connection to the vehicle data bus. It assesses and prioritizes data packets and initiates their processing in the fastest possible time. For example, it initiates emergency braking based on data from radar and other sensor systems. The AURIX family of microcontrollers is especially ideal for this purpose thanks to high processing power and extensive safety features.

AURIX microcontrollers are used in several controllers in the Audi A8: On the one hand, they control the functions for the engine. On the other, they operate in the Audi AI active chassis and in the electronic chassis platform, which controls the shock absorption. The microcontrollers also support activation of the airbag.

In addition to the electronics for drive, driver assistance and chassis, other semiconductor solutions from Infineon are installed in the comfort and body electronics, such as for example LED drivers from the LITIX Basic family in the tail lights as well as bridge drivers from the Embedded Power family in the windscreen wipers.

Infineon Technologies | www.infineon.com

MCU Vendors Embrace Amazon FreeRTOS

In a flurry of announcements concurrent with Amazon’s release of its new Amazon FreeRTOS operating system, microcontroller vendors are touting their collaborative efforts to support the OS. Amazon FreeRTOS extends the FreeRTOS kernel, a popular open source RTOS for microcontrollers, and includes software libraries for security, connectivity and updateability. Here’s a selection of announcements from the MCU community:

Microchip PIC32MZEF MCUs Support Amazon FreeRTOS
curiosityMicrochip Technology has expanded its collaboration with Amazon Web Services (AWS) to support cloud-connected embedded systems from the node to the cloud. Microchip’s PIC32MZ EF series of microcontrollers now support Amazon FreeRTOS.

STMicro Adds Amazon FreeRTOS to its IoT MCU Tool Suit
STMicroelectronics has announced its collaboration with Amazon Web Services (AWS) on Amazon FreeRTOS, the latest addition to the AWS Internet of Things (IoT) solution.

 

NXP MCU IoT Card with Wi-Fi Supports Amazon FreeRTOS
OM40007-LPC54018-IoT-ModuleNXP Semiconductors has introduced the LPC54018 MCU-based IoT module with onboard Wi-Fi and support for the new Amazon FreeRTOS on Amazon Web Services (AWS), offering developers universal connections to AWS.

 

TI SimpleLink™ MCU platform now supports new Amazon FreeRTOS (PRNewsfoto/Texas Instruments Incorporated)

TI Integrates SimpleLink MCU Platform with Amazon FreeRTOS
Texas Instruments (TI) has announced the integration of the new Amazon FreeRTOS into the SimpleLink microcontroller platform.

Renesas IoT Sandbox Supports RX65N MCU

Renesas Electronics America has expanded its Renesas IoT Sandbox lineup with the new RX65N Wi-Fi Cloud Connectivity Kit. The RX65N Wi-Fi Cloud Connectivity Kit provides an easy-to-use platform for connecting to the cloud, evaluating IoT solutions and creating IoT applications through cloud services and real-time workflows. The RX65N Wi-Fi Cloud Connectivity Kit integrates the high-performance Renesas RX65N microcontroller (MCU) and Medium One’s Smart Proximity demo with the data intelligence featured in Renesas IoT Sandbox.

RX65N_IoT_Sandbox_Wifi_Kit_UnpackedThe Renesas IoT Sandbox provides a fast path from IoT concept to prototype. It enables personalized data intelligence for system developers working with the Renesas SynergyTM Platform, the Renesas RL78 Family and RX Family of MCUs, and the Renesas RZ Family of microprocessors. The new RX65N Wi-Fi Cloud Connectivity Kit is based on the Renesas RX65N Group of MCUs, which is part of the high-performance RX600 Series of MCUs.

The new kit features the Smart Proximity demo implemented by Medium One. System developers can use workflows to extract data from the Ultrasonic Range Finder Sensor and then transmit distance data and duration length for objects close to the sensor to provide intelligence on end-user engagement with the objects. For instance, when deployed in retail environments, business owners can leverage the data to determine when and for how long shoppers view specific merchandise, providing greater insight on shoppers’ selection behaviors.

Developers can sign up for a Renesas IoT Sandbox account at www.renesas.com/iotsandbox. The data intelligence developer area is ready for immediate prototyping use. The RX65N Wi-Fi Connectivity Kit is available for order at Amazon for $59 per kit.

Renesas Electronics | www.renesas.com

NXP MCU IoT Card with Wi-Fi Supports Amazon FreeRTOS

NXP Semiconductors has introduced the LPC54018 MCU-based IoT module with onboard Wi-Fi and support for newly launched Amazon FreeRTOS on Amazon Web Services (AWS), offering developers universal connections to AWS. Amazon FreeRTOS provides tools for users to quickly and easily deploy an MCU-based connected device and develop an IoT application without having to worry about the complexity of scaling across millions of devices. Once connected, IoT device applications can take advantage of the capabilities of the cloud or continue processing data locally with AWS Greengrass.

Amazon FreeRTOS enables security-strong orchestration with the edge-cluster to further leverage low latencies in edge computing configurations, which extends AWS Greengrass core devices’ reach to the nodes. Distributed and autonomous computing architectures become possible through the consistent interface provided between the nodes and their gateways, in both online and offline scenario.

OM40007-LPC54018-IoT-ModuleNXP’s IoT module, co-developed with Embedded Artists and based on the LPC54018 MCU, offers unlimited memory extensibility, a root of trust built on the embedded SRAM physical unclonable functions (PUF) and on-chip cryptographic accelerators. Together, LPC and Amazon FreeRTOS, with easy-to-use software libraries, bring multiple layers of network transport security, simplify cloud on-boarding and over-the-air device management.

NXP enables node-to-cloud AWS connectivity with its LPC54018-based IoT module available on Amazon.com and EmbeddedArtists.com at $35 direct to consumers.

NXP Semiconductors | www.nxp.com

Microchip PIC32MZEF MCUs Support Amazon FreeRTOS

Microchip Technology has expanded its collaboration with Amazon Web Services (AWS) to support cloud-connected embedded systems from the node to the cloud. Supporting Amazon Greengrass, Amazon FreeRTOS and AWS Internet of Things (IoT), Microchip provides all the components, tools, software and support needed to rapidly develop secure cloud-connected systems.

Microchip’s PIC32MZ EF series of microcontrollers now support Amazon FreeRTOS, an operating system that makes compact low-powered edge devices easy to program, deploy, secure and maintain. These high-performance MCUs incorporate industry-leading connectivity options, ample Flash memory, rich peripherals and a robust toolchain which empower embedded designers to rapidly build complex applications. Amazon FreeRTOS includes software libraries which make it easy to securely deploy over-the-air updates as well as the ability to connect devices locally to AWS Greengrass or directly to the cloud, providing a variety of data processing location options.

For systems requiring data collection and analysis at a local level, developers can use Microchip’s SAMA5D2 series of microprocessors with integrated AWS Greengrass software. This will enable systems to run local compute, messaging, data caching and sync capabilities for connected devices in a secure way. This type of execution provides improved event response, conserves bandwidth and enables more cost-effective cloud computing. The SAMA5D2 devices, also available in System-in-Package (SiP) variants, offer full Amazon Greengrass compatibility in a low-power, small form factor MPU targeted at industrial and long-life gateway and concentrator applications. Additionally, the integrated security features and extended temperature range allows these MPUs to be deployed in physically insecure and harsh environments.

In any cloud-connected design, security and ease of use are vital pieces of the puzzle. Microchip’s ATECC608A CryptoAuthentication device enables enhanced system security as well as easy-to-use registration. The secure element provides a unique, trusted and protected identity to each device that can be securely authenticated to protect a brand’s intellectual property and revenue. In addition to enhancing system security, the ATECC608A allows AWS customers to instantly connect to the cloud through the device’s Just-in-Time-Registration (JITR) powered by AWS IoT.

curiosityMicrochip has an extensive toolchain for rapid and reliable development. The Curiosity PIC32MZ EF development board (shown), to kick-start Amazon FreeRTOS-based designs, is a fully integrated 32-bit development platform which also includes two mikroBUS expansion sockets, enabling designers to easily add additional capabilities, such as Wi-Fi with the WINC1510 click board, to their designs. The SAMA5D2 Xplained Ultra board, which can be used for AWS Greengrass designs, is a fast prototyping and evaluation platform for the SAMA5D2 series of MPUs. Additionally, the CryptoAuth Xplained Pro evaluation and development kit is an add-on board for rapid prototyping of secure solutions on AWS IoT and is compatible with any Microchip Xplained or XplainedPro evaluation boards. AWS is also a part of Microchip’s Design Partner Program which provides technical expertise and cost-effective solutions in a timely manner.

PIC32MZ EF MCUs are available starting at $5.48 each in 10,000 unit quantities. The PIC32MZ EF Curiosity board (DM320104) is available for $47.99 each. SAMA5D2 MPUs are available starting at $4.42 each in 10,000 unit quantities. The SAMA5D2 Xplained Ultra board (ATSAMA5D2C-XULT) is available for $150 each. ATECC608A secure elements are available starting at $0.56 each in 10,000 unit quantities. The CryptoAuth Xplained Pro evaluation and development kit (ATCryptoAuth-XPRO-B) is available for $10 each.

Microchip Technology | www.microchip.com

STMicro Adds Amazon FreeRTOS to its IoT MCU Tool Suite

STMicroelectronics has announced its collaboration with Amazon Web Services (AWS) on Amazon FreeRTOS, the latest addition to the AWS Internet of Things (IoT) solution. Amazon FreeRTOS provides everything one needs to easily and securely deploy microcontroller-based connected devices and develop an IoT application without having to worry about the complexity of scaling across millions of devices. Once connected, IoT device applications can take advantage of all of the capabilities the cloud has to offer or continue processing data locally with AWS Greengrass.

ST’s collaboration with AWS speeds designers’ efforts to create easily connectable IoT nodes with the combination of ST’s semiconductor building blocks and Amazon FreeRTOS, which extends the leading free and open-source real-time operating-system kernel for embedded devices (FreeRTOS) with the appropriate libraries for local networking, cloud connectivity, security, and remote software updates.

For the STM32, ST’s family of 32-bit Arm Cortex-M microcontrollers, the modular and interoperable IoT development platform spans state-of-the-art semiconductor components, ready-to-use development boards, free software tools and common application examples. At the official release of Amazon FreeRTOS, a version of the OS and libraries were immediately made available to run on the ultra-low-power STM32L4 series of microcontrollers.

The starter kit for Amazon FreeRTOS is ST’s B-L475E-IOT01A Discovery kit for IoT node, a fully integrated development board that exploits low-power communication, multiway sensing, and a raft of features provided by the STM32L4 series microcontroller to enable a wide range of IoT-capable applications. The Discovery kit’s support for Arduino Uno V3 and PMOD connectivity ensures unlimited expansion capabilities with a large choice of specialized add-on boards.

STMicroelectronics | www.st.com

TI Integrates SimpleLink MCU Platform with Amazon FreeRTOS

Texas Instruments (TI) has announced the integration of the new Amazon FreeRTOS into the SimpleLink microcontroller platform. Amazon Web Services (AWS) has worked with TI in the development of an integrated hardware and software solution that enables developers to quickly establish a connection to AWS IoT service out-of-the-box and immediately begin system development.

TI SimpleLink™ MCU platform now supports new Amazon FreeRTOS (PRNewsfoto/Texas Instruments Incorporated)

TI’s SimpleLink Wi-Fi CC3220SF wireless MCU LaunchPad development kit, which now supports Amazon FreeRTOS, offers embedded security features such as secure storage, cloning protection, secure bootloader and networking security. Developers can now take advantage of these security features to help them protect cloud-connected IoT devices from theft of intellectual property (IP) and data or other risks.

TI offers a broad portfolio of building blocks for IoT nodes and gateways spanning wired and wireless connectivity, microcontrollers, processors, sensing technology, power management and analog solutions, along with a community of cloud service providers, such as AWS, to help developers get connected to the cloud faster.

The SimpleLink MCU platform from Texas Instruments is a single development environment that delivers flexible hardware, software and tool options for customers developing Internet of Things (IoT) applications. With a single software architecture, modular development kits and free software tools for every point in the design life cycle, the SimpleLink MCU ecosystem allows 100 percent code reuse across the portfolio of microcontrollers, which supports a wide range of connectivity standards and technologies including RS-485, Bluetooth low energy, Wi-Fi, Sub-1 GHz, 6LoWPAN, Ethernet, RF4CE and proprietary radio frequencies. SimpleLink MCUs help manufacturers easily develop and seamlessly reuse resources to expand their portfolio of connected products.

Texas Instruments | www.ti.com

Talking Hands: American Sign Language Gesture Recognition Glove

Roberto developed a glove that enables communication between the user and those
around him. While the design is intended for use by people communicating in American Sign Language, you can apply what you learn in this article to a variety of communications applications.Capture
PHOTO 1-Here you see the finished product with all of the sensors sewn in. The use of string as opposed to adhesive for the sensors allowed the components to smoothly slide back and forth as the hand was articulated.

By Roberto Villalba

While studying at Cornell University in 2014, my lab partner Monica Lin and I designed and built a glove to be worn on the right hand that uses a machine learning (ML) algorithm to translate sign language into spoken English (see Photo 1). Our goal was to create a way for the speech impaired to be able to communicate with the general public more easily. Since every person’s hand is a unique size and shape, we aimed to create a device that could provide reliable translations regardless of those differences. Our device relies on a variety of sensors, such as flex sensors, a gyroscope, an accelerometer, and touch sensors to quantify the state of the user’s hand. These sensors allow us to capture the flex on each of the fingers, the hand’s orientation, rotation, and points of contact. By collecting a moderate amount of this data for each sign and feeding it into a ML algorithm, we are able to learn the association between sensor readings and their corresponding signs. We make use of a microcontroller to read, filter and send the data from the glove to a PC. Initially, some data is gathered from the users and the information is used to train a classifier that learns to differentiate between signs. Once the training is done, the user is able to put on the glove and make gestures which the computer then turns into audible output.

After performing some calculation and characterizing our flex sensors, we decided to use a 10-kΩ resistor. Note that the rightmost point goes into one of the microcontroller’s ADC.

FIGURE 1-After performing some calculation and characterizing our flex sensors, we decided to use a 10-kΩ resistor. Note that the rightmost point goes into one of the microcontroller’s ADC.

HIGH-LEVEL DESIGN
We use the microcontroller’s analog-to digital converter (ADC) to read the voltage drop across each of the flex sensors. We then move on to reading the linear acceleration and rotation values from the accelerometer and gyro sensor using I 2C. And finally, we get binary readings from each of the touch sensors regarding if there exists contact or not. We perform as many readings as possible within a given window of time and use all of this data to do some smoothing. This information is then sent through serial to the PC where it is gathered and processed. Python must listen to information coming in from the microprocessor and either store data or predict based on already learned information. Our code includes scripts for gathering data, loading stored data, classifying the data that is being streamed live, and some additional scripts to help with visualization of sensor readings and so on.

MCU & SENSORS
The design comprises an Atmel ATmega1284P microcontroller and a glove onto which the various sensors and necessary wires were sewn. Each finger has one Spectra Symbol flex sensor stitched on the backside of the glove. The accelerometer and gyro sensors are attached to the center of the back of the glove. The two contact sensors were made out of copper tape and wire that was affixed to four key locations.

Since each flex sensor has a resistance that varies depending on how much the finger is bent, we attached each flex sensor as part of a voltage divider circuit in order to obtain a corresponding voltage that can then be input into the microcontroller.

Capture3

We determined a good value for R1 by analyzing expected values from the flex sensor. Each one has a flat resistance of 10 k and a maximum expected resistance (obtained by measuring its resistance on a clenched fist) of about 27 k. In order to obtain the maximum range of possible output voltages from the divider circuit given an input voltage of 5 V, we plotted the expected ranges using the above equation and values of R1 in the range of 10 to 22 k. We found that the differences between the ranges were negligible and opted to use 10 k for R1 (see Figure 1).

Our resulting voltage divider has an output range of about 1 V. We were initially concerned that the resulting values from the microcontroller’s ADC converter would be too close together for the learning algorithm to discern between different values sufficiently. We planned to address this by increasing the input voltage to the voltage divider if necessary, but we found that the range of voltages described earlier was sufficient and performed extremely well.

The InvenSense MPU-6050 accelerometer and gyro sensor packet operates on a lower VCC (3.3 V) compared to the microcontroller’s 5 V. So as not to burn out the chip, we created a voltage regulator using an NPN transistor and a trimpot, connected as shown. The trimpot was adjusted so that the output of the regulator reads 3.3 V. This voltage also serves as the source for the pull-up resistors on the SDA and SCL wires to the microcontroller. Since the I 2C devices are capable only of driving the input voltages low, we connect them to VCC via two 4.7-k pull-up resistors (see Figure 2).

As described later, we found that we needed to add contact sensors to several key spots on the glove (see Figure 3). These would essentially function as switches that would pull the microcontroller input pins to ground to signal contact (be sure to set up the microcontroller pins to use the internal pull up resistors).

Figure 2: Here we see the schematic of the voltage regulator circuit that we created in order to obtain 3.3 V. The bottom of the schematic shows how this same regulator was used to pull up the signals at SCL and SDA. Figure 3: The contact sensor circuitry was quite simple. The input pins of the microcontroller are set to the internal pull-up resistors and whenever the two corresponding copper ends on the fingers touch the input is pulled low.

Figure 2: Here we see the schematic of the voltage regulator circuit that we created in order to obtain 3.3 V. The bottom of the schematic shows how this same regulator was used to pull up the signals at SCL and SDA.

Figure 3: The contact sensor circuitry was quite simple. The input pins of the microcontroller are set to the internal pull-up resistors and whenever the two corresponding copper ends on the fingers touch the input is pulled low.

I2C COMMUNICATIONS
Interfacing with the MPU-6050 required I 2C communication, for which we chose to use Peter Fleury’s public I 2C library for AVR microcontrollers. I 2C is designed to support multiple devices using a single dedicated data (SDA) bus and a single clock (SCL) bus. Even though we were only using the interface for the microcontroller to regularly poll the MPU6050, we had to adhere to the I 2C protocol. Fleury’s library provided us with macros for issuing start and stop conditions from the microcontroller (which represent different signals that the microcontroller is requesting data from the MPU-6050 or is releasing control of the bus). These provided macros allowed for us to easily initialize the I 2C interface, set up the MPU-6050, and request and receive the accelerometer and gyroscope data (described later).

Figure 4: The image is the visual output received from plotting sequences of sensor readings. The clear divisions across the horizontal signal the different signs A, B, C, and D, respectively.

Figure 4: The image is the visual output received from plotting sequences of sensor readings. The clear divisions across the horizontal signal the different signs A, B, C, and D, respectively.

While testing our I2C communication with the MPU-6050, we found that the microcontroller would on rare occasions hang while waiting for data from the I2C bus. To prevent this from stalling our program, we enabled a watchdog timer that would reset the system every 0.5 seconds, unless our program continued to progress to regular checkpoint intervals, at which time we would reset the watchdog timer to prevent it from unnecessarily resetting the system. We were able to leverage the fact that our microcontroller’s work consists primarily of continuously collecting sensor data and sending packets to a separate PC.

Photo 2: In this image we see the hand gestures for R, U, and V. As you can tell, there is not much difference in the hand’s orientation or the amount of flex on the fingers. However, note that the copper pieces make different kinds of contact for each of the signs.

Photo 2: In this image we see the hand gestures for R, U, and V. As you can tell, there is not much difference in the hand’s orientation or the amount of flex on the fingers. However, note that the copper pieces make different kinds of contact for each of the signs.

TINYREALTIME
For the majority of the code, we used Dan Henriksson and Anton Cervin’s TinyRealTime kernel. The primary reason for using this kernel is that we wanted to take advantage of the already implemented non-blocking UART library in order to communicate with the PC. While we only had a single thread running, we tried to squeeze in as much computation as possible while the data was being transmitted.

The program first initializes the I 2C, the MPU, and the ADC. After it enters an infinite loop it resets the watchdog timer and gets 16 readings from all of the sensors: accelerometers, gyroscopes, flex-sensors, and touch sensors. We then take all of the sensor values and compute filtered values by summing all of the 16 readings from each sensor. Since summation of the IMU sensors can produce overflow, we make sure to shift all of their values by 8 before summing them up. The data is then wrapped up into byte array packet that is organized in the form of a header (0xA1B2C3D4), the data, and a checksum of the data. Each of the sensors is stored into 2 bytes and the checksum is calculated by summing up the unsigned representation of each of the bytes in the data portion of the packet into a 2-byte integer. Once the packet has been created it is sent through the USB cable into the computer and the process repeats.

PYTHON COMMUNICATION
Communication with the microcontroller was established through the use of Python’s socket and struct libraries. We created a class called SerialWrapper whose main goal is to receive data from the microcontroller. It does so by opening a port and running a separate thread that waits on new data to be available. The data is then scanned for the header and a packet of the right length is removed when available. The checksum is then calculated and verified, and, if valid, the data is unpacked into the appropriate values and fed into a queue for other processes to extract. Since we know the format of the packet, we can use the struct library to extract all of the data from the packet, which is in a byte array format. We then provide the user with two modes of use. One that continuously captures and labels data in order to make a dataset, and another that continuously tries to classify incoming data. Support Vector Machines (SVM) are a widely used set of ML algorithms that learn to classify by using a kernel. While the kernel can take various forms, the most common kind are the linear SVMs. Simply put, the classification, or sign, for a set of readings is decided by taking the dot product of the readings and the classifier. While this may seem like a simple approach, the results are quite impressive. For more information about SVMs, take a look at scikit-learn’s “Support Vector Machines” (http://scikit-learn.org/stable/modules/svm.html).

PYTHON MACHINE LEARNING
For the purposes of this project we chose to focus primarily on the alphabet, a-z, and we added two more labels, “nothing” and “relaxed”, to the set. Our rationale for providing the classifier “nothing” was in order to have a class that was made up of mostly noise. This class would not only provide negative instances to help learn our other classes, but it also gave the classifier a way of outputting that the gestured sign is not recognized as one of the ones that we care about. In addition, we didn’t want the classifier to be trying to predict any of the letters when the user was simply standing by, thus we taught it what a “relaxed” state was. This state was simply the position that the user put his/her hand when they were not signing anything. In total there were 28 signs or labels. For our project we made extensive use of Python’s scikit-learn library. Since we were using various kinds of sensors with drastically different ranges of values, it was important to scale all of our data so that the SVM would have an easier time classifying. To do so we made use of the preprocessing tools available from scikit-learn. We chose to take all of our data and scale it so that the mean for each sensor was centered at zero and the readings had unit variance. This approach brought about drastic improvements in our performance and is strongly recommended. The classifier that we ended up using was a SVM that is provided by scikit-learn under the name of SVC.

Figure 5: The confusion matrix demonstrates how many times each label is predicted and how many times that prediction is accurate. We would like to see a perfect diagonal line, but we see that one square does not adhere to this. This square corresponds to “predicted V when it was really U” and it shows about a 66% accuracy.

Figure 5: The confusion matrix demonstrates how many times each label is predicted and how many times that prediction is accurate. We would like to see a perfect diagonal line, but we see that one square does not adhere to this. This square corresponds to “predicted V when it was really U” and it shows about a 66% accuracy.

Another part that was crucial to us as developers was the use of plotting in order to visualize the data and qualify how well a learning algorithm should be able to predict the various signs. The main tool that was developed for this was the plotting of a sequence of sensor readings as an image (see Figure 4). Since each packet contained a value for each of the sensors (13 in total), we could concatenate multiple packets to create a matrix. Each row is thus one of the sensor and we look at a row from left to right we get progressively later sensor readings. In addition, every packet makes up a column. This matrix could then be plotted with instances of the same sign grouped together and the differences between these and the others could then be observed. If the difference is clear to us, then the learning algorithm should have no issue telling them apart. If this is not the case, then it is possible that the algorithm could struggle more and changes to the approach could have been necessary.

The final step to classification is to pass the output of the classifier through a final level of filtering and debouncing before the output reaches the user. To accomplish this, we fill up a buffer with the last 10 predictions and only consider something a valid prediction if it has been predicted for at least nine out of the last 10 predictions. Furthermore, we debounce this output and only notify the user if this is a novel prediction and not just a continuation of the previous. We print this result on the screen and also make use of Peter Parente’s pyttsx text-to-speech x-platform to output the result as audio in the case that it is neither “nothing” or “relaxed.”

RESULTS
Our original glove did not have contact sensors on the index and middle fingers. As a result, it had a hard time predicting “R,” “U,” and “V” properly. These signs are actually quite similar to each other in terms of hand orientation and flex. To mitigate this, we added two contact sensors: one set on the tips of the index and middle fingers to detect “R,” and another pair in between the index and middle fingers to discern between “U” and “V.”

As you might have guessed, the speed of our approach is limited by the rate of communication between the microcontroller and the computer and by the rate at which we are able to poll the ADC on the microprocessor. We determined how quickly we could send data to the PC by sending data serially and increasing the send rate until we noticed a difference between the rate at which data was being received and the rate at which data was being sent. We then reduced the send frequency back to a reasonable value and converted this into a loop interval (about 3 ms).

We then aimed to gather as much data as possible from the sensors in between packet transmission. To accomplish this, we had the microcontroller gather as much data as possible between packets. And in addition to sending a packet, the microcontroller also sent the number of readings that it had performed. We then used this number to come up with a reasonable number of values to poll before aggregating the data and sending it to the PC. We concluded that the microcontroller was capable of reading and averaging each of the sensors 16 times each, which for our purposes would provide enough room to do some averaging.

The Python algorithm is currently limited by the rate at which the microcontroller sends data to the PC and the time that it takes the speech engine to say the word or letter. The rate of transfer is currently about thirty hertz and we wait to fill a buffer with about ten unanimous predictions. This means that the fastest that we could output a prediction would be about three times per second which for our needs was suitable. Of course, one can mess around with the values in order to get faster but slightly less accurate predictions. However, we felt that the glove was responsive enough at three predictions per second.

While we were able to get very accurate predictions, we did see some slight variations in accuracy depending on the size of the person’s hands. The accuracy of each flexsensor is limited beyond a certain point. Smaller hands will result in a larger degree of bend. As a result, the difference between slightly different signs with a lot of flex tends to be smaller for users with more petite hands. For example, consider the signs for “M” and “S.” The only difference between these signs is that “S” will elicit slightly more flex in the fingers. However, for smaller hands, the change in the resistance from the flex-sensor is small, and the algorithm may be unable to discern the difference between these signs.

Figure 6: We can see that even with very small amounts of data the classifier does quite well. After gathering just over 60 readings per sign it achieves an accuracy of over 98%.

Figure 6: We can see that even with very small amounts of data the classifier does quite well. After gathering just over 60 readings per sign it achieves an accuracy of over 98%.

In the end, our current classifier was able to achieve an accuracy of 98% (the error being composed almost solely of u, v sign confusion) on a task of 28 signs, the full alphabet as well as “relaxed” and “nothing” (see Figure 5). A random classifier would guess correctly 4% of the time, clearly indicating that our device is quite accurate. It is however worth noting that the algorithm could greatly benefit from improved touch sensors (seeing as the most common mistake is confusing U for V), being trained on a larger population of users, and especially on larger datasets. With a broad enough data set we could provide the new users with a small test script that only covers difficult letters to predict and relies on the already available data for the rest. The software has currently been trained on the two team members and it has been tested on some users outside of the team. The results were excellent for the team members that trained the glove and mostly satisfying though not perfect for the other volunteers. Since the volunteers did not have a chance to train the glove and were not very familiar with the signs, it is hard to say if their accuracy was a result of overfitting, individual variations in signing, or inexperience with American Sign Language. Regardless, the accuracy of the software on users who trained was near perfect and mostly accurate for users that did not know American Sign Language prior to and did not train the glove.

Lastly it is worth noting that the amount of data necessary for training the classifier was actually surprisingly small. With about 60 instances per label the classifier was able to reach the 98% mark. Given that we receive 30 samples per second and that there are 28 signs, this would mean that gathering data for training could be done in under a minute (see Figure 6).

FUTURE UPGRADES
The project met our expectations. Our initial goal was to create a system capable of recognizing and classifying gestures. We were able to do so with more than 98% average accuracy across all 28 classes. While we did not have a solid time requirement for the rate of prediction, the resulting speed made using the glove comfortable and it did not feel sluggish. Looking ahead, it would make sense to improve our approach for the touch sensors since the majority of the ambiguity in signs come from the difference between U and V. We want to use materials that lend themselves more seamlessly to clothing and provide a more reliable connection. In addition, it will be beneficial to test and train our project on a large group of people since this would provide us with richer data and more consistency. Lastly, we hope to make the glove wireless, which would allow it to easily communicate with phones and other devices and make the system truly portable.

RESOURCES
Arduino, “MPU-6050 Accelerometer + Gyro,” http://playground.arduino.cc/ Main/MPU-6050.

Atmel Corp., “8-Bit AVR Microcontroller with 128K Bytes In-System Programmable Flash: ATmega1284P,” 8059D­AVR­ 11/09, 2009,
www.atmel. com/images/doc8059.pdf.

Fleury, “AVR-Software,” 2006,
http://homepage. hispeed.ch/peterfleury/avrsoftware.html.

Lund University, “Tiny Real Time,” 2006, www.control.lth. se/~anton/tinyrealtime/.

Parente, “pyttsx – Text-tospeech x-platform,” pyttsx “struct–Interpret Strings as Packed Binary Data,” https://docs.python.org/2/ library/struct.html.

scikit-learn, “Preprocessing Data,”
http:// scikit-learn.org/stable/modules/preprocessing. html.

“Support Vector Machines,” scikit-learn.org/stable/modules/svm.html.

Spectra Symbol, “Flex Sensor FS,” 2015,
www.spectrasymbol.com/wp-content/themes/spectra/images/datasheets/FlexSensor.pdf.

Villalba and M. Lin, “Sign Language Glove,” ECE4760, Cornell University, 2014,
http:// people.ece.cornell.edu/land/courses/ece4760/FinalProjects/f2014/rdv28_mjl256/webpage/.

SOURCES
ATmega1284P Microcontroller Atmel | www.atmel.com

MPU-6050 MEMS MotionTracking Device InvenSense | www.invensense.com

Article originally published in Circuit Cellar June 2016, Issue #311

Battery-Free IoT Start Up Raises $19 Million

Wiliot, a fabless semiconductor start-up company, has closed an investment round with Qualcomm Ventures and M Ventures. The announcement was made in conjunction with the opening of the Active & Intelligent Packaging Industry Association (AIPIA) Conference in Amsterdam where the company will make its first public presentation to leaders in the packaging industry.

The latest investment round comes on the heels of a Series A Round financing effort that yielded $14m with forward-thinking strategic technology investors Grove Ventures, Norwest Venture Partners, and 83North Venture Capital. This first round closed in January, the month Wiliot was founded. In all, Wiliot has raised a total of $19 million in its first 10 months as a semiconductor company.

Wiliot-Scaling-IoT-with-Battery-Free-Bluetooth-1Wiliot, whose research and development arm is based in Israel, is on course to develop a wireless technology that will eliminate a reliance on batteries or wired power to vastly accelerate the Internet of Things with the vision of creating a world of “Smart Everything.” The new technology, which powers itself by harvesting energy from radio waves, enables a sensor as small as a fingernail, as thin as a sheet of paper, and an order of magnitude reduction in price and cost of maintenance.

With proof of concepts scheduled to start in 2H 2018, and a delivery to market date in early 2019, Wiliot’s technology will revolutionize the current Bluetooth beacon marketplace which after more than five years has reached a floor on reductions in cost, size and ease of maintenance that have hindered their widespread adoption.

Wiliot | www.wiliot.com

Sensor Node Gets LoRaWAN Certification

Advantech offers its standardized M2.COM IoT LoRaWAN certified sensor node WISE-1510 with integrated ARM Cortex-M4 processor and LoRa transceiver. The module the  is able to provide multi-interfaces for sensors and I/O control such as UART, I2C, SPI, GPIO, PWM and ADC. The WISE-1510 sensor node is well suited for for smart cities, WISE-1510_3D _S20170602171747agriculture, metering, street lighting and environment monitoring. With power consumption optimization and wide area reception, LoRa  sensors or applications with low data rate requirements can achieve years of battery life and kilometers of long distance connection.

WISE-1510 has has received LoRaWAN certification from the LoRa Alliance. Depending on deployment requirements, developers can select to use Public LoRaWAN network services or build a private LoRa system with WISE-3610 LoRa IoT gateway. Advantech’s WISE-3610  is a Qualcomm ARM Cortex A7 based hardware platform with private LoRa ecosystem solution that can connect up to 500 WISE-1510 sensor node devices. Powered by Advantech’s WISE-PaaS IoT Software Platform, WISE-3610 features automatic cloud connection through its WISE-PaaS/WISE Agent service, manages wireless nodes and data via WSN management APIs, and helps customers streamline their IoT data acquisition development through sensor service APIs, and WSN drivers.

Developers can leverage microprocessors on WISE-1510 to build their own applications. WISE-1510 offers unified software—ARM Mbed OS and SDK for easy development with APIs and related documents. Developers can also find extensive resources from Github such as code review, library integration and free core tools. WISE-1510 also offers worldwide certification which allow developers to leverage their IoT devices anywhere. Using Advantech’s WISE-3610 LoRa IoT Gateway, WISE-1510 can be connected to WISE-  PaaS/RMM or  ARM Mbed Cloud service with IoT communication protocols including LWM2M, CoAP, and MQTT. End-to-end integration assists system integrators to overcome complex challenges and helps them build IoT applications quickly and easily.

WISE-1510 features and specifications:

  • ARM Cortex-M4 core processor
  • Compatible support for public LoRaWAN or private LoRa networks
  • Great for low power/wide range applications
  • Multiple I/O interfaces for sensor and control
  • Supports wide temperatures  -40 °C to 85 °C

Advantech | www.advantech.com

Microcontroller Family Provides 25 Sensing Functions for 25 Cents

Texas Instruments (TI) has unveiled its lowest-cost ultra-low-power MSP430 microcontrollers for sensing applications. Developers can now implement simple sensing solutions through a variety of integrated mixed-signal features in this family of MSP430 value line sensing MCUs, available for as low as US $0.25 in high volumes. Additions to the family include two new entry-level devices and a new TI LaunchPa development kit for quick and easy evaluation. Developers can implement simple sensing functions with TI’s lowest-cost microcontroller family

United_States_QuarterDevelopers now have the flexibility to customize 25 common system-level functions including timers, input/output expanders, system reset controllers, electrically erasable programmable read-only memory (EEPROM) and more, using a library of code examples. A common core architecture, a tools and software ecosystem, and extensive documentation including migration guides make it easy for developers to choose the best MSP430 value line sensing MCU for each of their designs. Designers can scale from the 0.5-kB MSP430FR2000 MCU to the rest of the MSP430 sensing and measurement MCU portfolio for applications that require up to 256 kB of memory, higher performance or more analog peripherals.

The new MSP430FR2000 and MSP430FR2100 MCUs (with 0.5 kB and 1 kB of memory, respectively) and the new development kit join the MSP430 value line sensing family which includes the MSP430FR2111, MSP430FR2311, MSP430FR2033, MSP430FR2433 and MSP430FR4133 microcontroller families and their related development tools and software.

Developers can purchase the value line sensing portfolio through the TI store, priced as low as US$0.29 in 1,000-unit quantities and US $0.25 in higher volumes. Additionally, the new MSP430FR2433 LaunchPad development kit (MSP-EXP430FR2433) is available from the TI store and authorized distributors for US $9.99. Today through Dec. 31, 2017, the TI store is offering the LaunchPad kit for a promotional price of US $4.30.

Texas Instruments | www.ti.com

Mini Sensor Dies Target IoT and Autos

TDK has announced new miniaturized EPCOS MEMS pressure sensor dies. The automotive versions of the C33 series boast dimensions of just 1 mm x 1 mm x 0.4 mm. They are designed for absolute pressures of 1.2 bar to 10 bar and are qualified based on bild-wo-background-en-HighResolutionDataAEC-Q101. The typical operating voltage is 3 V. With a supply voltage of 5 V they offer sensitivities of between 15 mV/bar and 80 mV/bar, depending on the type. The miniaturized pressure sensors are suitable for a temperature range from -40 °C to +135 °C and can even withstand 140 °C for short periods. They also offer a very long-term stability of ± 0.35% FS (full scale).

The C39 type, with its footprint of just 0.65 mm x 0.65 mm is especially suitable for IoT and consumer applications. One noteworthy feature of the C39 is its low insertion height of just 0.24 mm, which makes the low-profile MEMS pressure sensor die ideal for applications in smartphones and wearables, for example, where space requirements are critical. The C39 is designed for an absolute pressure of 1.2 bar and, like the C33 series, offers long-term stability of ± 0.35% FS. All the pressure sensor dies operate on the piezoresistive principle and deliver, via a Wheatstone bridge, an analog signal that is proportional to the applied pressure and the supply voltage.

Further information on the products at www.epcos.com/pressure_sensor_elements

TDK-Lambda | www.us.tdk-lambda.com