Biodesign by Design: A Look at Soft Robotics

Mouser examines a human-centric approach to robotics. Soft Robotic designs are less metal and mechanical and utilize gentle, compliant mechanisms and actuators built using fluids and flexible materials. These enable a wide range of motion and are suitable for exoskeletons or wearables.

Low-Power MCUs Extend Battery Life for Wearables

Maxim Integrated Products has introduced the ultra-low power MAX32660 and MAX32652 microcontrollers. These MCUs are based on the ARM Cortex-M4 with FPU processor and provide designers the means to develop advanced applications under restrictive power constraints. Maxim’s family of DARWIN MCUs combine its wearable-grade power technology with the biggest embedded memories in their class and advanced embedded security.

Memory, size, power consumption, and processing power are critical features for engineers designing more complex algorithms for smarter IoT applications. According to Maxim, existing solutions today offer two extremes—they either have decent power consumption but limited processing and memory capabilities, or they have higher power consumption with more powerful processors and more memory.
The MAX32660 (shown) offers designers access to enough memory to run some advanced algorithms and manage sensors (256 KB flash and 96 KB SRAM). They also offer excellent power performance (down to 50µW/MHz), small size (1.6 mm x 1.6 mm in WLP package) and a cost-effective price point. Engineers can now build more intelligent sensors and systems that are smaller and lower in cost, while also providing a longer battery life.

As IoT devices become more intelligent, they start requiring more memory and additional embedded processors which can each be very expensive and power hungry. The MAX32652 offers an alternative for designers who can benefit from the low power consumption of an embedded microcontroller with the capabilities of a higher powered applications processor.

With 3 MB flash and 1 MB SRAM integrated on-chip and running up to 120 MHz, the MAX32652 offers a highly-integrated solution for IoT devices that strive to do more processing and provide more intelligence. Integrated high-speed peripherals such as high-speed USB 2.0, secure digital (SD) card controller, a thin-film transistor (TFT) display, and a complete security engine position the MAX32652 as the low-power brain for advanced IoT devices. With the added capability to run from external memories over HyperBus or XcellaBus, the MAX32652 can be designed to do even more tomorrow, providing designers a future-proof memory architecture and anticipating the increasing demands of smart devices.

The MAX32660 and MAX32652 are both available at Maxim’s website and select authorized distributors. MAX32660EVKIT# and MAX32652EVKIT# evaluation kits are also both available at Maxim’s website.

Maxim Integrated | www.maximintegrated.com

The Dick Tracy Wristwatch TV

Input Voltage

–Jeff Child, Editor-in-Chief

JeffHeadShot

At my first technology editor job back in 1990, my boss at the time was obsessed with the concept of the Dick Tracy wristwatch. Dick Tracy was a popular comic strip that ran from the late 30s up until 1972. Now, let me be clear, even I’m not old enough to be from the era when Dick Tracy was part of popular culture. But my boss was. For those of you who don’t know, the 2-Way Wrist Radio was one of the comic strip’s most iconic items. It was worn by Tracy and members of the police force and in 1964 the 2-Way Wrist Radio was upgraded to a 2-Way Wrist TV. When chip companies came to visit our editorial offices—this is back when press tours were still a thing—in many editorial meetings with those companies, my boss would quite often ask the hypothetical question: “When are we going to get the Dick Tracy wristwatch?”

Confident that Moore’s Law would go on forever, semiconductor companies back then were always hungry to get their share of the mobile electronic device market—although the “device” of the day kept changing. My boss’s Dick Tracy wristwatch question was a clever way to spur discussion about chip integration, extreme low power, wireless communication and even full motion video. Full motion video on a mobile device in particular was a technology that many were skeptical could ever happen. In that early 90s period, the DRAM was the main driver of semiconductor process technology, and, in turn, the desktop PC was by far the dominant market for DRAMs. As a result, there was a tendency to view all future computing through the lens of the PC. It would be more than a decade later before flash memory surpassed DRAMs as the main driver of the chip business, and that was because the market size of mobile devices began to eclipse PCs.

As most of you know, Circuit Cellar has BYTE magazine as a part its origin story. Steve Ciarcia had a popular column called Circuit Cellar in BYTE magazine. When Steve founded this magazine three decades ago, he gave it the Circuit Cellar name. The April 1981 issue of BYTE magazine famously had a picture of basically a wristwatch with a CRT screen and keyboard with a mini-floppy disk being inserted into its side. That’s a vivid example that we humans are notoriously really bad at predicting what future technologies will look like. We have an inherent bias imposing what we have now on our view of the future.

Fast forward today and obviously we have the Dick Tracy Wristwatch and so much more—the Apple Watch being the most vivid example. Today’s wearable devices span across the consumer, fitness and medical markets and all need a mix of low-power, low-cost and high-speed processing. But even though technology has come a long way, the design challenges are still tricky. Wearable electronic devices of today all share some common aspects. They have an extremely low budget for power consumption, they tend not to be suited for replaceable batteries and therefore must be rechargeable. They also usually require some kind of wireless connectivity.

Today’s wearables including a variety of products including smartwatches, physical activity monitors, heart rate monitors, smart headphones and more. Microcontrollers for these devices have to have extremely low power and high integration. At the same time, power solutions servicing this market require mastery of low quiescent current design techniques and high integration. To meet those needs chip vendors—primarily from the microcontroller and analog markets—keep advancing solutions that consume extremely low levels and power and manage that power.

One amusing aspect of the Dick Tracy wristwatch was that it was referred as a 2-Way Radio (and later a 2-Way TV). With Internet connectivity, today’s smartwatches basically are connected to an infinite number of network nodes. I can’t claim to be a better predictor of the future than the editors of 1981’s BYTE. But now I need to come up with a new question to ask chip vendors, and I don’t know what the question should be. Perhaps: “When are we going to get the Star Wars holographic 3D image messaging system?”. And in wristwatch form please.

This appears in the May (334) issue of Circuit Cellar magazine

Not a Circuit Cellar subscriber?  Don’t be left out! Sign up today:

Wireless MCUs are Bluetooth Mesh Certified

Cypress Semiconductor has announced its single-chip solutions for the Internet of Things (IoT) are Bluetooth mesh connectivity certified by the Bluetooth Special Interest Group (SIG) to a consumer product. LEDVANCE announced the market’s first Bluetooth mesh qualified LED lighting products, which leverage Cypress’ Bluetooth mesh technology. Three Cypress wireless combo chips and the latest version of its Wireless Internet Connectivity for Embedded Devices (WICED) software development kit (SDK) support Bluetooth connectivity with mesh networking capability. Cypress’ solutions enable a low-cost, low-power mesh network of devices that can communicate with each other–and with smartphones, tablets and voice-controlled home assistants–via simple, secure and ubiquitous Bluetooth connectivity.

Previously, users needed to be in the immediate vicinity of a Bluetooth device to control it without an added hub. With Bluetooth mesh networking technology, the devices within the network can communicate with each other to easily provide coverage throughout even the largest homes, allowing users to conveniently control all of the devices via apps on their smartphones and tablets.

Market research firm ABI Research forecasts there will be more than 57 million Bluetooth smart lightbulbs by 2021. Cypress’ CYW20719, CYW20706, and CYW20735 Bluetooth and Bluetooth Low Energy (BLE) combo solutions and CYW43569 and CYW43570 Wi-Fi and Bluetooth combo solutions offer fully compliant Bluetooth mesh. Cypress also offers Bluetooth mesh certified modules and an evaluation kit. The solutions share a common, widely-deployed Bluetooth stack and are supported in version 6.1 of Cypress’ all-inclusive WICED SDK, which streamlines the integration of wireless technologies for developers of smart home lighting and appliances, as well as healthcare applications.

Cypress Semiconductor | www.cypress.com

Analyst 2017 Review: Mobile Devices Dominated GPU Market

Jon Peddie Research (JPR), a market research and consulting firm focused on graphics and multimedia offers its annual review of GPU developments for 2017. In spite of the slow decline of the PC market overall, PC-based GPU sales, which include workstations, have been increasing, according to the review. In the mobile market, integrated GPUs have risen at the same rate as mobile devices and the SoCs in them. The same is true for the console market where integrated graphics are in every console and they too have increased in sales over the year.

Nearly 28% of the world’s population bought a GPU device in 2017, and that’s in addition to the systems already in use. And yet, probably less than half of them even know what the term GPU stands for, or what it does. To them the technology is invisible, and that means it’s working—they don’t have to know about it.

The market for, and use of, GPUs stretches from supercomputers and medical devices to gaming machines, mobile devices, automobiles, and wearables. Just about everyone in the industrialized world has at least a half dozen products with one a GPU, and technophiles can easily count a dozen or more. The manufacturing of GPUs approaches science fiction with features that will move below 10 nm next year and have a glide-path to 3 nm, and some think even 1 nm—Moore’s law is far from dead, but is getting trickier to coax out of the genie’s bottle as we drive into subatomic realms that can only be modeled and not seen.

Over the past 12 months JPR has a seen a few new, and some clever adaptations of GPUs that show the path for future developments and subsequent applications. 2017 was an amazing year for GPU development driven by games, eSports, AI, crypto currency mining, and simulations. Autonomous vehicles started to become a reality, as did augmented reality. The over-hyped consumer-based PC VR market explosion didn’t happen, and had little to no impact on GPU developments or sales. Most of the participants in VR already had a high-end system and the HMD was just another display to them.

Mobile GPUs, exemplified by products from Qualcomm, ARM and Imagination Technologies are key to amazing devices with long battery life, screens at or approaching 4K, and in 2017 people started talking about and showing HDR.

JPR’s review says that many, if not all, the developments we will see in 2018 were started as early as 2015, and that three to four-year lead time will continue. Lead times could get longer as we learn how to deal with chips constructed with billions of transistor manufactured at feature sizes smaller than X-rays. Ironically, buying cycles are also accelerating ensuring strong competition as players try to leap-frog each other in innovation. According to JPR, we’ll see considerable innovation in 2018, with AI being the leading application that will permeate every sector of our lives.

The JPR GPU Developments in 2017 Report is free to all subscribers of JPR. Individual copies of the report can be purchased for $100.

Jon Peddie Research | www.jonpeddie.com

Quantum Leaps

Input Voltage

–Jeff Child, Editor-in-Chief

JeffHeadShot

Throughout my career, I’ve always been impressed by Intel’s involvement in a wide spectrum of computing and electronics technologies. These range from the mundane and practical on one hand, to forward-looking and disruptive advances on the other. A lot of these weren’t technologies for which Intel ever intended to take direct advantage of over the long term. I think a lot about how Intel facilitated the creation of and early advances in USB. Intel even sold USB chips in the first couple years of USB’s emergence, but stepped aside from that with the knowledge that their main focus was selling processors.

USB made computers and a myriad of consumer electronic devices better and easier to use, and that, Intel knew, advanced the whole industry in which their microprocessors thrived. Today, look around your home, your office and even your car and count the number of USB connectors there are. It’s pretty obvious that USB’s impact has been truly universal.

Aside from mainstream, practical solutions like USB, Intel also continues to participate in the most forward-looking compute technologies. Exemplifying that, in January at the Consumer Electronics Show (CES) show in Las Vegas, Intel announced two major milestones in its efforts to develop future computing technologies. In his keynote address, Intel CEO Brian Krzanich announced the successful design, fabrication and delivery of a 49-qubit superconducting quantum test chip. The keynote also focused on the promise of neuromorphic computing.

In his speech, Krzanich explained that, just two months after delivery of a 17-qubit superconducting test chip, Intel that day unveiled “Tangle Lake,” a 49-qubit superconducting quantum test chip. The chip is named after a chain of lakes in Alaska, a nod to the extreme cold temperatures and the entangled state that quantum bits (or “qubits”) require to function.

According to Intel, achieving a 49-qubit test chip is an important milestone because it will allow researchers to assess and improve error correction techniques and simulate computational problems.

Krzanich predicts that quantum computing will solve problems that today might take our best supercomputers months or years to resolve, such as drug development, financial modeling and climate forecasting. While quantum computing has the potential to solve problems conventional computers can’t handle, the field is still nascent.

Mike Mayberry, VP and managing director of Intel Labs weighed in on the progress of the efforts. “We expect it will be 5 to 7 years before the industry gets to tackling engineering-scale problems, and it will likely require 1 million or more qubits to achieve commercial relevance,” said Mayberry.

Krzanich said the need to scale to greater numbers of working qubits is why Intel, in addition to investing in superconducting qubits, is also researching another type called spin qubits in silicon. Spin qubits could have a scaling advantage because they are much smaller than superconducting qubits. Spin qubits resemble a single electron transistor, which is similar in many ways to conventional transistors and potentially able to be manufactured with comparable processes. In fact, Intel has already invented a spin qubit fabrication flow on its 300-mm process technology.

At CES, Krzanich also showcased Intel’s research into neuromorphic computing—a new computing paradigm inspired by how the brain works that could unlock exponential gains in performance and power efficiency for the future of artificial intelligence. Intel Labs has developed a neuromorphic research chip, code-named “Loihi,” which includes circuits that mimic the brain’s basic operation.

While the concepts seem futuristic and abstract, Intel is thinking of the technology in terms of real-world uses. Intel says Neuromorphic chips could ultimately be used anywhere real-world data needs to be processed in evolving real-time environments. For example, these chips could enable smarter security cameras and smart-city infrastructure designed for real-time communication with autonomous vehicles. In the first half of this year, Intel plans to share the Loihi test chip with leading university and research institutions while applying it to more complex data sets and problems.

For me to compare quantum and neuromorphic computing to USB is as about as apples and oranges as you can get. But, who knows? When the day comes when quantum or neuromorphic chips are in our everyday devices, maybe my comparison won’t seem far-fetched at all.

This appears in the February (331) issue of Circuit Cellar magazine

Not a Circuit Cellar subscriber?  Don’t be left out! Sign up today:

The Quest for Extreme Low Power

Input Voltage

–Jeff Child, Editor-in-Chief

JeffHeadShot

Over the next couple years, power will clearly rank as a major design challenge for the myriad of edge devices deployed in Internet of Things (IoT) implementations. Such IoT devices are wireless units that need to be always on and connected. At the same time, they need low power consumption, while still being capable of doing the processing power needed to enable machine intelligence. The need for extreme low power in these devices goes beyond the need for long battery life. Instead the hope is for perpetually powered solutions providing uninterrupted operation—and, if possible, without any need for battery power. For their part, microcontroller vendors have been doing a lot in recent years within their own labs to craft extreme low power versions of their MCUs. But the appetite for low power at the IoT edge is practically endless.

Offering a fresh take on the topic, I recently spoke with Paul Washkewicz, vice president and co-founder of Eta Compute about the startup’s extreme low power technology for microcontrollers. The company claims to offer the lowest power MCU intellectual property (IP) available today, with voltages as low as 0.3 V. Eta Compute has developed and implemented a unique low power design methodology that delivers up to a 10x improvement in power efficiency. Its IP and custom designs operate over severe variations in conditions such as temperature, process, voltage and power supply variation. Eta Compute’s approach is a self-timed technology supporting dynamic voltage scaling (DVS) that is insensitive to process variations, inaccurate device models and path delay variations.

The technology has been implemented in a variety of chip functions. Among these are M0+ and M3 ARM cores scaling 0.3 V to 1.2 V operation with additional low voltage logic support functions such as real-time clocking (RTC), Advanced Encryption Standard (AES) and digital signal processing. The technology has also been implemented in an A-D converter sensor interface that consumes less than 5 µW. The company has also crafted an efficient power management device that supports dynamic voltage scaling down to 0.25 V with greater than 80% efficiency.

According to the company, Eta Compute’s technology can be implemented in any standard foundry process with no modifications to the process. This allows ease of adoption of any IP and is immune to delays and changes in process operations. Manufacturing is straightforward with the company’s IP able to port to technology nodes at any foundry. Last fall at ARM TechCon, David Baker, Ph.D. and Fellow at Eta Compute, did a presentation that included a demonstration of a small wireless sensor board that can operate perpetually on a small 1 square inch solar cell.

Attacking the problem from a different direction, another startup, Nikola Labs, is applying its special expertise in antenna design and advanced circuitry to build power harvesting into products ranging from wearables to sensors to battery-extending phone cases. Wi-Fi routers, mobile phones and other connected devices are continually emitting RF waves for communication. According to the company, radio wave power is strongest near the source—but devices transmit in all directions, saturating the surrounding area with stray waves. Nikola Labs’ high-performance, compact antennae capture this stray RF energy. Efficient electronics are then used to convert it into DC electricity that can be used to charge batteries or energize ultra-low power devices.

Nikola’s technology can derive usable energy from a wide band of frequencies, ranging from LTE (910 MHz) to Wi-Fi (2.4 GHz) and beyond (up to 6 GHz). Microwatts of power can be harvested in an active ambient RF area and this can rise to milliwatts for harvesters placed directly on transmitting sources. Nikola Labs has demonstrated energy harvesting from a common source of RF communication waves: an iPhone. Nikola engineers designed a case for iPhone 6 that captures waste RF transmissions, producing up to 30 mW of power to extend battery life by as much as 16% without impacting the phone’s ability to send and receive data.

Whether you address the challenge of extreme low power from the inside out or the outside in—or by advancing battery capabilities—there’s no doubt that the demand for such technologies will only grow within the coming years. With all that in mind, I look forward to covering developments on this topic in Circuit Cellar throughout 2018.

This appears in the January (330) issue of Circuit Cellar magazine

Not a Circuit Cellar subscriber?  Don’t be left out! Sign up today:

Massage Vest Uses PIC32

330 Freeman Lead Image

Controlled with an iOS App

These Cornell graduates designed a low-cost massage vest that pairs seamlessly with a custom iOS app. Using the Microchip PIC32 for its brains, the massage vest has sixteen vibration motors that the user can control to create the best massage possible.

By Harry Freeman, Megan Leszczynski and Gargi Ratnaparkhi

As technology continues to make its way into every aspect of our lives, we are increasingly bombarded with more information and given more tools to organize our busy days. For our final project in the Digital Design Using Microcontrollers class at Cornell University, we sought to build technology to help us slow down, enjoy the moment and appreciate our senses. With that in mind, we built a low-cost massage vest that pairs seamlessly with a custom iOS app. The massage vest embeds 16 vibration motors and users can control the vest to create the most comfortable and soothing massage possible. The user first provides their input through the iOS app, which allows for multiple input modes—including custom or preset. The iOS app communicates to a PIC32 microcontroller via a Bluetooth Low Energy (BLE) module and ultimately the PIC32 turns on the vibration motors to complete the user’s requests. A block diagram is shown in Figure 1. Throughout the massage, users can update their settings to adjust to their desires. The complete massage vest costs less than $100—competitive with mass produced massage vests.
330 Freeman Fig 1 for web
Massage vests have historically been used for both pleasure and therapeutic purposes. Several known iOS-controlled massage vests include the iMusic BodyRhythm from iCess Labs and the i-Massager from E-Tek—both presented at the Consumer Electronics Show (CES) in 2013. The former syncs a massage to music for the user’s enjoyment, while the latter provides Transcutaneous Electrical Nerve Stimulation (TENS) as a certified medical device to relieve chronic pain. A group of Cornell students also won an Innovation Award in 2013 from the Cornell University School of Electrical and Computer Engineering for a massage vest called the Sonic Destressing Vest. The Sonic Destressing vest claimed to reduce the serum cortisol levels of its users, potentially reducing the risk of heart disease and depression—among many other chronic issues related to high serum cortisol levels. Those three vests motivated us to build a multi-purpose massage vest that could be extended to provide the particular features of those vests if desired—serving an existing base of users.

This article describes the details of how our massage vest worked so you can build one for yourself. First, we’ll discuss the hardware design that creates the comforting experience the user has with the vest. This will be followed by a discussion of the software that integrates the components together and provides a friendly user interface. Finally, we will conclude with testing and results. …

Read the full article in the January 330 issue of Circuit Cellar

Don’t miss out on upcoming issues of Circuit Cellar. Subscribe today!
Note: We’ve made the October 2017 issue of Circuit Cellar available as a free sample issue. In it, you’ll find a rich variety of the kinds of articles and information that exemplify a typical issue of the current magazine.

Wearables Drive Low Power Demands

320 Wearablese Lead Image for Web

MCUs & Analog ICs Meet Needs

Wearable devices put extreme demands on the embedded electronics that make them work. Devices spanning across the consumer, fitness and medical markets all need a mix of low-power, low-cost and high-speed processing.

By Jeff Child, Editor-in-Chief

Designers of new wearable, connected devices are struggling to extend battery life for next-generation products, while at the same time increasing functionality and performance in smaller form factors. These devices include a variety of products such as smartwatches, physical activity monitors, heart rate monitors, smart headphones and more. The microcontrollers embedded in these devices must blend extreme low power with high integration. Meanwhile, analog and power solutions for wearables must likewise be highly integrated while serving up low quiescent currents.

Modern wearable electronic devices all share some common requirements. They have an extremely low budget for power consumption,. They tend not to be suited for replaceable batteries and therefore must be rechargeable. They also usually require some kind of wireless connectivity. To meet those needs chip vendors—primarily from the microcontroller and analog markets—keep advancing solutions that consume extremely low levels of power and manage that power. This technology vendors are tasked to keep up with a wearable device market that IDC forecasts will experience a compound annual growth rate (CAGR) of 18.4% in 2020.

MCU and BLE Combo

Following all those trends at once is Cypress Semiconductor’s PSoC 6 BLE. In September the company made its public release of the PSoC 6 BLE Pioneer Kit and PSoC Creator Integrated Design Environment (IDE) software version 4.2 that enable designers to begin developing with the PSoC 6. The PSoC 6 BLE is has built-in Bluetooth Low Energy (BLE) wireless connectivity and integrated hardware-based security.

Photo 1 The PSoC BLE Pioneer Kit features a PSoC 63 MCU with BLE connectivity. The kit enables development of modern touch and gesture-based interfaces that are robust and reliable with a linear slider, touch buttons and proximity sensors based using Cypress’ CapSense capacitive-sensing technology.

Photo 1
The PSoC BLE Pioneer Kit features a PSoC 63 MCU with BLE connectivity. The kit enables development of modern touch and gesture-based interfaces that are robust and reliable with a linear slider, touch buttons and proximity sensors based using Cypress’ CapSense capacitive-sensing technology.

According to Cypress, the company had more than 2,500 embedded engineer customers registering for the PSoC 6 BLE early adopter program in just a few months. Early adopters are using the flexible dual-core architecture of PSoC 6, using the ARM Cortex-M4 core as a host processor and the Cortex-M0+ core to manage peripheral functions such as capacitive sensing, BLE connectivity and sensor aggregation. Early adopter applications include wearables, personal medical devices, wireless speakers and more. Designers are also using the built-in security features in PSoC 6 to help guard against unwanted access to data.  …

Read the full article in the December 329 issue of Circuit Cellar

Don’t miss out on upcoming issues of Circuit Cellar. Subscribe today!
Note: We’ve made the October 2017 issue of Circuit Cellar available as a free sample issue. In it, you’ll find a rich variety of the kinds of articles and information that exemplify a typical issue of the current magazine.

Mini Sensor Dies Target IoT and Autos

TDK has announced new miniaturized EPCOS MEMS pressure sensor dies. The automotive versions of the C33 series boast dimensions of just 1 mm x 1 mm x 0.4 mm. They are designed for absolute pressures of 1.2 bar to 10 bar and are qualified based on bild-wo-background-en-HighResolutionDataAEC-Q101. The typical operating voltage is 3 V. With a supply voltage of 5 V they offer sensitivities of between 15 mV/bar and 80 mV/bar, depending on the type. The miniaturized pressure sensors are suitable for a temperature range from -40 °C to +135 °C and can even withstand 140 °C for short periods. They also offer a very long-term stability of ± 0.35% FS (full scale).

The C39 type, with its footprint of just 0.65 mm x 0.65 mm is especially suitable for IoT and consumer applications. One noteworthy feature of the C39 is its low insertion height of just 0.24 mm, which makes the low-profile MEMS pressure sensor die ideal for applications in smartphones and wearables, for example, where space requirements are critical. The C39 is designed for an absolute pressure of 1.2 bar and, like the C33 series, offers long-term stability of ± 0.35% FS. All the pressure sensor dies operate on the piezoresistive principle and deliver, via a Wheatstone bridge, an analog signal that is proportional to the applied pressure and the supply voltage.

Further information on the products at www.epcos.com/pressure_sensor_elements

TDK-Lambda | www.us.tdk-lambda.com

New Development Tool for Bluetooth 5

Nordic Semiconductor’s Bluetooth 5 developer solution for its nRF52840 SoC comprises the Nordic S140 v5.0 multi-role, concurrent protocol stack that brings Bluetooth 5’s long range and high throughput modes for immediate use to developers on the Nordic nRF52840 SoC. The Nordic nRF5 SDK offers application examples that implement this new long-range, high-throughput functionality. The existing Nordic nRF52832 SoC is also complemented with a Bluetooth 5 protocol stack.NordicBluetooth5Board

Bluetooth 5’s high throughput mode offers not only new use cases for wearables and other applications, but also significantly improves user experience with Bluetooth products. Time on air is reduced and thus leads to faster more robust communication as well as reduced overall power consumption. In addition, with 2 Mbps, the prospect of audio over Bluetooth low energy is possible.

The new Preview Development Kit (nRF52840-PDK) is a versatile, single-board development tool for Bluetooth 5, Bluetooth low energy, ANT, 802.15.4m, and 2.4-GHz proprietary applications using the nRF52840 SoC. The kit is hardware compatible with the Arduino Uno Revision 3 standard, making it possible to use third-party-compatible shields. An NFC antenna can be connected to enable NFC tag functionality. The kit gives access to all I/O and interfaces via connectors and has four LEDs and four buttons which are user-programmable.

Source: Nordic Semiconductor

New Scalable Biometric Sensor Platform for Wearables and the IoT

Valencell and STMicroelectronics recently launched a new development kit for biometric wearables. Featuring STMicro’s compact SensorTile turnkey multi-sensor module and Valencell’s Benchmark biometric sensor system, the platform offers designers a scalable solution for designers building biometric hearables and wearables.

The SensorTile IoT module’s specs and features:

  • 13.5 mm × 13.5 mm
  • STM32L4 microcontroller
  • Bluetooth Low Energy chipset
  • a wide spectrum of MEMS sensors (accelerometer, gyroscope, magnetometer, pressure, and temperature sensor)
  • Digital MEMS microphone

Valencell’s Benchmark sensor system’s specs and features:

  • PerformTek processor communicates with host processor using a simple UART or I2C interface protocol
  • Acquires heart rate, VO2, and calorie data
  • Standard flex connector interface

Source: Valencell

Mini Multi-Sensor Module for Wearables & IoT Designs

STMicroelectronics’s miniature SensorTile sensor board of its type comprises an MEMS accelerometer, gyroscope, magnetometer, pressure sensor, and a MEMS microphone. With the on-board low-power STM32L4 microcontroller, the SensorTile can be used as a sensing and connectivity hub for developing products ranging from wearables to Internet of Things (IoT) devices.

The 13.5 mm × 13.5 mm SensorTile features a Bluetooth Low-Energy (BLE) transceiver including an onboard miniature single-chip balun, as well as a broad set of system interfaces that support use as a sensor-fusion hub or as a platform for firmware development. You can plug it into a host board. At power-up, it immediately starts streaming inertial, audio, and environmental data to STMicro’s BlueMS free smartphone app.

Software development is simple with an API based on the STM32Cube Hardware Abstraction Layer and middleware components, including the STM32 Open Development Environment. It’s fully compatible with the Open Software eXpansion Libraries (Open.MEMS, Open.RF, and Open.AUDIO), as well as numerous third-party embedded sensing and voice-processing projects. Example programs are available (e.g., software for position sensing, activity recognition, and low-power voice communication).

The complete kit includes a cradle board, which carries the 13.5 mm × 13.5 mm SensorTile core system in standalone or hub mode and can be used as a reference design. This compact yet fully loaded board contains a humidity and temperature sensor, a micro-SD card socket, as well as a lithium-polymer battery (LiPo) charger. The pack also contains a LiPo rechargeable battery and a plastic case that provides a convenient housing for the cradle, SensorTile, and battery combination.

SensorTile kit’s main features, specs, and benefits:

  • Cradle/expansion board with an analog audio output, a micro-USB connector, and an Arduino-like interface that can be plugged into any STM32 Nucleo board to expand developers’ options for system and software development.
  • Programming cable
  • LSM6DSM 3-D accelerometer and 3-D gyroscope
  • LSM303AGR 3-D magnetometer and 3-D accelerometer
  • LPS22HB pressure sensor/barometer
  • MP34DT04 digital MEMS microphone
  • STM32L476 microcontroller
  • BlueNRG-MS network processor with integrated 2.4-GHz radio

Source: STMicroelectronics

Ultra-Small hSensor Platform for Wearable Apps

Maxim Integrated Products’s ultra-small hSensor Platform enables you to quickly develop wearable fitness and wellness-related prototypes. With it, you have all the necessary hardware on one PCB along with readily-accessible hardware functionality with the ARM mbed hardware development kit (HDK).Maxim health sensor

The hSensor Platform (MAXREFDES100# reference design) comprises an hSensor board that comes complete firmware with drivers, a debugger board, and a graphical user interface (GUI). The platform enables you to load algorithms for different applications.

The hSensor Platform includes the following: a MAX30003 ultra-low power, single-channel integrated biopotential AFE; a MAX30101 high-sensitivity pulse oximeter and heart-rate sensor; a MAX30205 clinical-grade temperature sensor; a MAX32620 ultra-low power ARM Cortex-M4F microcontroller optimized for wearables; a MAX14720 power management integrated circuit (PMIC); inertial sensors (three-axis accelerometer, six-axis accelerometer/gyroscope); a barometric pressure sensor; flash memory; and a Bluetooth Low Energy (BLE) radio.

The MAXREFDES100# costs $150. Hardware and firmware files are free.

Source: Maxim Integrated Products

The Future of Ultra-Low Power Signal Processing

One of my favorite quotes comes from the IEEE Signal Processing magazine in 2010. They attempted to answer the question: What does ultra-low power consumption mean? And they came to the conclusion that it is where the “power source lasts longer than the useful life of the product.”[1] It’s a great answer because it’s scalable. It applies equally to signal processing circuitry inside an embedded IoT device that can never be accessed or recharged and to signal processing inside a car where the petrol for the engine dominates the operating lifetime, not the signal processing power. It also describes exactly what a lot of science fiction has always envisioned: no changing or recharging of batteries, which people forget to do or never have enough batteries for. Rather, we have devices that simply always work.Figure 1

My research focuses on healthcare applications and creating “wearable algorithms”—that is, signal processing implementations that fit within the very small power budgets available in wearable devices. Historically, this focused on data reduction to save power. It’s well known that wireless data transmission is very power intensive. By using some power to reduce the amount of data that has to be sent, it’s possible to save lots of power in the wireless transmission stage and so to increase the overall battery lifetime.

This argument has been known for a long time. There are papers dating back to at least the 1990s based on it. It’s also readily achievable. Inevitably, it depends on the precise situation, but we showed in 2014 that the power consumption of a wireless sensor node could be brought down to the level of a node without a wireless transmitter (one that uses local flash memory) using easily available, easy-to-use, off-the-shelf-devices.[2]

This essay appears in Circuit Cellar 316, November 2016. Subscribe to Circuit Cellar to read project articles, essays, interviews, and tutorials every month!

Today, there are many additional benefits that are being enabled by the emerging use of ultra-low power signal processing embedded in the wearable itself, and these new applications are driving the research challenges: increased device functionality; minimized system latency; reliable, robust operation over unreliable wireless links; reduction in the amount of data to be analyzed offline; better quality recordings (e.g., with motion artifact removal to prevent signal saturations); new closed-loop recording—stimulation devices; and real-time data redaction for privacy, ensuring personal data never leaves the wearable.

It’s these last two that are the focus for my research now. They’re really important for enabling new “bioelectronic” medical devices which apply electrical stimulation as an alternative to classical pharmacological treatments. These “bioelectronics” will be fully data-driven, analyzing physiological measurements in real-time and using this to decide when to optimally trigger an intervention. Doing such as analysis on a wearable sensor node though requires ultra-low power signal processing that has all of the feature extraction and signal classification operating within a power budget of a few 100 µW or less.

To achieve this, most works do not use any specific software platform. Instead they achieve very low-power consumption by using only dedicated and highly customized hardware circuits. While there are many different approaches to realizing low-power fully custom electronics, for the hardware, the design trends are reasonably established: very low supply voltages, typically in the 0.5 to 1 V range; highly simplified circuit architectures, where a small reduction in processing accuracy leads to substantial power savings; and the use of extensive analogue processing in the very lowest power consumption circuits.[3]

Less well established are the signal processing functions for ultra-low power. Focusing on feature extractions, our 2015 review highlighted that the majority (more than half) of wearable algorithms created to date are based upon frequency information, with wavelet transforms being particularly popular.[4] This indicates a potential over-reliance on time–frequency decompositions as the best algorithmic starting points. It seems unlikely that time–frequency decompositions would provide the best, or even suitable, feature extraction across all signal types and all potential applications. There is a clear opportunity for creating wearable algorithms that are based on other feature extraction methods, such as the fractal dimension or Empirical Mode Decomposition.

Investigating this requires studying the three-way trade-off between algorithm performance (e.g., correct detections), algorithm cost (e.g., false detections), and power consumption. We know how to design signal processing algorithms, and we know how to design ultra-low power circuitry. However, combining the two opens many new degrees of freedom in the design space, and there are many opportunities and work to do in mapping feature extractions and classifiers into sub-1-V power supply dedicated hardware.


[1] G. Frantz, et al, “Ultra-low power signal processing,” IEEE Signal Processing Magazine, vol. 27, no. 2, 2010.
[2] S. A. Imtiaz, A. J. Casson, and E. Rodriguez-Villegas, “Compression in Wearable Sensor Nodes,” IEEE Transactions on Biomedical Engineering, vol. 61, no. 4, 2014.
[3] A. J. Casson, et al, “Wearable Algorithms,” in E. Sazonov and M. R. Neuman (eds.), Wearable Sensors, Elsevier, 2014.
[4] A. J. Casson, “Opportunities and Challenges for Ultra Low Power Signal Processing in Wearable Healthcare,” 23rd European Signal Processing Conference, Nice, 2015.


Alex Casson is a lecturer in the Sensing, Imaging, and Signal Processing Department at the University of Manchester. His research focuses on creating next-generation human body sensors, developing both the required hardware and software. Dr. Casson earned an undergraduate degree at the University of Oxford and a PhD from Imperial College London.