Highly Integrated, Precision ADCs and DACs Feature Small Footprint

Texas Instruments (TI) has introduced four tiny precision data converters. The new data converters enable designers to add more intelligence and functionality, while shrinking system board space. The DAC80508 and DAC70508 are eight-channel precision digital-to-analog converters (DACs) that provide true 16- and 14-bit resolution, respectively. The ADS122C04 and ADS122U04 are 24-bit precision analog-to-digital converters (ADCs) that feature a two-wire, I2C-compatible interface and a two-wire, UART-compatible interface, respectively. The devices are optimized for a variety of small-size, high-performance or cost-sensitive industrial, communications and personal electronics applications. Examples include optical modules, field transmitters, battery-powered systems, building automation and wearables.

Both DACs include a 2.5-V, 5-ppm/°C internal reference, eliminating the need for an external precision reference. Available in a 2.4-mm-by-2.4-mm die-size ball-grid array (DSBGA) package or wafer chip-scale package (WCSP) and a 3-mm-by-3-mm quad flat no-lead (QFN)-16 package, these devices are up to 36 percent smaller than the competition. The new DACs eliminate the typical trade-off between high performance and small size, enabling engineers to achieve the best system accuracy, while reducing board size or increasing channel density.

In addition to their compact size, the DAC80508 and DAC70508 provide true, 1 least significant bit (LSB) integral nonlinearity to achieve the highest level of accuracy at 16- and 14-bit resolution – up to 66 percent better linearity than the competition. They are fully specified over a -40°C to +125°C extended temperature range and provide features such as cyclic redundancy check (CRC) to increase system reliability.

The tiny, 24-bit precision ADCs are available in 3-mm-by-3-mm very thin QFN (WQFN)-16 and 5-mm-by-4.4-mm thin-shrink small-outline package (TSSOP)-16 options. The two-wire interface requires fewer digital isolation channels than a standard serial peripheral interface (SPI), reducing the overall cost of an isolated system. These precision ADCs eliminate the need for external circuitry by integrating a flexible input multiplexer, a low-noise programmable gain amplifier, two programmable excitation current sources, an oscillator and a precision temperature sensor.

Both ADC devices feature a low-drift 2.048-V, 5-ppm/°C internal reference. Their internal 2 percent accurate oscillators help designers improve power-line cycle noise rejection, enabling higher accuracy in noisy environments. With gains from 1 to 128 and noise as low as 100 nV, designers can measure both small-signal sensors and wide input ranges with one ADC. These device families, which also include pin-to-pin-compatible 16-bit options, give designers the flexibility to meet various system requirements by scaling performance up or down.

Engineers can evaluate the new data converters with the DAC80508 evaluation module, the ADS122C04 evaluation module and the ADS122U04 evaluation module, all available today for $99.00 from the TI store and authorized distributors.

TI’s new tiny DACs and ADCs are available now with pricing ranging from $3.95 to $9.99 (1,000s).

Texas Instruments | www.ti.com

IoT Module Family Features Ultra-Compact Form Factor

Telit has announced the xE310 family of miniature IoT modules. With initial models planned in LTE-M, NB-IoT and European 2G, the new form factor will enable Telit to meet growing demand for ultra-small, high-performance modules for wearable medical devices, fitness trackers, industrial sensors, smart metering, and other mass-production, massive deployment applications. Telit will start shipping xE310 modules in Q4 this year.
Telit claims the xE310 family is one of the smallest LGA form factors available in the market with a flexible perimeter footprint supporting various sizes from compact to smaller than 200 mm2. The xE310’s 94 pads include spares to provide Telit the flexibility to quickly deliver support for additional features as technologies, applications and markets evolve. Spares can be used for modules supporting Bluetooth, Wi-Fi or enhanced location technologies—in addition to cellular—while maintaining compatibility with cellular only models. They can also be used for additional connections that may be required for new 5G-enabled features.

The new form factor also gives OEMs greater flexibility, efficiency and yield during design and manufacturing. The xE310 family provides easy PCB routing while minimizing manufacturing process issues such as planarity and bending. The unique circular pad facilitates correct package orientation for automated assembly.

To learn more about the new xE310 family, visit the Telit stand 431 at IoT Solutions World Congress in Barcelona, Spain on October 16-18.

For a look at how this new design is enabling smart metering applications, register for the Telit webinar on November 15: “From 2G to 5G: 5 things you need to know for smarter utilities”: https://www.smart-energy.com/industry-sectors/data_analytics/webinar-15-november-5-things-you-need-to-know-for-smarter-utilities/.

Telit | www.telit.com

Low-Power MCUs Extend Battery Life for Wearables

Maxim Integrated Products has introduced the ultra-low power MAX32660 and MAX32652 microcontrollers. These MCUs are based on the ARM Cortex-M4 with FPU processor and provide designers the means to develop advanced applications under restrictive power constraints. Maxim’s family of DARWIN MCUs combine its wearable-grade power technology with the biggest embedded memories in their class and advanced embedded security.

Memory, size, power consumption, and processing power are critical features for engineers designing more complex algorithms for smarter IoT applications. According to Maxim, existing solutions today offer two extremes—they either have decent power consumption but limited processing and memory capabilities, or they have higher power consumption with more powerful processors and more memory.
The MAX32660 (shown) offers designers access to enough memory to run some advanced algorithms and manage sensors (256 KB flash and 96 KB SRAM). They also offer excellent power performance (down to 50µW/MHz), small size (1.6 mm x 1.6 mm in WLP package) and a cost-effective price point. Engineers can now build more intelligent sensors and systems that are smaller and lower in cost, while also providing a longer battery life.

As IoT devices become more intelligent, they start requiring more memory and additional embedded processors which can each be very expensive and power hungry. The MAX32652 offers an alternative for designers who can benefit from the low power consumption of an embedded microcontroller with the capabilities of a higher powered applications processor.

With 3 MB flash and 1 MB SRAM integrated on-chip and running up to 120 MHz, the MAX32652 offers a highly-integrated solution for IoT devices that strive to do more processing and provide more intelligence. Integrated high-speed peripherals such as high-speed USB 2.0, secure digital (SD) card controller, a thin-film transistor (TFT) display, and a complete security engine position the MAX32652 as the low-power brain for advanced IoT devices. With the added capability to run from external memories over HyperBus or XcellaBus, the MAX32652 can be designed to do even more tomorrow, providing designers a future-proof memory architecture and anticipating the increasing demands of smart devices.

The MAX32660 and MAX32652 are both available at Maxim’s website and select authorized distributors. MAX32660EVKIT# and MAX32652EVKIT# evaluation kits are also both available at Maxim’s website.

Maxim Integrated | www.maximintegrated.com

The Dick Tracy Wristwatch TV

Input Voltage

–Jeff Child, Editor-in-Chief

JeffHeadShot

At my first technology editor job back in 1990, my boss at the time was obsessed with the concept of the Dick Tracy wristwatch. Dick Tracy was a popular comic strip that ran from the late 30s up until 1972. Now, let me be clear, even I’m not old enough to be from the era when Dick Tracy was part of popular culture. But my boss was. For those of you who don’t know, the 2-Way Wrist Radio was one of the comic strip’s most iconic items. It was worn by Tracy and members of the police force and in 1964 the 2-Way Wrist Radio was upgraded to a 2-Way Wrist TV. When chip companies came to visit our editorial offices—this is back when press tours were still a thing—in many editorial meetings with those companies, my boss would quite often ask the hypothetical question: “When are we going to get the Dick Tracy wristwatch?”

Confident that Moore’s Law would go on forever, semiconductor companies back then were always hungry to get their share of the mobile electronic device market—although the “device” of the day kept changing. My boss’s Dick Tracy wristwatch question was a clever way to spur discussion about chip integration, extreme low power, wireless communication and even full motion video. Full motion video on a mobile device in particular was a technology that many were skeptical could ever happen. In that early 90s period, the DRAM was the main driver of semiconductor process technology, and, in turn, the desktop PC was by far the dominant market for DRAMs. As a result, there was a tendency to view all future computing through the lens of the PC. It would be more than a decade later before flash memory surpassed DRAMs as the main driver of the chip business, and that was because the market size of mobile devices began to eclipse PCs.

As most of you know, Circuit Cellar has BYTE magazine as a part its origin story. Steve Ciarcia had a popular column called Circuit Cellar in BYTE magazine. When Steve founded this magazine three decades ago, he gave it the Circuit Cellar name. The April 1981 issue of BYTE magazine famously had a picture of basically a wristwatch with a CRT screen and keyboard with a mini-floppy disk being inserted into its side. That’s a vivid example that we humans are notoriously really bad at predicting what future technologies will look like. We have an inherent bias imposing what we have now on our view of the future.

Fast forward today and obviously we have the Dick Tracy Wristwatch and so much more—the Apple Watch being the most vivid example. Today’s wearable devices span across the consumer, fitness and medical markets and all need a mix of low-power, low-cost and high-speed processing. But even though technology has come a long way, the design challenges are still tricky. Wearable electronic devices of today all share some common aspects. They have an extremely low budget for power consumption, they tend not to be suited for replaceable batteries and therefore must be rechargeable. They also usually require some kind of wireless connectivity.

Today’s wearables including a variety of products including smartwatches, physical activity monitors, heart rate monitors, smart headphones and more. Microcontrollers for these devices have to have extremely low power and high integration. At the same time, power solutions servicing this market require mastery of low quiescent current design techniques and high integration. To meet those needs chip vendors—primarily from the microcontroller and analog markets—keep advancing solutions that consume extremely low levels and power and manage that power.

One amusing aspect of the Dick Tracy wristwatch was that it was referred as a 2-Way Radio (and later a 2-Way TV). With Internet connectivity, today’s smartwatches basically are connected to an infinite number of network nodes. I can’t claim to be a better predictor of the future than the editors of 1981’s BYTE. But now I need to come up with a new question to ask chip vendors, and I don’t know what the question should be. Perhaps: “When are we going to get the Star Wars holographic 3D image messaging system?”. And in wristwatch form please.

This appears in the May (334) issue of Circuit Cellar magazine

Not a Circuit Cellar subscriber?  Don’t be left out! Sign up today:

Wireless MCUs are Bluetooth Mesh Certified

Cypress Semiconductor has announced its single-chip solutions for the Internet of Things (IoT) are Bluetooth mesh connectivity certified by the Bluetooth Special Interest Group (SIG) to a consumer product. LEDVANCE announced the market’s first Bluetooth mesh qualified LED lighting products, which leverage Cypress’ Bluetooth mesh technology. Three Cypress wireless combo chips and the latest version of its Wireless Internet Connectivity for Embedded Devices (WICED) software development kit (SDK) support Bluetooth connectivity with mesh networking capability. Cypress’ solutions enable a low-cost, low-power mesh network of devices that can communicate with each other–and with smartphones, tablets and voice-controlled home assistants–via simple, secure and ubiquitous Bluetooth connectivity.

Previously, users needed to be in the immediate vicinity of a Bluetooth device to control it without an added hub. With Bluetooth mesh networking technology, the devices within the network can communicate with each other to easily provide coverage throughout even the largest homes, allowing users to conveniently control all of the devices via apps on their smartphones and tablets.

Market research firm ABI Research forecasts there will be more than 57 million Bluetooth smart lightbulbs by 2021. Cypress’ CYW20719, CYW20706, and CYW20735 Bluetooth and Bluetooth Low Energy (BLE) combo solutions and CYW43569 and CYW43570 Wi-Fi and Bluetooth combo solutions offer fully compliant Bluetooth mesh. Cypress also offers Bluetooth mesh certified modules and an evaluation kit. The solutions share a common, widely-deployed Bluetooth stack and are supported in version 6.1 of Cypress’ all-inclusive WICED SDK, which streamlines the integration of wireless technologies for developers of smart home lighting and appliances, as well as healthcare applications.

Cypress Semiconductor | www.cypress.com

Analyst 2017 Review: Mobile Devices Dominated GPU Market

Jon Peddie Research (JPR), a market research and consulting firm focused on graphics and multimedia offers its annual review of GPU developments for 2017. In spite of the slow decline of the PC market overall, PC-based GPU sales, which include workstations, have been increasing, according to the review. In the mobile market, integrated GPUs have risen at the same rate as mobile devices and the SoCs in them. The same is true for the console market where integrated graphics are in every console and they too have increased in sales over the year.

Nearly 28% of the world’s population bought a GPU device in 2017, and that’s in addition to the systems already in use. And yet, probably less than half of them even know what the term GPU stands for, or what it does. To them the technology is invisible, and that means it’s working—they don’t have to know about it.

The market for, and use of, GPUs stretches from supercomputers and medical devices to gaming machines, mobile devices, automobiles, and wearables. Just about everyone in the industrialized world has at least a half dozen products with one a GPU, and technophiles can easily count a dozen or more. The manufacturing of GPUs approaches science fiction with features that will move below 10 nm next year and have a glide-path to 3 nm, and some think even 1 nm—Moore’s law is far from dead, but is getting trickier to coax out of the genie’s bottle as we drive into subatomic realms that can only be modeled and not seen.

Over the past 12 months JPR has a seen a few new, and some clever adaptations of GPUs that show the path for future developments and subsequent applications. 2017 was an amazing year for GPU development driven by games, eSports, AI, crypto currency mining, and simulations. Autonomous vehicles started to become a reality, as did augmented reality. The over-hyped consumer-based PC VR market explosion didn’t happen, and had little to no impact on GPU developments or sales. Most of the participants in VR already had a high-end system and the HMD was just another display to them.

Mobile GPUs, exemplified by products from Qualcomm, ARM and Imagination Technologies are key to amazing devices with long battery life, screens at or approaching 4K, and in 2017 people started talking about and showing HDR.

JPR’s review says that many, if not all, the developments we will see in 2018 were started as early as 2015, and that three to four-year lead time will continue. Lead times could get longer as we learn how to deal with chips constructed with billions of transistor manufactured at feature sizes smaller than X-rays. Ironically, buying cycles are also accelerating ensuring strong competition as players try to leap-frog each other in innovation. According to JPR, we’ll see considerable innovation in 2018, with AI being the leading application that will permeate every sector of our lives.

The JPR GPU Developments in 2017 Report is free to all subscribers of JPR. Individual copies of the report can be purchased for $100.

Jon Peddie Research | www.jonpeddie.com

Quantum Leaps

Input Voltage

–Jeff Child, Editor-in-Chief

JeffHeadShot

Throughout my career, I’ve always been impressed by Intel’s involvement in a wide spectrum of computing and electronics technologies. These range from the mundane and practical on one hand, to forward-looking and disruptive advances on the other. A lot of these weren’t technologies for which Intel ever intended to take direct advantage of over the long term. I think a lot about how Intel facilitated the creation of and early advances in USB. Intel even sold USB chips in the first couple years of USB’s emergence, but stepped aside from that with the knowledge that their main focus was selling processors.

USB made computers and a myriad of consumer electronic devices better and easier to use, and that, Intel knew, advanced the whole industry in which their microprocessors thrived. Today, look around your home, your office and even your car and count the number of USB connectors there are. It’s pretty obvious that USB’s impact has been truly universal.

Aside from mainstream, practical solutions like USB, Intel also continues to participate in the most forward-looking compute technologies. Exemplifying that, in January at the Consumer Electronics Show (CES) show in Las Vegas, Intel announced two major milestones in its efforts to develop future computing technologies. In his keynote address, Intel CEO Brian Krzanich announced the successful design, fabrication and delivery of a 49-qubit superconducting quantum test chip. The keynote also focused on the promise of neuromorphic computing.

In his speech, Krzanich explained that, just two months after delivery of a 17-qubit superconducting test chip, Intel that day unveiled “Tangle Lake,” a 49-qubit superconducting quantum test chip. The chip is named after a chain of lakes in Alaska, a nod to the extreme cold temperatures and the entangled state that quantum bits (or “qubits”) require to function.

According to Intel, achieving a 49-qubit test chip is an important milestone because it will allow researchers to assess and improve error correction techniques and simulate computational problems.

Krzanich predicts that quantum computing will solve problems that today might take our best supercomputers months or years to resolve, such as drug development, financial modeling and climate forecasting. While quantum computing has the potential to solve problems conventional computers can’t handle, the field is still nascent.

Mike Mayberry, VP and managing director of Intel Labs weighed in on the progress of the efforts. “We expect it will be 5 to 7 years before the industry gets to tackling engineering-scale problems, and it will likely require 1 million or more qubits to achieve commercial relevance,” said Mayberry.

Krzanich said the need to scale to greater numbers of working qubits is why Intel, in addition to investing in superconducting qubits, is also researching another type called spin qubits in silicon. Spin qubits could have a scaling advantage because they are much smaller than superconducting qubits. Spin qubits resemble a single electron transistor, which is similar in many ways to conventional transistors and potentially able to be manufactured with comparable processes. In fact, Intel has already invented a spin qubit fabrication flow on its 300-mm process technology.

At CES, Krzanich also showcased Intel’s research into neuromorphic computing—a new computing paradigm inspired by how the brain works that could unlock exponential gains in performance and power efficiency for the future of artificial intelligence. Intel Labs has developed a neuromorphic research chip, code-named “Loihi,” which includes circuits that mimic the brain’s basic operation.

While the concepts seem futuristic and abstract, Intel is thinking of the technology in terms of real-world uses. Intel says Neuromorphic chips could ultimately be used anywhere real-world data needs to be processed in evolving real-time environments. For example, these chips could enable smarter security cameras and smart-city infrastructure designed for real-time communication with autonomous vehicles. In the first half of this year, Intel plans to share the Loihi test chip with leading university and research institutions while applying it to more complex data sets and problems.

For me to compare quantum and neuromorphic computing to USB is as about as apples and oranges as you can get. But, who knows? When the day comes when quantum or neuromorphic chips are in our everyday devices, maybe my comparison won’t seem far-fetched at all.

This appears in the February (331) issue of Circuit Cellar magazine

Not a Circuit Cellar subscriber?  Don’t be left out! Sign up today:

The Quest for Extreme Low Power

Input Voltage

–Jeff Child, Editor-in-Chief

JeffHeadShot

Over the next couple years, power will clearly rank as a major design challenge for the myriad of edge devices deployed in Internet of Things (IoT) implementations. Such IoT devices are wireless units that need to be always on and connected. At the same time, they need low power consumption, while still being capable of doing the processing power needed to enable machine intelligence. The need for extreme low power in these devices goes beyond the need for long battery life. Instead the hope is for perpetually powered solutions providing uninterrupted operation—and, if possible, without any need for battery power. For their part, microcontroller vendors have been doing a lot in recent years within their own labs to craft extreme low power versions of their MCUs. But the appetite for low power at the IoT edge is practically endless.

Offering a fresh take on the topic, I recently spoke with Paul Washkewicz, vice president and co-founder of Eta Compute about the startup’s extreme low power technology for microcontrollers. The company claims to offer the lowest power MCU intellectual property (IP) available today, with voltages as low as 0.3 V. Eta Compute has developed and implemented a unique low power design methodology that delivers up to a 10x improvement in power efficiency. Its IP and custom designs operate over severe variations in conditions such as temperature, process, voltage and power supply variation. Eta Compute’s approach is a self-timed technology supporting dynamic voltage scaling (DVS) that is insensitive to process variations, inaccurate device models and path delay variations.

The technology has been implemented in a variety of chip functions. Among these are M0+ and M3 ARM cores scaling 0.3 V to 1.2 V operation with additional low voltage logic support functions such as real-time clocking (RTC), Advanced Encryption Standard (AES) and digital signal processing. The technology has also been implemented in an A-D converter sensor interface that consumes less than 5 µW. The company has also crafted an efficient power management device that supports dynamic voltage scaling down to 0.25 V with greater than 80% efficiency.

According to the company, Eta Compute’s technology can be implemented in any standard foundry process with no modifications to the process. This allows ease of adoption of any IP and is immune to delays and changes in process operations. Manufacturing is straightforward with the company’s IP able to port to technology nodes at any foundry. Last fall at ARM TechCon, David Baker, Ph.D. and Fellow at Eta Compute, did a presentation that included a demonstration of a small wireless sensor board that can operate perpetually on a small 1 square inch solar cell.

Attacking the problem from a different direction, another startup, Nikola Labs, is applying its special expertise in antenna design and advanced circuitry to build power harvesting into products ranging from wearables to sensors to battery-extending phone cases. Wi-Fi routers, mobile phones and other connected devices are continually emitting RF waves for communication. According to the company, radio wave power is strongest near the source—but devices transmit in all directions, saturating the surrounding area with stray waves. Nikola Labs’ high-performance, compact antennae capture this stray RF energy. Efficient electronics are then used to convert it into DC electricity that can be used to charge batteries or energize ultra-low power devices.

Nikola’s technology can derive usable energy from a wide band of frequencies, ranging from LTE (910 MHz) to Wi-Fi (2.4 GHz) and beyond (up to 6 GHz). Microwatts of power can be harvested in an active ambient RF area and this can rise to milliwatts for harvesters placed directly on transmitting sources. Nikola Labs has demonstrated energy harvesting from a common source of RF communication waves: an iPhone. Nikola engineers designed a case for iPhone 6 that captures waste RF transmissions, producing up to 30 mW of power to extend battery life by as much as 16% without impacting the phone’s ability to send and receive data.

Whether you address the challenge of extreme low power from the inside out or the outside in—or by advancing battery capabilities—there’s no doubt that the demand for such technologies will only grow within the coming years. With all that in mind, I look forward to covering developments on this topic in Circuit Cellar throughout 2018.

This appears in the January (330) issue of Circuit Cellar magazine

Not a Circuit Cellar subscriber?  Don’t be left out! Sign up today:

Massage Vest Uses PIC32

330 Freeman Lead Image

Controlled with an iOS App

These Cornell graduates designed a low-cost massage vest that pairs seamlessly with a custom iOS app. Using the Microchip PIC32 for its brains, the massage vest has sixteen vibration motors that the user can control to create the best massage possible.

By Harry Freeman, Megan Leszczynski and Gargi Ratnaparkhi

As technology continues to make its way into every aspect of our lives, we are increasingly bombarded with more information and given more tools to organize our busy days. For our final project in the Digital Design Using Microcontrollers class at Cornell University, we sought to build technology to help us slow down, enjoy the moment and appreciate our senses. With that in mind, we built a low-cost massage vest that pairs seamlessly with a custom iOS app. The massage vest embeds 16 vibration motors and users can control the vest to create the most comfortable and soothing massage possible. The user first provides their input through the iOS app, which allows for multiple input modes—including custom or preset. The iOS app communicates to a PIC32 microcontroller via a Bluetooth Low Energy (BLE) module and ultimately the PIC32 turns on the vibration motors to complete the user’s requests. A block diagram is shown in Figure 1. Throughout the massage, users can update their settings to adjust to their desires. The complete massage vest costs less than $100—competitive with mass produced massage vests.
330 Freeman Fig 1 for web
Massage vests have historically been used for both pleasure and therapeutic purposes. Several known iOS-controlled massage vests include the iMusic BodyRhythm from iCess Labs and the i-Massager from E-Tek—both presented at the Consumer Electronics Show (CES) in 2013. The former syncs a massage to music for the user’s enjoyment, while the latter provides Transcutaneous Electrical Nerve Stimulation (TENS) as a certified medical device to relieve chronic pain. A group of Cornell students also won an Innovation Award in 2013 from the Cornell University School of Electrical and Computer Engineering for a massage vest called the Sonic Destressing Vest. The Sonic Destressing vest claimed to reduce the serum cortisol levels of its users, potentially reducing the risk of heart disease and depression—among many other chronic issues related to high serum cortisol levels. Those three vests motivated us to build a multi-purpose massage vest that could be extended to provide the particular features of those vests if desired—serving an existing base of users.

This article describes the details of how our massage vest worked so you can build one for yourself. First, we’ll discuss the hardware design that creates the comforting experience the user has with the vest. This will be followed by a discussion of the software that integrates the components together and provides a friendly user interface. Finally, we will conclude with testing and results. …

Read the full article in the January 330 issue of Circuit Cellar

Don’t miss out on upcoming issues of Circuit Cellar. Subscribe today!
Note: We’ve made the October 2017 issue of Circuit Cellar available as a free sample issue. In it, you’ll find a rich variety of the kinds of articles and information that exemplify a typical issue of the current magazine.

Wearables Drive Low Power Demands

320 Wearablese Lead Image for Web

MCUs & Analog ICs Meet Needs

Wearable devices put extreme demands on the embedded electronics that make them work. Devices spanning across the consumer, fitness and medical markets all need a mix of low-power, low-cost and high-speed processing.

By Jeff Child, Editor-in-Chief

Designers of new wearable, connected devices are struggling to extend battery life for next-generation products, while at the same time increasing functionality and performance in smaller form factors. These devices include a variety of products such as smartwatches, physical activity monitors, heart rate monitors, smart headphones and more. The microcontrollers embedded in these devices must blend extreme low power with high integration. Meanwhile, analog and power solutions for wearables must likewise be highly integrated while serving up low quiescent currents.

Modern wearable electronic devices all share some common requirements. They have an extremely low budget for power consumption,. They tend not to be suited for replaceable batteries and therefore must be rechargeable. They also usually require some kind of wireless connectivity. To meet those needs chip vendors—primarily from the microcontroller and analog markets—keep advancing solutions that consume extremely low levels of power and manage that power. This technology vendors are tasked to keep up with a wearable device market that IDC forecasts will experience a compound annual growth rate (CAGR) of 18.4% in 2020.

MCU and BLE Combo

Following all those trends at once is Cypress Semiconductor’s PSoC 6 BLE. In September the company made its public release of the PSoC 6 BLE Pioneer Kit and PSoC Creator Integrated Design Environment (IDE) software version 4.2 that enable designers to begin developing with the PSoC 6. The PSoC 6 BLE is has built-in Bluetooth Low Energy (BLE) wireless connectivity and integrated hardware-based security.

Photo 1 The PSoC BLE Pioneer Kit features a PSoC 63 MCU with BLE connectivity. The kit enables development of modern touch and gesture-based interfaces that are robust and reliable with a linear slider, touch buttons and proximity sensors based using Cypress’ CapSense capacitive-sensing technology.

Photo 1
The PSoC BLE Pioneer Kit features a PSoC 63 MCU with BLE connectivity. The kit enables development of modern touch and gesture-based interfaces that are robust and reliable with a linear slider, touch buttons and proximity sensors based using Cypress’ CapSense capacitive-sensing technology.

According to Cypress, the company had more than 2,500 embedded engineer customers registering for the PSoC 6 BLE early adopter program in just a few months. Early adopters are using the flexible dual-core architecture of PSoC 6, using the ARM Cortex-M4 core as a host processor and the Cortex-M0+ core to manage peripheral functions such as capacitive sensing, BLE connectivity and sensor aggregation. Early adopter applications include wearables, personal medical devices, wireless speakers and more. Designers are also using the built-in security features in PSoC 6 to help guard against unwanted access to data.  …

Read the full article in the December 329 issue of Circuit Cellar

Don’t miss out on upcoming issues of Circuit Cellar. Subscribe today!
Note: We’ve made the October 2017 issue of Circuit Cellar available as a free sample issue. In it, you’ll find a rich variety of the kinds of articles and information that exemplify a typical issue of the current magazine.

Mini Sensor Dies Target IoT and Autos

TDK has announced new miniaturized EPCOS MEMS pressure sensor dies. The automotive versions of the C33 series boast dimensions of just 1 mm x 1 mm x 0.4 mm. They are designed for absolute pressures of 1.2 bar to 10 bar and are qualified based on bild-wo-background-en-HighResolutionDataAEC-Q101. The typical operating voltage is 3 V. With a supply voltage of 5 V they offer sensitivities of between 15 mV/bar and 80 mV/bar, depending on the type. The miniaturized pressure sensors are suitable for a temperature range from -40 °C to +135 °C and can even withstand 140 °C for short periods. They also offer a very long-term stability of ± 0.35% FS (full scale).

The C39 type, with its footprint of just 0.65 mm x 0.65 mm is especially suitable for IoT and consumer applications. One noteworthy feature of the C39 is its low insertion height of just 0.24 mm, which makes the low-profile MEMS pressure sensor die ideal for applications in smartphones and wearables, for example, where space requirements are critical. The C39 is designed for an absolute pressure of 1.2 bar and, like the C33 series, offers long-term stability of ± 0.35% FS. All the pressure sensor dies operate on the piezoresistive principle and deliver, via a Wheatstone bridge, an analog signal that is proportional to the applied pressure and the supply voltage.

Further information on the products at www.epcos.com/pressure_sensor_elements

TDK-Lambda | www.us.tdk-lambda.com