About Circuit Cellar Staff

Circuit Cellar's editorial team comprises professional engineers, technical editors, and digital media specialists. You can reach the Editorial Department at editorial@circuitcellar.com, @circuitcellar, and facebook.com/circuitcellar

Innovative Product Design: An Interview with Rich Legrand

Rich Legrand founded Charmed Labs in 2002 to develop and sell innovative robotics-related designs, including the Xport Robot Kit, the Qwerk robot controller, the GigaPan robotic camera mount, and the Pixy vision sensor. He recently told us about his background, passion for robotics, and interest in open-source hardware.

Legrand-IMG_5660CIRCUIT CELLAR: Tell us a bit about your background. When did you first get started with electronics and engineering?

RICH: Back in 1982 when I was 12, one of my older brother’s friends was what they called a “whiz kid.” I would show up uninvited at his place because he was always creating something new, and he didn’t treat me like a snotty-nosed kid (which I was). On one particular afternoon he had disassembled a Big Trak toy (remember those?) and connected it to his Atari 800, so the Atari could control its movements. He wrote a simple BASIC program to read the joystick movements and translate them to Big Trak movements. You could then hit the return key and the Atari would play back the motions you just made. There were relays clicking and LEDs flashing, and the Big Trak did exactly what you told it to do. I had never seen a computer do this before, and I was absolutely amazed. I wanted to learn as much as I could about electronics after that. And I’m still learning, of course.

CIRCUIT CELLAR: You studied electrical engineering at both Rice University and North Carolina State University. Why electrical engineering?

RICH: I think it goes back to when I was 12 and trying to learn more about robotics. With a limited budget, it was largely a question of what I get my hands on. Back then you could go into Radio Shack and buy a handful of 7400 series parts and create something simple, but pretty amazing. Forrest Mims’s books (also available at Radio Shack) were full of inspiring circuit designs. And Steve Ciarcia’s “Circuit Cellar” column in Byte magazine focused on seat-of-the-pants electronics projects you could build yourself. The only tools you needed were a soldering iron, a voltmeter, and a logic probe. I think young people today see a similar landscape where it’s easier to get involved in electrical engineering than say mechanical engineering (although 3-D printing might change this). The Internet is full of source material and the hardware (computers, microcontrollers, power supplies, etc.) is lower-cost and easier to find. The Arduino is a good example of this. It has its own ecosystem from which you can launch practically any project or idea.

CIRCUIT CELLAR: Photography factors in a lot of your work and work history. Is photography a passion of yours?

RICH: I don’t think so, but I enjoy photography. Image processing, image understanding, machine vision—the idea that you can extract useful information from a digital image with a piece of software, an algorithm. It’s a cool idea to me because you can have multiple vision algorithms and effectively have several sensors in one package. Or in the case of Gigapan, being able to create a gigapixel imager from a fairly low-cost point-and-shoot camera, some motors, and customized photo stitching software. I’m a hardware guy at heart, but hardware tends to be expensive. Combining inexpensive hardware with software to create something that’s lower-cost—it sounds like a pretty niche idea, but these are the projects that I seem to fall for over and over again. Working on these projects is what I really enjoy.

CIRCUIT CELLAR: Prior to your current gig at Charmed Labs, you were with Gigapan Systems, which you co-founded. Tell us about how you came to launch Gigapan.

RICH: Gigapan is robotic camera mount that allows practically anyone with a digital camera to make high-resolution panoramas. The basic idea is that you take a camera with high resolution but narrow field-of-view (high-zoom) to capture a mosaic of pictures that can be later stitched together with software to form a much larger, highly-detailed panorama of the subject, whether it’s the Grand Canyon or the cockpit of the Space Shuttle. This technique is used by the Mars rovers, so it’s not surprising that a NASA engineer (Randy Sargent) first conceived Gigapan. Charmed Labs got a chance to bid on the hardware, and we designed and manufactured the first Gigapan units as part of a public beta program. (The beta was funded by Carnegie Mellon University through donations from NASA and Google.) The beta garnered enough attention to get investors and start a company to focus on Gigapan, which we did. We were on CNN, we were mentioned on Jay Leno. It was a fun and exciting time!

he first Xport was a simple circuit board with flash for program storage and an FPGA for programmable I/O.

The first Xport was a simple circuit board with flash for program storage and an FPGA for programmable I/O.

CIRCUIT CELLAR: In a 2004 article, “Closed-Loop Motion Control for Mobile Robotics“ (Circuit Cellar 169), you introduced us to your first product, the Xport. How did you come to design the Xport?

RICH: When the Gameboy Advance was announced back in 1999, I thought it was a perfect robot platform. It had a color LCD and a powerful 32-bit processor, it was optimized for battery power, and it was low-cost. The pitch went something like: “For $40 you can buy a cartridge for your Gameboy that allows you to play a game. For $99 you can buy a cartridge with motors and sensors that turns your Gameboy into a robot.” So the Gameboy becomes the “brains” of the robot if you will. I didn’t know what the robot would do exactly, other than be cool and robot-like, and I didn’t know how to land a consumer electronics product on the shelves of Toys “R” Us, so I tackled some of the bigger technical problems instead, like how to turn the Gameboy into an embedded system with the required I/O for robotics. I ordered a Gameboy from Japan through eBay prior to the US release and reverse-engineered the cartridge port. The first “Xport” prototype was working not long after the first Gameboys showed up in US stores, so that was pretty cool. It was a simple circuit board that plugged into the Gamboy’s cartridge port. It had flash for program storage and an FPGA for programmable I/O. The Xport seemed like an interesting product by itself, so I decided to sell it. I quit my job as a software engineer and started Charmed Labs.

CIRCUIT CELLAR: Tell us about the Xport Botball Controller (XBC).

RICH: The Xport turned the Gameboy into an embedded system with lots of I/O, but my real goal was to make a robot. So I added more electronics around the Xport for motor control, sensor inputs, a simple vision system, even Bluetooth. I sold it online for a while before the folks at Botball expressed interest in using it for their robot competition, which is geared for middle school and high school students. Building a robot out of a Gameboy was a compelling idea, especially for kids, and tens of thousands of students used the XBC to learn about engineering—that was really great. I never got the Gameboy robot on the shelves of Toys “R” Us, but it was a really happy ending to the project.

CIRCUIT CELLAR: Charmed Labs collaborated with the Carnegie Mellon CREATE Lab on the Qwerk robot controller. How did you end up collaborating with CMU?

RICH: I met Illah Nourbakhsh who runs the CREATE lab at a robot competition back when he was a grad student. His lab’s Telepresence Robotics Kit (TeRK) was created in part to address the falling rate of computer science graduates in the US. The idea was to create a curriculum that featured robotics to help attract more students to the field. Qwerk was an embedded Linux system that allowed you make a telepresence robot easily. You could literally plug in some motors, a webcam, and a battery, and fire up a web browser and become “telepresent” through the robot. We designed and manufactured Qwerk for a couple years before we licensed it.

The Qwerk

The Qwerk

CIRCUIT CELLAR: Pixy is a cool vision sensor for robotics that you can teach to track objects. What was the impetus for that design?

RICH: Pixy is actually the fifth version of the CMUcam. The first CMUcam was invented at  Carnegie Mellon by Anthony Rowe back in 2000 when he was a graduate student. I got involved on a bit of a lark. NXP Semiconductors had just announced a processor that looked like an good fit for a low-cost vision sensor, so I sent Anthony a heads-up, that’s all. He was looking for someone to help with the next version of CMUcam, so it was a happy coincidence.

The Pixy vision sensor

The Pixy vision sensor

CIRCUIT CELLAR: You launched Pixy in 2013 on Kickstarter. Would you recommend Kickstarter to Circuit Cellar readers who are thinking of launching a hardware product?

RICH: Before crowdfunding was a thing, you either had to self-fund or convince a few investors to contribute a decent amount of cash based on the premise that you had a good idea. And the investors typically didn’t have your background or perspective, so it was usually a difficult sell. With crowdfunding, a couple hundred people with similar backgrounds and perspectives contribute $50 (or so) in exchange for becoming the very first customers. It’s an easier path I think, and it’s a great fit for products like Pixy that have a limited but enthusiastic audience. I think of crowdfunding as a cost-effective marketing strategy. Sites like Kickstarter get huge amounts of traffic, and getting your idea in front of such a large audience is usually expensive—cost-prohibitive in my case. It also answers two important questions for hardware makers: Are enough people interested in this thing to make it worthwhile? And if it is worthwhile, how many should I make?

But I really didn’t think many people would be interested in a vision sensor for hobbyist robotics, so when faced with the task of creating a Kickstarter for Pixy, I thought of lots of excuses not to move forward with it. Case in point—if your Kickstarter campaign fails, it’s public Internet knowledge. (Yikes!) But I’m always telling my boys that you learn more from your mistakes than from your successes, so it seemed pretty lame that I was dragging my heals on the Kickstarter thing because I wanted to avoid potential embarrassment. I eventually got the campaign launched, and it was a success, and Pixy got a chance to see the light of day, so that was good. It was a lot of work, and it was psychologically exhausting, but it was really fun to see folks excited about your idea. I’d totally do it again though, and I’d like to crowdfund my next project.

CIRCUIT CELLAR: Can you tell us about one or two of the more interesting projects you’ve seen featuring Pixy?

RICH: Ben Heck used Pixy in a couple of his episodes of the Ben Heck Show (www.element14.com/community/community/experts/benheck). He used Pixy to create a camera that can automatically track what he’s filming. And Microsoft used Pixy for an Windows 10 demo that played air hockey IR-Lock (www.irlock.com) is a small company that launched a successful Kickstarter campaign that featured Pixy as a beacon detector for use in autonomous drones. All of these projects have a high fun-factor, which I really enjoy seeing.

CIRCUIT CELLAR: What’s next for Charmed Labs?

RICH: I’ll tell you about one of my crazier ideas. My wife gets on my case every holiday season to hang lights on the house. It wouldn’t be that bad, except our next-door neighbors go all-out. They hang lights on every available surface of their house—think Griswolds from the Christmas Vacation movie. So anything I do to our house looks pretty sad by comparison. I’m competitive. But I had the idea that if I created a computer-controlled light show that’s synchronized to music, it might be a good face-saving technology, a way to possibly one-up the neighbors, because that’s what it’s all about, right? (Ha!) So I’ve been working on an easy-to-set-up and low-cost way to make your own holiday light show. It’s way outside of my robotics wheelhouse. I’m learning about high-voltage electronics and UL requirements, and there’s a decent chance it won’t be cost-competitive, or even work, but my hope is to launch a crowdfunding campaign in the next year or so.

CIRCUIT CELLAR: What are your thoughts on the future of open-source hardware?

RICH: We can probably thank the Arduino folks because before they came along, very few were talking about open hardware. They showed that you can fully open-source a design (including the hardware) and still be successful. Pixy was my first open hardware project and I must admit that I was a little nervous moving forward with it, but open hardware principles have definitely helped us. More people are using Pixy because it’s fully open. If you’re interested in licensing your software or firmware, open hardware is an effective marketing strategy, so I don’t think it’s about “giving it all away” as some might assume. That is, you can still offer closed-source licenses to customers that want to use your software, but not open-source their customizations. I’ve always liked the idea of open vs. proprietary, and I’ve learned plenty from fellow engineers who choose to share instead of lock things down. It’s great for innovation.

On a different robot, a flapping winged ornithopter, we had this PC104 computer running matlab as the controller. It probably weighed about 2 pounds, which forced us to build a huge wingspan – almost 6 feet. We dreamed about adding some machine vision to the platform as well. Having just built a vision-based robot for MIT’s MASLAB competition using an FPGA paired with an Arduino – the PC104 solution started to look pretty stupid to me. That was what really got me interested embedded work. FPGAs and Microcontrollers gave you an insane amount of computing power at comparatively minuscule power and weight footprints. And so died the PC104 standard.

CIRCUIT CELLAR: Tell us about your internship at Analog Devices. Can you provide a bit info about what you worked on?

ANDREW: I got the job at Analog after some folks there saw our MASLAB robot I mentioned earlier. I worked in the MEMS group that was responsible for the XL345. This was the accelerator used in the Nintendo Wii, so we all felt a bit like rockstars. iPhone had just come out, and so everyone was dreaming about using these types of devices in smartphones. Analog was where I really cut my teeth on the Cortex M3, which we used in our test hardware for the part.

CIRCUIT CELLAR: What was the most important thing you learned during your internship at Analog Devices?

ANDREW: Certainly the most surprising thing I learned was that in the land of digital logic and RTL, verification engineers outnumber design engineers by about 6 to 1. When going to fab costs you six or seven figures, you need to be *sure* that things are going to work. Despite the enormous amount of simulation, you still never get it on the first try though. I won’t mention how many tries it took on that part.

CIRCUIT CELLAR: What is Leaflabs? How did it start? Who comprises your team today? (Could you share a photo of the workspace, lab area?)

ANDREW: LeafLabs is an R&D firm specializing in embedded and distributed systems. Projects start as solving specific problems for a client, but the idea is to turn those relationships into product opportunities. To me, that’s what separates R&D from consulting.

I started LeafLabs with a handful of friends in 2009. It was an all MIT cast of engineers, and it took four or five years before I understood how much we were holding ourselves back by not embracing some marketing and sales talent. The original concept was to try and design ICs that were optimized for running certain machine learning algorithms at low power. The idea was that smartphones might want to do speech to text some day without sending the audio off to the cloud. This was way too ambitious for a group of 22 year olds with no money.

Our second overly ambitious idea was to try and solve the “FPGA problem.” I’m still really passionate about this, but it too was too much for four kids in a basement to take a big bite off. The problem is that FPGAs vendors like Xilinx and Altera have loads of expertise in silicon, but great software is just not in their DNA. Imagine if x86 never published their instruction set. What if Intel insisted on owning not just the processors, but the languages, compilers, libraries, IDEs, debuggers, operating systems, and the rest of it? Would we ever have gotten to Linux? What about Python? FPGAs have enormous potential to surpass even the GPU as a completely standard technology in computer systems. There should some gate fabric in my phone. The development tools just suck, suck, suck. If any FPGA executives are reading this: Please open up your bitstream formats, the FSF and the rest of the community will get the ball rolling on an open toolchain that will far exceed what you guys are doing internally. You will change the world.

CIRCUIT CELLAR: How did the Maple microcontroller board come about? How did it take to develop?

ANDREW: Arduino was really starting to come up at the time. I had just left Analog, where we had been using the 32-bit Cortex M3. We started asking “Chips like the STM32 are clearly the way of the future, why on earth is Arduino using a chip from the ‘90s” – Perry, another LeafLabs founder, was really passionate about this. ARM is taking over the world, the community deserves a product that is as easy to use as Arduino, but built on top of modern technology.

CIRCUIT CELLAR: Can you define “minimalist data acquisition” for our readers? What is it and why does it interest you?

ANDREW: More and more fields, but particularly in Neuroscience, are having to deal with outrageously huge real time data sets. There are 100 Billion neurons in the human brain, if we want to listen to just 1000 of them we are already talking about ~1Gb/s. Ed Boyden, a professor at MIT, asked us if we could build some hardware to help handle the torrent. Could we scale to 1 Tb/s? Could we build something that researchers on a budget could actually afford? That mere mortals could use?

Willow is a hardware platform for capturing, storing, and processing neuroscience data at this scale. We had to be “minimalist” to keep costs down, and ensure our system is easy to use. Since we need to use an FPGA anyway to interface with a data source (like a bank of ADCs, or an array of image sensors), we thought, “Why not use the same chip for interfacing to storage?” With a single $150 FPGA and a couple of $200 SSD drives, we can record at 12Gbps, put guarantees on throughput, and record for a couple of hours!

CIRCUIT CELLAR: Tell us about the Willow minimalist data acquisition system. (Do you have a photo we can publish?) How did the project come about? Are you still beta testing?

ANDREW: If you have a need to capture, store, and process real time data at the scale of 10 Gbps, or 1000 Gpbs, and you want an open source tool that is not going to cost you six figures, we would love to talk with you about our beta program. We will probably be coming out of beta in early 2016.

CIRCUIT CELLAR: What are you goals for Leaflabs for the next 6 to 12 months?

ANDREW: Including our superb remote contractors, our team is pushing 20. A year from now, it could be double that. This is a really tricky transition – where company culture really starts to solidify, where project management becomes a first order problem, and where people’s careers are on the line. My first goal for LeafLabs is make sure we nail this transition and build off of a really solid foundation.

Besides that, we are always looking for compelling new problems to work on and new markets to play in. Getting into neuroscience has been an absolute blast.

CIRCUIT CELLAR: Can you tell us about any new products you are working on?

ANDREW: We just started a new project in the consumer electronics space. I think we caught that bug from working on Ara and how exciting it is to work on something people immediately understand without being domain experts. Put an Ara phone in someone’s hands and they immediately say “Wow.” Unfortunately, we aren’t ready to talk about the new project yet.

CIRCUIT CELLAR: Think big picture. What is the “next big thing” in electrical engineering or game changer on the horizon? For instance, what excites you the most? The Internet of Things? Innovations in open-source technology? 3-D printing?

ANDREW: Chip to Chip networking with UniPro of course! I think that we have a real opportunity to make hardware more like software during this next decade. Look how Web companies operate. They are design focused, iterate swiftly, deploy continuously. We can do this in hardware too. With each new tool – be it Android or UniPro or whatever – we get a bit closer to the ideal where product development is more about users and less about plumbing. All of the cheap silicon coming down the pipe from the smartphone industry is truly a revolution for anyone in the hardware business. I can run Android now on a $5 part. In a few years, it will be less than $1. With all that horsepower, we can move embedded development out from assembly hacking and debugging TCP stacks and towards the much more interesting problem of how to make the billions of devices comprising the Internet of Things secure, flexible, and most importantly, useful!

This interview appears in Circuit Cellar 305 (December 2015).

Low Power PIC MCUs Extend Battery Life, Eliminate External Memory via Flash

Microchip Technology recently expanded its Low Power PIC microcontroller portfolio. The new PIC24F GB6 family includes up to 1 MB of Flash memory with Error Correction Code (ECC) and 32 KB of RAM. The new 16-bit MCU in Microchip’s first to offer such a large memory size. Featuring dual-partition flash with Live Update capability, the devices can hold two independent software applications, permitting the simultaneous programming of one partition while executing application code from the other. This useful combination of features makes the PIC24F GB6 family ideal for a wide variety of applications (e.g., industrial, computer, medical/fitness, and portable applications).Microchip plugin mod

Microcontrollers in the PIC24F GB6 family have active current consumption as low as 190 µA/MHz and 3.2 µA in Sleep mode. With the ability to perform over-the-air firmware updates, designers can provide a cost-effective, reliable and secure method for updating their applications. The robust peripheral set for these devices includes a 200 ksps, 24 channel, 12-bit analog-to-digital converter (ADC).

The PIC24F GB6 family is supported by Microchip’s standard suite of development tools. The new PIC24FJ1024GB610 Plug-In Module (part # MA240023, $25) is available today for the Explorer 16 Development Board (part # DM240001, $129).

All eight members of the PIC24F GB6 microcontroller family are released for volume production, and are available within normal lead times. Pricing starts at $1.74 each, in high volumes. Product variants are available in 64-, 100-, and 121-pin packages, with flash memory ranging from 128 KB to 1 MB.

Source: Microchip Technology

Low-Power RS-485 Transceiver with Low-Voltage Interface

Exar Corp. recently announced the XR33202, which is a half-duplex, 20-Mbps RS-485 (TIA/EIA-485) transceiver optimized to operate over a wide 3-to-5.5-V supply voltage range. The transceiver includes an adjustable low voltage logic interface and features the industry’s lowest standby current of 3 µA (maximum), 0.05 µA (typical). The device’s wide operating voltage range, flexible logic interface, and low standby current make it ideal for battery-powered and multi-voltage systems.EX052_XR33202_Exar

The XR33202 exceeds the highest ESD rating of IEC 61000-4-2 Level 4. It also includes protection features such as hot swap glitch, overload and enhanced receiver fail-safe for open, shorted or terminated idle data lines.

Specified over an extended temperature range of –40°C to 125°C, the XR33202 is offered in RoHS compliant, green/halogen free, space-saving 3 mm × 3 mm DFN package. One thousand-piece pricing starts at $1.60 each.

Summary of features:

  • Wide 3-to-5.5-V supply operation
  • 1.65 to 5.5 V I/O logic interface VL pin
  • Less than 3 µA (max) standby current
  • 20 Mbps maximum data rate
  • Robust ESD protection for RS-485 bus pins
  • –40°C to 125°C ambient operating temperature range

Source: Exar Corp.

Bare Metal Security Extends On-Chip Analytics for SoCs

UltraSoC recently announced Bare Metal Security capabilities to extend its on-chip monitoring and analytics to deliver security functionality required in a broad range of embedded products (e.g., IoT appliances and enterprise systems). Bare Metal Security features are implemented as hardware running below the operating system, so they’re nonintrusive even if the system’s conventional security measures are compromised. This adds an entirely new level of protection for the system-on-chip (SoC).

Bare Metal Security functionality uses the UltraSoC monitors to watch for unexpected behaviors such as suspicious memory accesses or processor activity, at hardware speed and nonintrusively, with minimal silicon overhead. Because it is an orthogonal on-chip hardware infrastructure independent of the main system functionality and software, there is no negative impact on system performance and it is very difficult for an attacker to subvert or tamper with. Although it functions below and outside of the operating system, the technology also provides a means of communicating with software on the device as part of a holistic security system, if this is necessary. Bare-Metal Security features also provide visibility of the whole system, making it extremely difficult to camouflage or hide an attack.

By offering resource-efficient and highly effective protection against malicious attack and malfunction, the UltraSoC on-chip analytics and monitoring system provides both development support and functionality enhancement from the same on-chip blocks. Teams already using UltraSoC to accelerate the debug, silicon validation, and bring-up process can use the same infrastructure for security processing. Designers who need Bare Metal Security features get the development benefits of a vendor-independent on-chip debug infrastructure at zero additional cost.

Although originally developed for debug and silicon validation, UltraSoC’s IP also enables a broad range of value-added functionality in-service, of which security is just one example. Other applications include in-field monitoring, performance optimization, reducing power utilization and SLA enforcement.

Source: UltraSoC Technologies

Expanded Auto Test Capabilities for Scopes with Support for HDMI v2.0 and Embedded DisplayPort

Teledyne LeCroy recently announced the availability of the QPHY-HDMI2 and QPHY-eDP, which expanded its automated transmitter test solutions for display standards to include HDMI Version 2.0 and Embedded DisplayPort. The QPHY-HDMI2 software option for the WaveMaster/SDA/DDA 8 Zi series of oscilloscopes provides validation/verification and debug tools in accordance with version 2.0 of the HDMI electrical test specification. Teledyne QPHY-HDMI2

The QPHY-eDP software option for the WaveMaster/SDA/DDA 8 Zi series of oscilloscopes provides an automated test environment for running all of the real-time oscilloscope tests for sources in accordance with Version 1.4a of the Video Electronics Standards Association (VESA) Embedded DisplayPort PHY Compliance Test Guideline. QPHY-eDP supports testing at up to 5.4 Gbps for full coverage of all bit rates included in the eDP 1.4 compliance test guideline. As with QPHY-HDMI2, optional RF switching and de-embedding is also supported by QPHY-eDP.

The QPHY-HDMI2 and QPHY-eDP each cost $7,000. Both are available on WaveMaster 8Zi, LabMaster 9Zi, and LabMaster 10Zi oscilloscopes with bandwidths of 13 GHz or higher and running firmware version 7.9.x or later.

Source: Teledyne LeCroy

December Electrical Engineering Challenge Update (Sponsor: NetBurner)

Spot the schematic error? Take the December Electrical Engineering Challenge (sponsored by NetBurner) now!

This month, find the error in the schematic posted below (and on the Challenge webpage) for a chance to win a NetBurner MOD54415 LC Development Kit ($129 value) or a Circuit Cellar Digital Subscription (1 year).


Find the error in this schematic and submit your answer by October 20, 2015. Submit via the Challenge webpage. Click image to access submission form.

Find the error in this schematic and submit your answer by December 20, 2015. Submit via the Challenge webpage. Click image to access submission form.


Out of each month’s group of entrants who correctly find the error in the code or schematic, one person will be randomly selected to win a NetBurner IoT Cloud Kit and another person will receive a free 1-year digital subscription to Circuit Cellar.

  • NetBurner MOD54415 LC Development Kit: You can add Ethernet connectivity to an existing product or use it as your product’s core processor! The NetBurner Ethernet Core Module is a device containing everything needed for design engineers to add network control and to monitor a company’s communications assets. The module solves the problem of network-enabling devices with 10/100 Ethernet, including those requiring digital, analog, and serial control.NetburnerMod54415module
  • Circuit Cellar Digital Subscription (1 year): Each month, Circuit Cellar magazine reaches a diverse international readership of professional electrical engineers, EE/ECE academics, students, and electronics enthusiasts who work with embedded technologies on a regular basis.Circuit Cellar magazine covers a variety of essential topics, including embedded development, wireless communications, robotics, embedded programming, sensors & measurement, analog tech, and programmable logic.


Read the Rules, Terms & Conditions


NetBurner solves the problem of network enabling devices, including those requiring digital, analog and serial control. NetBurner provides complete hardware and software solutions that help you network enable your devices.

NetBurner, Inc.
5405 Morehouse Dr.
San Diego, CA 92121 USA

The Future of Hardware Design

The future of hardware design is in the cloud. Many companies are already focused on the Internet of Things (IoT) and creating hardware to be interconnected in the cloud. However, can we get to a point where we build hardware itself in the cloud?

Traditional methods of building hardware in the cloud recalls the large industry of EDA software packages—board layouts, 3-D circuit assemblies, and chip design. It’s arguable that this industry emphasizes mechanical design, focusing on intricate chip placement, 3-D space, and connections. There are also cloud-based SPICE simulators for electronics—a less-than-user-friendly experience with limited libraries of generic parts. Simulators that do have a larger library also tend to have a larger associated cost. Finding exact parts can be a frustrating experience. A SPICE transistor typically does not have a BOM part number requiring a working design to become a sourcing hunt amongst several vendor offerings.123D Circuits with Wifi Module

What if I want to create real hardware in the cloud, and build a project like those in Circuit Cellar articles? This is where I see the innovation that is changing the future of how we make electronics. We now have cloud platforms that provide you with the experience of using actual parts from vendors and interfacing them with a microcontroller. Component lists including servo motors, IR remotes with buttons, LCDs, buzzers with sound, and accelerometers are needed if you’re actually building a project. Definitive parts carried by vendors and not just generic ICs are crucial. Ask any design engineer—they have their typical parts that they reuse and trust in every design. They need to verify that these parts move and work, so having an online platform with these parts allows for a real world simulation.

An Arduino IDE that allows for real-time debugging and stepping through code in the cloud is powerful. Advanced microcontroller IDEs do not have external components in their simulators or environment. A platform that can interconnect a controller with external components in simulation mirrors real life closer than anything else. By observing rises in computer processing power, many opportunities may be realized in the future with other more complex MCUs.

Most hardware designers are unaware of the newest cloud offerings or have not worked with a platform enough to evaluate it as a game-changer. But imagine if new electronics makers and existing engineers could learn and innovate without hardware for free in the cloud.

I remember spending considerable time working on circuit boards to learn the hardware “maker” side of electronics. I would typically start with a breadboard to build basic circuits. Afterwards it was migrated to a protoboard to build a smaller, robust circuit that could be soldered together. Several confident projects later, I jumped to designing and producing PCB boards that eventually led to an entirely different level in the semiconductor industry. Once the boards were designed, all the motors, sensors, and external parts could be assembled to the board for testing.

Traditionally, an assembled PCB was needed to run the hardware design—to test it for reliability, to program it, and to verify it works as desired. Parts could be implemented separately, but in the end, a final assembled design was required for software testing, peripheral integration, and quality testing. Imagine how this is now different using a hardware simulation. The quality aspect will always be tied to actual hardware testing, but the design phase is definitely undergoing disruption. A user can simply modify and test until the design works to their liking, and then design it straight away to a PCB after several online designs failures, all without consequence.

With an online simulation platform, aspiring engineers can now have experiences different from my traditional one. They don’t need labs or breadboards to blink LEDs. The cloud equalizes access to technology regardless of background. Hardware designs can flow like software. Instead of sending electronics kits to countries with importation issues, hardware designs can be shared online and people can toggle buttons and user test it. Students do not have to buy expensive hardware, batteries, or anything more than a computer.

An online simulation platform also affects the design cycle. Hardware design cycles can be fast when needed, but it’s not like software. But by merging the two sides means thousands can access a design and provide feedback overnight, just like a Facebook update. Changes to a design can be done instantly and deployed at the same time—an unheard of cycle time. That’s software’s contribution to the traditional hardware one.
There are other possibilities for hardware simulation on the end product side of the market. For instance, crowdfunding websites have become popular destinations for funding projects. But should we trust a simple video representing a working prototype and then buy the hardware ahead of a production? Why can’t we play with the real hardware online? By using an online simulation of actual hardware, even less can be invested in terms of hardware costs, and in the virtual environment, potential customers can experience the end product built on a real electronic design.

Subtle changes tend to build up and then avalanche to make dramatic changes in how industries operate. Seeing the early signs—realizing something should be simpler—allows you to ask questions and determine where market gaps exist. Hardware simulation in the cloud will change the future of electronics design, and it will provide a great platform for showcasing your designs and teaching others about the industry.

John Young is the Product Marketing Manager for Autodesk’s 123D Circuits (https://123d.circuits.io/) focusing on building a free online simulator for electronics. He has a semiconductor background in designing products—from R&D to market launch for Freescale and Renesas. His passion is finding the right market segment and building new/revamped products. He holds a BSEE from Florida Atlantic University, an MBA from the Thunderbird School of Global Management and is pursuing a project management certification from Stanford.

Industrial Drive Control SoC to Support Digital and Analog Position Sensors

Texas Instruments’s new TMS320F28379D and TMS320F28379S microcontrollers are an expansion to the C2000 Delfino microcontroller portfolio. When combined with DesignDRIVE Position Manager technology, they enable simple interfacing to position sensors. Based on the real-time control architecture of C2000 microcontrollers, the DesignDRIVE platform is ideal for the development of industrial inverter and servo drives used in robotics, elevators, and other industrial manufacturing applications.TI - TMS320F

With the C2000 DesignDRIVE development kit, you investigate a variety of motor drive topologies. DesignDRIVE is supported by the C2000 controlSUITE package and includes specific examples of vector control of motors, incorporating current, speed, and position loops, to help developers jumpstart their evaluation and development. In addition, users can download Texas Instruments’s Code Composer Studio integrated development environment (IDE), providing code generation and debugging capabilities. You can download reference interface and power supply designs for motor drives.

The TMS320F28379D and TMS320F28379S microcontrollers are now sampling starting at $17.20. The DesignDRIVE Kit (TMDXIDDK379D) costs $999.

Source: Texas Instruments

Boost Arduino Mega Capability with 512-KB SRAM & True Parallel Bus Expansion

The Arduino MEGA-2560 is a versatile microcontroller board, but it has only 8 KB SRAM. SCIDYNE recently developed the XMEM+ to enhance a standard MEGA in two ways. It increases SRAM up to 512 KB and provides True Parallel Bus Expansion. The XMEM+ plugs on top using the standard Arduino R3 stack-through connector pattern. This enables you to build systems around multiple Arduino shields. Once enabled in software, the XMEM+ becomes an integral part of the accessible MEGA memory.Scidyne

The XMEM+ also provides a fixed 23K Expansion Bus for connecting custom parallel type circuitry. Buffered Read, Write, Enable, Reset, 8-bit Data, and 16-bit Address signals are fully accessible for off-board prototyping. The XMEM+ makes any Arduino MEGA system much better suited for memory-intensive applications involving extended data logging, deep memory buffers, large arrays, and complex data structures. Target applications include industrial control systems, signage, robotics, IoT, product development, and education.

The introductory price is $39.99.

Source: SCIDYNE Corp.

Dual-Core, Runtime-Reconfigurable Processor for Low-Power Applications

Imsys has developed a dual-core, runtime-reconfigurable processor that can run at 350 MHz with an active power consumption of 19.7 µW/MHz using one core. Intended for low-power applications, 97% of the processor’s transistors are used in memory blocks. The cores share memories and a five-port grid network router (NoC). Memory management is handled by microcode, and memory is closely integrated with the processor without the need for an ordinary cache controller. The active consumption of each core—executing from RAM, including its consumption there—is 6.9 mW at 350 MHz.

Imsys’s processor is suitable for sensor nodes powered by energy harvesting in the Internet of Things (IoT), as well as in many-core chips for microservers and robotics. Microcode, as opposed to logic gates, is compact and energy efficient. Imsys uses extensive microprogramming to accomplish a rich set of instructions, thereby reducing the number of cycles needed without energy inefficient speculative activity and duplicated hardware logic. Each core has two instruction sets, one of which executes Java and Python directly from the dense JVM bytecode representation. C code is compiled to the other set with unparalleled density. Internal microcode is used for computationally intensive standard routines, such as crypto algorithms, which would otherwise be assembly coded library routines or even special hardware blocks. Optimizing CPU intensive tasks by microcode can reduce execution time and energy consumption of by more than an order of magnitude compared to C code.

The rich instruction set optimized for the compiler reduces the memory needed for software. And just like the microcoded algorithms, it reduces the number of clock cycles needed for execution. This platform has a certified JVM and uses an RTOS kernel certified to ISO 26262 safety standard for automotive applications. The development tools will be enhanced with the support enabled by the LLVM infrastructure. A new instruction set optimized for an LLVM backend has been developed and is being implemented in the coming hardware generation.

Source: Imsys

Low-Power Apollo Microcontroller Now in Volume Production

Ambiq Micro’s Apollo MCU—which was demonstrated to consume less than half the energy of other microcontrollers in real-world applications (EEMBC ULPBench benchmark)—is now available for shipping for high-volume consumer applications. The microcontroller features active mode current around 34 µA/MHz when running from flash memory and sleep mode current less than 150 nA. Built around an ARM M4 core with a floating-point unit, it’s available with 64 to 512 KB of embedded flash memory. In addition, it includes a 10-bit ADC and a variety of serial interfaces.  AMB012 Ambiq Available in both BGA and WLCSP packages, the Apollo MCU is available for immediate delivery with prices starting at $1.50 in 10,000-unit quantities.

Source: Ambiq Micro

An Introduction to Verilog

If you are new to programming FPGAs and CPLDs or looking for a new design language, Kareem Matariyeh has the solution for you. In this article, he introduces you to Verilog. Although the hardware description language has been used in the ASIC industry for years, it has all the tools to help you implement complex designs, such as a creating a VGA interface or writing to an Ethernet controller. Matariyeh writes:

Programmable logic has been around for well over two decades. Today, due to larger and cheaper devices on the market, FPGAs and CPLDs are finding their way into a wide array of projects, and there is a plethora of languages to choose from. VHDL is the popular choice outside of the U.S. It is preferred if you need a strong typed language. However, the focus of this article will be on another popular language called Verilog, which is a hardware description language that is similar to the C language.

Typically, Verilog is used in the ASIC design industry. Companies such as Sun Microsystems, Advanced Micro Devices, and NVIDIA use Verilog to verify and test new processor architectures before committing to physical silicon and post-fab verification. However, Verilog can be used in other ways, including implementing complex designs such as a VGA interface. Another complex design such as an Ethernet controller can also be written in Verilog and implemented in a programmable device.

This article is mostly tailored to engineers who need to learn Verilog and do not know or know little about the language. Those who know VHDL will benefit from reading this article as well and should be able to pick up Verilog fairly quickly after reviewing the example listings and referring to the Resources at the end of the article. This article does not go over hardware, but I have included some links that will help you learn more about how the hardware interacts with this language at the end.


First, it is best to know what variable types are available in Verilog. The basic types available are: binary, integer, and real. Other types are available but they are not used as often as these three. Keep everything in the binary number system as much as possible because type casting can cause post-implementation issues, but not all writers are the same. Binary and integer types have the ability to use other values such as “z” (high impedance) and “x” (don’t care). Both are nice to have around when you want a shared bus between designs or a bus to the outside world. Binary types can be assigned by giving an integer value. However, there are times when you want to assign or look at a specific bit. Some of the listings use this notation. In case you are curious, it looks like this: X’wY, where X is the word size, w is the number base—b for binary, h for hex—and Y is the value. Any value without this is considered an integer by default. Keeping everything in binary, however, can become a pain in the neck especially when dealing with numbers larger than 8 bits.Table1

Table 1 shows some of the variable types that are available in Verilog. Integer is probably the most useful one to have around because it’s 32 bits long and helps you keep track of numbers easily. Note that integer is a signed type but can also be set with all “z” or “x.” Real is not used that much, when it is used the number is truncated to an integer. It is best to keep this in mind when using the real type, granted it is the least popular compared to binary and integer. When any design is initialized in a simulator, the initial values of a binary and integer are all “x.” Real, on the other hand, is 0.0 because it cannot use “x.” There are other types that are used when interconnecting within and outside of a design. They are included in the table, but won’t be introduced until later.Table2

Some, but not all, operators from C are in Verilog. Some of the operators available in Verilog are in Table 2. It isn’t a complete list, but it contains most of the more commonly used operators. Like C, Verilog can understand operations and perform implicit casting (i.e., adding an integer with a 4-bit word and storing it into a binary register or even a real); typically this is frowned on mostly due to the fact that implicit casting in Verilog can open a new can of worms and cause issues when running the code in hardware. As long as casting does not give any erroneous results during an operation, there should be no show-stoppers in a design. Signed operation happens only if integers and real types are used in arithmetic (add, subtract, multiply) operations.


In Verilog, designs are called modules. A module defines its ports and contains the implementation code. If you think of the design as a black box, Verilog code typically looks like a black box with the top missing. Languages like Verilog and VHDL encourage black box usage because it can make code more readable, make debugging easier, and encourage code reuse. In Verilog, multiple code implementations cannot have the same module name. This is in stark contrast to VHDL, where architectures can share the same entity name. The only way to get around this in Verilog is to copy a module and rename it.

In Listing 1, a fairly standard shift register inserts a binary value at the end of a byte every clock cycle. If you’re experienced with VHDL, you can see that there aren’t any library declarations. This is mainly due to the fact that Verilog originated from an interpretive foundation. However, there are include directives that can be used to add external modules and features. Obviously, the first lines after the module statement are defining the modules’ port directions and type with the reserved words input and output. There is another declaration called inout, which is bidirectional but not in the listing. A module’s input and output ports can use integer and real, but binary is recommended if it is a top-level module.Listing1

The reg statement essentially acts like a storage unit. Because it has the same name as the output port it acts like one item. Using reg this way is helpful because its storage ability allows the output to remain constant while system inputs change between clock cycles. There is another kind of statement called wire. It is used to tie more than one module together or drive combinational designs. It will appear in later listings.

The next line of code is the always statement or block. You want to have a begin and end statement for it. If you know VHDL, this is the same as the process statement and works in the same fashion. If you are completely new to programmable logic in general, it works like this: “For every action X that happens on signals indicated in the sensitivity list, follow these instructions.” In some modules, there is usually a begin and an end statement. This is the equivalent of curly braces seen in C/C++. It’s best to use these with decision structures (i.e., always, if, and case) as much as possible.

Finally, the last statement is a logical left shift operation. Verilog bitwise operators in some instances need the keyword assign for the operation to happen. The compiler will tell you if an assign statement is missing. From there, the code does its insertion operation and then waits for the next positive edge of the clock. This was a pretty straightforward example; unfortunately, it doesn’t do much. The best way to get around that is to add more features using functions, tying-in more modules, or using parameters to increase flexibility.


Tasks and functions make module implementation clearer. Both are best used when redundant code or complex actions need to be split up from the main source. There are some differences between tasks and functions.

A task can call other tasks and functions, while a function can call only other functions. A task does not return a value; it modifies a variable that is passed to it as an output. Passing items to a task is also optional. Functions, on the other hand, must return one and only one value and must have at least one value passed to them to be valid. Tasks are well-suited for test benches because they can hold delay and control statements. Functions, however, have to be able to run within one time unit to work. This means functions should not be used for test benches or simulations that require delays or use sequential designs. Experimenting is a good thing because these constructs are helpful.

There is one cardinal rule to follow when using a function or task. They have to be defined within the module, unlike VHDL where functions are defined in a package to get maximum flexibility. Tasks and functions can be defined in a separate file and then attached to a module with an include statement. This enables you to reuse code in a project or across multiple projects. Both tasks and functions can use types other than binary for their input and output ports, giving you even more flexibility.Listing2

Listing 2 contains a function that essentially acts like a basic ALU. Depending on what is passed to the function, the function will process the information and return the calculated integer value. Tasks work in the same way, but the structure is a little different when dealing with inputs and outputs. As I said before, one of the major differences between a task and a function is that the former can have multiple outputs, rather than just one. This gives you the ability to make a task more complicated internally, if need be.Listing3

Listing 3 is an example of a task in action with more than one output. Note how it is implemented the same way as a function. It has to be defined and called within the module in order to work. But rather than define the task explicitly within the module, the task is defined in a separate file and an include directive is added in the module code just to show how functions and tasks can be defined outside of a module and available for other modules to use.


If too much is added to a module, it can become so large that debugging and editing become a chore. Doing this also minimizes code reuse to the point where new counters and state machines are being recreated when just using small modules/functions from a previous project is more than adequate. A good way to get around these issues is by making multiple modules in the same file or across multiple files and creating an instantiation of that module within an upper-level module to use its abilities. Multiple modules are good to have for a pipelined system. This enables you to use the same kind of module over multiple areas of a system. Older modules can also be used this way so less time is used on constant recreation.Listing4

That is the idea of code reuse in a nutshell. Now I will discuss an example of code reuse and multiple modules. The shift register from Listing 1 is having its data go into an even parity generator and the result from both modules is output through the top-level module in Listing 4. All of this is done across multiple files in one listing for easier reading. In all modular designs, there is always a module called a top-level entity, where all of the inputs and outputs of a system connect to the physical world. It is also where lower-level entities are spawned. Subordinates can spawn entities below themselves as well (see Figure 1).Figure 1

Think of it as a large black box with smaller black boxes connected with wires and those small black boxes have either stuff or even smaller black boxes. Pretty neat, but it can get annoying. Imagine a situation where a memory controller for 10-bit addressing is created and then the address length needs to be extended to 16 bits. That can be a lot of files to go through to change 10 to 16. However, with parameters all that needs to be changed is one value in one file and it’s all done.


Parameters are great to have around in Verilog and can make code reuse even more attractive. Parameters allow words to take the place of a numerical value like #define in C, but with some extra features such as overriding. Parameters can be put in length descriptors, making it easy to change the size of an output, input, or variable. For example, if a VGA generator had a color depth of 8 bits but needed to be changed to 32-bit color depth, then instead of changing the locations where the value occurs, only the value of the parameter would be changed and when the module was recompiled it would be able to display 32-bit color. The same can be done for memory controllers and other modules that have ports, wires, or registers with 1 bit or more in size. Parameters can also be overridden. This is performed just before or when a module is instantiated. This is helpful if the module needs to be the same all the time across separate projects that are using the same source, but needs to be a little different for another project. Parameters can also be used in functions and tasks as long as the parameter is in the same file the implementation code is in. Parameters with functions and tasks give Verilog the flexibility of a VHDL package, granted it really isn’t a package, because the implementation is located in a module and not in a separate construct.Listing5

There are many ways to override parameters. One way is by using the defparam keyword, which explicitly changes the value of the parameter in the instantiated module before it is invoked. Another way is by overriding the parameter when the module is being invoked. Listing 5 shows how both are done with dummy modules that already have defined parameters. The defparam method is from an older version of the language, so depending on the version of Verilog being used, make sure to pick the right method.

Download the entire article.

Rad Tolerant 3.3-V CAN Transceivers for Satellite Communications

Intersil Corp. recently announced  the industry’s first radiation-tolerant, 3.3-V controller area network (CAN) transceivers that are fully QML-V qualified and compliant with the ISO11898-2 physical layer standard. The three new ISL7202xSEH CAN transceivers provide reliable serial data transmission between a CAN controller and CAN bus at speeds up to 1 Mbps. Up to 120 of Intersil’s ISL7202xSEH transceivers can be connected to a single CAN bus to reduce cabling/harness size, weight and power (SWAP) costs. This weight and mass reduction of up to 18% allows system engineers to add millions of dollars in satellite functionality, and eliminate the extra cabling and tradeoffs associated with current point-to-point interface solutions.isl7202 Intersil

The ISL72026SEH, ISL72027SEH, and ISL72028SEH 3.3-V CAN transceivers deliver ultra-high performance in the most demanding environments by leveraging Intersil’s proprietary silicon on insulator process, which provides single event latch-up (SEL) and single event burn-out (SEB) robustness in heavy ion environments. With the emergence of all-electric propulsion satellites that maximize payload but take longer to reach final orbit, customers require higher total dose testing for mission assurance. Intersil’s CAN transceivers are low dose rate tested up to 75 krad on a wafer-by-wafer basis, and apply single event transient (SET) mitigation techniques to reduce system level bit error rates, providing predictable performance. They are also “cold spare” redundant capable, allowing the connection of additional unpowered transceivers to the CAN bus. This mission-critical capability maximizes system life.

  • The ISL7202xSEH family offers a number of unique features:
  • The ISL72026SEH includes a loopback test capability that allows node diagnostics and reporting while the system is transmitting data

It offers split termination output using the VREF pin to provide a VCC/2 output reference. This improves network electromagnetic compatibility and stabilizes the bus voltage, preventing it from drifting to a high common-mode voltage during inactive periods. The ISL72028SEH includes a low power shutdown mode that switches off the driver and receiver to draw 50 µA for power conservation.

Key features and specs:

  • Electrically screened to SMD 5962-15228, and compatible with ISO11898-2
  • Delivers 4-kV human body model (HBM) ESD protection on all pins
  • 3- to 3.6-V supply range, –7 to 12 V common-mode input voltage range, 5-V tolerant logic inputs, and bus pin fault protection to ±20-V terrestrial and ±18 V in orbit
  • Cold spare powered down devices do not affect active devices operating in parallel
  • Three selectable driver rise and fall times
  • Glitch free bus I/O during power-up and power-down
  • Full fail-safe receiver: open, short, terminated/undriven
  • Hi Z input allows for 120 nodes on the bus and data rates up to 1 Mbps
  • Low quiescent supply current of 7 mA
  • Thermal shutdown
  • Low dose rate (0.01 rad(Si)/s) radiation tolerance of 75 krad(Si)
  • SEL/B immune up to LET 60 MeV.cm2/mg

The ISL72026SEH, ISL72027SEH and ISL72028SEH 3.3V CAN transceivers are available in eight-lead ceramic flatpack packages.

Source: Intersil Corp.

STMicro’s New Advanced 32-Bit Secure Microcontroller

STMicroelectronics has introduced the first member of the third generation of its ST33 series of secure microcontrollers based on the 32-bit ARM SecurCore SC300 processor. The ST33J2M0, which provides 2-MB flash program memory, is intended for secure applications including embedded Secure Element (eSE), Single Wire Protocol (SWP) SIMs for NFC applications, and embedded Universal Integrated Circuit Card (UICC). The secure microcontroller includes the highest performance and integrated crypto-accelerators that together with the industry’s fastest clock speed in a secure microcontroller enable the highest performance for fast application execution. It also features a new hardware architecture with strong and multiple fault-protection mechanisms covering the CPU, memories, and buses to facilitate the development of highly secure software.s.

The ST33J2M0 features multiple hardware accelerators for advanced cryptographic functions. The EDES peripheral provides a secure Data Encryption Standard (DES) algorithm implementation, while the NESCRYPT crypto-processor efficiently supports the public key algorithm. The AES peripheral ensures secure and fast AES algorithm implementation.

ST33J2M0 samples are available as wafers or housed in VQFN and WLCSP packages.

Connected Home Solutions with ZigBee and Thread-Ready Connectivity

Silicon Labs recently introduced a series of comprehensive reference designs that reduce time to market and simplify the development of ZigBee-based home automation, connected lighting and smart gateway products. The first in a series of Internet of Things (IoT) solutions, the new reference designs include hardware, firmware, and software tools for developing high-quality connected home solutions based on Silicon Labs’s ZigBee “Golden Unit” Home Automation (HA 1.2) software stack and ZigBee SoC mesh networking technology.SiliconLabs IoT-SolutionsSilicon Labs’ ZigBee connected lighting reference designs feature wireless lighting boards and a plug-in demo board. The Golden Unit ZigBee stack allows LED lights to reliably join, interoperate, and leave a mesh network. The connected lights can support white, color temperature tuning, and RGB color settings as well as dimming.

Silicon Labs’ ZigBee-based home automation reference designs include a capacitive-sense dimmable light switch and a small door/window contact sensor. The light switch provides color, color tuning, and dimming control capabilities. As opposed to conventional switches, the wireless, battery-powered switches have no moving parts and are easy to place anywhere in a home. The switch design includes Silicon Labs’s EFM8 capacitive sensing microcontroller to detect different user gestures (touch, hold, and swipe). The contact sensor reference design provides all the tools needed to create wireless, battery-powered sensors used to monitor door and window positions (open or closed).

Silicon Labs offers two ZigBee gateway options to complement the reference designs:

  • A plug-and-play USB virtual gateway that works with any PC development platform and supports the Windows, OS X, and Linux environments as a virtual machine
  • An out-of-the-box Wi-Fi/Ethernet gateway reference design based on an embedded Linux computer system

Both gateway options enable you to control and monitor ZigBee HA 1.2-compliant end nodes through Wi-Fi with any device with a web browser, such as a smartphone or tablet. With an intuitive, web-based user interface, you can easily create rules between ZigBee end devices including lights, dimmable light switches, and contact sensors.

Silicon Labs’ connected lighting, home automation, and smart gateway reference designs are currently available. The RD-0020-0601 and RD-0035-0601 connected lighting reference designs cost $49. The RD-0030-0201 contact sensor reference design is $39. The RD-0039-0201 capacitive-sense dimmable light switch reference design is $29. The USB virtual gateway is $49. The out-of-the-box Wi-Fi/Ethernet gateway reference design is $149.

Source: Silicon Labs