The Future of Automation

The robot invasion isn’t coming. It’s already here. One would be hard-pressed to find anything in modern “industrialized” society that doesn’t rely on a substantial level of automation during its life cycle—whether in its production, consumption, use, or (most probably) all of the above. Regardless of the definition du jour, “robots” are indeed well on their way to taking over—and not in the terrifying, apocalyptic, “Skynet” kind of way, but in a way that will universally improve the human condition.

Of course, the success of this r/evolution relies on an almost incomprehensible level of compounding innovations and advancements accompanied by a mountain of associated technical complexities and challenges. The good news is many of these challenges have already been addressed—albeit in a piecemeal manner—by focused professionals in their respective fields. The real obstacle to progress, therefore, ultimately lies in the compilation and integration of a variety of technologies and techniques from heterogeneous industries, with the end goal being a collection of cohesive systems that can be easily and intuitively implemented in industry.

Two of the most promising and critical aspects of robotics and automation today are human-machine collaboration and flexible manufacturing. Interestingly (and, perhaps, fortuitously), their problem sets are virtually identical, as the functionality of both systems inherently revolves around constantly changing and wildly unpredictable environments and tasks. These machines, therefore, have to be heavily adaptable to and continuously “aware” of their surroundings in order to maintain not only a high level of performance, but also to consistently perform safely and reliably.

Not unlike humans, machines rely on their ability to collect, analyze, and act on external data, oftentimes in a deterministic fashion—in other words, certain things must happen in a pre-defined amount of time for the system to perform as intended. These data can range from the very slow and simple (e.g., calculating temperature by reading the voltage of a thermocouple once a second) to the extremely fast and complex (e.g., running control loops for eight brushless electric motors 25,000-plus times a second). Needless to say, giving a machine the ability to perceive—be it through sight, sound, and/or touch—and act on its surroundings in “real time” is no easy task.

Read more Tech the Future essays and get inspired!

Computer vision (sight and perception), speech recognition (sound and language), and precision motion control (touch and motor skills) are things most people take for granted, as they are collectively fundamental to survival. Machines, however, are not “born” with—nor have they evolved—these abilities. Piling on additional layers of complexity like communication and the ability to acquire knowledge/learn new tasks, and it becomes menacingly apparent how substantial the challenge of creating intelligent and connected automated systems really is.

While the laundry list of requirements might seem nearly impossible to address, fortunately the tools used for integrating these exceedingly complex systems have undergone their own period of hyper growth in the last several years. In much the same way developers, researchers, engineers, and entrepreneurs have picked off industry- and application-specific problems related to the aforementioned technical hurdles, as have the people behind the hardware and software that make it possible for these independently developed, otherwise standalone solutions to be combined and interwoven, thus resulting in truly world-changing innovations.

For developers, only in the last few years has it become practical to leverage the combination of embedded technologies like the power-efficient, developer-friendly mobile application processor with the flexibility and raw “horsepower” of programmable logic (i.e., field-programmable gate arrays, which have historically been reserved for the aerospace/defense and telecommunication industries) at scales never previously imagined. And with rapidly growing developer communities, the platforms built around these technologies are directly facilitating the advancement of automation, and doing it all under a single silicon “roof.” There’s little doubt that increasing access to these new tools will usher in a more nimble, intelligent, safe, and affordable wave of robotics.

Looking forward, automation will undoubtedly continue to play an ever-increasingly central role in day-to-day life. As a result, careful consideration must be given to facilitating human-machine (and machine-machine) collaboration in order to accelerate innovation and overcome the technical and societal impacts bred from the disruption of the status quo. The pieces are already there, now it’s time to assemble them.

This article appears in Circuit Cellar 320.

Ryan Cousins is cofounder and CEO of krtkl, Inc. (“critical”), a San Francisco-based embedded systems company. The krtkl team created snickerdoodle—an affordable and highly reconfigurable platform for developing mechatronic, audio/video, computer vision, networking, and wireless communication systems. Ryan has a BS in mechanical engineering from UCLA.  He has experience in R&D, project management, and business development in the medical and embedded systems industries. Learn more at or

Member Profile: Walter O. Krawec

Walter O. Krawec

Walter O. Krawec

Upstate New York

Research Assistant and PhD Student, Stevens Institute of Technology

Walter has been reading Circuit Cellar since he got his first issue in 1999. Free copies were available at the Trinity College Fire Fighting Robot Contest, which was his first experience with robotics. Circuit Cellar was the first magazine for which he wrote an article (“An HC11 File Manager,” two-part series, issues 129 and 130, 2001).

Robotics, among other things. He is particularly interested in developmental and evolutionary robotics (where the robot’s strategies, controllers, and so forth are evolved instead of programmed in directly).

Walter is enjoying his Raspberry Pi. “What a remarkable product! I think it’s great that I can take my AI software, which I’ve been writing on a PC, copy it to the Raspberry Pi, compile it with GCC, then off it goes with little or no modification!”

Walter is designing a new programming language and interpreter (for Windows/Mac/Linux, including the Raspberry Pi) that uses a simulated quantum computer to drive a robot. “What better way to learn the basics of quantum computing than by building a robot around one?” The first version of this language is available on his website ( He has plans to release an improved version.

Walter said he is amazed with the power of the latest embedded technology, for example the Raspberry Pi. “For less than $40 you have a perfect controller for a robot that can handle incredibly complex programs. Slap on one of those USB battery packs and you have a fully mobile robot,” he said. He used a Pololu Maestro to interface the motors and analog sensors. “It all works and it does everything I need.” However, he added, “If you want to build any of this yourself by hand it can be much harder, especially since most of the cool stuff is surface mount, making it difficult to get started.”