32-bit MCUs Optimized for Motor Control in Robotics and More

Renesas Electronics has unveiled the RX66T Group of microcontrollers (MCUs). The chips are the first members of Renesas’ flagship 32-bit RX MCU family based on the new third-generation RXv3 CPU core. The new MCUs leverage advanced CPU core technology to achieve substantially improved performance, as much as 2.5 times better than previous RX family MCUs.

Combining the powerful new RXv3 core with the strengths of the current RX62T and RX63T MCUs, the new RX66T MCUs address the real-time performance and enhanced stability required by inverter control. The new MCUs are ideal for use in industrial applications in next-generation smart factory equipment, such as industrial motors, power conditioners and robots, as well as smart home appliances, including air conditioners and washing machines.
When operating at 160 MHz, the RX66T MCUs achieve best-in-class performance of 928 CoreMark 2, enabling more precise inverter control. The MCUs can control up to four motors simultaneously, making them well-suited for conventional motor control and applications requiring multi-axis motor control, such as compact industrial robots and personal robots, which are quickly growing in popularity.

In addition, the RX66T’s extra processing capacity allows developers to add programs utilizing embedded AI (e-AI) for motor fault detection. Such programs can detect motor faults and identify fault location in real time based on the motor’s current or vibration characteristics. Providing this capability offers developers the significant value-add of productivity, safety, and quality. The RX66T MCUs also integrate a 5V power supply that delivers excellent noise tolerance.

With more and more devices ranging from robots and power conditioners to washers and dryers joining the Internet of Things, motorized devices in the field will require online firmware updates throughout their life cycles. Applying e-AI for predictive failure diagnostics requires endpoint MCUs to be securely updated with learning results generated in the cloud. The RX66T MCU Group incorporates Renesas’ Trusted Secure IP (TSIP), which has a track record of CAVP certification3 and provides secure firmware updates and encrypted communication.

Key Features of the RX66T MCU Group:

  • Supports inverter control with a maximum operating frequency of 160 MHz, 928 CoreMark, on-chip floating point-unit (FPU), and 5V power supply
  • High-speed flash memory with 120 MHz maximum read operation to reduce speed differential with the CPU and realize both high performance and a consistent execution
  • Reduces footprint and component count by generating three-phase complementary pulse width modulation (PWM) output for up to four motors using 112-pin and 144-pin package MCUs, and up to three motors using 64-pin, 80-pin and 100-pin package MCUs
  • Configurations available with 16 KB of error correction code (ECC) SRAM, and up to 128 KB of SRAM with single-cycle access and single-bit error detection (parity checking) for high reliability
  • Ability to generate high-resolution PWM signals with a minimum state change duration of 195 picoseconds (1.6 times better than existing RX products) for power conditioner or digital power supply control applications
  • Renesas’ Trusted Secure IP (TSIP) provides secure firmware updates and encrypted communication with a track record of CAVP certification

The Renesas Motor Workbench 2.0 supports 20kHz real-time debugging and adds 10 new functions and an RX66T CPU card for the 24V Motor Control Evaluation Kit are available now.

The new RX66T Group comprises 80 MCUs with pin counts ranging from 64 to 144 pins and on-chip flash memory sizes of 256 KB to 1,024 KB. Mass production starts today for the widely used 100-pin package MCU with 256 KB or 512 KB of program flash and 64 KB of SRAM. Other MCU versions will release over time. Pricing for the RX66T MCU Group starts at $3.25 per unit in 10,000-unit quantities.

Renesas Electronics | www.renesas.com

The Future of Intelligent Robots

Robots have been around for over half a century now, making constant progress in terms of their sophistication and intelligence levels, as well as their conceptual and literal closeness to humans. As they become smarter and more aware, it becomes easier to get closer to them both socially and physically. That leads to a world where robots do things not only for us but also with us.

Not-so-intelligent robots made their first debut in factory environments in the late ‘50s. Their main role was to merely handle the tasks that humans were either not very good at or that were dangerous for them. Traditionally, these robots have had very limited sensing; they have essentially been blind despite being extremely strong, fast, and repeatable. Considering what consequences were likely to follow if humans were to freely wander about within the close vicinity of these strong, fast, and blind robots, it seemed to be a good idea to isolate them from the environment by placing them in safety cages.

Advances in the fields of sensing and compliant control made it possible to get a bit closer to these robots, again both socially and physically. Researchers have started proposing frameworks that would enable human-robot collaborative manipulation and task execution in various scenarios. Bi-manual collaborative manufacturing robots like YuMi by ABB and service robots like HERB by the Personal Robotics Lab of Carnegie Mellon University[1] have started emerging. Various modalities of learning from/programming by demonstration, such as kinesthetic teaching and imitation, make it very natural to interact with these robots and teach them the skills and tasks we want them perform the way we teach a child. For instance, the Baxter robot by Rethink Robotics heavily utilizes these capabilities and technologies to potentially bring a teachable robot to every small company with basic manufacturing needs.

As robots gets smarter, more aware, and safer, it becomes easier to socially accept and trust them as well. This reduces the physical distance between humans and robots even further, leading to assistive robotic technologies, which literally “live” side by side with humans 24/7. One such project is the Assistive Dexterous Arm (ADA)[2] that we have been carrying out at the Robotics Institute and the Human-Computer Interaction Institute of Carnegie Mellon University. ADA is a wheelchair mountable, semi-autonomous manipulator arm that utilizes the sliding autonomy concept in assisting people with disabilities in performing their activities of daily living. Our current focus is on assistive feeding, where the robot is expected to help the users eat their meals in a very natural and socially acceptable manner. This requires the ability to predict the user’s behaviors and intentions as well as spatial and social awareness to avoid awkward situations in social eating settings. Also, safety becomes our utmost concern as the robot has to be very close to the user’s face and mouth during task execution.

In addition to assistive manipulators, there have also been giant leaps in the research and development of smart and lightweight exoskeletons that make it possible for paraplegics to walk by themselves. These exoskeletons make use of the same set of technologies, such as compliant control, situational awareness through precise sensing, and even learning from demonstration to capture the walking patterns of a healthy individual.

These technologies combined with the recent developments in neuroscience have made it possible to get even closer to humans than an assistive manipulator or an exoskeleton, and literally unite with them through intelligent prosthetics. An intelligent prosthetic limb uses learning algorithms to map the received neural signals to the user’s intentions as the user’s brain is constantly adapting to the artificial limb. It also needs to be highly compliant to be able to handle the vast variance and uncertainty in the real world, not to mention safety.

Extrapolating from the aforementioned developments and many others, we can easily say that robots are going to be woven into our lives. Laser technology used to be unreachable and cutting-edge from an average person’s perspective a couple decades ago. However, as Rodney Brooks says in his book titled Robot: The Future of Flesh and Machines, (Penguin Books, 2003), now we do not know exactly how many laser devices we have in our houses, and more importantly we don’t even care! That will be the case for the robots. In the not so distant future, we will be enjoying the ride in our autonomous vehicle as a bunch of nanobots in our blood stream are delivering drugs and fixing problems, and we will feel good knowing that our older relatives are getting some great care from their assistive companion robots.

[1] http://www.cmu.edu/herb-robot/
[2] https://youtu.be/glpCAdKEWAA

Tekin Meriçli, PhD, is a well-rounded roboticist with in-depth expertise in machine intelligence and learning, perception, and manipulation. He is currently a Postdoctoral Fellow at the Human-Computer Interaction Institute at Carnegie Mellon University, where he leads the efforts on building intuitive and expressive interfaces to interact with semi-autonomous robotic systems that are intended to assist elderly and disabled. Previously, he was a Postdoctoral Fellow at the National Robotics Engineering Center (NREC) and the Personal Robotics Lab of the Robotics Institute at Carnegie Mellon University. He received his PhD in Computer Science from Bogazici University, Turkey.

This essay appears in Circuit Cellar 298, May 2015.