Machine chine vision is a field of electrical engineering that’s changing how we interact with our environment, as well as the ways by which machines communicate with each other. Circuit Cellar has been publishing articles on the subject since the mid-1990s. The technology has come a long way since then. But it’s important (and exciting) to regularly review past projects to learn from the engineers who paved the way for today’s ground-breaking innovations.
In Circuit Cellar 92, a team of engineers (Bill Bailey, Jon Reese, Randy Sargent, Carl Witty, and Anne Wright) from Newton Labs, a pioneer in robot engineering, introduced readers to the M1 color-sensitive robot. The robot’s main functions were to locate and carry tennis balls. But as you can imagine, the underlying technology was also used to do much more.
The engineering team writes:
Machine vision has been a challenge for AI researchers for decades. Many tasks that are simple for humans can only be accomplished by computers in carefully controlled laboratory environments, if at all. Still, robotics is benefiting today from some simple vision strategies that are achievable with commercially available systems.
In this article, we fill you in on some of the technical details of the Cognachrome vision system and show its application to a challenging and exciting task—the 1996 International AAAI Mobile Robot Competition in Portland, Oregon… In 1996, the contest was for an autonomous robot to collect 10 tennis balls and 2 quickly and randomly moving, self-powered squiggle balls and deliver them to a holding pen within 15 min.
In M1’s IR sensor array, each LED is fired in turn and detected reflections are latched by the 74HC259 into an eight-bit byte.
At the time of the conference, we had already been manufacturing the Cognachrome for a while and saw this contest as an excellent way to put our ideas (and our board) to the test. We outfitted a general-purpose robot called M1 with a Cognachrome and a gripper and wrote software for it to catch and carry tennis balls… M1 follows the wall using an infrared obstacle detector. The code drives two banks of four infrared LEDs one at a time, each modulated at 40 kHz.
The left half of M1’s infrared sensor array is composed of a Sharp GP1U52X infrared detector sandwiched between four infrared LEDs
Two standard Sharp GP1U52X infrared remote-control reception modules detect reflections. The 74HC163/74HC238 combination fires each LED in turn, and the ’HC259 latches detected reflections. This system provides reliable obstacle detection in the 8–12″ range.
The figure above shows the schematic. The photo shows the IR sensors.
The system provides only yes/no information about obstacles in the eight directions around the front half of the robot. However, M1 can crudely estimate distance to large obstacles (e.g., walls) via patterns in the reflections. The more adjacent directions with detected reflections, the closer the obstacle probably is.
Download the Entire Article