Using PIC32s and IMUs
Simulating human body motion is a key concept in robotics development. With that in mind, learn how these three Cornell graduates accurately simulate the movement of a human arm on a small-sized robotic arm. The Microchip PIC32 MCU-based system enables the motion-controlled, 3-DoF robotic arm to take a user’s throwing motion as a reference to its own throw. In this way, they created a robotic arm that can throw a ping pong ball and thus play beer pong.
(scroll to bottom to see the video)
— ADVERTISMENT—
—Advertise Here—
Because the electronics industry is advancing so rapidly, it’s now easier than ever to build interesting systems without hurting your wallet. This has also led to an overwhelming variety of projects suitable as a final project in a college class. One of the types of systems we were interested in from the start was wearables—but wearables and what? We could attempt to create something meaningful, something that would help humanity. Or not. We opted for a something fun for college students—something you could talk about at parties without losing everyone’s attention. That’s how our idea of the “Pong Bot” came to life.
The project’s focus was to simulate the movement of a human arm such as aiming and throwing small objects—for example, a ping pong ball—with a small robotic arm. We used the motion-controlled, 3-DoF (degrees of freedom) robotic arm that takes the user’s throwing motion as a reference to its own throw. A robotic arm that mimics the user’s arm motion has many different applications. For example, you can use the robot arm to lift heavy objects that human arms can’t handle, or use the robotic arm remotely from a distance.
With all that in mind, we hope that, with small modifications, readers could take the concepts from this project and create something more useful—although perhaps less fun. Integrating both mechanical and electrical components, we set out to control a beer-pong catapult robot that simulates the user’s throwing gesture. With this system, a mini-scale beer pong game can be played using a robotic arm that throws the ping pong ball for you—a fun twist enabling you to play beer pong in style. For those unfamiliar with the game, beer pong is a drinking game in which players throw a ping pong ball across a table with the intent of landing the ball in a cup of beer on the other end.
— ADVERTISMENT—
—Advertise Here—
THE USER INTERFACE
The user interface for our device relies on a sleeve (Figure 1) worn on the user’s arm and adjusted so that IMUs (inertial measurement units) align with wrist and elbow. This allows gesture control. The aiming of the catapult and start position for the throw are determined using readings from the IMU. The IMU delivers two angles from the elbow and one angle from the wrist, so we get the 3 degrees of freedom needed for our robotic arm. By combining the data from the gyroscopes and accelerometers attached on the elbow and wrist, the controller sends out three current angles from the calibrated zero. This mapping of user’s arm to the robotic arm will be discussed in a later section including the implementation of a complementary filter using IMU readings.
The controller device itself consists of two IMUs, a Digi International RF XBee module, an inexpensive pressure sensor and a Microchip PIC32 microcontroller (MCU) on the sleeve. The pressure sensor is implemented as a digital button. When the user pinches on the pressure sensor to a certain threshold, the robotic catapult starts to move according to the movement of user’s arm. This is when the user is expected to aim. As soon as the pressure sensor is released, the robotic arm swings very quickly as it throws the ball—similar to what a catapult would do. Figure 2 shows the development board and the circuitry mounted on the sleeve. The development board used is Sean Carroll’s PIC32 Small Development Board—a link with the details of this resource is provided on the Circuit Cellar article materials webpage.
The release mechanism was designed after initial testing. Due to a short delay between the user’s movement and the movement of the servos, it was difficult to get a rapid movement that would cause the ball to be thrown. For maximum enjoyment the user is encouraged to make the throwing movement after the pressure sensor has been released. This won’t affect the throw, but it makes it seem more like the user is also controlling the throw.
— ADVERTISMENT—
—Advertise Here—
The controller can wirelessly communicate to the robotic arm by having all the sensors on the arm hooked up to a local PIC32 MCU that sends signals via an RF transmitter. On the robotic arm, we will have an RF receiver that receives the signals from the controller and moves the robotic arm accordingly in real-time. Because of what we believe to be hardware constraints with the Xbee modules, we were only able to send a signal from the controller to the robotic arm every 200 ms. This means that the servo’s position signal was updated every 200 ms. This sometimes made the robotic arm seem a little shaky and unresponsive. For a more reliable system, our final version didn’t use wireless communication. Instead, we used two very long wires to directly connect UART pins between the two boards. This modified, final version enabled us to establish a stable and fast communication interval of 65 ms, providing smooth control of the robotic arm. The long wires won’t interfere with the user’s movement as long as the arm is used as intended.
ROBOTIC ARM
Due to time and budget constraints, we had to get creative with our materials and assembly. Most of the materials were collected while we were in a cafeteria during a break from working on this. The assembled robotic arm is shown in Figure 3. Despite the commonplace materials, it behaves just as we intended, and serves its purpose. The system requires three servos to translate the IMU readings to motion in different axes. We gave constraints to each servo so that we don’t surpass the angle of its physical limitations. In other words, if a human’s elbow joint can’t provide 360 degrees of rotation, our robotic arm shouldn’t be able to do that either. Servo 1 rotates approximately 180 degrees, and Servo 2 and Servo 3 rotate 40 degrees and 90 degrees, respectively.
The most sophisticated component in our robotic arm was the base used to support Servo 1. It was 3D-printed. That’s because we realized at the onset that we needed a solid, stable base to prevent movement from the other servos from destabilizing our system. The 3D rendering of the base is shown in Figure 4. Servo 1, which rotates the base, was fitted into a casing that was laser-cut to minimize jittering. Later, we attached the base casing to a plank of wood for greater stability, because the arms should have imbalance in their center of gravity. Then Servo 2 was screwed into a 3D-printed mold that fit right into Servo 1. The arms of the robots were extended by wooden coffee sticks. Finally, the holder or bowl of the ball was made with a plastic spoon. The “shoulder joint” was replicated by Servos 1 and 2. The elbow movement was replicated by Servo 3, which was attached at the end of the stick extended from Servo 2. The servos were connected to a board, which was connected to the PIC32 MCU used for the robotic arm station (Figure 5). The development board used here is Sean Carroll’s PIC32 Large Development Board—a link with the details of this resource is provided on the Circuit Cellar article materials webpage.
COMPONENT BREAKDOWN
The IMUs used in this project were MPU6050 from TDK InvenSense, which uses I2C protocol for communication with the host device. The PIC32 has capability of two I2C channels. However, we only used one, because the MPU6050 has a bit address—0 or 1—which is used for talking to the specific IMU unit. We used a helper function, called i2c_helper.h from another project that also used I2C to communicate successfully with this sensor [3].
When we started experimenting with the data being collected, we faced a few problems with accuracy and drift. This is a common issue with IMUs, and was remedied by sensor fusion—using the information from both types of sensors in our IMU unit to correct the error. The Kalman Filter is the standard method to solve these problems, but it is computationally heavy and at times difficult to implement. A simpler algorithm is the complementary filter, which is easier to implement and is “often applied in systems of limited resources such as this project [4].
The complementary filter provided a simple way of getting accurate data and reducing drift. This was accomplished by combining the gyroscope and accelerometer data on each axis. This approach proved to be an efficient, computationally lightweight alternative to a Kalman Filter for our system. We implemented the complementary filter for angle data for each joint. On the PIC32 attached to the sleeve, we extracted the most recent value for the accelerometer and gyroscope, scaled both of them and then processed them through our algorithm.
Listing 1 shows a snippet of C code of the algorithm on our system to calculate the “elbow” angle from the sleeve. The past gyroscope data is integrated and then multiplied by the constant, COMP_FILTER_G_COEF
, which is then added to the scaled accelerometer data. Multiplying this computed value by the other constant, COMP_FILTER_A_COEF
, we get the final, filtered angle data. The values COMP_ FILTER_G_COEF
and COMP_FILTER_A_COEF
(0.98 and 0.02 respectively) were used to weight the values of the accelerometer and gyroscope differently because of to two things. First, the gyroscope drifts a lot when IMU is not detecting movement. Second, the accelerometer is easily disturbed, causing spikes in its readings—especially while facing movement. But it has data that could be useful to counteract the drift from the gyroscope. These can be adjusted, but 0.98 and 0.02 worked fairly well for us.
// Elbow IMU
accTilty_elbow= -atan2f(xAccel_elbow, zAccel_elbow)*180.0/M_PI;
tilty_elbow = (COMP_FILTER_G_COEF*(tilty_elbow + yGyro_elbow*IMU_READ_PERIOD*0.001) + COMP_FILTER_A_COEF*accTilty_elbow);
LISTING 1 – This snippet shows the C code used to accurately calculate the elbow angle for the robotic arm. The variable tiltY_elbow was passed to the robotic arm.
The pressure sensor was a variable resistor. To use it as a digital button, we built a simple voltage divider circuit and set a threshold. This meant that any time the output of the circuit exceeded it, we would set a flag. We used the ADC on the PIC32 to read voltage values to detect when to follow the arm and when to release the ball (or launch the catapult).
When we started debugging and testing our system, we needed a more reliable way of communication between PIC32 MCUs than RF because we also had to debug our RF communication. For this, we used UART between the PIC32s. This was convenient because our RF modules also used UART as a communication protocol to send data so, theoretically, we could see them as a UART bridge.
The servo motor controls were all written on the PIC32 MCU that was hooked up to the robotic arm. Its purpose was to extract all the data that were sent from the PIC32 MCU on the sleeve and turn them into PWM signals. Four variables were extracted from the PIC32 on the sleeve. Three represented the tilt of elbow on Z axis, on Y axis, and tilt of wrist on Z axis. And the fourth was a flag indicating whether or not the pressure sensor was pressed.
RESULTS AND USABILITY
We performed sets of testing to deduce numerical specifications for our system. Drift tests were done for each servo experimentally, to observe drift in angles. For each test we reset the system and performed 20 cycles of the maximum range of motion allowed for each degree of freedom. On Servo 1—which was getting data from the IMU on the elbow—we observed approximately an 8-degree difference when returning to the original position. Servo 2 used data from the same IMU, and we found no measurable drift after our test. This was reasonable given that the range of motion for this servo was only 40 degrees. On Servo 3—which used data from the IMU on the wrist—we also saw no significant drift after the 20 cycles and returning to the original position.
While testing the Servo 2 and Servo 3 rotation, we noticed that Servo 1 was also rotating slightly, which should have not been the case. This may have been due to IMU’s position on the elbow, which wasn’t securely fixed at one position. The arm movement made the sleeve elastically extend and contract, and in this process the IMU might effectively be moved around. This might also have caused movement to the base rotation and stacking up drift angles.
To determine how much weight our delightfully crafted robotic arm could handle, we performed a series of tests with increasingly heavier materials. Although the arm’s intended use was to throw lightweight objects such as ping pong balls, we found that any object lighter than 15 g also could be thrown in an acceptable way.
Finally, we tested the range of throw to specify the shortest and longest distance the throw could cover. We found that by adjusting the initial position of the throw, we could cover a range of 6″ to 13″. This might not be a lot—especially if you’re trying to play a real game of beer pong against a real person—but it could certainly bring some external entertainment to the game.
As for speed of execution, as noted previously, wired communication was more effective. Combining that with the use of interrupts with UART, we were able to get our robotic arm to mimic the user’s arm with no noticeable delay. This is mainly because we don’t expect the users to make any sudden movements while aiming. We noticed that quick, sudden movements made the robotic arm struggle to catch up, and in some cases, it reacted in strange ways that disappeared when the user started aiming more slowly.
Anyone with an arm and a finger can use our controller, as long as the IMUs are adjusted to match the location of the elbow and the wrist. Also, the robotic arm is quite fragile because it’s made of birchwood sticks and cheap generic servos that seem to jitter slightly in motion. The robustness of the robotic arm can be improved by replacing the birchwood sticks with 3D-printed parts and by using more reliable servos. The arm controller on the sleeve can also be made more robust by strapping components more tightly on the user’s arm. Lastly, the user must acknowledge and consider the slight drifting on the base rotation of the robot and the maximum load the system can take to throw. Check out the video of the final demo below:
RESOURCES
References:
[1] Sean Carroll. The Small Board, Nov 2016 http://people.ece.cornell.edu/land/courses/ece4760/PIC32/target_board.html
[2] Sean Carroll. The Big Board, Nov 2016 http://people.ece.cornell.edu/land/courses/ece4760/PIC32/target_board.html
[3] Caulley, Desmond; Nehoran, Nadav; Zhao, Sherry. Self-balancing Robot. Dec 2015 http://people.ece.cornell.edu/land/courses/ece4760/FinalProjects/f2015/dc686_nn233_hz263/final_project_webpage_v2/dc686_nn233_hz263/index.html
[4] M. Nowicki, J. Wietrzykowski, and P. Skrzypczynski, “Simplicity or flexibility? Complementary Filter vs. EKF for orientation estimation on mobile devices,” 2015 IEEE 2nd International Conference on Cybernetics (CYBCONF), 2015. http://ieeexplore.ieee.org/document/7175926/
Parts List:
Robotic Arm:
● Microcontroller: PIC32MX250F128B
● Servo Motors: SG90 (x3)
● RF: XBEE S2 (not used)
Sleeve
● Microcontroller: PIC32MX250F128B
● IMU: MPU6050
● Force Sensor: Pololu 2128260
● RF: XBEE S2 (not used)
Digi International | www.digi.com
Microchip Technology | www.microchip.com
TDK InvenSense | www.invensense.com
PUBLISHED IN CIRCUIT CELLAR MAGAZINE • MAY 2019 #346 – Get a PDF of the issue
Sponsor this ArticleDaniel Fayad graduated Cornell in May 2018 with a degree in Electrical and Computer Engineering. He now is a Software Development Engineer on Amazon's Alexa team. He is interested in Embedded Systems, Speech Recognition, Computer Vision and he isn’t particularly good at beer pong.
Harrison Hyundong Chang graduated from Cornell University in 2017 with bachelor’s and master’s degrees in Mechanical and Aerospace Engineering. He worked as an Antenna Engineer for a year at Samsung Electronics, mobile communications business, and is now working as a Hardware Engineer for an Internal Corporate Venture called C-Lab at Samsung Electronics.
Justin Choi obtained a bachelor’s degree in Mechanical Engineering from Cornell in 2017, followed by a master's degree in Aerospace Engineering from Cornell in 2018. He now works at Northrop Grumman Innovation Systems as a Guidance Navigation and Control Engineer, working on development of flight computer algorithms for commercial and science satellites.