Using a PIC32 MCU
Can a PIC32 MCU solve a Rubik’s cube faster than a human? To find out, these two Cornell students built a self-contained robot that can solve Rubik’s cube. The system uses image processing Python script and specialized algorithms to solve the puzzle.
While hanging around at home with a couple of friends, we came across an old Rubik’s cube. None of us actually knew how to solve a Rubik’s cube, so we decided to set ourselves a challenge. Two of our friends would try to learn how to solve a Rubik’s cube manually, and we would attempt to build a robot to solve one. Once we were both done, we would hold a competition to see who could solve the Rubik’s cube the fastest.
This project is a self-contained robot that solves Rubik’s cube. The robot only requires human interaction to initially place the cube, press “capture” for our camera and make minor readjustments of the cube throughout the solving phase. After the initial cube placement, our robot follows a predetermined sequence to take a photo of each side of the cube. These photos are passed to an image-processing Python script, which generates an array of the cube by color to be used by Kociemba’s algorithm . Once Kociemba’s algorithm identifies the moves needed to solve the cube, it sends the moves, stored as an array, through a serial connection to our Microchip PIC32 microcontroller (MCU). This solution is sent in a condensed form of “Right Side Clockwise Rotation,” “Bottom Side turn 180 Degrees” and so forth. Our software then instructs the robot’s four arms to execute the required rotations, using specific motor command signals.
The arms, themselves, consist of 3D-printed parts that use two servos—one to extend and retract the arm, and the other, equipped with a claw, to rotate the cube. This system can be visualized with the state diagram shown in Figure 1.
Because our project was limited to a budget of $125, our plans for the structure of the robot had to be adjusted accordingly. Although having six arms would have been the most efficient setup, it was not fiscally possible to buy the 12 servos necessary to fit this ideal configuration. Additionally, having six arms would present numerous complications for taking photos of the cube. This left our final design with four arms, all in one plane. However, this arrangement also presented an efficiency issue of not being able to interact directly with the Front and Back faces of the cube. Looking directly at the cube in our setup (the same view as that seen by the camera taking the photos in Figure 2), the arms were designated “Down,” “Up,” “Left” and Right.”
Another limitation of the budget was that we had to downgrade the quality of the physical structure holding the arms. In our final design, a sturdy piece of wood (11″ × 24″) is a base, with two thin pieces of wood (0.25″ × 1.5″ ×1 4″) jutting up vertically, and another piece (0.25″ × 1.5″ × 11″) connecting the two vertical pieces. Each of these structures had an arm bolted into it, including two very small pieces (0.25″ × 1.5″ × 5″) that held the Down arm in place. Consequently, building our structure was tedious, and took multiple attempts to get the claws to grab the cube with the necessary precision
The arms, themselves, consisted of an outer sleeve housing a sliding inner piece (Figure 3) that could extend from and retract into the outer sleeve. This was made possible by attaching a servo to the side of the outer sleeve with a gear connected. The teeth of the gear grip a piece inserted into the sliding inner piece. The sliding inner piece was also equipped with a very strong servo, 20kg/cm, that was equipped with a 3D-printed claw. This was the point of contact between the robot and the cube. To meet our budget constraints, we got a servo that had just enough range of motion (about 60 degrees) to extend the claw to the cube and retract off the cube enough to rotate without hitting it. In total, each arm consisted of five 3D-printed parts, two servos and a decent amount of lubricant to help the pieces move.
The servos were driven using variable PWM signals that we controlled through our PIC32 MCU. The schematic of the circuit we used is shown in Figure 4. The circuit features an opto-isolator that separates the MCU from the servos, to prevent any large inductive spikes from the servo from destroying the transistors in the MCU port. The bench power supply, the Keysight (formerly Agilent) E3630A, provided enough current to drive all the servos, as long as multiple servos did not try to move simultaneously. If we attempted to move multiple servos at the same time, the motors drew more power than the supply could provide, and the servos acted erratically and did not move properly. This somewhat limited how we could interact with the cube, and caused the movements to be less specific than we could have made them.
The PWM (pulse-width modulation) control signals for the servos had a pulse width of 20ms, with a duty ratio between 0.025 and 0.125 (0.5ms to 2.5ms). The PWM control for the claw servos had to be a lot more precise than for the gear servos, so four output compare units on the PIC32 were used to generate these signals. Since we only had five output compare units on our PIC32 MCU, we needed to use an interrupt service routine (ISR) to create the remaining four PWM control signals.
The ISR, which was triggered by a timer running at 100kHz, contained a counter, which was incremented on every iteration to set certain pins on the board high or low. This manually created the required PWM control signals. The counter was reset every 200 iterations of the ISR, to give us our 20ms pulse width. Although this ISR method worked, we noticed that the PWM control signals generated were not as precise or responsive as the ones generated by the output compare units; however, this was not a huge problem for us, since the gear servos only ever needed to be in one of two positions.
Most of our software consisted of the functions required to actually move the claws and make them perform the moves required to solve a Rubik’s Cube. For the claws to retract, extend and rotate without interfering with one another, they had to be almost perfectly aligned. The PWM signals allowed us to set our claw servos to either vertical or horizontal, and our gear servos to either extended or retracted, by modifying the duty cycle of the signals. These values were calibrated roughly at first, and fine-tuned throughout the entire time we were building the robot.
We used an incremental design process to program and test all the robot’s movement functions. First, we got the most basic moves (the extension and rotation of the claws) working and tested. Once we were confident that these moves were working properly, we combined these basic moves to form the more complex moves, such as setting the claws on the cube in the correct orientation. These, in turn, constituted the even more complex moves such as rotating the cube and turning each of its faces. This incremental design process made debugging the movement functions a lot easier than if we had just tried to program the complex moves immediately.
Our final move-list consisted of the following moves:
show_Face. In this naming scheme CW stands for clockwise and CCW stands for counterclockwise. The turn moves were meant for actually turning each of the four faces of the cube that the claws had contact with in both directions. The rotate functions were required to interface with the Front and Back faces of the cube, and to enable each face of the cube to point toward the camera. The
show_Face function held the cube in such a way that none of the sections on its face was obscured by the claws. Together, these functions allowed us to perform all the moves required to photograph and solve the cube.
To have a self-contained robot, we had to develop a way for the robot to retrieve the initial state of the cube. The most obvious solution was to use a camera to take photos of each side. Our Python script started by taking a photo every time the “Enter” key was pressed, and stored these images in the same directory as the script itself.
Once all the photos were captured, we used a Python library called openCV  to figure out what color each square was. This worked by checking each pixel of the image and comparing it to a “mask.” This mask was just a range of RGB values that corresponded to the six colors of the Rubik’s cube. The values we used were: [45, 55, 225] to [120, 120, 255] for red; [70, 132, 245] to [177, 220, 255] for orange; [120, 220, 225] to [219, 255, 255] for yellow [78, 112, 0] to [160, 200, 70] for green; [180, 61, 39] to [255, 145, 120] for blue; and [220, 220, 220] to [255, 255, 255] for white.
Figuring out what ranges to use for each color was an entire project-long calibration. Depending on the lighting, the colors that the photos captured could change, so we tried to limit the variations in the lighting setup. If a pixel didn’t fit into the mask ranges, we knew that it was not a relevant pixel and could be discarded. We chose to make an irrelevant pixel black. Once the mask was applied to the image, we took an average of roughly 3,000 pixels from an area we identified to be the cube. The cube was never in the same spot, due to imperfect rotations. As a result, sometimes our averaged square landed a bit outside the cube and in the area deemed irrelevant by the mask. These pixels were simply ignored, and we turned them green so we could tell when this had occurred.
This script in action is shown in Figure 5. You’ll notice the output to our command prompt has the colors in terms of “Back,” “Front,” “Up,” “Down,” “Left” and “Right,” since this was how Kociemba’s algorithm takes the input. You will also notice that we do not take the average of the center face. This is because we place the cube into the structure the same way every run, and our code specifies the order of the faces we are viewing. This means that we could hard-code in the color of the center face, because we knew exactly what we were looking at without the image processing.
We decided to use Kociemba’s algorithm  instead of the optimal solution (commonly referred to as “God’s Algorithm”), because it gives us a “reasonable” solution in a short amount of computing time. Kociemba’s algorithm guarantees a solution with under 30 moves. In contrast, the optimal solution algorithm could take a very long time to actually run, which is why we didn’t use it.
Kociemba’s algorithm creates a two-phase approach that allows it to search for a solution faster. The first phase is to get the cube to “state 1.” In layman’s terms, state 1 is any subset of the cube that can be achieved by running the moves: Up clockwise, Down clockwise, Right 180, Left 180, Front 180 and Back 180 from a solved cube. Kociemba’s algorithm uses an extensive set of pruning tables to formulate the most efficient “moveset” to transform a scrambled cube to state 1. Then, it uses the same moves given above to transform the cube from state 1 to a “solved state.” In our implementation, “Up” refers to the blue side, “Right” refers to yellow, “Left” to white, “Front” to red, “Down” to green, and “Back” to orange; and “180” refers to two rotations of a face.
From state 1 you can get to a solved cube from only doing the moves: R CW, R CCW, L CW, L CCW, F CW, F CCW, B CW and B CCW. The algorithm will check if a longer phase 1 can lead to a shorter phase 2, and thus a shorter overall solution. The algorithm uses extensive lookup tables to create a solution. Our original plan was to run this algorithm on the PIC32, but we simply did not have the memory needed to store the lookup tables. Instead, we ran the algorithm on a laptop and sent over the result to the PIC32.
The output of Kociemba’s algorithm was a string consisting of the moves required to solve the Rubik’s cube from its initial state. To transfer this string from the laptop to the PIC32 MCU, we connected the PIC32 to the laptop, using a USB-to-serial cable, and used the serial control link to communicate with the Python script running on the laptop. The serial setup on the PIC32 was quite simple, and was done primarily through the functions provided in the Protothreads library ,. Similarly, the serial setup on the laptop side was also done easily using a simple serial library.
Before being sent over, the output string of Kociemba’s algorithm was modified slightly to delineate the beginning and end of the moves. Once this string was received by the PIC32, we decoded it and converted the string into an array of moves, which could then be iterated through and performed by the claws to solve the cube.
RESULTS AND DISCUSSION
All in all, our robot met most of our expectations for the project. Although the 3D-printed parts for the arm were printed a bit too large, after hand-sanding down the interlocking pieces and adding some silicone lubricant, they moved flawlessly. The arms had to be calibrated so that they could all extend to meet on the cube and retract enough to allow the claws to rotate, while not disrupting the cube. Because the claws also were printed too large and did not fit into the servo attachments as expected, the claws were a little more fragile than we would have liked. This caused them to break a few times throughout the progress of the project, and each time they broke, we had to restart the calibration process.
After calibrating using different PWM signals, we got the claws to rotate consistently. The only issue was the physical support structure itself. This made some minor human interaction necessary to reset the cube if it wasn’t sitting right. Overall, we estimated that roughly a third of the moves required minor tapping of the cube back into place. However, the claws probably would have ignored the slight offsets and powered through with their moves. On less than 10% of the moves, the cube had to be moved significantly to reset it in the correct position. The claw servos we purchased promised 180 degrees of rotation, but much to our disappointment, turned out to have only about 160 degrees. This took away the ability to do 180-degree turns, and instead, caused a loss of efficiency by substituting two 90 degree turns.
Another major difficulty with this project was getting consistently correct results from the image-processing Python script. The overhead lighting was extremely bright, so the resulting glare on the cube sometimes made the orange and yellow indistinguishable. After attempting to calibrate to this glare using redefined RGB ranges, we found that it would be a better use of time to control the lighting instead. We used a cardboard shield to block the glare, which worked slightly better. But the program then confused the orange and red. Even so, it worked much better than our original setup. To compensate for a potential mismatch, we created a manual user override that allowed us to pass our own input to Kociemba’s Algorithm, if needed.
All things considered; our robot worked fairly well. It completed the cube with minor human assistance in a majority of attempts, taking 4-5 minutes on average, depending on the amount of moves the solution required. A lot of this time was due to delays between moves, when we had to reorient the cube. If we had a more robust support system, these delays would not occur, and our completion time would drop substantially.
To be completely honest, it took our friends all of 20 minutes to learn how to solve a Rubik’s cube, compared to the entire semester of work that that it took us to design, build and test the robot. So, while this was a fun and interesting project that taught us a lot, if you’re main aim is to solve a Rubik’s cube fast, we would recommend learning how to do so manually.
References: Kociemba, Herbert. “Kociemba’s Algorithm.” The Two Phase Algorithm, kociemba.org/cube.htm
 “Opencv-Python.” PyPI, pypi.org/project/opencv-python/
 Dunkels, Adam. Protothreads. dunkels.com/adam/pt/
 Land, Bruce R. “Cornell University ECE 4760.” ECE 4760 Course Website, Cornell University, people.ece.cornell.edu/land/courses/ece4760/
Tsoy, Maxim. Kociemba. github.com/muodov/kociemba]
“Fully 3D-Printed Rubik’s Cube Solving Robot.” Thingiverse, 2019, thingiverse.com/thing:2471044.
Keysight Technologies | www.keysight.com
Microchip Technology | www.microchip.com
PUBLISHED IN CIRCUIT CELLAR MAGAZINE • MARCH 2021 #368 – Get a PDF of the issueSponsor this Article
James Connelly (firstname.lastname@example.org) graduated with a B.S. in Electrical and Computer Engineering from Cornell University in 2019. He has been working as a Software Integration Engineer since graduating.
Shivansh Gupta (email@example.com) graduated with a double B.S. in Electrical & Computer Engineering and Computer Science from Cornell University in 2020. He is currently pursuing an M.Eng in Computer Science from Cornell Tech.