Sensors and Servos
Setting out to create an inexpensive prosthetic, these students at Cornell University developed a robotic hand that can be controlled with the user’s forearm. The system uses flex sensors to measure the changes in forearm topography, which is then translated to finger movement.
Modern prosthetic hands are expensive, and can involve invasive and bulky technologies. The increased availability of 3D printing, microcontrollers (MCUs) and sensors inspired us to build a simplified alternative that is both portable and inexpensive.
Our design was also inspired by an interest in human physiology. The sensing method for this project is based on the visible deformation between the wrist and forearm when a finger is flexed. Each finger is connected by a tendon to a muscle in the lower forearm. To flex a finger, the flexor muscle can flex in unique shapes to pull specific tendons, while applying minimal force to the other tendons. Most importantly, these movements can be replicated even by a person without a hand . By monitoring tendon and muscle flexes, we can predict individual finger movements, which can form the basis of a prosthetic device.
Our design consists of an array of flex sensors on a cuff worn by the user. The sensors are connected to a Microchip PIC32 MCU. The system receives information about changes in surface flexure, and uses it to determine the most likely intended motion. It then translates this to a 3D-printed robotic hand by sending signals to the servo motors that control the fingers. This method—while prone to sensor alignment issues—gives a good estimate of motion. We believe that incorporating these measurements into prosthetics can increase their portability and decrease cost.
The robotic hand consists of a 3D-printed model that is mounted on a board and connected to motors. The 3D model was sourced from a design posted by Ryan Gross under a Creative Commons license . The fingers are connected to the base of the model using a length of paracord, which forces them to an open position while still allowing flexibility. Each finger is also connected to a servo motor, using fishing line strung through the center of the hand. The middle and ring fingers are controlled by a single motor, because they often move in unison, and their independent motion is not required for a functioning hand.
The four motors are powered by a 9V battery connected to a 5VDC converter, with a capacitor across the power and ground rails to eliminate noise. Each servo is capable of rotating up to 120 degrees, giving us enough range to fully open and close each finger of the model hand. This rotation is controlled via pulse-width modulation (PWM) signals from the MCU. The hand is attached to the end of a wooden board, and the servos are mounted with screws to the middle of the same board. This setup is shown in Figure 1.
The cuff is worn over the user’s forearm. It consists of seven flex sensors attached to a sleeve made from an athletic sock (Figure 2). The flex sensors are thin film piezoelectric sensors (piezos) covered in a plastic laminate, with two leads crimped at one end. They generate voltage in proportion to the speed at which they are flexed . The extreme sensitivity of these piezos made them attractive to us for use in measuring forearm deformation from muscle and tendon flexure.
The sensor placement was determined through a series of experimental trials. Based on oscilloscope readings, we found the largest response on the inner forearm, just below the wrist. However, the majority of the differentiable, finger-by-finger tendon movement in this area occurs within a surface area of approximately 2cm x 4cm—meaning we could not assign an individual flex sensor to each tendon. Instead, we used the combinations of four different sensor readings at the wrist to differentiate finger movements.
Additionally, we attached three other sensors to the lower forearm, where muscle flexure was most prominent when opening and closing the full hand, thumb and middle two fingers. To accommodate this large number of sensors, we connected each one to an input of a Texas Instruments (TI) CD4051BE analog multiplexer, whose output was wired to a single pin on our board . The MCU reads the voltage for each sensor through an analog-to-digital converter (ADC) channel.
When flexed, the piezos are capable of generating up to 70V, which could damage the MCU. To resolve this issue, we created an overvoltage protection circuit for each sensor. This consisted of two back-biased diodes—one connected to the supply voltage VDD (3.3V), and the other connected to ground. The positive end of each flex sensor was attached between these two diodes, so that if the generated voltage leaves the safe range of 0-3.3V, one of the diodes becomes forward-biased and shorts the excess voltage to power or ground.
Another feature of the signal generated by flexing the piezos is that it includes both negative and positive voltages. To allow for the capture of the negative portion of the signal, we created a rail with voltage VDD/2 using a voltage divider. The piezos were then connected at their negative ends to this half-voltage rail instead of ground, so that the signal would be normalized about half the input voltage. This allowed us to read the negative voltages as values between 0 and VDD/2 V. For noise elimination, the flex sensors, themselves were attached to these circuits using shielded wire. A full schematic of the circuits used in this project is shown in Figure 3.
READING AND PROCESSING SIGNALS
To read and process the signals on the software side, we used an interrupt service routine (ISR), whose main task was to read the voltages generated by the flex sensors. Because each sensor is connected indirectly through a multiplexer, we implemented a digital counter that counts up from zero to the number of flex sensors. The value of the counter increments with wrap-around each time the program enters the ISR. The value of the counter is written to the mux select pins, causing the corresponding sensor to be read on the ADC. The ISR reads this value, computes whether or not it is a minimum or maximum value for that sensor over the previous 500ms and stores it in the sensor’s data structure if it fits either criteria.
To react to human motion, which requires a rapid response to input, our software consists of multiple concurrent threads and an interrupt service routine, each responsible for different calculations and events. The ISR, which collects flex sensor values from the ADC, runs at about 800Hz, which allows us to sample each sensor quickly enough to ensure that we will not miss a significant value from any sensor. It is important to sample at a faster rate than the capabilities of human motion, because the signals we receive from the sensors rapidly change shape, and therefore have transient maximum and minimum values. Testing by outputting the values gathered by the ISR on an on-board digital-to-analog (DAC) channel confirmed that we can recreate the approximate flex sensor signal waveforms for any given movement.
After collection by the ISR, the flex sensor values are analyzed by the calculation thread to determine whether a movement has occurred and to categorize it if possible. This thread runs every 500ms, allowing enough time for the user to perform a complete motion, and the analysis is based on the values collected by the ISR during this time frame. The motor thread, which generates the pulse-width modulation (PWM) signals that actually move the hand, runs every 20ms, so that any action predicted by the calculation thread can be converted to movement almost immediately.
ANALYSIS OF SIGNALS
The analysis of the signals is done by a separate thread in our program running every 500ms. Our primary algorithm performs an L1 template match between calibration data and real-time values collected from the flex sensors. This is carried out in a series of steps involving analysis of the last set of data gathered by the ISR, normalization of these values, and performing various calculations to determine the ratios of the peak and minimum values for each sensor. The algorithm then attempts to find the minimal sum of errors between these values, and a set of values obtained during calibration. Based on this, the system predicts the corresponding hand movement, and translates this into motion on the 3D model.
To obtain the calibration dataset, each user must put the device into calibration mode after putting on the cuff. This mode, which is triggered by a button press, walks the user through a series of motions by displaying instructions on a screen. The user is asked to open and close each finger (with the middle and ring moving simultaneously) and the whole hand, and to leave the hand at rest. Each motion is repeated five times. This step is necessary due to differences in the arm length and physiology of users. It helps us ensure that the readings we take from the sensors are calibrated, such that the predictions we eventually make are personalized for the current user.
During calibration mode, data are collected from each flex sensor for every movement that the user is asked to perform. These values are normalized with respect to the measurements taken when the user’s hand is steady. The average of the sensor values between each trial is taken. This average is used to determine each sensor’s peak and minimum value during each action, and the ratio of each sensor’s values to each other. The ratio measurements, which are fractional values between 0 and 1, are multiplied by a fixed scaler to ensure that they are on the same order of magnitude as the measured peak and minimum sensor values. The absolute values and the ratio measurements are then given different weights, based on how well they minimize errors across each of the calibration trials. Due to computational limitations, these weights are chosen from a small discrete space instead of a continuous one. Possible weights for any measurement are: 0, 0.25, 0.5, 0.75 and 1. However, the sum of weights for each action must be equal to 1. After this step, the measurements and weights are associated with their corresponding action until the next time the device is calibrated.
Detailed pseudocode for this analysis is given in Listing 1. The calibration thread executes slowly, allowing the user 2 seconds to complete each movement. This is done mainly to minimize human error, because in this time frame, the user must read the instruction prompts and move their hand accordingly. This slow runtime does not affect the rest of our system, however, since calibration mode runs in a blocking thread and is completely independent of the rest of the program.
LISTING 1 - Pseudocode for the analysis algorithm Calibration analysis: 1. For each of the 5 trials: a. For each action: i. Prompt user to perform action ii. For each sensor: 1. Store the max and min values obtained during this time 2. For each action: a. Calculate the average max and min values for each sensor b. Sum the average min and max values over all sensors c. For each sensor, calculate and store the ratio of its min and max to the sum 3. For each of the 5 trials: a. For each possible weight w: i. Calculate the differences between the average and trial max and min values for each sensor, multiplied by w ii. Calculate the differences between the average and trial ratio values for each sensor, multiplied by 1-w iii. Save the weight w that minimizes the sum of these errors Calculation analysis 1. For each sensor: a. Determine the current max, min and ratio values b. Calculate the weighted error between each sensor’s current value and its saved value for each action 2. Predict the action corresponding to the minimum sum of these errors
Finger movement is determined by comparing the data obtained during calibration with the data measured by the ISR in real-time. For each sensor, we again calculate the maximum and minimum values and the ratios between sensor values that were collected by the ISR during the previous 500ms time period. Error calculations are then performed between the current values and the calibration set, with the absolute value and ratio measurements weighted according to the values determined during calibration. The movement corresponding to the minimal error between the two sets is predicted. This motion is then translated to the hand by turning on the open or close PWM signals for the appropriate servo motors. A flow chart for this algorithm and the interrupt service routine, is shown in Figure 4.
RESULTS AND DISCUSSION
We found that measuring tendon flexure in the wrist and forearm is a valid approach to differentiating among finger motions. Throughout analysis, we determined that certain areas of the forearm are more useful for recognizing different movements. The locations of these “hot spots” are shown in Figure 5.
Additionally, some motions were easier to predict when only absolute values of the sensors were considered, whereas others were easier to differentiate using ratio-based comparisons. Differentiation of large motions—such as opening and closing the whole hand—relied heavily on the maximum values of a few sensors. Prediction of smaller movements, such as closing the thumb, often depended more on the ratios of the peak and minimum values of a specific sensor to those of the other sensors on the cuff.
Due to the nature of materials used in our design, many factors that affected accuracy varied from trial to trial. These included mainly the positioning of the sensors on the cuff, how tightly they were able to be taped down and the positioning and stretchiness of the cuff itself. The only actions for which we were able to obtain accuracy of almost 100% consistently were opening and closing the hand. Actions involving specific finger movements could be predicted with up to 75% accuracy, provided that the wearer performed these motions consistently and deliberately during calibration. A YouTube video showcasing the performance of the hand is shown below:
We discovered that many people encounter a learning curve when attempting to calibrate the device. However, this is not uncommon with prosthetics. In our case, this learning curve was short. During a demo at the Cornell Engineering Project Showcase, five people attempted to control the robotic hand for the first time. After just one run of calibration mode, almost everyone was able to perform hand open and close motions, and to control the two middle fingers. Success with the thumb and index finger motions was far less common. In addition to human error during calibration, this was likely due to length and physiology differences in the arms of different users, since the places we used to measure these two motions were confined to small areas near the wrist. In contrast, control of the whole hand and two middle fingers can be measured across a larger area of the forearm, and thus was more easily generalized (Figure 5).
Another limitation we found during the demo was the sensitivity of the flex sensors to arm movement. If a user’s arm bumped into the table, for example, the hand would move in unpredictable ways. Additionally, accuracy decreased as the user’s arm was moved further away from its initial position during calibration.
Despite the accuracy limitations, we believe this project is a successful prototype. However, there are multiple future improvements that we would like to make. Throughout development, we explored an alternate method of analysis, in which the MCU sent all the sensor data over a USB serial link to a computer running a Python script. The idea was to train a machine learning model on a calibration data set, and use this model to predict all future movements based on the incoming data.
Unfortunately, speed limitations with our current hardware made it impossible to perform this type of analysis, and still react to human movement in a timely fashion. We decided to stick with the lower scale, error minimization-based analysis, which gives us a real-time response with somewhat less reliability. In the future, integrating a Raspberry Pi or another similar device over a faster link could allow us to use machine learning to perform a more in-depth analysis of the data.
Further improvements could also be made in the areas of hardware and overall system design. Using a different type of piezo film might allow higher spatial resolution, and integrating a greater number of sensors would enable more complete mapping of forearm topography. A more rigid cuff design—such as a 3D-printed exoskeleton —could prevent displacement of sensors from their positions on the cuff, and help eliminate transient signals during arm movement. Currently, the hand is only capable of complete opens and closes.
Because the piezos respond to velocity rather than displacement, determination of intermediate positions between open and closed is not feasible. The inclusion of electromyography (EMG) and other sensors could enable better categorical accuracy in regards to finger determination, and they also could collect data for mapping partial finger flexes onto the robotic hand. Additionally, improvements could be made to the calibration process to make it more streamlined and user-friendly.
Altogether, our prototype cost about $85 to build. This is significantly less expensive than the current prosthetic hand options on the market, which range from $7,500 to $100,000. Although our design is not entirely reliable as a stand-alone product, existing issues can be resolved by incorporating other technologies. A prosthetic utilizing this design might allow a person whose hand is severed above the wrist to regain some functionality.
 Tendon use in amputees.
Smith, D.G., General Principles of Amputation Surgery. Chapter 2 in Atlas of Amputation and Limb Deficiencies. https://orthop.washington.edu/patient-care/limb-loss/general-principles-of-amputation-surgery.html?fbclid=IwAR0L7phoCnhjw8bhVw8BgdyHsZ5ndLcHf588hPVpenIiIY958ceBtu3IT4k#muscle
 3D hand model
 Flex sensor datasheet
 Multiplexer datasheet
PUBLISHED IN CIRCUIT CELLAR MAGAZINE • SEPTEMBER 2020 #362 – Get a PDF of the issueSponsor this Article