Time-of-flight sensors have become small and affordable in the last few years. In this article, learn how these three Cornell graduates created a travel aid for the visually impaired with a few time-of-flight sensors, coin vibration motors, an Arduino Pro Mini, a Microchip PIC32 MCU, a flashlight and a sock.
We designed a low-cost, handheld Electronic Travel Aid (ETA). This type of device is typically used by visually impaired people to navigate more easily on foot. It could also be used to assist anyone in the dark or even just for a cursory look around corners or into difficult-to-access areas. This prototype uses six time-of-flight (ToF) sensors to gain information about the user’s immediate path. Time of flight is a method for measuring the distance between a sensor and an object, based on the time difference between the emission of a signal and its return to the sensor, after being reflected by an object. In our application, ToF technology conveys the information to the user by varying the intensity of six vibration motors against the user’s skin, with a one-to-one mapping between sensors and motors.
Our device senses objects up to 1.2 m away with high accuracy. Many travel aids rely on sound to give the user a picture of their surroundings, whether crude echolocation from the tapping of a white cane or complex auditory feedback given by the most popular ETAs on the market today . Those solutions are difficult for the visually impaired pedestrian who is also hard of hearing, in a noisy environment or simply not fond of the distraction of an in-ear device. Instead of sound, this device conveys the information to an area of sensitive skin on the forearm.
The sensor module is held in one hand, and is linked by a cable to the feedback module incorporated into a wristband. As the users sweep the sensor module in front of and around them, they form a mental picture of their surroundings. For ToF sensors, we used VL53L0X devices from STMicrolectronics on breakout boards from Adafruit. To take advantage of an existing VL53L0X Arduino library, we used an Arduino Pro Mini to read from the sensors using I2C. The sensor data were sent via UART through the cable to a Microchip PIC32MX250F128B microcontroller (MCU), which processed it and created corresponding feedback through the vibration motors.
ToF sensors were chosen for their speed. For the ETA to be usable, the delay between sensing and feedback must be as small as possible. A read from one of these ToF sensors takes a total of only 33 ms, making them very appealing for our project. ToF sensor breakout boards from vendors that provide good documentation cost about $15.
The sensing module contains six time-of-flight distance sensors arranged in a line, each tilted to a specific angle within a common plane. The total angle from the first sensor to the last sensor is about 25 degrees, which gives information across a spread of 1 m at 1 m of distance from the sensor module—good coverage of a pedestrian’s immediate path. This means the center of the sensing “cone” of each sensor is 20 cm apart (about a hand’s width) at 1 m of distance from the sensors. The distance between sensors is 0.8″ (2.03 cm), and the sensor array is centered between the middle two sensors. Using these dimensions, the desired angles of the TOF sensors relative to the horizontal were calculated to be—from left to right in Figure 1—12.7, 7.7, 2.6, -2.6, -7.7 and -12.7 degrees. The field of view of each sensor is 25 degrees, so it is unlikely that increasing the density of ToF sensors would improve resolution. And the additional time needed to read from the sensors would cause a delay in feedback.
Sensor mounting was achieved by soldering header socket segments to a long section of perfboard at the requisite angles. This was the trickiest part of the build. Each header’s legs were angled with a pair of pliers to allow angled integration with the perfboard. The profile of the array was drawn on a piece of cardstock and used as a guide for a team of three assemblers to align each header with the perfboard and solder it at the correct angle. The perfboard was secured to a repurposed flashlight case (acting as the handle of the ETA) with a rubber band. The flashlight electronics and plastic opening cover were removed, allowing a board that holds the Arduino Pro Mini, sensor wiring and two 9 V batteries to fit inside. These components are accessed easily when the rubber band is unhooked and the sensor array moved aside. Figure 2 shows the sensing module opened to access components.
The VL53L0X breakout board we used has seven pins (VIN, 2v8, GND, GPIO, SHDN, SCL, SDA), five of which (VIN, GND, SHDN, SCL, SDA) are used to connect to the Arduino. The sensors share the SCL and SDA lines as is typical for I2C connections with one master and multiple slaves. All sensor GND pins are connected to the Arduino board ground, and all VIN pins are connected to the 5 V Arduino VCC. Each SHDN pin, which is used to set new addresses for each of the sensors, is connected to different digital I/O pins on the Arduino board. We used pins A4 to A9. Figure 3 is the schematic of the system. As the schematic shows, we used Sean Carroll’s PIC32MC250F128B Small Development Board in our design. The Circuit Cellar article materials webpage has a link to the details of that board.
The perfboard holding the Arduino Pro Mini also supports the connections from the sensors to the Arduino, the connections to both 9 V batteries and wiring to the external flashlight button (they turn the device on and off). The perfboard also holds the connections to the cable that link the handheld part of the device to the wristband. The cable serves as both a physical connection and an electrical connection, carrying power and ground from both 9 V batteries and the transmit UART signal from the Arduino to the PIC32 MCU.
The vibration motors were attached to a wristband made from the ankle of a sock, with the motors arranged in a line across the top of the user’s forearm (Figure 4). The forearm was chosen both for its density of nerve endings—allowing the vibration motors to be closer together—and for convenience. The user can slip a wristband on and off with ease. The information from each sensor is mapped to a specific motor such that the rightmost sensor when the sensors are held with the flashlight button upward is mapped to the rightmost motor on the wristband and so on sequentially leftward. This creates an intuitive perception of where a sensor’s information is from in relation to the user. As there are six ToF sensors, six vibration motors convey sensor information to the user. Vibration intensity is varied through pulse-width modulation (PWM). Higher vibration intensity indicates proximity.
The motors’ wires were sewed to the wristband. The wristband is elastic to accommodate the user’s wrist size. The distance between vibration motors is about 1” when the wristband is slack, increasing up to around 2” on a larger arm. Separation by about 1” is required for the user to be able to distinguish the sensations of one vibrating motor from the next. It was easier to distinguish between motors if they were dangled by a short wire.
The motors were standard 3 V eccentric rotating mass (ERM) motors—the coin type commonly used in cell phones. The motors required more current than could be sourced from the PIC32, so a separate power source was used, and the motors were optically isolated from the PIC32 to prevent damage to it. A single opto-isolator circuit and its connections to the PIC32 are shown in Figure 3. An identical opto-isolator circuit was used for each of the six motors. The power source used for the motors was a standard 9 V battery regulated with the Darlington transistor voltage regulator circuit shown in Figure 3. The output of the voltage regulator is “VMOT_REGULATED” in the opto-isolator circuit. The motor circuit board containing the opto-isolators and voltage regulators forms the bottom layer of a stack, with the PIC32 mounted on top. The stack was secured to the wristband and wrist with two rubber bands (Figure 4).
A second 9 V battery powered both the Arduino Pro Mini and the PIC32. Both batteries were contained in the flashlight case. The cable connecting the sensing module to the feedback module included wires with a separate power and ground from each of the batteries and the transmit UART line from the Arduino Pro Mini to the PIC32. The schematic in Figure 3 shows the pin mapping from the Arduino to the PIC32 and from the PIC32 to the opto-isolators.
The data from the six distance sensors is retrieved by the Arduino Pro Mini via I2C. The Arduino sends the data to the PIC32 through a UART transmission. The PIC32 processes the data and scales it to a PWM value, which it outputs to the vibration motors through an opto-isolator circuit. A block diagram of this can be seen in Figure 5.
We used the Pololu VL53L0X library (Version 1.0.2 (2017.06.27)) to read from the VL53L0X sensor. This library is a simplified version of the VL53L0X API from STMicroelectronics. It includes basic functions to initialize the sensors, set them in different modes of operation, start measurements and read data from the sensors. We used the default settings of the sensor, which reads distance measurements up to 120 cm, because it has the fastest read time: 33 ms. This library was the motivation for including an Arduino in this device. The Arduino program consists of a setup in which the sensors are initialized and an infinite while loop in which each sensor is read from, the data are filtered of out-of-range readings, and then all data are sent by UART to the PIC32.
Each sensor has a default address of 0x29. To read measurements from all six sensors, we set a unique address for each sensor in the setup. This was accomplished by resetting each sensor by pulling its SHDN pin low (pin is default high), sending it into shutdown mode. The Arduino digital output pins A4 to A9 are each connected to a sensor SHDN pin. When the program begins, all are pulled low. Then they are set high one by one, pulling the SHDN pin high again while programming the sensor with a new address using the setAddress() function provided by the Pololu VL53L0X library. The new addresses are 0x30 through 0x35.
Once the addresses were set, we entered a loop in which we used the readRangeSingleMillimeters() function to take and store a single distance reading from each sensor in turn. The readings were checked for out-of-range readings—values higher than the 1,200 mm that can be sensed in default mode This was the only filtering of data performed at any point. Since the total time to read from all sensors is about 200 ms, the device can update the intensity of the motor vibration 4-5 times per second. It wasn’t worthwhile to take the time to perform running averages or reject outliers. That’s because the number one priority was real-time performance, and any outlier was unlikely to be noticed by the user.
After the error filtering, the measurements are sent by UART in the order they were taken from sensor with address 0x30 to 0x35. This is done through a series of Serial.print statements. Each sensor reading is separated by an asterisk (“*”) to relate the reading to a specific sensor and each round of readings is ended with a return character (“/r”) to indicate to the PIC32 that all six sensor readings have been received. The Arduino continues this loop of taking sensor readings and transmitting them to the PIC32 indefinitely.
The PIC32 program uses a stackless threading library called “protothreads” . These threads have very little memory overhead. The program uses two protothreads and two interrupt service routines (ISRs). The main function schedules a single thread in round-robin. That thread spawns a second thread to read data from the UART, then when the second thread dies, the first thread processes the data from the UART and scales them into values that define the PWM duty cycles. Meanwhile, two ISRs continually update the PWM duty cycles and output six PWM signals.
First, the main function schedules a thread to repeat perpetually. The first thing this thread does is spawn a protothread called GetSerialBuffer (originated by Cornell professor Bruce Land ) to receive and store the data sent by UART from the Arduino. This protothread builds a string from UART2 (which receives data from the RX pin that is connected through the cable to the Arduino TX pin) and accumulates it in character array. The character “/r” is sent by the Arduino after the data from all six sensors have been transmitted, and is used to tell the PIC32 to break from GetSerialBuffer, having stored in the character array six number strings separated by asterisks. When the return character is received, the thread GetSerialBuffer dies and control returns to the original protothread.
We iterated through the character array one character at a time, accumulating each char in the array into a second buffer, until a “*” character is identified. In this way one sensor reading at a time was accumulated in this second character buffer. When the delimiting character is reached, the protothread converts the sensor reading char value into an int, scales the value by some amount to turn it into a PWM value and stores the integer in the corresponding index of an integer array. The sensor data character buffer is then reset to be used as an accumulator in the next sensor reading.
The scaling used was in the form A×(B – measure)C where “measure” is a sensor reading (in millimeters), B is some maximum distance (in millimeters), C is the power and A is a scaling coefficient that ensures that the maximum duty cycle is restricted to the chosen range. After testing the strength of the vibration motors, we chose a maximum duty cycle of 37.5%. This maximum intensity is generated when an object is right in front of the sensor. Exponential scaling was chosen through testing, and uses the rule that perceived vibration intensity is proportional to the log of the actual vibration intensity . We tried scaling the distance reading using powers between 1 and 2 in our conversions to PWM, and settled on a power of 1.5. Our maximum measurable distance was 1,200 mm. To signal the user when an object enters the reach of the sensors, there is a sudden jump from no vibration to a low intensity vibration.
Creating the Jump
To create this jump we used a B value of 2,000 instead of 1,200, creating an offset of about a 10% duty cycle at 1,200 mm. Based on these parameters, the coefficient A is given a value that scales the distance reading to a certain number of cycles, since that is what the program uses to define the duty cycle. In this case, with a clock speed of 40 MHz and a period of 1 ms, a 37.5% duty cycle was 15,000 cycles long, leading to a coefficient A value of 0.1677. A graph of the scaling between distance and duty cycle is shown in Figure 6. The portion showing measured distances above 1,200 mm in reality maps to a zero percent duty cycle instead of the curve shown. That’s because values above 1,200 mm are never received. When no object is detected by the ToF sensor, an error value will be read and converted by the Arduino to a value that the PIC32 interprets as an out-of-range distance. The out-of-range distance is given a zero percent duty cycle, leaving the vibration motor off.
When the end of the character array is reached, scaled data from all six sensors have been stored in the int array. At this point, the protothread ends. Because it is the only thread scheduled, it immediately begins again. In this way, the PIC32 continuously reads the sensor data and updates the PWM signals.
The last part of the PIC32 code is devoted to generating PWM signals. We set the period of our PWM cycle to be 1 ms. Output compare channels can be configured to a PWM mode that handles generating the pulses. However, the duty cycle must still be updated from the new sensor readings, which is done in one of the ISRs. The Timer 3 ISR was used to update the five output compare modules, using the values generated by the scaling in the protothread. Because only five output compare channels are available on this version of the PIC32 MCU, an additional PWM signal must be generated using an ISR. The Timer 2 ISR is used to generate the last PWM signal. This is accomplished by flipping a bit based on a counter value that is calculated from the distance sensor reading, using the scaling detailed above.
TESTING AND RESULTS
The ToF sensors were tested at various distances and incidence angles to confirm that readings would be accurate enough to be useful. All six sensors used in the project were tested on a white painted wall in a room well-lit by sun and fluorescent lights. The distances tested were every 20 cm between 20 cm and 1,200 cm from the wall. These tests were repeated at zero angle of incidence, 20 degrees of incidence and 40 degrees of incidence. Five readings were averaged together for each point and sensor. The normalized absolute error of these data is graphed in Figure 7. This testing showed a mean error under 5% over all conditions tested. This was a positive result, especially considering that angles in the test conditions could not be measured with high precision, and was consistent with the manufacturer’s claim of approximately 3%-7% accuracy indoors, depending on object reflectance. An uncertainty of 5% at distances less than 2 m still allowed plenty of warning about objects in the user’s path.
System-level (product) testing was conducted indoors, on a white, green, yellow and blue tiled floor in a hallway. The hallway was lit by sunlight and fluorescent lights. Obstacles were placed at various points within a 15’ stretch of hallway. The tester—unaware of the locations of the objects—attempted to navigate through the stretch of hallway with eyes closed, using only the ETA. One of these tests is shown in Figure 8. In 10 tests, the user was able to navigate around the objects successfully six times. The results indicated that with more practice using the device, navigating around objects would be achievable. Practice helped gain a sense of what angle to point the sensors, what speed to walk and what scanning motion to perform to gain a picture of the environment. In addition, tuning the scaling of the sensor readings to the haptic feedback could improve sensitivity.
Additional informal testing involved navigating halls, doorways and staircases using the ETA. This testing supported the conclusion that practiced users can navigate around most objects without difficulty, as long as they don’t move more quickly than the environmental feedback updates. In practice, that means an ordinary walking pace.
Other system components were also tested. The output generated by the Timer 2 ISR was examined to ensure proper behavior. Waveforms of the PWM signals generated by the PIC32 were examined using an oscilloscope to ensure correct output. The opto-isolator circuits were built, examined closely against the circuit schematic by two people and connections were tested using a multimeter. Then the circuit was powered using a 4.5 V bench power supply, to ensure that no catastrophic failure or shorts would occur. After this, PWM signals generated by the PIC32 were input to the opto-isolator circuit, and the output PWM signal from the opto-isolator circuit was examined using an oscilloscope.
After examination of the waveform showed it had the correct shape, magnitude and frequency, the vibration motors themselves were tested by connecting them to the output of the opto-isolator circuit. At this stage of testing, the maximum duty cycle of 37.5% was chosen. Increasing the duty cycle beyond this point can cause the motors to overheat. The oscilloscope capture in Figure 9 shows the output of the voltage regulator while the vibration motors were operating at a high duty cycle—sensing an object very close to the sensors. The output of the regulator was set to about 4.5 V. The vibration motors draw about 200 mA when all six motors are operating at maximum duty cycle (37.5%). There was a voltage spike of about 3 V on the rising edge of the PWM. Though somewhat noisy, the regulator performed sufficiently well for this application.
The Electronic Travel Aid was successful in terms of how intuitively the vibration motors convey distance information to the user. Each tester that put on the wristband could point the sensor module arbitrarily and immediately know from the intensity of the vibration motors how far away the sensed objects were. Learning to navigate successfully with the device required some practice. After all, using only the ETA to navigate is like seeing the world only in the narrow spotlight created by the beam of a dim flashlight. A user has to develop a pattern and rhythm of sweeping the sensor module to gain information efficiently to build a picture of the surroundings.
Overall, the design of the ETA was simple and inexpensive, and the results of testing were positive. The efficiency of the transactions between the Arduino and the PIC32, and the between the PIC32 and the motors allowed the user to receive information without a noticeable delay, which is crucial for real-time operation. Any number of interesting modifications or extensions could be made to this project. Though this device is not likely to assist anyone in its current form, both haptic feedback and ToF cameras are exciting areas of development. We look forward to seeing continued advancements in assistive device technology.
For detailed article references and additional resources go to:
References  through  as marked in the article can be found there
PUBLISHED IN CIRCUIT CELLAR MAGAZINE• June 2019 #347 – Get a PDF of the issueSponsor this Article
Naomi Hess graduated from Cornell University with a BS in Electrical and Computer Engineering in May 2018. Her primary interests are power system planning and operation. She can be contacted at firstname.lastname@example.org.
Jun Ko is an M.Eng candidate in Electrical and Computer Engineering at Cornell University. He holds a BSEE from Korea Advanced Institute of Science and Technology.
Aaheli Chattopadhyay is currently pursuing her masters of science in computer science at Georgia Institute of Technology. She completed her undergraduate studies in electrical and computer engineering from Cornell University. Her interests lie at the intersection of machine learning and computer vision