A Twist on Modern Instruments
Normally, you’d think that taking the strings out of a harp would be a downgrade. But in this article, Cornell student Alex Hatzis presents a system that does just that—replacing the harp strings with red lasers. Phototransistors are used to detect when the beams are intercepted by a person’s hand playing the harp, and some convincing real-time sound synthesis helps to create a new, high tech instrument.
My lab partner, Glenna Zhang, and I had the opportunity to work together on a few final projects during our time studying electrical and computer engineering at Cornell University. Even though those projects varied wildly in both design and complexity, we always tried to tie-in one thematic constant, something for which we both share a passion: music. The application of technology has already given way to a number of new ways to produce music. The unique sound of the electric guitar would not be possible without its amplifier, and computer synthesizers can replicate most instruments with remarkable accuracy, or can be used to produce entirely new sounds. Despite these advancements, we both feel that there is plenty of untapped potential for technology to change how we produce music. Our final project for the course “ECE 4760: Designing with Microcontrollers,” is now dubbed the “Laser Harp,” and it is one such attempt to tap into that potential .
Anyone equally interested in the world of technology and music will probably realize that our project is not the first device to be called a “laser harp.” While researching the idea, we found that laser harps already take on several different forms. The most common is a single unit placed on the ground that beams several coplanar, outward-fanning lasers toward the sky.
Our idea took closer inspiration from a display at a local museum, the Ithaca Sciencenter, which featured a traditional harp frame without any strings . However, if you ran your hand through the middle, sounds would play as if the strings were still present. Our project was our own take on this example. At the highest level, our harp can be broken into a few different subcomponents: the laser emitter and receiver subsystem, the string control/sound synthesis subsystem and the physical harp frame.
Our laser harp is extremely intuitive to use. A block diagram of the harp subsystems is shown in Figure 1. When powered on, the eight laser pointers built into the top of the frame illuminate. They emit visible red light, but the beams themselves are not visible without the use of a fog machine or something similar. The harp is small enough to rest comfortably in one’s lap (Figure 2). This is how we designed it to be played—instead of resting on the floor like a typical full-size harp. Each of the lasers is carefully aligned to point at its own dedicated phototransistor circuit, which allows us to tell whether or not the corresponding laser pointer is currently shining on it. These lasers represent the strings of the harp.
In the current implementation, each of the eight “strings” corresponds to one of the notes in a C major scale. By running your hand through the harp, the laser beams are intercepted, and the phototransistor circuits register this. The control system interprets which strings are being plucked, and produces the correct waveform to be played through a speaker connected externally to the harp using a standard 3.5 mm aux cable. The sounds produced have been designed to closely replicate plucked strings, and the result is quite pleasing to hear.
Not all the design choices we ended up with were obvious from the beginning, and the changes we made in response to issues we found along the way are responsible for a few of the most distinctive aspects of the final design.
Our initial design for the harp didn’t specify the use of lasers, because the display we were inspired by at the Ithaca Sciencenter didn’t have any lasers visible. Rather, our goal was simply to have invisible or intangible strings. Our first idea, and the first step we took in this project, was to try ultrasonic or infrared range sensors. This seemed like the obvious choice, because, even using units with mediocre accuracy, we would be able to easily register the presence of a hand in the harp. The issue arose when we began experimenting with the region of sensitivity, which, with most sensors of this type, extends outward like a cone. This meant that closer to the sensors, they were quite precise in telling whether a hand or finger was in front of them.
As you moved farther away from the sensors, however, the cone where the sensor could “see” became wider. For many applications, this would not be an issue, but when trying to emulate several slim strings running parallel to one another, this made differentiating among those strings very difficult. A person trying to play the middle string, for instance, might accidentally end up tripping the sensors both to its right and left. A conceptual comparison of different sensor detection areas is shown in Figure 3.
It was this concern that led us to begin experimenting with laser pointers. The pointers we chose were originally designed for pets, so they are not very high power. But their beam divergence was easily small enough for the dimensions of our harp. Each pointer we acquired was typically powered with a single AA battery, but the additional length added to accommodate the battery made it difficult to fit the pointers into the frame. We used pipe cutters to remove the section of the pointers dedicated to holding the battery, and soldered wires to the leads so they could be connected to a single power supply (Figure 4). Each pointer runs on about 150 mA at 1.5 V, and since we wired the entire string of eight in parallel, the line runs on 1.2 A at 1.5 V. In hindsight, this was not an optimal setup and required higher-gauge wire, but the power supplies in our lab still worked with this power draw.
The challenge then became that these lasers had no inherent way, as range sensors did, of telling if their beams were being intercepted, so a separate system was required for checking this. For this purpose, we opted to use BPW40 phototransistors, which have a maximum response around 780 nm light. This is slightly further than ideal from the 650 nm emitted by the lasers, but when properly aligned with the beam, we found the response from the phototransistor was still enormously pronounced. The change in the circuit’s voltage was so strong that we could almost treat it as a digital signal, but we still ended up working with it as an analog voltage for robustness. It did, however, make it much easier to differentiate between when the light on the phototransistor was on or off.
Each phototransistor was built into an extremely simple circuit with Vcc (supply voltage) connected to its collector terminal, and a 1 kΩ resistor to ground on its emitter terminal. When light shines on the phototransistor, current flows through it and the resistor, causing the voltage across the resistor to rise. This means that a high voltage at the node between the phototransistor and the resistor corresponds to a string not being plucked, since the beam is not being broken. Consequently, when a beam is intercepted, comparatively little light shines on the phototransistor, very little current flows and the voltage across the resistor falls. We correlate this with a string being plucked.
The full circuit schematic is shown in Figure 5. Each node between each phototransistor-resistor pair is connected to an eight-to-one, two-way analog multiplexer, the CD74HC4051E from Texas Instruments (TI). Analog multiplexers are much slower than their digital counterparts, but because the strings are used only for user input, this was not an issue. All eight phototransistor circuits are multiplexed to the same single ADC (analog-to-digital converter) on our microcontroller (MCU) board, which further simplified our design later on. We used simple I/O (input-output) pins to toggle the three address bits on this multiplexer and sequentially cycle through the phototransistor circuits—checking them one at a time. Using this method, the MCU was able to read more or less continuously from a single ADC, to determine which strings were being played and which were not.
We used a PIC32 MCU on a custom development board designed for our class, ECE 4760. If our software found that one of the polled strings had been plucked by the user, then the PIC would output the waveform corresponding to that string (or multiple strings) using the DAC (digital-to-analog converter) present on the development board. That DAC output would then be fed through a very simple, passive, RC low-pass filter with a cutoff frequency of a few thousand hertz. This filter was intended to reduce some of the quantization that is inherent in the DAC. Once past the filter, the signal was passed to a 3.5 mm audio jack, which could be used to play the sound on external powered speakers.
Perhaps one of the most distinctive aspects of our project is its visual design. The harp has a handmade wooden frame that is designed to house all the electronics. Spending as much time as we did on the physical aspects of the harp was certainly not our original intent, but it made for a far better result than we had anticipated.
The first prototype we created was a cardboard box with holes at the top for mounting laser pointers, and spots at the bottom for affixing the phototransistors. We quickly realized that this type of frame, or any frame made from a weak material, would not be suitable for our needs. The main issue we encountered was with aligning the laser pointers to the phototransistors. The very reason we chose to use them in the first place—their precision—was making it nearly impossible to set them up reliably. Not only did we have to align the laser pointers to have their beams land on the phototransistors, but also the phototransistors themselves have a sharp angular response curve and had to be angled precisely to get consistent, easily discernible data. The response of these particular phototransistors begins to fall off sharply after 15 degrees, and is almost nonexistent past 20 degrees.
These factors, combined, meant that the specifications we had laid out for our project demanded a much more robust frame. The frame would have to keep the parts aligned when stationary, and it should not wobble or flex significantly when held. We decided that wood was the easiest material to work with that met our requirements for strength and stability. We used a few pencil sketches and a rough idea of how large the harp should be to create a 3D CAD model of the current frame in Autodesk’s Fusion 360 (Figure 6).
We had limited woodworking experience, so we decided it would be easiest for us to represent the harp as a series of two-dimensional pieces that we could cut out of plywood. We achieved this by projecting several of the profiles on the 3D model into a flat .dxf file (Figure 7). This type of file would normally allow us to cut out the pieces using a laser cutter, but the plywood we chose was too thick to allow this.
Instead, we used the profiles we generated as guides to cut out the pieces manually with a jigsaw, and used wood glue to assemble the pieces into the full frame. We sanded the sides where the different pieces did not align well, and added wood filler, mainly to help with the aesthetics, but also to make the frame more comfortable to hold. The last step was to spray paint the entire frame gold, which again was for purely aesthetic reasons.
With the frame fully assembled (pre-assembly pieces shown in Figure 8), we were able to install the eight laser pointers into the topmost arch, which has a gap in it designed to house them. From here, they shine down to the trapezoidal sound box. Once the lasers were fixed in place, we drilled holes for the phototransistors in the sound box, permanently aligning them with the laser pointers. The rest of the electronics also fit into the trapezoidal section of the frame, making for an extremely clean final design.
Sanding and painting the frame, along with packing all the electronics inside it, made for a much more enjoyable user experience. The harp was actually quite easy and pleasant to use as a result of several of the functional decisions we made during the assembly process.
Many small components in the code ensure our harp is working as intended, but the most important parts can be broadly divided into two sections: string polling and sound synthesis.
String Polling: The string polling section’s most basic task is exactly what the name implies—it has one thread dedicated entirely to repeatedly checking the state of the laser strings. It uses the I/O pins RB3, RB10 and RB8 to set the address bits and toggle through the eight inputs to the analog multiplexer, reading the value passed through to the ADC for each address. The way this thread is written, it reads the value on the ADC before updating the address I/O pins and yielding from the thread. This gives the ADC more time to set up during the time that the thread is yielded, ensuring that each reading is accurate. When any of the strings is determined to be plucked, the polling thread sets a variable that lets the sound synthesis section of the code know which note to play. The code checks one string every millisecond, so all eight strings are checked every 8 ms, for a polling rate of 250 Hz. This is comparatively low, but more than sufficient for reading human inputs, while leaving enough CPU time for the real-time synthesis.
Sound Synthesis: The sound synthesis section is the key to our harp’s ability to create a convincing string sound in real time. We used the Karplus-Strong algorithm to synthesize our sounds. Karplus-Strong is a sort of “shortcut” for simulating an oscillating string using the wave equation .
The wave equation itself is a second-order, partial differential equation, which can be used to describe wave-like behavior in any situation by relating several of the features of the oscillating system. If we want to accurately model sound on a string, then the wave equation is the place to start.
The wave equation in this form describes the relationship among the curvature, vertical acceleration and horizontal velocity of the wave. The details of how to use or solve this equation are beyond the scope of this article, but the important takeaway is that using a solution to this equation, we can model a vibrating string.
The algorithm takes advantage of the fact that one possible solution to the wave equation is a travelling wave on a string:
Notice that the sin() function makes the wave equation periodic. When a wave travels along a string with fixed ends (that is, the ends do not move, as would be the case in a harp or other string instrument), it reflects and inverts upon reaching the end of the string (Figure 9).
We represented this behavior in the code using a circular buffer shift register, where the length of the circular buffer is analogous to the length of the string. Using this technique, we replicated the periodic nature of the travelling wave. We started with two buffers travelling in opposite directions, with a factor of -1 applied at both ends to represent the waves being inverted at a fixed end of a string (Figure 10). Any real string has a certain “resonant” frequency. When a string is plucked, energy is added to it in the form of noise comprising all frequencies. All other frequencies die out as the resonant one lingers, which gives each string its distinct tone. The resonant frequency for each of our simulated strings is defined by the length of the buffer. Because a larger buffer means that it takes more time for the travelling wave to traverse it, a larger buffer corresponds to a lower frequency.
To begin implementing this in the code, we set up a hardware timer interrupt to run at 20 kHz, which is our synthesis frequency. The buffers shift their entries each time the interrupt from the timer occurs. These multiplications at the ends of the buffer commute to form a single buffer, cycling in one direction without any inversions. To set the length of the buffer for a certain note, we divide our sampling frequency by the frequency of the note we want to play. At a high enough sampling frequency, this circular buffer could always have an integer number of cells, but for most typical synthesis frequencies—in our case 20 kHz—we require a fractional number. For example, to play a C4 note (261.6 Hz) at a synthesis rate of 20 kHz, we would need 76.45 cells. This means that in addition to the 76 integer locations in our circular buffer, we needed a fractional time delay of some kind to achieve the required 76.45 cells, to have tonal accuracy.
Finally, low frequencies die out slower than higher ones on a string, so a low-pass filter replicates this characteristic (Figure 11). Ideally, this happens to every cell at every time step, but only applying it to one transition in the circular buffer—when it loops back to the beginning—is not audibly different to a human listener, because it still happens so fast, and since mathematically these operations can commute without issue.
In the actual ISR (interrupt service routine), we do not constantly resize the buffer (represented by an array) to change the note played or shift all the entries in the buffer. Instead, we simply move two pointers every time the ISR is called, and a separate variable tracks when those pointers should roll back to zero. We change it as necessary, effectively setting the length of the buffer. At this loop back, we first low-pass filter the signal by taking the difference between the previous and current index, multiplying it by a low-pass coefficient, and then adding that value to the current index. This shifts it downward if the previous index was lower, or upward if the previous index was higher, thereby smoothing the waveform.
The next step implements the fractional time offset. The goal of the code in this section is to create a phase-shift all-pass filter, where the specified phase shift results in the required fractional delay. It takes the difference between the previous index and the current one, multiplies it by a corrector value specific to the note and corresponding to the necessary fractional component, and adds it to the previous index, to obtain the corrected current index. This corrector value does not depend on the length of the string. Instead, each time it loops through the entire buffer once, it correctly delays the waveform. This results in accurate frequency readings with a tuning app, and the notes sound correct to the ear as well. The overall string sound is quite convincing as well. You can see and listen to a sample of the results in our project video below (Figure 12).
In the final implementation of this ISR, we used fixed-point arithmetic and a few small optimizations to synthesize four strings simultaneously, with each string using about 20% of the CPU. This means that at every timer interrupt, all these steps happen separately for four different strings.
If a string is determined to have been plucked by the string polling section, then another variable is set to add drive to the plucked string. After the drive has been added, vibrations continue to pass through the string until the damping coefficient causes them to die out. This creates a convincing string sound, ringing out for some time before dying out, independent of what the other strings are doing.
We encountered a snag at this point. Eight laser strings were built into our harp, but our MCU was limited to synthesizing only four strings simultaneously. We effectively had four “slots” that could be used for producing sound. The obvious solution seemed to be mapping two strings to each synthesis slot, so that plucking either the first or second laser string would play using the first synthesis slot, the third and fourth strings would use the second synthesis slot, and so on. The problem with this idea was that playing the two strings assigned to the same synthesis slot would require one of them to stop playing, and it would be impossible for them to play simultaneously.
We also found that changing the note assigned to a synthesis slot produced an unpleasant audible cutoff sound, since this required stopping the sound altogether to change the size of the buffer. However, re-plucking a string already being synthesized did not have this issue, so we wanted to change the notes assigned to the synthesis slots as infrequently as possible. Additionally, we wanted to allow recently played notes to continue playing for as long as possible, so they could overlap realistically with other played notes, regardless of which strings were chosen.
LRU DATA STRUCTURES
To achieve these objectives, we created a system of data structures to resemble an LRU (least recently used) cache, as an expansion to the thread where we poll the laser string inputs. An LRU cache of size four, for example, remembers the four most recent unique items that were accessed. So, if you request item A, item A is also placed into the cache. If an item already in the cache is requested, that item is provided, and it becomes the most recently used. If a requested item is not already in the cache, either the least recently used item or the one that has been in the cache the longest without being used is removed, and the newly requested item is added in the first spot. This type of priority structure is exactly what was needed for the strings and synthesis slots.
We started with the cache itself, which we implemented as a 2 by 4 array. Each column of the cache is a single entry, but for each entry we require two pieces of information—the string synthesis slot being used (ranging from 0 to 3), and the note currently assigned to that slot. If, for example, you play the notes A, B, C and D, they appear in the cache in the order of D, C, B, A. If you play A again, it moves to the front of the cache, changing the order to A, D, C, B. If you then play an E note, B is kicked out of the cache, and the cache becomes E, A, D, C.
In this example, while a note remains in the cache, the synthesis slot it uses remains the same. This minimizes the number of times we switch the synthesized notes, and also ensures that, when we do switch synthesized notes, we choose the least recently used synthesis slot. This plays more softly than any of the more recently played notes, minimizing how noticeable the cut-off sound is. It makes the harp sound more realistic, and also makes it simpler to change which notes are playing.
Finally, to see if a note is in the cache—and if so, where it is located in the cache—we have a map array. This maps each of the eight notes corresponding to the laser strings, if present, to its index in the cache. Otherwise it is mapped to the value -1. An illustration of how these data structures look after the aforementioned series of events is shown in Figure 13.
The data structures used to control sound synthesis were, in some senses, overengineered for the tasks for which they were required. We even had presets left over in the code for several additional octaves of notes and half notes that were not currently implemented. These artifacts suggest how we originally intended the harp to have slightly more functionality. You could imagine how something like an octave key could be a logical addition to a harp like ours. There could even be some interactive way to set which notes are played by each string.
Most of our software was designed to be easily expandable for these features, but what really caused these trade-offs for us was the structural design and construction of the harp. Not only did the construction require a significant time investment on its own, but also our decision to package all the electronics neatly into the interior of the frame meant that adding an additional push button or something similar would require modifications to the frame. Nonetheless, adding such additional features is a logical extension of the project.
Even without these features, at a base level we achieved the goal that we set for ourselves during our project proposal. From a product standpoint it was even better than we had hoped. Our harp turned out to be an extremely enjoyable design challenge, as well as a product that is pleasant both to the eye and the ear.
Author’s Note: My sincerest appreciation and thanks to my lab partner, Glenna Zhang, without whom this project would have been impossible to create and much less enjoyable to work on.
For detailed article references and additional resources go to:
References  through  as marked in the article can be found there
Autodesk | www.autodesk.com
Microchip Technology | www.microchip.com
Texas Instruments | www.ti.com
PUBLISHED IN CIRCUIT CELLAR MAGAZINE • OCTOBER 2019 #351 – Get a PDF of the issueSponsor this Article
Alex Hatzis is a Cornell class of 2020 student pursuing a BS in Electrical and Computer Engineering. His main interest is in embedded systems design.