Projects Research & Design Hub

Guitar Video Game Uses PIC32

Realism Revamp

While music-playing video games are fun, their user interfaces tend to leave a lot to be desired. Learn how these two Cornell students designed and built a musical video game that’s interfaced using a custom-built wireless guitar controller. The game is run on a Microchip PIC32 MCU and has a TFT LCD display to show notes that move across the screen toward a strum region.

While many popular video games involve playing a musical instrument, the controllers used by the player are not the greatest. These controllers are often made of cheap plastic, and poorly reflect the feeling of playing the real instrument. We have created a fun and competitive musical video game, which is interfaced with using a custom-built wireless guitar controller (Figure 1 and Figure 2). The motivation for the project was to experiment with video game interfaces that simulate the real-world objects that inspired them.

— ADVERTISMENT—

Advertise Here

FIGURE 1
Front of the guitar controller. Note the strings and plectrum.

FIGURE 2
Back of the guitar controller

— ADVERTISMENT—

Advertise Here

The video game is run on a Microchip PIC32 microcontroller [1]. We use a thin-film-transistor LCD display (TFT) to display notes that move across the screen toward a strum region. The user plays notes on a wireless mock guitar, which is built with carbon-impregnated elastic as strings and a conducting plectrum for the guitar pick. The game program running on the PIC32 produces guitar plucks and undertones of the song, while keeping track of the user’s score. The guitar is connected to an Arduino Uno and Bluetooth control center, which communicates wirelessly to the PIC32.

The controller was designed to simulate the natural motion of playing a guitar as closely as possible. We broke down that motion on a real guitar into two parts. First, users select the sound they want to play by holding the appropriate strings down. Second, the users play the sound by strumming the strings. To have a controller that resembled a real guitar, we wanted to abide by those two intuitive motions.

FRET & STRUM CIRCUITS
At the top of the guitar controller is the fret board. This is where the users can select the sounds they want to play. Throughout the system, the sound is represented as a nibble (4 bits), so we use 4 strings to select the sound.

Each string works as an active-low push-button. The strings are made of carbon-impregnated elastic, which feels and moves like elastic but is also conductive. Each string was wrapped in 30-gauge copper wire, to ensure solid contact with any conductive surfaces. The strings are each connected to screws that run through the fret board and connect the strings to the fret circuit (Figure 3).

FIGURE 3
Complete controller circuit schematic (on guitar).

The purpose of the fret circuit is to detect changes in voltage across four lines. Each line is branched off a power rail and connected across a string to an input pin on an Arduino Uno. Current runs from the power rail across each string to its respective input pin, which reads a HIGH signal. To detect a push on the string, we grounded the surface into which the string is pushed. By wrapping the fret board in a grounded conductive pad and pushing the string into the fret board, we are able to ground our signal before it can reach the input pin. When this occurs, the associated pin reads a LOW signal, which is interpreted as a press of the string by our system.

— ADVERTISMENT—

Advertise Here

Along with the fret circuit, we needed a way to detect strums. The strum circuit is similar in its use of a copper-wrapped, carbon-impregnated elastic string. The string is connected through the fret board to an input pin on the Arduino, but is not powered. Without any external contact, the pin reads LOW. When voltage is applied to the string, the pin reads HIGH, detecting the strum. To mimic the strumming motion most accurately, we used a guitar pick to apply the voltage to the string. The pick is wrapped in a conductive material (aluminum foil), which is connected to the power rail. Contact of the pick applies voltage to the string, which on a rising edge denotes a strum.

As shown in Figure 4, the direct user interface for the player is the guitar controller. The physical interaction with the guitar is converted to an encoded signal by an Arduino mounted to the back of the guitar. The Arduino Uno polls for a signal that denotes a strum, and then reads the strum pattern across the four strings. The signal is sent over USB serial to a Bluetooth control station, which uses a Python script to broadcast the signal to an Adafruit Bluetooth LE module. The laptop that we used as a Bluetooth control station established a link between the controller and the Bluetooth receiver, and was paramount to the debugging and testing of our system. Finally, the Bluetooth module communicated over UART with the PIC, which interpreted the user’s signal in the context of the game [2].

FIGURE 4
Shown here is a block diagram of the controller signals.

To enable communication between the Bluetooth module and the PIC, we connected the TX and RX lines on the module to the RX and TX lines on the PIC, respectively. These lines are shown in the diagram of the development board (Figure 5) as RA1 for RX and RB10 for TX. The Bluetooth module can be found as U7.

FIGURE 5
PIC32 Development Board Schematic. Note the RX (RA1) and TX (RB10) lines used for UART communication, and the DAC outputs used for sound generation.

SOFTWARE DESIGN
Animation, score tracking and song generation were accomplished in two threads in the main C program on the PIC32 [3]. The animation thread was run at a constant frame rate, such that the user experiences a uniform rate during the game. It is slow enough, however, to allow for the game processing and UART communication protocol. The game thread is run as quickly as possible to check for correct notes played and to update scoring.

The game thread handles all scoring and checking whether the correct note is played at the correct time. The thread keeps track of the current game time, and upon detecting a strum, compares it with a window of acceptable times to be strumming for the current note. For any note, this window starts at the note’s ”start time,” and has a constant period, which we refer to as the ”user error tolerance.” After a note’s window has passed, we increment a pointer to the current note, and reset all variables related to sound generation.

Upon detecting a strum, we check that the current game time is within the current note’s start time and user error tolerance. If it is, we then check whether the string pattern is equal to the correct value according to the note’s frequency. If this is the current condition, we increment the score by the score factor and increment score factor by 1, unless the latter has already reached its maximum value of 30. If the note being played is a “G”, which requires two strings to strum, the increment to the score is doubled. If a strum is detected at a time that is not within any note’s timing window, or if the string pattern is incorrect, then we reset the score factor to 1, and decrement the score by 2, unless the score is at its minimum value of 0.

If a note’s time window has passed—more specifically if the current game time is greater than the start time of the note plus the user error tolerance–and the user never made an attempt to the play the note, then the user is penalized. Similar to the previous case, we reset the score factor and decrement the score.

The cases above are shown in the game logic block diagram (Figure 6), which represents the states we will be in during the game. Note that at all points in the game, we are either in a correct or incorrect time period, and in each of those periods, we continuously check the other conditions previously mentioned.

FIGURE 6
Game logic utilized in game thread.

Once the last note has been played, the game state enters a final score display screen (Figure 7) and waits until the user restarts the game. The final display screen shows the correct number of notes played out of the total possible and the final score. It also prompts the user to restart the game if desired.

FIGURE 7
TFT display at the end of the game. Note the total score is displayed and the number of correct notes played out of the total possible notes.

The movement of all the notes across the TFT display is managed by the animation thread. This thread begins by initializing the TFT for a vertical resolution. The notes start at the bottom of the screen and then move across the screen toward a ”strum region.” When the notes hover over that area, the user is supposed to strum the correct strings. The strum region, shown in Figure 8, is represented by empty circles of the same color as that of the note that will be played. We display the score, score factor and current game time on the sides of the TFT. We also display the musical note representation of the current string pattern.

FIGURE 8
TFT display mid game. Note the score, score factor, and current string pattern shown on the left side, the game time shown on the right, and the notes being displayed in the middle.

We had a hard requirement of at least 15 frames/s for drawing the notes and updating the scoring metrics to eliminate tearing in the display. Therefore, we determined trade-offs between the sampling frequency of the audio, processing of the game logic, and the number and frequency of updates on the TFT display. To meet the audio and animation requirements, we implemented our sound generation with a sampling frequency of 8,000 Hz instead of a higher frequency. The final design allowed for a constant 16+ frames/s.

POSITIONING THE NOTES

Each time we enter the animation thread, we must update the y position of the next four notes to be played. The Y position follows a function that ensures that the note is in the strum region when the current game time matches the note’s start time. We erase the previous animation of a passed note by keeping track of the note’s position before Y is updated. The calculation for these positions is shown in the code in Listing 1. Active notes are drawn to the screen with colors matching their frequencies, as follows:

“C” = YELLOW; “D” = RED; “E” = GREEN; “F” = BLUE; “G” = RED & BLUE.

for (i = 0; i<4; i++){
notes[noteIndex+i].yprev = notes[noteIndex+i].y;
notes[noteIndex+i].y = 20 - ((currentGameTime-notes[noteIndex+i].startTime)/10);
}

LISTING 1
Shown here is code for calculating the X and Y positions.

When a note has been played correctly, there is auditory feedback matching the feedback of the guitar. The sound generation occurs in an interrupt service routine (ISR) at a sampling frequency (Fs) of 8,000 Hz. Sound is generated using Direct Digital Synthesis (DDS) for the undertones of each note, and Karplus-Strong String Synthesis for the guitar plucking noise when a note is played correctly [4]. The sound is outputted on a two-channel speaker using two lines of a DAC (A for undertones and B for Karplus-Strong). We write to both lines of the DAC through SPI.

Our configuration includes setting up a timer (Timer2), which allows us to have a DDS sample rate of 8 kHz on a 40 MHz clock. We set the timer to trigger our DDS logic ISR. We set up Timer2 by first opening it to interrupt at 5,000 cycles, configuring it, and clearing the interrupt flag, as shown in Listing 2.

OpenTimer2(T2_ON | T2_SOURCE_INT | T2_PS_1_1, 5000);
// set up timer with priority 1
ConfigIntTimer2(T2_INT_ON | T2_INT_PRIOR_1);
mT2ClearIntFlag(); // clear interrupt flag

LISTING 3
For the acoustic guitar string plucking, the basic filter for a Karplus-Strong implementation is used. An example of the C implementation of this filter is shown in this code.

For the undertones, we looped through a 256-entry sine table using a 32-bit phaser. Because the sine table is only 256 entries long, we used bit shifting to get the top 8 bits of the phaser before doing a lookup. The phaser increment value is determined by:

where our sampling frequency is 8 kHz. The calculated phase increment is added to a phase accumulator. The output frequency is changed based on the current note the user is supposed to play, regardless of whether the note is played correctly. This allows for audio feedback if the note is played incorrectly or if the user is unable to determine visually what note to play.

We also modulated the amplitude of the sound linearly from 0 to 1 at the start of each note, to avoid any annoying clicking at the start or end of a note. We chose linear ramping because of the simplicity of the calculation, and to maintain a nearly continuous sound for the user. The undertone is outputted to channel A of the DAC and played on one channel of a connected speaker.

For the acoustic guitar string plucking, the basic filter for a Karplus-Strong implementation is shown in the block diagram in Figure 9. We implemented this for each note using fix16 arithmetic to utilize fewer CPU cycles in the interrupt [9]. An example of the C implementation of this filter is shown in Listing 3.

FIGURE 9
Implementation of Karplus Strong block diagram. [10]

DAC_B_data = (C_table[pluckIndexIn]>>16); // sample out, fix16 to int
lowpass_out = multfix16(damping, ((C_table[pluckIndexIn]) + (C_table[pluckIndexOut]))>>1) ;
C_table[pluckIndexIn] = multfix16(tune,(lowpass_out - last_tune_out)) + last_tune_in //
replace current pluckIndexIn with new time-delayed averaged value with tuning
constant
// all-pass state vars
last_tune_out = C_table[pluckIndexIn];
last_tune_in = lowpass_out;

Each note’s table is originally set to white noise, to simulate the initial randomness of an acoustic guitar string pluck that converges to a single frequency. We did not dampen, to simulate a high energy pluck and to allow the note to play continuously at the same energy level until it ends. The frequency of the note determines the correct amount of delay we require for each update cycle. Whenever a note is finished, its circular buffer array is refilled with white noise for the next time it is played.

RESULTS AND DISCUSSION
We used unit testing for each component of the project, to ensure that every part of the system worked separately. We then combined each of these components and tested edge cases to ensure the implementation worked according to the specifications we set out to fulfill.

First, we constructed the physical set-up of the mock guitar using carbon-impregnated elastic and a conducting plectrum connected to an Arduino Uno [5]. We tested the physical set-up by first making a mock-up out of cardboard, to test the length and width for user comfort. After prototyping the design, we made modifications and then constructed the final guitar out of 1/8-inch plywood, with carbon-impregnated elastic for strings and a conducting spectrum.

Second, we tested the note structures’ animation and timing visually, to ensure that the notes would appear with ample time for the user to play each note correctly and enter the strum region of the TFT at the correct time. We caught timing and display errors and corrected the physics and display of the note movement accordingly.

Third, the song construction and note structures were tested by printing values to the TFT and debugging according to the specification. By using this method, we caught timing errors when updating the note index of the song, and corrected them accordingly.

Fourth, we tested the sound generation using the oscilloscope on the DAC and listening to the undertones. By listening, we could change the fine tuning and dampening of the string generation to more closely resemble the frequency and sound of an acoustic guitar played with a high energy pluck.

Fifth, we tested the Bluetooth communication between the Bluetooth control station and the Adafruit Bluefruit LE UART Friend to the PIC32, and caught additional timing errors for updating the strum value [6]. To correct these issues, we modified the UART library by updating a timer flag.

MORE ON GAME PLAY
The final game-play runs constantly at a little over 16 frames/s and shows no tearing or jumping in notes, scoring or timing on the display. The score factor updates consistently and accurately when notes are played correctly, an incorrect note is played, or there is no strum by the correct time. When playing a note that requires two strings at once, double score is added correctly.

The frequencies of our undertone for each of the five notes (C4, D4, E4, F4, G4) have errors of less than 1%, according to measurements from the oscilloscope. The errors for frequencies from the Karplus-Strong String Synthesis vary for any given note due to limitations in the sampling frequency caused by the use of integer left-shift registers. The synthesized frequency in relation to the notes’ natural frequency and percent error are shown in Table 1 [7].

Note Natural Freq. (Hz) Synth Freq. (Hz) Error Percent
C4 261.6 258.0 1.4
D4 293.7 296.2 0.8
E4 329.6 333.3 1.1
F4 349.2 347.8 1.4
G4 392.0 400.0 2.0

TABLE 1
Frequency errors for each of the five notes

Further error reduction is feasible by using fractional stages in the Karplus-Strong algorithm by tuning each individual note. We instead chose a fixed tuning constant to minimize error for all notes by audibly testing the sound generation.

There are many opportunities to implement further functionality in the game. Although we found a good medium for the user tolerance error in the game, adding a feature that allows the user to choose the difficulty would be possible in our current implementation. We could also extend the game to allow multiple-string chords to be played. This feature could be extended for playing multiple notes simultaneously, simply by adding and averaging the components of their Karplus-Strong buffers. There are also opportunities for extending the implementation of the mock guitar. While we used carbon-impregnated elastic to create a connection for the strings and strumming, an extension could allow fretting on the same string by measuring the resistance change when stretched.

The real bottom line for this project is that the game is exciting and engaging to play! We set out to create a fun video game designed to simulate the basic hand movements of playing a guitar. Our end result was a successful, functional video game that allows the user to play various note patterns for our implementation of “Ode to Joy” [8].

The scoring component and error noise create a competitive and educational feature that encourages the user to improve through audio-visual feedback. Check out this YouTube video of our project below: 

For detailed article references and additional resources go to:
www.circuitcellar.com/article-materials
References [1] through [10] as marked in the article can be found there.

RESOURCES
Adafruit | www.adafruit.com
Microchip Technology | www.microchip.com

PUBLISHED IN CIRCUIT CELLAR MAGAZINE • MARCH 2019 #344 – Get a PDF of the issue

Keep up-to-date with our FREE Weekly Newsletter!

Don't miss out on upcoming issues of Circuit Cellar.


Note: We’ve made the Dec 2022 issue of Circuit Cellar available as a free sample issue. In it, you’ll find a rich variety of the kinds of articles and information that exemplify a typical issue of the current magazine.

Would you like to write for Circuit Cellar? We are always accepting articles/posts from the technical community. Get in touch with us and let's discuss your ideas.

Sponsor this Article
+ posts

Jake Podell (jhp246@cornell.edu) is a Junior at Cornell University studying Computer Science in the School of Engineering. He is also a member of the CUAir, Cornell’s Unmanned Aerial Vehicle project team, and will be interning at Facebook over the summer.

Jonah Wexler (jfw95@cornell.edu) is a Senior at Cornell University studying Electrical and Computer Engineering with a Dyson Business Minor for Engineers. Next year, he will be working at BlackRock in the Portfolio Analytics Group.

Jake and Jonah have been friends throughout college and have worked on various engineering projects with each other.

Supporting Companies

Upcoming Events


Copyright © KCK Media Corp.
All Rights Reserved

Copyright © 2023 KCK Media Corp.

Guitar Video Game Uses PIC32

by Jake Podell & Jonah Wexler time to read: 13 min