Projects Research & Design Hub

Building a Smart Weather Cube

Using a Raspberry Pi 3B+

The Raspberry Pi, combined with compatible RPi add-ons (HATs), is a versatile technology for all types of projects. Here, learn how these two Cornell students built SmartCube, a wirelessly controlled device for your desk that displays weather information graphically with LEDs. While showing weather info, it also functions as a desk lamp or night light, displaying calming, colorful animations.

  • How to build a wirelessly controlled desktop device that displays weather information graphically with LEDs

  • How to design the hardware subsystem

  • How to integrate the OpenWeatherMap API to the code

  • How to do the interfacing between the systems

  • How to set up the LED matrix

  • How to develop the multi-threaded software

  • How to understanding the timing requirements

  • Microchip PIC32 MCU

  • Raspberry Pi 3B+ SBC

  • WS2812 is an RGB LEDs

  • Touch sensor

  • Adafruit piTFT touchscreen

  • Adafruit Pi Camera

  • Raspberry Pi OS

  • Pygame library

  • HM10 module

  • 3D printers

Our project, SmartCube, is an easily accessible source of weather information for your desktop. It displays temperature and animations based on the weather conditions (clouds, rain, thunderstorm, snow), becomes a desk lamp by illuminating a plethora of hues (Figure 1), and could be used as a night light by displaying calming animations with your color preferences. Additionally, it can be conveniently controlled wirelessly via Bluetooth.

FIGURE 1 – The fully integrated system of the SmartCube weather display and desk lamp

SmartCube combines the strengths of the Raspberry Pi and the PIC32 (Figure 2). The Microchip PIC32 microcontroller (MCU) drives all real-time intensive tasks, such as reading input from a touch sensor, driving 200 individually addressable RGB LED lights and driving a TFT display (Figure 3). The Raspberry Pi was used mainly for its user interface design tools, its connectivity capabilities and its processing power. The Raspberry Pi allows the user to see live weather information, communicate commands to PIC32 via Bluetooth Low Energy (BLE), and detect shapes and colors to display on the lamp.

FIGURE 2 – Block diagram of the SmartCube system
(Click to enlarge)
FIGURE 3 – Schematic diagram. It includes the MCU (PIC32); the Input devices: touch sensor and Bluetooth module; and output devices: TFT display and LED strip.
(Click to enlarge)

The Raspberry Pi used the built-in Wi-Fi module, a piTFT touchscreen and a Pi Camera as the input interfaces. With the built-in Wi-Fi module, we were able to connect to the Internet and download live weather information from a weather forecast website, along with the time and date. The touchscreen allows the user to interact with buttons in a custom-built GUI, which controls the function modes of the lamp. The Pi Camera lets us implement shape and color detection, to be later displayed in the lamp (computer vision).

For output interfaces, the Pi system had the piTFT display and the built-in BLE module. The piTFT displayed live weather information, a forecast of the weekly weather and the live feed of the camera when in use. The built-in BLE module communicated the functioning modes to an HM10 BLE module connected to the PIC32 system, enabling it to make the appropriate adjustments to the lamp.

The hardware of this subsystem consisted of the Raspberry Pi 3B+ board, an Adafruit piTFT touch display, an Adafruit Pi Camera and a 5V power bank. The piTFT has a resolution of 320×240 pixels, and measures 2.8″ diagonally. The camera has a resolution of 8Mpixels, and is capable of capturing 720p video at 60fps. Most of the hardware was in the PIC32 subsystem. The addition of a 5V power bank made it possible for the entire subsystem to be portable.

Our Pi ran Raspberry Pi OS (Buster release). The program on this subsystem consists of four subroutines: GUI, API integration, Bluetooth connectivity and computer vision. With the Pygame library [1], we created a user interface graphic to display weather information and to control the functioning modes of the lamp. The design was inspired by mobile weather apps, to make it appealing and familiar to the end user (Figure 4).

FIGURE 4 – GUI developed with Pygame [1]

We integrated the OpenWeatherMap API [2] to our code to obtain live weather information. This API offers a free subscription that gives current weather, weekly forecast, and up to 60 calls per minute, which ensures accurate weather to the second. We used temperature, sky condition, pressure and humidity. Each API call returns a string in JSON with lots of information about the current weather. The string is parsed using Python’s JSON library to obtain the desired parameters and filter out unnecessary information. The API includes an icon pack to illustrate different sky conditions. The temperature is read in degrees Kelvin, so we wrote a function to convert it to degrees Fahrenheit.

An initial objective of ours was to implement face recognition, so the lamp would recognize different users and adjust preferences accordingly. Due to time constraints, however, we resorted to implementing shape/color detection using Python’s OpenCV library. Following a tutorial in [3], we were able to successfully detect shapes and colors. Contour detection and color masking worked together in our algorithm, because we needed to isolate colors effectively before detecting the shapes in those colors. To achieve this, we converted the color format into HSV, and optimized the thresholds for detection before we actually ran the code. We recommend performing the calibration in lighting conditions dimmer than the environment where it will be used, so that it will be capable of detecting and masking out colors more effectively.

We used a predefined function to reduce the noise in the edges of the detected shapes and then approximate the number of sides. Based on this number, we determined whether a shape was a triangle or a rectangle. Sometimes the system mistakenly detected triangles as rectangles. The issue surfaced mainly under poor lighting conditions, which led to a noisy image with inconsistent color, and to the detection of a false fourth side. To avoid this, we added some debouncing. The system would then determine the shape to be a rectangle only after several dozen consecutive detections of rectangles. This process increased the detection delay by approximately 1 second, but also increased the detection accuracy to approximately 95%!


After many unsuccessful attempts at connecting the Pi with the HM10 module, we got it to work with Python’s Bluepy library. It offers functions to open a connection to other BLE devices with a known MAC address. Once connected, it can send information by writing to specific handles in the receiver. In the case of the HM10, handle 0x0012 corresponds to characteristic 0000ffe1-0000-1000-8000-00805f9b34fb. This characteristic is used by the HM-10 to read/write up to 20 bytes at a time. Once the data are sent from the Pi to the HM10, they are ready to be processed by the PIC32.

We built upon the example code for serial UART communication in Cornell’s ECE 4760 course website [4]. Since we were receiving information from another machine, we used the function PT_GetMachineBuffer_aux() to properly receive and process the messages. To use it, we first needed to set termination conditions to tell the PIC32 when the data transmission had finished. We decided to use a termination count to end the transmission after five characters had been received.

Upon receiving the string, it is placed in a buffer and is ready to be decoded. Our initial encoding only had one character, because the number of animation modes could be expressed with 8 bits. However, we decided to increase it to five to ease the process of receiving the current time and temperature. We needed only one character for each animation, so in the instructions that change animation modes, we sent four additional characters to keep the termination count correct. The usable character that would contain the instruction was retrieved from the first character in the buffer array.

The moment the lamp turns on, it receives the time directly from the Pi. The time arrives as a 5-char string with the format “thhmm”, where “t” indicates that the string brings time information, and the rest of the string comprises the individual digits necessary to display in a 12-hour clock. For instance, a received string of “t0905” tells the PIC to update the time to 09:05. The temperature was received after the Pi sent the command to display “Weather” mode. It was also received as a 5-char string, but only the first three characters would be used. The format of the temperature message is “xTTss”, where “x” indicates that the message brings the temperature, “TT” is the two digits of the temperature in degrees Fahrenheit, and “ss” is “stuffing” characters.

Whenever data are received, this thread updates the global variable bt_data with the first character in the receive buffer array. Inside the thread, there is a switch statement controlled by bt_data to update the global variables that control the current time and temperature. This variable is also used throughout the rest of the code to control animations.


The Bluetooth module and a soft potentiometer are the two input interfaces of the PIC 32 (Figure 3). Once these instructions received by the HM10 are decoded, the output interfaces respond appropriately. The soft potentiometer changes the colors of the LEDs when the lamp is in the Fullbright, Waterfall and Color Fade modes.

For output interfaces, we have a TFT display and an individually addressable RGB LED strip (Figure 3). In the TFT, the current time and temperature updated consistently and accurately. The LED strip has 200 LEDs distributed equally in 5×10 matrices among the sides of the cube. All four sides are connected together and display the same animation based on the command received from the Pi.

We designed the lamp’s casing using Autodesk Inventor (from Autodesk). The casing consists of three parts: the lamp base, the LED matrix mount and the lamp cover (Figure 5). The dimensions of all these parts were dictated by the dimensions of the LED matrix. The parts were 3D printed using a transparent PLA (polylactic acid) filament for the cover and a silver filament for the base and LED mount. To obtain the right percentage of light diffusion, the case was 3D printed with a 60% infill and 0.2mm layer height. We made the 3D printable files available for downloading [5].

FIGURE 5 – 3D-printed lamp casing (measurements in mm). a) base, (b) LED mount/holder and (c) cover

The main technical detail to consider for the LED matrix design was the maximum current that the LEDs can potentially draw. In our project we used a NeoPixel WS2812B strip from Adafruit. One LED draws a maximum current of 60mA when all three colors are illuminated with maximum luminous (white light). Keeping the total current of the system in mind, we limited the number of LEDs to 200, resulting in a 10-row by 5-column LED matrix per side (Figure 6). In addition, we decided never to set the intensity of the LEDs to its maximum value. In our case, we limited the maximum total current of the LED matrix to 3A.

FIGURE 6 – LED matrix. The lamp has one matrix per side, for a total of four matrices.

All sides of the cube replicate the same information using the same data line. To accomplish this, we connected all the power, ground and data leads to one another. The data lead sends the information bit-wise to the LED matrix. In this case we would send information prominent to one side of the cube, which, in turn, would be replicated on all other sides.

The 2.2″, 18-bit, color TFT LCD display has 320×240 color pixels. The TFT display is located on the top of the cube. To understand what’s going on, it’s helpful to watch the project video at this point (see the YouTube video below (Figure 7)). On the display, the current time flashes on and off every second, with precision up to the minute. In addition, the display provides the outside temperature in degrees Fahrenheit. The display communicates with the MCU using a 4-wire SPI.

The circular soft potentiometer on top of the cube allows the user to select the desired color that will be shown in the lamp. The potentiometer has a 10kΩ resistor seen across the ground and VCC (3.3V) leads. The middle pin has a varying resistance with respect to ground and VCC depending where the user presses. When not pressed, the middle pin floats. To use accurate value readings from the middle pin, a 220Ω resistor connects to ground, so the reading will be 0 if the potentiometer is not pressed. We did this to prevent the value out of the potentiometer from floating.


We used a rate scheduler to schedule the different threads in our program. This ensured that time-critical tasks were given more time in the processor. We set the thread protothread_timer to have the highest priority. This thread contains the all the conditions under which the LEDs light up and show animations, and needed highest priority to ensure that the LEDs worked correctly. (The use of these LEDs requires high precision timing when sending bit streams.)

The second highest priority belongs to the thread protothread_demo, which handles the demo mode. It also needs high priority with respect to other threads, because of its close contact with LED setup and usage. The thread protothread_pot has the third highest priority, because it needs to make precise ADC (analog-to-digital converter) readings while in use. It has lower priority because these readings, though precise, do not occur often or quickly, compared to the CPU clock, since they are user dependent.

Finally, with lowest priority we have threads protothread_bluetooth and protothread_clock. The Bluetooth thread handles information sent from the Raspberry Pi to set mode, time and weather. Since it does not have any time-critical task, running it with low priority allows us to reallocate time needed in the threads with higher priority. The thread protothread_clock sends the time to be displayed on the TFT display. The only time-sensitive aspect is to update every second, and this time is large compared to the CPU computations. Therefore, the thread always ends up executing precisely.

We based our implementation of controlling the LEDs on code found in Cornell’s ECE 4760 course website [6]. To address each LED in the matrix individually, we used a NeoMove function, which receives as parameters the LED’s X and Y coordinates and the RGB values that the LED will take. The algorithm decodes the received X-Y coordinate to the corresponding LED number in the strip. The matrix comprises 50 LEDs arranged in a matrix that is 5 columns by 10 rows, where each column and row represents the X and Y coordinate, respectively. An X-Y coordinate (1,10) corresponds to the first LED and a X-Y coordinate (5,1) corresponds to the last (50th) LED on the physical strip.


The LED strip has a complex way of receiving and understanding the data line sent by the MCU. The strip functions with critical timing requirements. To achieve this, we created an array with the bit stream that will be sent to the LED strip, using function NeoDraw. The data transfer for each LED is green, then red, then blue—8-bits each, with the MSB (most significant bit) first. Using the array, we then decompose the values into bits, and send these bits via a timing requirement stated on the data. The data transfer time chart is shown in Table 1 [6].

TABLE 1 – Data transfer time chart [6]

Timing is very important when sending data on this particular device, so minimizing tasks that waste time is critical to obtaining good performance and reducing bugs (unwanted flickers). Throughout our code we often opted for time delays between updating variables and sending them, so that the data stream was not corrupted with unwanted bit values. Refer to Adafruit’s WS2812B specification sheet for precise timing [7].

To create animations, we began by developing a function that can be invoked when all LEDs need to be illuminated (or turned off) with the same parameters/colors simultaneously. It takes as input parameters three integer values that correspond to the values RGB that the LEDs will take. We use this function in combination to individually address LEDs to create the custom-made animations listed in Table 2. The constructed animations for specific weather conditions and modes can be seen by scanning the QR code in Figure 7 and watching the project video.

TABLE 2 – Encoding LED animations

The thread handling all animation and modes is called protothread_timer. Its function is to decode the Bluetooth value received, and translate it into animations using the LEDs. Table 2 shows how each character received via Bluetooth maps to a different animation. Under each case, we use the LED matrix control function to generate different combinations of LEDs that result in the perceived animations. We also set the color of the LEDs here, and the timing at which we want animations to run.

When creating moving graphics, the timing aspect needs to be accounted for, so the animation will have correct visual perception and distinction by the human eye. To slow down the changes at which the CPU can execute them into noticeable changes in a graphic, we use yield times within the code. They make the LEDs stay on long enough to be seen, and for the eye to perceive the changes. This yield time is also crucial for our scheduling algorithm and the concurrency of our code.

An additional thread was created to read the potentiometer and update the appropriate variables. Inside this thread we handle all inputs from the potentiometer and translate them into colors that animations will use. We wanted to be able to gradually change through multiple color combinations ranging from blue to pink (Figure 8). White was added after the pink color. Using the ADC, we get a value reading from the potentiometer in the range of 0 to 1,023.

FIGURE 8 – Color wheel based on RGB LEDs [8]

When the user is not pressing the potentiometer, the ADC reading will always be 0 (as discussed earlier). The ADC values are divided into 35 possible sections. Each section or division corresponds to values set in global variables for the colors Red, Green and Blue that the LEDs will have. They are set as global variables because they are used inside the threads where animations are called and executed. This allows us to change the color of animations instantaneously, while allowing the animations to be concurrently executed.

We wanted a thread to handle the updating of time in the TFT. This thread ran every second for accurate timekeeping. Since the time was received digit by digit, the clock had to be updated as such. For instance, when 60 seconds had elapsed, the second counter was reset, and the rightmost minute was incremented. If the righthand minute was 9, then it would be reset, and the left-hand minute would be incremented. The same applied to the digits composing the hour. The clock was implemented as a 12-hour clock (Figure 9). The hour was displayed in the TFT, and flashed on and off every second. 

FIGURE 9 – Clock on TFT display


[1] Pygame library:
[2] OpenWeatherMap API:
[3] Tutorial for detecting shapes and colors:
[4] Example code for serial UART communication:
[5] Files for the parts of the 3D-printed casing:
[6] Code for controlling the LEDs (includes data transfer time chart):
[7] AdafruitWS2812B spec sheet:
[8] The RGB color wheel:

Adafruit |
Autodesk |
Microchip Technology |


Keep up-to-date with our FREE Weekly Newsletter!

Don't miss out on upcoming issues of Circuit Cellar.

Note: We’ve made the Dec 2022 issue of Circuit Cellar available as a free sample issue. In it, you’ll find a rich variety of the kinds of articles and information that exemplify a typical issue of the current magazine.

Would you like to write for Circuit Cellar? We are always accepting articles/posts from the technical community. Get in touch with us and let's discuss your ideas.
+ posts

Alberto Lopez Delgado is a Cornell University alumnus with a Bachelor’s
degree in Electrical and Computer Engineering. He currently works as an
SoC Design Verification Engineer at Intel. He is mostly interested in computer
systems and MCUs. He can be reached at

Carlos Gutierrez is a Senior studying Electrical and Computer Engineering
at Cornell University. He has worked as an intern with Lockheed Martin for
the past two summers. He is interested in embedded systems and power
electronics. He can be reached at

Supporting Companies

Upcoming Events

Copyright © KCK Media Corp.
All Rights Reserved

Copyright © 2024 KCK Media Corp.

Building a Smart Weather Cube

by Alberto Lopez Delgado and Carlos Gutierrez time to read: 13 min