Projects Research & Design Hub

Build an Automated Pest Deterrent System

Using Raspberry Pi

Garden pests are a threat to your flowers, vegetables and fruit. Learn how these Camosun College students designed an automated pest deterrent system that uses an ML-trained Raspberry Pi to detect and identify pests, and Bluetooth to communicate to its user. Their prototype base unit displays relevant information gathered by two field units.

  • How build an advanced pest deterrent system

  • How to optimized a passive infrared sensor (PIR) set up

  • How do use machine learning to avoid false triggering

  • How to select the rights sounds and frequencies to deter pests

  • How to use a mesh BLE network to report pest detection

  • How to design a PCB, enclosure and GUI for a pest deterrent system

  • Panasonic EKMB1306112K PIR sensor

  • Raspberry PI 3B boards

  •  OpenCV and Tensorflow

  • Mesh BLE (Bluetooth Low Energy) network

  • PyBeacon program

  • TI LM2596 DC buck regulator module by Lanpu

  • 7″ LCD Raspberry Pi touchscreen

  • 3D printer

  • Node-RED development tool for wiring hardware

What inspired us to choose this project? It was a conversation Simon had with his father last summer, while sitting and having lunch. His father told him how the deer had come again in the night and eaten the flowers off all 28 of his tomato plants. His father said, “Son, could you invent something to keep these pesky deer out of my garden?” Simon knew that his dad was not alone with this issue. Being a farmer himself, he had experienced just how much damage deer can do.

When considering crop security and pest deterrence, the current market for the average homeowner is oversaturated with inefficient, ineffective products. By combining machine learning (ML) with high-precision, passive infrared sensors (PIRs) and Bluetooth Low-Energy (BLE), we created a more sophisticated deterrent system that touched on each group member’s technological area of interest.

Based on our project scope and requirements, we divided our product into a four-phase system: Detection, Distinguishing, Deterrence and Disclosure.

DETECTION
Reliable and continuous scouting of your property is critical for garden and home security. Knowing that your target is 40+ feet away and approaching allows the system plenty of time to exit sleep mode and enact its security measures. After researching the passive infrared sensor (PIR) market, we found the best choice to get this job done was the Panasonic EKMB1306112K. This high-precision PIR, made of top-of-the-line materials from a trusted manufacturer, boasts one of the densest detection zones available.

It’s able to achieve a 17-meter detection range, with as little as a 4°C temperature difference between the detected target and ambient temperature. This means that a single node could cover the average garden space of a home, or, if used close to the perimeter of the property, the node could see and begin deterring the target before it reaches your garden.

There is one downside to using PIRs—false triggering. We have gone to great lengths in the design to eliminate any possibility of false triggering. To achieve this, we started with stacked PIRs. These PIRs have a 60-degree horizontal viewing angle. We used two PIRs for each angle of approach—one above the other. To explain how this helps prevent false triggering, let’s take a common example: An outdoor security system.

— ADVERTISMENT—

Advertise Here

You’ve got your motion-triggered sensor set up outside, and have left it overnight to watch over things. Then, unbeknownst to you, a moth or another photoactive bug lands on your sensor, causing your sensor to trigger constantly. You wake up the next day to find a night-long recording of nothing! What a waste of time and energy! By stacking two PIRs and requiring both to be triggered at once, we effectively reduce the likelihood of false triggering. We wondered if maybe this proactive design were not enough? This led us to the next phase of our system.

DISTINGUISHING
Let’s return to the moth example. You now have moths on both of your PIRs, causing your alarm to run all night. How do you prevent false triggering in this situation? Machine learning comes to the rescue by enabling the system to detect, recognize and act judiciously instead of blindly.

Software: We had two initial contenders for our product’s “brain”: OpenCV and Tensorflow. We opted to use TensorFlow, because of its larger documentation base, its readily available tutorials, and—most importantly—because it is optimized for IoT and mobile applications. The latter makes it great for the Raspberry Pi family, which was important because we planned on using the Raspberry Pi 3B. However, TensorFlow had its own challenges. During install, some of its dependencies required updating. And since we were using Raspbian Lite OS, we had to install these dependencies one at a time—at times having to revert applications to older versions.

Training: Once we were able to run the basic tutorial example, the next step was to check the trained processor’s accuracy. Unfortunately, Google’s model was too global for our needs—it identified a raccoon as a red panda. We completed the next tutorial of retraining the processor for our own needs. This meant creating categorized folders (cougar, deer, raccoon, cat, dog, human, tractor and tree), and telling the processor where to locate them.

After our first retraining, we realized that the Raspberry Pi 3B (Pi) was not ideal for the processor-intensive task. We decided to do all future training on a desktop computer, and then copy the image over to our Pi. After retraining, we tested our model’s accuracy and found it to be lacking. Results were between 60% and 76% accurate, using roughly 100 photos per category. We learned that you need at least 300 photos for consistent accuracy.

In addition to that, you need your photos to portray different perspectives of the object. Consider the appearance of your favorite dog. If you had seen this dog only from the front at its eye level, is there any way you could say with confidence what it might look like from the side? Or from behind? For the AI to be truly effective at recognition, it needs images in different resolutions, lighting, perspectives and views.

On our third training attempt, we used more than 300 images per category, making certain to use many different elements to help flex our AI’s brain muscle. However, gathering and manipulating 2,500 pictures is not easy, and it consumes time. We used websites such as Unsplash and Pexels as sources of free images. After finishing the retraining, we resumed testing and saw accuracies greater than 90% in some categories. It was a good day at the office.

DETERRENCE
We felt confident at this point to move forward with the Deterrence phase of the system. Because we intended to use sound as a deterrent, we researched how sound is perceived by different species. The average human hearing range is 20Hz to 20kHz, and most middle-aged humans can only hear in the 15kHz range. While human hearing is limited to sounds below 20kHz, other species are capable of hearing in the ultrasonic range (frequencies greater than 20kHz). For example, dogs, deer and raccoons can hear up to 40kHz, and cats can hear up to 80kHz.

This meant that we needed a large, 10kHz audible range that would not wake you or your neighbor if an alarm went off at night. The simplest way to guarantee a sound will remain at a specified frequency is to use pure tones—waveforms such as sine or square waves. If you don’t know what a pure tone is there are several free pure-tone generators available on the Internet. Simply put, they are waveforms such as sine or square waves.

— ADVERTISMENT—

Advertise Here

Pure Tones and Man’s Best Friend: While pure tones make an effective deterrence option for urban settings, we thought that rural options could use a more creative touch. After reading several articles on predatory vocalizations and how they are used to deter urban pests, we learned that the pests had one definitive predator in common: Dogs. Because there are so few cougars and bears in urban areas, pests have few, if any, encounters with them. However, the pests are more than likely to have several run-ins with dogs.

One of us (Simon) has 50-150 chickens on his farm at any given time. If they are not in their coop at night, raccoons will come to hunt them. Generally, the raccoons aren’t afraid when he tries to chase them off. But if his Labrador retriever gets out there and starts barking, they immediately scatter and run for the property line. Based on research, we decided to incorporate plenty of variety in the barking sounds used, and we added some other predator vocalizations for good measure.

DISCLOSURE
We were moving forward smoothly in all three of the initial phases for our system. It was time to discuss the final phase of the system, Disclosure. In the Disclosure phase, we convey important information about the alarm to the user via our mesh BLE network. BLE is on the same 2.4GHz frequency band as regular Bluetooth. However, BLE uses a different modulation system from that of regular Bluetooth. Here is what this difference meant to us: BLE has a greater potential range, but a lower over-the-air data rate.

While regular Bluetooth is limited to seven slaves, BLE can have more than seven active slaves. That means we can have a multitude of potential field nodes on the same network, covering a wide area of land (the perimeter of a farming acreage). Most importantly, BLE offers a lower power consumption (often half that of regular Bluetooth), making it ideal, because we wanted our design to be optionally battery powered.

We had considered using Long Range (LoRa), but it was more range than we needed and had a smaller packet size. We also considered Wi-Fi, but the power required was too great to provide a long-term battery-powered option. Also, it required nodes to be within range of a Wi-Fi router, which might not be the case on larger properties.

Bluetooth Low-Energy 4.2: Once we had landed on BLE, we could begin writing code to transmit (Figure 1) and receive the desired data. Unfortunately, writing our own code based on the BLE protocols was too involved for our given time frame and required intensive documentation review. After some failed attempts, we opted to use a Python-based program called PyBeacon. This program is designed to advertise and scan for Eddystone-URLs. An Eddystone-URL is a 17-byte packet in the form of an HTTPS URL. Because we are not trying to send a URL, we needed to alter the provided code to fit our needs. The process of testing PyBeacon’s code to discern what it deemed necessary for a valid packet took a while. But once we knew what the packet was expected to look like, we could begin modifying the code. We were able to remove the unwanted packet data, and now had nearly 17 bytes of data length to pack as much useful information as possible into our message.

FIGURE 1 – This transmission flow chart begins as the system powers on. The system waits for a PIR to sense a target, and when it has detected something, it photographs the target using the appropriate directional camera. If, through image processing, the subject is determined to be a pest (with a certainty greater than 70%), the system triggers its alarm and communicates the event to the user. Otherwise, the system returns to waiting for the next PIR event.

It’s important to note that we originally planned on sending images taken by the field node to the base node, but it was clear that this would not be possible with only 17 bytes. Because we were no longer concerned with sending large amounts of data, we decided on four important details to send.

WHAT, WHEN, WHERE?
First is the opcode, or operation code, which tells our program what type of signal is received: Alarm, status, time sync or settings update. Second is the node of origin—where the alarm was triggered. This is important because we are using a mesh network topology. In this arrangement, all nodes must be in constant communication with all other nodes within range, and the program can identify which of the nodes on the property is detecting pests. Third is the date of the event. This information is paramount to relay, because it lets the user know what time of day the pests are coming to feed, and if any patterns emerge. Finally, we send the machine learning outcomes. The outcomes include which category rated highest, and what percentage of accuracy the AI found for category certainty.

After in-house testing, we had reliable transmission and receiving (Figure 2) on our two field nodes and base node. We then tested the connection at the required 40′, and could both transmit and receive without issue. However, to encode and decode this packet, an appropriate codec was required. We designed the codec to take all necessary information from the transmitting field node and encode it into our 17-byte packet. While on the receiving end, the codec would neatly decode the packet, interpret the string of characters and display the four aforementioned pieces of data.

FIGURE 2 – This receive flow chart begins as the system powers on. The system constantly listens for other broadcasting sources. When the system receives a broadcasted message, the codec determines if the packet is valid, and compares the message to ensure it is new and relevant data. If the data is new, the system decodes and relays the packet, and stores the data. After the data is stored, or if the packet is invalid, the system returns to listening for broadcasts.

DESIGN DECISIONS
PCB Design: With the network up and running, it was time to design our PCB. Because we are using a Pi as the center for our design, we already had most of our functionality covered. All the PIRs could be connected via the Pi’s GPIOs, and the two cameras would be connected via USB and the 15-pin FFC header. The purpose of the PCB (Figure 3) was to allow us to accept a variety of input voltages and supply types. To accomplish this task, a power rectification and distribution system was needed.

FIGURE 3 – A 3D rendering of our custom-designed Pi-hat PCB (green). On the board is one Lanpu LM2596 Buck Power Supply Module (blue), one P-channel MOSFET, 4 Bulk Capacitors, 7 Jumpers (3-pin) and access for the 40-pin Pi header.

We used an LM2596 DC buck regulator module by Lanpu to do this job. Based on Texas Instruments’ (TI’s) LM2596 chip, the regulator module met all our requirements for power distribution: a 3A current output to power the board in its awake state, the PIRs and the camera modules with current to spare. It also provides a 3V to 40V input voltage range, allowing us to accept a wide variety of battery compositions (lead-acid or lithium-ion) and setups (large series battery banks). Finally, the module can be powered by a common AC adapter.

Enclosure Design: Once we had chosen and designed all our components, we had a good understanding of the dimensions we needed to work with for the field and base node enclosures. The requirements for the field node enclosure design (Figure 4) included a small overhang from the lid to cover the camera lenses from direct rainfall and evenly spaced mounting holes for the cameras and PIRs to achieve 180 degrees of horizontal view coverage. Also needed was an optional battery compartment (Figure 5) for off-grid applications, and the ability to be mounted a variety of ways for a simple installation process.

FIGURE 4 – A top-down 3D rendering of our node enclosure. Custom holes for both the cameras and PIRs were placed with the intention for maximum coverage and consistency. To the right of our stand-off mounted Pi is a DC barrel jack that connects the battery back. At the back is the mounting bracket for the enclosure.
FIGURE 5 – A 3D rendering of our battery pack. This pack is attached to the bottom of the node enclosure. We designed this feature to add versatility to our design, making it optional to expand to off-grid solutions.

To achieve these requirements, we decided to 3D print our enclosures out of polylactic acid (PLA) using an Ultimaker 3. This allowed us to paint, shape and design our enclosures with relative ease, and helped to create a professional representation that portrayed our desired final product (Figure 6). Another benefit of 3D printing the enclosures was our ability to reprint an entirely new enclosure in less than 24 hours. If we didn’t like the look, or if we found there wasn’t enough space for the Raspberry Pi inside, we could make the necessary adjustments and be ready by the next day. The mounting bracket on the back of the enclosure allows it to be placed on a stake in the ground, or tied with a strap to a fence or house.

FIGURE 6 – Our 3D-printed node enclosure fully assembled with the battery pack. We opted for a classic camouflage paint job for the field node, since it seemed appropriate for the market. Using stencils, it took several days to paint it.

We designed the base node (Figure 7) to be placed on a countertop, take up minimal space and display useful information to the user. With those objectives in mind—and because we used a 7″ LCD Raspberry Pi touchscreen—we designed the enclosure to frame the screen and slant it backward slightly for good readability and a stylish, welcoming appearance.

GUI Design: Once we had finished the enclosures, we began creating the GUI for the base node. The program we used to design the GUI was Node-RED, a web browser-based development tool for wiring hardware, APIs and other IoT devices. We chose Node-RED (Figure 8) because it is designed for Raspberry Pi applications and offers a simple and familiar flowchart-style IDE. We achieved an easy-to-use GUI that offers the user relevant data such as field node battery life (Figure 9) and alarm status (Figure 10) when interacted with, and otherwise displays the current time and date.

— ADVERTISMENT—

Advertise Here

FIGURE 8 – Node-RED flow chart, showing the operating functions of our base node. The operations are battery life for nodes 1 and 2, node alarm status, or date and time. The flow lines starting with a “timestamp” begin with received data from the field nodes.
FIGURE 9 – Screen capture of our base node’s display. The node is displaying the battery life of field node 1 in percentage, and the most recent check time. This information is under the “Node 1” tab.
FIGURE 10 – Screen capture of our base node’s display of the most recent alarm status. The information we considered most important to the user includes the sending node, the detected animal, and the time of alarm. This information is under the “Warning” tab.

DISCUSSION
We originally set out to design an automated pest deterrent solution for use in gardens and on farms. In the end, we created a powerful and sophisticated deterrent system with the potential for growth and adaptation to the circumstances of any property owner. We believe this achievement gives our design great value and merit, particularly if we decide to produce it commercially.

In future iterations of the design, however, we would make hardware changes and quality-of-life improvements. First, we would use a metal enclosure rated IP64 (dust-tight and protected from water splashes from all directions), to allow the field nodes to operate in a farmer’s field. Second, we would replace the Raspberry Pi 3B and all its excess features with a processor with a lower power consumption. This processor would be embedded into a custom PCB design encompassing only the necessary functionality. And finally, we would switch from BLE to Bluetooth 5.0, to achieve a larger packet size. This would allow images to be sent from the field node to the base node.

We owe much of our success to proper project management. We took the time early in the development process to address the faults and failures of the market for pest deterrence. As a result, we were able to overcome them by using new technologies and our current knowledge base, developed through Camosun College’s Electronics & Computer Engineering Technology – Renewable Energy program. 

Author’s Note: Cole, Cameron, and Simon are graduates of Camosun College’s Electronics & Computer Engineering Technology – Renewable Energy program, and they plan on continuing part-time development of their project to manufacture an affordable and effective pest deterrent product.

RESOURCES
Panasonic Electronic Components | na.industrial.panasonic.com
PyBeacon | www.pypi.org
Node-RED | www.nodered.org
TensorFlow | www.Tensorflow.org
Texas Instruments | www.ti.com

PUBLISHED IN CIRCUIT CELLAR MAGAZINE • MAY 2020 #358 – Get a PDF of the issue

Keep up-to-date with our FREE Weekly Newsletter!

Don't miss out on upcoming issues of Circuit Cellar.


Note: We’ve made the May 2020 issue of Circuit Cellar available as a free sample issue. In it, you’ll find a rich variety of the kinds of articles and information that exemplify a typical issue of the current magazine.

Would you like to write for Circuit Cellar? We are always accepting articles/posts from the technical community. Get in touch with us and let's discuss your ideas.

Sponsor this Article
+ posts

Cole Gamborski is a Camosun College graduate with many years of experience in the field of property security. He has an aptitude for programming and is continuing at Camosun in its Engineering Bridge program in order to earn an Electrical Engineering degree from UVIC.

Cameron Phillips is a Camosun College graduate with many years of mechanical experience. He is well versed in machine learning and 3D design, and he will be taking on a co-op with FTS Forest Tech before continuing his education.

Simon Fowler is a Camosun College graduate with experiences designing solar PV installations. He hopes to find work in the Solar PV field before continuing his education in the Camosun Engineering Bridge program.

Supporting Companies

Upcoming Events


Copyright © KCK Media Corp.
All Rights Reserved

Copyright © 2023 KCK Media Corp.

Build an Automated Pest Deterrent System

by Circuit Cellar Staff time to read: 14 min