-
-
Priansh drilling a hole in the initial Raspberry Pi case, an Altoids tin.
-
Sebastian and Priansh setting up the second Raspberry Pi case, utilizing a plastic water bottle rather than an Altoids tin.
-
Priansh using LabelEditor so the model properly detects the chicken objects and non chicken objects.
-
The flowchart outlining the methodology of Coop Colonel
-
Photo of the Raspberry Pi in its bottle case, with camera facing downwards.
-
The 5 chickens to be monitored, sleeping in a row.
-
Andrew setting up a camera for a shot.
-
Roasting marshmallows at midnight
-
The setup at night
-
The filming setup for one scene in the video, recorded at 3:30 AM
-
Setting up lighting for the introduction, recorded at 4:45 AM
-
The process of editing the video, ~approx 5:30 AM to 8:30 AM
The Problem
Since 2024, the US has been under a bird flu epidemic. In that time, 166 million chickens have been killed, driving egg prices up and causing massive losses for farmers. Unfortunately, current state of the art tools for chicken health monitoring like AgriStats require minimum production values that exceed virtually all small farms, and charge annual fees of $25,000-$100,000, ultimately leaving small farmers without the tools they need.
In fact, this monopoly is so extensive that AgriStats is being sued by the Justice Department for monopolistic behavior, as the health benefits they offer to specifically unethical factory farms allows AgriStats' client to control over 90% of chicken sales in the US. Furthermore, the DOJ report alleges that using their data, AgriStats "encouraged meat processors to raise prices and reduce supply"
What it solves
Coop Colonel is a first of it's kind, low-power, low-cost, open source chicken monitoring solution for a backyard coop or small farm. It provides farmers with live views of their coops, and uses machine vision to monitor chicken health through anomaly based detection. Instead of farmers having to pay annual subscriptions in the tens of thousands, they can leverage ARM's efficient architecture and run state of the art health detection, with their only recurring cost being electricity.
How it works
We use 2 Raspberry Pi Zero W's, each connected to a Arducam 5MP OV5647, all sending TCP streams over the network. We capture these streams and encode them using opencv, sending camera feed both into the web view, and to our fine-tuned YOLO v8 model. The model scans the camera feed for chickens, and based off the position of the chickens, and time spent in those positions, it is able to return data regarding the feeding/drinking/egg laying ability of the chickens. Based off this data, we are able to determine anomalies in chicken behavior, allowing us to determine if the chickens may be sick.
Challenges we ran into
Creating the housing for the Pi Camera's was a challenge. We originally thought using recycled Altoids cans would work, but quickly realized that the aluminum case would drastically decrease the quality of the Pi's wireless network connection. We ended up recycling plastic water bottles to create the housing, and they worked quite well.
We also had some issues with training our machine vision model. This was a pretty intensive task, but by renting a server on Vultr, we made quick work of the training process.
We also had some issues with false positives - the feeding and drinking stations themselves being categorized as chickens. We took advantage of Label Studio, though, and it let us manually label the stations as not chickens, and label partially obstructed chickens (which were resulting in false negatives) as chickens. Using Vultr, we were able to quickly retrain the models and test them, which was a great help in the accuracy.
Accomplishments that we're proud of
We were honestly a bit shocked at seeing the framerate. I certainly didn't expect amazing framerate, and seeing a video feed so smooth was a welcome surprise after a lot of struggles setting up the cameras.
I was also shocked at seeing how well the models were running. We were originally expecting to just have 1 fps detection that would go entirely into the backend for anomaly detection, but in testing our model ran at nearly the same framerate as the video itself. We really only have Vultr to credit for that - being able to retrain so frequently using powerful cloud compute left us with a final model that I can only describe as beautiful.
Another thing we're proud of is how well we utilized the Pi's ARM architecture. I know it seems a bit cliche but we quite literally would not have been able to do the project if we didn't properly use ARM. The entire BCM2835 SoC is a VideoCore IV computer with an ARM chip attached, and using the VideoCore IV and ARM cpu we were able to process the video in a way that wouldn't have been possible on a different architecture under such low specs.
What we learned
Through a whole lot of trial and error, I can confidently say we've learnt more than we've ever wanted to know about network streams, and though it was very frustrating at times, I think we really learnt a lot.
We also learnt a lot about agritech, and what it means for something to be reliable. When we created these tools we wanted them to be things with no need for maintenance, very durable, and very reliable. Because in the real world, unreliable agritech is unusable agritech.
What's next for Coop Colonel
In the future, being able to add an IR camera will give us another method for detecting illnesses. Additionally, factoring in more complex anomaly detection through individualizing each chicken, instead of working on diagnosing a flock, could prevent diseases before they even have a chance to spread.
Built With
- arducam
- cloudflare
- ffmpeg
- label-studio
- opencv
- python
- raspberry-pi
- ultralytics
- yolo


Log in or sign up for Devpost to join the conversation.