Inspiration

In recent years, pollinators like bees and other insects have been dying off at a rapidly accelerating pace. While this has been caused by a variety of factors, a major contributor to declining pollinator populations is the increased prevalence of lawns, which replace the native plants that pollinators rely on with non-pollinator friendly Kentucky bluegrass. Pollinators are cornerstones for agriculture and food webs, so protecting them is essential for promoting sustainability. Beyond supporting pollinators, maintaining local plants is extremely important for preserving biodiversity. Pollution, climate change, and urban sprawl have strained native plant populations, leading to habitat loss and reduced ecological resilience. PlantGo! aims incentivize users to find and protect these native plants, and increase awareness and information about native plants and their natural ranges.

What it does

PlantGo! allows users to upload images of plants they find in order to classify them, and gives the users information about the plant they’ve found, including conservation status and their meaning in Victorian flower language. From there, users can navigate to a map to see whether the plant they’ve found is native to their region or is an invasive species. PlantGo! also includes an AR garden, which is a fun way to incentivize users to continue to find plants. All of this is populated into a database that allows users to track their progress and see which plants they’ve found in their profile. Lastly, the user has the option to chat with a chatbot if they want to find next steps for promoting biodiversity or additional information regarding sustainability and plants.

How we built it

We built PlantGo! using streamlit as our primary tech stack. PlantGo! is a fullstack website, so for backend we used MongoDB to store user information, which includes their username, password, and which plants they’ve found. This becomes the user’s profile, which they can view once they’ve signed in. In order to determine which plant the user has found, we used PlantNet API to classify images, and allowed users to upload images to the web app. The map of plant ranges was generated using a map functionality in streamlit to print a world map, and we added locations based on the longitude and latitude for the native range of each plant that PlantGo! supports. The chatbot is made using a Gemini API, allowing users to converse with Gemini for next steps or additional information. Lastly, for the AR, we used a tensorflow model that maps out the hand, used React to work with a computer webcam on a website. We then painted out hand landmarks, and placed flowers based on these hand landmarks.

Challenges we ran into

A challenge we ran into was using MongoDB as a database: none of us were familiar to MongoDB prior to AthenaHacks, so we had to learn how to use MongoDB on the fly while we were constructing our website. Learning how to use streamlit was a challenge as well, as we were not familiar with streamlit either, so we had to spend a significant amount of time reading through streamlit’s documentation and experimenting. Using streamlit also created difficulties in designing the visuals of our web app: we were familiar with using CSS to design, but streamlit changed things. The biggest challenge we ran into was with the AR. In order to map flowers onto the hand, we needed to figure out how to communicate between 3 coordinate systems: the camera, 2D images, and 3D images, and place them on the same screen in order to allow the image to populate.

Accomplishments that we're proud of

We’re proud of figuring out motion detection through the web browser camera in order to track hand movement, and using computer vision and combining coordinate systems in order to give the AR Garden functionality. Our biggest accomplishment at this hackathon is how much we learned with new databases and tech stacks. None of us were familiar with MongoDB or streamlit prior to AthenaHacks, so we’re proud of being able to learn how to use these on the fly and being able to create a functional website.

What we learned

MongoDB and streamlit were both completely new to all of us before AthenaHacks, so we learned how to use both of these. We also learned how to use computer vision for AR and how to communicate between 3 coordinate systems with the AR in order to allow the image to populate on the screen.

What's next for PlantGo!

PlantGo!’s plant database was unfortunately limited by our time constraints; the web app currently only supports plant information and maps for 5 different plant species, so a next step would definitely be to expand the database to include a greater number plant species to increase the usability of PlantGo! in a variety of locations. We also want to allow users to be able to directly log the location where they found the plant in order to directly compare the location with the native range of said plant. This would be a major improvement over what PlantGo! currently has, which relies on users navigating to a separate map page and estimating their own location to determine whether the plant they’ve found is an invasive species or in its native range.

Built With

Share this project:

Updates