Inspiration

We were thinking about global issues that scare us, but where communities could come together to make a change. With a projected increase in population of 2 billion people and a loss of a third of earth’s arable land by 2040, feeding our ever-growing population is quickly evolving into a critical issue for the future [1]. In addition, it is expensive and environmentally harmful to transport produce across national and international boundaries. We see how helping out with a garden can be challenging and novel at first, but this app aims to simplify the involvement process. This way, even new gardeners can be a part of the process -- which is known to help improve dietary habits, improve mental health and relaxation, and build welcoming, safer communities [2, 3]. Our solution is particularly relevant to (though not limited to) urban gardens because the close proximity makes the plants more prone to disease -- being able to recognize these diseases early on will improve the health of these gardens.

What it does

Our application is designed to make community gardening easier. Firstly, the application has the ability to classify diseased plants from healthy ones from a singular image of a leaf. The trained algorithm is able to classify the health of a plant at various stages in its lifetime, allowing the user to promptly identify diseased plants early on. Our application also has a forum, in which community gardeners can share their experiences with gardening and ask for help, creating a strong network of resident farmers of varying experience levels. Finally, the application has a map feature which visualizes the locations of the community gardens in the area. The map feature also has a health bar which shows levels of health of plants at the selected gardens. Overall, our app aims to prevent spread of diseases, help inexperienced farmers and connect the community.

How we built it

We gathered images of leaves from healthy and sick tomatoes from an online database [4]. We utilized the ML Toolkit on Firebase in order to construct a model that could classify plant leaves by whether they are sick or healthy. We then took this model, and worked with Android Studio in order to make a simple interface, that could demonstrate effectiveness in for a minimum viable product.

Challenges we ran into

Integrating a simple front end with our ML model was a challenge. To this effect, we had to learn how to use android studio.

We also had trouble obtaining enough data to train the model, until we found a database online. Having a large amount of image data (1000+ images) meant that our model would likely be more accurate.

Since we were new to Android Studio, making a relatively-decent looking front end was a challenge. Looking at sample code helped us along the way here.

Accomplishments that we're proud of

We are proud of our ability to quickly pivot and adapt to changes in development, as issues experienced in the latter stages required rapid divergence of ideas, converging to feasible products, and ability to learn new development skills and software.

We are also proud of achieving the interfacing of the backend functionality to a simple frontend design, leading to the creation of an Android app to help streamline and familiarize the process of analyzing plant health.

We are also proud of our flexibility, bring able to simultaneously work on multiple aspects of the design cohesively while also being available to help one another when faced with issues.

What we learned

From a non-technical standpoint, we learned how to divide a project into smaller chunks and work on them productively as a team.

From a technical standpoint, we learned how to use Firebase, Android Studio, ML Kit, Kotlin, and Tensorflow to create a working machine learning model and Android application, primed to identify sick and healthy plant leaves. We also learned how to quickly learn and development machine learning algorithms and models, being able to adapt to ML Kit and Firebase during the Hackathon. Finally, we became more familiar with backend development using Google APIs and Firebase.

What's next for GreenThumbs

More functionality! Expanding the mobile app to include the components discussed in the mockups, while also introducing more features to the app would make helping out easier for those new to gardening! We would also like to make it easier to identify plants from weeds.

A stronger ML model! The current model was only trained using sick and healthy tomato leaves. Using additional plant types and specific disease names could help us to create a model that can classify leaves with higher specificity.

A nicer front end! Since we were new to Android Studio, we weren't completely sure how to make things look the way we desired. With more practice and patience, the front end could look as pretty as how we drew it, with all the functionality desired.

A help forum! The addition of a help forum could allow less experienced gardeners and farmers to learn of the community can ease the transition into gardening.

What else we can do

The nature of our model allows a compartmental allocation of images to categories. As such, the underlying algorithm can be used to identify any category of objects, and is not limited to just plants.

Built With

Share this project:

Updates