Here's the normalized version with simple formatting:


Inspiration

Farmers and gardeners often face the same frustrating problem: a healthy-looking plant suddenly starts showing mysterious spots, color changes, or leaf curling. Identifying plant diseases quickly can mean the difference between saving a crop and losing an entire harvest. We wanted to create a tool that empowers anyone, from backyard gardeners to agricultural researchers, to detect and understand plant health issues instantly, using just a photo.

What it does

Leafylitics is an AI-powered plant diagnostics app. Users simply upload a photo of a leaf, and the app detects the crop type (tomato, potato, corn, etc.), analyzes the leaf for diseases using image recognition trained on the PlantVillage dataset, and generates an explainable analysis report. The report includes the disease name and confidence level, environmental factors that may be contributing, treatment recommendations, and a user-friendly summary written in plain language. The app also stores previous analyses for comparison and tracking crop health over time.

How we built it

We trained a convolutional neural network (CNN) using the PlantVillage dataset, a comprehensive, open-source dataset containing tens of thousands of labeled plant leaf images. Our stack includes a frontend built with Expo and React Native, featuring an intuitive UI with image upload and results visualization across a multi-page layout (Upload to Analysis Result). The backend uses Python and FastAPI for ML model inference and returns JSON with detailed analysis results. We used TensorFlow and Keras for model training, fine-tuning a pretrained model on PlantVillage images with image augmentation for better generalization. For storage and APIs, we use Google Cloud Storage for handling uploads and Gemini 2.5 Flash for natural-language generation of summaries and recommendations.

Challenges we ran into

Balancing accuracy and inference speed was difficult because high-resolution leaf images slowed down inference, so we optimized preprocessing and batching. The dataset had imbalances where some crops had significantly fewer examples, requiring careful augmentation and sampling. UI design was also challenging, as early prototypes used modals for results (ugly Apple-style popups), which we later refactored into a full analysis page with navigation. We also wanted model explainability, not just "what disease" but why, so we added AI-generated summaries that explain symptoms and confidence levels.

Accomplishments that we're proud of

We achieved over 95% classification accuracy on validation data and built a fully functional mobile app that integrates ML inference and a natural-language layer. We created a clean UX flow from snapping a photo to getting actionable insights in under 10 seconds. We also designed our own AI-generated "analysis report" format that is readable by both farmers and researchers.

What we learned

We learned how to fine-tune and deploy real-world ML models efficiently on-device and in the cloud. We discovered the importance of UI flow in trust, as users engage more with results when they look clean and detailed, not like a popup. Using Gemini 2.5 Flash for post-processing results improved user comprehension dramatically. We also learned that data collection and validation are just as hard as model building.

What's next for Leafylitics

We plan to add real-time camera mode to diagnose while scanning crops in the field and incorporate weather and soil data APIs for environmental correlation. We want to allow offline inference for rural regions with limited connectivity and expand to multi-plant detection and support for custom crop training by local communities. We also hope to explore partnerships with agricultural universities and sustainability NGOs.

Built With

Share this project:

Updates