Skip to content

masaok/flowerid-next

Repository files navigation

Flower Identifier

A mobile-first web app that identifies flowers using machine learning. Take a photo of a flower and get instant identification with confidence scores.

Features

  • Camera Integration - Uses device camera with back-camera preference for mobile
  • Real-time ML Inference - Runs entirely in the browser using TensorFlow.js
  • Offline Capable - PWA-ready with no server-side processing required
  • 5 Flower Types - Identifies daisy, dandelion, rose, sunflower, and tulip

Tech Stack

Layer Technology
Framework Next.js 16 (App Router)
Language TypeScript
Styling Tailwind CSS 4
ML Runtime TensorFlow.js
Model MobileNetV2 (transfer learning)
Testing Jest + React Testing Library
Hooks Husky (pre-commit CI)

Architecture

┌─────────────────────────────────────────────────────┐
│                    Browser                          │
├─────────────────────────────────────────────────────┤
│  Camera API  →  Canvas Crop  →  TensorFlow.js      │
│     ↓              ↓                ↓               │
│  Video Stream   Square Image    MobileNetV2        │
│                 (282×282px)     Predictions        │
└─────────────────────────────────────────────────────┘

The model runs entirely client-side:

  1. Camera captures video stream
  2. On capture, image is cropped to the viewfinder circle
  3. TensorFlow.js preprocesses and runs inference
  4. Results display with confidence percentages

Getting Started

# Install dependencies
pnpm install

# Run development server
pnpm dev

# Open http://localhost:3000

Scripts

Command Description
pnpm dev Start development server
pnpm build Production build
pnpm test Run Jest tests
pnpm ci Type check + lint + test

Training the Model

The model is pre-trained and included in public/tfjs_model/. To retrain:

cd training

# Create virtual environment (recommended)
python -m venv venv
source venv/bin/activate

# Install dependencies
pip install -r requirements.txt

# Run training (~5 minutes on CPU)
python train.py

# Copy output to public directory
cp tfjs_model/* ../public/tfjs_model/

Training uses the TensorFlow Flowers dataset (3,670 images across 5 classes) with MobileNetV2 transfer learning.

Project Structure

├── public/
│   ├── tfjs_model/          # TensorFlow.js model files
│   └── manifest.json        # PWA manifest
├── src/
│   ├── app/
│   │   ├── page.tsx         # Home page
│   │   └── camera/page.tsx  # Camera + identification
│   ├── components/
│   │   ├── CameraView.tsx   # Viewfinder with circle guide
│   │   ├── ResultCard.tsx   # Prediction results display
│   │   └── ...
│   ├── hooks/
│   │   └── useCamera.ts     # Camera access + capture
│   └── lib/
│       ├── model.ts         # TensorFlow.js inference
│       └── flowers.ts       # Flower metadata
└── training/
    ├── train.py             # Model training script
    └── requirements.txt     # Python dependencies

Model Details

  • Base Model: MobileNetV2 (ImageNet pretrained)
  • Input Size: 224×224×3
  • Output: 5-class softmax (daisy, dandelion, roses, sunflowers, tulips)
  • Size: ~8.7MB (TFJS graph model)
  • Accuracy: ~87% on test set

License

MIT

About

Flower Identification Mobile Web App

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors