Skip to content

N-45div/SmartStock-AI

Repository files navigation

SmartStock AI

Real-time inventory management and demand forecasting system powered by Confluent Kafka streaming and Google Cloud Vertex AI.

Live Demo: https://smartstock-frontend-1064261519338.europe-west1.run.app

Backend API: https://smartstock-backend-1064261519338.europe-west1.run.app


Overview

SmartStock AI is an enterprise-grade inventory intelligence platform that combines real-time streaming data with generative AI to transform reactive inventory management into predictive operations. The system continuously ingests sales transactions, analyzes demand patterns, and provides actionable insights through an AI-powered assistant.

Problem Statement

Traditional inventory management relies on historical averages and manual forecasting, leading to:

  • Stockouts causing lost revenue (estimated 4% of retail sales annually)
  • Overstocking resulting in capital tied up in excess inventory
  • Delayed response to demand shifts and market changes
  • Lack of real-time visibility across multiple locations

Solution

SmartStock AI addresses these challenges through:

  • Real-time data streaming via Confluent Kafka for immediate visibility
  • AI-powered demand forecasting using Google Gemini for predictive insights
  • Automated alerting when inventory levels approach critical thresholds
  • Intelligent recommendations for restocking quantities and timing

Data Flow

  1. Data Generation: A POS simulator (or demo generator) produces retail transactions
  2. Streaming: Events are published to Confluent Kafka topics
  3. Persistence: Confluent BigQuery Sink Connector(s) ingest Kafka topics into BigQuery
  4. Analytics: Backend queries BigQuery for real-time analytics + historical trends
  5. AI Processing: Vertex AI (Gemini) generates demand forecasts and recommended actions
  6. Presentation: React frontend shows dashboards, insights, and guided workflows

Technology Stack

Layer Technology Purpose
Frontend React 18, Vite, TailwindCSS Single-page application
Backend Node.js, Express.js REST API server
Streaming Apache Kafka (Confluent Cloud) Real-time event streaming
Database Google BigQuery Analytics data warehouse
AI/ML Google Vertex AI (Gemini 2.0 Flash) Demand forecasting, AI assistant
Deployment Google Cloud Run Serverless container hosting
Scheduling Google Cloud Scheduler Automated data generation
Containerization Docker, nginx Production builds

Features

Dashboard

  • Real-time metrics: revenue, sales count, inventory value, low stock alerts
  • Sales activity charts with historical trends
  • Revenue breakdown by product category
  • Integrated AI assistant for natural language queries

Demand Heatmap

  • Visual representation of sales patterns across time
  • Hourly, daily, and category-based views
  • Peak demand identification
  • Data sourced directly from BigQuery

AI Predictions

  • Product-level demand forecasting
  • Days until stockout calculations
  • Confidence scores and trend indicators
  • Powered by Gemini 2.0 Flash model

Smart Restock

  • Automated restock suggestions based on current inventory and predicted demand
  • Purchase order generation with cost calculations
  • Urgency-based prioritization
  • Supplier lead time considerations
  • Purchase order workflow (Kafka-first event sourcing + BigQuery views) with supplier email sending

AI Assistant

  • Natural language interface for inventory queries
  • Real-time data from BigQuery
  • Actionable recommendations
  • Context-aware responses using current inventory state

Anomaly Detection

  • Rapid depletion alerts
  • Unusual sales velocity detection
  • Volume spike identification

Multi-Store Transfers

  • Inter-location inventory optimization (Downtown Market, Suburban Plaza, Airport Express)
  • Transfer recommendations based on stock levels
  • Cost-benefit analysis

AI Action Center

  • Centralized hub for AI-driven inventory actions
  • Action types: Critical Restock, Promotion Opportunity, Stock Transfer, Clearance
  • Full lifecycle management: Pending → Acknowledged → Executed
  • ROI estimates and priority scoring
  • Real-time action generation powered by Gemini AI

AI Agent Pipeline (ADK Multi-Agent System)

  • Autonomous inventory management running every 10 minutes
  • Multi-agent pipeline using Google ADK:
    • SalesDataFetcher: Queries BigQuery for sales data
    • MarketIntelAgent: Analyzes market context and seasonal factors
    • InventoryAnalyzer: Identifies stock issues
    • DemandPredictor: AI-powered demand forecasting
    • RecommendationValidator: Quality checks recommendations
    • ActionPublisher: Publishes to Kafka
  • Auto-Actions triggered after each pipeline run:
    • Auto-generates Action Center recommendations
    • Auto-resolves alerts (restock, markdown, transfer)
    • Auto-balances store inventory transfers

Project Structure

SmartStock-AI/
├── agents/                          # ADK Multi-Agent Pipeline (Python)
│   ├── smartstock_agents/
│   │   ├── agent.py                 # Agent definitions & pipeline
│   │   └── tools.py                 # BigQuery, Kafka tools
│   ├── main.py                      # FastAPI server
│   ├── Dockerfile
│   └── requirements.txt
├── backend/                         # Node.js Backend
│   ├── src/
│   │   ├── api/
│   │   │   └── routes.js           # REST API endpoints
│   │   ├── gemini/
│   │   │   └── gemini-service.js   # Vertex AI integration
│   │   ├── kafka/
│   │   │   └── kafka-service.js    # Confluent Kafka producer
│   │   └── services/
│   │       ├── agent-service.js    # Agent pipeline orchestrator
│   │       ├── action-center-service.js  # AI Action Center
│   │       ├── bigquery-service.js # BigQuery client
│   │       ├── inventory-service.js
│   │       └── prediction-service.js
│   ├── Dockerfile
│   └── package.json
├── frontend/                        # React Frontend
│   ├── src/
│   │   ├── components/
│   │   │   ├── ActionCenter.jsx    # AI Action Center
│   │   │   ├── AgentPipeline.jsx   # Agent monitoring
│   │   │   ├── AIAssistant.jsx
│   │   │   ├── Dashboard.jsx
│   │   │   ├── DemandHeatmap.jsx
│   │   │   ├── MultiStoreTransfer.jsx
│   │   │   ├── SmartRestockOrder.jsx
│   │   │   └── ...
│   │   ├── hooks/
│   │   │   └── useApi.js
│   │   └── App.jsx
│   ├── Dockerfile
│   ├── nginx.conf
│   └── package.json
├── ARCHITECTURE.md                  # Detailed architecture diagrams
└── README.md

API Reference

Inventory Endpoints

Method Endpoint Description
GET /api/inventory Current inventory state
GET /api/metrics Dashboard metrics
GET /api/alerts Active alerts

Analytics Endpoints

Method Endpoint Description
GET /api/analytics/sales Recent sales from BigQuery
GET /api/analytics/summary Sales summary statistics
GET /api/analytics/hourly Hourly sales patterns

AI Endpoints

Method Endpoint Description
POST /api/ai/chat AI assistant conversation
GET /api/predictions Demand predictions
GET /api/anomalies Detected anomalies

Restock Endpoints

Method Endpoint Description
GET /api/restock/suggestions Restock recommendations
POST /api/restock/order Generate purchase order
POST /api/restock/simulate Supply chain simulation

Simulator Endpoints

Method Endpoint Description
POST /api/simulator/trigger Trigger sales generation

Deployment

Prerequisites

  • Google Cloud Platform account with billing enabled
  • Confluent Cloud account
  • gcloud CLI installed and configured

Environment Variables

Backend

GOOGLE_CLOUD_PROJECT=your-project-id
CONFLUENT_BOOTSTRAP_SERVERS=pkc-xxxxx.region.gcp.confluent.cloud:9092
CONFLUENT_API_KEY=your-api-key
CONFLUENT_API_SECRET=your-api-secret

# Purchase orders + email
[email protected]
[email protected]
EMAIL_DRY_RUN=false

# Optional: demo traffic generator for sales-transactions
ENABLE_KAFKA_SALES_GENERATOR=false
KAFKA_SALES_GENERATOR_INTERVAL_MS=15000

# Topics (defaults shown)
KAFKA_TOPIC_SALES=sales-transactions
KAFKA_TOPIC_INVENTORY_UPDATES=inventory-updates
KAFKA_TOPIC_DEMAND_PREDICTIONS=demand-predictions
KAFKA_TOPIC_RESTOCK_ALERTS=restock-alerts
KAFKA_TOPIC_PURCHASE_ORDER_EVENTS=purchase_order_events

PORT=3001

Frontend

VITE_API_URL=https://your-backend-url.run.app

Deploy to Cloud Run

# Backend
cd backend
gcloud builds submit --tag gcr.io/PROJECT_ID/smartstock-backend
gcloud run deploy smartstock-backend \
  --image gcr.io/PROJECT_ID/smartstock-backend \
  --region europe-west1 \
  --allow-unauthenticated \
  --set-env-vars "GOOGLE_CLOUD_PROJECT=PROJECT_ID,CONFLUENT_BOOTSTRAP_SERVERS=..." \
  --memory 512Mi

# Frontend
cd frontend
gcloud builds submit --tag gcr.io/PROJECT_ID/smartstock-frontend
gcloud run deploy smartstock-frontend \
  --image gcr.io/PROJECT_ID/smartstock-frontend \
  --region europe-west1 \
  --allow-unauthenticated \
  --port 80 \
  --memory 256Mi

Configure Cloud Scheduler

gcloud scheduler jobs create http smartstock-simulator \
  --location=europe-west1 \
  --schedule="*/2 * * * *" \
  --uri="https://your-backend-url.run.app/api/simulator/trigger" \
  --http-method=POST \
  --oidc-service-account-email=your-service-account@PROJECT_ID.iam.gserviceaccount.com

Local Development

Installation

# Clone repository
git clone https://github.com/N-45div/SmartStock-AI.git
cd SmartStock-AI

# Backend setup
cd backend
npm install
cp .env.example .env
# Configure environment variables
npm run dev

# Frontend setup (new terminal)
cd frontend
npm install
npm run dev

Running Tests

cd backend
npm test

cd frontend
npm test

Configuration

Confluent Cloud Setup

  1. Create a Confluent Cloud cluster
  2. Create topics: sales-transactions, inventory-updates, demand-predictions, restock-alerts, purchase_order_events
  3. Configure BigQuery Sink Connector(s) to ingest the topics into BigQuery
  4. Generate API keys for producer access

BigQuery Setup

  1. Create dataset: smartstock_analytics
  2. Tables are auto-created by the BigQuery Sink Connector
  3. Ensure service account has bigquery.dataEditor and bigquery.jobUser roles

Vertex AI Setup

  1. Enable Vertex AI API in Google Cloud Console
  2. Ensure service account has aiplatform.user role
  3. Model used: gemini-2.0-flash-001 in us-central1

Architecture

For detailed architecture diagrams (Mermaid flowcharts, sequence diagrams, data models), see ARCHITECTURE.md


Performance Considerations

  • Polling Interval: Frontend polls every 15 seconds for updates (WebSocket not supported in Cloud Run)
  • Data Retention: BigQuery stores all historical transactions for analytics
  • Caching: Predictions are cached to reduce AI API calls
  • Cold Starts: Cloud Run instances may have initial latency; consider min-instances for production

License

MIT License - see LICENSE file


Acknowledgments

Built for the AI Partner Catalyst Hackathon, demonstrating the integration of Confluent streaming platform with Google Cloud AI infrastructure for real-time inventory intelligence.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors