Skip to content

lucapug/modern-python-learning-companion

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

37 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Modern Python Learning Companion

An AI-assisted educational tool that analyzes Python tools used in GitHub-hosted educational courses and suggests modern alternatives with clear explanations.

Overview

This project is designed to help educators and learners discover modern Python tooling alternatives. Unlike static catalogs, our recommendations are signal-driven and time-dependent, reflecting the evolving nature of the Python ecosystem while remaining focused on learning-oriented use cases.

Key Features

  • Signal-Driven Recommendations: Tool suggestions are based on external signals (GitHub Trending, repository metrics) rather than fixed opinions
  • Transparent Decision Making: Each recommendation includes clear explanations of why a tool is suggested
  • Dynamic "Modern" Definition: "Modern" is a measurable, time-dependent property (trending → emerging → stable)
  • Educational Focus: Provides practical, step-by-step instructions without altering course content
  • Non-Invasive: Designed as a learning companion, not a production refactoring engine

AI System Development

After an initial brainstorming phase, conducted in ChatGPT, the project has been implemented in VSCode IDE with Roo Code, alternating Architect and Code Modes. The LLM end point provider has been Ziphu AI, with selected model: GLM-4.7

  • In Architect Mode: the AI assistants has created documentation in docs/ and in plans/
  • In Code Mode: Roo Code has implemented the project, strictly following architectural and development instructions (spec driven development).

Architecture

The system consists of:

  • Backend: FastAPI with Pydantic v2 for data validation
  • Frontend: React + TypeScript + Vite + Tailwind CSS
  • Signal Sources: GitHub Trending API, repository metadata
  • Orchestration: n8n for scheduled workflows, Temporal for fault-tolerant AI pipelines
  • Database: SQLite for development, PostgreSQL for production
  • Cache: Redis for performance optimization
graph TB
    subgraph "User Interface"
        UI[Frontend UI]
    end
    
    subgraph "API Layer"
        API[FastAPI Backend]
    end
    
    subgraph "Core Services"
        ToolService[Tool Discovery Service]
        CertService[Certification Service]
        RepoService[Repo Analysis Service]
    end
    
    subgraph "Signal Sources"
        Trending[GitHub Trending]
        RepoMeta[Repository Metadata]
        Context7[Context7 MCP]
    end
    
    subgraph "Orchestration"
        n8n[n8n Workflows]
        Temporal[Temporal Workflows]
    end
    
    subgraph "Data Layer"
        DB[(SQLite dev / PostgreSQL prod)]
        Cache[(Redis Cache)]
    end
    
    UI --> API
    API --> ToolService
    API --> CertService
    API --> RepoService
    
    CertService --> Trending
    CertService --> RepoMeta
    CertService --> Context7
    
    n8n --> Trending
    n8n --> CertService
    
    Temporal --> RepoService
    Temporal --> CertService
    
    ToolService --> DB
    CertService --> DB
    RepoService --> DB
    
    ToolService --> Cache
    CertService --> Cache
Loading

Technology Stack

Backend (Modern Python)

Category Classic Tool Modern Tool
Package Management pip uv
Environment venv uv venv
Linting/Formatting flake8 + black ruff
Type Checking mypy pyright
Web Framework Flask/Django FastAPI
Data Validation dataclasses pydantic v2
CLI argparse typer
Testing unittest pytest

Infrastructure

Component Technology
Database SQLite (dev) / PostgreSQL (prod)
Cache Redis
Containerization Docker + Docker Compose
CI/CD GitHub Actions
Deployment Fly.io / Render / Railway
Orchestration n8n, Temporal
MCP Context7

Frontend

Category Technology
Framework React + TypeScript
Build Tool Vite
Styling Tailwind CSS
Testing Vitest + React Testing Library

Credentials

Required Credentials

Credential Environment Variable Purpose Required
LLM API Key MPC_LLM_API_KEY / LLM_API_KEY AI-powered features via LLM provider (OpenAI by default) Yes
GitHub Token MPC_GITHUB_TOKEN / GITHUB_TOKEN Fetch trending repos and metadata from GitHub API Yes

Optional Credentials

Credential Environment Variable Purpose Default Value
Database URL MPC_DATABASE_URL Database connection URL sqlite+aiosqlite:///./modern_python_companion.db (dev)
Redis Host MPC_REDIS_HOST Redis server address localhost
Redis Port MPC_REDIS_PORT Redis server port 6379
Redis DB MPC_REDIS_DB Redis database number 0
LLM Provider MPC_LLM_PROVIDER LLM service provider openai
LLM Model MPC_LLM_MODEL LLM model to use gpt-4

Future Credentials (Not Currently Required)

Credential Purpose Status
Context7 MCP API Key Fetch up-to-date tool documentation Mock implementation - no credentials needed yet
n8n Credentials N8N_USER, N8N_PASSWORD Orchestration service (optional, not in current docker-compose)
Temporal Credentials Temporal server connection Orchestration service (optional, not in current docker-compose)

How to Obtain Credentials

Important Notes

  1. GitHub Token: If not provided, GitHub API features are gracefully disabled and return empty results.
  2. Context7 MCP: Currently uses a mock implementation. In production, this would require API credentials for fetching real tool documentation.
  3. Redis: No authentication is configured in the current implementation.
  4. Frontend: Only requires VITE_API_BASE_URL which points to the backend API - no credentials needed.

Quick Start

Prerequisites

  • Python 3.11+
  • Node.js 18+ or 20+
  • uv installed
  • Docker (optional, for containerized deployment)

Backend Setup

# Navigate to backend directory
cd backend

# Initialize with uv (already done)
# uv init --name modern-python-companion --lib .

# Install dependencies
uv sync

# Run development server
uv run uvicorn modern_python_companion.main:app --reload --host 0.0.0.0 --port 8000

Frontend Setup

# Navigate to frontend directory
cd frontend

# Install dependencies
npm install

# Run development server
npm run dev

Docker Setup

# Copy environment variables
cp .env.example .env

# Edit .env with your values

# Start all services
docker compose up

# Start specific service
docker compose up backend

# View logs
docker compose logs -f backend

# Stop services
docker compose down

API Documentation

Once the backend is running, visit:

API Endpoints

Tools Endpoints

GET  /api/v1/tools                     # List all classic tools
GET  /api/v1/tools/{name}/alternatives # Get modern alternatives
GET  /api/v1/tools/{name}/signals      # Get certification signals

Analysis Endpoints

POST /api/v1/analyze/repo              # Analyze a GitHub repository (saves to database)
GET  /api/v1/analyze/{id}              # Get analysis results (retrieves from database)

Health Endpoint

GET  /api/v1/health                    # Health check

Development

Running Tests

# Run all tests
cd backend
uv run pytest tests/ -v

# Run unit tests only
uv run pytest tests/unit/ -v

# Run integration tests only
uv run pytest tests/integration/ -v

# Run with coverage
uv run pytest tests/ --cov=src --cov-report=html

Code Quality

# Format code
cd backend
uv run ruff format .

# Check code
uv run ruff check .

# Type check
uv run pyright src/

Pre-commit Hooks

Install pre-commit for automatic code quality checks:

# Install pre-commit
pip install pre-commit

# Set up hooks
pre-commit install

# Run on all files
pre-commit run --all-files

Project Structure

modern-python-learning-companion/
├── backend/                    # FastAPI backend
│   ├── src/modern_python_companion/
│   │   ├── api/v1/             # API endpoints
│   │   ├── models/              # Pydantic models
│   │   ├── services/             # Business logic
│   │   │   └── signal_fetchers/  # External data sources
│   │   ├── db/                  # Database layer (SQLite dev / PostgreSQL prod)
│   │   │   ├── __init__.py       # Database setup
│   │   │   ├── models.py        # SQLAlchemy models
│   │   │   └── repositories/    # Repository pattern
│   │   ├── config.py            # Configuration
│   │   └── main.py             # Application entry
│   ├── tests/                   # Test suite
│   │   ├── unit/               # Unit tests
│   │   └── integration/          # Integration tests
│   ├── Dockerfile                # Backend container
│   ├── pyproject.toml           # Project config
│   └── .env.example             # Environment template
├── frontend/                   # React + TypeScript frontend
│   ├── src/
│   │   ├── pages/              # Page components
│   │   ├── services/            # API clients
│   │   └── types/              # TypeScript types
│   ├── Dockerfile              # Frontend container
│   ├── nginx.conf              # Nginx configuration
│   ├── package.json           # Dependencies
│   └── vite.config.ts         # Vite config
├── .github/workflows/           # GitHub Actions CI/CD
│   ├── pr-checks.yml         # PR validation
│   ├── scheduled-tests.yml    # Daily tests
│   └── deploy.yml            # Build and deploy
├── docker-compose.yml           # All services
└── README.md                 # This file

Tool Certification Levels

The system classifies tools into three maturity levels:

Level Criteria Example
Trending Present in GitHub Trending (monthly) marimo
Emerging Trending + positive maturity signals polars (with good stars)
Stable High adoption + reliable maintainer ruff, uv

Signal Types

The system considers multiple signal types:

  • trending: GitHub Trending presence
  • stars: Repository star count (normalized)
  • maintainer: Maintainer/organization reputation
  • ecosystem: Ecosystem consolidation
  • docs: Documentation quality

Non-Goals

This project explicitly does NOT:

  • Migrate production systems
  • Perform automatic large-scale refactoring
  • Guarantee backward compatibility
  • Provide enterprise-grade migration support
  • Replace the need for manual code review

License

See LICENSE file for details.

Contributing

Contributions are welcome! Please see our contributing guidelines.

Documentation

For detailed documentation, see the docs/ directory:

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors