An open-source private AI scribe platform. Currently support for running locally on Mac, PC, or Linux using Flask for the backend, SQLite for the database, Vite for the frontend, and Ollama for local AI inference.
- Backend: Flask, SQLAlchemy, SQLite
- Frontend: Vite
- AI Model Inference: Ollama (defaulting to the Llama 3.2 model)
Make sure you have the following installed on your machine:
- Python 3.8+
- Node.js 16+
- npm
- Ollama (installed and running locally)
git clone https://github.com/yourusername/private-ai-scribe.git
cd private-ai-scribeCreate a virtual environment and install Python dependencies:
python -m venv venv
source venv/bin/activate # On Windows use `venv\Scripts\activate`
pip install -r requirements.txtInitialize the SQLite database:
flask db upgrade # if using migrations, otherwise your custom init commandNavigate to the frontend directory and install dependencies:
cd frontend
npm installFrom the project root:
source venv/bin/activate
flask runYour backend will start at http://127.0.0.1:5000
With the flask server running, run the following command to create an admin user:
flask create-admin
and you will be prompted to enter an email and password for the admin account. Once created, you can login.
From the frontend directory:
npm run devThe frontend will be available at http://127.0.0.1:3000
Ensure Ollama is installed and running:
ollama serveDownload the default Llama 3.2 model:
ollama pull llama:3.2You can test the model locally with:
ollama run llama:3.2Make sure to configure your Flask backend to connect to the local Ollama API.
Contributions are welcome! Please open an issue or submit a pull request.
This project is licensed under the MIT License - see the LICENSE file for details.