California Accountability Panel is a dashboard for viewing school standards. Contributions are welcome!
Note
Learn about the project on the documentation website (under development).
Note
Academic performance data for 2024 and 2025 can be downloaded in a zip file on Nate's google drive https://drive.google.com/drive/folders/1ifRu7gL8OVxN7oHKadEydS3e7c85ityr?usp=sharing.
The easiest way to get started is to use the Docker container (under development). If you want to contribute to the project, it is recommended to install the PostgreSQL, Python and Node.js requirements and run each part separately. For help, see the developer documentation.
For support and to keep updated on news:
- Attend a virtual Community Hack Night.
- Join our Slack channel (updated 2026-02-01).
- Email us at [email protected] or [email protected].
All parts of the application can be started by running docker compose up. Before the first run, make sure to update the configs in the .env files to customize your configurations. For help, see deployment documentation.
The minimum required environment variables are:
SECRET_KEY=changethis
FIRST_SUPERUSER=[email protected]
FIRST_SUPERUSER_PASSWORD=changethis
POSTGRES_PASSWORD=changethisView all environment variables.
On first Docker startup, the prestart job now also attempts to import academic indicator data using:
backend/app/scripts/import_ela_data.pybackend/app/scripts/import_indicators.py
For local development (compose.override.yml), mount or place files under backend/resources/ (default expected folder: backend/resources/cde).
Optional .env controls:
RUN_DATA_IMPORTS=true
IMPORT_ELA_DATA_FILE=/app/backend/resources/cde/eladownload2025.xlsx
IMPORT_INDICATORS_SOURCE=cde
IMPORT_INDICATORS_PATH=/app/backend/resources/cde
IMPORT_INDICATORS_INDICATOR=
IMPORT_INDICATORS_BATCH_SIZE=1000Imports are skipped automatically if academicindicator already has rows, to avoid duplicate inserts on restart.
Detached mode (docker compose up -d) does not stream container logs. To see live import progress from prestart (including heartbeat messages while files parse), run:
docker compose logs -f prestartSome environment variables in the .env file have a default value of changethis.
You have to change them with a secret key, to generate secret keys you can run the following command:
python -c "import secrets; print(secrets.token_urlsafe(32))"Copy the content and use that as password / secret key. And run that again to generate another secure key.
Cloud Run + Cloud SQL support is available through backend/app/scripts/deploy_cloud_run.py.
Runtime secrets for Cloud Run are sourced from GCP Secret Manager:
capanel-secret-key->SECRET_KEYcapanel-postgres-password->POSTGRES_PASSWORD
FIRST_SUPERUSER and FIRST_SUPERUSER_PASSWORD stay local .env values for local development and are not managed in Secret Manager.
For Cloud Run, deploy now supports syncing local resources from ~/Downloads/resources to GCS before image deploy:
IMPORT_RESOURCES_LOCAL_PATH=~/Downloads/resources
IMPORT_GCS_URI=gs://ca-panel-001-resources/resources
SYNC_LOCAL_IMPORTS_TO_BUCKET=trueBy default, running:
uv run --env-file .env backend/app/main.pyuses local Postgres from .env:
DB_CONNECTION_MODE=auto
POSTGRES_SERVER=localhost
POSTGRES_PORT=5432
POSTGRES_DB=capanel_f65b
POSTGRES_USER=nateb
POSTGRES_PASSWORD=...DB_CONNECTION_MODE=auto selects local Postgres in ENVIRONMENT=local/staging when POSTGRES_SERVER is set, and prefers Cloud SQL in ENVIRONMENT=production when CLOUD_SQL_INSTANCE_CONNECTION_NAME is set.
You can force behavior with:
DB_CONNECTION_MODE=local
# or
DB_CONNECTION_MODE=cloudsqlUse:
docker compose up --buildThis starts the db Postgres container and configures backend services to use it.
Backend supports Cloud SQL via either:
CLOUD_SQL_INSTANCE_CONNECTION_NAME=ca-panel-001:us-west1:capanel-pg
POSTGRES_DB=capanel
POSTGRES_USER=capanel_app
# POSTGRES_PASSWORD is injected at runtime from Secret Manager in Cloud Runor:
DATABASE_URL=postgresql+psycopg://USER:PASSWORD@/DB?host=/cloudsql/PROJECT:REGION:INSTANCE- Set Cloud Run values in your local
.envfile. - Create/update runtime secrets in Secret Manager:
python backend/app/scripts/create_secrets.py- Load env vars and run:
set -a
source .env
set +a
python backend/app/scripts/provision_cloud_run.py
python backend/app/scripts/deploy_cloud_run.pyNotes:
- Cloud Run deploy uses one service (
capanel-full) with two containers:frontend(NGINX ingress on8080)backendsidecar (FastAPI on8000)
- The
${BACKEND_SERVICE}-initCloud Run Job is deployed and never auto-executed during deploy. - The init job runs only
backend/app/scripts/initial_data.py. - A manual HTTP Cloud Function
${INIT_TRIGGER_FUNCTION_NAME}is deployed and can trigger the init job on demand. - Ensure callers have
roles/cloudfunctions.invokeron the trigger function.
Suggested Google Cloud resource names:
- Artifact Registry repository:
capanel-repo(region:us-west1) - Cloud SQL instance:
capanel-pg(PostgreSQL 18, private IP only, networkdefault) - Cloud SQL database:
capanel - Cloud SQL user:
capanel_app - Cloud Run full service:
capanel-full - Backend image name (Artifact Registry):
capanel-backend - Frontend image name (Artifact Registry):
capanel-frontend - Runtime service account:
[email protected] - Private services IP range:
google-managed-services-default
We strive to make this application secure as possible. Some highlights include:
- Hashed passwords.
- Based on an actively maintained open-source project (full-stack-fastapi-postgres). We can mimic the versions of the pyproject dependencies and know when things need upgrading.
The application does not enforce https by default. You can enable it by setting SECURE_SSL_REDIRECT to True in the .env file.
See Security.md for more information on reporting security vulnerabilities. For other security related topics see the security documentation page. You can also email [email protected].


