This is my solution to the NinetyOne graduate assessment. The task was to read a CSV file of test scores and print out the top scorer(s). I extended it with a SQLite database and a REST API as required.
- Reads a CSV file and finds the person (or people) with the highest score
- If there's a tie, it prints all the top scorers in alphabetical order
- Saves every record from the CSV into a SQLite database
- Exposes a REST API to add new scores and query the results
Install dependencies first:
pip install -r requirements.txtRun the CLI:
python main.py TestData.csvOutput:
George Of The Jungle
Sipho Lolo
Score: 78
Run the tests:
pytest tests/ -vStart the API:
uvicorn app.api:app --reload├── app/
│ ├── api.py # REST API (FastAPI)
│ ├── csv_parser.py # Custom CSV parser
│ ├── database.py # SQLite logic
│ └── scorer.py # Finding the top scorers
├── tests/
│ └── test_all.py # 16 unit tests
├── main.py # CLI entry point
└── TestData.csv
All endpoints require the header X-API-Key: changeme-supersecret-key.
| Method | Endpoint | Description |
|---|---|---|
| POST | /scores |
Add a new score |
| GET | /scores/top |
Get the top scorer(s) |
| GET | /scores/{first}/{second} |
Get a specific person's score |
Interactive docs available at http://localhost:8000/docs when the server is running.
Custom CSV parser – the brief said not to use standard CSV libraries, so I wrote my own in csv_parser.py. It handles quoted fields, commas inside quotes, and both Windows and Unix line endings.
Module structure – I kept the CSV parsing, scoring logic, database, and API in separate files. The main reason was testability — I wanted to be able to test the scoring logic without needing a database or a running server.
SQLite – simple, no setup needed, and ships with Python. For a production system I'd swap it out for PostgreSQL or similar.
FastAPI – I chose it because it's lightweight, generates the Swagger docs automatically, and handles request validation through Pydantic models.
Right now the API uses a static key in the X-API-Key header. That's fine for local development but not for production. If this were going live I'd:
- Move the key into an environment variable so it's never in the code
- Switch to short-lived JWT tokens issued by an identity provider like AWS Cognito or Auth0
- Enforce HTTPS at the load balancer so the key is never sent in plain text
- Add rate limiting to prevent brute-force attempts
I'd use AWS for this:
- API – AWS Lambda + API Gateway for the endpoints (or ECS Fargate if traffic is consistent enough to justify it)
- Database – Amazon RDS (PostgreSQL) for a managed, backed-up database
- Frontend – React app hosted on S3 with CloudFront in front for HTTPS and caching
- Auth – Amazon Cognito for user management and JWT issuance
- Secrets – AWS Secrets Manager to store the DB credentials and API keys
- CI/CD – GitHub Actions to run tests and deploy on every push to main
The main reason for this stack is that it keeps operational overhead low while still being easy to scale. Lambda in particular makes sense here because the API is stateless and traffic is likely to be bursty rather than constant.