Optifiner
Inspiration
Darwinian evolution and the recent rise of multi-agent AI frameworks made us wonder whether software optimization could follow the same principles as natural selection.
Instead of relying on a single “smart” agent or a human-in-the-loop optimizer, what if we let multiple agents compete? What if codebases could evolve, generation by generation, with only the strongest improvements surviving?
The name Optifiner is inspired by OptiFine, the iconic Minecraft optimization mod. Just as OptiFine squeezes performance out of Minecraft without changing the game itself, Optifiner is designed to optimize real-world codebases without rewriting them from scratch. We wanted to carry that same spirit forward and apply it to everything.
What It Does
Optifiner is a Darwinian, multi-agent code optimization framework.
You point Optifiner at a GitHub repository, provide your API keys, and let your agents go wild.
At a high level:
- Multiple AI agents independently explore different improvement strategies, including refactoring, caching, restructuring, and performance tuning.
- Each generation produces multiple candidate code variants.
- The system automatically builds and evaluates each variant using user-defined metrics such as FPS, runtime, or benchmark scores.
- Only the best-performing variant survives. All other candidates are discarded.
- The winning variant becomes the parent of the next generation.
This process repeats across generations, allowing the codebase to evolve toward measurable performance improvements.
For our demo, we optimized a deliberately inefficient web-based horror game. Across generations, Optifiner evolved the code to deliver dramatic performance gains that exceeded what conventional AI coding tools achieved.
How We Built It
Optifiner is built as a modular, end-to-end system:
- LangGraph for orchestrating multi-agent workflows and evolutionary control
- FastAPI powering the Python backend for agent execution, evaluation, and coordination
- React for a responsive frontend to visualize generations, performance metrics, and progress
- PostgreSQL for persisting experiment state, agent outputs, and evolution history
To enable full autonomy:
- Agents can automatically stage, commit, and push changes to GitHub repositories
- We implemented GitHub App–based authentication to safely grant write access without exposing personal tokens
Challenges We Ran Into
Building Optifiner surfaced several difficult engineering challenges:
- Integrating the frontend and backend while streaming real-time agent progress and evaluation results
- Designing agent interfaces that allowed multiple agents to work on different parts of the codebase simultaneously
- Safely enabling agents to auto-stage, commit, and push code changes using GitHub Apps
- Designing fair and stable evaluation metrics that accurately reflected real performance improvements
Accomplishments That We’re Proud Of
- Running Optifiner on the Flask codebase reveals a 65.4% improvement in performance (10086 requests per second to 16570).
- Optimized a slow-running web-based horror game from 2 FPS to 60 FPS, a 2900% performance improvement
- Compared against Cursor’s 650% improvement on the same codebase (2 FPS to 15 FPS), Optifiner surpassed it by 250%
- Built a fully autonomous system that can modify, evaluate, and evolve real GitHub repositories end to end
- Demonstrated that evolutionary, multi-agent optimization can outperform single-agent coding assistants on complex codebases
What We Learned
Through building Optifiner, we learned:
- How to orchestrate and coordinate multi-agent frameworks at scale
- How to safely automate Git operations using GitHub Apps
- The practical limits of single-agent optimization tools on large, highly coupled codebases
- That diversity of strategies matters more than a single “smart” solution when optimizing software
What’s Next for Optifiner
Optifiner currently focuses on web-based and locally runnable projects, but this is just the beginning.
Next, we plan to:
- Expand support to more languages and frameworks beyond web stacks
- Allow users to define custom fitness functions, such as memory usage, load time, or energy efficiency
- Introduce long-term agent memory so successful strategies can transfer across projects
- Provide a plug-in system for defining new agents and evolutionary rules
Our long-term goal is to turn Optifiner into a general-purpose evolutionary engine for optimizing software, one generation at a time.
Built With
- langgraph
- postgresql
- pydantic
- python
- react
- tailwind
- typescript
- vite
Log in or sign up for Devpost to join the conversation.