This chat app is built on top of the go-openai library.
It talks to the local Ollama server by default, but you can also hook it up to OpenAI or DeepSeek.
Basically: if it works with the OpenAI SDK, it works here.
It’s not just a pretty chat box, the bot can also connect to a GitHub (using MCP tools). That means it can:
- Peek at your issues (even private ones)
- Work on them
- And open a pull request from a fresh branch
I’ve mostly tested this with gpt-oss:20b, and it’s been surprisingly good at tool use.
Still early days though, so don’t expect magic on mid or large codebases just yet.
It needs a GitHub access token to work.
These all I have tried
- ✅ Local Ollama models (Qwen, LLaMA, gpt-oss)
- ✅ OpenAI
- ✅ DeepSeek
- ✅ Basically anything OpenAI-compatible (this i haven't)
git clone https://github.com/adistrim/gollama
cd gollamacd server
go mod tidy # install dependencies
cp .env.example .env # put your stuff here
go build -o gollama . # build the server
./gollama # run the servercd ../app
pnpm install # install node dependencies
cp .env.example .env
pnpm dev # dev serverOther frontend scripts:
pnpm dev # dev server
pnpm build # prod build
pnpm preview # preview buildIf you get stuck, open an issue or just email me [email protected]
This project is licensed under the MIT License - see the LICENSE file for details.