Skip to content

unvibe/unvibe

Repository files navigation

Important

This project is in the early stages of development

Unvibe

Run llms in your local projects and files with full configuration and control over context

Features

  • 🧠 Memory-first: Remembers key project context and conversation threads to improve responses
  • 🔗 Automatic context: Plugins auto-detect relevant files, dependencies, and history for smarter suggestions
  • 🛠️ Custom tools: Extend Unvibe with your own developer tools and integrations
  • 📜 Custom system instructions: Fine-tune assistant behavior and problem-solving style
  • 🧑‍💻 Deep diagnostics: Git, TypeScript, and ESLint integration for actionable insights
  • 🎨 Visual mode: Iterate on UI visually and preview changes in real time
  • 📂 Files mode: See before/after code edits and review changes safely
  • 🌈 Theming: Personalize with custom colors, fonts, and editor styles
  • ⚡ Easy model swapping: Add and configure new LLMs with minimal setup

Requirements

  • Node.js: requires Node.js version 23 or higher.
  • gh CLI: Github CLI is used for various git / github operations
  • OpenAI API Key: You need an OpenAI API key to use the LLMs
  • Playwright: to convert a link to markdown npm i -g playwright
  • Ollama: to use Ollama models, and powering the llm memory system

Install and Run

Note

Recomended workflow: since this is just the exploration phase, there are a lot of rough edges, coming soon features, and missing features. for now use GPT 4.1 and its variants (mini | nano)only

To install:

npx unvibe

this will:

  1. pull the latest version of Unvibe from github into ~/.unvibe
  2. installs the dependencies
  3. builds the app
  4. starts the app at server http://localhost:54495

If this is the first time you run it then you will be greeted with a welcome message that prompts for the OpenAI API key, once you enter it you are ready to go! Optionally, also follow the instructions to set up AWS S3 to enable LLM input/output of type image (for now, later will be used for video/audio too).

To update

npx unvibe --update

or update from the app itself by clicking on the "Update" button in the sidebar.

To delete

Warning

This will delete everything related to Unvibe and will also delete all the data in the database (sqlite) Unvibe does not store or send any data to any remote server, everything is stored locally in the database.

rm -rf ~/.unvibe

How does it work?

The home screen acts like an entry point to all your folders on your computer, by default it will show folders in ~/projects and you can add more sources to your project with easy to navigate UI into your local files and folders.

Once you click on a folder, Unvibe will parse the folder and register plugins associated with that folder, for example if you use tailwind, a tailwind plugin will be registered, if you use typescript, a typescript plugin will be registered, etc...

A plugin is a simple interface that has a detect function that detects if the plugin is applicable to the current folder, if it is then it will add one or more of the following:

  • llm-tools: tools that can be used by the LLM to interact with the project, like running scripts, searching files, etc.

  • llm-system-instructions: system instructions that will be used to guide the LLM in understanding the project structure and context.

  • code-hooks: hooks that will run on the LLM response, like ts-check, eslint, etc, hooks are either transform or diagnose a given file that is generated by the LLM, transformations are like prettier that will just format the code, or generaly mutates the code, while diagnostics are like ts-check that will check the code for errors and report them back to the LLM.

Later on the roadmap a plugin will also be able to provide a structured-output configuration that will be used to guide the LLM in generating structured output, like a JSON object with specific fields.

All of this is just to provide the LLM with the best context and tools to give it maximum possibility of successeding in understanding and interacting with your project.

in addition to plugins, you can also add ad-hoc tools, plugins, and system instructions.

Core Plugin

The core plugin provides the basic functionality to any project, like reading files, writing files, running scripts, etc.

It includes the following tools:

  • fuzzy_search_files: Search for files by (partial) name.
  • get_file_content: Read file content (optionally by line range).
  • get_file_metadata: Get file size and line count.
  • search_in_files: Search for strings or symbols inside files.
  • node_scratch_pad: Run JavaScript code using Node.js with optional network access.
  • shell_command: Safely execute read-only shell commands in the project.

Additionally, some system instructions:

  • character: Defines the LLM's role and capabilities.
  • files_summary: Summarizes the project files and structure.
  • os_info: Provides information about the operating system.

Project Status

This started as a sketch of a platform to run all kinds of llms in your local projects and files, with a nice way to add/remove/update context and tools. When you first open a project from the app, it will parse that project and register plugins associated with that project stack.

Let's say you opened a Typescript React project, then it will register typescript and various web tooling that will:

  • aid the model in understanding the project structure via system instructions
  • provide the model with the ability to run scripts against your codebase
  • provide tools to manage various aspects of your stack
  • diagnostics hooks (runs on llm repsonse) and analysis of the project structure, code, and files

and more, everything is customizable, although for now the only supported stack is Typescript/nodejs based projects, some plugins stubs exists for later support like AWS, Docker, Go, Python, etc...

There's a lot of rough edges throughout the app, this is an experimental phase, so expect some bugs and missing features.

TODOs

  • Support for visual flow like (the llm view the project with you and ready to be prompted with the correct context)
  • Support for files flow, focused work on a file (you can jumpt to a file and prompt away in few strokes)
  • Support Structured Output Configuration, currently it's modular, but hardcoded for now
  • Finish all Coming Soon features marked by <ComingSoon /> component
  • More quality of life improvments in the llm-input UI component in continue-thread flow
  • Add Structured Output play ground for testing various behaviors independent of the current project
  • Support search enabled queries, currently limited to web_scraping tool
  • Add Home docs (user guide to various features, document all cases of usage)
  • Fix add project/github pull flow (which source to add to?)
  • Enable remove/add source in home/projects
  • More informative home/environment page and inputs (what does each var do?)
  • Fix archived threads
  • Add delete|edit hover actions to the threads list items
  • Enable per-project/theme
  • Enable per-project and per-thread context settings
  • Make next.js plugin
  • Make react-router plugin
  • Show Structured Output status (outdated/accepted/...)
  • Autocomplete file paths in the llm-input component

Contributing

check the contributing guide for more details.

Roadmap

These are the hardest to solve right now, so for now im sticking to the TODOs above, once these are done, I will start working on the roadmap items.

  • Fix all llm providers (gemini, ollama, anthropic, etc...)
  • Enable llm image output
  • Enable Audio/Video input and output
  • Enable Custom workflows (more than just visual/files/threads flows)

Final Thoughts

I started this project because I wanted full control over what can be done with LLMs, this project showed promise and hope that i can compose contexts however I want, and manage my own costs (although more expensive than directly using a 20$ a month AI product subscription) but i think it is worth it

I am mainly intereseted in fully building the JS/TS web stacks and making them fully supported (a plugin for next.js, react-router, etc...) and then I will start working on other stacks like python, go, etc. But that's a tall order, so my hope is if people find the idea useful they can contribute to the project and help me build it. so unvibe becomes a fully open-source community project.

About

Bring LLMs to your local codebase—private, extensible, and plugin-powered AI for your projects.

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages