Inspiration
We wanted to make a virtual coding assistant that focuses on learning rather than just giving away the answer, like most LLM based assistants. This would incentivize more learning, particularly for new developers who are less familiar with software fundamentals.
What it does
Debug.Me is a multi-purpose extension for Visual Studio Code, with its primary goal of helping users learn rather than just copy code. It has two modes, debug and brainstorm. Debug will read terminal errors as well as the code that was responsible for it and explain these issues in simple, understandable terms. Brainstorm is a place where users can bounce ideas off the assistant, who will guide them with questions but ultimately let the users make their own decisions.
How we built it
The assistant was built with Mistral, an open-source model as well as LangChain, an LLM framework that chains conversations and allows for easy memory recollection.
For the VSExtension, it was built Webview for interactive panel inside the VSCode interface. JavaScript and TypeScript, HTML and CSS as a base.
Challenges we ran into
For the backend models, we initially tried running the assistant with an entirely local hosted model such that it could work entirely offline. However, we did not have GPU access and we felt the best models we could run locally on just a CPU were insufficient for our purposes, so we had to pivot to an open-source model rather late on, as an open-source model would be able to run a more complex assistant while maintaining the customizability of a locally hosted model.
For the frontend, initializing the VSCode Extension Code in the workspace was more difficult than expected. There are multiple workarounds that need to be made to get the functionalities that we want.
Accomplishments that we're proud of
A working VSCode extension that hosts an LLM agent to help you with your code!
What we learned
A lot about LLMs and the various models out there. A lot of LangChain. Also , VSCode development and full-stack integration.
What's next for Debug.Me
Definitely more personalization for models to specific users and also functionalities that are more interactive.
Built With
- css
- fastapi
- html
- javascript
- llama
- mistral
- python
- typescript
Log in or sign up for Devpost to join the conversation.