Inspiration
Hitachi Track for TAMU Datathon
What it does
Classification document analyzer that's hosted locally to protect your personal information, and smoothens the process of going through documents parsing sensitive information with ease and privacy.
How we built it
Used llama.cpp python, css, html, and javascript to connect front end to back end with our AI models which were strong at classification which were granite. And in order for it to be performative we made sure to go for low byte counts and quant 4 and below.
Challenges we ran into
- Hosting a local/offline LLM model to protect sensitive information and prevent data privacy leak.
- Implementing different features such as real-time status update and designing clear UI/UX that's easy to navigate.
Accomplishments that we're proud of
A classification document analyzer that's hosted offline and locally to prevent any potential sensitive information being leaker that would otherwise happen in using online and non-private APIs.
What we learned
- Learning how to implement a local/offline LLM model
- Designing clear UI/UX
What's next for YourLocalDocBruh
Improve processing accuracy and speed, as well as adding more user-friendly designs.
Built With
- css
- docker
- docling
- granite
- html
- javascript
- llama
- python

Log in or sign up for Devpost to join the conversation.