Inspiration

Our team is passionate about using AI technologies to solve real world problems and create amazing experiences while doing so. We were inspired by the Rohde & Schwarz challenge, which was about enabling employees to monitor and deal with ever-growing log files in the modern age.

What it does

Our solution allows the user to upload a log file and gives a short summary of it to inform the user and to inspire the user to ask further questions. Through an interactive experience with a chatbot, it keeps updating the dynamic summary depending on the user's interest and the conversation history so far.

The intelligent AI-powered system in the backend parses the logs, filters accordingly, and keeps track of a database generated from the initial file. The system predicts queries in parallel to the chat to retreive the most relevant entries from the database (which correspond to log rows) to then present them to the user and incorporate to the current summary iteratively.

How We built it

We have a flask backend with a PostgreSQL server to store the filtered log rows as entries. The database rows also have a column to store embeddings of the log messages generated quickly by a local model.

The frontend is built with react, styled with tailwindcss, and infused with love and passion. It is designed to be self-explanatory and provide a smooth experience for the users.

Challenges We ran into

We haven't worked with local large language models extensively before. Getting them to run fast to enable a smooth user experience was very challenging, especially to generate the initial summary quickly was not feasible (1 minute for around 200 lines of log entries). Even a 4-bit quantized model was too slow. To solve this issue we decided to use OpenAI API calls and came up with clever tricks to achieve a smooth experience.

Accomplishments that I'm proud of

The gradient in the log layer selection after uploading the file is beautiful to look at, the references integrate seemlessy and beautifully to the summary that continiously updates, and our hard work designing prompts for OpenAI models have paid off!

What we learned

How to stay focused on the goal, how to spend hours trying to run local models and deciding to choose existing APIs for chat interfaces

What's next for Logzilla

First and foremost, probably some sleep. Then, we hope to make the initial summary more comprehensive, the chatbot even cleverer, and the whole system even more robust.

Built With

Share this project:

Updates