Important
Please see the challenges section for more information, but the chatbot is currently not using my preferred language model and you can only ask it a thing about once a minute. I am very saddened by this, but this decision had to be made last minute and there was not really another solution. I am working on fixing this ASAP.
Inspiration
My main reason for joining the Hackathon was twofold: I love e-sports and I have always wanted to combine it with my software engineering work. And secondly, I wanted to explore the AWS Bedrock services.
What it does
As per the specs, Valchat is a chatbot that has specific tools available to help out with Valorant related questions and most importantly with questions related to creating a dream team for a competition.
How I built it
The app is built using Java and Spring Boot. It uses Bootstrap with HTMx for the frontend. I choose those technologies because it has been a while since I did anything in Java, and I wanted to refresh my memory and see what kind of challenges I would encounter.
For the data store, I currently use an H2 file-based database. Once the core data is extracted from the JSON files, it turned out to be the easiest solution. It allows me to easily ship the data and to version it by simply having multiple files. In a proper application, I would probably replace it by a different db tech, but for the purpose of the Hackathon, it turned out to be a good choice.
Extracting the data from the provided JSON files is currently a batch operation. I have run it once and if updates are needed I need to run it again. It would not be all that difficult to repurpose the existing code and make it into an event-based system. But I very deliberately tried to avoid adding complexity into the project so I could make the deadline.
This database is used by a lambda to transform the normalized schema into something that an LLM can use. This lambda is then used by a Bedrock agent that answers the user's questions.
Challenges I ran into
The main challenge has been the sudden reduction in the model request quota on AWS. Initially my account had the default quota of 500 request per minute, but that was suddenly reduced to 10. This made it pretty much impossible to use the agent, as the agent needs a few requests to come up with a plan and process the different steps of the plan.
I have been communicating with AWS support, but this quota can not be changed easily and it has been problematic getting in touch with the right person.
For now, I have had to revert to using Anthropic Claude 3 Haiku, as I have a 20 rqm quota for that one. That makes it barely possible to ask one question per minute. The disadvantage are that the answers are not as good as the more advanced Claude models.
Another challenge I faced was that some Bedrock services, such as the knowledge base, were prohibitively expensive for a hobby project such as this. I ended up going for an H2 database as discussed above.
Accomplishments that I'm proud of
I am proud that it actually works. I had to do a little bit of everything (data engineering, back-end, front-end, prompt engineering) but was able to make it work by my self-imposed deadline.
What we learned
Once the data was available, the AWS Bedrock services made it easy to integrate them with the foundational models they provide. Definitely a recommendation from me. I just wish there was some way to run the Knowledge Base in development mode where it would be cheaper
What's next for Valchat - the manager
There are quite a few thing I want to fix up in the next week:
- I really really want to revert back to the Sonnet version of Claude, so I hope the quota issue can be resolved soon.
- Some of the queries that the lambda uses need to be tweaked and optimized
- Some of the data seems to be incorrect (for instance, teams with 6 players), I need to investigate and fix this
- The UI is very barebones and could be made more attractive.
- General improvements are need to the prompts
- Find a way to give public access without incurring too high costs
Log in or sign up for Devpost to join the conversation.