-
-
AvaFront
-
Ever received a sales pitch from an AI agent? Trigger a sales pitch upon request or conditionally based on user questions.
-
AI Agents have unlimited patience. Don't feel bad asking questions or asking for clarity if you don't understand. Ask for recommendations.
-
Customize your Agent to fit your brand! Choose a look and voice that fits.
The Next Evolution Is Here
Users interfacing with technology has come a long way. With computers we began with the Command Line Interfaces (CLI): Early user interfaces were predominantly text-based, requiring users to input commands through a command line. This type of interface was not very intuitive and required specific knowledge of command syntax.
Next, the Graphical User Interfaces (GUI): This was a huge leap forward for being user-friendly. Users could interact with software through visual elements like buttons and icons, making operations more intuitive.
With the advent of the internet, web-based interfaces became prominent. Everyone needed a dotcom/website to legitimize their business. What is the next evolution? Natural conversation interface! A time is coming when everyone will need an AI agent that we can interface with natural conversation. Instead of navigating through web pages, users could simply ask for what they need, and the AI would deliver instant, personalized results.
Enter AvaFront, the next generation in customer experience. Imagine having a 24/7 avatar that can greet customers, answer inquiries, manage appointments, resolve complaints, and even assist with sales support all through natural conversations.
AvaFront is an AI-powered avatar that can be displayed at your front desk or integrated into your website, providing a seamless and personalized experience for your customers. With its ability to handle various tasks and provide accurate information, AvaFront ensures exceptional customer service round the clock. Elevate your business with AvaFront and revolutionize the way your customers interact with you.
Context Diagram

How it Was Built

For the front end, I used Angular 18 and a library named TalkingHead to sync the avatar talking to the speech.
For the backend, I use AutoGen.Net. I started this project trying to learn python and use the original AutoGen. Thanks for the great developers at Microsoft for porting it to .Net. You guys/gals rock! The conversation state is stored in CosmosDb. I implemented API Management for rate limiting and in the future possibly to utilize subscription keys for certain bots. The solution is broken up into 4 projects in trying to follow clean architecture. API project, AutoGen project, Shared project, and Infrastructure project. Unfortunately I have no domain layer as it's not clear what I'm building from the start.
The Avatar agent are multiagent. I tried using the AutoGen group chat but couldn't trust the results so I used more of a chat manager pattern to handle which agent should handle the prompt.
Agent prompts located here:
link
link
Safety/Responsible prompts:
link
High Level Features
Please reference the context diagram. The user can interact via clicking, typing, or speaking. The AvaFront application can interface back with customers through speech, opening new tabs in browser, showing images, showing video, sending emails, sending texts and sending notifications to the customer. It understands and responds in natural language. It also has the capability to authenticate a user via multiple factors (something they know (secret) and something they have (phone)). Authentication is important if a user wants to get updates on anything customer specific.
AvaFront has a stateless API layer architecture which makes it easily deployable. The AI agent is multiagent-multimodal, multi model AI agent. Steps to deploy are gathering and preparing the data, feeding the data to the agent, crafting any domain specific prompts and deploying.
The current modalities are text and speech. The next step would be to implement vision through web/mobile camera. The models currently being used are gpt3.5turbo and gpt4 32k (this was needed to feed the training data).
How to Test AvaFront
Visit https://avafront.com
Note:Response times can vary especially when you are asking information questions due to the amount of data being fed to the LLM. Feel free to open the console to see the logging and network traffic. The video was cut and sped up to meet the video length requirements
AvaFront Agent
The first agent is an AvaFront agent. It has been prompted via RAG on the AvaFront training data here: link You can ask it all sorts of questions about AvaFront. You can also test throwing curve balls at it to see how it will handle the inquiries. No need to find the navigation to go to another agent, just ask the first agent to send you to the Sales Demo, the Restaurant Demo, or the Authentication Demo and it will redirect you to another agent.
Restaurant Agent
The restaurant agent has been prompted with this training on this data: link Feel free to ask it general information about the restaurant or ask to place an order or reservation. Currently the reservation and ordering system is mocked. Future implementations would include integrations with ordering software providers like doordash.
Sales Agent
You can ask the AvaFront agent to take you to the sales demo. Here you can see the agent give a sales pitch. It is a prototype showing off the capabilities of showing different images while speaking. It is not listening.
Authentication Agent
The authentication agent is a proof of concept to show how an agent can perform multifactor authentication. After asking you for a name or phone number, it will also ask you more information that you should know (such as zip code or full address). In the future, it will be able to send a text to verify you have the phone that matches your records. --Implementation not fully completed--
If the agent is speaking and you don't have the patience to wait for it to finish, you can interrupt it by saying stop or pressing the "Listening" button.
What's Next
There needs to be performance modifications to make the avatar respond faster. As much as I wanted to explore Azure Search for semantic ranker and vectorizing the data, it was too much of a time sink to explore. This would be a next step in increasing the performance of the agents by feeding smaller precise bits of RAG data to the LLM. Right now I'm feeding all data to the LLM and sometimes the response back can take 10+ seconds. Not a very natural................................................................................................................................................................................conversation.........

I'm super excited to continue to explore this idea. There are still so many things I didn't have time to implement. I'm also excited to build ideas for the future when AI agents become normal. There will be a need to build AI bot registries and we will need protocols for AI agent discoverability. So many new ideas. Exciting times!
Also, the code is very messy and needs refactoring.
Special Thanks
First and foremost, I'm very grateful to my loving and supportive wife who took on extra baby duties to allow me to explore my passion for life changing technology. My baby is awesome too!
Thanks to Mika Suominen (https://github.com/met4citizen) for his amazing work on the TalkingHead library that made the avatar possible. When I started the hackathon two weeks ago, I was really excited to find and try Azure Speech's in preview "Speech to Avatar" (https://learn.microsoft.com/en-us/azure/ai-services/speech-service/text-to-speech-avatar/what-is-text-to-speech-avatar). It was super easy to use and implement and it was photorealistic. It was an easy choice for a hackathon except a day later I realize it's a $1 per minute! Here's some feedback MS, that is not something I would be willing to pay. That cost does not scale well.
Shout out to https://readyplayer.me/ for easy Avatars.
Also thanks to Renatto Garro who I met at the eMerge conference in Miami. It was after hearing one of his talks that it spawned the idea for AvaFront. He also introduced me to AutoGen.
This project has two repositories. One for the front end and one for the backend: https://github.com/vangjv/avafrontfrontend https://github.com/vangjv/avafrontbackend
Built With
- angular.js
- autogen
- azure
- c#
- cloud
- cosmosdb
- openai
- redis
- talkinghead
Log in or sign up for Devpost to join the conversation.