Inspiration

So, in college, I have to cite legit scholarly sources for essays and papers, and it was honestly a hassle to find relevant scholarly, peer-reviewed information about the topic I was writing the essay on, which, mind you, would usually be a niche topic. Google didn’t help much in sourcing and finding information during that time, so I thought it would be really cool and helpful for students like me and other beginner researchers to use Timeonar to learn about the topic they’re going to write about and get past that first frustrating bump on the road. Big thanks to the Perplexity team for making this possible with this API.

What it does

Timeonar gives the timline of a research topic you are interested in and gives you a view of how research on it evolved over time, while giving you authentic sources for learning. Timeonar simplifies the research process by helping users quickly understand any topic they want to learn about. Instead of spending hours browsing databases or figuring out where to start, users just enter a topic and click "Search". Our backend uses the Sonar API to first fetch a timeline of the topic, showing how it evolved over time. For each year, it provides key insights, a summary of what happened, and what was discovered during that time. Then, a sequence of follow-up API calls fetches deeper information, such as the research methodology used, the theoretical paradigms applied, and any major breakthroughs or field evolutions. Finally, the last API call retrieves the sources behind the discoveries. Everything is displayed sequentially to make the website feel fast and responsive without overwhelming the user.

How we built it

The tech stack used to build Timeonar was the following: Frontend: Vite + React with tailwindcss, hosted on Vercel Backend: .NET, C#, hosted on Azure For the Sonar API calls, a single API call lacked context and often hallucinated, so I implemented sequential API calls that pass the output of the previous call into the next one. Since Sonar reasoning supports a large context window, this wasn’t an issue. Sequential API calls also significantly improved the frontend experience; previously, users had to wait too long for the timeline to render. But by combining sequential API calls with a Server-Side Events (SSE) connection, data is now streamed and displayed almost in real time, which makes the user experience much smoother and faster.

Challenges we ran into

The Sonar API initially hallucinated and, instead of linking to the correct primary source for a specific year, it repeatedly linked to a general Wikipedia article covering the entire topic's timeline for every year/entry. This issue was resolved by implementing sequential API calls, which gave Sonar more context through previous responses and allowed for more precise and robust prompting. Another challenge I faced was the slow API response time when requesting everything at once. To solve this, I split the calls and implemented sequential API requests. While the total processing time remained the same, delivering information to the user in real time made the experience feel significantly faster and more responsive. Setting up the backend in Azure was also kinda hard, as this is my first time using Azure.

Accomplishments that we're proud of

I’m really proud that I was able to improve the user experience of the website far beyond what I initially intended. I also really like the way I handled the Sonar API in the backend and the prompts I used.

What we learned

I learned how to handle API calls more effectively, including different strategies for interacting with LLMs to work around token limits and improve response speed. I also learned about SSE and web sockets and the difference between them. I learned about how to use Azure, although the very basics of it.

Built With

Share this project:

Updates