Inspiration
Our inspiration for noogie stems from the growing concern about media bias and the polarization of news consumption. We recognized that people often get trapped in information bubbles, only seeing news that confirms their existing beliefs. Noogie was created to address this critical need by providing a news platform that presents stories from an unbiased perspective, allowing users to see multiple sides of every story. By leveraging AI to analyze and summarize news articles from various sources, we aim to give readers a more complete and balanced understanding of current events.
What it does
Noogie is an intelligent news aggregation platform that collects articles from multiple news sources and uses advanced AI to provide unbiased summaries. The platform presents users with comprehensive overviews of news stories that highlight different perspectives and viewpoints, rather than pushing a single narrative. Users can explore various angles of the same story, helping them form more informed opinions based on a fuller picture of the facts and different interpretations of events.
How we built it
We constructed noogie using a tech stack that includes TypeScript for type-safe development, d3.js for dynamic data visualizations, and Supabase as our database solution. The system incorporates OpenAI's API for intelligent article summarization and implements custom classification models for efficient content clustering. We built a scalable data pipeline that scrapes news sources, processes the content through our AI models, and stores normalized data across different sources in our database.
Challenges we ran into
Throughout development, we encountered several significant technical challenges. Creating reliable API endpoints proved more complex than anticipated, requiring careful consideration of rate limiting and error handling. Efficiently scraping news sources while respecting rate limits and avoiding being blocked was another major hurdle. Developing accurate classification models for clustering similar articles from different sources required extensive experimentation and fine-tuning. We also faced difficulties in summarizing articles efficiently while maintaining accuracy and neutrality. Setting up effective data visualization components and normalizing data across diverse news sources with different formats and structures presented additional complexity. Finally, optimizing our database upload processes to handle large volumes of articles required careful architecture planning.
Accomplishments that we're proud of
We're particularly proud of successfully overcoming the majority of technical challenges we faced during development. Our team achieved successful implementation of multiple complex technologies, from AI models to data visualization frameworks. The workload was distributed equally among team members, fostering collaborative development and shared learning. We gained valuable familiarity with numerous technologies including d3.js, TypeScript, API development, classification models, and Supabase. The user experience we created is engaging and intuitive across all platform features. Most importantly, we built a consistent and scalable data pipeline that can handle growing volumes of news content while maintaining performance and accuracy in our classification models.
What we learned
This project provided extensive learning opportunities across multiple domains. We developed proficiency in d3.js for creating dynamic and interactive data visualizations, deepened our understanding of TypeScript for building robust applications, and gained hands-on experience with API endpoint creation and management. We learned about machine learning classification models and their practical applications in content clustering. Additionally, we became familiar with Supabase as a modern database solution and gained valuable experience in server configuration and deployment strategies.
What's Next for noogie
noogie.ai.....(YC '28)



Log in or sign up for Devpost to join the conversation.