What it does
Our website was designed with a primary mission: to raise awareness about climate change through an interactive, user-friendly platform. It includes a variety of tools to inform, engage, and educate visitors about the global climate crisis. Key features include:
- Climate Reports Tab: A dedicated space where users can access up-to-date reports from credible sources like the UN and climate activist organizations, offering detailed insights into the state of the environment, policy actions, and ongoing climate-related challenges.
- AI Chatbot: The chatbot serves as a virtual assistant, answering climate-related queries in real-time. It provides users with information on topics ranging from carbon emissions to renewable energy, and more.
- AI Animated Reporter: A unique and engaging feature, the animated reporter provides users with dynamic video presentations of climate news and reports. This feature combines artificial intelligence with creative design to deliver engaging content in a way that feels both informative and entertaining.
How we built it
To build this platform, we combined several technologies that worked harmoniously to create a seamless user experience:
Backend (Flask and Node.js): For the backend, we used Flask to power our chatbot. The Flask API was connected to the Gemini API to enable AI-powered interactions with users. Additionally, we used Node.js to manage the backend logic for the website's other components.
Web Scraping for Reports (BeautifulSoup and MongoDB): To gather climate reports, we turned to web scraping. Using the BeautifulSoup library, we scraped relevant data from respected websites, including UN climate reports and documents from climate activism groups. This scraped data was then stored in a MongoDB database, allowing us to dynamically serve the most up-to-date reports on our site.
Frontend (React.js): For the user interface, we used React.js to create a clean, interactive, and responsive design. The front end connects smoothly with the backend and provides a user-friendly experience that makes it easy for visitors to navigate through reports, interact with the chatbot, and watch the AI-powered animated reporter.
Challenges we ran into
Connecting Flask and Node Backend: One of the primary challenges we faced was integrating Flask with the Node.js backend. Initially, we struggled with routing and ensuring smooth communication between the two backends. We overcame this by carefully defining API endpoints in Flask, and using RESTful services to allow data exchange between both parts of the backend.
Database Issues: Another issue was related to storing and retrieving the scraped data in MongoDB. We faced several challenges with data consistency and performance as we scaled the data storage. These were resolved through optimizing the MongoDB queries and ensuring that data was structured in a way that could be easily retrieved and displayed in the frontend.
Web Scraping Problems: One of the challenges we faced with web scraping was that different websites have different ways of organizing their content, which made it hard to extract the right information. Some websites also had measures in place to prevent scraping, which made it more difficult to collect data. To solve this, we had to adjust our code to handle these differences and make sure we only grabbed the most relevant and accurate information.
Accomplishments that we're proud of
Successfully Integrating Multiple Technologies: We are particularly proud of how we managed to integrate several technologies into a cohesive platform. From combining Flask, Node.js, React.js, and BeautifulSoup to utilizing the Gemini API for the chatbot, the technical integration was a key achievement.
Dynamic and Interactive Features: The AI chatbot and animated reporter are unique features that we’re extremely proud of. By using AI, we were able to provide an interactive and engaging experience that helps users understand the complex issue of climate change in a way that feels both informative and personal.
Real-time Climate Reports: The ability to scrape live data from authoritative climate reports and display it on our website gives users access to up-to-date, relevant information about the climate crisis. This feature ensures that our platform stays current and provides value to our users.
What we learned
- The Importance of API Integration: Throughout the project, we learned how important it is to ensure proper integration between different technologies. Whether it was connecting the backend APIs or ensuring that data flows smoothly from the database to the frontend, API integration was critical to the project’s success.
User Experience (UX) Design: Creating an intuitive and engaging user experience was one of our biggest takeaways from this project. We spent a lot of time iterating on the design to make sure that visitors could easily navigate the platform and find the information they needed.
Handling Real-time Data and Scalability: Working with live data, especially when it comes to climate reports, taught us valuable lessons about data management. Ensuring that we could scale the storage and retrieval process efficiently while maintaining the website's performance was a significant learning experience.
Log in or sign up for Devpost to join the conversation.