OceanPulse: Navigating Safer Paths for Marine Life
Inspiration
The oceans have some of the most vast and beautiful ecosystems ever seen on planet Earth, but still regarded with mismanaged care. For example, according to Friend of the Sea, ship strikes kill over 20,000 whales annually, where many of these collisions could be potentially avoidable through better route planning. Furthermore, the critically endangered North Atlantic right whale population has dwindled to fewer than 400 individuals according to NOAA Fisheries, largely due to these preventable accidents. Despite advancements in both marine tracking technology and shipping logistics, there's been no integrated system connecting wildlife data with maritime route planning. That disconnect sparked our idea for OceanPulse - a platform that bridges this technological gap to protect marine life while keeping shipping lanes efficient.
What it does
OceanPulse transforms how we visualize the intersection of marine life and maritime commerce through a three-layer approach:
Layer 1: Data Visualization
Renders real-time marine migration patterns with fluid animations in both 2D maps and immersive 3D globe views Maps global shipping lanes with heat maps showing vessel density and traffic patterns Colors indicate species populations, endangered status, and collision risk probability
Layer 2: Conflict Analysis
Identifies high-risk zones where migrations and shipping corridors overlap Calculates collision probabilities based on timing, species movement patterns, and vessel traffic density Generates risk heatmaps that update dynamically as new data flows into the system
Layer 3: Solution Generation
Recommends optimized shipping routes that reduce wildlife encounters while minimizing added travel time and fuel costs Provides quantifiable impact metrics showing potential lives saved vs. economic considerations Offers seasonal planning tools based on historical migration patterns
Bringing all this together is our AI assistant powered by Google Gemini, which answers nuanced questions about marine conservation, shipping practices, and optimized routing all via natural language - making complex data accessible to everyone from marine biologists to shipping executives.
How we built it
We engineered OceanPulse with a technology stack optimized for processing and visualizing complex geospatial data in real-time:
Core Architecture
Developed a modular microservice architecture using Node.js that separates data ingestion, processing, and visualization Implemented a WebSocket pipeline for real-time data streaming with fallback mechanisms for unstable connections Created a custom ETL (Extract, Transform, Load) pipeline that normalizes data from various marine conservation APIs
Frontend Implementation
Built a responsive UI with React and TailwindCSS that adapts to different screen sizes and user roles Implemented Leaflet.js for lightweight 2D mapping with custom layer controls Integrated Cesium.js for the immersive 3D globe view with WebGL acceleration
Data Integration Pipeline
Developed custom connectors to the OBIS (Ocean Biodiversity Information System) API for real-time marine species tracking Created a unified data format that harmonizes disparate marine databases into a consistent visualization schema Implemented a caching layer that reduces API calls while maintaining data freshness
AI and Analytics
Built a specialized context manager for the Google Gemini API that understands maritime terminology and ecological concepts Developed a custom routing algorithm that balances shortest path requirements with collision avoidance Created a predictive model that estimates migration pattern shifts based on historical data and current ocean conditions
Documentation and Accessibility
Generated comprehensive API documentation using Mintlify W3C Markup Validators used for accessibility compliance
Challenges we ran into
Building OceanPulse pushed us to solve several complex technical problems: Data Integration Complexities We discovered that marine tracking data comes in dozens of different formats with inconsistent coordinate systems. We looked at multiple APIs and conversion methods, eventually having all of them normalize into our unified data model. Real-time Performance Bottlenecks When we first deployed our visualization, trying to render thousands of vessels and migration paths simultaneously brought browsers to a crawl. We implemented a dynamic level-of-detail system that reduces complexity at different zoom levels, using quadtree data structures to efficiently manage which elements to render. AI Context Management The Google Gemini API initially struggled with complex maritime queries because it lacked domain-specific context. We developed a specialized prompt engineering system that dynamically adds relevant marine conservation facts based on query topics, improving response accuracy dramatically. Edge Case Route Handling Our optimization algorithm initially suggested routes that, while safer for marine life, would occasionally create navigational hazards in narrow waterways. We integrated nautical chart data to ensure all suggested routes maintained proper safety distances from shorelines and underwater obstacles. Scalability Under Peak Loads During our stress testing, we found performance degradation when simulating high user counts. We refactored our backend to use worker threads for computationally intensive tasks and implemented Redis-based caching to maintain responsiveness even with hundreds of simultaneous users.
What we learned
This project was a deep dive (get it) into both technical skills and environmental understanding: On the technical side, we worked with advanced geospatial visualization techniques - learning how to optimize WebGL for thousands of animated elements while maintaining performance. We discovered the performance of spatial indexing for efficiently querying large geospatial datasets. Working with the Google Gemini API taught us effective prompt engineering for domain-specific applications. Lastly, the environmental aspects were equally helpful, and great to personally know that we were working on something that was heavily impactful, relevant, and easily scalable as well across the globe.
What's next for OceanPulse
We're just getting started with OceanPulse, and our roadmap includes several exciting expansions: Technical Enhancements
Developing a mobile companion app with offline capabilities for field researchers Implementing machine learning models to predict migration pattern shifts due to climate change
Feature Expansion
Building a real-time alerting system that notifies vessels entering high-risk areas Adding support for smaller marine species like sea turtles and manatees Developing policy simulation tools to model the impact of potential shipping regulations
Partnership Development
Collaborating with major shipping companies to implement our route recommendations Working with marine conservation organizations to incorporate their specialized datasets Partnering with maritime navigation software providers to integrate our collision risk overlays
Research Applications
Providing anonymized data access for marine researchers studying population dynamics Creating specialized visualization tools for ecological research Developing educational modules for oceanography students
Overall, we know that with backing, effort, and time, it's possible to make OceanPulse the global standard for marine conservation planning, and showing more importantly, that protecting marine life and maintaining efficient shipping aren't competing goals - but compatible ones.
Built With
- azure
- gemini
- next.js
- python
- typescript

Log in or sign up for Devpost to join the conversation.