Inspiration
The inspiration comes from xnode's architecture and aim to integrate all types of data format.
What it does
Our application allows users to fetch real-time prices from multiple decentralized exchanges and/or centralized exchanged to ensure the users get the best price at time they swap tokens.
How we built it
When a user selects a specific API for analysis, their real-time data embarks on a comprehensive journey through a series of interconnected systems. Initially, Apache Kafka comes into play, acting as the primary data ingestion platform where the user's data is stored in a dedicated topic. Simultaneously, for resilience and backup, the data is duplicated and safeguarded in AWS S3. To ensure consistency and ease of analysis, the stored information undergoes processing to achieve a standardized format. The processed data finds a home in a MongoDB database, leveraging its NoSQL capabilities for flexible and scalable storage. Meanwhile, the metadata associated with each data topic, containing essential details for reference, is managed through a Kademlia node. Kademlia, functioning as a decentralized distributed hash table (DHT) system, ensures the accessibility of metadata by providing a fault-tolerant and decentralized approach to data storage and retrieval. Ultimately, the processed data is streamed to a WebSocket server, offering real-time and efficient utilization for analytical purposes.
All the services, such as frontend, backend are deployed independently using docker containers.
Challenges we ran into
Figuring out how to store large data in quick manner. connecting databases to kafka in real time
Accomplishments that we're proud of
understanding the kademlia network and how it makes this solution highly scalable.
What we learned
How to work with decentralized apis, understanding new distributed hash systems, deploying services.
What's next for Data cloud development
work on building the web ui, complete the expressed plan.

Log in or sign up for Devpost to join the conversation.