Three computer science students came together with absolutely no prior Cloud development knowledge, but had the aim to build easily deployable cloud infrastructure, and bring transparency to Crypto APIs in the process.

As a team we planned to build a working prototype on a data cloud, featuring data connectivity, custom APIs, and analytics on our local network. As Computer Science students with no prior Cloud experience, we were thrilled to learn as much as possible. We set out to help Cloud infrastructure deployment by reducing costs and accelerating the speed of production. For this challenge we must create a decentralized data cloud system mirroring the key features of Xnode. Ideally, this solution enables rapid development and deployment of data clouds, emphasizing ease of use, affordability, and scalability.

As this was a project where we learned while developing, we also hit the ground running learning about the problem space. This meant that we slowly developed the project solution as we went. This also meant we ideated throughout the development process. Initially, we had decided to work with Kafka and this formed the core of our development and ideation process, how to work from this base technology to create a design and solution.

The proposed solution aims to use a node-based cloud cluster as the infrastructure that enables the ETL of multiple real-time data streams. To create a cloud of nodes, Apache Kafka was used to streamline this process and decrease the amount of manual coding. Through various scripts and automations, our team was able to leverage the existing features of Kafka such as load-balancing, partitioning, and replication, to create a seamless user experience. Our solution allows users to easily generate a mini data cloud capable of ingesting and storing large amounts of data in a fault-tolerant and robust manner, ensuring that data is not lost should small faults occur in the system. This data cloud solution can be easily integrated with various APIs, and service the needs of many niche entities that require a low-cost and accessible solution for cloud-infrastructure. As such, our solution is highly adaptable to a plethora of use cases. One such usage is to act as a data ingestion and analytics platform for Crypto currency exchanges.

Having no experience with Cloud, we took this as an opportunity to learn a new technical skill. Finding what to learn and what resources to use was part of the challenge. Once we decided on the tech stack, we were put to the test to upskill quickly and produce a working MVP. Moreover, integrating the Kafka cloud with Granfana proved difficult due to local environment issues and the time required to set up the JMX Exporters. Additionally, due to time constraints and simply a lack of understanding of containerization tools we were unable to implement a way of containerizing our product.

Built With

Share this project:

Updates