Inspiration
The project draws inspiration from the Hash Calendar, Keyless Signature Infrastructure, and Apache Kafka.
What they do
A hash calendar is a simple concept that involves creating a list of hash values for data over time. Each entry in the list corresponds to the hash of the entire dataset at a specific point in time. By comparing hash values between entries, you can quickly detect any changes or tampering in the data.
KSI revolutionizes digital signature verification by eliminating the need for private keys entirely. It stands out by associating hash values with precise timestamps or real-world events, leveraging a decentralized ledger to diminish reliance on centralized authorities. This approach empowers users to autonomously verify data integrity without dependence on a singular point of control. A KSI-backed public global blockchain offers compelling evidence of time, integrity, and identity.
Apache Kafka is a distributed streaming platform that allows for the real-time processing of data streams. It acts as a highly scalable and fault-tolerant message broker, enabling the efficient communication and integration of data between different systems and applications. Apache Kafka offers transparent data analytics, enhancing visibility into data origins and facilitates API bundling, enabling services to combine data from various endpoints.
How we should build it
Keyless signature technology provides mass-scale, non-expiring data validation while eliminating the need for secrets or other forms of trust. Thus, it eliminates the need for complex certificate-based solutions which carry certificate management issues, including expiration and revocation. Any client using the keyless signature service can make a request to sign any data item it has access to, be it a log file, XML file, office document, database record, SWIFT transaction, FPML message, eDiscovery product, and so on. In return, the client will receive a keyless signature which can be stored alongside the signed data, within the signed data, or in a repository separate from the signed data for backup and archival purposes.
Our solution is to build a public blockchain with KSI and integrate it with Apache Kafka producer and consumer applications. Use the KSI distributed ledger to sign the data before it is produced to Kafka and verify the signature upon consumption. This design helps organisations to prove the time, authenticity, and origin (machine, organization, individual) of the input data.
It is just a high-level design concept due to the time limitation. The detailed design and implementation for each use case may vary due to Apache Kafka's diverse applications in different data streaming scenarios. A sample picture is attached as a demo of the idea of how it might work in one use case.
Challenges we ran into
Joined a challenging team marked by poor communication and a reluctance among members to express opinions openly. There is no consensus for decision-making and action. Frustrated by the lack of communication, planning, and progress visibility, I opted to leave and submit my own solution.
Accomplishments that we're proud of
This is the first time I've participated in a hackathon that lasted for such an extended period, and I'm pleased that I followed through with most of the program.
What we learned
I've learned a lot about web3, blockchain and data streaming in a brief time throughout the hackathon program. I have learnt a lot of mistakes for being a part of a team and working with other team members. I can certainly do better for the next hackathon or a startup.
What's next for NoëlTree
Due to the time limitation, I can't build a prototype, but I am planning to work with some talented people to create MVP for this concept if there is a validated commercial value.
Built With
- apache
- blockchain
- kafka
- ksi
- zookeeper
Log in or sign up for Devpost to join the conversation.