dMegle (decentralized Megle) - Real-time live video streaming powered by Arkiv blockchain subscriptions. Stream live camera feeds directly from browsers without centralized servers.
This project demonstrates real-time video streaming using Arkiv subscriptions. Instead of traditional video streaming protocols, video is chunked into small entities stored on Arkiv, and viewers subscribe to receive chunks in real-time as they're published.
- Video Chunking: Video files are split into small chunks (~40KB each, base64 encoded)
- Publishing: Each chunk is published as an Arkiv entity with metadata (streamId, chunkIndex, timestamp)
- Real-time Subscriptions: Viewers subscribe to Arkiv events and receive chunks as they're created
- Chunk Assembly: Received chunks are buffered and assembled for playback
- TTL Management: Chunks automatically expire after a set time to manage storage costs
- Real-time Streaming: Video chunks streamed via Arkiv subscriptions
- CRUD Operations: Create streams, publish chunks, query existing chunks
- TTL/Expiration: Automatic expiration of video chunks to manage costs
- Live Subscriptions: Real-time chunk delivery using Arkiv's subscription system
- Query System: Fetch chunks by stream ID, index range, or timestamp
- Web UI: Interactive interface for creating streams, uploading videos, and watching
-
video-streamer.ts: Arkiv SDK wrapper for video streaming- Stream creation and metadata management
- Chunk publishing with size limits
- Real-time subscription handling
- Chunk querying and retrieval
-
server.ts: Express server with WebSocket bridge- REST API for stream management
- WebSocket server for browser clients
- Arkiv subscription โ WebSocket bridge
- File upload handling
-
public/: Web UI- Stream creation interface
- Video upload and streaming
- Real-time chunk reception display
- Stream watching interface
// Subscribe to video chunks as they're created
await streamer.subscribeToStream(streamId, {
onChunkReceived: (chunk) => {
// Receive chunks in real-time
console.log(`Chunk ${chunk.chunkIndex} received`);
},
});// Create stream metadata
const streamId = await streamer.createStream({
title: 'My Video',
expiresIn: 3600,
});
// Publish video chunk
await streamer.publishChunk(streamId, chunkIndex, chunkData);// Chunks expire after 1 hour
await walletClient.createEntity({
payload: chunkData,
expiresIn: 3600, // seconds
});// Query chunks for a stream
const chunks = await streamer.getStreamChunks(streamId, startIndex, endIndex);- Node.js 18+ (LTS recommended) or Bun 1.x
- TypeScript 5+
- Ethereum wallet with test ETH for Arkiv Testnet
- RPC endpoint access (Mendoza testnet)
- Clone the repository:
git clone <your-repo-url>
cd arkhiv-test- Install dependencies:
npm install- Set up environment variables:
cp .env.example .envEdit .env:
PRIVATE_KEY=0x... # Your testnet private key
RPC_URL=https://mendoza.hoodi.arkiv.network/rpc
WS_URL=wss://mendoza.hoodi.arkiv.network/rpc/ws
PORT=3000Run the streaming demo:
npm run streamThis will:
- Create a test video stream
- Set up real-time subscription
- Publish video chunks sequentially
- Receive chunks via subscription in real-time
Start the server:
npm run serverOpen http://localhost:3000:
- Create Stream: Enter title/description and create a new stream
- Upload Video: Select stream and upload a video file (small files recommended for demo)
- Watch Stream: Enter stream ID to watch and receive chunks in real-time
GET /api/account- Get connected account addressPOST /api/streams- Create a new video streamGET /api/streams/:streamId- Get stream metadataPOST /api/streams/:streamId/upload- Upload and stream video fileGET /api/streams/:streamId/chunks- Get chunks for a stream
subscribe- Subscribe to a stream's chunksunsubscribe- Unsubscribe from a streamchunk- Receive chunk datachunk:received- Chunk notificationstream:complete- Stream finishederror- Error occurred
- Video file is chunked into ~40KB pieces
- Each chunk is published as an Arkiv entity with:
type: 'video-chunk'streamId: Identifies the streamchunkIndex: Sequential chunk number- Base64-encoded video data in payload
- Viewer subscribes to Arkiv entity creation events
- Filter events for
type='video-chunk'and matchingstreamId - Receive chunks in real-time as they're published
- Buffer chunks and maintain order using
chunkIndex - Assemble chunks for playback
Using Arkiv subscriptions for video streaming means:
- No traditional streaming server needed
- Decentralized chunk delivery via blockchain
- Real-time updates as chunks are published
- Automatic expiration for cost management
- Queryable chunk history
- Raw chunk: ~40KB (before base64 encoding)
- Base64 encoded: ~50KB (fits in Arkiv entity)
- Why small: Gas costs scale with entity size
- Chunks include
chunkIndexattribute for ordering - Subscriptions may receive chunks out of order
- Client buffers and sorts by index before playback
- Each chunk expires after 1 hour (configurable)
- Stream metadata expires after stream duration
- Old chunks automatically cleaned up
- Reduces long-term storage costs
- Video Format: Demo uses raw binary chunks (not proper video codec)
- Chunk Size: Limited to ~50KB per chunk (Arkiv entity size)
- Latency: Blockchain writes have inherent latency
- Cost: Each chunk requires gas fees
For production use, consider:
- Using proper video codec (H.264, VP9) with chunking
- Hybrid approach: Store large video files off-chain (IPFS), use Arkiv for metadata/chunks
- CDN integration for final delivery
- Chunk size optimization based on gas costs
This project demonstrates:
- Real-time Data Streams: Using Arkiv subscriptions for live data delivery
- Chunked Storage: Breaking large data into manageable entities
- TTL Management: Cost-efficient storage with automatic expiration
- Query Patterns: Filtering and retrieving chunks by stream/index
- Subscription Architecture: Building real-time apps on blockchain
- โ Public live demo: Web UI for video streaming
- โ Public repo: This repository with comprehensive documentation
- โ
Demo video: Record showing:
- Creating a stream
- Uploading video (chunks published)
- Real-time chunk reception via subscriptions
- Multiple viewers receiving same stream
- Pushes Boundaries: Uses Arkiv subscriptions for video streaming (novel use case)
- Real-time Focus: Heavily leverages subscription feature for live delivery
- Multiple Features: Uses CRUD, TTL, subscriptions, and queries together
- Practical Demo: Working web UI that demonstrates real-time streaming
- Clear Architecture: Well-documented streaming protocol
MIT
Built for the Arkiv Hackathon at DevConnect 2025. Powered by Arkiv Network.
Note: This is a proof-of-concept demonstrating real-time video streaming via Arkiv subscriptions. For production video streaming, consider hybrid architectures combining Arkiv with traditional CDN/video infrastructure.