Amberdata Engineering Blog https://engineering.amberdata.io The Amberdata Engineering Blog discusses the technical challenges faced in developing and scaling a digital asset data infrastructure. Our focus is on delivering real-time analytics and managing petabytes of data efficiently. Join us as we explore these engineering hurdles and share solutions that drive innovation in the digital asset space. en Sat, 07 Feb 2026 15:52:41 GMT 2026-02-07T15:52:41Z en Building a Cost- Aware Data Platform at Petabyte Scale https://engineering.amberdata.io/building-a-cost-aware-data-platform-at-petabyte-scale <div class="hs-featured-image-wrapper"> <a href="proxy.php?url=https://engineering.amberdata.io/building-a-cost-aware-data-platform-at-petabyte-scale" title="" class="hs-featured-image-link"> <img src="proxy.php?url=https://engineering.amberdata.io/hubfs/AI-Generated%20Media/Images/The%20image%20depicts%20a%20modern%20bustling%20office%20environment%20filled%20with%20engineers%20and%20data%20scientists%20engaged%20in%20collaborative%20work%20Large%20screens%20display%20c.png" alt="Building a Cost- Aware Data Platform at Petabyte Scale" class="hs-featured-image" style="width:auto !important; max-width:50%; float:left; margin:0 15px 15px 0;"> </a> </div> <p>To understand the challenge at Amberdata, you have to start with scale. Today, our platform:</p> <p><strong>Produces 8–10 TB</strong> of data per day</p> <p><strong>Consumes 24–30 TB</strong> daily via Redpanda</p> <p><strong>Stores 2.6 PB</strong> in object storage</p> <p><strong>Handles 50B messages</strong> per day</p> <div class="hs-featured-image-wrapper"> <a href="proxy.php?url=https://engineering.amberdata.io/building-a-cost-aware-data-platform-at-petabyte-scale" title="" class="hs-featured-image-link"> <img src="proxy.php?url=https://engineering.amberdata.io/hubfs/AI-Generated%20Media/Images/The%20image%20depicts%20a%20modern%20bustling%20office%20environment%20filled%20with%20engineers%20and%20data%20scientists%20engaged%20in%20collaborative%20work%20Large%20screens%20display%20c.png" alt="Building a Cost- Aware Data Platform at Petabyte Scale" class="hs-featured-image" style="width:auto !important; max-width:50%; float:left; margin:0 15px 15px 0;"> </a> </div> <p>To understand the challenge at Amberdata, you have to start with scale. Today, our platform:</p> <p><strong>Produces 8–10 TB</strong> of data per day</p> <p><strong>Consumes 24–30 TB</strong> daily via Redpanda</p> <p><strong>Stores 2.6 PB</strong> in object storage</p> <p><strong>Handles 50B messages</strong> per day</p> <img src="proxy.php?url=https://track.hubspot.com/__ptq.gif?a=20854245&amp;k=14&amp;r=https%3A%2F%2Fengineering.amberdata.io%2Fbuilding-a-cost-aware-data-platform-at-petabyte-scale&amp;bu=https%253A%252F%252Fengineering.amberdata.io&amp;bvt=rss" alt="" width="1" height="1" style="min-height:1px!important;width:1px!important;border-width:0!important;margin-top:0!important;margin-bottom:0!important;margin-right:0!important;margin-left:0!important;padding-top:0!important;padding-bottom:0!important;padding-right:0!important;padding-left:0!important; "> Big Data Blockchain Startups Real-Time Analytics Technical Debt Scale-Up Fri, 06 Feb 2026 21:20:53 GMT [email protected] (Stefan Feissli) https://engineering.amberdata.io/building-a-cost-aware-data-platform-at-petabyte-scale 2026-02-06T21:20:53Z Re-imagining Blockchain Architecture at Amberdata - Part 1 https://engineering.amberdata.io/re-imagining-blockchain-architecture-at-amberdata-part-1 <div class="hs-featured-image-wrapper"> <a href="proxy.php?url=https://engineering.amberdata.io/re-imagining-blockchain-architecture-at-amberdata-part-1" title="" class="hs-featured-image-link"> <img src="proxy.php?url=https://engineering.amberdata.io/hubfs/AI-Generated%20Media/Images/The%20image%20depicts%20a%20modern%20hightech%20control%20room%20filled%20with%20large%20screens%20displaying%20complex%20data%20visualizations%20and%20blockchain%20metrics%20In%20the%20foregr.png" alt="Re-imagining Blockchain Architecture at Amberdata - Part 1" class="hs-featured-image" style="width:auto !important; max-width:50%; float:left; margin:0 15px 15px 0;"> </a> </div> <p>Upon joining Amberdata in early 2023, we had coverage of 4 major blockchains, those being Bitcoin, Bitcoin Cash, Litecoin, and Ethereum. However, we suffered from one big issue: onboarding new blockchains quickly and cost-effectively. Several factors contributed to this, but the most critical was <strong>collection speed</strong>. Sweeping an entire chain from genesis to the latest block requires processing tens of terabytes of data. In this post, we will look at a process that used to take months and cost over $50,000 per chain, which has been reduced to just a few hours and under $5,000.</p> <div class="hs-featured-image-wrapper"> <a href="proxy.php?url=https://engineering.amberdata.io/re-imagining-blockchain-architecture-at-amberdata-part-1" title="" class="hs-featured-image-link"> <img src="proxy.php?url=https://engineering.amberdata.io/hubfs/AI-Generated%20Media/Images/The%20image%20depicts%20a%20modern%20hightech%20control%20room%20filled%20with%20large%20screens%20displaying%20complex%20data%20visualizations%20and%20blockchain%20metrics%20In%20the%20foregr.png" alt="Re-imagining Blockchain Architecture at Amberdata - Part 1" class="hs-featured-image" style="width:auto !important; max-width:50%; float:left; margin:0 15px 15px 0;"> </a> </div> <p>Upon joining Amberdata in early 2023, we had coverage of 4 major blockchains, those being Bitcoin, Bitcoin Cash, Litecoin, and Ethereum. However, we suffered from one big issue: onboarding new blockchains quickly and cost-effectively. Several factors contributed to this, but the most critical was <strong>collection speed</strong>. Sweeping an entire chain from genesis to the latest block requires processing tens of terabytes of data. In this post, we will look at a process that used to take months and cost over $50,000 per chain, which has been reduced to just a few hours and under $5,000.</p> <img src="proxy.php?url=https://track.hubspot.com/__ptq.gif?a=20854245&amp;k=14&amp;r=https%3A%2F%2Fengineering.amberdata.io%2Fre-imagining-blockchain-architecture-at-amberdata-part-1&amp;bu=https%253A%252F%252Fengineering.amberdata.io&amp;bvt=rss" alt="" width="1" height="1" style="min-height:1px!important;width:1px!important;border-width:0!important;margin-top:0!important;margin-bottom:0!important;margin-right:0!important;margin-left:0!important;padding-top:0!important;padding-bottom:0!important;padding-right:0!important;padding-left:0!important; "> Big Data Blockchain Startups Tue, 11 Nov 2025 18:00:05 GMT [email protected] (Cory VanHooser) https://engineering.amberdata.io/re-imagining-blockchain-architecture-at-amberdata-part-1 2025-11-11T18:00:05Z Amberdata's Architecture: Our Technical Foundations https://engineering.amberdata.io/amberdata-architecture-our-technical-foundations <div class="hs-featured-image-wrapper"> <a href="proxy.php?url=https://engineering.amberdata.io/amberdata-architecture-our-technical-foundations" title="" class="hs-featured-image-link"> <img src="proxy.php?url=https://engineering.amberdata.io/hubfs/AI-Generated%20Media/Images/An%20image%20of%20an%20engineering%20team%20drawing%20up%20a%20software%20architecture%20on%20a%20white%20board%20for%20a%20new%20product%20team%20should%20be%20diverse-1.jpeg" alt="Amberdata's Architecture: Our Technical Foundations" class="hs-featured-image" style="width:auto !important; max-width:50%; float:left; margin:0 15px 15px 0;"> </a> </div> <p><span>Over the last two years, we have completely re-architected our system to meet the evolving demands of customers, digital asset markets, and our business. The broader adoption of crypto assets, recent all-time trading highs, and persistent market volatility have tested the limits of our data infrastructure. To meet these ever-increasing needs, we’ve built a modern architecture designed for scalability, reliability, and agility.</span></p> <p>We’re proud to share a few key milestones achieved by the new system:</p> <ol> <li>Processing over <strong>$500B in daily notional transaction value</strong></li> <li>Ingesting around <strong>2 million events per second</strong> into Apache Pinot</li> <li>Handling <strong>~2 TB of new data daily</strong></li> <li><span> Storing over </span><strong>2 petabytes</strong> of compressed data in our object store</li> <li><strong>Three billion</strong> REST API calls last year.</li> </ol> <p>Amberdata provides institutional-grade digital asset intelligence. Our customers rely on us for different use cases, from real-time market data and blockchain protocol information to normalized order book events and spot analytics.</p> <p>Despite this variety, our core capabilities are:</p> <ul> <li><span style="font-size: 10px;"> </span><strong>Collecting</strong> relevant and timely data from diverse sources</li> <li><span style="font-size: 10px;"> </span><strong>Normalizing, enriching, and transforming </strong>market and blockchain data into proprietary, industry-leading insights</li> <li><span style="font-size: 10px;"> </span>Deliver data <strong>consistently and accurately</strong> across all channels</li> <li><span style="font-size: 10px;"> </span>Provide historical data through <strong>batch exports and marketplace integrations</strong> to clients’ AI and machine learning platforms</li> <li><span style="font-size: 10px;"> </span>Support discovery and insight through an <strong>intuitive UI and APIs</strong></li> </ul> <p><strong>Our goal is simple yet ambitious.</strong></p> <p>Build a <span>scalable</span> and resilient platform optimized for <strong>adding new datasets, deriving insights, and publishing them</strong> across delivery channels on demand while maintaining impeccable <strong>data quality, high availability, simplicity, and cost-efficiency</strong>.</p> <h5>System Architecture</h5> <p><span>Our distributed system comprises three main technical components:</span></p> <ol> <li><span><strong>Redpanda and Redpanda Connect </strong>form the core of our high-throughput event ingestion and stream processing architecture.</span></li> <li><strong>Databricks and Delta Lake </strong>serve as the core infrastructure of our data processing pipeline, supporting both real-time and batch workloads, enforcing centralized governance, and providing a unified view across diverse storage tiers.</li> <li><strong>Apache Pinot</strong> for real-time analytics.</li> </ol> <p>All services are deployed across multiple availability zones in AWS to ensure fault tolerance and high availability.</p> <p><strong>Redpanda: High-Throughput Streaming</strong></p> <p>We rely on Redpanda as the core of our ingestion pipeline, with Redpanda Connect handling data integration and transformation. While Redpanda is compatible with Kafka, it offers faster performance, easier management, and significant cost savings. This capability enables us to ingest and process millions of events per second from blockchain, exchanges, and third-party data with high throughput and low latency.</p> <ul> <li><span style="font-size: 10px;"> </span>Redpanda Connect transforms, enriches, and routes events in-stream.</li> <li><span><span style="font-size: 10px;"> </span>Following a Lambda-style architecture, data is ingested simultaneously into low-latency and long-term storage, ensuring real-time access and historical retention.</span></li> </ul> <p><strong>Databricks + Delta Lake: Our Unified Data Platform</strong></p> <p>The Delta Lake serves as our single source of truth, following a <strong>Medallion Architecture</strong> that organizes data across three layers—<strong>raw (Bronze)</strong>, <strong>cleaned and enriched (Silver)</strong>, and <strong>business-level, analytics-ready (Gold)</strong>—all centrally governed through <strong>Unity Catalog</strong>. It supports <strong>ACID transactions and schema enforcement</strong>, ensuring data consistency and reliability throughout the pipeline. Shared <strong>notebooks, orchestrated workflows, and interactive dashboards </strong>promote collaboration.</p> <p>As the foundation of our<strong> data mesh architecture</strong>, the data lakehouse facilitates <strong>domain ownership</strong>, data-as-a-product thinking, and self-serve access. Meanwhile, Unity Catalog enforces <strong>fine-grained access control, audit logging</strong>, and enterprise-grade security and governance.</p> <p><span>Databricks has <strong>significantly increased execution speed</strong>, allowing our domain teams to ship new datasets faster with less overhead, while simplifying data quality and governance. With the adoption of Unity Catalog’s Iceberg Interface, we gain <strong>seamless interoperability</strong> with external systems like Snowflake and Trino, further reducing integration overhead and accelerating time to insight.</span></p> <p><strong>Apache Pinot: Real-Time Analytics Engine</strong></p> <p>Pinot powers all our <strong>low-latency data</strong> needs. Its capability to handle high ingest rates with sub-second query times makes it perfect for near <strong>real-time, complex analytical queries</strong>. Additionally, its <strong>query-side scalability</strong> enables us to maintain high performance even as data volumes and user concurrency scale.</p> <p>With its columnar storage format, innovative indexing strategies (e.g., star-tree index), and real-time ingestion capabilities, Pinot is ideal for scenarios that demand fast filtering, aggregation, and slicing of large volumes of time-series data.</p> <h5>Accelerating Time to Market</h5> <p>Scalability and reliability are foundational, but the real differentiator is <strong>how quickly we can analyze, transform, and generate proprietary</strong> insights from a new data source—a blockchain or a crypto exchange. Reducing this <strong>lead time</strong> has been a central focus in designing our new system, enabling <strong>faster integration and value delivery.</strong></p> <p>Architectural <strong>simplicity is fundamental</strong>—we prioritize straightforward, composable systems that lower operational overhead. Each team manages its <strong>domain</strong> and shares data via clear contracts, allowing for <strong>autonomy</strong> and reducing integration friction.</p> <p><strong>Loosely coupled</strong> services allow teams to deploy independently and recover smoothly when issues arise. We have integrated observability, automation, and cost-awareness at every level, enabling us to move quickly while maintaining control.</p> <h5>Concluding Thoughts</h5> <p><span>Amberdata’s new </span>platform has been in <strong>development for two years</strong>. It’s now a production-grade system providing institutional-quality insights across the digital asset ecosystem<span>.</span></p> <p>By integrating <strong>real-time streaming, a governed lakehouse architecture, and low-latency analytics</strong>, we’ve established a scalable foundation to support the next wave of growth in crypto and beyond.</p> <p>Whether you’re developing crypto trading applications, auditing blockchain operations, or assessing liquidity across exchanges, Amberdata offers the infrastructure to confidently support those applications.</p> <div class="hs-featured-image-wrapper"> <a href="proxy.php?url=https://engineering.amberdata.io/amberdata-architecture-our-technical-foundations" title="" class="hs-featured-image-link"> <img src="proxy.php?url=https://engineering.amberdata.io/hubfs/AI-Generated%20Media/Images/An%20image%20of%20an%20engineering%20team%20drawing%20up%20a%20software%20architecture%20on%20a%20white%20board%20for%20a%20new%20product%20team%20should%20be%20diverse-1.jpeg" alt="Amberdata's Architecture: Our Technical Foundations" class="hs-featured-image" style="width:auto !important; max-width:50%; float:left; margin:0 15px 15px 0;"> </a> </div> <p><span>Over the last two years, we have completely re-architected our system to meet the evolving demands of customers, digital asset markets, and our business. The broader adoption of crypto assets, recent all-time trading highs, and persistent market volatility have tested the limits of our data infrastructure. To meet these ever-increasing needs, we’ve built a modern architecture designed for scalability, reliability, and agility.</span></p> <p>We’re proud to share a few key milestones achieved by the new system:</p> <ol> <li>Processing over <strong>$500B in daily notional transaction value</strong></li> <li>Ingesting around <strong>2 million events per second</strong> into Apache Pinot</li> <li>Handling <strong>~2 TB of new data daily</strong></li> <li><span> Storing over </span><strong>2 petabytes</strong> of compressed data in our object store</li> <li><strong>Three billion</strong> REST API calls last year.</li> </ol> <p>Amberdata provides institutional-grade digital asset intelligence. Our customers rely on us for different use cases, from real-time market data and blockchain protocol information to normalized order book events and spot analytics.</p> <p>Despite this variety, our core capabilities are:</p> <ul> <li><span style="font-size: 10px;"> </span><strong>Collecting</strong> relevant and timely data from diverse sources</li> <li><span style="font-size: 10px;"> </span><strong>Normalizing, enriching, and transforming </strong>market and blockchain data into proprietary, industry-leading insights</li> <li><span style="font-size: 10px;"> </span>Deliver data <strong>consistently and accurately</strong> across all channels</li> <li><span style="font-size: 10px;"> </span>Provide historical data through <strong>batch exports and marketplace integrations</strong> to clients’ AI and machine learning platforms</li> <li><span style="font-size: 10px;"> </span>Support discovery and insight through an <strong>intuitive UI and APIs</strong></li> </ul> <p><strong>Our goal is simple yet ambitious.</strong></p> <p>Build a <span>scalable</span> and resilient platform optimized for <strong>adding new datasets, deriving insights, and publishing them</strong> across delivery channels on demand while maintaining impeccable <strong>data quality, high availability, simplicity, and cost-efficiency</strong>.</p> <h5>System Architecture</h5> <p><span>Our distributed system comprises three main technical components:</span></p> <ol> <li><span><strong>Redpanda and Redpanda Connect </strong>form the core of our high-throughput event ingestion and stream processing architecture.</span></li> <li><strong>Databricks and Delta Lake </strong>serve as the core infrastructure of our data processing pipeline, supporting both real-time and batch workloads, enforcing centralized governance, and providing a unified view across diverse storage tiers.</li> <li><strong>Apache Pinot</strong> for real-time analytics.</li> </ol> <p>All services are deployed across multiple availability zones in AWS to ensure fault tolerance and high availability.</p> <p><strong>Redpanda: High-Throughput Streaming</strong></p> <p>We rely on Redpanda as the core of our ingestion pipeline, with Redpanda Connect handling data integration and transformation. While Redpanda is compatible with Kafka, it offers faster performance, easier management, and significant cost savings. This capability enables us to ingest and process millions of events per second from blockchain, exchanges, and third-party data with high throughput and low latency.</p> <ul> <li><span style="font-size: 10px;"> </span>Redpanda Connect transforms, enriches, and routes events in-stream.</li> <li><span><span style="font-size: 10px;"> </span>Following a Lambda-style architecture, data is ingested simultaneously into low-latency and long-term storage, ensuring real-time access and historical retention.</span></li> </ul> <p><strong>Databricks + Delta Lake: Our Unified Data Platform</strong></p> <p>The Delta Lake serves as our single source of truth, following a <strong>Medallion Architecture</strong> that organizes data across three layers—<strong>raw (Bronze)</strong>, <strong>cleaned and enriched (Silver)</strong>, and <strong>business-level, analytics-ready (Gold)</strong>—all centrally governed through <strong>Unity Catalog</strong>. It supports <strong>ACID transactions and schema enforcement</strong>, ensuring data consistency and reliability throughout the pipeline. Shared <strong>notebooks, orchestrated workflows, and interactive dashboards </strong>promote collaboration.</p> <p>As the foundation of our<strong> data mesh architecture</strong>, the data lakehouse facilitates <strong>domain ownership</strong>, data-as-a-product thinking, and self-serve access. Meanwhile, Unity Catalog enforces <strong>fine-grained access control, audit logging</strong>, and enterprise-grade security and governance.</p> <p><span>Databricks has <strong>significantly increased execution speed</strong>, allowing our domain teams to ship new datasets faster with less overhead, while simplifying data quality and governance. With the adoption of Unity Catalog’s Iceberg Interface, we gain <strong>seamless interoperability</strong> with external systems like Snowflake and Trino, further reducing integration overhead and accelerating time to insight.</span></p> <p><strong>Apache Pinot: Real-Time Analytics Engine</strong></p> <p>Pinot powers all our <strong>low-latency data</strong> needs. Its capability to handle high ingest rates with sub-second query times makes it perfect for near <strong>real-time, complex analytical queries</strong>. Additionally, its <strong>query-side scalability</strong> enables us to maintain high performance even as data volumes and user concurrency scale.</p> <p>With its columnar storage format, innovative indexing strategies (e.g., star-tree index), and real-time ingestion capabilities, Pinot is ideal for scenarios that demand fast filtering, aggregation, and slicing of large volumes of time-series data.</p> <h5>Accelerating Time to Market</h5> <p>Scalability and reliability are foundational, but the real differentiator is <strong>how quickly we can analyze, transform, and generate proprietary</strong> insights from a new data source—a blockchain or a crypto exchange. Reducing this <strong>lead time</strong> has been a central focus in designing our new system, enabling <strong>faster integration and value delivery.</strong></p> <p>Architectural <strong>simplicity is fundamental</strong>—we prioritize straightforward, composable systems that lower operational overhead. Each team manages its <strong>domain</strong> and shares data via clear contracts, allowing for <strong>autonomy</strong> and reducing integration friction.</p> <p><strong>Loosely coupled</strong> services allow teams to deploy independently and recover smoothly when issues arise. We have integrated observability, automation, and cost-awareness at every level, enabling us to move quickly while maintaining control.</p> <h5>Concluding Thoughts</h5> <p><span>Amberdata’s new </span>platform has been in <strong>development for two years</strong>. It’s now a production-grade system providing institutional-quality insights across the digital asset ecosystem<span>.</span></p> <p>By integrating <strong>real-time streaming, a governed lakehouse architecture, and low-latency analytics</strong>, we’ve established a scalable foundation to support the next wave of growth in crypto and beyond.</p> <p>Whether you’re developing crypto trading applications, auditing blockchain operations, or assessing liquidity across exchanges, Amberdata offers the infrastructure to confidently support those applications.</p> <img src="proxy.php?url=https://track.hubspot.com/__ptq.gif?a=20854245&amp;k=14&amp;r=https%3A%2F%2Fengineering.amberdata.io%2Famberdata-architecture-our-technical-foundations&amp;bu=https%253A%252F%252Fengineering.amberdata.io&amp;bvt=rss" alt="" width="1" height="1" style="min-height:1px!important;width:1px!important;border-width:0!important;margin-top:0!important;margin-bottom:0!important;margin-right:0!important;margin-left:0!important;padding-top:0!important;padding-bottom:0!important;padding-right:0!important;padding-left:0!important; "> Big Data Blockchain Startups Real-Time Analytics Fri, 09 May 2025 19:15:38 GMT [email protected] (Stefan Feissli) https://engineering.amberdata.io/amberdata-architecture-our-technical-foundations 2025-05-09T19:15:38Z Transitioning Amberdata to a Growth Company https://engineering.amberdata.io/engineering-challenges-transitioning-amberdata-to-a-growth-company <div class="hs-featured-image-wrapper"> <a href="proxy.php?url=https://engineering.amberdata.io/engineering-challenges-transitioning-amberdata-to-a-growth-company" title="" class="hs-featured-image-link"> <img src="proxy.php?url=https://engineering.amberdata.io/hubfs/AI-Generated%20Media/Images/The%20image%20depicts%20a%20bustling%20modern%20office%20environment%20filled%20with%20diverse%20teams%20of%20engineers%20and%20data%20scientists%20gathered%20around%20large%20screens%20displa.jpeg" alt="Transitioning Amberdata to a Growth Company" class="hs-featured-image" style="width:auto !important; max-width:50%; float:left; margin:0 15px 15px 0;"> </a> </div> <p>Two years ago, when I joined Amberdata as VP of Engineering, we processed roughly <strong>$1 billion </strong>in daily notional transaction value. Today, that number exceeds <strong>$500 billion</strong>.</p> <div class="hs-featured-image-wrapper"> <a href="proxy.php?url=https://engineering.amberdata.io/engineering-challenges-transitioning-amberdata-to-a-growth-company" title="" class="hs-featured-image-link"> <img src="proxy.php?url=https://engineering.amberdata.io/hubfs/AI-Generated%20Media/Images/The%20image%20depicts%20a%20bustling%20modern%20office%20environment%20filled%20with%20diverse%20teams%20of%20engineers%20and%20data%20scientists%20gathered%20around%20large%20screens%20displa.jpeg" alt="Transitioning Amberdata to a Growth Company" class="hs-featured-image" style="width:auto !important; max-width:50%; float:left; margin:0 15px 15px 0;"> </a> </div> <p>Two years ago, when I joined Amberdata as VP of Engineering, we processed roughly <strong>$1 billion </strong>in daily notional transaction value. Today, that number exceeds <strong>$500 billion</strong>.</p> <img src="proxy.php?url=https://track.hubspot.com/__ptq.gif?a=20854245&amp;k=14&amp;r=https%3A%2F%2Fengineering.amberdata.io%2Fengineering-challenges-transitioning-amberdata-to-a-growth-company&amp;bu=https%253A%252F%252Fengineering.amberdata.io&amp;bvt=rss" alt="" width="1" height="1" style="min-height:1px!important;width:1px!important;border-width:0!important;margin-top:0!important;margin-bottom:0!important;margin-right:0!important;margin-left:0!important;padding-top:0!important;padding-bottom:0!important;padding-right:0!important;padding-left:0!important; "> Big Data Blockchain Startups Real-Time Analytics Technical Debt Scale-Up Thu, 13 Feb 2025 23:57:53 GMT [email protected] (Stefan Feissli) https://engineering.amberdata.io/engineering-challenges-transitioning-amberdata-to-a-growth-company 2025-02-13T23:57:53Z