Optalysys https://optalysys.com/ Tue, 10 Mar 2026 10:53:50 +0000 en-GB hourly 1 https://wordpress.org/?v=6.9.1 Optalysys strengthens Advisory Board with Phil Cheetham & Ro Cammarota to support US expansion & technology commercialisation  https://optalysys.com/resource/optalysys-strengthens-advisory-board-with-phil-cheetham-and-ex-intel-chief-scientist-ro-cammarota-to-support-us-expansion-and-technology-commercialisation/ Tue, 10 Mar 2026 09:03:04 +0000 https://optalysys.com/?p=3191 Leeds, 10 March 2026: Optalysys, a photonics company enabling the future of secure computing and sustainable AI, has appointed Rosario “Ro” Cammarota and Phil Cheetham to its Advisory Board, effective 1 March 2026.  Ro is an established leader in Fully Homomorphic Encryption (FHE) and secure AI systems, and former Chief Scientist at Intel. He founded and scaled Intel’s encrypted computing programme and led the development of the first […]

The post Optalysys strengthens Advisory Board with Phil Cheetham & Ro Cammarota to support US expansion & technology commercialisation  appeared first on Optalysys.

]]>

Optalysys strengthens Advisory Board with Phil Cheetham and former Intel Chief Scientist Ro Cammarota

To support US expansion and technology commercialisition

Leeds, 10 March 2026: Optalysys, a photonics company enabling the future of secure computing and sustainable AI, has appointed Rosario “Ro” Cammarota and Phil Cheetham to its Advisory Board, effective 1 March 2026. 

Ro is an established leader in Fully Homomorphic Encryption (FHE) and secure AI systems, and former Chief Scientist at Intel. He founded and scaled Intel’s encrypted computing programme and led the development of the first advanced-node platform enabling FHE at scale. His expertise spans translating breakthrough cryptography into high-performance, deployable computing infrastructure. 

Phil brings deep expertise in cloud-based secure compute and large-scale infrastructure for AI and data-intensive applications. He is currently Group Head of Data Content at LSEG (London Stock Exchange Group), responsible for sourcing and content across the group. Previously, he was Vice President, EC2 Instance Platforms at AWS, where he led the evolution of existing and introduction of new instance types for EC2, such as Graviton and Trainium. Phil has spent several years working with Optalysys, both as an advisor and a Non-Executive Board member. 

Earlier this year, Optalysys announced it had secured £23m to advance its photonic computing capabilities and US expansion. Its advances in silicon photonics brings massive efficiency gains to AI, high-performance and secure computing, telecoms and cryptography.  

FHE is one of its key application areas – a post-quantum-ready form of cryptography that allows data to be processed securely while remaining encrypted – a critical area of concern for the growth of secure AI, cloud and enterprise computing. 

Dr Nick New, CEO of Optalysys, said“Ro and Phil bring outstanding and highly complementary expertise to our Board. Ro has been at the forefront of advancing FHE from theory to deployable platforms, while Phil offers deep insight into cloud-scale compute and the realities of adopting secure systems in the industries we’re prioritising. Their guidance will help us accelerate product and commercial execution to support next generation computing workloads – across AI, security, intelligence and discovery – and raise Optalysys’ profile in the US as we ramp up our international market presence.” 

Ro Cammarota said, “Optalysys is tackling one of the most important problems in secure AI: making computation on encrypted data practical at meaningful scale. I’m excited to join the Advisory Board and support the team as they translate their technology into deployable products and expand their impact globally.” 

Phil Cheetham said, “Optalysys is focused on solving one of the most challenging problems in confidential AI at scale.  Having worked with Optalysys over the past couple of years I have seen the strength of the technology and the team first-hand. As global demand grows for secure, sustainable, high-performance computing across all industry verticals, I’m delighted to join the Advisory Board and continue to help Optalysys.” 

– ENDS –  

About Optalysys 

Founded in Leeds, UK, in 2013 by Dr. Nick New (CEO) and Robert Todd (CTO), Optalysys is a photonic computing company. Optalysys’ unique approach integrates data movement and processing on a single chip, combining silicon photonics with cutting-edge digital technologies to deliver immense computational power required for today’s workloads and next generation cloud infrastructure. Optalysys has built the world’s first dedicated hardware solution designed for encrypted blockchain applications — LightLocker™ Node. 

The post Optalysys strengthens Advisory Board with Phil Cheetham & Ro Cammarota to support US expansion & technology commercialisation  appeared first on Optalysys.

]]>
Accelerated Privacy by Design: Turning encrypted compute into deployable, repeatable services  https://optalysys.com/resource/accelerated-privacy-by-design-turning-encrypted-compute-into-deployable-repeatable-services/ Mon, 09 Mar 2026 16:19:50 +0000 https://optalysys.com/?p=3188 For the last few years, the landscape of Privacy Enhancing Technologies (PETs) has been fragmented. Zero-Knowledge (ZK) for scaling and proofs, Multi-Party Computation (MPC) for custody, Fully Homomorphic Encryption (FHE) for computing directly on encrypted data.   While each tool is powerful in isolation, the lack of integration has created a silo. Developers are forced to […]

The post Accelerated Privacy by Design: Turning encrypted compute into deployable, repeatable services  appeared first on Optalysys.

]]>

Accelerated Privacy by Design: Turning encrypted compute into deployable, repeatable services 

by Marcella Arthur
CRO in Residence at Optalysys

For the last few years, the landscape of Privacy Enhancing Technologies (PETs) has been fragmented. Zero-Knowledge (ZK) for scaling and proofs, Multi-Party Computation (MPC) for custody, Fully Homomorphic Encryption (FHE) for computing directly on encrypted data.  

While each tool is powerful in isolation, the lack of integration has created a silo. Developers are forced to act as systems integrators, stitching together incompatible cryptographic primitives. This fragmentation slows down the entire ecosystem, introduces security gaps, and makes compliance a nightmare of coordination. 

Simultaneously, blockchain technology has moved crypto-fringes to the technology of choice for serious institutions exploring the future of financial systems. 

This shift demands that privacy, performance and compliance come together.  

Ending the PETs silo

Right now, the privacy-enhancing technologies that complement distributed ledger technologies exist and are understood largely in silo.  

  • ZK proofs handle one class of problems: proving correctness without revealing inputs 
  • FHE unlocks computation over encrypted data 
  • MPC lets multiple parties collaborate without revealing their secrets 
  • TEEs offer sealed execution environments 

Each of these is powerful on its own. Each has its own ecosystem, libraries, hardware assumptions and best practices. And each tends to be deployed as a one-off solution to a specific problem. 

When combined, they can solve the privacy and compliance barriers to widespread, institutional adoption of blockchain and decentralised technologies. 

We are moving toward a unified architecture where confidentiality, verifiable execution, and high-performance compute are tightly integrated. Privacy isn’t a bolted-on feature or a tax on performance – with convergence, it is the native state of the network. 

Imagine a single transaction that is:  

  • Encrypted (via FHE) to protect the input data from the validator 
  • Verifiable (via ZK) to prove regulatory compliance without exposing the data 
  • Collaborative (via MPC) to allow multiple parties to compute on or analyse the result 

This is accelerated privacy by design, which transforms confidentiality and compliance into innovation enablers.

Convergence is what makes blockchain privacy operationally dependable

A converged architecture reduces moving parts and makes performance scalable. 

In practical terms, it means: 

  • Sensitive data stays encrypted through the workflow 
  • Execution produces verifiable evidence without exposing raw inputs 
  • Policy constraints can be enforced without turning the platform into a surveillance tool 
  • Performance remains stable enough to support SLAs 

This is privacy by design that’s engineered for real deployments, at a global scale, rather than pilots. 

The commercial impact of dedicated acceleration  

Enterprise buyers don’t optimise for peak performance. They optimise for predictability. 

If encrypted compute introduces cost or latency spikes then margins become unstable, confidence erodes and managing services or pricing becomes complicated. 

Dedicated acceleration, like LightLocker™ Node for FHE, is what stabilises the performance envelope so privacy can be productised. Not as a one-off feature, but as a service tier. 

Once you have predictable confidential execution, partners can build repeatable offerings: 

  • Confidential execution tiers with defined SLAs 
  • Vertical reference architectures for tokenised assets, private marketplaces, secure data-sharing 
  • Managed services for monitoring, evidence generation, policy updates and performance tuning

Convergence is the key to adoption 

When you remove the performance penalty, privacy shifts from being a compliance hurdle to a competitive advantage. It allows for new business models or use cases – like dark pool exchanges, private on-chain medical diagnostics, and confidential AI training – that simply cannot exist in a transparent or siloed world. 

The future will be built on infrastructure that can support regulated workflows without forcing firms to choose between transparency and confidentiality. That is what enterprises can trust, and integrators can build business around. 

If you are evaluating how to incorporate privacy-preserving blockchain into your service offerings, get in touch with us to explore how to make the economics work: responsibly, repeatably and at scale → 


The post Accelerated Privacy by Design: Turning encrypted compute into deployable, repeatable services  appeared first on Optalysys.

]]>
Tokenised RWAs are a trillion-dollar opportunity – but only if privacy scales  https://optalysys.com/resource/tokenised-rwas-are-a-trillion-dollar-opportunity-but-only-if-privacy-scales/ Wed, 04 Mar 2026 10:59:24 +0000 https://optalysys.com/?p=3165 The tokenisation of Real World Assets (RWAs) has moved from pilots to platforms in the last couple of years.  Tokenising real estate, bonds, private credit and treasuries can improve liquidity, broaden access, and streamline asset management. This represents one of the most compelling applications of blockchain technology – transforming the global financial landscape as we know it and bridging the gap between TradFi markets and […]

The post Tokenised RWAs are a trillion-dollar opportunity – but only if privacy scales  appeared first on Optalysys.

]]>

Tokenised RWAs are a trillion-dollar opportunity, but only if privacy scales 

by Marcella Arthur
CRO in Residence at Optalysys

The tokenisation of Real World Assets (RWAs) has moved from pilots to platforms in the last couple of years. 

Tokenising real estate, bonds, private credit and treasuries can improve liquidity, broaden access, and streamline asset management. This represents one of the most compelling applications of blockchain technology – transforming the global financial landscape as we know it and bridging the gap between TradFi markets and the efficiencies of DeFi infrastructure. 

The scale of the opportunity is immense: at the end of 2025 the total market value of on-chain RWAs stood at over $36 billion and some estimates project growth to $30-$50 trillion by 2030

But recognising this opportunity relies on overcoming a fundamental infrastructure constraint – not token standards or smart contract tooling – but privacy at scale. 

RWA tokenisation cannot scale commercially without confidentiality that is predictable, auditable, and operationally dependable. 

Privacy isn’t a feature, it’s a prerequisite 

Bringing swathes of traditionally illiquid or inaccessible assets onto blockchain simply doesn’t marry with the inherent transparency of distributed ledger technology. 

The entire surface area of the data involved in RWA tokenisation is highly sensitive: 

  • Asset ownership: revealing the identities of asset owners is often unacceptable due to privacy regulations and commercial sensitivities 
  • Asset details & valuation: exposing specific details, underlying performance data, or real-time valuations of assets can undermine competitive positions or negotiation leverage  
  • Transaction data: publicly visible transaction amounts, counterparty details, or investment flows related to RWAs can leak sensitive strategic information 

Broadcasting these – even in semi-public permissioned approaches – creates risk major firms won’t touch and regulators won’t accept.   

This privacy gap is arguably the single largest barrier preventing the RWA tokenisation market (and indeed blockchain technology itself) from scaling exponentially and achieving its widely projected potential.  

“The biggest barrier to widespread enterprise adoption of blockchain is privacy” – Jeremy Allaire, CEO of Circle (issuer of USDC stablecoin) 

The participation of major financial institutions is the credibility signal: real deployments are being explored and launched by BlackrockJPMorgan, Franklin Templeton and Société Générale to name a few – but the bar for compliant, scalable privacy is high. 

What confidential RWA rails require in production 

The direction is toward models where RWA information is recorded, managed, and transacted while encrypted

FHE offers an ideal solution, because it enables computation on encrypted data (policy checks, transfer logic, portfolio valuation, analytics) without decrypting.   

Practically, this supports shared infrastructure and programmability while keeping identities, asset attributes, values, and counterparties confidential when needed, and auditable when required. 

Predictability determines monetisation 

Encrypted compute introduces heavy overhead. On standard CPU/GPU infrastructure, cost can scale too quickly and become prohibitive at high transaction rates. 

For GSIs/MSPs, that translates directly into commercial constraints: 

  • If performance and cost are unpredictable, you can’t offer meaningful SLAs. 
  • If you can’t offer SLAs, institutions won’t move production workloads. 
  • If every deployment is bespoke, you don’t get repeatable revenue. 

This is why dedicated acceleration for encrypted compute matters: it turns confidentiality from a bespoke project risk into a service tier you can package and sell.  

LightLocker™ Node is purpose built for enabling low-latency confidential transactions at scale on blockchains, ideally positioned for RWA workflows. 

Your RWA tokenisation partner evaluation checklist 

If you’re building an RWA tokenisation practice or managed service, evaluate partners with the same discipline you’d use for regulated infrastructure: 

  1. Predictability under load 
    Request p50/p95/p99 latency and cost curves for encrypted operations, including burst scenarios 
  1. Audit posture 
    Can the system execute policy checks over encrypted data and produce audit evidence without exposing raw identities/positions? 
  1. Operational model 
    How are encrypted compute components deployed, monitored, scaled, and updated? What are the failure modes and recovery paths? 
  1. Repeatability 
    Can you package this into reference architectures and managed confidential tiers with defined SLAs, plus ongoing services (monitoring, policy updates, performance optimisation)? 

Accelerating the future of asset management 

The tokenisation of RWAs holds transformative potential for financial markets, promising increased efficiency, liquidity, and accessibility.   

Major institutions are already participating, and market projections point towards a multi-trillion dollar future. However, unlocking this potential requires solving the critical challenge of privacy on blockchains.  

Fully Homomorphic Encryption offers a powerful pathway to achieving this, enabling computation on encrypted data directly within blockchain environments. Dedicated hardware acceleration is the critical enabling technology required to break the performance barrier and set the pathway to that $30 trillion projection. 

This synergy between advanced cryptography and specialised hardware is fundamental to building the trusted infrastructure needed for the future of asset management on blockchain. 

If you are evaluating how to incorporate privacy-preserving blockchain into your service offerings, get in touch with us to explore how to make the economics work: responsibly, repeatably and at scale → 


The post Tokenised RWAs are a trillion-dollar opportunity – but only if privacy scales  appeared first on Optalysys.

]]>
The economics of scaling blockchain privacy https://optalysys.com/resource/the-economics-of-scaling-blockchain-privacy/ Mon, 23 Feb 2026 10:45:00 +0000 https://optalysys.com/?p=3129 Global finance is transforming. In a world where tokenised assets are scaling fast (Citi estimates tokenised deposits could support over $100 trillion in annual flows by 2030) and stablecoins are already processing trillions in yearly volume, it’s no surprise that banks around the world are actively exploring and deploying blockchain technology. But, public ledgers are […]

The post The economics of scaling blockchain privacy appeared first on Optalysys.

]]>
Secure AI

The economics of scaling blockchain privacy

by Marcella Arthur
CRO in Residence at Optalysys

Global finance is transforming. In a world where tokenised assets are scaling fast (Citi estimates tokenised deposits could support over $100 trillion in annual flows by 2030) and stablecoins are already processing trillions in yearly volume, it’s no surprise that banks around the world are actively exploring and deploying blockchain technology.

But, public ledgers are inherently transparent. Institutional, regulatory and consumer demands for data confidentiality are forcing firms to investigate privacy solutions that can work within blockchain technologies and perform predictably on a global scale.

Encrypted computation is often described as the holy grail privacy.

The ability to process data while it remains encrypted unlocks powerful use cases for enterprises exploring blockchain: confidential transactions, staking and positioning, data-sharing between organisations, and new forms of digital identity. 

So why hasn’t it become standard? 

The answer isn’t a lack of cryptographic capability, it’s the economics. Advanced privacy technologies can be incredibly compute-intensive and performance and costs have historically behaved too unpredictably for serious adoption as usage grows.  

This unpredictability is what kills monetisation. You cannot reliably price services, design products, or commit critical workflows to infrastructure that you cannot model effectively. 

For any institution, the single most important metric is not the fastest possible encrypted transaction. It is the variance of performance and cost as usage grows. 

The opportunity for GSIs and MSPs is to flip the narrative: own the pattern for predictable encrypted compute, and package it as a premium, repeatable service

When volatility stands in the way of adoption 

On conventional hardware, advanced privacy technologies such as Fully Homomorphic Encryption (FHE) and complex zero-knowledge proofs are extremely resource-intensive. Their performance can fluctuate based on system load, contention with other workloads and the complexity of the underlying operations. 

In practice, this volatility creates massive risk in three critical areas for your clients:

  • Operational – If the network gets busy: a volatile market event, a product launch, or a burst of user activity, and encrypted flows slow or stall, clients will notice. Transactions that used to settle in milliseconds now take seconds; infrastructure costs for the same workload suddenly multiply, firms lose credibility and users lose trust
  • Regulatory – Risk teams don’t solely judge systems on average behaviour; they must look at how they behave under stress. If encrypted paths collapse under load, privacy starts to look like a new source of operational and conduct risk, rather than a shield. With regulators closing in, these paths must be wateright.
  • Commercial – If a bank cannot predict performance and cost, they won’t move mission-critical workflows onto that platform – no matter how compelling the use case

This is not hypothetical.

Deloitte found that regulatory complexity and operational risk are top concerns for digital-asset initiatives, ahead of pure technology questions, reflecting how much weight decision-makers put on predictable behaviour and governance.

For services providers, volatility translates directly into projects that can’t be priced and services that can’t be guaranteed. This risk is a direct threat to the repeatability and profitability of any privacy-enabled service line. 

Dedicated acceleration is the stabilising foundation for blockchain privacy 

This is where architecture becomes a critical commercial consideration. 

At Optalysys, we develop dedicated, FPGA-based encrypted compute servers specifically architected for deploying FHE on blockchain. Instead of running heavy cryptography on general-purpose CPUs and GPUs alongside everything else, we move the core operations into specialised hardware that is tuned for encrypted workloads. 

This approach delivers several concrete benefits: 

  • More predictable performance – FPGAs allow us to implement the critical operations at the hardware level, with fixed pipelines and parallelism tailored to the workload. That reduces latency variability and makes behaviour under load easier to model – the foundation for credible SLAs 
  • Improved efficiency – Specialised acceleration can significantly reduce the resources required per encrypted operation, lowering the effective privacy tax on each transaction or query. Lower, more stable operating costs translate into healthier, more predictable margins 
  • Clearer capacity planning – Because the performance characteristics of the accelerator are well understood, it becomes easier to forecast how encrypted demand will translate into infrastructure requirements. This means you can design service tiers and pricing models that scale sensibly 

LightLocker Node™ is a dedicated encrypted-compute engine designed to sit alongside existing infrastructure, providing a stable, scalable foundation for privacy-preserving workloads. 

This turns blockchain privacy from an implementation detail to a product service that you can wrap, sell and repeat. 

Why is blockchain privacy critical now? 

For blockchain-based systems to support meaningful, regulated workloads, privacy must become part of their fabric. Something end-users do not notice, and architecture teams can rely on without constant intervention. 

Blockchain can only be deployed responsibly and recognise its trillion-dollar potential with robust privacy measures. And advanced privacy will only go mainstream when it is: 

  • Economically predictable 
  • Operationally dependable 
  • Efficient enough to scale sustainably   

The infrastructure that uses fewer resources per encrypted operation, scales more gracefully and behaves more consistently will win out. FPGA-based acceleration is a practical way to reach that point today: efficient by design, tuned for encrypted workloads, and capable of delivering the predictable performance that real-world adoption demands. erformance and policy reinforce eachother — will be the ones clients call when they’re ready to move from pilots to production. 

Get in touch with us explore what a compliance-grade ledger operations offering looks like for your clients: SLAs, runbooks, and a managed enforcement service you can monetise.

We’ll work with you to provide access to our test environment, a guided evaluation plan and reference blueprints and artefacts for evidence, reporting, and integration → 


The post The economics of scaling blockchain privacy appeared first on Optalysys.

]]>
Turning compliance from blockchain design constraint to competitive edge https://optalysys.com/resource/turning-compliance-from-blockchain-design-constraint-to-competitive-edge/ Wed, 18 Feb 2026 09:58:43 +0000 https://optalysys.com/?p=3116 Blockchain has grown up.  What began as an experiment in open, permissionless networks is now part of board-level conversations about market infrastructure, digital assets, cross-border payments, and data-sharing.   From the EU’s MiCA regulation fully coming into force to the GENIUS act, strict new stablecoin frameworks in the US and Asia, and the FCA finalising a UK crypto regulatory framework, the message is clear:  If a blockchain is going […]

The post Turning compliance from blockchain design constraint to competitive edge appeared first on Optalysys.

]]>
Secure AI

Turning compliance from blockchain design constraint to competitive edge

by Marcella Arthur
CRO in Residence at Optalysys

Blockchain has grown up. 

What began as an experiment in open, permissionless networks is now part of board-level conversations about market infrastructure, digital assets, cross-border payments, and data-sharing.  

From the EU’s MiCA regulation fully coming into force to the GENIUS actstrict new stablecoin frameworks in the US and Asia, and the FCA finalising a UK crypto regulatory framework, the message is clear: 

If a blockchain is going to carry regulated value, compliance can’t sit at the edges – it must be designed into the core. Policy is becoming code that executes before value moves, not after. 

The fundamentals are still the same: KYC, AML and CTF checks, transaction monitoring and reporting of suspicious activity. But baking these processes into distributed ledger technologies is new terrain. 

For enterprises and institutions, this isn’t optional. 

For MSPs, it’s an opportunity. 

From reactive compliance to embedded enforcement 

Historically, financial compliance has been reactive; you detect a breach or flag a rogue transaction flagged, it’s then investigated, reported, analysed and corrected.   

Architecting on-chain enforcement – embedding compliance by design – allows firms to catch and prevent non-compliant activity before it executes. This becomes the enabler for scaling systems safely and cementing the global shift to blockchain-based finance.   

Rather than relying solely on external monitoring and manual processes, the system itself becomes an active control mechanism. 

This does not replace governance, risk and compliance functions. It strengthens them with: 

  • Clear, enforceable rules expressed as code 
  • Automated checks at transaction or contract level 
  • A consistent set of controls that apply across participants and jurisdictions 

Turning blockchain compliance into a product surface 

Alongside policymakers’ demands for oversight, the blockchain ecosystem seeing mounting pressure to implement more robust privacy measures, from users and institutions alike. 

The winning architecture pattern we’re starting to see is:  

Enforce → Attest → Prove, without exposing identities, positions, or raw data. 

That means designing architectures where: 

  • Enforce: policy is expressed as code, at smart-contract level (e.g. who may transact, under what conditions, with which assets and jurisdictions) 
  • Attest: privacy-preserving technologies (like ZK or FHE) are used to enforce and attest to those policies without exposing underlying data 
  • Prove: the system emits clear, auditable evidence on how decisions were made while protecting sensitive information 

For enterprises, that makes on-chain systems easier to justify internally: 

  • Legal and compliance teams can see how obligations are operationalised 
  • Risk functions can understand and test worst-case behaviour 
  • Audit teams have a clear trail of what the system allowed and denied 

For GSIs and MSPs, it creates a deliverable you can take to market: a compliance-ready ledger blueprint, credential, contract and evidencing templates and robust privacy measures. But without a performance and reliability envelope and clear ownership boundaries, it doesn’t translate into a repeatable, scalable deployment pattern to roll out across clients. 

What breaks when blockchain compliance is an afterthought 

What happens when this enforce → attest → prove pattern is not in place? Let’s look at a regulated firm piloting tokenised assets on a shared ledger: 

  • Compliance defines requirements 
  • Only eligible, AML/KYC-satisfied users can participate 
  • Jurisdictional constraints must be adhered to 
  • Full auditability of processes and decisions is required 

When compliance is bolted-on – via manual whitelists maintained off-chain, post-trade monitoring and reporting or separate privacy flows that aren’t tied into the transaction path – the pilot breaks. 

  • Onboarding stalls: each new participant requires manual checks, list updates, and reconfiguration. Sales cycles stretch by months 
  • Rule changes are brittle: updating a policy means retesting every integration and custom component; each change feels like a one-off project 
  • Investigations are painful: when something looks suspicious,  teams have to reconcile raw logs, external tools, and multiple systems just to reconstruct what happened 

Over time, three critical things break: Commercial momentum, operational resilience and regulatory confidence. 

Deloitte found that regulatory complexity is viewed as the greatest challenge to firms’ compliance risk management efforts for digital assets, with lack of leadership support for changes or investments and difficulty in identifying illicit digital asset use coming second and third respectively. 

If you can’t offer a service that reduces that complexity and delivers on enforcement, privacy and performance, you will lose out to providers that can. 

You need: 

  • Packaged deliverables: 
    • Compliance-grade ledger blueprint 
    • Credentialed onboarding and policy templates 
    • Evidence & audit reporting layer 
    • Performance and reliability envelope with clear ownership boundaries 
  • commercial motion: where to price this (per project, subscription, usage), and confidence in how it repeats, scales and performs across industries and clients. 

The performance problem 

It’s clear that privacy-preserving enforcement paves the way forward, but it is incredibly compute-heavy.  

Fully Homomorphic Encryption – the ‘holy grail’ of cryptography that enables processing and analysis of data that remains encrypted at all times – offers a range of compliance benefits: 

  • Data minimisation & confidentiality by design 
  • Verifiable rule execution 
  • Controlled disclosure/governance-friendly decryption 
  • Reduced information leakage that can create misconduct risk 

But its computational demands have hindered its deployments due to performance trade-offs too great for firms to seriously consider and providers to reliably offer. 

This is where Optalysys comes in: our role is to make enforce → attest → prove practical, predictable and repeatable by delivering: 

  • Accelerated encrypted compute (via our confidential blockchain server LightLocker™ Node) to pull the most intensive enforcement logic out of generic CPU/GPU pools and onto dedicated rails which enables… 
  • An execution model where policy checks over encrypted data have stable, measurable latency and cost — the kind you can build SLAs and services around 
  • Patterns and reference designs that help partners wrap this into compliance-grade offerings, not just custom builds 

Acceleration is what turns the pattern into something you can industrialise. 

How to evaluate a compliance-ready blockchain stack 

As a critical infrastructure provider, you’ll need to assess: 

  • Workload definition 
    • Which flows need on-chain policy enforcement? 
    • What data must remain encrypted end-to-end? 
  • Success criteria 
    • Target latency budget for “enforce + attest” per transaction 
    • Acceptable overhead vs. non-enforced flows 
  • Benchmark plan 
    • Scenarios (normal load, stress, upgrade events) 
    • Metrics (p95/p99 latency, throughput, evidence quality) 
  • Operating cadence 
    • How often rules change (sanctions, product sets, jurisdictions) 
    • Who is authorised to update policies, and how is this audited?
  • Runbooks and SLAs 
    • What happens if the enforcement layer degrades? 
    • How quickly must you detect and correct evidence gaps? 

Compliance has moved from a box-ticking exercise to a design constraint and service opportunity

Partners who action that — and turn it into enforceable, attestable, repeatable architectures where privacy, performance and policy reinforce eachother — will be the ones clients call when they’re ready to move from pilots to production. 

Get in touch with us explore what a compliance-grade ledger operations offering looks like for your clients: SLAs, runbooks, and a managed enforcement service you can monetise.  

We’ll work with you to provide access to our test environment, a guided evaluation plan and reference blueprints and artefacts for evidence, reporting, and integration → 


The post Turning compliance from blockchain design constraint to competitive edge appeared first on Optalysys.

]]>
Scalable privacy as a managed service: Turning confidential blockchain into a repeatable revenue line https://optalysys.com/resource/scalable-privacy-as-a-managed-service-turning-confidential-blockchain-into-a-repeatable-revenue-line/ Wed, 11 Feb 2026 09:32:25 +0000 https://optalysys.com/?p=3055 Enterprises exploring blockchain are not short on ideas. They are short on confidence. They see the promise of tokenised assets, shared ledgers, and programmable workflows. What stops adoption is not ambition. It is operational risk. Privacy sits at the centre of that risk. Most blockchain platforms are secure, but not confidential. Retrofitting privacy introduces complexity, […]

The post Scalable privacy as a managed service: Turning confidential blockchain into a repeatable revenue line appeared first on Optalysys.

]]>

Scalable privacy as a managed service: Turning confidential blockchain into a repeatable revenue line

by Marcella Arthur
CRO in Residence at Optalysys

Enterprises exploring blockchain are not short on ideas. They are short on confidence.

They see the promise of tokenised assets, shared ledgers, and programmable workflows. What stops adoption is not ambition. It is operational risk.

Privacy sits at the centre of that risk.

Most blockchain platforms are secure, but not confidential. Retrofitting privacy introduces complexity, performance volatility, and cost uncertainty that enterprises are not equipped to manage alone.

This creates a clear opportunity for MSPs.

From experimental tech to managed infrastructure 

The early blockchain privacy story was about the cypherpunk workarounds and novelty technologies like pooled funds and the introduction of ZK (Zero Knowledge) proving systems.

These have evolved into the privacy toolkit that explorative, forward-thinking builders are using today: zk-SNARKs, zk-STARKs, MPC (Multi-Party Computation), TEEs (Trusted Execution Environments), and more recently, FHE (Fully Homomorphic Encryption).

But privacy has moved from the experimental fringes to the required default, and the tech must follow suit.

However, much of this innovation assumed conditions that rarely exist in real deployments:

  • Small, well-defined data sets 
  • Simple, controlled user journeys 
  • Limited interaction with the wider network 

Enterprise deployments look nothing like that.

They require:

  • Predictable performance under variable load
  • Stable cost profiles that can be budgeted
  • Clear operational ownership
  • Support for compliance and audit requirements

In this landscape, privacy technologies must behave like core infrastructure – dependable, well-understood and compatible with the rest of the stack.

This is where MSPs become essential.

The MSP value opportunity

Confidential blockchain is not a one-off project. It is an ongoing operational capability.

Enterprises do not want to build and maintain specialised encrypted compute stacks internally. They want a service.

This creates a new managed category:

Confidential blockchain acceleration as a service.

MSPs that can offer this gain:

  • A differentiated revenue line tied to privacy and compliance
  • Long-lived infrastructure contracts rather than project work
  • Clear attach points for monitoring, SLAs, upgrades, and optimisation
  • A defensible position as regulated workloads move on-chain

Predictability is what makes this sellable

For enterprises, predictability matters more than novelty.

Let’s take a bank exploring tokenised assets on a shared ledger, or a market operator assessing a private order book. They cannot adopt an approach where: 

  • The time required to process an encrypted transaction varies dramatically with network load 
  • The cost of maintaining confidentiality becomes unpredictable at scale 
  • Privacy features impact overall system stability or user experience  

They need to know:

  • How encrypted workloads perform as usage grows
  • How costs scale with transaction volume
  • How upgrades are handled over time
  • How operational risk is managed

In this context, reliability becomes more important than novelty. Organisations are not looking for the most experimental technology; they are looking for technology that behaves in a predictable, repeatable way under real-world conditions. 

If privacy introduces volatility, adoption stalls.

If privacy behaves predictably, it becomes deployable. 

Managing the privacy tax

There is still a privacy tax associated with encrypted compute: additional processing time, higher resource consumption and increased complexity.  

If this tax cannot be controlled, organisations are faced with compromising security with performance, or vice versa.  

Scalable privacy means reducing and stabilising that tax so that confidential processing becomes a standard, budgetable part of the architecture.  The role of infrastructure and acceleration is to make that cost:

  • Visible
  • Controllable
  • Optimisable

When the privacy tax is stabilised, MSPs can price services confidently and maintain margin as customers scale.

If you’re exploring how to bring sensitive or regulated workloads on-chain, get in touch with us to find out more about how accelerated encrypted compute can support that journey, and how to assess readiness for scalable privacy in your own architecture →

The post Scalable privacy as a managed service: Turning confidential blockchain into a repeatable revenue line appeared first on Optalysys.

]]>
Computing for the future: Silicon photonics and sustainability https://optalysys.com/resource/computing-for-the-future-silicon-photonics-and-sustainability/ Fri, 23 Jan 2026 10:19:36 +0000 https://optalysys.com/?p=3019 In 1965, Intel co-founder Gordon Moore noticed that transistors – the building blocks of microchips – were shrinking.  He predicted that every two years, the number of transistors within a chip would double, and remarkably, this prediction came true. It became known as Moore’s Law, and it drove phenomenal growth in computing power that changed the world as we know it and fuelled a […]

The post Computing for the future: Silicon photonics and sustainability appeared first on Optalysys.

]]>

Computing for the future:
Silicon photonics and sustainability 

Our world needs more compute and less carbon. By moving from electrons to light, silicon photonics unlocks high performance with lower energy and stronger privacy. The next era of sustainable computing starts now.

In 1965, Intel co-founder Gordon Moore noticed that transistors – the building blocks of microchips – were shrinking. 

He predicted that every two years, the number of transistors within a chip would double, and remarkably, this prediction came true. It became known as Moore’s Law, and it drove phenomenal growth in computing power that changed the world as we know it and fuelled a vast market for computing hardware. 

As these silicon transistors got smaller, computers became faster, cheaper and more powerful. Early transistor computers in the late1950s typically held fewer than 100 transistors, the first Macintosh computer (1984) contained 68,000 transistors while an iPhone 17 (2025) has a whopping 19 billion. 

The end of computing as we know it 

The modern world is built on the results of this exponential growth, but we have hit the limits of physics when it comes to shrinking silicon. Transistors are now so tiny – just a few atoms wide – that it is impossible to shrink them further and have them operate predictably and precisely in the way that we have come to rely on.  

At the same time, a related principle – Dennard scaling, which said that as transistors get smaller their power use remains proportional to their area – has also broken down.  

The result is a phenomenon known as dark silicon: chips packed with billions of transistors that can’t all be switched on at once without overheating or drawing excessive power. 

This all marks the end of computing as we have come to know it. Energy demands from expanding data centres are surging, AI models are demanding orders of magnitude more compute, and silicon electronics are hitting hard physical and thermal limits. And all of this comes at a huge environmental cost. 

This is why we are pioneering the use of silicon photonics; computing with light, rather than electricity. 

Beyond the limits of electronic compute – silicon photonics 

Photonics (or optical computing) replaces electrons with photons to move and process data. Light travels faster, carries more information simultaneously, and produces minimal heat compared to electricity. 

By integrating photonics directly onto silicon, we combine the scalability of semiconductor manufacturing with the speed and efficiency of optics.  

This results in dramatically higher compute density, lower power consumption, and superior thermal performance, enabling us to overcome the limits that the collapse of Moore’s Law and the rise of dark silicon have imposed on conventional chips. 

Sustainable, scalable intelligence that doesn’t cost the earth it serves 

Sustainability is now one of the defining challenges of modern computing. The rapid growth underpinned by Moore’s Law has left a significant carbon footprint, and energy demands are only increasing.  

Data centres already account for nearly 3% of global electricity use, and AI workloads could double that in the next few years. Every watt saved at the chip level has a multiplier effect across the world’s digital infrastructure. 

By using light instead of current, we can deliver orders of magnitude performance gains per watt, reducing power draw and cooling requirements while enabling far greater computational throughput.  

This creates a foundation for truly sustainable AI and data processing, one that is faster, cleaner, and fundamentally scalable

A new class of applications 

The benefits of silicon photonics extend beyond sustainability gains. The same parallelism and efficiency that make photonic systems greener also enable new types of computing once thought impractical. 

One of these is Fully Homomorphic Encryption (FHE), a cryptographic technique that allows data to be processed securely while remaining encrypted. Typically, FHE has been confined to the lab – too computationally intensive for most practical uses. Our tech changes that, offering the computational power and energy efficiency to make this groundbreaking, privacy-preserving computing viable at scale. 

This breakthrough has far-reaching implications for sectors such as defence, finance, and healthcare, where both performance and data privacy are critical. The same principles also apply to AI acceleration, scientific modelling, and edge computing, all of which benefit from high bandwidth and low-power processing. 

Meeting the world’s demands 

As the constraints of electronic silicon become ever tighter, the global appetite for alternatives is growing fast.  

With silicon photonics, bottlenecks fall. Energy use drops. Security strengthens. The limits that held electronic compute in place no longer define what is possible.

The collapse of Moore’s law and the end of Dennard scaling do not signal an end to progress. They mark the start of a new path where light, not electrons, carries the work of the world.

We are expanding internationally to collaborate with leading research institutions, semiconductor manufacturers, and technology partners across AI, cybersecurity, and data infrastructure. Our investors and government partners stand with us as we bring this future to market.

We are proud to help lead this transformation to faster, greener, and inherently secure computing.

Now is the time to choose the future of compute. Join us to test, validate, and scale silicon photonics. Partner with us on research. Build with us in production. Invest with us to accelerate adoption. Support policies that move light based computing forward.

The future runs on light. Build it with us.

The post Computing for the future: Silicon photonics and sustainability appeared first on Optalysys.

]]>
Optalysys secures £23m to advance photonic computing and US expansion https://optalysys.com/resource/optalysys-secures-23m-funding-to-advance-photonic-computing/ Thu, 22 Jan 2026 08:00:00 +0000 https://optalysys.com/?p=3020 Leeds, UK, 22nd January 2026: Optalysys, the photonic computing company, has raised £23 million in a Series A extension round led by Northern Gritstone with participation from imec.xpand, Lingotto Horizon, and the UK government’s National Security Strategic Investment Fund (NSSIF). The investment will be used to accelerate the commercialisation of Optalysys’ proprietary photonic chips and […]

The post Optalysys secures £23m to advance photonic computing and US expansion appeared first on Optalysys.

]]>

Optalysys secures £23m to advance photonic computing and US expansion

Leeds, UK, 22nd January 2026: Optalysys, the photonic computing company, has raised £23 million in a Series A extension round led by Northern Gritstone with participation from imec.xpand, Lingotto Horizon, and the UK government’s National Security Strategic Investment Fund (NSSIF). The investment will be used to accelerate the commercialisation of Optalysys’ proprietary photonic chips and support expansion into the US.

As AI and cloud workloads continue to grow exponentially, conventional electronic computing is reaching its physical limits. Optalysys’ unique approach integrates data movement and processing on a single chip, combining silicon photonics with cutting-edge digital technologies to deliver immense computational power whilst reducing the carbon footprint of today’s energy-intensive digital methods.

The company is developing a programmable, high-density layer designed to run compute-intensive workloads, including  GenAI and post-quantum algorithms, forming a foundation for next-generation cloud infrastructure. One key application is fully homomorphic encryption (FHE), which allows data to be processed securely while remaining encrypted — an increasingly important capability for secure cloud and enterprise computing. Early forms of FHE technology are being used in digital form in Optalysys’ LightLocker™ Node servers launched last year — the world’s first dedicated hardware solution designed for encrypted blockchain applications.

Dr Nick New, CEO and Co-Founder of Optalysys, said: We are at a defining moment in the evolution of computing. Photonic computing opens up fundamentally new capabilities, allowing data to be moved and processed with far greater speed and efficiency. This investment validates both the scale of the opportunity ahead and our ability to execute against it. It allows us to expand into new markets and take an important step towards making photonic computing a mainstream part of cloud infrastructure.”

Robert Todd, CTO and Co-Founder of Optalysys, said: Recent acquisitions in the semiconductor industry have highlighted the role that photonics can play in addressing the limits of electronic computing, particularly in processing capability and power consumption, resulting from the demands of training and running even larger AI models. Optalysys’ approach uniquely combines data movement and compute within the same package. Expanding to the US is an exciting and natural next step for us, so that we can tap into its strong photonics ecosystem and the immense talent located in Silicon Valley.

Duncan Johnson, CEO of Northern Gritstone, said: “Optalysys is scaling towards global success. The company is building technology for the next generation of computing and has the team, technology and commercial traction that we, as investors, want. We’re excited to support the team as they continue to commercialise their technology and deliver real-world impact across multiple industries.”

ENDS

About Optalysys

Founded in Leeds, UK, in 2013 by Dr. Nick New (CEO) and Robert Todd (CTO), Optalysys is a photonic computing company. Optalysys’ unique approach integrates data movement and processing on a single chip, combining silicon photonics with cutting-edge digital technologies to deliver immense computational power required for today’s workloads and next generation cloud infrastructure. Optalysys has built the world’s first dedicated hardware solution designed for encrypted blockchain applications — LightLocker™ Node.

About Northern Gritstone

Northern Gritstone is a venture capital firm unlocking the North of England’s innovation potential by investing in science and technology startups and university spinouts tackling global challenges. Founded in 2021, Northern Gritstone partners with the Universities of Leeds, Liverpool, Manchester and Sheffield to identify world-leading research and develop deeptech and life sciences companies with the potential to scale worldwide. Backed by leading institutional investors including Legal & General, M&G, Aviva, Columbia Threadneedle, Bruntwood, local authority pension funds, and the British Business Bank, Northern Gritstone has supported over 40 companies to date, with more than £380 million invested alongside co-investors.

The post Optalysys secures £23m to advance photonic computing and US expansion appeared first on Optalysys.

]]>
Confidential computing decoded: TEEs vs FHE  https://optalysys.com/resource/confidential-computing-decoded-tees-vs-fhe/ Fri, 09 Jan 2026 08:59:06 +0000 https://optalysys.com/?p=2983 For decades, the cybersecurity industry has been preoccupied with two states of data. We have built elaborate fortresses around data at rest (encrypting files on disks) and secured data in transit (wrapping network traffic in Transport Layer Security).  But when it comes to data in use, many organisations find themselves vulnerable to exploits.  To actually […]

The post Confidential computing decoded: TEEs vs FHE  appeared first on Optalysys.

]]>

Confidential computing decoded: TEEs vs FHE 

Confidential computing is becoming a critical component in cybersecurity strategies, but how do Trusted Execution Environments stack up against Fully Homomorphic Encryption?

For decades, the cybersecurity industry has been preoccupied with two states of data. We have built elaborate fortresses around data at rest (encrypting files on disks) and secured data in transit (wrapping network traffic in Transport Layer Security). 

But when it comes to data in use, many organisations find themselves vulnerable to exploits. 

To actually do anything with data – like run an AI model, calculate a risk score, or retrieve a record from a database – it typically must be decrypted. For that brief window, the data is exposed and vulnerable. If a hacker, a rogue administrator, or a compromised operating system accesses the data in its plaintext state, the encryption at rest and in transit becomes irrelevant.  

This is the gap that confidential computing closes. It is the reason Gartner has named it a top strategic technology trend for 2026, predicting that by 2029, over 75% of operations in untrusted infrastructure will use it.  

The current standard: Trusted Execution Environments (TEEs)

At the heart of today’s confidential computing is the Trusted Execution Environment (TEE). Available in chips from major manufacturers like Intel and AMD, a TEE acts as a hardware-enforced secure enclave within the processor.  

The premise is elegant:  

  • Isolation: The TEE carves out a secure region of memory. Even the cloud provider’s hypervisor cannot look inside 
  • Attestation: Before you send data, the chip provides a cryptographic “ID card” proving it is genuine and running the correct code 

The flaw: recent TEE exploits

However, the TEE model relies on a critical assumption: absolute trust in the hardware. You are trusting that the physical silicon is impervious to attack. Unfortunately, recent history – and specifically the events of late 2025 – has shown that hardware is far from bulletproof.  

In October 2025, researchers unveiled TEE.fail and WireTap, two devastating physical attacks on confidential computing infrastructure. Using cheap, off-the-shelf electronics (the required kit could be bought for $1000), attackers were able to place an interposer between the CPU and the memory. By listening to the electrical signals on the memory bus, they could exploit the deterministic nature of the memory encryption to extract the very keys meant to protect the TEE.  

This follows a long line of side-channel exploits like Downfall, Hertzbleed, and CacheWarp, which allow attackers to decode secrets by analysing power consumption or execution timing. 

The lesson is clear: As long as data is decrypted somewhere on the chip, it remains vulnerable to exploitation.

From hardware to mathematics: cryptographic certainty with FHE 

While TEEs rely on specialised hardware enclaves to protect data and code while it’s in use, Fully Homomorphic Encryption (FHE) uses advanced cryptography to perform computations directly on the encrypted data and return an encrypted result. 

This means that, with FHE, even if the system is compromised, all an attacker can access is encrypted data and noise. The data is never exposed nor accessible in its plaintext state – not by the CPU, not in the cache and not in the system’s memory. 

 Historically, the barrier to FHE has been speed; software-only implementations are computationally intense. But just as GPUs revolutionised graphics processing, the emergence of dedicated hardware acceleration, like that developed by us at Optalysy is solving FHE performance bottlenecks and making it a commercially viable way to deploy confidential computing.  

Gartner is right to highlight confidential computing as the trend of the decade. But smart leaders should view TEEs as one component in the mix of privacy and security. The future belongs to infrastructure that protects data even when the hardware fails, with the right mix of the right technologies for the job. 


At Optalysys we’re developing the future of secure AI through pioneering the use of silicon photonics to accelerate Fully Homomorphic Encryption. Get in touch with us to find out how we can accelerate your FHE use case → 

The post Confidential computing decoded: TEEs vs FHE  appeared first on Optalysys.

]]>
AI is a choice: 5 steps to make it privacy-safe  https://optalysys.com/resource/ai-is-a-choice-5-steps-to-make-it-privacy-safe/ Fri, 19 Dec 2025 17:10:22 +0000 https://optalysys.com/?p=2985 A recent Gartner report on AI and privacy states: “using AI is a choice and never an obligation”.   However, in the current enterprise landscape, the buzz around AI is leading to ever-increasing pressure on teams across organisations to adopt new forms of machine learning.   The report confirms what security and privacy leaders already know: AI […]

The post AI is a choice: 5 steps to make it privacy-safe  appeared first on Optalysys.

]]>

AI is a choice: 5 steps to make it privacy-safe 

How do you unlock the value of AI without exposing your customers, and your organisation, to unacceptable risk? It starts by treating privacy as a foundational component, not an add-on. 

A recent Gartner report on AI and privacy states: “using AI is a choice and never an obligation”.  

However, in the current enterprise landscape, the buzz around AI is leading to ever-increasing pressure on teams across organisations to adopt new forms of machine learning.  

The report confirms what security and privacy leaders already know: AI adoption amplifies privacy risk, especially when foundational privacy controls are weak. The challenge is that AI models are data-hungry, and that data is often full of sensitive, identifiable information.  

So how do you innovate responsibly? How do you unlock the value of AI without exposing your customers, and your organisation, to unacceptable risk? It starts by treating privacy as a foundational component, not an add-on.  

Here are five practical steps to bake privacy into your AI operations from day one. 

1. Start with proactive governance (not panic!) 

The rapid rise of AI has many boards demanding new, sweeping AI governance committees. A more pragmatic approach, as suggested by Gartner, is to integrate AI governance into your existing privacy programs. Don’t start from scratch; evolve what you already have.  

Your existing Privacy Impact Assessment (PIA) process is the perfect place to start.  

Evolve it into an “AI and Privacy Impact Assessment” (AIIA). This ensures that before any new AI model is deployed, your privacy and security teams have already asked the hard questions: What data was this model trained on? What is the data lineage? How will we monitor it for bias and drift?  

By embedding AI risk into proven workflows, you make governance a repeatable process, not a state of panic. 

2. Embed Privacy-by-Design (PbD) into every AI project

The core principles of PbD center around designing systems so that privacy isn’t a feature, it’s the architecture. 

That means: 

  • Making privacy impact assessments part of your model development workflow 
  • Documenting what data you collect, why it’s needed, and how long it’s retained 
  • Treating privacy as a default setting, not a user-dependent option 

In practice, this could mean automatically de-identifying training datasets, restricting model inputs to non-sensitive attributes, or adding runtime enforcement to prevent unnecessary personal data exposure. 

3. You can use AI to implement and scale your privacy measures responsibly 

AI is a source of risk, but it can also be a remedy. The sheer scale of data in a modern enterprise makes manual governance impossible.  

This is where automation comes in. Gartner notes that AI can be leveraged to enhance your privacy program by automating data discovery, classification, and life cycle controls.  

It can also streamline the handling of Subject Rights Requests (SRRs) and help assess re-identification risk, freeing up your human experts to focus on high-level strategy and risk management. 

4. Equip yourself with the right PET for the job

Privacy Enhancing Technologies (PETs) are the building blocks of safely and securely utilising AI. Your strategy must involve selecting the right tool for the right use case. Are you training a model on historical data? Synthetic data might work. Are you performing broad statistical analysis for a public report? Differential privacy could be a fit.  

But what if you need to process live, sensitive, individual data? That requires a different, more robust solution, like Fully Homomorphic Encryption (FHE)

PETBest forExample
Synthetic dataModel training & collaboration Replace sensitive datasets with statistically valid, non-identifiable versions
Differential privacyAnalytics & reporting Add noise to outputs to prevent re-identification 
Privacy-Aware Machine Learning (PAML)AI model training Rules that prevent singling out individuals from large datasets 
Confidential computing (like FHE)High-sensitivity workloads Protect confidentiality by processing encrypted data without needing to decrypt it 

Choosing the right PET depends on data sensitivity, use case, and computational constraints. The wrong choice risks either overprotecting (reducing model utility) or under-protecting (exposing sensitive data).  

5. Strengthen your foundations before you scale 

Think of AI as an amplifier: if your privacy programme is strong, it enhances it. If it’s weak, it exposes it. 
 
Before training a model, make sure your organisation has mastered the basics: 

  • Data minimisation and purpose limitation: Only collect what’s needed, and only use it for what’s declared 
  • Clear retention and deletion policies: Know when and how to retire data 
  • Impact assessments that matter: Don’t treat PIAs as tickbox exercises – they are your early warning system for hidden risk. 

If these aren’t in place, AI adoption will magnify every weak control, from insecure data pipelines to unclear lawful bases for processing. 

The bottom line: AI is a choice

The Gartner report ends with a reminder worth repeating: 

“Though pressure may feel intense to adopt every new form of AI technology available today, remember: using AI is a choice and never an obligation.” 

Choosing AI means choosing responsibility. Building privacy-safe AI isn’t just good ethics – it’s good engineering, good compliance, and good business. 

At Optalysys we’re developing the future of secure AI through pioneering the use of optical computing to accelerate Fully Homomorphic Encryption. Get in touch with us to find out how we can accelerate your FHE use case → 


The post AI is a choice: 5 steps to make it privacy-safe  appeared first on Optalysys.

]]>