TechGDPR https://techgdpr.com/ Wed, 04 Mar 2026 11:20:37 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 Data protection digest 18 Feb – 2 Mar 2026: ‘Conditional Consent’ for meaningful user control over cookie preferences https://techgdpr.com/blog/data-protection-digest-04032026-conditional-consent-for-meaningful-user-control-over-cookie-preferences/ Wed, 04 Mar 2026 10:03:33 +0000 https://techgdpr.com/?p=11607 Conditional consent vs cookie fatigue On 10 February, the EDPB and EDPS, in a joint opinion, strongly welcomed the regulatory solution to address cookie fatigue and the proliferation of consent banners. This follows the  European Commission’s proposal to switch to automated, machine-readable indications of data subjects’ choices under the Digital Omibus package. The EU regulators […]

The post Data protection digest 18 Feb – 2 Mar 2026: ‘Conditional Consent’ for meaningful user control over cookie preferences appeared first on TechGDPR.

]]>
Conditional consent vs cookie fatigue

On 10 February, the EDPB and EDPS, in a joint opinion, strongly welcomed the regulatory solution to address cookie fatigue and the proliferation of consent banners. This follows the  European Commission’s proposal to switch to automated, machine-readable indications of data subjects’ choices under the Digital Omibus package. The EU regulators welcome that, pursuant to the proposed Article 88b of the GDPR, harmonisation standards will be developed. 

Such standards should cover the communication of data subjects’ choices, from browsers to websites, from mobile phone applications to web services, and ensure that all involved actors use the same automated machine-readable indications and are not simply repackaging consent in a new technical format. 

 Stay up to date! Sign up to receive our fortnightly digest via email.

Anticipating the need of data controllers and browser providers in the near future to be able to accept and enable automated signals, TechGDPR publishes Conditional Consent, an open concept paper proposing what automated signalling should look like for meaningful user control, based on three dimensions:

  • Cookie purpose
  • Website category
  • Third-party processing

The concept paper contains the main principles, legal basis and exceptions, technical specifications, along with a comparison with existing tools, and a proposed implementation solution, all available at conditionalconsent.com.

Main developments 

Prohibited AI practices: A Future of Privacy Forum analysis draws “red lines” under prohibited practices in the new EU AI Act. They concern harmful manipulation and deception, social scoring, individual risk assessment, untargeted scraping of facial images, emotion recognition, biometric categorisation, and real-time remote biometric identification for law enforcement. Prohibited AI practices are regulated by Article 5 of the AI Act, which became applicable in February 2025. Plus, starting on 2 August 2025, this provision also became enforceable

AI-generated images: The EDPB has signed a Joint Statement on AI-Generated Imagery and the Protection of Privacy. The statement, coordinated by the Global Privacy Assembly, represents the united position of 61 authorities across the world. The statement addresses serious concerns about AI systems that generate realistic images and videos depicting identifiable individuals without their knowledge or consent. The co-signatories are especially concerned about potential harm to children and other vulnerable groups, such as cyber-bullying and/or exploitation. Fundamental principles should guide all organisations developing and using AI content generation systems, including:

  • Implement robust safeguards to prevent the misuse of personal information.
  • Ensure meaningful transparency about AI system capabilities, safeguards, acceptable uses and the consequences of misuse. 
  • Provide effective and accessible mechanisms for individuals to request the removal of harmful content involving personal information and respond rapidly to such requests. 
  • Address specific risks to children through implementing enhanced safeguards and providing clear, age-appropriate information to children, parents, guardians and educators

Digital Omnibus legal study

The European Parliament published a study identifying interlinks and possible overlaps between different legal acts in the field of digital legislation. It analyses the European Commission’s Digital Omnibus package proposals published on 19 November 2025, distinguishing administrative simplification from more substantive recalibration of safeguards across data, privacy, cybersecurity and AI areas. The study highlights key areas of controversy (legal certainty, enforcement capacity, and impacts on rights) and sets out areas for consideration for parliamentary scrutiny, including:

  • Debate over the definition of personal data in the GDPR
  • Integrating ePrivacy into GDPR (cookie fatigue)
  • Concerns about restricting data access rights
  • Data Act consolidation
  • Centralised incident notification submission SEP
  • AI timelines, burden reduction and centralisation.

Ransomware statistics

In 2025, 65 ransomware incidents were reported to the police in the Netherlands. Incident response companies responded to 40 incidents. Access is usually gained through exploiting vulnerabilities and account takeovers. In a ransomware attack, computer systems and data are locked with a code containing malicious software. Hard drives, databases, backups, USB drives, and cloud data can also be affected. The victim is blackmailed. The attacker offers this code for payment. 

Reporting the incident is crucial if you, as a business or individual, have been a victim of ransomware. Even if the criminals have already been paid, filing a report provides the police with vital information. A report can contain missing information that police can use to unlock the system. It also helps them identify suspects. 

More from supervisory authorities

GDPR survey in Germany: The North Rhine-Westphalia data protection commissioner has used a recent survey by the business association Bitkom as an opportunity to reject discussions about the complete or partial centralisation of data protection supervision.

The survey of 603 companies clearly shows that businesses in the state primarily view data protection laws as too complicated. 85 % of the companies surveyed in Germany want more understandable data protection regulations. 79 % are calling for a reform of the GDPR, and 69 % demand better coordination with other regulations. 

Just 33 % believe that decision-making processes would be faster within a federal agency, while 44 % are concerned about losing proximity to their local supervisory authority and thus a direct contact person (which implies the need for additional staff to handle a sharply increasing number of complaints and consultation services). 

Session replay tools: The French data protection regulator CNIL is launching a public consultation on its draft recommendation concerning session replay tools that allow the monitoring and analysis of users’ online behaviour. The objective is to support the actors who design these tools and those who use them in their compliance. Session replay tools are used to reconstruct the complete browsing path of an Internet user on a website or a mobile app. They can, for example, be used to detect and fix bugs or optimise the structure or ergonomics of a website or mobile application. 

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

More official guidance

GDPR certification criteria: The North Rhine-Westphalia data protection commissioner also approved a nationwide catalogue (available in English and German) of criteria for IT solutions. Companies that meet these criteria will receive a certificate confirming their compliance with European data protection law, which they can then use for advertising purposes. The catalogue was developed by TÜV Nord Group. This is the third such approval issued by the NRW regulator.

Specifically, it addresses so-called information processing services – online banking, accounting, and AI systems, as well as search engines. The certification process, conducted by a specialised certification body, typically involves a detailed audit of the processing operations within the respective company. This audit verifies the technical and organisational measures in place, as well as compliance with the principles of the GDPR. 

Health screening campaigns via phone are possible: In Italy, the data protection authority Garante has approved the use of telephone numbers for screening, provided that adequate safeguards are respected. Healthcare companies may use adult patients’ telephone numbers, provided during previous healthcare services, to promote participation in screening campaigns required by national or regional regulations, even if the information request did not expressly state this purpose at the time the data was collected.

Specifically, healthcare companies will be required to update their information, specifying that the most recent contact details collected for treatment purposes, subject to verification of their accuracy. It may be used exclusively for the promotion of public prevention programs and not for other purposes (for example, scientific research or administrative activities).

In other news

Employee data access rights: The LewisSilkin legal blog analyses a recent decision from the French Court of Appeal, which confirmed that employees cannot rely on their right of access to obtain copies of entire work email correspondence or business files, merely because their name or email address appears in them. Where the material contains no substantive personal data beyond identifying information, the right of access does not extend to wholesale document disclosure.

Furthermore, the right of access cannot be seen as a litigation discovery mechanism (e.g., employee dismissal as it appears in the above case). The court decision also reflects the ICO guidance on the Right of Access.  

Reddit fine: In the UK, Reddit was fined 14.47 million pounds for children’s privacy failures. The Information Commissioner’s investigation found that Reddit did not apply any robust age assurance mechanism. The company did not have a lawful basis for processing the personal information of children under the age of 13. It also failed to carry out a data protection impact assessment to assess and mitigate risks to children before 2025. In the past year, Reddit introduced age assurance measures that include age verification to access mature content and asked users to declare their age when opening an account. The commissioner once again informed Reddit that relying on self-declaration presents risks to children, as it is easy to bypass. 

Samsung consent case: The Texas Attorney General reached an agreement with Samsung Electronics America, concerning the collection of Automated Content Recognition (ACR) viewing data from Texas consumers through Samsung smart televisions. Under the agreement, Samsung must cease collecting or processing ACR viewing data without obtaining Texas consumers’ express consent and must update its smart televisions to implement clear and conspicuous disclosures and consent screens, digitalpolicyalert.org reports.

More enforcement decisions

Ransomware attack followed by privacy fine: In Spain, data protection agency AEPD fined Sprinter Megacentros del Deporte (a sporting goods retailer) 2.6 million euros for a data breach, DataGudance reports. A ransomware attack encrypted systems and exfiltrated data, affecting 6.3 million individuals. Notification of a data breach to data subjects was also not delivered ‘without undue delay’ and lacked specific mitigation information. 

Conditional consent

Biometric data fine: The Italian Garante has fined eCampus University 50,000 euros for unlawfully processing the biometric data of numerous participants in its online courses. The investigations revealed the lack of a suitable legal basis to justify the use of biometric systems, especially given the availability of less invasive tools.

It also emerged that the University had not conducted a data protection impact assessment before implementing the system. The violations affected a very high number of participants, over 450 students for each lesson.

Data processing agreement fine: The Polish data protection authority UODO has fined DPD Polska more than 2.75 million euros after finding serious failures in how the courier company structured its relationships with external carriers, according to an analysis by grcreport.com. These carriers participated in loading and unloading parcels and had access to address labels containing personal data. In some cases, shipments were transported in vehicles not owned by DPD Polska and for which it had no other legal basis. Despite this third-party access, the company did not conclude personal data processing agreements with the carriers.

GDPR does not prevent authorities from being notified of social fraud

The Danish data protection regulator, Datatilsynet, explains that the GDPR does not contain a general prohibition on disclosing information to public authorities. On the contrary, the rules allow data to be disclosed when there is a lawful basis for processing. This may be if the disclosure is necessary to comply with a legal obligation. The question of whether, for example, an insurance company may or must disclose information on possible fraud to a public authority, therefore, depends on the specific legal basis in national legislation, including rules on confidentiality and sector-specific regulations. 

And Finally

Conditional consent

AI models and GDPR audit tool:  The French CNIL, with other actors in the digital data domain, the ANSSI, the PEReN and Inria, are launching a call for expressions of interest to test an audit tool called PANAME that makes it possible to assess the confidentiality of AI models and their compliance with the GDPR. This project aims to develop a tool to audit the privacy of AI models. It will take the form of a library for performing data extraction and/or re-identification tests on AI models. 

For more than a decade, research has shown that it is possible to extract data, including personal data, from an AI model that was included in the training dataset. This extraction can be carried out via:

  • statistical techniques at the model level, full or partial access to the model, 
  • in the case of generative AI, by directly querying the model by instruction (prompt).

AI geolocation: Privacy International explains that one of the most concerning capabilities of the newest AI systems is to infer geographic location from images. Vision‑Language Models (VLMs) can now determine where in the world any given photo is taken with striking speed and accuracy. Most people are unaware that widely accessible AI tools can identify the location of their personal photos, even when Global Positioning System (GPS) metadata has been removed. Inferring location from images without GPS data may potentially support beneficial activities, such as robotics development or investigative journalism. But they are not privacy risk-free. 

The post Data protection digest 18 Feb – 2 Mar 2026: ‘Conditional Consent’ for meaningful user control over cookie preferences appeared first on TechGDPR.

]]>
Conditional Consent: an Open Proposal for How Article 88b Consent Signalling Should Work https://techgdpr.com/blog/conditional-consent-article-88b-consent-signalling-proposal/ Wed, 25 Feb 2026 12:15:40 +0000 https://techgdpr.com/?p=11601 Cookie consent is broken, and everyone knows it. Europeans spend an estimated 575 million hours per year clicking through consent banners. Research shows that up to 80% of users click “Accept All” when dark patterns push them toward it, which 72% of banners do. Half of websites set cookies before users make any choice at […]

The post Conditional Consent: an Open Proposal for How Article 88b Consent Signalling Should Work appeared first on TechGDPR.

]]>
Cookie consent is broken, and everyone knows it. Europeans spend an estimated 575 million hours per year clicking through consent banners. Research shows that up to 80% of users click “Accept All” when dark patterns push them toward it, which 72% of banners do. Half of websites set cookies before users make any choice at all, and 57.5% keep advertising cookies running even after users revoke consent. This is not informed consent. It is consent theatre, and the European Commission has finally acknowledged it.

The Digital Omnibus proposal, published in November 2025, introduces Article 88b to the GDPR. For the first time, EU law will require websites to accept automated, machine-readable consent signals from browsers. Users would set their preferences once, their browser would communicate those preferences to every site they visit, and controllers would be legally obliged to respect them. No more banners. No more clicking. No more dark patterns.

But here is the catch: the standards for how these signals should work have not been written yet. Article 88b delegates the technical specification to implementing acts and standardisation bodies. The decisions made in that process — what signals can express, who controls the interface, how much granularity users get — will shape consent for a generation of internet users.

That is why we published Conditional Consent: an open concept paper and technical specification proposing what Article 88b signalling should look like, designed from the user’s perspective.

The core idea: consent as conditions, not clicks

Today, consent is binary. Accept or reject, site by site, visit by visit. Conditional Consent proposes that users define rules across three dimensions:

  • Cookie purpose: functional, analytics, advertising, social media, personalisation
  • Website category: e-commerce, news, government, banking, healthcare
  • Third-party processor: first-party only, exclude specific companies, allow named providers

A user might say: “Allow functional cookies everywhere. Allow analytics on shopping sites, first-party only. Deny all advertising cookies. Block any processing involving Meta.”

This level of granularity does not exist in any consent tool today. Current Consent Management Platforms offer purpose toggles at best. Global Privacy Control — the most successful browser privacy signal, now mandated in twelve US states — can only express a binary “do not sell.” The Advanced Data Protection Control specification developed by noyb and the Vienna University of Economics and Business came closest to what we propose, supporting granular purpose-based HTTP header signalling, but never achieved real-world adoption and lacks the website category and processor dimensions.

Conditional Consent builds on all of these. It proposes an open HTTP header protocol for Article 88b signalling, combined with automated CMP interaction as a fallback — so it works on existing websites from day one, without requiring website operators to change anything.

What we published

The concept paper sets out the problem, the legal basis in Article 88b, six core principles for user-centric consent signalling, a detailed comparison with existing tools (GPC, ADPC, Consenter, Consent-O-Matic, IAB TCF), and a proposed architecture for a browser extension MVP.

The technical specification (pending) goes deeper: browser extension architecture, a preference engine for evaluating conditional rules, an HTTP header protocol, a CMP automation layer, chatbot-guided onboarding, and a compatibility analysis with every relevant existing standard.

These are (or will be) published under CC BY 4.0 at conditionalconsent.com. They are designed to be forked, extended, critiqued, and adopted by anyone — browser vendors, CMP providers, privacy advocates, standardisation bodies.

Why now

Article 88b has a staged timeline. Controllers must accept automated signals within 24 months of entry into force. Browser providers must enable signalling within 48 months. But the implementing standards — the technical specifications that define what those signals can actually carry — need to be developed now. Once a standard is set, it will be extremely difficult to change.

The risk is that the advertising industry shapes these standards toward the simplest possible signal — a binary accept/reject that perpetuates the current model in machine-readable form. The opportunity is to establish that the standard should support genuine conditional granularity: rules that reflect how people actually think about their privacy.

What we are asking for

We are not launching a product. We are putting a proposal on the table — early, openly, and with full documentation — so that the conversation about Article 88b implementation includes a concrete, user-centric option.

If you work in privacy, policy, browser development, or consent management, we would like your input. Read the papers. Challenge the assumptions. Propose improvements. Tell us what we got wrong. The specification is deliberately open because getting this right requires more perspectives than any single consultancy can provide.

The concept paper and technical specification are available at conditionalconsent.com.

The post Conditional Consent: an Open Proposal for How Article 88b Consent Signalling Should Work appeared first on TechGDPR.

]]>
Data protection digest 3-17 Feb 2026: When using anonymisation for deletion, controllers have differing degrees of success – EDPB https://techgdpr.com/blog/data-protection-digest-19022026-when-using-anonymisation-for-deletion-controllers-have-differing-degrees-of-success/ Thu, 19 Feb 2026 09:54:35 +0000 https://techgdpr.com/?p=11568 Data deletion requests Throughout 2025, 32 supervisory authorities across the EU/EEA launched coordinated investigations into controllers’ compliance with the right to erasure under the GDPR. Now, the EDPB has published a report of the findings. As the right to deletion is not absolute, some controllers face difficulties in assessing and applying the conditions for exercising […]

The post Data protection digest 3-17 Feb 2026: When using anonymisation for deletion, controllers have differing degrees of success – EDPB appeared first on TechGDPR.

]]>
Data deletion requests

Throughout 2025, 32 supervisory authorities across the EU/EEA launched coordinated investigations into controllers’ compliance with the right to erasure under the GDPR. Now, the EDPB has published a report of the findings. As the right to deletion is not absolute, some controllers face difficulties in assessing and applying the conditions for exercising this right, including in conducting the balancing tests between the right to erasure and other rights and freedoms. Many regulators raised concerns regarding controllers not having:

  • internal procedure or practice in place to handle erasure requests, or having an incomplete or irregularly reviewed procedure,
  • specific procedures and measures to handle erasure requests in the context of back-ups,
  • staff training,  
  • information provided to data subjects,
  • legal certainty on the exceptions to deny erasure requests, and 
  • data retention periods, etc.

Multiple regulators found that controllers relying on anonymisation for deletion have varying degrees of success in correctly implementing it. In some cases, they only apply basic pseudonymisation or partial masking, although such a process would not fulfil the requirements of the GDPR regarding deletion.

Stay up to date! Sign up to receive our fortnightly digest via email.

Interestingly, the majority of the polled controllers (out of 764) had not received a single request for erasure in the last two years. While controllers were often chosen due to being in certain particular situations (processing sensitive data, processing a very large amount of data, etc.), about 70% of controllers still received fewer than 10 requests per year. Also, it appears that certain profiles are less likely to exercise their rights (eg, applicants in public services, citizens toward public services, contractors, or job applicants/employees) while others seem less hesitant to do so (eg, potential customers).

Main developments 

Digital omnibus and GDPR simplification: The EDPB and EDPS issued a long-awaited statement on simplification of the digital legislative framework in the EU. Among many things, they advised against the proposed changes to the definition of personal data. The changes go far beyond a targeted modification of the GDPR, a ‘technical amendment’ or a mere codification of CJEU jurisprudence.

Defining what is no longer personal data directly affects and narrows the scope of application of EU data protection legislation and should not be addressed in an implementing act, say the regulators. The full opinion in the context of GDPR, AI Act, and ePrivacy Directive can be read here.

UK data reform: Meanwhile, in the UK, on 5 February, the main provisions of the Data Use and Access Act 2025  came into force, amending the UK GDPR and Data Protection Act 2018. These include: new ‘recognised legitimate interests’ legal basis for data controllers, cookie consent exemptions, data reuse permissions, the use of automated decision making, more relaxed transfers of personal data internationally, and sometimes limiting data subject access requests, etc. 

Age-appropriate code design

deletion

On February 5, South Carolina signed Age-Appropriate Code Design into law, after it was previously adopted by California, Maryland, Nebraska, and Vermont. According to JD Supra analysis, covered online services must exercise “reasonable care” in the use of a minor’s personal data and the design and operation of the covered online service. This includes features that:

  •  Decrease minors’ time and activity on the service to prevent compulsive usage, severe psychological harm, and privacy intrusions. 
  • Opt minors out of “personalisation recommendation systems” by default, and 
  • Set personal data settings to the highest level of protection by default.
  • Collect, use, share, or retain the minimum amount of a minor’s personal data “necessary” to provide the specific elements of the covered online service, etc.

More from supervisory authorities

DPO role: Under EU law, all EU institutions, bodies, offices and agencies (EUIs) are required to appoint a data protection officer (DPO). To strengthen the effectiveness and independence of this function, the EDPS has adopted two key documents clarifying the role and protection of DPOs within EUIs: 

They provide practical and up-to-date guidance on the designation of DPOs, their institutional positioning, the guarantees of independence attached to the function, and the responsibilities entrusted to them. 

Cybersecurity exercise: The ENISA offers a methodology to an end-to-end theoretical framework for planning, running and evaluating cybersecurity exercises. It ensures the right profiles and stakeholders are involved at the right time, and provides theoretical material based on lessons identified, industry best practices and cybersecurity expertise. Download the guide and the support toolkit templates here

Games age limitation: The French government, on 4 February, adopted a decree on the experimentation of games with monetisable digital objects. It requires, among other controls,  the refusal of the opening of a player account for any minor, or before verification of the identity and the age of the applicant. It requires the enterprise offering a game to document the arrangements used for verification, to carry out regular checks, and to be able to demonstrate the effectiveness and compliance of those arrangements to the National Gaming Authority. 

How to deal with data protection complaints

deletion

The updated UK ICO guidance reminds organisations what they need to do to meet the new requirements for people to open a data protection complaints process, as set out in the new Data Use and Access Act, although these requirements are not in force until 19 June 2026. At a glance, the law says organisations must:

  • Give people a way of making data protection complaints;
  • Acknowledge receipt of complaints within 30 days of receiving them;
  • Without undue delay, take appropriate steps to respond to complaints, including making appropriate enquiries, and keep people informed;
  • Without undue delay, tell people the outcome of their complaints.

Read practical advice on each of these points in the original publication.

In other news

СNIL sanctions statistics: Cookies, employee surveillance and data security were the main subjects of the penalties imposed by the French data protection authority CNIL, in 2025, the cumulative amount of which totalled 486,839,500 euros. Also, insufficient security of personal data, lack of cooperation with the CNIL and non-respect for the rights of individuals were the three main reasons for sanctions under the recently introduced simplified procedures. Numerous formal notices have targeted websites that allowed the deposit of cookies and other trackers without respecting the consent of individuals, either by not allowing them to refuse the deposit in a simple way, or by not taking into account the withdrawal of users’ consent.

In addition, the regulator often sanctioned the non-compliance with the obligations of the subcontractors concerning the data entrusted to them, in particular: 

  • implementing appropriate technical and organisational measures to ensure an adequate level of security;
  • only processing data on the instructions of the data controller;
  • deleting the data at the end of their contractual relationship with the data controller.

OpenClaw AI: The Dutch data protection authority AP warns against the use of OpenClaw, an AI agent tool that has become popular since last year. The platform provides users with an AI assistant to install, which can perform tasks autonomously. For that, the user has to give full access to their computer and programs, including email, files and online services. The platform can also be vulnerable to hidden commands in websites, emails and chat messages. That can lead to taking over accounts, reading personal data and stealing access codes.

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

More enforcement decisions

Amazon Italy investigation: On 9 February, the Italian data protection authority Garante and the National Labour Inspectorate announced an investigation into Amazon regarding the processing of workers’ personal data and the use of video surveillance systems. The investigation will examine the company’s logistics hubs, with a particular focus on the distribution centres in Passo Corese and Castel San Giovanni, to determine the extent to which monitoring practices comply with the legal requirements stipulated within the Workers’ Statute, digitalpolicyalert.org reports. 

Dutch municipalities fined: The Dutch data protection AP authority fined 10 municipalities 250,000 euros for processing sensitive information without consent, according to DataGuidance. Violations included processing data on religious beliefs, family relationships, political views, and criminal or terrorism-related information. The municipalities processed this sensitive information (from an external research bureau, amid national counter-radicalisation efforts) without valid consent.

Swiss cookie redress case: Digitec Galaxus informed the Swiss privacy regulator FDPIC that it had implemented its formal recommendation that customers be given the option to object to the processing of their personal data for marketing purposes. Following criticism over excessive data processing, users can now disable personalisation with one click (one-click opt-out), whereby the corresponding cookies are automatically disabled. To that end, the registration form now explicitly mentions personalisation and the right to object, and the privacy policy has been updated accordingly.

And Finally

Data brokers warning in the US: The Federal Trade Commission sent letters to 13 data brokers warning them of their responsibility to comply with the Protecting Americans’ Data from Foreign Adversaries Act of 2024. It prohibits data brokers from selling, releasing, disclosing, or providing access to personally identifiable sensitive data about Americans to any foreign adversary, which includes North Korea, China, Russia, and Iran, or any entity controlled by those countries

The law defines personally identifiable sensitive data to include health, financial, genetic, biometric, geolocation, and sexual behaviour information, etc.

The post Data protection digest 3-17 Feb 2026: When using anonymisation for deletion, controllers have differing degrees of success – EDPB appeared first on TechGDPR.

]]>
Does the GDPR apply to my US company? https://techgdpr.com/blog/does-the-gdpr-apply-to-my-us-company/ Tue, 10 Feb 2026 09:35:09 +0000 https://s8.tgin.eu/?p=11059 Introduction The usual assumption of most US businesses is, “the GDPR is an EU regulation, hence it does not impact my organisation.” This belief results most often in unnecessary risk. The US equivalent of this misconception would be a company registered in Texas thinking its services don’t fall under the scope of the CCPA.  The […]

The post Does the GDPR apply to my US company? appeared first on TechGDPR.

]]>
Introduction

The usual assumption of most US businesses is, “the GDPR is an EU regulation, hence it does not impact my organisation.” This belief results most often in unnecessary risk. The US equivalent of this misconception would be a company registered in Texas thinking its services don’t fall under the scope of the CCPA. 

The GDPR has extraterritorial effect, that is, it has effect on and more often than not, does affect organisations which are outside the European Union.

Note that since Brexit, the UK has maintained GDPR provisions but further adapted them to its body of laws, this is known as the UK GDPR which adds an additional but small level of complexity for transfers of data outside the UK. For the sake of simplicity, the term GDPR used in this article will also apply to the UK.

What is the GDPR and why it has global reach

The GDPR is the code name for the UK and the EU’s General Data Protection Regulation. It shields the personal data of individuals who are within the European Union, provides rights to the data owners (i.e. individuals) and lays out obligations for the organisations handling that data. It has a general territorial scope such that it may apply to organisations outside of the EU if certain conditions are fulfilled.

A US company may be controlled by the GDPR if it is:

  1. Providing goods or services to data subjects in the European Union (EEA and UK)

This trigger is independent of payment or contractual terms. A business will be deemed to be targeting or envisaging an EU audience if it engages in any of the following activity:

  • Sending physical goods or providing access to digital services into a member state of the EU/EEA/UK;
  • Taking payments in a European currency such as Euros;
  • Running campaigns that market to email recipients in the EU/EEA/UK; and
  • Providing a website or service in a language that is widely spoken across the EU/EEA/UK.
  1. Tracking the behavior of users in the European Union

This trigger is extremely applicable to digital-first companies today. If your business is tracking or profiling users in the European Union, the GDPR will most likely apply. This includes practices like:

  • Tracking European Union website and app users with analytics tools;
  • Placing cookies or other tracking tags on the devices of users in the European Union which triggers additional requirements from the ePrivacy Directive and other local laws; and
  • Running targeted advertisement campaigns against users within the European Union on the basis of their online behavior.

Article 3 of the GDPR expressly sets out these conditions. These are detailed in additional guidance by the European Data Protection Board (Guidelines 05/2021). Registration of an organization outside of the EU does not necessarily remove a business from scope.

What constitutes personal data under the GDPR?

The GDPR defines personal data as any information relating to an identified or identifiable natural person. This definition is deliberately broad. This is to encompass a wider range of data than the concept of “personally identifiable information” (PII) used in other jurisdictions. It is critical for any organisation to understand what information falls under this comprehensive definition to determine its compliance obligations.

Personal data includes, but is not limited to:

  • Direct identifiers: A person’s name, email address, physical address, or telephone number.
  • Online identifiers: An individual’s Internet Protocol (IP) address, browser cookies, and device identifiers (IP/MAC address, IMEIs, …).
  • Pseudonyms like user IDs, vehicle numbers (VINs), randomly chosen usernames, hashes…
  • Metadata in context like timestamps, 
  • Special categories of data: Biometric data, such as fingerprints or facial recognition information. To learn more about sensitive data under the GDPR, that is addressed in Art.9 of the GDPR and our blog article detailing the differences between PII and personal data
  • Other information: Video or photo recordings, and an individual’s location data.
  • IoT data associated with a device purchaser, owner, user, maintenance person, etc…

If your organization collects any of this information from individuals in the European Union, it is processing personal data and must assess its compliance obligations under the GDPR.

What if my business doesn’t comply?

Non-compliance with the GDPR will result in massive financial and reputational losses. Supervisory authorities can impose fines of up to twenty million euros or four percent of the annual global turnover of an organization. This is decided by whichever is the greater. The GDPR has a highly structured framework of administrative fines, which can be applied in two tiers:

  • Tier 1: Up to €10 million, or 2% of the company’s total annual turnover worldwide in the preceding financial year. This is decided by whichever is the greater.
  • Tier 2: Up to €20 million, or 4% of the company’s total annual turnover worldwide in the preceding financial year. This is decided by whichever is the greater.

Enforcement is also a legitimate concern for U.S. companies. For example, Clearview AI, a U.S.-based firm, was the subject of enforcement action and fines by multiple EU data protection authorities for processing EU individuals’ personal data lacking a sufficient legal basis. 

Along with fines, organizations can anticipate loss of customer trust, damage to their reputation, and legal restrictions on their data processing activities. Enforcement action against household names demonstrates that regulators are willing to act against organizations outside the European Union when the GDPR applies. 

A simple checklist for your U.S. company

To allow you to consider at a glance whether the GDPR applies to your business, ask yourself the following questions:

  • Does your company’s website, app, or service deliver goods or services to individuals in the European Union?
  • Do you use instruments that monitor the online behavior of individuals in the European Union?
  • Does your company process the personal data of any of your staff members working in the European Union?
  • Do you implement any vendor tool to carry any of that data processing for you?

If you answered yes to any of these queries, then it is highly likely your company is subject to the GDPR.

Real-life examples of when the GDPR applies

  • An online store in the United States accepting payment in euros and shipping goods to customers in the European Union;
  • A company processing payroll for a remote employee working in the European Union;
  • A marketing company running targeted campaigns aimed at audiences within the European Union.

Conversely, a strictly internal website with no European customer targeting and only incidental EU visits generally will not be subject to the GDPR.

Special Case: United States companies with EU-Based employees

The processing of employees’ personal data in the European Union triggers GDPR obligations. Some examples are maintaining personal records, processing sensitive information, and monitoring work performance. Paying an employee in the European Union without additional data processing might not necessarily trigger full GDPR compliance requirements. That being the case HR processes need to be carefully reviewed. Please check out our blog article on how the GDPR and effects HR data for non EU-companies for further information. 

Your next steps toward compliance

If your business is subject to the GDPR, it’s essential to be forward-leaning with regards to compliance.

  • Carry out a data mapping exercise: This will lead to Records of Processing Activities, the details of which are outlined in Art. 30 of the GDPR. Record all personal data your organization gathers and processes, the reason for the data, and where it is stored;
  • Determining a lawful basis for all your data processing activities: This provides a documented and valid legal rationale for collecting and using personal data. This could be e.g., user consent, contractual necessity with the person, or legitimate interest of your organization, EU legal obligation;
  • Drafting accessible  privacy notices: Provides an intelligible and accessible privacy notice describing data collection, purposes, storage, and data sharing practices;
  • Respecting the rights of data subjects: Enable individuals to exercise their rights under the GDPR. These rights include access, rectification, erasure, restriction, and objection;
  • Appointing a Data Protection Officer (DPO): Appoint a DPO where required. This could be due to processing vast volumes of sensitive personal data or conduct systematic monitoring of individuals;
  • Consider an EU Representative: If your business is established outside of the European Union, you may need to have a representative within one of the member states under Article 27; and/or
  • Seek expert advice: The GDPR is complex. For complete compliance, it would be ideal to obtain a professional GDPR compliance audit.

Conclusion

Whether the GDPR affects an American business or not is not a matter of a business’s physical presence, but if it has a connection with individuals in the European Union. If your business offers goods or services to EU residents or monitors their activities, then it is very likely the GDPR will affect you. The penalty for failure to comply can be extremely high, both financially and with regard to one’s reputation.

It is suggested that all U.S. businesses conduct an internal examination of data processing operations. If unsure, securing a professional GDPR compliance assessment can guarantee a clear and secure path forward.

The post Does the GDPR apply to my US company? appeared first on TechGDPR.

]]>
Data protection digest 19 Jan – 2 Feb 2026: New PETs guide, Digital identities ecosystem & employees’ surveillance fine https://techgdpr.com/blog/data-protection-digest-04022026-new-pets-guide-digital-identities-ecosystem-employees-surveillance-fine/ Wed, 04 Feb 2026 10:59:44 +0000 https://techgdpr.com/?p=11530 Privacy Enhancing Technologies (PETs) The Israeli data protection authority published a technical guide to Privacy Enhancing Technologies, available in English. PETs are a diverse family of methods, processes, and digital tools that are appropriate for different stages in the information life cycle: Stay up to date! Sign up to receive our fortnightly digest via email. […]

The post Data protection digest 19 Jan – 2 Feb 2026: New PETs guide, Digital identities ecosystem & employees’ surveillance fine appeared first on TechGDPR.

]]>
Privacy Enhancing Technologies (PETs)

The Israeli data protection authority published a technical guide to Privacy Enhancing Technologies, available in English. PETs are a diverse family of methods, processes, and digital tools that are appropriate for different stages in the information life cycle:

  • Data collection and preparation for use: Obfuscating personal data and reducing its level of detail by removing identifiers, altering data values, or masking exact figures.
  • Data use and processing: Reducing exposure of personal data during processing, and in some cases, enabling data use without the need for viewing it during processing.
  • Control over data use: Defining rules and permissions for access to personal data and displaying data relating to the identity of the person accessing the data, the type of data, and the time of access. 
Stay up to date! Sign up to receive our fortnightly digest via email.

Main developments 

Brazil adequacy decision: On 28 January, the European Commission recognised that Brazil ensures an adequate level of protection for personal data under the EU GDPR. The enforced decision confirms that Brazil provides comparable levels of data protection, allowing the free transfer of personal data between the two jurisdictions without additional authorisations or safeguards. The Commission also recognises the independence of the Brazilian Data Protection Authority (ANPD), and the safeguards governing public authorities’ access to personal data for law enforcement and national security purposes. 

PETs

Data Privacy Framework: The EDPB has published a new version of the EU-US Data Privacy Framework FAQ for European individuals.  “European individuals” means any natural person, regardless of their nationality, whose personal data has been transferred to a US company under this framework. It applies to any type of personal data processed for commercial or health purposes, and human resources data collected in the context of employment, as long as the recipient company in the US is self-certified under the DPF

If you believe that a company in the US has violated its obligations or your rights under the EU-U.S. Data Privacy Framework, several redress avenues are available

Digital omnibus: The EDPB and EDPS also adopted a joint opinion on simplification of the implementation of harmonised rules on AI. Among other things, the EDPB and the EDPS recommend maintaining the standard of strict necessity currently applying for the processing of special categories of personal data for bias detection and correction in relation to high-risk AI systems. They also support the creation of EU-level AI regulatory sandboxes to promote innovation and help SMEs, as well as AI literacy obligations for systems providers and deployers. The full opinion can be read here

HIPAA Notice

In the US, if your company provides health benefits or qualifies as a covered entity under the Health Insurance Portability and Accountability Act (HIPAA), it is important to update your Notice of Privacy Practices (NPP) by 16 February to remain compliant. The notice must include new and more restrictive requirements related to protected health information (PHI) in particular, on the disclosure of patients’ substance use disorder records. The following steps may include assessing related policies, training, materials, and business associate agreements (BAAs) for consistency.

You can also read the latest epic.org report on the health data privacy crisis in the US here

More from supervisory authorities

M&A: Before a planned company sale, large amounts of data are often processed as part of a due diligence review. This can include personal data, particularly of employees, customers, and suppliers. The Liechtenstein Data Protection Authority has compiled information (in German) regarding which data protection regulations must be observed. This information does not replace an individual assessment and is not exhaustive. 

Camera surveillance in public transport: The Dutch data protection authority states that permanent camera surveillance at employees’ designated workstations is not permitted. Cameras may only be used when strictly necessary, for example, for safety during incidents, and not for systematic monitoring or evaluation of employees. For the data controller, this includes technical adjustments to cameras, adapting internal protocols, and providing clear instructions to employees.

AI tools safe usage: The Spanish AEPD has published the main principles of safe, responsible, and conscious use of AI. Among the recommendations, the privacy regulator advises against sharing personal data with AI – full name, address, telephone number, ID/NIE, images of people, or sensitive or delicate information – medical, financial or contractual details, geolocation. In the workplace, the agency emphasises the importance of following the information and security policies of each organisation and, in particular, of not including information that reveals confidential data of the entity, its staff or clients.

Digital identities ecosystem

Verifiable Digital Credentials (VDCs) can represent a wide range of data, from a driver’s license to a diploma to proof of age, explains America’s NIST. However, their interoperability requires a common set of standards and protocols for issuing, using, and verifying VDCs. As VDCs gain traction for both in-person and online identity verification, two key standards are helping to define this space:

See their comparison in the original publication

In parallel, the German Federal Office for Information Security (BSI) has issued the updated Technical Guideline for Biometric Authentication Systems (in German), which can be used for significantly more use cases of facial and fingerprint recognition through smartphones or access control systems. 

Cookie policy

The Latvian data protection authority reminds us of the essentials of a cookie policy, which provides the user with clear information about how their data is processed when using cookies. A document published on any website must explain in a user-friendly way: a) what cookies the website uses; b) for what purpose they are used; c) who their recipients are.

The multi-layered approach ensures that the most important information about the use of cookies on the website is provided in a concentrated manner (in the cookie pop-up notification or banner), including an indication of where more detailed information can be found (cookie policy). Cookie policies are often confused with privacy policies (by briefly including information about cookies among what is described in the privacy policy). However, to ensure transparency, information should be provided to users separately – in two documents or at least in clearly separated “blocks” of information. 

Shopping cart reminder e-mail

According to the Saxony data protection commissioner, retailers often send a reminder email pointing out an incomplete purchase process. Despite regular complaints received about such communication, there are no data protection concerns regarding a one-time shopping cart status update via email. The automatically generated messages must be distinguished from unsolicited advertising and are considered technical support

Given the customer’s expectations and the recipient’s perspective, it is at least realistic to expect a technically triggered status update during the contract negotiation phase, in accordance with Art. 6 of the GDPR. At the same time, the data processing known as reminder emails is subject to information requirements and must be appropriately indicated in the notices pursuant to Art. 13 of the GDPR.

In other news

PETs

Excel file disclosure: The Romanian regulator ANSPDCP imposed fines totalling 15,000 euros against Continental Automotive Products SRL for breaches of the GDPR principles of data minimisation, accountability, and the security of processing. The investigation followed the controller submitting a personal data breach notification concerning the repeated internal distribution of an Excel file containing a consolidated list of employees, including medical data from medical certificates relating to numerous employees and former employees over a period of time. 

GM driver data ban: America’s Federal Trade Commission finalised an order against General Motors and its OnStar subsidiary after the automaker secretly collected and sold detailed driving data from millions of vehicles without consumer consent.  The final order approved by the Commission imposes a five-year ban on GM disclosing consumers’ geolocation and driver behaviour data to consumer reporting agencies. And for the entire 20-year life of the order, GM will be required to:

  • obtain affirmative express consent from consumers before collecting, using, or sharing connected vehicle data, with some exceptions, such as for providing location data to emergency first responders;
  • create a way for all US consumers to request a copy of their data and seek its deletion;
  • give consumers the ability to disable the collection of precise geolocation data from their vehicles if their vehicle has the necessary technology; and
  • provide a way for consumers to opt out of the collection of geolocation and driver behaviour data, with some limited exceptions.

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

Chromebook case

The Danish data protection authority decided in the Chromebook case regarding 51 municipalities’ use of Google’s products for teaching in primary schools. The regulator issues serious criticism and warns the municipalities about their setup of the programs in question and about the use of sub-processors outside the EU. In addition, it states that as a data controller, municipalities cannot legally use products that contain unclear processing constructs. Finally, they must have access to the necessary resources to ensure lawful processing of personal data, including in situations where the contractual basis for the product changes.

Microsoft 365 Education

The Austrian data protection authority upheld a complaint filed by a pupil, represented by the European Centre for Digital Rights (NOYB), against Microsoft regarding the use of tracking cookies in Microsoft 365 Education. The decision relates to the installation and use of non-essential cookies on the device of a minor using Microsoft 365 Education at an Austrian school.  The authority also found that no valid consent had been obtained, digitalpolicyalert.org reports.

More enforcement decisions

Employees’ geolocation: The Italian regulator Garante fined a company in the agricultural seed selection and production sector 120,000 euros for unlawfully processing the personal data of five employees. As part of a multinational group, at the direction of its Swiss parent company, it installed a device on its company vehicles that unlawfully collected data on employees’ business and private travel (time, mileage, fuel consumption, and driving style) for the purpose of assigning a monthly score. The collected data was retained for 13 months and used to evaluate employee driving behaviour and to implement any corrective measures. 

Access to a fired worker’s email: Garante also ruled that the content of emails, contact information, and any attachments fall within the definition of correspondence and are therefore protected by the right to confidentiality. In the related case, the regulator fined a company 40,000 euros for violating the confidentiality of a CEO’s email account after his employment ended. After receiving a disciplinary letter that resulted in dismissal,  he asked the company to disable the email account, forward any messages received in the meantime to his personal email address, and activate an automatic reply. However, this request remained unanswered. 

France Travail: The French CNIL, meanwhile, fined France Travail 5 million euros for failing to ensure the security of the data of job seekers. In 2024, attackers managed to break into the agency’s information system. They used social engineering techniques to usurp the accounts of CAP EMPLOI advisors, responsible for people with disabilities. The attackers accessed the data of all registered people, or those who have been registered over the past 20 years. However, the attackers did not gain access to the complete files of job seekers, which may include health data. 

And finally

Change your password:  According to the German BSI, a blanket password change is no longer an effective security measure. Frequent password changes often lead consumers to use weak, easily predictable passwords. Password managers help to keep track of passwords. However, even a complex password does not offer 100% protection. Instead, BSI recommends activating two-factor authentication (2FA). 

Australia child accounts ban: According to the Guardian, Snapchat banned or disabled the accounts of around 415,000 Australian users who were detected as being under the age of 16. This was done to comply with the new under-16s social media prohibition. In December, Snapchat was one of ten platforms that needed to restrict people (4,7 million accounts) under the age of 16 from using its services. However, other allegations have surfaced after the prohibition went into place, with some claiming that Snapchat’s facial age verification was easily overcome by teens.

The post Data protection digest 19 Jan – 2 Feb 2026: New PETs guide, Digital identities ecosystem & employees’ surveillance fine appeared first on TechGDPR.

]]>
Data protection digest 4-18 Jan 2026: Legitimate Interests Assessment, AWS Europe Sovereign Cloud, Google settlement over child data https://techgdpr.com/blog/data-protection-digest-22012026-legitimate-interests-aws-europe-sovereign-cloud-google-settlement-over-child-data/ Thu, 22 Jan 2026 09:32:31 +0000 https://techgdpr.com/?p=11469 Legitimate Interests Assessment (LIA) The Hamburg Data Protection Commissioner provided a comprehensive questionnaire for determining the legitimate interests legal basis for processing. It helps those responsible to examine and document precisely what their interest in data processing is and whether the rights and interests of the data subject are adequately considered. It guides users step-by-step […]

The post Data protection digest 4-18 Jan 2026: Legitimate Interests Assessment, AWS Europe Sovereign Cloud, Google settlement over child data appeared first on TechGDPR.

]]>
Legitimate Interests Assessment (LIA)

The Hamburg Data Protection Commissioner provided a comprehensive questionnaire for determining the legitimate interests legal basis for processing. It helps those responsible to examine and document precisely what their interest in data processing is and whether the rights and interests of the data subject are adequately considered. It guides users step-by-step through the most important checkpoints:

  • Determination: What objectives are pursued with the data processing, and are these legally permissible?
  • Necessity: Is the processing necessary, and is only the required personal data collected?
  • Balancing: Are the rights and interests of the individuals concerned sufficiently considered and protected?
  • Documentation and compliance: Are the audit procedures recorded and regularly updated?

You can download the LIA questionnaire in German or the LIA questionnaire in English.

Stay up to date! Sign up to receive our fortnightly digest via email.

EDPB updates

The European Data Protection Board welcomes comments on the recommendations on the elements and principles to be found in Processor Binding Corporate Rules – BCR-P. Such comments should be sent by 2 March. BCRs are a tool for providing appropriate safeguards for transfers of personal data by a group of undertakings engaged in a joint economic activity with third countries that have not been providing an adequate level of protection pursuant to the GDPR. The recommendations clarify when BCR-P can be used, namely, only for intra-group transfers between processors, when the controller is not part of the group. Read more about the scope of BCR-P and its interplay with the data processing agreements here.

Other developments

Legitimate Interests

AWS Europe Sovereign Cloud: The German Federal Office for Information Security BSI has announced its support for the US cloud provider Amazon Web Services in the design of security and sovereignty features for its new European Sovereign Cloud (ESC): an independent cloud infrastructure located entirely within the EU, whose operation will be technically and organisationally independent from the global AWS instance.

Later this year, the BSI will publish general sovereignty criteria for cloud computing solutions based on the new framework. It will serve as a basis for assessing the degree of autonomy of cloud solutions and can also be used in procurement processes. 

HIPAA Security Rule: In the US, for HIPAA-covered entities and business associates, the HIPAA Security Rule requires ensuring the confidentiality, integrity, and availability of all electronic protected health information (ePHI) that the regulated entity creates, receives, maintains, or transmits. To that end, the US Department of Health and Human Services has published the latest recommendations on System Hardening and Protecting ePHI. The measures include: 

  • patching known vulnerabilities
  • removing or disabling unneeded software and services
  • enabling and configuring security measures that sometimes intersect with some of the technical safeguard standards and implementation specifications of the HIPAA Security Rule, such as access controls, encryption, audit controls, and authentication.

GDPR certifications and codes of conduct

France’s CNIL maps the deployment of GDPR compliance tools across Europe. Two maps list the certifications and codes of conduct approved by national supervisory authorities or by the European Data Protection Board since the entry into force of the GDPR. These instruments may operate at either the national or European level. Certification (Art. 42 of the GDPR) makes it possible to demonstrate that a product, service, or data processing activity meets data protection criteria set out in an approved referential. And a code of conduct (Art. 40 of the GDPR) translates the Regulation’s obligations into concrete, sector-specific rules, and becomes binding on its members. 

UK international transfers

The UK Information Commissioner published an updated guidance on international transfers of personal data, making it quicker for businesses to understand and comply with the transfer rules under the UK GDPR. It sets out a clear ‘three-step test’ for organisations to use to identify if they’re making restricted transfers. New content also provides clarity on areas where organisations have questions, such as roles and responsibilities, which reflects the complexity of multi-layered transfer scenarios.

Multi-device consent

The French regulator also published its recommendations (in French) on the collection of cross-device consent. For instance, when a user accesses a website or a mobile app, they express their choices about the use of cookies or other trackers on a device connected to their account. These choices are then automatically applied to all devices connected to that account. This includes, but is not limited to, their phone, tablet, computer or connected TV, as well as the browser or app they are using. Thus, users must be well-informed of this login system.

More from supervisory authorities

Remote job interviews: According to the Latvian regulator DVI, an employer may collect the content of a remote job interview using AI tools if an appropriate legal basis can be applied. Such data processing may be carried out based on the candidate’s consent or the legitimate interests of the company. Consent must be freely given, specific, unambiguous and informed. If the processing is carried out based on legitimate interests, a balancing test of the interests of both parties must be carried out before such processing is initiated.

Regardless of the chosen legal basis, the data controller is obliged to inform the candidate before the interview about the planned data processing during the interview, including the use of AI tools, the purposes of processing, the data retention period and the candidate’s rights. The candidate has the right to object, and such objections must be taken into account; in the event of potential harm, the processing must be stopped.

Cybersecurity guide: The Australian Cyber Security Centre published guidance with a checklist on managing cybersecurity risks of artificial intelligence for small businesses when adopting cloud-based AI technologies. Reportedly, more small businesses are using AI through applications, websites and enterprise systems hosted in the public cloud like OpenAI’s ChatGPT, Google Gemini, Anthropic’s Claude, and Microsoft Copilot. Before adopting AI tools, small businesses should understand the related risks and ways to mitigate them, including: 

  • data leaks and privacy breaches
  • reliability and manipulation of AI outputs
  • supply chain vulnerabilities.

Data subject rights in the event of a bankruptcy

The Norwegian data protection authority has imposed a fine on Timegrip AS. The case concerns a retail chain that went bankrupt, and the employees needed to document the hours they had worked. The company Timegrip had been the data processor for the retail chain until the bankruptcy, and stored this data. However, they would not provide the data to either the bankruptcy estate or the employees themselves. 

Timegrip argued that the company did not have the right to provide the complainant with a copy because a data processor can only process personal data on the basis of an instruction from the controller. Since the controller retail chain had gone bankrupt, Timegrip claimed that no one could give them such an instruction. At the same time, Timegrip refused access requests from 80 different individuals, despite the company being aware that they were in a vulnerable situation and dependent on the timesheets to document their salary claims. 

In addition, it was Timegrip that made decisions about essential aspects of the processing, such as what the data could be used for, the storage period and who could have access to the personal data. In other words, it was clear that it was Timegrip that exercised the real control over the personal data.

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

Google multimillion-dollar settlement over child data

In the US, a federal judge granted final approval for a 30 million dollar class action settlement against Google, after six years of litigation with parents claiming the tech giant violated children’s privacy by collecting data while they watched YouTube videos. Although Google doesn’t charge for access to YouTube, the company does use it as a revenue source. It collaborates with advertisers and the owners of popular YouTube channels to advertise on specific videos, with Google and the channel owners splitting the payments received from advertisers.

In other news 

Free mobile fine: The French CNIL issued two sanctions against the companies FREE MOBILE and FREE, imposing fines of 27 and 15 million euros, respectively, over the inadequacy of the measures taken to ensure the security of their subscribers’ data. In October 2024, an attacker managed to infiltrate the companies’ information systems and access personal data concerning 24 million subscriber contracts, including IBANs, when the people were customers of both companies. 

The investigation has shown that the authentication procedure for connecting to the VPN of both companies, used in particular for the remote work of the company’s employees, was not sufficiently robust. In addition, the measures deployed by the companies in order to detect abnormal behaviour on their information system were ineffective.

Major university data breach: In Australia, a cyberattack compromised the personal information of students from all Victorian government schools. An unauthorised external third party accessed a database containing information about current and past school student accounts, including student names, school-issued email addresses, and encrypted passwords. In the opinion of the Australian legal expert from Moores, who analysed the breach, certain factors tend to correlate with such incidents. These include:

  • Adoption of new CRMs and platforms (including leaving administrator access open, and having incorrect privacy settings, which make online forms publicly searchable);
  • Keeping old information which is no longer required;
  • A spike in emails sent to incorrect recipients on Fridays and in the lead-up to school holidays.
  • Spreadsheets sent via email (instead of SharePoint, for example).

Business email compromise

Business Email Compromise (BEC) is currently one of the fastest-growing forms of digital fraud, according to the Dutch National Cybersecurity Centre. In BEC, criminals pose as trusted individuals within an organisation, often a director or manager, but also a colleague, supplier, or customer.

The criminals’ goals can vary, such as changing account numbers, obtaining login credentials, stealing sensitive information, or using compromised accounts for new phishing campaigns. The power of BEC lies not in its technical complexity but in exploiting the principles of social influence. BEC fraudsters cleverly utilise subtle social pressure, for example, by capitalising on scarcity by creating a sense of urgency, exploiting reciprocity by first building trust or asking for small favours, or relying on an authority figure. 

And finally 

AI prompting guide: IAB Europe has published its AI Prompting Guide. It provides practical, reusable techniques you can apply immediately, including, among others, managing risks such as hallucinations, sensitive data exposure, bias, and prompt injection. Mitigating methods in this case may be addressed through careful prompting, review, and user judgment, while others require more structural safeguards such as validation, monitoring, and clear boundaries around how models are used. 

For instance, sensitive data exposure occurs when confidential, personal, or proprietary information is included in prompts or generated in outputs inappropriately. This can involve personal data, commercial secrets, or information subject to legal or contractual restrictions. The mitigation strategy would include: 

  • removing or anonymising sensitive information before including it in prompts 
  • limiting the amount of context shared to what is strictly necessary for the task 
  • following organisational guidance on approved tools and data handling, and 
  • applying access controls where models are integrated into workflows. 

For sensitive use cases, ensure outputs are reviewed before being stored, shared, or acted upon.

The post Data protection digest 4-18 Jan 2026: Legitimate Interests Assessment, AWS Europe Sovereign Cloud, Google settlement over child data appeared first on TechGDPR.

]]>
Data protection digest 3 Jan 2026: Improvements are being made to GDPR enforcement, US consumer privacy, and emerging “Shadow AI” concerns https://techgdpr.com/blog/data-protection-digest-03012026-improvements-are-being-made-to-gdpr-enforcement-us-consumer-privacy-and-emerging-shadow-ai/ Wed, 07 Jan 2026 09:47:06 +0000 https://techgdpr.com/?p=11446 GDPR enforcement simplified A new regulation came into force on 1 January, supplementing the GDPR. It speeds up the work of data protection authorities in enforcement cases that involve multiple countries in the EU/EEA. The regulation provides, among other things, for time limits, stages of investigation, the exchange of information between authorities, and the rights […]

The post Data protection digest 3 Jan 2026: Improvements are being made to GDPR enforcement, US consumer privacy, and emerging “Shadow AI” concerns appeared first on TechGDPR.

]]>
GDPR enforcement simplified

A new regulation came into force on 1 January, supplementing the GDPR. It speeds up the work of data protection authorities in enforcement cases that involve multiple countries in the EU/EEA. The regulation provides, among other things, for time limits, stages of investigation, the exchange of information between authorities, and the rights of the parties concerned. In future, data protection authorities will have to issue a resolution proposal on a cross-border case as a rule within 12-15 months. In the most complex cases, the deadline can be extended by 12 months. The regulation will apply from April 2027. 

Stay up to date! Sign up to receive our fortnightly digest via email.

UK Adequacy decision

The European Commission adopted two new adequacy decisions for the UK – one under the GDPR and the other under the Law Enforcement Directive, until 27 December 2031.  In accordance with the new decisions, transfers of personal data from the EU to the UK can continue to take place without any specific framework. Following Brexit, the Commission adopted two adequacy decisions vis-à-vis the UK in 2021. Sunset clauses had been introduced in each of the decisions. The decisions expired in mid 2025, but have been extended until the end of the year. The EDPS has since issued an opinion on these decisions.

More legal updates

US consumer privacy updates: In Kentucky, as well as Indiana, Rhode Island and several other states, GDPR-enhanced legislation related to consumer data privacy took effect on January 1. In Kentucky, in particular, the new legislation establishes the rights to confirm whether data is being processed, to correct any inaccuracies, to delete personal data provided by the consumer, to obtain a copy of the consumer’s data, and to opt out of targeted advertising, the sale of data, or profiling of the consumer along with requirements for entities that control and process their data.

Similarly, in January, new regulations became effective in California regarding a risk-assessment framework for certain high-risk data processing activities, as well as transparency and notice requirements, disclosure of sensitive personal information, data breach reporting, consumer rights requests, and data collection and deletion by data brokers

AI use by banks

The Hungarian data protection regulator issued a report on the processing of personal data by AI systems used by banks in Hungary (available in English). Some good practices indicated by the report include:

  • AI recognition of images, voices and texts must be reliable, without compromising data security. Principles of data minimisation and storage limitation must be observed.
  • The quality of the data used for AI training is important, as well as identifying whether or not the training data needs to be linked to a specific natural person. In many cases, pseudonymisation or anonymisation can be used to mitigate privacy risks before training.
  • The use of ‘Shadow AI’ is becoming a new phenomenon. It covers all cases where, in an organisation, users use AI systems in an unregulated, non-transparent, uncoordinated manner from the point of view of the organisation, either for work or for some personal use, using the organisation’s IT infrastructure. 
  • In their operations, certain banks under review also use analytical models to analyse and predict creditworthiness and product affinity, the precise classification of which may raise questions. They often operate on a statistical basis, but may also have an AI-based component, and it is necessary to apply the appropriate safeguards. 

More from supervisory authorities

EU Data Act: The French privacy regulator CNIL explained how the EU Data Act is going to reform the EU digital economy, gradually implemented through 2026-2027. The Act sets fair rules on the access and use of personal or non-personal data generated by connected objects. It allows anyone who owns or uses connected products to access the data generated by this object. It also facilitates their sharing with other actors, in particular by prohibiting unfair contractual clauses.

The implementation of this regulation must be done in conjunction with the GDPR. In particular, it provides that in the event of a contradiction between the two texts, it is the GDPR that prevails when personal data is concerned.

Similarly, the Digital Governance Act should be taken into account, which has set up new trusted intermediaries to encourage voluntary data sharing.

Bodycam use: At the end of December, the CJEU ruled in a case regarding a data controller’s obligation to provide information when collecting personal data via a body-worn camera worn by ticket inspectors on public transport. The collection of personal data by means of body-worn cameras constitutes collection directly from the data subject. The information obligation must therefore be respected at the time of collection, Article 13 of the GDPR. The information obligation can operate at several levels, where the most important information is, for example, stated in a warning sign, while the remaining information can be provided in another appropriate (and easily accessible) way.

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

Disney US settlement

On 31 of December, a federal judge required Disney to pay 10 million dollars to settle FTC allegations that the company allowed personal data to be collected from children who viewed child-directed videos on YouTube without notifying parents or obtaining their consent as required by the Children’s Online Privacy Protection Rule (COPPA Rule). A complaint alleged that Disney violated the COPPA Rule by failing to properly label some videos that it uploaded to YouTube as “Made for Kids”.

The complaint alleged that by mislabeling these videos, Disney allowed for the collection, through YouTube, of personal data from children under 13 who viewed child-directed videos and used that data for targeted advertising to children.

More enforcement decisions

TikTok investigations: According to vitallaw.com, the Spanish and Norwegian data protection authorities have issued warnings to TikTok users regarding the company’s transfer of personal data to China, where national laws could require that data be shared with Chinese authorities. TikTok already faces EU fines over violations of the GDPR and was ordered to stop transferring personal data to China. 

So far, TikTok has been granted an interim injunction that allows the company to continue transferring personal data to China until the case is resolved. As a result, regulators are warning users to read the online platform’s notifications and privacy policies, check their privacy settings and think about what they share in the app. It is also recommended that businesses consider whether to continue using TikTok and conduct risk assessments.

PCRM software fine: Finally, the French CNIL has fined Nexpublica 1,700,000 euros for failing to provide sufficient security measures for a tool for managing the relationship with users in the field of social action.  Nexpublica (formerly Inetum Software), specialises in the design of computer systems and PCRM software used in particular by homes for disabled people.

At the end of 2022, Nexpublica customers made data breach notifications with the CNIL, because users of the portal had access to documents concerning third parties. The CNIL then carried out inspections of the company, which revealed the inadequacy of the technical and organisational measures. It is considered that the vulnerabilities found:

  • were mostly the result of a lack of knowledge of the state of the art and basic safety principles;
  • were known and identified by the company through several audit reports.

Despite this, the flaws were only patched after the data breaches.

The post Data protection digest 3 Jan 2026: Improvements are being made to GDPR enforcement, US consumer privacy, and emerging “Shadow AI” concerns appeared first on TechGDPR.

]]>
Data protection digest 3-18 Dec 2025: E-commerce websites should offer a choice between ‘guest’ mode, or voluntary account creation https://techgdpr.com/blog/data-protection-digest-22122025-e-commerce-websites-should-offer-a-choice-between-guest-mode-or-voluntary-account-creation/ Mon, 22 Dec 2025 09:26:19 +0000 https://techgdpr.com/?p=11425 E-commerce user data As a general rule, users should have the option to engage with e-commerce websites, including the ability to make purchases, without creating an account. In such cases, the EDPB recommends that e-commerce websites offer a choice: either a ‘guest’ mode, allowing users make purchases without creating an account, or the option to […]

The post Data protection digest 3-18 Dec 2025: E-commerce websites should offer a choice between ‘guest’ mode, or voluntary account creation appeared first on TechGDPR.

]]>
E-commerce user data

As a general rule, users should have the option to engage with e-commerce websites, including the ability to make purchases, without creating an account. In such cases, the EDPB recommends that e-commerce websites offer a choice: either a ‘guest’ mode, allowing users make purchases without creating an account, or the option to voluntarily create an account. This approach minimises the collection and processing of personal data, and therefore aligns with the GDPR’s principle of data protection by design and by default. However, mandatory account creation can be justified in a limited number of cases, including for example, offering a subscription service or providing access to exclusive offers. 

Stay up to date! Sign up to receive our fortnightly digest via email.

Google antitrust investigation

The EU Commission has opened an investigation into possible anticompetitive conduct by Google in the use of online content for AI purposes – using the content of web publishers, as well as content uploaded on the online video-sharing platform YouTube. The investigation will notably examine whether Google is distorting competition by imposing unfair terms and conditions on publishers and content creators, or by granting itself privileged access to such content, thereby placing developers of rival AI models at a disadvantage. It should be noted that there is no legal deadline in the EU for bringing an antitrust investigation to an end. 

More legal updates

US AI national policy: On 11 December, President Trump signed an Executive Order on  establishing a national policy framework for AI and lifting barriers to innovation. According to digitalpolicyalert.org, the US Administration will work with Congress to establish a single national AI standard that avoids conflicting state legislation. This standard would override any state laws that contradict the policy and would include protections for children, respect for copyrights, prevention of censorship, and measures to keep communities safe. 

US immigration data: According to Privacy International, the US Government also intends to force visitors who are not required to get visas, such as British and French citizens, to submit their digital history and even DNA as the price of entry. With this much data AI tools will likely be deployed to unlock details of your life for border and immigration agencies. In particular, it wants to know all about: 

  1. ‘telephone numbers used in the last five years’
  2. ‘email addresses used in the last ten years’
  3. ‘family number telephone numbers (sic) used in the last five years’
  4. biometrics – face, fingerprint, DNA, and iris
  5. business telephone numbers used in the last five years
  6. business email addresses used in the last ten years.

If the proposed changes, published on 10th of December, are adopted after the 60-day consultation, travellers will have to use dedicated apps for their ESTA application, and to provide biometric proof of their departure. The latter will disclose the user’s location once they have left the US and run live detection on the selfie photo

Password managers

e-commerce

The German Federal Office for Information Security (BSI) examined this product category and investigated the IT security features of ten selected password managers. Three out of ten stored passwords in a way that theoretically allows manufacturers access. This increases the attack surface on the manufacturer’s side, which must be mitigated by additional compensatory measures. Users must trust these additional measures.

If the password manager stores data in the cloud, consumers should be informed about the storage location and data protection measures. This information can be included, for example, on the manufacturer’s website, in the terms and conditions for using the product, or in the privacy policy.

AI Training guidance

The Swedish data protection authority IMY has investigated the possibility of using personal data to create synthetic data for AI training purposes. Such data is created to resemble the original data without being able to be linked to individuals. It can be very positive from a privacy perspective, even though the synthesis itself means that personal data is processed, so it needs to comply with the GDPR. The particular project IMY investigated was about custody cases. It therefore involved a large amount of data of a very sensitive nature, which requires special considerations and measures. 

More from supervisory authorities

Medical research: The Hessian data protection commissioner has published a guide to data protection in medical research (in German). The guide presents four concrete use cases from the practice of medical research and classifies them from a data protection perspective. In particular, the cases describe the use of AI in cancer screening, pathology, intensive care, and the distinction between quality assurance and scientific research. The guide pays particular attention to the question of under what circumstances data can be considered anonymous. The use of anonymised data is especially relevant for medical research and the training of AI models. For research projects where anonymisation is not practical, the guide presents alternative legal bases under data protection law.

Consent forms: Consent is one of the lawful grounds for processing personal data. It means that a person freely, specifically and unambiguously agrees to the processing of their data for one or more purposes. Consent has to be verifiable so that the controller can demonstrate that it was received in accordance with the requirements. Therefore, in situations where consent is requested in person, a written form is useful, which provides clarity for both the organisation and the customer. It can include the minimum information that is most important at the time of consent, so as not to overload the information to be received, as well as not to delay the duration of the service or process itself. The consent form must state: 

  • Who will process the data (company, individual entrepreneur), with their name
  • Why is data needed
  • What data is needed
  • How to withdraw consent
  • Customer ID (data subject’s first name, last name)
  • Date, signature
  • Information on where to find more information about data processing, including the duration of data storage and how to contact the controller

Cambridge Analytica compensations

Eligible Australian Facebook users impacted by the Cambridge Analytica affair have until 31 December to register under a payment program established in a landmark settlement. The 50 million dollars payment program was established by Meta Platforms as part of an enforceable undertaking the Australian Information Commissioner accepted from Meta in December 2024. This brings to an end 7 years of investigation and litigation related to the Cambridge Analytica matter in Australia.

Meta data access

The Austrian Supreme Court ordered Meta must provide full access to all personal users data requests within 14 days, including the sources, recipients and purposes for which each information was used, Privacy advocacy group NOYB reports. Meta’s claims of trade secrets or other limitations were rejected. The company claimed it would lead to unprecedented access to the inner systems of the platform. 

Meta must also ensure that sensitive information (political views, sexual orientation, or health) is not processed together with other data unless a valid legal basis according to Art. 9 GDPR applies, even if it was collected unintentionally or technically distinguishing it would be impossible. The case was brought by the NOYB activist Max Schrems in 2014 and laboured 11 years in Austrian courts and the CJEU. The plaintiff was awarded 500 euros in damages.

American Express cookie fine

The French privacy regulator CNIL fined American Express Carte France, the French subsidiary of the American Express group, 1.5 million euros for non-compliance with the rules applicable to cookies: a) by depositing trackers without having user consent, or b) despite their refusal to consent, or c) by continuing to read the trackers previously deposited despite subsequent consent withdrawal. 

In other news

Germany telecommunications fine: Due to massive violations of data protection rights, the North Rhine-Westphalia data protection commissioner has imposed a fine of 300,000 euros on a local telecommunications company. Since 2022, consumers have repeatedly contacted the regulator for the same reason: they received personalised ad letters promoting a contract for an internet and telephone connection. The recipients consistently stated that they had never had any prior contact with this company. However, the advertising letters were remarkably detailed. The recipients were only required to add their IBAN and sign the form.

Due to the design of the letters and the similarity of the name to very well-known telecommunications provider, many consumers were unaware that it wasn’t an offer for a different tariff with their existing provider, but rather an offer to switch providers. As a result, those affected often signed the contract documents. Only when they later realized they had switched providers did they cancel or revoke the contracts – and were then hit with a demand for a flat-rate compensation fee by the company. 

Direct marketing fine: The Italian data protection authority has fined Verisure Italia for unlawful processing of personal data for marketing purposes. The measure stems from a complaint from a former customer who continued to receive unwanted promotional text messages even after objecting to the processing of his data, and from a report from a potential customer who, after requesting a quote, began receiving promotional phone calls, emails, and text messages. The communications continued despite the exercise of the right to object provided for by the GDPR. Furthermore, the regulator deemed the retention period for potential customer data envisaged for telemarketing (12 months) to be excessive. 

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

More enforcement actions

Data processor breach: The French CNIL imposed a fine on Mobius Solutions, the processor behind a data breach affecting users of Deezer. The company was fined 1 million euros for failing to comply with the applicable rules regarding subcontracting. In 2022, Deezer reported that its users’ data had been posted on the dark web and that its former processor, Mobius Solutions, whose services it used to carry out personalised advertising campaigns for its customers, was involved.

The processor retained a copy of the data of more than 46 million DEEZER users after the end of their contractual relationship, despite its obligation to delete all such data at the end of the contract.

University data breach: The Dutch AP imposed a 175,000-euro fine on HAN University of Applied Sciences for breaching the GDPR data security rules.  A hacker used SQL injection through a web form to access HAN’s database. The individual threatened to make personal data, including addresses, names, passwords, and citizen service numbers, public and unsuccessfully demanded ransom from the university.

Password manager data breach: The UK Information Commissioner fined password manager provider LastPass 1.2 million pounds following a 2022 data breach that compromised the personal information of up to 1.6 million of its UK users. LastPass failed to implement sufficiently robust technical and security measures, which ultimately enabled a hacker to gain unauthorised access to its backup database. The incidents occurred when a hacker gained access first to a corporate laptop of an employee based in Europe and then to a US-based employee’s personal laptop on which the hacker implanted malware and was then able to capture the employee’s master password.

In case you missed it

e-commerce

Meta personalised ads: On 8 December, the European Commission acknowledged Meta’s undertaking to offer users in the EU an alternative choice of Facebook and Instagram services that would show them fewer personalised ads, to comply with the Digital Markets Act. This is the first time that such a choice is offered on Meta’s social networks. Meta will give users the effective choice between: 

Meta will present these new options to users in the EU in January 2026. This follows a close dialogue between the Commission and Meta after the Commission found Meta in breach of the Digital Markets Act and issued Meta a non-compliance decision related to Meta’s “consent or pay” model in April 2025.

TikTok usage risks in the EU: The Dutch AP urges users and organisations to carefully consider whether they wish to continue using TikTok and other services that transfer personal data to countries outside the EU, including China. The Irish data protection authority DPC has previously ruled that this transfer is in breach of the GDPR. In addition, the Irish court required TikTok to better inform users on data processing activities. Users can still decide whether they want to continue using TikTok under these circumstances. If not, they can (temporarily) delete the app or deactivate an account.

The post Data protection digest 3-18 Dec 2025: E-commerce websites should offer a choice between ‘guest’ mode, or voluntary account creation appeared first on TechGDPR.

]]>
Data protection digest 18 Nov-2 Dec 2025:  “Digital omnibus” package latest & market price of personal data already estimated https://techgdpr.com/blog/data-protection-digest-4122025-digital-omnibus-latest-and-market-price-of-personal-data/ Thu, 04 Dec 2025 10:02:26 +0000 https://techgdpr.com/?p=11391 “Digital omnibus” package latest On 19 November, the European Commission presented proposals for amendments in the digital area legislation, including the GDPR, the Data Act, the EU AI Act, and the NIS 2 Directive. According to digitalpolicyalert.org analysis, the Digital Omnibus would amend the GDPR by: The Digital Omnibus would also exempt personal data processing […]

The post Data protection digest 18 Nov-2 Dec 2025:  “Digital omnibus” package latest & market price of personal data already estimated appeared first on TechGDPR.

]]>
“Digital omnibus” package latest

On 19 November, the European Commission presented proposals for amendments in the digital area legislation, including the GDPR, the Data Act, the EU AI Act, and the NIS 2 Directive. According to digitalpolicyalert.org analysis, the Digital Omnibus would amend the GDPR by:

  • changing the definition of personal data to specify any entity that is reasonably likely to have the means to identify a person,
  • exempting certain biometric data and data used by AI from the restrictions on processing special categories of personal data,
  • clarifying on further processing of personal data in the public interest or for scientific research purposes, and
  • specifying that processing of personal data that is necessary for the interests of a controller in the development or operation of an AI system can be pursued for ”legitimate interests”.

The Digital Omnibus would also exempt personal data processing from the cookie requirements under the ePrivacy Directive. Instead, it would amend the GDPR to maintain the consent requirement, while specifying that certain processing activities, such as electronic communications transmissions, service provision, audience measurement solely for an online service provider, and maintaining or restoring security, would be considered lawful. Websites and apps would have to allow data subjects to consent through automated, machine-readable mechanisms; browser manufacturers must likewise enable users to grant or refuse consent.

Finally, personal data breaches that are likely to result in a high risk to the rights and freedoms of natural persons would need to be reported to the single-entry point within 96 hours of becoming aware of them. Similarly, there would be unified lists of processing activities that do or do not require a Data Protection Impact Assessment, and create a standard DPIA template and methodology.

Stay up to date! Sign up to receive our fortnightly digest via email.

GDPR enforcement

On 17 November, the Council of the EU adopted new rules to improve cooperation between national data protection bodies when they enforce the GDPR to speed up the process of handling cross-border data protection complaints. Main elements of the new EU regulation include:

  • Admissibility: Regardless of where in the EU a complaint is filed, admissibility will be judged based on the same information/conditions. 
  • Rights of complainants and parties under investigation: Common rules will apply for the involvement of the complainant in the procedure, and the right to be heard for the company or organisation that is being investigated.
  • Simple cooperation procedure: For straightforward cases, data protection authorities can decide, to avoid administrative burden, to settle actions without resorting to the full set of cooperation rules.
  • Deadlines: In the future, an investigation should not take more than 15 months. For the most complex cases, this deadline can be extended by 12 months. In the case of a simple cooperation procedure between national data protection bodies, the investigation should be wrapped up within 12 months.

The regulation will enter into force 20 days after its publication in the Official Journal of the EU. It will become applicable 15 months after it enters into force.

More legal updates

The European Commission has launched a whistleblower tool for the AI Act. Whistleblowers can provide relevant information in any of the EU official languages and in any relevant format. The tool provides a secure means to report potential law violations that could compromise fundamental rights, health, or public trust. The highest level of confidentiality and data protection is guaranteed through certified encryption mechanisms. Anyone can access the AI Act Whistleblower Tool and read more information about the tool and the frequently asked questions

California privacy updates: California has enacted a bill which amends the state’s data breach notification law to establish strict new reporting timelines. Beginning January 1, 2026, businesses must notify affected California residents within 30 calendar days of discovering a security incident involving personal information. For incidents affecting more than 500 residents, notice to the California Attorney General must be provided within 15 calendar days of the consumer notice. The amendment allows limited exceptions for law enforcement needs or when necessary to determine the scope of the incident and restore system integrity, JD Supra lawblog reports. 

In parallel, starting Jan. 1st, 2027, California will prohibit a business from developing or maintaining a browser, as defined, that does not include functionality configurable by a consumer that enables the browser to send an opt-out preference signal to businesses with which the consumer interacts through the browser. The bill would require a business that develops or maintains a browser to make clear to a consumer in its public disclosures how the opt-out preference signal works and the intended effect. The bill would grant a business that develops or maintains a browser that includes this functionality immunity from liability for a violation of those provisions by a business that receives the opt-out preference signal. 

Child data protection in the EU

On 26 November, the European Parliament adopted a resolution on the protection of minors online as part of an own-initiative procedure on the topic. The resolution calls, among other things, for the implementation of an EU-wide harmonised digital minimum age of 16 for accessing social media, video-sharing platforms and AI companions without parental consent, with 13 as the minimum age for any social media use by children, even with parental consent. 

In parallel, the German Data Protection Conference, DSK, adopted a resolution calling for amendments to the GDPR to strengthen protections for children. It proposes a ban on children’s consent for profiling and advertising, limits on children’s ability to consent to special-category data processing, and clearer rights for children to access counselling and medical services privately. It also focuses on a prohibition on children consenting to automated decisions, attention to children in breach notifications, data protection by design and default, and consideration of children’s risks in data protection impact assessments, digitalpolicyalert.org sums up. 

Cloud computing

The European Commission has published non-binding Model Contractual Terms for data access and use and Standard Contractual Clauses for cloud computing contracts. They have been developed to help parties, especially SMEs, implement the provisions of the Data Act. Their use is voluntary and open to users’ possible amendments. Although they were mainly drafted for business-to-business contracts, they can also be used in relations between businesses and consumers, if relevant consumer protection rules are added. 

Three sets of Model Contractual Terms (MCTs) were drafted to cover the relationships where data sharing is mandatory, between data holders, users and data recipients of data generated when using connected products. Plus, proposed Standard Contractual Clauses (SCCs) translate the provisions of ‘cloud switching’ into ready-to-use contractual terms that can be inserted in data processing contracts:

  • SCC Switching & Exit
  • SCC Termination 
  • SCC Security & Business continuity (including provider notification of significant incidents).

Email security

The German Federal Office for Information Security, BSI,  has published a White paper on requirements for the protection, transparency, and user-friendliness of webmails that systematically and future-orientedly increase consumer security. The paper considers not only technical security functions, but also usability, transparency and trust as essential components of digital sovereignty. A fundamental part of e-mail security currently still rests on the shoulders of users. They should be familiar with two-factor authentication, passkey and encryption. The BSI sees responsibility primarily with the providers: they must provide effective procedures regarding authentication, encryption, spam protection and account recovery that work without major user intervention.

Data Act implementation

Digital omnibus

The Data Act has been in effect since September 2025. This new European regulation is intended to give consumers within the EU more control over the use of their data. For instance, a car owner will have the right to access the data their car collects. If repairs are needed, they can share the data with a garage of their choice, explains the Dutch data protection agency AP, which will jointly oversee the implementation process at a national level, starting from 21 November.

The Data Act and the implementing laws do not override the rules of the GDPR. In the event of conflicting rules, the GDPR takes precedence. This means that any data sharing involving personal data must comply with the GDPR, stresses the regulator. 

More from supervisory authorities

Market research data processing: In Poland, the data protection regulator UODO approved the “Code of Conduct on the Processing of Personal Data by Private Research Agencies”. The reason for the development of the code was numerous discrepancies in the processing of the personal data of research participants. As a result, in the case of identical surveys, their participants, depending on the entity conducting the study, could receive divergent information, for instance, on the legal basis for the processing of personal data. Information obligations were also fulfilled differently. The Code also provides guidance to help carry out a risk assessment or, where justified, a data protection impact assessment.

It is worth noting that the code obliges all entities that join it to appoint a Data Protection Officer (DPO)

Sound recording and CCTV: Organisations often choose to conduct video surveillance with sound recording. Sometimes, they also do not disable the camera manufacturer’s default audio function. As a result, the additional risks posed not only by image capture, but also by sound recording are not sufficiently assessed. In addition, the processing of personal data related to it is not always carried out legally: recording sound and image are two different data processing operations, so both audio and video require different legal bases

The processing of personal data by performing video surveillance with audio recording is not justified in most cases. There are rare situations where it is legal and permissible, mainly when it is associated with an increased risk to the essential interests of the organisation or society. Often, the legal basis for such processing can be found in the special regulatory framework applicable to a particular industry in which the organisation operates.

Receive our digest by email

Sign up to receive our digest by email every 2 weeks

Employment clauses and personal data processing

Labour clauses are widely used by both public and private contracting authorities to ensure fair wages and working conditions for suppliers. Contracting entities often require the supplier to provide documentation of its compliance with the labour clauses, typically in the form of employees’ salaries and timesheets, and employment contracts. This gives rise to questions about the supplier’s legal basis for disclosing such personal data to the contracting authority, notes Denmark’s data protection agency. To that end, there will generally be an overriding legitimate interest that these may form the basis for the disclosure of the information in question.

TechSonar 2025-2026

EDPS’s latest guidance on new technology concentrating on the TechSonar report 2025-2026 explores six trends: agentic AI, AI companions, automated proctoring, AI-driven personalised learning, coding assistants and confidential computing. While each of these technologies serves a distinct purpose, they are deeply interconnected. Together, they illustrate how AI is progressively reshaping not only business processes or common daily tasks, but also the human experience of technology. Continue reading the full report here

In other news

Digital omnibus

Data security in cloud-based EdTech: The US Federal Trade Commission will require education technology provider Illuminate Education, Inc. (Illuminate) to implement a data security program and delete unnecessary data to settle allegations that the company’s data security failures led to a major data breach, which allowed hackers to access the personal data of more than 10 million students

Illuminate sells cloud-based technology products and collects and maintains personal information about students on behalf of schools and school districts. In its complaint, the FTC alleged that in 2021, a hacker used the credentials of a former employee, who had departed Illuminate three and a half years prior, to breach Illuminate’s databases stored on a third-party cloud provider. 

Medical data breach: The Norwegian data protection regulator upheld the fine on Argon Medical Devices. In 2023, it issued an American company Argon Medical Devices an infringement fee of approximately. 127,000 euros for violating the GDPR. In 2021, Argon discovered a security breach that affected the personal data of all of its European employees, including those in Norway. Argon sent the Norwegian regulator a notification of a breach long after the 72-hour deadline for reporting such breaches. 

Argon believed that they did not need to report the breach until they had a complete overview of the incident and all its consequences. This view was enshrined in their procedures, and this was the basis for the delay.  The case is an important reminder that controllers must have appropriate measures in place to determine whether a breach has occurred and to promptly notify the supervisory authority and the data subject.

Mobile app gaming company fine

California’s Attorney General settled with Jam City, Inc., resolving allegations that the mobile app gaming company violated the state’s Consumer Privacy Act (CCPA) by failing to offer consumers methods to opt out of the sale or sharing of their personal information across its popular gaming apps. Jam City creates games for mobile platforms, including games based on popular franchises such as Frozen, Harry Potter, and Family Guy. In addition to 1.4 million dollars in civil penalties, Jam City must provide in-app methods for consumers to opt out of the sale or sharing of their data and must not sell or share the personal information of consumers under 16 years old without their affirmative “opt-in” consent.

Data brokers fine

The Belgian data protection authority GBA, meanwhile, has imposed a 40,000 euros fine on data broker Infobel for illegally reselling data for marketing purposes, cybernews.com reports. A consumer complained to the GBA after getting a marketing brochure in the mail from a firm with which he was not a customer. The complainant asks how the corporation received his information. The customer was informed that his information had been given by a media agency. The agency obtained his information via Infobel, a data broker that received it from a telecom operator. 

Infobel said it had permission to sell the complainant’s information to the media agency since it had secured approval from data subjects. However, the data protection authorities claimed that there was no explicit, informed, or unambiguous consent. 

Cookie consent fine

On November 20, the French regulator CNIL fined the French company Conde Nast Publications 750,000 euros for non-compliance with the rules applicable to cookies deposited on the terminals of users visiting the “vanityfair.fr” site. In particular, cookies subject to consent were placed on the terminals of users visiting the “vanityfair.fr” site as soon as they arrived on the site, even before they interacted with the cookie banner to express a choice. Also, when a user clicked on the “Refuse all” button in the banner, or when they decided to withdraw their consent to the registration of trackers on their terminal, new cookies subject to consent were nevertheless deposited, and other cookies, already present, continued to be read. 

And finally…

Meta multi-million file: A Spanish court has ordered Meta to pay 479 million euros to Spanish digital media outlets for unfair competition practices and infringing the GDPR, a ruling the company will appeal, Reuters reports. The settlement, which will be given to 87 digital press publishers and news organisations, is related to Meta’s use of personal data for behavioural advertising.

The complaint filed by the Spanish outlets centred on Meta’s shift in the legal basis for processing personal data after the GDPR went into effect in May 2018. Meta changed “user consent” to “performance of a contract” to support behavioural advertising. Later, regulators judged that it was insufficient. Meta returned to consent as its legal foundation in 2023. The judge assessed that Meta generated at least 5.3 billion euros in advertising income during those five years.

Personal data monetisation: The French CNIL commissioned a survey on the perception of the French people regarding the use of their personal data. From a representative sample of 2,082 people aged 15 and over, 65% of them say they are willing to sell their data. Of these, only 6% would be willing to sell it for less than 1 euro per month, while 14% preferred a fee of more than 200 euros per month. 

The most common valuation was between 10 and 30 euros per month, preferred by 28% of respondents. This coincides with the latest market research based on Meta services estimation, where, for a price of 5 euros, 20% of people would be willing to sell their data, and 90% of companies would be willing to buy it. Taken together, these results make it possible to approximate a market price for data that would be around 40 euros per month (and per subscribed service). 

The post Data protection digest 18 Nov-2 Dec 2025:  “Digital omnibus” package latest & market price of personal data already estimated appeared first on TechGDPR.

]]>
AI Data Retention Strategy under the GDPR and the EU AI Act: Reconciling the Regulatory Clock https://techgdpr.com/blog/reconciling-the-regulatory-clock/ Wed, 26 Nov 2025 15:11:23 +0000 https://techgdpr.com/?p=11361 Artificial Intelligence (AI) is reshaping industries, but organizations developing AI systems face a critical, often overlooked strategic risk: managing the retention of training data in compliance with European Union (EU) law. The GDPR emphasizes rapid deletion of personal data, while the EU AI Act requires long-term archival of system documentation. Navigating these conflicting requirements is […]

The post AI Data Retention Strategy under the GDPR and the EU AI Act: Reconciling the Regulatory Clock appeared first on TechGDPR.

]]>
Artificial Intelligence (AI) is reshaping industries, but organizations developing AI systems face a critical, often overlooked strategic risk: managing the retention of training data in compliance with European Union (EU) law. The GDPR emphasizes rapid deletion of personal data, while the EU AI Act requires long-term archival of system documentation. Navigating these conflicting requirements is essential for legal compliance, operational efficiency, and risk mitigation. An effective AI data retention strategy under the GDPR and the EU AI Act is now essential for organisations developing, deploying, or governing artificial intelligence systems in the European Union.

Executive Summary: The Dual Compliance Imperative and Strategic Findings

Organisations that leverage advanced data processing, particularly those developing complex Artificial Intelligence (AI) systems, face a critical and often unrecognized strategic risk: the prolonged retention of training data. European Union (EU) law establishes conflicting imperatives regarding data lifecycle management, creating a fundamental compliance challenge. The General Data Protection Regulation (GDPR) mandates personal data erasure as soon as the data is no longer required for its established purpose, while the newly implemented EU AI Act demands lengthy archival of system documentation.

The GDPR is the primary constraint on personal data, and the AI Act governs long-term retention of non-personal audit and system records.

The Inescapable Regulatory Conflict: Delete Now vs. Document for a Decade

The core of the conflict lies in the tension between personal data protection and system accountability. The GDPR is clear: personal data must be erased once its specific processing purpose is fulfilled. This is enforced by the Storage Limitation Principle (Article 5(1)(e)). Retention beyond this defined necessity, even if the data might be useful for future research or system retraining, is deemed a direct violation unless a new, distinct, and lawful purpose is established.

Conversely, the EU AI Act introduces stringent requirements for system traceability, particularly for High-Risk AI Systems (HRAS). Providers of HRAS must maintain comprehensive technical documentation, quality management system records, and conformity declarations for up to 10 years after the system is placed on the market (Article 18, EU AI Act). This requirement applies to system records, ensuring long-term accountability, but does not override the fundamental protection afforded to individuals’ data under the GDPR.

The GDPR Foundation: The “Storage Limitation” Principle 

The entire framework of data retention under EU law rests on the GDPR’s Storage Limitation Principle (Article 5(1)(e)).This foundational rule dictates that personal data must be kept “for no longer than is necessary for the purposes for which the personal data are processed.” This is the core principle driving all retention decisions.

Personal data shall be:
(e) kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed; personal data may be stored for longer periods insofar as the personal data will be processed solely for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes in accordance with Article 89(1) subject to implementation of the appropriate technical and organisational measures required by this Regulation in order to safeguard the rights and freedoms of the data subject (‘storage limitation’); 
GDPR Article 5(1)(e)

The GDPR does not set generic retention times, instead placing the full burden on the data controller to define, document, and justify a specific deletion timeline for every category of data. If personal data (which is defined broadly to include information beyond PII, like cookie IDs) is used to train a system, the retention clock starts ticking. Organisations leveraging advanced data processing face a critical strategic risk: retaining training data for too long. The GDPR is unambiguous; personal data must be erased once its specific processing purpose. Retention beyond that, even for potential future research, is a direct violation unless a new, distinct, and lawful purpose is established.

Defining the Critical Strategic Risk for GDPR non-compliance

The strategic risk is precisely defined by failing to establish, document, and legally justify a specific deletion timeline for every category of personal data used in the training process. The absence of generic retention times in the GDPR places the full burden of definition and justification squarely upon the data controller. 

This environment forces organizations to confront a critical trade-off: is the unproven, speculative future value of raw personal data worth the risk of fines and potential data breaches? The calculation strongly favors deletion. As, 

  • Failing to define and document specific deletion timelines exposes organizations to GDPR violations.
  • Retaining data for future retraining or academic purposes is legally indefensible once the initial training purpose is fulfilled.
  • Financial penalties for non-compliance can exceed the cost of implementing compliant, minimal-data systems.

The EU AI Act Layer: Traceability and Documentation 

The EU AI Act introduces a layered approach to retention centered on system accountability rather than individual personal data. The rules are tied to the system’s risk profile, with High-Risk AI Systems (HRAS) (EU AI Act, Chapter 3) having the most stringent obligations.

Data Governance (Article 10) for HRAS requires that training, validation, and testing data sets be relevant, representative, and free of errors. While not a direct retention rule, this implicitly requires maintaining data sets for a period necessary for auditing and quality checks during the development phase.

The most critical requirement is Documentation Retention (Article 18): HRAS providers must keep key records (Technical Documentation, Quality Management System, etc.) for 10 years after the system is placed on the market. This 10-year rule applies to documentation and metadata, not the raw personal data itself, which must be deleted sooner under the GDPR. This 10-year period covers documentation, quality records, and conformity declarations. It is vital to understand that this does not override the GDPR’s Storage Limitation Principle (Article 5(1)(e))

Raw personal data used for training must still be deleted sooner. However, the requirement for Record-Keeping (Logging) (Article 12) means that systems must automatically record events and usage logs. While these logs should ideally be anonymised, their retention period must be “appropriate” extending the non-personal data record-keeping timeline. This mandates a long-term, non-personal data retention strategy that must be carefully integrated with the strict, short deletion cycles required by the GDPR for raw personal data.

Blending the GDPR and EU AI Act Requirements

The intersection of the GDPR and the EU AI Act necessitates a blended compliance strategy, particularly concerning purpose and identification. The GDPR’s Purpose Limitation principle (Article 5(1)(b)) demands that the purpose for processing, such as system training, be explicitly defined. This definition directly dictates the maximum legal retention period for personal data.

Personal data shall be:
(b) collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes; further processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes shall, in accordance with Article 89(1), not be considered to be incompatible with the initial purposes (‘purpose limitation’);
GDPR Article 5(1)(b)

Implementing De-Identification in Your AI Data Retention Strategy under the GDPR and the EU AI Act

The best path for long-term data use is de-identification:

  • Pseudonymisation only reduces identifiability; the data remains personal data under the GDPR and the Storage Limitation Principle still applies.
  • Anonymisation is the only legal release valve. If the data is permanently and irreversibly stripped of identifiers; it is no longer considered personal data (GDPR Recital 26). Therefore, it can be retained indefinitely.

It’s critical to remember that while the raw personal data must be deleted, the trained system itself (the output) can be retained.

Reconciling the GDPR’s Right to Erasure with the EU AI Act Traceability

The most direct legal challenge is reconciling the GDPR’s Right to Erasure (Article 17) with the ongoing need for system traceability under the AI Act. If a system is trained on personal data, the controller must maintain the technical ability to honor an erasure request.

This is the Purpose Limitation Conflict: if the initial purpose (training) is complete, retaining the raw personal data is a violation of the GDPR. Developers must implement technical solutions like secure deletion protocols immediately after a system is finalised. Using robust, irreversible anonymisation is the only way to retain data sets without triggering the GDPR’s strict retention clock.

When facing overlapping regulations, the GDPR always acts as the primary constraint on personal data. Its Storage Limitation Principle sets the hard ceiling for raw personal data retention. This is regardless of the EU AI Act’s documentation rules.

The crucial legal distinction is that PII and other personal data used to create the system must be subject to rigorous deletion procedures the moment the training purpose ends. The technical documentation, metadata, and system logs (which should contain no personal data) are then subject to the EU AI Act’s extended 10-year retention rules. This hierarchy demands that the deletion process (the GDPR) must happen first, leaving only the audit trail (EU AI Act) behind.

The documentation required under the EU AI Act must serve dual purposes: it must confirm the system’s data quality (EU AI Act) and must also provide evidence of the deletion or robust anonymization event, confirming that the GDPR timeline was honored.

Table: Comparison of differences 

Summary GDPR (Personal Data Protection)EU AI Act (HRAS Accountability)
AssetRaw PII, Pseudonymous Data, Identifiable Metadata.Technical Documentation, QMS, System Logs (Non-Personal), Conformity Records.
Core PrincipleStorage Limitation (Delete when purpose ends).Accountability & Traceability (Document for 10 years).
Max Retention PeriodDefined by Controller’s Justified Purpose (Short/Medium Term).10 years after the system is placed on the market.
Legal HierarchyPrimary binding constraint on identifiability.Governs the necessary audit trail after GDPR constraints are met.
Highest Penalty Risk4% Global Annual Turnover (Financial).Operational disruption, market access denial.

The Financial & Operational Cost of AI Data

Compliance is not just a cost, but a powerful risk mitigator. Storing raw personal data beyond the necessary period is a direct violation of the GDPR’s Storage Limitation Principle. This exposes an organisation to fines of up to 4% of global annual turnover (GDPR Article 83).

Beyond the fines, excessive data retention creates massive operational liability. Longer storage times mean higher infrastructure costs and a larger surface area for security breaches. Every day the data is held, the probability of a costly Data Subject Request (DSR) increases, demanding expensive legal and technical personnel to fulfill. Compliant, timely deletion is ultimately the most financially responsible strategy.

Should you store raw personal data for training?

Organisations often retain raw data for perceived future utility, perhaps for retraining a system. The GDPR forces a hard strategic trade-off: is the speculative future value of that raw personal data worth the immediate, tangible risk of massive fines and data breaches?

The EU AI Act demands auditable records, but these should be built from fully anonymised data or non-personal data metadata. The cost calculation is simple: the threat of financial penalty for retaining personal data too is a much greater risk or potential cost than developing a compliant, data-minimal system. A mature data strategy prioritises de-identification and deletion over retention, significantly reducing the organisation’s regulatory and financial exposure.

Data TypeLegal StatusRetention RequirementEffect on AI Systems
Raw Personal Data (PII)Personal data under the GDPRMust be deleted as soon as the training purpose ends (Article 5(1)(e))Limits availability for retraining; requires technical deletion pipelines; increases compliance complexity if data spans multiple systems
Pseudonymised DataStill personal data under the GDPRSame as raw personal data; cannot retain for 10-year auditProvides limited utility for internal processing, but retention beyond purpose is legally risky; still triggers Data Subject Requests and fines if not deleted
Irreversibly Anonymised DataNon-personal data (Recital 26)Can be retained indefinitelySupports long-term model auditing, retraining, bias checks, and the EU AI Act traceability; safe to store for 10-year audit requirements
Metadata / Technical DocumentationNon-personal dataRetention required up to 10 years under the EU AI Act (Articles 10, 18)Supports HRAS compliance; ensures traceability without exposing personal data; must be designed to avoid inclusion of PII
System LogsNon-personal / anonymizedRetention period must be “appropriate,” often aligned with the EU AI Act 10-year auditEnables audit and monitoring; must be anonymized to avoid GDPR violations; operational impact includes storage and secure access management

Strategic Recommendations

The regulatory landscape governing AI development in the EU is defined by a critical tension:

  1. the immediate obligation to protect individual privacy (GDPR) and
  2. the extended obligation to ensure system safety and traceability (EU AI Act).

Compliant data management requires recognizing the GDPR’s Storage Limitation Principle as the absolute constraint on personal data retention. This is regardless of the EU AI Act’s documentation timelines. The solution is architectural separation, where raw personal data is subject to automated deletion, and the audit trail is constructed exclusively from non-personal, irreversibly anonymized assets.

TLDR;

  • Under the GDPR, personal data must be deleted once its specific purpose is fulfilled. This limits how long raw training data can be stored.
  • For AI developers, this means models cannot indefinitely rely on historical raw personal data. This can potentially impact retraining strategies and model evolution.

The post AI Data Retention Strategy under the GDPR and the EU AI Act: Reconciling the Regulatory Clock appeared first on TechGDPR.

]]>