The Top 7 GDPR Violations and How to Avoid Them

PrivaSift TeamApr 01, 2026gdprcompliancedata-privacydata-breachsecurity

The Top 7 GDPR Violations and How to Avoid Them

Since the General Data Protection Regulation took effect in May 2018, European data protection authorities have issued more than €4.5 billion in fines. And the trend is accelerating — 2025 alone saw record-breaking penalties against companies of every size, from global tech giants to mid-market SaaS providers that simply lost track of where personal data lived in their systems.

If you're a CTO, DPO, or security engineer, the message is clear: GDPR enforcement is no longer theoretical. Regulators have moved past the "grace period" mentality. They're auditing proactively, responding aggressively to breach notifications, and handing down fines that scale to 4% of global annual turnover. A single overlooked database column containing email addresses or IP addresses can trigger an investigation.

The good news is that most GDPR violations fall into a surprisingly small number of categories — and nearly all of them are preventable with the right processes and tooling. This guide breaks down the seven most common violations, explains how regulators evaluate them, and gives you concrete steps to close each gap before an audit finds it for you.

1. Insufficient Legal Basis for Data Processing

![1. Insufficient Legal Basis for Data Processing](https://max.dnt-ai.ru/img/privasift/top-gdpr-violations_sec1.png)

The single most fined violation under GDPR is processing personal data without a valid legal basis. Article 6 of the regulation requires that every processing activity be grounded in one of six lawful bases: consent, contract, legal obligation, vital interests, public task, or legitimate interest.

Real-world example: In January 2023, Ireland's Data Protection Commission fined Meta €390 million for forcing users to accept personalized advertising as a condition of using Facebook and Instagram. Meta claimed "contractual necessity" as its legal basis, but regulators ruled that targeted ads are not necessary to deliver a social media service.

How to avoid it:

  • Map every data flow to a legal basis. Maintain a processing register (Article 30) that explicitly documents which legal basis applies to each category of personal data you collect.
  • Don't default to consent when you mean legitimate interest — and vice versa. Each basis has different requirements. Consent must be freely given, specific, and withdrawable. Legitimate interest requires a documented balancing test (LIA).
  • Audit your data inventory regularly. You can't assign a legal basis to data you don't know you have.
`text Example: Processing Register Entry ───────────────────────────────────────────────────── Data Category: User email addresses Purpose: Account authentication, password reset Legal Basis: Contract (Art. 6(1)(b)) Retention: Duration of account + 30 days Storage: PostgreSQL (eu-west-1), Redis cache PII Fields: email, hashed_password, last_login_ip ───────────────────────────────────────────────────── `

Automating PII discovery across your databases and file stores is the fastest way to ensure your processing register is actually complete. Tools like PrivaSift can scan structured and unstructured data sources to surface personal data you may not realize you're storing.

2. Non-Compliant Consent Mechanisms

![2. Non-Compliant Consent Mechanisms](https://max.dnt-ai.ru/img/privasift/top-gdpr-violations_sec2.png)

Consent violations are among the easiest for regulators to spot — and among the hardest for organizations to fix retroactively. Under GDPR, consent must be freely given, specific, informed, unambiguous, and as easy to withdraw as it is to give. Pre-ticked checkboxes, bundled consent, and "consent walls" that block access unless users agree to all processing are all non-compliant.

Real-world example: France's CNIL fined Google €150 million in 2022 for making it significantly harder for users to refuse cookies than to accept them. The "Accept All" button was prominent, while rejecting cookies required multiple clicks through nested menus.

How to avoid it:

  • Separate consent requests by purpose. Don't bundle marketing, analytics, and functional cookies into a single opt-in.
  • Make refusing as easy as accepting. Regulators now explicitly check that "Reject All" is as prominent and accessible as "Accept All."
  • Store consent receipts with timestamps. You must be able to prove when and how consent was obtained.
  • Implement a withdrawal mechanism that is no more complex than the consent mechanism itself.
`html `

3. Data Breach Notification Failures

![3. Data Breach Notification Failures](https://max.dnt-ai.ru/img/privasift/top-gdpr-violations_sec3.png)

Article 33 requires that data controllers notify their supervisory authority within 72 hours of becoming aware of a personal data breach — unless the breach is unlikely to result in a risk to individuals' rights. Article 34 further requires notifying affected individuals when the breach poses a "high risk."

The 72-hour clock is strict. Regulators have fined organizations not for the breach itself, but for the delayed or incomplete notification that followed.

Real-world example: In 2020, British Airways was fined £20 million (reduced from an initial £183 million) after a breach exposed the personal and financial data of approximately 429,612 customers. The ICO found that BA failed to detect the breach for over two months and did not have adequate security measures in place.

How to avoid it:

  • Build a breach detection pipeline. Monitor access logs, anomalous queries, and data exfiltration signals in real time.
  • Pre-draft your notification templates. When a breach occurs, you don't want to spend 24 of your 72 hours figuring out what to say.
  • Run tabletop exercises quarterly. Your incident response team should be able to classify a breach, assess risk, and begin notification within hours, not days.
  • Know where your PII lives before a breach happens. The fastest way to assess breach impact is to already have an up-to-date inventory of what personal data exists in the affected systems.
`text Breach Response Timeline (Article 33 Compliance) ───────────────────────────────────────────────── Hour 0: Breach detected / reported internally Hour 0-4: Incident response team activated Hour 4-12: Scope assessment — which PII is affected? Hour 12-24: Risk classification (low / high risk to individuals) Hour 24-48: Draft supervisory authority notification Hour 48-72: Submit notification to DPA (Article 33) Hour 48-72: If high risk → notify affected individuals (Article 34) ───────────────────────────────────────────────── `

4. Violations of Data Subject Rights

![4. Violations of Data Subject Rights](https://max.dnt-ai.ru/img/privasift/top-gdpr-violations_sec4.png)

GDPR grants individuals a suite of rights: access (Article 15), rectification (Article 16), erasure (Article 17), data portability (Article 20), and the right to object (Article 21). Organizations must respond to these requests within one month — and failing to do so is one of the most common enforcement triggers.

Real-world example: In 2021, the Hamburg Commissioner for Data Protection fined Vattenfall Europe Sales €900,000 for failing to properly respond to a data subject access request (DSAR). The company could not locate all personal data related to the individual because it was scattered across disconnected systems.

How to avoid it:

  • Centralize your DSAR workflow. Use a ticketing system or dedicated tool to track requests, deadlines, and responses.
  • Implement PII discovery across all data stores. You cannot fulfill a right-to-erasure request if you don't know that the user's data also lives in a legacy MongoDB instance, an analytics warehouse, and three CSV exports on a shared drive.
  • Automate where possible. For access and portability requests, build internal APIs that can pull a user's data from all systems into a structured export.
`python

Example: Automated DSAR data collection across multiple stores

import asyncio from datetime import datetime, timedelta

DSAR_DEADLINE = timedelta(days=30)

async def collect_user_data(user_id: str, request_date: datetime): """Collect all PII for a data subject across registered stores.""" deadline = request_date + DSAR_DEADLINE

sources = [ query_postgres(user_id), query_mongodb(user_id), query_s3_exports(user_id), query_analytics_warehouse(user_id), query_email_service(user_id), ]

results = await asyncio.gather(*sources, return_exceptions=True)

report = { "subject_id": user_id, "request_date": request_date.isoformat(), "deadline": deadline.isoformat(), "data_sources": len(sources), "data": [r for r in results if not isinstance(r, Exception)], "errors": [str(r) for r in results if isinstance(r, Exception)], } return report `

The Vattenfall case illustrates a critical point: the biggest risk is not that you refuse a request — it's that you can't find the data in the first place. Automated PII scanning is the foundation of reliable DSAR fulfillment.

5. Inadequate Data Protection by Design and Default

Article 25 requires that data protection be embedded into the design of systems and business processes from the outset — not bolted on as an afterthought. "By default" means that the strictest privacy settings should apply automatically, without requiring user intervention.

Real-world example: The Swedish DPA fined Klarna €720,000 in 2022 for failing to provide adequate information to data subjects and for not implementing privacy by design in its payment processing systems. The company was collecting more personal data than necessary for the stated purpose.

How to avoid it:

  • Apply data minimization at the schema level. Before adding a column or field, ask: is this data necessary for the stated purpose? If not, don't collect it.
  • Default to the most restrictive access. New features should ship with data collection off by default, not on.
  • Pseudonymize or anonymize where possible. If your analytics pipeline doesn't need real email addresses, hash them at ingestion.
  • Conduct Data Protection Impact Assessments (DPIAs) for any new processing activity that involves high-risk data, profiling, or large-scale monitoring.
`yaml

Example: DPIA checklist for a new feature

dpia: feature: "User behavior analytics dashboard" data_controller: "Product Engineering" date: "2026-03-15" checks: - question: "Is processing necessary and proportionate?" answer: "Yes — aggregated usage patterns for product improvement" - question: "Can we achieve the purpose with less data?" answer: "Yes — anonymize user IDs before aggregation" - question: "What risks exist for data subjects?" answer: "Re-identification risk if granular timestamps are retained" - question: "What mitigations are in place?" answer: "k-anonymity threshold of 50, 24-hour timestamp bucketing" - question: "Is a full DPIA required?" answer: "No — mitigations reduce risk below threshold" `

6. Insufficient Technical and Organizational Security Measures

Article 32 requires organizations to implement security measures "appropriate to the risk," including encryption, pseudonymization, access controls, and regular testing. This is intentionally broad — regulators evaluate your security posture relative to the sensitivity of data you process and the state of the art.

Real-world example: The Italian Garante fined the Lazio region's health authority €120,000 after a ransomware attack in 2021 exposed health data of millions of citizens. The investigation revealed outdated systems, lack of multi-factor authentication, and inadequate network segmentation.

How to avoid it:

  • Encrypt PII at rest and in transit. Use AES-256 for storage and TLS 1.2+ for transmission. This is table stakes.
  • Enforce least-privilege access. Developers should not have production database access by default. Use role-based access control (RBAC) and audit access logs.
  • Implement MFA across all systems that touch personal data — especially admin panels, database consoles, and cloud provider accounts.
  • Test your defenses. Conduct penetration testing at least annually and run vulnerability scans on a continuous basis.
  • Monitor for PII in unexpected locations. Personal data frequently leaks into log files, error messages, analytics events, and staging environments. Automated PII scanning catches this drift before an attacker does.
`bash

Quick check: Are you logging PII accidentally?

Search application logs for common PII patterns

Email addresses in logs

grep -rP '[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}' /var/log/app/

Credit card numbers (basic pattern)

grep -rP '\b\d{4}[\s-]?\d{4}[\s-]?\d{4}[\s-]?\d{4}\b' /var/log/app/

If either returns results, you have a PII leak in your logs.

`

7. Non-Compliant International Data Transfers

Since the Schrems II ruling invalidated the EU-US Privacy Shield in 2020, international data transfers have been a minefield. Article 44-49 of GDPR requires that transfers outside the EEA only occur with "adequate safeguards" — typically Standard Contractual Clauses (SCCs), Binding Corporate Rules (BCRs), or an adequacy decision.

Real-world example: In May 2023, Ireland's DPC fined Meta a record €1.2 billion for transferring EU users' data to the United States without adequate protections against US government surveillance. This remains the largest GDPR fine ever issued.

How to avoid it:

  • Map all cross-border data flows. Know exactly which personal data leaves the EEA, to which countries, and through which processors.
  • Update your SCCs. The European Commission published new SCCs in June 2021. If you're still using the old versions, they are no longer valid.
  • Conduct Transfer Impact Assessments (TIAs) for each transfer to a country without an adequacy decision. Document the legal framework of the receiving country and the supplementary measures you've implemented.
  • Consider data localization. For high-risk data, hosting within the EEA eliminates transfer risk entirely. Most major cloud providers now offer EU-only regions.
  • Monitor the EU-US Data Privacy Framework. While the current framework (adopted July 2023) provides a legal basis for transfers to certified US companies, its long-term viability remains uncertain.

Frequently Asked Questions

What is the maximum GDPR fine, and how is it calculated?

GDPR fines are capped at the higher of €20 million or 4% of global annual turnover. Supervisory authorities consider several factors when determining the amount: the nature and severity of the violation, whether it was intentional or negligent, the number of data subjects affected, what measures were taken to mitigate damage, the organization's degree of cooperation with the authority, and any prior violations. In practice, fines vary enormously — from a few thousand euros for small businesses to billions for multinational corporations. The €1.2 billion Meta fine in 2023 demonstrated that regulators will use the full range of their powers.

How do I know if my organization is processing PII that I'm not aware of?

This is more common than most engineering teams realize. PII accumulates in unexpected places: application logs, error tracking services (like Sentry), analytics pipelines, staging databases cloned from production, spreadsheet exports shared via email, and third-party integrations that cache user data. The only reliable way to get a complete picture is automated PII discovery — scanning your databases, file systems, and cloud storage with tools that recognize patterns like email addresses, national ID numbers, phone numbers, and health data. Manual audits are too slow and too incomplete to keep up with how fast data spreads in modern architectures.

Do GDPR rules apply to companies outside the EU?

Yes. GDPR applies to any organization that processes personal data of individuals in the EEA, regardless of where the organization is based. If your SaaS product has a single user in Germany, GDPR applies to your processing of that user's data. This extraterritorial scope (Article 3) is one of GDPR's most consequential provisions. Non-EU companies must appoint an EU representative (Article 27) and comply with all obligations, including breach notification, data subject rights, and lawful processing requirements.

What's the difference between a data controller and a data processor under GDPR?

A data controller determines the purposes and means of processing personal data — they decide why and how data is processed. A data processor processes data on behalf of the controller, following the controller's instructions. For example, if your company uses a cloud email provider to send newsletters, your company is the controller (you decide to send marketing emails and to whom) and the email provider is the processor. Both have GDPR obligations, but controllers bear primary responsibility for compliance. The relationship must be governed by a Data Processing Agreement (DPA) under Article 28. Critically, if a processor processes data beyond the controller's instructions, they become a controller for that processing — and assume full liability.

How often should we audit our GDPR compliance?

There is no mandated frequency, but best practice among mature organizations is to conduct a comprehensive compliance audit annually, with continuous monitoring in between. Specific triggers should also prompt a review: launching a new product or feature that processes personal data, onboarding a new third-party processor, expanding into a new market, experiencing a data breach, or receiving guidance from your supervisory authority. The organizations that get fined are almost never the ones with rigorous audit schedules — they're the ones that set up compliance once and assumed it would hold. Data environments change constantly. Your compliance posture needs to keep pace.

Start Scanning for PII Today

PrivaSift automatically detects PII across your files, databases, and cloud storage — helping you stay GDPR and CCPA compliant without the manual work.

[Try PrivaSift Free →](https://privasift.com)

Scan your data for PII — free, no setup required

Try PrivaSift