EU Advocate General backs keyword based RFIs in Meta probes

  • Thread Author
Advocate General Athanasios Rantos told Europe’s top court on February 26, 2026 that the European Commission’s sweeping document demands in its probes of Facebook are lawful, proportionate and properly safeguarded—an opinion that, if adopted by the Court of Justice, will hand regulators a major tactical and legal win in the EU’s sustained campaign to force Big Tech to account for how it designs, bundles and monetizes social platforms.

Blue-lit virtual data room with a circle of EU stars, documents, and scales of justice.Background​

The legal fight stems from two long-running EU competition investigations into Meta Platforms Ireland: one into the alleged tying and self‑preferencing that benefitted Facebook Marketplace, and a separate probe into Meta’s use of Facebook‑generated data for advertising and competitive advantage. The Commission’s original requests—issued in 2020—required Meta to run large-scale keyword searches across its internal repositories and to furnish documents identified by those keywords. The General Court upheld the Commission’s approach in May 2023, and Meta appealed to the Court of Justice.
In his February 26, 2026 opinion in cases C‑496/23 P (Facebook Marketplace) and C‑497/23 P (Facebook Data), Advocate General Rantos concluded the General Court had not erred in law when it found the requests to be necessary, proportionate and covered by appropriate safeguards. His recommendation: dismiss Meta’s appeals and uphold the General Court judgments. While the Advocate General’s opinion is non‑binding, CJEU judges often follow it—meaning a final ruling could arrive in the months ahead that cements the legal standard for how the Commission seeks digital evidence in high‑stakes competition probes.

What the opinion says and why it matters​

Key legal points from the Advocate General​

  • The Commission enjoys broad investigative powers under EU competition law and is afforded a margin of discretion in selecting investigative techniques.
  • Requests framed as combinations of electronic search terms are lawful where they are reasonably connected to the subject matter of the investigation and where proportionality and privacy safeguards are assessed.
  • The General Court did not err in concluding that amended procedures—including the use of a virtual data room and specific safeguards for sensitive personal data—reduced the risk of unlawful intrusion.
Those conclusions matter because they resolve, at a high level, a fundamental tension: can competition authorities use modern, keyword‑driven discovery tools to assemble digital evidence about dominant platforms without infringing data‑protection guarantees under the EU charter and GDPR? The Advocate General’s answer leans toward yes, provided the authorities show a factual link between the terms requested and the suspected anti‑competitive conduct, and they implement strict handling safeguards.

Why the industry is watching​

Meta framed the Commission’s requests as a digital dragnet—alleging that thousands of search phrases pulled back vast quantities of irrelevant and highly sensitive personal material. The company warned such tactics could chill internal communications, expose private data and grant regulators de facto access to broad swathes of corporate and personal information.
Regulators countered that keyword requests are routine, that many terms originated with the company itself during the initial inquiry process, and that procedural safeguards (including restricted virtual data room access and anonymization where appropriate) mitigated privacy risks. The Advocate General’s opinion vindicates that view in principle, elevating the Commission’s tactical playbook into accepted investigative practice unless the Court says otherwise.

The technical mechanics: how keyword RFIs and virtual data rooms work​

What investigators ask for​

  • Authorities typically issue a Request for Information (RFI) that enumerates:
  • custodians (employees or systems whose documents are relevant),
  • date ranges,
  • and keyword lists—terms and Boolean combinations used to filter enterprise email, chat, documents and file systems.
  • A company runs those keywords across its archives, produces the set matching the criteria, and furnishes the results to the authority—or uploads them to a secured “virtual data room” for controlled review.

Why keyword RFIs are controversial​

  • Keywords can be overinclusive: terms used in non‑investigative contexts (e.g., “not good for us”) can appear in personal notes, HR records or unrelated files.
  • Historic, cross‑functional repositories can include sensitive content—medical records, security assessments, personal correspondence—that regulators do not need.
  • The combinatorial nature of Boolean searches can magnify results exponentially, creating huge review burdens and privacy review problems.

Safeguards and mitigation​

  • Virtual data rooms with tiered access, redaction protocols and strict logging are meant to reduce privacy exposure.
  • Data minimization and cooperation between competition authorities and data‑protection agencies can carve out rules for handling sensitive documents.
  • Judicial oversight—either pre‑production review by a court or post‑production remedies—can serve as a check against abuse.
Advocate General Rantos emphasized that where such safeguards exist and the RFI is properly reasoned (identifying the investigation’s subject matter and suspected infringement), the Commission’s approach falls within its discretionary powers.

The wider regulatory landscape: fines, DMA enforcement and the cost of non‑compliance​

This legal dispute is not isolated. It sits atop a wave of EU regulatory enforcement that has reshaped the operational and legal environment for gatekeeper platforms.
  • In 2024 the Commission fined Meta roughly €797.7 million for tying Facebook Marketplace to the Facebook social network and imposing unfair conditions on rival classifieds—an order that targeted self‑preferencing and cross‑service data use.
  • Under the Digital Markets Act (DMA) and related enforcement initiatives, gatekeepers have faced specific obligations on data combination, user choice for personalized ads, and non‑discrimination. Enforcement actions and fines under the DMA have already disciplined platform behavior and generated targeted compliance costs.
  • Regulators are increasingly coordinating across competition, privacy and digital‑services frameworks, tightening the interplay between antitrust probes and data protection oversight.
For businesses that rely on personalized advertising, these developments are material: regulatory constraints on data combination and stricter judicial acceptance of Commission evidence‑gathering could both increase enforcement risk and force business model changes.

Practical stakes for Meta and other platform operators​

Immediate legal stakes​

If the Court of Justice follows the Advocate General, Meta will have exhausted a significant avenue of judicial relief against the Commission’s evidence‑gathering techniques. That outcome would:
  • Make it harder for dominant platforms to inoculate themselves from discovery-style RFIs using broad keyword lists.
  • Reduce the threshold for competition authorities to deploy electronic search terms in cross‑service and behavioral investigations.
  • Limit the protections available at the evidence‑collection stage, shifting battles toward assessing relevance and proportionality during proceedings rather than at the point of collection.

Commercial stakes​

  • Operational costs: responding to exhaustive keyword RFIs is expensive—companies must sift, redact and host large data sets under tight deadlines.
  • Strategic implications: regulators empowered to collect and analyze internal communications increase the likelihood of findings that support structural or behavioral remedies (e.g., forced disentanglement of services, data‑sharing obligations).
  • Advertising model risk: the DMA and parallel regulatory pressure on personalized targeting mean platforms may need to redesign ad stacks, invest in consent management, or offer lower‑personalization alternatives—each with measurable revenue consequences.

Reputation and privacy risk​

  • Even where authorities implement safeguards, the prospect of millions of internal messages being reviewed—even in a secure data room—raises public‑relations and employee‑privacy concerns.
  • For enterprise customers, the perception that a platform’s internal data is subject to broad regulatory trawls could undermine trust.

Critical analysis: strengths, weaknesses and unanswered questions​

Strengths of the Advocate General’s analysis​

  • Realism about investigatory needs: modern digital markets generate business conduct that is technically and behaviorally complex; regulators need modern tools to test complex hypotheses.
  • Procedural balance: the opinion does not grant carte blanche. It insists on factual linkage between search terms and the subject matter of the investigation and recognizes the role of safeguards.
  • Preserves an enforceable remedy: by endorsing virtual data rooms and redaction procedures, the opinion gives regulators a workable path for evidence collection without forcing them into intrusive on‑site seizures.

Weaknesses and risks​

  • The proportionality line remains vague: determining when a keyword list crosses from necessary to disproportionate is inherently fact‑sensitive and susceptible to inconsistent application across cases.
  • Potential for mission creep: investigators with broad discretion could drift toward overbroad requests unless courts develop robust, harmonized judicial standards for electronic RFIs.
  • Privacy enforcement gap: even where safeguards are promised, the practical burden of identifying and redacting sensitive personal data may fall primarily on the responding company—creating an asymmetry of cost and privacy risk.

Unanswered questions and litigation flashpoints​

  • Where will the Court draw the line on who must show the individual relevance of each keyword? The Advocate General suggested the Commission need not demonstrate relevance for every document at the outset—but courts could require more granular justification in future cases.
  • How will national data‑protection authorities respond when competition RFIs implicate large volumes of personal data? The interplay between GDPR and competition law remains a potential source of conflicting obligations unless coordinated guidance becomes binding.
  • Will other jurisdictions adopt the EU’s approach, or carve out stricter safeguards? Global harmonization is unlikely; multinational platforms may face divergent standards and increased compliance fragmentation.

What this means for businesses that depend on platform advertising​

Advertisers, publishers and small businesses should not treat this as a purely legal dispute between Meta and Brussels. The rulings and broader regulatory momentum have practical implications:
  • Plan for lower personalization effectiveness: DMA constraints and privacy pressure could reduce signal fidelity for targeted ads.
  • Diversify acquisition channels: reliance on a single gatekeeper for scalable audience targeting is riskier; invest in direct channels, first‑party data capture and contextual targeting.
  • Invest in consent and data governance: robust consent frameworks, consent‑aware measurement and privacy‑preserving analytics will be competitive advantages.
  • Pressure‑test contracts: assess contractual commitments with platform partners for resilience in the face of changing ad‑tech and regulation.
For enterprise IT teams, the day after a stricter ruling means investments in consent tooling, first‑party customer data platforms (CDPs), and on‑prem / cloud data architectures that support compliance while preserving measurement.

Ten concrete steps Meta and similar platforms should consider now​

  • Strengthen legal and technical justification for each potential keyword or search term before it’s used in any RFI response.
  • Build automated, auditable pipelines for redaction and privacy‑preserving review that reduce manual burden.
  • Publish clearer internal playbooks explaining data retention, classification and custodianship to narrow scope in future inquiries.
  • Expand the use of privacy by design—segregating highly sensitive HR, health and security files from operational product data to reduce spillover.
  • Accelerate investments in first‑party data and client‑side measurement to reduce dependency on cross‑service combination.
  • Launch transparent user options and equivalents for personalized ads to comply with DMA obligations.
  • Invest in independent audits of RFI‑response processes to produce defensible records in court.
  • Engage proactively with European regulators and data‑protection authorities to seek common protocols for keyword RFIs.
  • Evaluate business models that uncouple essential social services from ad inventory critical to competitive advantage.
  • Prepare crisis communications and employee guidance to mitigate the reputational fallout of large‑scale document productions.

Broader implications for antitrust and data governance​

  • This dispute highlights a systemic shift: antitrust enforcers are treating internal platform data not merely as evidence but as an asset that can demonstrate the mechanics of dominance. That shift elevates evidence‑gathering from checkbox litigation tactics to central levers of remedy design.
  • Court acceptance of keyword RFIs (with safeguards) makes it more likely that future antitrust cases will be supported by large‑scale internal analytics, machine‑learning models and behavioral correlation evidence drawn from platform metadata.
  • Regulators will increasingly demand technical transparency: access to log files, ad‑tech pipelines and algorithmic decision points could become standard features of investigations—requiring platforms to architect systems for forensic explainability.

What to watch next​

  • The Court of Justice’s final ruling in these appeals: whether it adopts the Advocate General’s reasoning in full, narrows it, or introduces new constraints.
  • Coordinated guidance from the Commission, national data‑protection authorities and courts on the DMA–GDPR interface: clearer protocols would reduce the present friction between privacy and competition aims.
  • Follow‑on enforcement actions: a judicial green light for keyword RFIs will likely encourage regulators to bring more complex, data‑intensive cases against gatekeepers.
  • Industry responses: expect renewed legal challenges, product redesigns and defensive compliance spending by platforms to insulate revenue streams and preserve user trust.

Conclusion​

Advocate General Rantos’s February 26, 2026 opinion marks a significant inflection point in how Europe’s competition authorities can investigate dominant digital platforms. By accepting keyword‑based RFIs and the use of virtual data rooms as legitimate investigatory tools—so long as there is a nexus to the suspected conduct and safeguards are in place—the opinion tilts the balance toward more powerful, data‑centric enforcement.
That shift presents both a clarifying moment and a new frontier of regulatory friction. For platforms like Meta, it raises the stakes of internal governance, evidence hygiene and business‑model design. For advertisers, publishers and IT leaders, it signals that the era of frictionless, hyper‑targeted ad stacks in Europe is evolving into one where compliance, first‑party strategy and contextual alternatives are not optional but essential.
Ultimately, the Court’s forthcoming decision will not just shape the outcome of Meta’s appeals—it will help write the next chapter of digital competition law, privacy governance and the architecture of online services across Europe and beyond.

Source: techi.com EU Court Adviser Backs Antitrust Data Demands Against Meta
 

Back
Top