Showing posts with label compliance by design. Show all posts
Showing posts with label compliance by design. Show all posts

Thursday, June 19, 2025

DMA (Team) Sudans, when will Meta's compliance with Article 5(2) finally flow?


Don't look for it in Rome...

  Nearly two months on, the Commission’s DMA non-compliance decision against Meta  was finally published yesterday [18 June]. Coincidentally or not, a German court also published a ruling yesterday, offering its own reading of the same DMA obligation with which, according to the Commission, Meta is not (yet?) compliant. As Wavesblog Readers will appreciate, both are of considerable interest. What follows are a few preliminary observations. Let me begin, however, with a brief disclosure. During the eight months in which I had the pleasure of consulting for Article 19, I had occasion to focus on Meta twice: first, in writing about an intriguing German judgment applying Article 102 TFEU (not the one you're thinking of, another one entirely); and second, precisely in relation to the application of Article 5(2) of the DMA.  As for the latter, in the course of that work I had reached the conclusion, now confirmed by the Commission in its recently published decision, that there was simply no way Meta could be considered compliant, a conclusion I had the pleasure of presenting in the Commission’s presence in beautiful Fiesole.
 

Why, then, did it take an 80-page decision, and why is Meta, in all likelihood, still not compliant more than a year after it was first required to be? The most obvious answer, of course, is that the obligation in question strikes at the very heart of Meta’s business model, not only as it stands today, but also with implications for future developments, given the EU legislator's insistence on DMA compliance by design, a theme that has been a recurring one here on Wavesblog. Moreover, in this case, the Commission is not simply tasked with interpreting and enforcing a 'standard' DMA obligation in relative isolation; to do so, it must also apply the General Data Protection Regulation in conjunction with the DMA. That these two legislative instruments converge in more than one respect is also confirmed by the German ruling mentioned earlier and to which we shall return in due course. On that note, we are still awaiting the joint EDPB/Commission guidance on the interplay between the DMA and the GDPR, which, according to  Commission’s remarks this week in Gdańsk, is expected imminently. Interestingly, the Commission takes this into account as a mitigating factor in determining the fine for non-compliance ("the Commission acknowledges that the interplay between Regulation (EU) 2022/1925 and Regulation (EU) 2016/679 created a multifaceted regulatory environment and added complexity for Meta to design its advertising model in a manner compliant with both regulations"). That might be perfectly understandable, were it not for the fact that we are, after all, dealing with Meta, which has elevated non-compliance with the GDPR to something of an art form worthy of Bernini. This raises both a puzzle and a question: will Meta be left to do much the same under the DMA? And might other gatekeepers be allowed to match, or even surpass, it? Is there, perhaps, a structural flaw in the DMA’s enforcement apparatus, just as there are, quite plainly, in the GDPR, that gatekeepers can be expected to exploit at every opportunity? Or is it simply a matter of DMA enforcement resources falling well short of what would actually be required? What, then, can be inferred from this particular decision in response to that question? Serious reflection is clearly needed here. For now, I can offer you, Wavesblog Readers, only a few very first impressions, but I’d be all the more keen to hear yours. 

Even before Compliance Day (7 March 2024), it is clear from the decision that the Commission already had serious reservations about the "Consent or Pay" advertising model that Meta was in the process of grinding out, which had been presented to the Commission as early as 7 September 2023. The decision makes clear not only that the Commission was in close dialogue with Meta, but also that it engaged with several consumer associations and other interested third parties, on both the DMA and privacy sides of the matter. On that note, a further question, though perhaps it’s only me. Should there not be, if not a formal transparency requirement then at least a Kantian one, for the Commission to list, even in a footnote, all the interested parties with whom it bilaterally discussed the matter?  On this point, one almost hears the Commission suggesting that such a transparency obligation might discourage others from speaking up, for fear of retaliation by the gatekeeper. The point is well taken, but one wonders whether some form of protected channel might be devised, a kind of "privileged observer’s window with shielding" available where reasonably requested, providing clear assurances that the identities of those coming forward will be safeguarded (short of being a whistleblower). Moreover, as is well known, this point tangentially touches on a broader issue. The EU legislator, likely with a view to streamlining enforcement, left limited formal room for well-meaning third-party involvement. The Commission-initiated compliance workshops, the 2025 edition of which has just begun, are a welcome addition, but they are, of course, far from sufficient. In particular, without access to fresh data provided by the gatekeepers, available only to the Commission, how are third parties expected to contribute anything genuinely useful at this point of the "compliance journey"? As we shall see, this concrete data was also an important point in the very process that led to the adoption of the non-compliance decision in this case (Meta knew that its 'compliance model' was producing exactly the result they wanted). The lawyers, economists and technologists on the DMA team have clearly had their hands full in the matching ring with Meta (hence, of course, the Sudans in the title). Even a quick reading of the decision reveals, between the lines and squarely on them, the array of tactics deployed by Meta to throw a spanner in the works of effective DMA compliance, all carefully orchestrated and calculated with precision, and surprising no one. But one does wonder whether the DMA’s hat might not still conceal other tools, better suited to crowdsourcing and channelling constructive efforts, particularly from those third parties who stand to benefit from the DMA (as we also heard in Gdańsk), from conflict-free academics, and from what might be called real civil society, genuinely committed to effective and resolute DMA enforcement, rather than the usual crop of gatekeeper-supported associations. 

To loosely paraphrase Microsoft at the 2025 Enforcement Workshop, Meta’s “compliance model from day one” was a marble (code)-carved binary choice, one that could scarcely have been further from what the EU legislator had in mind. Unlike the strategies adopted by certain other gatekeepers, Meta didn’t even bother to kick the compliance can just far enough down the road to create an illusion of movement. The opening of non-compliance proceedings came swiftly and had, by all appearances, been fully anticipated. The decision that brings them to a close contains no epiphanic surprises, and is lengthy only because Meta's counsel deployed the full repertoire of legal ingenuity, focusing in particular on arguments forged in the intersection between the DMA and data protection law. Time, then, for a modest walkthrough, dear Wavesblog Readers.
 
As one assumes it must by now form part of general digital literacy (until one realises it doesn’t, even when speaking to students born to parents who were themselves, perhaps, already digital natives), Meta "generates almost the entirety of its revenues from its advertising service." On Facebook and Instagram, both designated under the DMA, end users "post and consume personalised content." Upon registering, each user receives "a dedicated, unique user identifier... and all data tied to that user identifier is part of a unified user account for that environment, i.e., the Facebook environment or the Instagram environment." Meta combines the personal data it collects from that user within the same social network environment and further merges it with data gathered through another designated core platform service, Meta Ads, to display personalised advertising. It is solely this latter data combination that is at the heart of the non-compliance decision.

[I pick it up again now, not without noting that while I'm still writing this post, the 60-day deadline to comply with the cease-and-desist order, according to my calculations -  which are apparently wrong but I don't understand why -  has lapsed, and that DMA enforcement may well have found its way into the cauldron of Trump-era trade negotiations].

Let us suppose that you, dear Wavesblog Reader, use Facebook in the EU. Meta, as designated gatekeeper, according to the DMA (and commencing for real no later than today - 24 June - or on the 26th at the latest) must present you with the specific choice of a less personalised but equivalent alternative to the advertising-based service Facebook that you would use if you had given your consent to the combination of your personal data from Facebook with data from the Meta Ads. 

[Let's have another go at this, shall we? A few rather significant developments have intervened since last time I wrote here. First off, it appears DMA enforcement hasn't quite ended up in the cauldron of Trump trade negotiations. Second, we've had the last DMA Compliance Workshop, with Meta taking centre stage. During the workshop, we heard that from 27th June, Meta has rolled out yet another iteration of  their "compliance journey." The third development is that the Commission has launched a public consultation on the first review of the Digital Markets Act, which will keep many of us rather pleasantly occupied this summer. The fourth, and of less general interest (but mentally preparing for it - we'll record the episode rather soon), is that I'll draw on thoughts from this piece to discuss this decision  "Chez Oles"  with two more Invitees. Finally, I should mention that it's not merely Meta and the DMA Team feeling the heat at the moment: the entire European continent is currently enduring what can only be described as a perfectly ghastly heatwave. Rather puts everything into perspective, doesn't it?]
 
Data cocktails beyond the decision 

Perhaps the best place to restart is with what was clearly explained during the Compliance Workshop, and you, dear Wavesblog readers, who’ve perhaps already pored over the decision, will have caught all the nuances of it. The non-compliance decision at issue concerns a single type of data combination covered by Article 5(2). Many other cocktails Meta shakes up with your data fall outside the decision’s direct scope,  but are still covered by the same article and have been extensively discussed in the ongoing regulatory dialogue between the DMA Team and Meta. Data cocktails can be mixed within CPSs themselves (e.g., Facebook and Messanger), but also between CPSs and the gatekeeper’s other services (e.g., Instagram and Threads). During the workshop, we heard from the Commission that these other 5(2)-related compliance discussions largely centred on what qualifies as an equivalent service for users opting out of data combination, without slipping into degradation beyond what’s strictly necessary due to reduced data use. These, as we understood, are discussions about the kind of service a user who declines data combination is entitled to expect, and whether it is genuinely equivalent, as required by Article 5(2). In this context, it emerged that the Commission and Meta have been discussing what such a service looks like. For example, Messenger or Threads when data combination with Facebook and Instagram, respectively, is not allowed by the end user. On this front, the Commission noted, with some satisfaction, that improvements have been made. 

Increasingly hot...
 
Equally outside the scope of the non-compliance decision, but still very much part of the ongoing regulatory dialogue with Meta on Article 5(2) was, unsurprisingly, AI. The Commission has been actively following the rollout of Meta’s shiny new AI services, as well as the more familiar ones, like WhatsApp, where AI has been quietly slipped in, while considering what all this might mean for compliance with Article 5(2). 
We heard that the Commission is looking specifically at "data flows across services, how the data is sourced and used [to train, ground of fine-tune the models] and if that involves the combination of personal data." Moreover, the Commission is "looking at data for personalising or grounding AI both within designated services and other services offered by Meta." As we've witnessed throughout the rest of the Compliance Workshops, Series 2, a robust regulatory dialogue on AI is already well underway with all designated gatekeepers, a fact that appears to have been somewhat downplayed by Meta before the Cologne court mentioned earlier, to whose recent ruling we now (very briefly) turn. 
 
Source: Proton Drive on X
The case was brought by a German consumer protection association seeking to prevent Meta from using data publicly shared by adult users on Facebook and Instagram (such as your photos posted on a public Instagram account - so-called First Party Data) as well as user interactions with its AI (e.g., the questions you put to Meta AI reused to fine-tune and improve those very systems - Flywheel Data), to train its own large language models. In summer 2024, Meta had already informed its end users of its intention to train its own large language model using the data of adult users from the EU/EEA, with the training scheduled to begin on 26 June 2024, though this was later postponed following concerns raised by the Irish data protection authority, as well as among others by the claimant, who issued a formal warning. Less than a year later, after what one can only assume was a rather lively exchange between Meta, the regulators, and other stakeholders, Meta tried again. This time, no one stopped them. The German court had an eleventh-hour opportunity to do so, but decided not to take it. For our purposes, it's worth noting that the claimant argued not only that Meta was in breach of the GDPR, but also in direct violation of its obligation under Article 5(2) of the DMA. At the workshop, we heard from the DMA Team that this very issue is part of their ongoing regulatory dialogue with Meta: how data flows across services, how it's sourced and used to train AI models, and whether this involves the combination of personal data. It’s a fascinating and extremely relevant (also legal) question, but one that would carry this blog post well off course. So, let’s return to the shore we’ve been meaning to stay anchored to: the non-compliance decision.
 
According to the Commission’s stated preference, the DMA Workshop was not meant to be the forum for discussing the non-compliance decision itself, but rather a space for Meta to illustrate where it currently stands on compliance with Article 5(2), also following its latest post-27 June tweaks, and to gather feedback from third parties. Wavesblog Readers can judge for themselves whether that’s quite how things actually played out. In any case, for context, the Commission did briefly outline the content of the non-compliance decision itself, offering a very helpful summary: one I’m grateful for and will return to as a running thread, gradually weaving in my own first reflections along the way. The decision concerns whether what we might call “1.0” of personal data combination in Meta's advertising services was compliant with the DMA. “2.0” was rolled out in November 2024, followed by “3.0” on 27 June 2025. 1.0 was squarely a "consent or pay" advertising model and this is what the decision addresses. 
 
End user's journey: 1.0
As reminded above, Meta combines personal data for what the Commission, rather benevolently, refers to as ads personalisation on Facebook and Instagram servicesSo this takes us straight to the heart of Meta’s economic engine, which has remained more or less unchanged since time immemorial, or at least after the dorm-room days. By 7 March 2024, it’s hard to argue that Meta could've been caught unprepared, as Article 5(2) itself drew initial inspiration from a long-running German antitrust saga in which Meta has been entangled since at least 2019. In its DMA non-compliance decision, "the Commission finds that Meta’s consent-or-pay advertising model fails to present end users of Meta's Facebook and Instagram platforms with a specific choice of a less personalised but equivalent alternative to the fully personalised option." At this point, it’s worth recalling that between March and November 2024, Facebook and Instagram users were asked to “choose” between Scylla (a six-headed sea monster who would snatch and devour sailors from passing ships on one side of the Strait of Messina) and Charybdis (a massive, gaping whirlpool that could swallow entire ships, on the other side of the Strait): only by paying could they escape the fully personalised option. 
 
Penelope waiting, Chiusi Etruscan Museum 

Based on the Commission's reading of Article 5(2) DMA, the legal reasoning underpinning the non-compliance decision is twofold. First of all, the less personalised but paying (Scylla) and the fully personalised (Charybdis) options cannot be considered equivalent alternatives, as they exhibit different conditions of access. Second, the binary configuration of Meta’s consent-or-pay model (Scylla or Charybdis) doesn’t ensure that end users freely give consent to the personalised ads option, falling short of the GDPR requirements for the combination of personal data for that purpose. But what about ('data combination in Meta's advertising services') 2.0? In November 2024, Meta charted an additional ads option. This is a free of charge, advertising-based version of Instagram and Facebook. Further tweaks to this option followed just as the 60-day compliance period set out in the non-compliance decision was about to expire — enter 3.0. Is it finally the Ithaca Option, that ensures compliance with the DMA by giving the end user a real chance to exercise the data right enshrined in Article 5(2)? In its non-compliance decision, specifically on Meta’s data combination 1.0, the Commission, without delving into detail, also indicates what a DMA-compliant solution would, in its view, require, making explicit reference to elements introduced in 2.0:
1) the end user should be presented with a neutral choice with regard to the combination of personal data for ads so that he/she can make a free decision in this respect;
2) for all the relevant personal data combined valid consent has been obtained;
3) the less personalised alternative should be equivalent in terms of user experience, performance and conditions of access - except for the amount of personal data used.
In any case, the Commission has already informed Meta that it is still assessing whether 3.0 is sufficient to meet those main parameters of compliance, while also hinting that one still outstanding issue might be whether the end user is truly being presented with a neutral choice (1). 
 

Before diving into the more legal aspects of the non-compliance decision, while taking on board, too, at least some of the arguments aired by Meta during the workshop (soon to be set out in more precise legal terms in the upcoming appeal against the decision), I’d like to surf briefly above the currents of DMA enforcement. Allow me, dear Wavesblog Reader, a quick self-quotation (then I’ll stop, promise). Reflecting on Article 5(2) and other DMA data-related provisions  in something I wrote in early in 2021, I reached a conclusion I still stand by, namely that this provision, at its core, and with regard to online advertising specifically, should provide end users with a real choice over the level of ‘creepiness’ they’re comfortable with. That may sound rather underwhelming — and it is. In fact, in the same piece, I wondered whether it wasn’t finally time to regulate personalised advertising more broadly, and for real, without losing sight of the broader picture: that private digital infrastructures, especially informational ones, ought, depending on the case, to be regulated decisively, dismantled, made interoperable, and so. At the same time, I do believe this end user's right to choose as carved clearly into Article 5(2) of the DMA, without tying it to one’s capacity to pay in order to escape a level of surveillance one finds uncomfortable (for oneself and/or the society she/he lives in), is immensely important.

 Bravissima, whatever. 

It’s a glimpse of something different: a very narrow but real incentive in favour of an "economic engine of tech" that starts rewarding services built around data minimisation and not surveillance and data collection (and its combination). From this perspective, offering the mass of Facebook and Instagram users a genuinely neutral choice, and thus the concrete possibility of making a free decision in favour of a less, but equivalent, personalised alternative (e.g., about the preferred level of creepiness), could also have a welcome educational effect. However, we can’t pretend that the current AI age hasn’t, for now at least, further accelerated the shift towards surveillance and data accumulation — and what’s being done to counter it still feels far too limited (see also the above-mentioned ruling by the Cologne court).

Hannibal in Italy, Palazzo dei Conservatori - Rome
(incidentally, where the Treaty of Rome was signed in 1957)
And so we come to those distinctly legal parts of the non-compliance decision which I found most interesting and thought-provoking — and which may well be among the points challenged in the forthcoming appeal, judging by the amuse-bouche offered by Meta’s representative during the compliance workshop. What some, Meta included, saw, and still see, as the unwieldy elephant in the Article 5(2) enforcement room is the namesake ruling from the CJEU. This isn’t just any elephant, however — it’s a war elephant, deployed by Meta in the front line to defend its 'data combination in Meta's advertising services' 1.0, namely its consent or pay model. The first thing to note is that the ruling, of course, isn’t about the DMA, but it stems from the German Facebook antitrust saga we’ve already touched on. According to Meta, however, that ruling has significant implications for how the DMA itself should be interpreted. Specifically, the ruling is meant to justify the binary option presented to end-users under the consent or pay model. It is beyond doubt, as recalled by the Meta representative, that the Court was directly referring to Meta—here acting in its capacity as an online social network holding a dominant position within the meaning of EU competition rules. Equally indisputable is that, although the case arose from a competition law context, part of the ruling addressed the conditions under which consent for the processing of personal data for personalised advertising can be considered validly given. The Commission, however, does not consider the Meta ruling to provide a valid justification for the consent or pay model under Article 5(2) of the DMA. 

Distinct legal notions 
The main argument supporting the Commission's view is rather straightforward: namely, that the Court could not have ruled on compliance with Article 5(2) of the DMA, as the latter introduces a specific test based on two distinct, cumulative legal notions—only one of which is the concept of consent within the meaning of Article 4, point (11), and Article 7 of the GDPR, while the other, that of a specific choice, must be understood as an autonomous notion under the DMA, with no equivalent in the GDPR or in any other legal instrument. Meta, as designated gatekeeper, is under the DMA obligation to present end users of Facebook and Instagram "with the specific choice of a less personalised but equivalent alternative to the advertising-based service it offers where those end users consent to the combination of their personal data from Meta’s Non-Ads Services in and with data from the [Online Advertising Service] CPS Meta Ads." So, under the DMA, it’s not enough for the end user to have given valid consent under the GDPR, as the user must also be presented with the specific choice of a less personalised alternative: only then can the end user exercise a real specific choice. This alternative should not rely on the combination of personal dataAnd this is precisely where this new data right granted to end users under Article 5(2) starts to take form. End users are entitled to choose whether, in order to use the online social networks Facebook and/or Instagram, they want their data to be combined or not — in this case, with data from the other CPS used by Meta for advertising purposes. The consent-or-pay model clearly didn’t offer this kind of option — something Meta began to address with its advertising model 2.0, though likely still not sufficiently for effective compliance, in the Commission’s view, given that Meta has since moved to 3.0. We will return later to the original scope and contours of this particular right granted to end users under the DMA, as further clarified by the EU legislator. For now, however, we must still address other aspects of the decision that are undoubtedly of interest going forward.

As just noted, the enforcement of the DMA is not necessarily bound by the contours of the GDPR where the DMA goes beyond it, or simply moves in a different direction. However, for those parts where the EU legislator has drawn on legal concepts from the GDPR, such as the requirement that end users give valid consent to the combination of their personal data for the purpose of serving personalised advertisements,  reference must be made to Article 4(11) and Article 7 of  the GDPR. This, in turn, triggers a duty of sincere cooperation with the supervisory authorities responsible for monitoring the application of that Regulation. Here, however, a possible complication arises, which in the ideal world of EU law enforcement perhaps shouldn’t exist. And yet, it does. We won’t dwell on it for long, but it’s worth highlighting (and it may at least partly explain Mario Draghi’s well-documented allergy to the GDPR, reaffirmed only a few days ago). The cast of characters on the GDPR enforcement stage has often found it difficult to offer a consistent reading of this Regulation. Unsurprisingly, this has weakened enforcement and worked to the advantage of those less inclined to embrace it. This risk has been avoided under the DMA, notably by assigning quasi-exclusive enforcement power to a single entity: the Commission (though arguably creating others in the process, which we won’t go into here). Echoes of the considerable effort supervisory authorities expend to ensure even a modicum of consistent GDPR application also surface in the text of the DMA non-compliance decision, where the Commission finds itself almost compelled to justify having taken into account the EDPB’s opinion on the matter when applying the GDPR (e.g., explaining that "the fact that some members of the Board voted differently or expressed reservations about the Opinion 08/2024 does not diminish the value of that Opinion, in the same way that the Court’s judgments produce their effects irrespective of whether they were decided by majority or unanimously"). What matters is that DMA enforcement not be delayed or undermined by such tensions, which, one imagines, some gatekeepers could hope to exploit, should a tempting opportunity arise. 

Series 1
In the decision, the Commission doesn’t merely argue for the legal autonomy of the notion of “specific choice” — which, in itself, would already be enough, and quite rightly so, in my view, especially given in particular the legislative history of the provision (one recalls that, in the DMA Trialogue, the alternative under serious consideration was a straightforward ban on data combination given the broadly recognised limitations of GDPR consent in light of the DMA's specific objectives). Of some significance, also for the wider GDPR community, should be the Commission’s careful reading of the very (few) words of the Court’s ruling that Meta has been brandishing since shortly after it was issued, in defence of its consent-or-pay advertising model. I’ve just rewatched last year’s Compliance Workshop, partly to check whether I remembered it correctly: a certain weary note, as if Meta had found itself navigating a 'forest dark' of regulators and judges. From the ashes of that multi-directional, conceptual crossfire, the consent-or-pay model seemed to arise like a phoenix. The words of the Meta interpreted by as a green light for the model are: “if necessary for an appropriate fee” (Recital 150). As previously recalled, the Court’s words must be read in their specific context: the Court was addressing whether an imbalance of power resulting from a dominant position could invalidate consent when tied to data processing not strictly necessary for contract performance (in this case, processing for behavioural advertising). The CJEU held that, in such a situation, consent may still be valid only if users are free to refuse processing and are nonetheless allowed access to an equivalent service without such processing, if necessary for an appropriate fee. In that context, the Commission makes clear in the decision that interpreting valid consent under the GDPR starts with a key premise: it is for the controller in a dominant position to demonstrate that such a 'fee for no tracking' is necessary to continue providing online networking services to the non-consenting end user. That is clearly not the case where other, more favourable options to provide that service exist. In any case, the onus to demonstrate that necessity was on the controller — on Meta — and here, it simply didn’t meet it. Moreover, a further point that the Commission does not leave unaddressed is the question of whether the fee was appropriate at all. In this respect,  the Commission ventures into a discussion of the term fee itself — noting that, in the language of the proceedings (German), the word used is Entgelt, which is more accurately understood in English as remuneration. This implies, according to the Commission, that remuneration need not necessarily take the form of a monetary fee "which has to be paid directly by the end user if Meta could be appropriately remunerated in a different manner." In fact, data controllers may, where necessity is duly demonstrated, receive another appropriate consideration for the online service they provide.

Zama? 
That specific elephant may not have survived the crossing of the Alps, but the legal battle between the Commission and Meta over whether consent can be deemed freely given in the context of a ‘Consent or Pay’ model, at this point in the non-compliance decision, is far from over. On the contrary, without the specific cover offered by those *six* words from the Meta ruling, the legal discussion necessarily broadens and becomes, frankly, even more engaging. This, however, calls for a deeper analysis, going well beyond the scope of this blog post, which is already out of proportion by my modest standards. There are surely several commentaries in the making. What is clear, though, is that the Commission, as sole enforcer of the DMA, has worked in cooperation with data protection authorities, both within the High-Level Group and beyond, relying in particular on the already mentioned EDPB Opinion. It’s also worth noting that while the debate over what constitutes freely given consent to the processing of one’s data by a dominant undertaking of Meta’s calibre and outreach has been ongoing at least since the Bundeskartellamt took a keen interest in the question, the notion of "specific choice" under the DMA is both novel and distinctive, thereby making the Commission’s first take on it all the more compelling. This is also because, even though the decision is clearly focused on why, between March and November 2024 (when the consent or pay model was in effect), Meta was not DMA-compliant, it nonetheless engages in tracing, affirmatively, the contours of the specific data right conferred on end users under Article 5(2). This is a particularly important discussion both now and going forward, especially in light of the gatekeepers’ obligation to future-proof the services they roll out. The judges who will soon be called upon to interpret this provision, following Meta’s appeal against the decision, are unlikely to disregard this key dynamic. 
 
Time, then, to look at things squarely from the end user’s perspective, because it’s your rights in the digital age at stake. That is, you, dear Wavesblog reader, using Instagram, Facebook, or both. According to the letter of the DMA, Meta is obliged to offer you a specific choice. Such a specific choice should require Meta offering you a less personalised, but equivalent, alternative Facebook/Instagram service that does not rely on the combination of your personal data from Non-Ads Services with data from Meta Ads for the purpose of displaying personalised advertisements to you on the Facebook or Instagram 'environments.' But let’s take a moment to clarify a couple of points that, unfortunately, surface only with difficulty in the non-confidential versions of Meta's compliance reports. The public and researchers not privy to more confidential exchanges can, however, get a clearer picture if someone asks the right questions during the compliance workshops and if Meta answers them fully and clearly. Otherwise, one must wait for the Commission to do so in an official and informed position, as it does in the non-compliance decision. First of all, it’s important to understand that what gets combined with data in Meta Ads is not just personal data from Facebook and Instagram, as it also includes your data from Messenger, Marketplace, and even Dating or Gaming Play, if you use them.You might then ask yourself which of your personal data are actually processed within Meta Ads. The decision mentions "personal data [Meta] collects from end users interacting with its OAS CPS Meta Ads"..."such as engagement with an advertisement or ad account." At its core, it’s about the data flow: specifically, the movement of your personal data from one service to another, in both directions. For instance, the personalisation of your Facebook feed may be based not only on your activity within Facebook, but also on how you engage with ads. More fundamentally, your personal data is the fuel powering Meta’s targeted advertising machine, used in various ways to deliver the ads you see on your Facebook/Instagram environments. As part of Meta Ads, Meta uses an advertising auction process that uses first party (data collected by Meta from you, across its Non-Ads Services), third party data (your personal data that Meta receives from third parties, such as your activity on third party websites, and also offline interactions), as well as campaign information by advertisers, to determine the selection and ranking of ads "surfaced" to you. 
 

 

 TBC