Editor’s Note: This article highlights the European Commission’s preliminary findings against X for breaching the Digital Services Act. These findings underscore critical issues of transparency and accountability that are central to the DSA, particularly in relation to content moderation, advertising transparency, and data access for researchers. This development is essential for cybersecurity, information governance, and eDiscovery professionals, as it emphasizes the importance of compliance with regulatory standards designed to protect users and maintain trust in digital platforms. The outcome of this case could set significant precedents for the industry, influencing future practices and regulatory approaches.


Content Assessment: Digital Services Act Enforcement Heats Up: X Faces Preliminary Findings

Information - 90%
Insight - 92%
Relevance - 88%
Objectivity - 91%
Authority - 94%

91%

Excellent

A short percentage-based assessment of the qualitative benefit expressed as a percentage of positive reception of the recent article from ComplexDiscovery OÜ titled, "Digital Services Act Enforcement Heats Up: X Faces Preliminary Findings."


Industry News – Data Privacy and Protection Beat

Digital Services Act Enforcement Heats Up: X Faces Preliminary Findings

ComplexDiscovery Staff

The European Commission has taken a significant step in enforcing the Digital Services Act (DSA), issuing preliminary findings against X, formerly known as Twitter, for alleged breaches of the regulation. This move marks a pivotal moment in the EU’s efforts to regulate large online platforms and ensure digital transparency and accountability.

The Commission’s investigation into X’s practices has revealed potential non-compliance in three key areas: the platform’s “verified accounts” system, advertising transparency, and data access for researchers. These findings underscore the challenges faced by social media giants in adapting to the stringent requirements of the DSA, which aims to create a safer and more transparent online environment.

The first area of concern revolves around X’s implementation of its “verified accounts” system, denoted by the familiar blue checkmark. According to the Commission, the current design and operation of this feature deviate from industry standards and potentially mislead users. The crux of the issue lies in the fact that anyone can subscribe to obtain a “verified” status, which the Commission argues compromises users’ ability to make informed decisions about the authenticity of accounts and the content they encounter. This situation has apparently led to instances of malicious actors exploiting the system to deceive users, raising serious questions about the platform’s approach to user trust and content credibility.

Advertising transparency forms the second pillar of the Commission’s preliminary findings. The DSA mandates that very large online platforms (VLOPs) provide a searchable and reliable repository of advertisements. However, the Commission contends that X has fallen short of this requirement. The platform’s current ad repository allegedly contains design features and access barriers that render it unsuitable for its intended transparency purpose. This deficiency not only affects users’ ability to understand the advertising landscape on the platform but also hinders the necessary supervision and research into emerging risks associated with online advertising distribution.

The third area of concern relates to X’s approach to providing data access for researchers. The DSA emphasizes the importance of allowing eligible researchers to access public data from VLOPs to conduct studies and analyses. However, the Commission’s findings suggest that X has placed undue restrictions on this access. Specifically, X’s terms of service prohibit independent access to public data through methods such as scraping. Furthermore, the process for granting researchers access to X’s application programming interface (API) appears to be designed in a way that discourages research projects or forces researchers to pay exorbitant fees for access. This situation potentially stifles important research that could contribute to a better understanding of the platform’s impact and dynamics.

These preliminary findings represent a crucial juncture in the enforcement of the DSA. X now has the opportunity to examine the documents in the Commission’s investigation file and respond to the preliminary findings, exercising its rights of defense. Concurrently, the European Board for Digital Services will be consulted on the matter.

The potential consequences for X, should these preliminary views be confirmed, are substantial. The Commission could adopt a non-compliance decision, finding X in breach of Articles 25, 39, and 40(12) of the DSA. Such a decision could result in fines of up to 6% of X’s total worldwide annual turnover, along with orders to implement measures addressing the identified breaches. Moreover, a non-compliance decision might trigger an enhanced supervision period to ensure X’s adherence to the remedial measures.

This development is part of a broader effort by the European Commission to enforce the DSA across major online platforms. X was designated as a Very Large Online Platform under the DSA on April 25, 2023, following its declaration of reaching over 45 million monthly active users in the EU. The formal proceedings against X were opened on December 18, 2023, encompassing not only the areas addressed in these preliminary findings but also concerns related to the dissemination of illegal content and the effectiveness of measures to combat information manipulation.

The Commission’s actions extend beyond X, with formal proceedings opened against other major platforms such as TikTok, AliExpress, and Meta in recent months. This widespread scrutiny underscores the EU’s commitment to enforcing the DSA and shaping a more accountable digital landscape.

For professionals in cybersecurity, information governance, and eDiscovery, these developments highlight the increasing regulatory pressure on social media platforms and the potential far-reaching implications for data management, user privacy, and content moderation. The focus on transparency in advertising and data access for researchers is particularly relevant, as it may lead to new standards and practices in how platform data is collected, stored, and made available for scrutiny.

The emphasis on combating deceptive practices, such as the misuse of verified account systems, also underscores the growing importance of robust identity verification and trust mechanisms in online environments. This could potentially influence future developments in digital identity management and online authentication systems.

Furthermore, the Commission’s approach to enforcing the DSA, including the use of a whistleblower tool for anonymous reporting, sets a precedent for regulatory oversight in the digital age. This may prompt organizations to reassess their internal compliance mechanisms and whistleblowing policies to align with evolving regulatory expectations.

As the situation unfolds, industry professionals must closely monitor the outcomes of these proceedings and their potential impact on digital platform governance, data access policies, and online advertising practices. The resolution of these issues may well shape the future landscape of online interactions, data transparency, and platform accountability in the European Union and beyond.

News Sources


Assisted by GAI and LLM Technologies

Additional Reading

Source: ComplexDiscovery OÜ

 

Have a Request?

If you have information or offering requests that you would like to ask us about, please let us know, and we will make our response to you a priority.

ComplexDiscovery OÜ is a highly recognized digital publication focused on providing detailed insights into the fields of cybersecurity, information governance, and eDiscovery. Based in Estonia, a hub for digital innovation, ComplexDiscovery OÜ upholds rigorous standards in journalistic integrity, delivering nuanced analyses of global trends, technology advancements, and the eDiscovery sector. The publication expertly connects intricate legal technology issues with the broader narrative of international business and current events, offering its readership invaluable insights for informed decision-making.

For the latest in law, technology, and business, visit ComplexDiscovery.com.

 

Generative Artificial Intelligence and Large Language Model Use

ComplexDiscovery OÜ recognizes the value of GAI and LLM tools in streamlining content creation processes and enhancing the overall quality of its research, writing, and editing efforts. To this end, ComplexDiscovery OÜ regularly employs GAI tools, including ChatGPT, Claude, DALL-E2, Grammarly, Midjourney, and Perplexity, to assist, augment, and accelerate the development and publication of both new and revised content in posts and pages published (initiated in late 2022).

ComplexDiscovery also provides a ChatGPT-powered AI article assistant for its users. This feature leverages LLM capabilities to generate relevant and valuable insights related to specific page and post content published on ComplexDiscovery.com. By offering this AI-driven service, ComplexDiscovery OÜ aims to create a more interactive and engaging experience for its users, while highlighting the importance of responsible and ethical use of GAI and LLM technologies.