Preserving Democracy Against the Abuse of Power: A Concrete Example of Manipulation on Social Media

Legal Representation Request for Lawsuit Against Platform X

  1. Introduction and Request

I am reaching out to my friend and lawyer, David Carter, to inquire if he would be willing to represent me in a lawsuit in California against Platform X (formerly Twitter). The lawsuit involves a violation of the First Amendment of the U.S. Constitution—specifically, the right to free speech. During a live broadcast featuring Elon Musk and Donald Trump, I observed selective content filtering where pro-Trump posts were highlighted, while anti-Kamala Harris posts were suppressed. This action creates a biased narrative that could be interpreted as propaganda aimed at influencing public opinion and ultimately the election outcome.

I believe this is a clear violation of the democratic process, and I would appreciate your assistance in taking legal action against Elon Musk and Platform X.

Request for Legal Representation: Potential Lawsuit Against Platform X for First Amendment Violation

Date: September 13, 2024

Subject: Legal Representation Request for Lawsuit Against Platform X

Email content:

Dear David Carter,

I hope you're doing well. I'm writing to see if you would consider representing me in a potential lawsuit in the state of California against Platform X, formerly known as Twitter. The lawsuit pertains to a violation of the First Amendment of the U.S. Constitution—specifically, the right to free speech.

The issue arose during a live call featuring Donald Trump, during which I observed questionable filtering of comments that appeared biased. Pro-Trump posts were visibly highlighted, while posts critical of other candidates, particularly Kamala Harris, were suppressed. This behavior seems like a form of propaganda and manipulation of public opinion, potentially aimed at influencing the outcome of the upcoming election.

In addition to this, I have reached out to the European Commission regarding their concerns about X's infringement on the Digital Services Act (DSA). After filing my complaint, I received a response from the Head of Unit, acknowledging the Commission’s ongoing actions. However, I firmly believe that justice needs to be pursued within California, where X is currently registered. There’s also an urgency to the matter, as Mr. Musk announced plans to move the company to Texas back in July, which could, in part, be an attempt to evade legal scrutiny in a more conservative state.

I’ve already posted an informal letter of complaint on my X account, which you can find here.

Additionally, I’ve written a blog post explaining my correspondence with the European Commission, which you can review here.

I appreciate your representation in a previous case and would be grateful for your time in considering this request. I am in need of an expert lawyer to draft the formal complaint and to proceed with legal action, not only against Elon Musk but also against the platform for violating democratic processes in the United States.

For your reference, the blog article is titled: “Holding Twitter/X Accountable for Fairness and Transparency” and was published on September 10, 2024. The article covers my concerns about biased moderation practices during the Trump call, which favored one political narrative over another during this critical election period.

Thank you for considering my request. I look forward to hearing from you.

Best regards,  

Kevin Bihan-Poudec  

Founder, Voice For Change Foundation

Advocate for Ethical AI, Workforce Preservation and Human Rights

2. Concerns Regarding the Digital Services Act (DSA) and Communication with the European Commission

I have already communicated with the European Commission regarding my concerns about X’s potential violation of the Digital Services Act (DSA). After submitting my formal complaint, I received a response from the Head of Unit, Prabhat Agarwal, acknowledging the Commission’s ongoing actions against Platform X. These actions focus on the platform’s compliance with the DSA, particularly around content moderation, algorithmic manipulation, and transparency during the election period.

Original Complaint to the European Commission:

Date: August 14, 2024

Subject: Concern Regarding Potential Violation of Digital Services Act on Twitter/X

Letter Content:

"Dear Members of the European Commission,

I hope this message finds you well. I am writing to bring to your attention a matter of great concern regarding the recent actions of Twitter/X, a platform now under the ownership of Elon Musk. Specifically, I would like to address the apparent filtering of "relevant comments" during a live phone call between former U.S. President Donald Trump and Elon Musk which you can find more information by clicking on the below article “Democracy in Danger in America Led by the Most Prominent Tech CEO: Elon Musk"

As a user of the platform, I observed that the comments marked as "most relevant" during the live broadcast appeared to be overwhelmingly one-sided, favoring the narrative of a single political figure. This selective filtering, if intentional, raises serious concerns about the integrity of the platform and its compliance with the Digital Services Act (DSA), particularly regarding the amplification of content that may not be in the public interest and could potentially mislead or influence public opinion during a crucial election period.

As you are aware, the DSA mandates that platforms, especially those designated as Very Large Online Platforms (VLOPs), must adhere to strict due diligence obligations. These include ensuring that content moderation practices are transparent, non-discriminatory, and do not unduly favor any political ideology or candidate. The biased selection of comments, which seemingly promoted one viewpoint over others, could be seen as a violation of these principles.

Moreover, the ability of one individual to exert such control over the public discourse on a platform with over 300 million users worldwide, one-third of which are in the EU, is alarming. This concentration of power poses a significant risk to the democratic process, as it allows for the manipulation of public opinion on a large scale.

Given the gravity of this situation, I kindly request that the European Commission conduct a thorough investigation into the actions of Twitter/X during this broadcast. It is essential to ensure that the platform is not being used to unfairly influence the outcome of elections or to propagate a specific political agenda, particularly in a manner that could violate EU law.

I appreciate your attention to this matter and look forward to your response on the steps that will be taken to address these concerns. 

Thank you for your commitment to upholding the principles of transparency and fairness in the digital space.

Sincerely,

Kevin Bihan-Poudec

Founder, Voice For Change Foundation | Advocate for Ethical AI and Workforce Preservation”

European Commission’s Original Complaint to Elon Musk:

Date: August 12, 2024

From: Thierry Breton, Member of the European Commission

Letter Content:

"Dear Mr. Musk,

I am writing to you in the context of recent events in the United Kingdom and in relation to the planned broadcast on your platform X of a live conversation between a U.S. presidential candidate and yourself, which will also be accessible to users in the EU.

I understand that you are currently doing a stress test of the platform. In this context, I am compelled to remind you of the due diligence obligations set out in the Digital Services Act (DSA), as outlined in my previous letter. As the individual entity ultimately controlling a platform with over 300 million users worldwide, of which one-third in the EU, that has been designated as a Very Large Online Platform, you have the legal obligation to ensure X's compliance with EU law and in particular the DSA in the EU.

This notably means ensuring, on one hand, that freedom of expression and of information, including media freedom and pluralism, are effectively protected and, on the other hand, that all proportionate and effective mitigation measures are put in place regarding the amplification of harmful content in connection with relevant events, including live streaming, which, if unaddressed, might increase the risk profile of X and generate detrimental effects on civic discourse and public security. This is important against the background of recent examples of public unrest brought about by the amplification of content that promotes hatred, disorder, incitement to violence, or certain instances of disinformation.

It also implies:

i) informing EU judicial and administrative authorities without undue delay on the measures taken to address their orders against content considered illegal, according to national and/or EU law,

ii) taking timely, diligent, non-arbitrary, and objective action upon receipt of notices by users considering certain content illegal,

iii) informing users concerning the measures taken upon receipt of the relevant notice, and

iv) publicly reporting about content moderation measures.

In this respect, I note that the DSA obligations apply without exceptions or discrimination to the moderation of the whole user community and content of X (including yourself as a user with over 190 million followers) which is accessible to EU users and should be fulfilled in line with the risk-based approach of the DSA, which requires greater due diligence in case of a foreseeable increase of the risk profile.

As you know, formal proceedings are already ongoing against X under the DSA, notably in areas linked to the dissemination of illegal content and the effectiveness of the measures taken to combat disinformation.

As the relevant content is accessible to EU users and being amplified also in our jurisdiction, we cannot exclude potential spillovers in the EU. Therefore, we are monitoring the potential risks in the EU associated with the dissemination of content that may incite violence, hate, and racism in conjunction with major political or societal events around the world, including debates and interviews in the context of elections.

Let me clarify that any negative effect of illegal content on X in the EU, which could be attributed to the ineffectiveness of the way in which X applies the relevant provisions of the DSA, may be relevant in the context of the ongoing proceedings and of the overall assessment of X's compliance with EU law. This is in line with what has already been done in the recent past, for example in relation to the repercussions and amplification of terrorist content or content that incites violence, hate, and racism in the EU, such as in the context of the recent riots in the United Kingdom.

I therefore urge you to promptly ensure the effectiveness of your systems and to report measures taken to my team.

My services and I will be extremely vigilant to any evidence that points to breaches of the DSA and will not hesitate to make full use of our toolbox, including by adopting interim measures, should it be warranted to protect EU citizens from serious harm.

Yours sincerely,

Thierry Breton

Member of the European Commission"

European Commission Response to My Complaint:

Date: September 9, 2024

Subject: Ares(2024)5846153 Concern Regarding Potential Violation of Digital Services Act on Twitter/X

Letter Content:

“Dear Kevin Bihan-Poudec,

Thank you for your message addressed to Commissioner Breton, who has asked me to respond on his behalf, as Head of Unit in charge of the Digital Services Act.

As you mention in your letter, the European Commission proposed the Digital Services Act (DSA), which we are currently implementing to address risks related to illegal content, fundamental rights and disinformation and to ensure that decisions by online platforms are transparent and predictable.

Very Large Online Platforms (VLOPs), such as X, are subject to regulatory oversight by the European Commission.

On 18 December 2023, the Commission has opened formal proceedings against X to assess whether X may have breached the DSA in areas linked to risk management, content moderation, dark patterns, advertising transparency and data access for researchers. You can find the decision here, and you can read more about the issues that are being investigated at this link.

On 14 March 2024, the Commission sent a request for information to provide more information on their respective mitigation measures for risks linked to generative AI1.

On 8 May 2024, the Commission sent an additional request for information to obtain more details on X's content moderation activities and resources, on the risk assessment conducted by X in relation to the implementation of generative AI tools in the EU as well as on other areas covered by the ongoing proceedings2.

On 12 July 2024, the Commission informed X of its preliminary view that it is in breach of the Digital Services Act (DSA) in areas linked to dark patterns, advertising transparency and data access for researchers3.

In relation to content moderation, the ongoing investigation focuses on systemic issues, such as effectiveness of X’s content moderation measures. The Commission does not have direct competence to intervene in individual content moderation decisions.

Nevertheless, under the new DSA system, users will always need to address the online platforms first when encountering a problem with content moderation as the one you are facing. However, the DSA puts several obligations on online platforms to make this process more efficient, transparent, and fair. Firstly, they are required to have a point of contact for users, such as email addresses, instant messages, or chatbots. Online platforms also have to ensure that contact is quick and direct and cannot solely rely on automated tools, making it easier for users to reach platforms if they wish to make a complaint. Secondly, online platforms must provide clear and specific reasons for their decisions concerning accessing and suspending a user’s account. Thirdly, if a user chooses to have the decision reviewed, this must be handled free of charge via a platform’s internal complaints system. And lastly, the Digital Services Act will allow users to use certified out-of-court dispute settlement mechanism bodies to resolve disputes relating to platforms’ decisions about users’ content.

If the platforms do not respect any of the rights and obligations under the DSA, the user has the right to file complaints with the Digital Services Coordinator in their EU Member State, which can start supervisory actions against the platform.

You can read more about the Digital Services Act at this link.

Yours sincerely,

Prabhat AGARWAL

Head of Unit

My Response to the European Commission:

Date: September 10, 2024

Letter Content:

Dear Prabhat Agarwal,

Thank you for your prompt and detailed response to my concerns regarding potential violations of the Digital Services Act (DSA) by Twitter/X. I appreciate the ongoing efforts of the European Commission to ensure transparency, accountability, and fairness within very large online platforms (VLOPs), particularly those with the systemic influence that Twitter/X holds.

As mentioned in your correspondence, the formal proceedings against Twitter/X include concerns about content moderation, risk management, transparency in advertising, and algorithmic manipulation. Based on my own observations, I would like to further emphasize the pressing nature of these issues, particularly with regard to the manipulation of public opinion during crucial election periods.

As mentioned in my earlier correspondence, during a live broadcast featuring Elon Musk and 2024 U.S. presidential candidate Donald Trump, I observed a significant bias in comment moderation. Comments supporting Trump were highlighted as "most relevant," while negative comments about Kamala Harris were suppressed, creating a skewed narrative that clearly favored one political figure over another. This is deeply concerning given the influence such platforms wield in shaping political discourse and public perception.

While platforms like Twitter/X are private entities, their immense power over digital communication means they play an outsized role in the free exchange of ideas—especially during elections. If algorithms are used to selectively promote certain viewpoints, this undermines the spirit of free speech and threatens the integrity of democratic processes.

While I have addressed this issue with the platform directly, see attached, as required under the DSA, I would appreciate your guidance on the next steps if the platform fails to meet its obligations. Specifically, as a resident of California, what would be the appropriate course of action regarding filing complaints with the Digital Services Coordinator in an EU Member State, given the international scope of this issue?

I would like to formally share my findings and opinion as part of the legal actions that may arise from these proceedings. It is imperative that platforms of this size are held accountable, particularly when their actions have the potential to influence elections and erode public trust in democratic institutions.

Thank you once again for your commitment to upholding the values of transparency, fairness, and accountability within the digital space. I look forward to engaging further on this matter and contributing to the ongoing proceedings against Twitter/X and its owner, Elon Musk.

Kind regards,
Kevin Bihan-Poudec
Founder, Voice For Change Foundation

In conclusion, this case is not about personal monetary compensation, but rather about holding the powerful accountable and ensuring that no one is above the law.

The actions of Platform X under Elon Musk’s control present a clear threat to the democratic process, and it is imperative that we take action to safeguard free speech and transparency, especially during critical times like elections.

With the legal expertise of a lawyer, I am confident that, We, can set a strong precedent for responsible digital governance and demonstrate that these platforms, no matter how influential, must adhere to the principles of fairness, accountability, and the rule of law as outlined in the Constitution of the United States.

Previous
Previous

Is America ready for the ethical AI social movement?

Next
Next

The American Labor Crisis: Massive Layoffs and the Unfolding Reality in Tech