After a conflict on how the social network's content should be regulated in the Union, Elon Musk shut down Twitter's entire Brussels headquarters.
Twitter's connection with the European Union, which has some of the most robust regulations controlling the digital world and is frequently at the forefront of global regulation in the sector, may be strained by the closing of the company's Brussels center.
Platforms like Twitter are required by one guideline to remove anything that is prohibited in any of the EU bloc's member states. For instance, tweets influencing elections or content advocating hate speech would need to be removed in jurisdictions where such communication is prohibited.
Another obligation is that social media sites like Twitter must demonstrate to the European Commission, the executive arm of the EU, that they are making a sufficient effort to stop the spread of content that is not illegal but may be damaging. Disinformation falls under this category. This summer, businesses will need to demonstrate how they are handling such positions.
Musk will need to abide by the GDPR, a set of ground-breaking EU data protection laws that mandate Twitter have a data protection officer in the EU.
The present proposal forbids the use of algorithms that have been demonstrated to be biased against individuals, which may have an influence on Twitter's face-cropping tools, which have been presented to favor youthful, slim women.
Twitter might also be obligated to monitor private conversations for grooming or images of child sexual abuse under the EU's Child Sexual Abuse Materials proposal. In the EU, there is still discussion about them.
In order to comply with the DSA, Twitter will need to put in a lot more effort, such as creating a system that allows users to flag illegal content with ease and hiring enough moderators to examine the content in every EU member state.
Twitter won't have to publish a risk analysis until next summer, but it will have to disclose its user count in February, which initiates the commission oversight process.
Two lawsuits that might hold social media corporations accountable for their algorithms that encourage dangerous or unlawful information are scheduled for hearings before the US Supreme Court. This might fundamentally alter how US businesses regulate content.