Under Elon Musk’s ownership, Twitter is failing to protect users from trolling, harassment, and child sexual exploitation, according to insiders.

Sections of this topic

    In this article, we’ll delve into the recent allegations made against social media platform Twitter.

    Key Takeaways:

    • Twitter is facing allegations that it is no longer able to protect its users from trolling, state-coordinated disinformation, and child sexual exploitation.
    • Changes made under the ownership of Elon Musk have reportedly deprioritized protecting users from harm in favor of championing freedom of expression and profitability.
    • Former employees have revealed that features designed to protect users from hate, such as the “harmful reply nudge,” have been abandoned, and teams responsible for creating safety measures have been sacked.
    • The University of Sheffield conducted research indicating that there has been a 69% increase in the number of new accounts following abusive profiles. This rise has sparked concerns over targeted harassment campaigns, as well as the increase in child sexual exploitation on the platform.
    • Although Twitter claims that it values “defending and respecting the user’s voice” as one of its core principles, neither the platform nor Musk has responded to the concerns raised in the BBC Panorama investigation.

    Allegations of Twitter’s Inability to Protect Users

    Twitter, the popular social media platform, has been under fire for reportedly being unable to protect its users from trolling, state-coordinated disinformation, and child sexual exploitation. 

    Insiders have claimed that these changes occurred following layoffs and changes made under the ownership of Elon Musk. 

    These allegations are supported by academic research and testimonies from Twitter users, highlighting the rise of hate on the platform.

    Abandonment of Features to Protect Users from Hate

    According to former employees, features that were introduced to protect users from hate, such as the nudge buttons, have been abandoned. 

    A former head of content design revealed that her entire team was sacked, and she herself resigned. 

    Twitter’s internal research showed that the safety measures created by this team, such as the nudge button, reduced trolling by 60%. 

    According to an engineer working for Twitter, no one is currently responsible for handling the issue, and they described the platform as appearing to be in good shape on the surface but having serious problems underneath.

    Silence from Twitter on Allegations

    Twitter has not yet responded to the BBC’s request for comment. There are concerns about an increase in child sexual exploitation, as well as targeted harassment campaigns that aim to suppress freedom of expression and foreign influence operations. 

    Additionally, there has been a rise in misogynistic online hate aimed at a reporter, with a 69% increase in new accounts following abusive profiles. 

    Rape survivors have also been targeted by more active accounts that may have been newly created or reinstated since the takeover.

    Musk’s Prioritization of Profit Over User Safety

    Insiders claimed that Elon Musk’s takeover prioritized making the social media company profitable and championing freedom of expression over protecting users from harm. 

    Elon Musk shared a set of internal documents known as the “Twitter Files” to justify his belief that the previous leadership of Twitter did not apply its moderation and suspension policies fairly.

    However, those who have been on the inside feel that Musk has used this to deprioritize protecting users from harm altogether.

    Disregard for Twitter’s Core Values

    Although Twitter’s policies state that “defending and respecting the user’s voice” is a core value, it seems that this value is not being upheld as neither the platform nor Musk have addressed the concerns raised in the BBC Panorama investigation.

    Layoffs and Rise of Misogynistic and Abusive User Profiles

    The BBC Panorama investigation has uncovered that approximately half of Twitter’s 7,500 employees have either been dismissed or left their positions since Musk purchased the platform in October of last year. 

    Research by the University of Sheffield also found a 69% surge in new accounts that were following known misogynistic and abusive user profiles.

    Undermined Ability to Prevent Child Sexual Exploitation

    One insider working on preventing sexual exploitation of children on Twitter revealed that job cuts had undermined their ability to do so, despite Musk labeling it a priority. 

    The insider’s team used to detect and report child sexual exploitation content daily. 

    However, due to job cuts, the team fears that they may not be able to report all instances to the police as their staff has been reduced from 20 to just six or seven.

    Twitter’s Attempts to Make the Platform Safer

    Twitter claimed to have deleted 400,000 accounts within a month to enhance safety on the platform. 

    However, Lisa Jennings Young, a former head of content design, disclosed that her entire team, responsible for developing safety measures to shield users from hate, was dismissed. 

    She also revealed that their most effective feature was the “harmful reply nudge,” which utilized AI technology to detect harmful language and notify users before sending a tweet, preventing numerous individuals from composing abusive messages.

    Concerns for User Safety on Social Media Platforms

    With the rise of hateful content and targeted harassment campaigns on social media platforms, concerns for user safety have become increasingly pressing. 

    It is crucial for these platforms to prioritize the safety of their users and take action to prevent further harm.

    Twitter’s inability to protect users from hate, trolling, state-coordinated disinformation, and child sexual exploitation highlights the need for social media platforms to implement effective safety measures. 

    Musk’s prioritization of profit over user safety and disregard for Twitter’s core values have undermined the platform’s ability to protect its users.

    The rise of misogynistic and abusive user profiles on Twitter is also alarming, with the University of Sheffield’s research showing a 69% increase in new accounts following these profiles. 

    Job cuts have undermined the ability of Twitter’s teams to prevent child sexual exploitation and detect harmful content, further adding to the safety concerns on the platform.

    While Twitter has stated that it has removed accounts to make the platform safer, the sacking of the team responsible for creating safety measures raises questions about the effectiveness of these actions. 

    The harmful reply nudge, one of the most successful features designed to protect users from hate, has been abandoned, further highlighting the need for better safety measures.

    Conclusion

    It is crucial for social media platforms to prioritize the safety of their users and implement effective safety measures to prevent harm. The rise of hate, targeted harassment campaigns, and child sexual exploitation on Twitter underscores the need for immediate action to protect users and maintain the integrity of the platform.