How to Improve X

The social media platform X, formerly known as Twitter is at a crossroads. It's the social media platform that many of its users love to hate - though that may be true of other platforms, too, as the House passed a bill this week with the goal of banning TikTok.

Ever since Elon Musk purchased Twitter about 15 months ago, the platform has undergone tremendous internal and external shocks. While still a leading platform, a May 2023 Pew Research Study reported that a quarter of users plan not to use the site this year and 60% said they had taken a break from the site for a period of several weeks or more. Wired reported that daily active users fell from 140 million to 121 million.

Considering that engagement is an essential part of any social media business model, that's bad news for X. Worse, the Supreme Court is scheduled to hear five cases this term involving the First Amendment and the Internet platforms, including one last month involving a pair of Florida and Texas laws. These cases may limit social media platforms' ability to manage their sites and may be the result of a lack of transparency and consistency in X's moderation policies.

Here are five more important steps the company should take to improve users' experience on X.

Improve moderation to combat the toxic culture that's the worst part of the X experience.

The platform's toxic culture has grown more pronounced since Musk took control but X doesn't seem to care, despite concerns from advertisers who have cut back spending by 55% according to Reuters. It's one thing to claim free speech but when users are getting bullied, harassed and threatened, it's incumbent upon the platform to implement measures to protect them.

One way could be to introduce features - such as requiring posters to use their real names, as LinkedIn and Facebook do - to facilitate respectful online interactions. While X allows users to mute others, it's an imperfect solution; I sometimes see posts from people I've already muted because they have been re-tweeted by someone else. This feels like the X-version of whack-a-mole.

Give people more control over what they see.

In last month's Supreme Court hearing, lawyers in favor of the Florida and Texas laws argued that platforms should not be able to regulate content and that users should be able to decide for themselves. But the problem is that we have very little control over what's served up on X or Facebook, for that matter.

X should let us customize the topics we want to follow. Instead, X provides a limited ability through an "Interests" menu under the Privacy and Safety feature that most users don't know about. From there, a user can unclick a list of pre-selected choices that, for me, included topics that I don't like, follow or search for.

Meanwhile, X along with other platforms, collects massive amounts of user data. Despite a new privacy policy that went into effect last year that claims to "help keep X more secure and respectful for everyone, and more relevant to you," X continues to share data with third parties and has made it complicated to shield or limit who gets access to your data. All social media platforms should enhance user privacy features.

Give advertisers more control over where their ads are placed.

After reports that their ads were unknowingly placed alongside accounts promoting bigotry and hatred, dozens of organizations halted their advertising on X until brand safety could be improved. This issue may have cost X more than $75 million in revenue, according to the New York Times.

4) Put a muzzle on Elon. Instead of creating value for the platform, his posts may be reducing value because they are erratic, confusing and frequently wrong. According to the Community Notes Leaderboard, an unaffiliated organization that ranks people based on the number of Community Notes appended to their posts as community-based fact checks to correct wrong information (based on objective sources and unambiguous explanations), Musk currently ranks #30 with 71 community notes, accumulating 11 more community notes since January.

Implement robust content moderation algorithms to curb misinformation and hate speech.

Experts already expect record levels of disinformation about the 2024 elections, so X, along with other social media platforms, should ramp up their content moderation policies. However, Musk has decimated X's content moderation.

The company can use technology to help address the problem. X should develop a more transparent algorithm to reduce the impact of echo chambers, tools to verify information, promote credible sources, and tackle bots and automated amplifiers of problematic content.

In fact, the European Union is so concerned about disinformation on X, along with a lack of transparency about advertising, that it opened an investigation into X's practices in December. But don't expect Congress to provide a solution. Despite holding hearings every year, Congress hasn't figured out social media, and even recent hearings have not appeared to have made positive changes.

The good news for Musk is that people still use X because alternatives like Threads, Mastodon, Bluesky, Tribel, or Truth Social are unproven or problematic. But if X continues its current course, users and advertisers may finally give up on X.

Norman Birnbach

Norman Birnbach is the president of Birnbach Communications, www.birnbachcom.com, a 23-year-old Boston-based PR and social media agency that helps clients navigate trends and raise awareness through earned media and thought leadership. A regular contributor to CommPro, his blog, PR BackTalk, provides insights and attitudes about PR, journalism, and traditional and social media.

Previous
Previous

Author Janet Wallach on ‘Flirting with Danger: The Mysterious Life of Marguerite Harrison, Socialite Spy’

Next
Next

Key Insights from Industry Executives on Empowering Women in Leadership in 2024