The Octagon

A Sacramento Country Day School Newspaper

From Copy to .Com News

A month into Musk’s Twitter: exploring the social media takeover

Since his Oct. 27 takeover of Twitter, Elon Musk has made radical changes to the popular social media platform. However, under the rule of the multi-billionaire and self-proclaimed “free speech absolutist,” many people, including users at Country Day, are worried about the new content moderation policies.

Content moderation 

In an April tweet before he took control of Twitter, Musk said, “By free speech, I simply match that which matches the law. I am against censorship that goes far beyond the law. If people want less free speech, they will ask the government to pass laws to that effect.”

Senior Brynne Barnard-Bahn, a Twitter user, disagrees with Musk’s free speech argument.

She said although social media is a fun way to converse with others online, companies should take extra care to foster a safe and inclusive space for its users.

“I think fun should definitely be involved, but fun shouldn’t involve discrimination,” she said. “Elon Musk is using the government as a shield for making a public stance on hate speech.”

Despite heated public discussion about the company’s role in content moderation, Musk followed through with his approach, immediately imposing a massive near 50% layoff of Twitter employees across various departments, including those responsible for accessibility, content moderation and human rights. 

Before the layoffs, Twitter already was struggling to take down posts users had identified as problematic, according to a Nov. 24 European Union study. 

Now, Musk’s drastic cuts decrease the number of employees responsible for reviewing content that users report.  content ranging from hate speech to child pornography. Musk’s  decisions are a part of his larger plans to reform the work culture,  cut perceived waste at the company and generate more revenue through Twitter Blue, a newly introduced monthly verification subscription for its users.

Senior Karabelo Bowsky, who occasionally uses Twitter, said she could understand Musk’s staffing changes from a business perspective but is concerned with the implications of a reduced moderation team.

 “He’s only seeing this from a business side and is not seeing it from the humanity side,” she said. “The influence social media has on how we interact with each other is so insanely huge.”

The U.S. Constitution’s First Amendment protects a general expression of opinion or belief, but includes exceptions for statements calling for incitement, defamation, obscenity, child pornography, threats, fighting words or fraud.

Since  hate speech and slurs do not fall under these exemptions, there has been  a sharp rise in their use on Twitter under Musk’s leadership.

Assistant to Head of High School Grace Strumpfer disagrees with the loosening of Twitter’s content moderation. Strumpfer said more manual and specific review on a case-by-case basis is needed to protect against hate speech and evaluate certain gray areas regarding the use of slurs under general law.

“I think it would be great to have policies against hate speech, but then you run into censorship problems of whether we should ban books written by people in those minorities who are reclaiming those words,” Strumpfer said. “There’s a certain kind of danger in allowing hate speech, allowing people to throw slurs around for targeted harassment.”

Hate speech

Musk revisited his vision for free speech in a Nov. 18 tweet, stating negative or hate tweets will be deboosted and demonetized. In a separate Nov. 23 tweet, Musk claimed that overall Twitter engagement with hate speech decreased by a third.

However, despite his claims, reports from the New York Times and the Center for Countering Digital Hate found a significant increase in both the engagement levels and quantity of hate speech on the platform. 

Under Musk’s takeover in late October, the average daily usage of the N-word on Twitter increased to 3,876 from the 2022 average of 1,282.

In addition, the report found a 58% increase in slurs toward members of the gay community, a 33% increase of misogynistic slur usage and a 62% increase in hate speech directed toward transgender individuals on the site.

Although Strumpfer takes care to choose who they follow in an effort to curate the content they see on their Twitter feed, they still have seen instances of hate speech occasionally.

 “I go on Twitter for a specific reason, and that’s to look at cute cats and cool art. I don’t go on there to get my news because it’s a social media website,” Strumpfer said.

Even though Strumpfer doesn’t seek out the political side of Twitter, they have noticed that the bigger art accounts, especially those belonging to people of color, have had to deal with hate comments and troll accounts.

In recent months, viral antisemitic rants of rapper Ye, formerly known as Kanye West, caused additional controversy and online discussion. Ye has repeatedly used his Twitter account and media presence to highlight antisemitic conspiracy theories and sympathize with alt-right and Nazi figures. 

Bowsky, who is a part of the Black community and a follower of Judaism, is concerned with the general rise of hate speech throughout all of the social media platforms she uses, including Twitter, Instagram and TikTok. 

“As a Black woman in this day and age, I get nervous engaging in such conversations, just out of fear for myself. It might not be the best way to go about it, but it’s just what makes me feel safe,” Bowsky said. “I think with this hate speech, it’ll make me less inclined to want to engage in such discussions over social media.”

Although she said celebrities and notable figures with influence have a moral obligation to positively shape their online spheres, Bowsky said there is more to be done \ to get to the root of conflict instead of putting the weight of social change on celebrities.

“Putting all the blame on them is also unfair, no matter how much I do or don’t agree with them,” Bowsky said. “I think the biggest issue is cancel culture, because the issue isn’t addressed when a post is deleted.”

Twitter suspended Ye’s account on Dec. 2 after he posted an image of a swastika merged with the Jewish Star of David. Although Musk said Ye violated Twitter policy for inciting violence, many users raised their concerns for consistent rule enforcement.

Barnard-Bahn agrees with the decision to deplatform Ye. However, she finds the vague qualifications for a ban problematic.

“It’s hypocritical for Musk to stop things like that, but not stop racism, homophobia or transphobia,” Barnard-Bahn said. “You can’t pick and choose which minorities you are cool with protecting.”

Barnard-Bahn said that, in addition to following the law, social media companies like Twitter should actively curate posts to minimize harm.

“Freedom Friday”

In addition to changes in content moderation, controversial decisions in bringing back formerly banned Twitter accounts caused disagreement among the community. 

Musk dubbed Nov. 19 “Freedom Friday,” and reinstated former president Donald J. Trump’s Twitter account which had been taken down due to worries of inciting violence in the wake of the Jan. 6 insurrection. Musk also reinstated the Twitter account of kickboxer-turned-personality Andrew Tate who had previously lost his Twitter account in 2017 as a result of his remarks against victims of rape.

Musk’s decision to reinstate Trump and Tate angered many Twitter users, especially because Musk used a  public poll on his personal Twitter account to decide whether Trump’s account should be reinstated. 

“I think, for somebody who has a business, that was really unprofessional of him,” senior Karabelo Bowsky said, pointing to the flaws of Musk’s one-step process. “People who follow him will probably believe him too; that creates a skewed perspective.”

Strumpfer said Twitter should restore the number of staff on the company’s content moderation team, state the platform’s rules with more clarity and improve consistency in enforcing rules going forward. 

“I think the rules should be clear and written,” she said. “I think a great start to this whole mess is not firing the entire content moderation team and paying your workers their money”

To read more on the reports referenced in the story in the New York Times and Center for Countering Digital Hate

https://drive.google.com/drive/folders/196e5RahRp-WUTK-unG7cxTuuUXoVvcR7?usp=sharing

By Garman Xu

This story was oringially published in the Dec. 13 Issue of The Octagon

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *