Life

This Is The Major Problem With Twitter’s New 280 Character Limit

LOIC VENANCE/AFP/Getty Images

On Tuesday, Twitter announced its plans to gradually roll out a new character limit that will allow people to post longer messages. When it's implemented site-wide, users will have 280 characters at their disposal, which is double the current limit. Jack Dorsey, Twitter's CEO and co-founder, said in a tweet that 140 characters "was an arbitrary choice based on the 160 character SMS limit." By adding more characters, he said, Twitter is solving a problem people run into while trying to tweet.

Of course, change is never easy, and people were quick to make fun of Twitter for the seemingly random adjustment. Several mocked Dorsey's announcement tweet, hinting it could've easily been condensed into 140 characters. But some users raised more serious concerns, voicing the same question: Why is Twitter changing its character limits instead of protecting its users who have long asked for a stricter anti-harassment policy?

The social media platform has been accused of not doing enough to stop harassment in many different waves over the course of its 11 years. Twitter was founded on the premise that it would allow unrestricted free speech, but critics say this approach is why the website seems to be filled with trolls, harassers, and white supremacists.

Earlier this year, comedian Shahak Shapia traveled to one of Twitter's offices with stencils and spray-painted some of the homophobic and anti-Semitic tweets sent to him on the sidewalk outside of the building. He'd received no response when he reported the offensive messages to Twitter, he said, and had to resort to something drastic.

For Twitter's part, it released new harassment rules earlier this year. The company said it is working harder to stop abusive accounts from being created and removing hateful messages from search results. Dorsey has said abusive language has no place on the platform. And the site's official rules include explicit warnings against hate speech: "You may not promote violence against or directly attack or threaten other people on the basis of race, ethnicity, national origin, sexual orientation, gender, gender identity, religious affiliation, age, disability, or disease," the policy reads. "We also do not allow accounts whose primary purpose is inciting harm towards others on the basis of these categories."

But many say Twitter still isn't doing enough. Last year, Twitter suspended alt-right provocateur Milo Yiannopoulos, but only after he led a racist and sexist harassment campaign against actress and comedian Leslie Jones after the release of Ghostbusters. The abuse became so extreme that Jones contemplated quitting Twitter entirely.

Twitter has even offered its coveted blue check mark to self-proclaimed white supremacists. (Twitter's verification policy states that the badge "lets people know that an account of public interest is authentic," and that it "does not imply an endorsement by Twitter.") Richard Spencer, who self-identifies as a white nationalist and wants to see the creation of a "white ethno-state," is verified on Twitter. One of his recent tweets about kneeling for the national anthem says, "If only Blacks read Wikipedia, they wouldn't do this, guys." He's not the only controversial figure who is still on the platform. Tim Gionet, who goes by "Baked Alaska," is an alt-righter who attended the Charlottesville neo-Nazi rally and was temporarily suspended by Twitter for sharing a meme of a Jewish woman in a gas chamber. He is also verified.

Dorsey addressed concerns about the company's handling of white supremacists and harassment late Tuesday night. Sports writer and political activist Dave Hogg tweeted, "It would be even better if you paid attention to and acted on the issues relating to abuse and harrassment [sic]." Dorsey responded, "We are."

Only time will tell whether Dorsey will implement anything new on the harassment front, but users are making their voices heard: They think Twitter clearly has some work to do.