Twitter has a toxicity problem rampant across its platform. It's hard to scroll through without coming across at least some hate-speech, misogyny, homophobia, racism, or just people being awful. And the only way to police a user group that large is with AI.
Images courtesy: Reuters
Now, a group of researchers might have just the tool for the job. The tool in question is a new algorithm running in the background that can identify Twitter accounts that exhibit bullying or troll-like behaviour with a 90 percent accuracy.
The machine learning program comes courtesy of researchers at the Binghamton University in the United States. It essentially uses a combination of natural language processing and sentiment analysis in order to classify whether certain tweets include cyberbullying or cyberaggression.
Jeremy Blackburn, one of the computer scientists on the research team, says the new algorithm uses information from people's Twitter profiles, as well as connections to the accounts they might be bullying.?
"We built crawlers - programs that collect data from Twitter via a variety of mechanisms," he said. "We gathered tweets of Twitter users, their profiles, as well as (social) network-related things, like who they follow and who follows them." Apparently, that extra bit of context is crucial for the bot to differentiate between regular tweets and those that are aggressive.
"In a nutshell, the algorithms 'learn' how to tell the difference between bullies and typical users by weighing certain features as they are shown more examples."
Of course, while this tool would be a big help to a platform like Twitter, when it comes to policing online bullying, even the researchers behind it agree it's not enough by itself. After all, it's reactive by nature because it can help identify and remove users that bully others online. However it does nothing to prevent it in the first place.
"The unfortunate truth is that even if bullying accounts are deleted, even if all their previous attacks are deleted, the victims still saw and were potentially affected by them," Blackburn says.