Algorithms may fail at times but “squads” of friends can help people fight online harassment and cyberbullying better, says a study. Also Read - Internet down: Zomato, Paytm, Disney+ Hotstar, Amazon, Myntra, many other global services suffered massive outage
While current tools for harassment such as blocking users or filtering trigger words partially helps, a lot of online mistreatment is so subtle that an algorithm might not pick up on various cues. Also Read - Instagram testing new Collab feature for posts, Reels: Here's how it works
To reach this conclusion, a team from Massachusetts Institute of Technology (MIT) used “Squadbox”, a new crowdsourcing tool that enables people who have been the targets of harassment to coordinate “squads” of friends to filter messages and support them during attacks. Also Read - Instagram Sensitive Control feature launched: Here's how to limit sensitive content in Explore tab
According to senior author and MIT Professor David Karger, “Squadbox” aims to make the so-called “friend-sourcing” more efficient and less work for its moderators.
“If you just give moderators the keys to your inbox, how does the moderator know when there’s an email to moderate, and which email has already been handled by other moderators?” said Karger from MIT’s ‘s Computer Science and Artificial Intelligence Laboratory (CSAIL).
“Squadbox” allows users to customise how incoming email is handled, divvying up the work to make sure there’s no duplication of effort.
According to researchers, harassment has become ubiquitous on social media and in the online world.
Twitter has been under fire for how it handles harassment, YouTube’s trending algorithm has occasionally promoted offensive videos and reporting abuse on Instagram is still quite difficult.
The team interviewed scientists, activists and Youtube personalities, and found that many people who are harassed on email rely on friends and family to shield themselves from abusive messages.
With Squadbox, the “owner” of the squad can set up filters to automatically forward incoming content to its moderation pipeline.
Once an email arrives, a moderator decides which emails are harassment, and which can be forwarded back to the person’s inbox.
While the team plans to extend the capabilities of Squadbox to work with social media platforms, they say that email is a particularly useful system for studying harassment.
The team will present the work at ACM’s CHI Conference on Human Factors in Computing Systems in Montreal, Canada later this month.