How Women Coders Make Online Platforms Safer

To combat online abuse, a research team in Australia found a solution — bring more women to the table during the development process.

Illustration of a woman sitting in from of two screens as she writes code.
Illustration of a woman sitting in from of two screens as she writes code.
Illustration by Jonathan Carrington

“Women are now taking part in the solution as agents of change. ”

Alka Lamba, a politician from India, tells Amnesty International, “When you are on social media, you face trolls, threats, abuses, and challenges 100% of the time. Their purpose is to silence you. It makes you want to cry. They talk about your personal life, your looks, and your family.” The same study goes on to validate how the social media platform Twitter has turned into a ‘battlefield’ for women who are constantly being subject to a barrage of physical or sexual threats, degrading content, profanity, slurs, and insulting epithets. And yet, it seems that online harassment is an overt expression of the violence that exists offline, too. The good news is that technology can help create safeguards in online spaces.

To Bring Solutions to the Table, Bring Women to the Table

Women are now taking part in the solution as agents of change. In light of how rampant — and distressing — the situation of online harassment is, Dr. Richi Nayak, a professor at the Queensland University of Technology, Australia, along with her team, has developed a deep learning algorithm aimed at making the virtual world, specifically Twitter, safer for women through the detection of misogynistic tweets. She spoke with me about her team’s research and development process.

The Algorithm Wasn’t Built in a Day

“Designing the algorithm from scratch is where the innovation of this project lies”, Nayak shares, “We started with a blank network, taught the model standard language by training with datasets from Wikipedia. Next, we trained it to learn somewhat abusive language through content related to user reviews of movies, products, and more. Finally, we trained the model on a large dataset of tweets. The language and semantics are very different on Twitter — there are slang terms, abbreviations — all of which was taught to the algorithm to develop its linguistic capability.”

“A nasty comment made online might not strike as threatening to a man going through the data, but a woman may be able to better pick up on the subverted tropes.”

It’s in these cracks and crevices that there’s a scope for subjectivity and interpretation — who gets to decide if a tweet is misogynistic or not? A nasty comment made online might not strike as threatening to a man going through the data, but a woman may be able to better pick up on the subverted tropes. This is when having a woman’s voice at the table becomes crucial. And that opens up the conversation to the importance of incorporating lived experiences, particularly the lived experiences of those at the receiving end of sexist verbal abuse. Even more vital to this conversation is having it at the forefront of the development process rather than as an afterthought.

Passing the Mic

According to HR consulting firm Mercer, the Diversity & Inclusion (D&I) technology market — encompassing tools that prevent and address bias, harassment, and discrimination — is worth approximately $100 million. Such tools are gaining momentum because of realizations including, as mentioned by Ellen Pao in her book Reset, the failure of the rabid yet ineffectual one-off bias training, the whole tech system having exclusion built into its design, and the “diversity solutions” that up until 2016 were PR-oriented gimmicks.

The Future is Bright and Abuse-free

An algorithm is an evolving tool that emulates the human experience of lifelong learning in that it will keep brushing up on its context-identifying skills. It can potentially also be modeled and adapted to identify other brackets of abusive content — racist, homophobic, ableist, classist, transphobic content — and nip it in the bud. There’s scope to expand it to accommodate linguistic specificities across a plethora of global and regional languages. The development of each of these should, therefore, involve the voices of people from marginalized communities for a safer online experience.

We’re a fully remote, globally distributed cloud innovation company. We work a little differently.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store