Technology is often seen as part of the solution to gendered violence – but as Beauty Dhlamini explains, it also presents a new set of challenges and chilling effects
Content note: This article contains descriptions of methods of committing violence against women
Throughout my lifetime the news has been dominated by narratives of people, especially women who have lost their lives as a result of domestic male violence. The reactions to domestic violence and femicide are so predictable: every year a new story, a new name to say, a new name to remember, as we must watch society explode in outrage at the callous death of another one of us. And every year, it is only a matter of time before society forgets and lets the issue slip back into obscurity without any meaningful change taking place.
Over the last year we’ve witnessed encouragement from the government, the public and private sector to ‘innovate’ our way out of this crisis through tech. Suggestions have included plans for increased surveillance through CCTV, dedicated phone lines, smartwatches, software and a government-backed app, all of which are focused on allowing friends, families and – ironically – partners to track users’ whereabouts and recognise if they’re in danger.
Using technology as a main solution to domestic violence is short-sighted and completely disregards what sits at the core of domestic violence and femicides. This is because the use of technology to perpetrate domestic abuse, referred to as tech abuse, has become increasingly common but research on it is still sparse.
The news of Sarah Everard’s murder gripped us. For the first time in March 2021, there was a shift. People were not just talking about the issue at hand, but also the asymmetry that exists in people’s reaction towards domestic violence cases, especially where minoritised women were concerned. Coalition building and organising amongst womanist, feminist and LGBTQ+ movements pressured the government and police to stop being complicit and to offer tangible solutions to protect women and girls.
However, despite the government committing to launching a multimillion-pound communications campaign “with a focus on targeting perpetrators and harmful misogynistic attitudes” – as part of the government’s tackling violence against women and girl’s strategy released in July last year, little to no progress has been achieved in making this reality.
“One extreme example of this is perpetrators using physical tracking devices like Apple AirTags or installing malicious spyware to track someone’s movements, to intercept them, harm them – even kill them“
It’s no coincidence since then, technology has subsumed the focus in finding solutions, as naturally, tech has all the answers, ignoring the fact that technology abuse enables gendered violence and consequently femicides. Tech abuse involves abusive messages or calls, hacking of phone or email, cyberstalking, posting malicious messages on social media, controlling device passwords and usage or control of online banking among other behaviours.
As technology is part of every aspect of our lives, it means that tech abuse can take many different forms. Common devices such as tablets and growing use of internet-connected home devices (such as smart speakers) may provide perpetrators with a wider and more sophisticated range of tools to harm victims.
Tactics typically employed by perpetrators include omnipresence, isolation, punishment, and humiliation. One extreme example of this is perpetrators using physical tracking devices like Apple AirTags or installing malicious spyware to track someone’s movements, to intercept them, harm them – even kill them. More insidious technology abuse may involve mimicking someone’s identity by using their accounts and sending messages impersonating that person to sabotage their work, their friends, to isolate them.
The number of domestic abuse cases has increased dramatically since the UK’s Covid lockdown – and tech has played a role. According to research by domestic violence charity Refuge, complex tech abuse cases including high-level stalking, spyware, tracking devices and images being circulated on numerous platforms rose by 97% in the first year of lockdown (from April 2020 to May 2021), and they’re still growing. Furthermore, Refuge confirmed that more than 70% of those it provides support to have reported tech-related abuse within a relationship.
Unfortunately, several technological devices are grounded in the premise of transparency – everyone in the home is happy to share their information with one another. However, this is idealistic and fails to recognise, the bleak reality for some families. The Cost of Covid-19 report by Surviving Economic Abuse found that online living rapidly increased people’s vulnerabilities. The shift from cash to card payments made it possible for abusers to monitor, track, and locate someone based on their spending. Banks shutting down during the lockdown facilitated financial abuse – another form of domestic violence – as it was easier for abusers to take out loans and create debt in a partner’s name.
A further predicament with technological initiatives being at the forefront of solving domestic violence and consequently femicides is the monopoly the tech world has in other areas of our life. Tech has presented the idea that, because of the efficiency technology has allowed us to experience, it can solve all our problems with neutrality – without discrimination and biases towards anyone.
This mindset creates even more issues for society. As most of these apps are geared towards women, the technological initiatives contribute to the lopsided viewpoint that the onus of safety is a women’s issue rather than a man’s – they are even marketed toward women. The assumption exists that there is no need or any expectation to for men to download these apps.
Additionally, the adoption of these data-driven apps also criminalises poorer areas, even without them being used. Areas with poorer infrastructures such as a lack of streetlights could be earmarked as unsafe for women, justifying over-policing and surveillance, with which could lead to further stigmatisation of marginalised communities.
The result of these ‘safety’ apps is a spatial politics of exclusion, enabled by imaginaries of who belongs where. The question then stops being about how many people are committing crime – but where police are directing their law-enforcement resources. For example, the policing of drug related crimes follows a pattern that falls in line with judicial powers approaching public safety. By law, Westminster (or any other gentrified area) isn’t labelled high drug use or a crime hotspot despite admission by politicians of continuous drug use there, whereas there is greater targeting in minoritised and low socioeconomic areas. This proves there will always be an unbalanced approach to dealing with safety.
As proved with the controversies surrounding safety apps in the past, enabling anonymous perspectives on “safety” only serve to reflect people’s prejudices through technology, rather than helping us catch perpetrators and maintaining a sense of safety.
“Industries need to do better in the design and monitoring processes of their technology. For example, sharing a small innocuous piece of data such as battery level should be to the discretion of users.“
That’s not to say that tech does not have a place in protecting people, especially women from domestic violence. Technology has been the lifeline for many of those who experienced recording evidence, and most importantly accessing support.
For example, Refuge created a website specific to tech abuse that provides survivors to learn about and recognise tech abuse, has a new animation to illustrate common forms and experiences of tech abuse, includes 17 step-by-step support guides covering a range of device and account settings and has an interactive chatbot with video guides for securing accounts and devices in other languages like Spanish and Urdu.
Industries need to do better in the design and monitoring processes of their technology. For example, sharing a small innocuous piece of data such as battery level should be to the discretion of users. Banks should be working with the tech industry to develop intelligence monitoring of reference descriptions and free text fields to block abusive messages which abusers often use to in bank transfers.
Whilst aspects of tech abuse have been addressed in several different UK Government policies and legal frameworks, including domestic abuse law and online harms policy and draft online safety Bill – it needs to be explicit in how technology plays a role in domestic abuse and violence, especially against women and girls.
However, the safety of women should not solely be their responsibility, rather there needs to be a shift. The onus should fall onto the government, the technology industry for more thoughtful designs and society at large especially holding abusers (mainly men) accountable in their violent behaviours. This is because technology does not create abusers, but it has given them more tools to facilitate their unhinged behaviours, abuse and violence.
Beauty Dhlamini is a global health scholar with a focus on health inequalities and co-hosts the podcast, Mind the Health Gap.
If you enjoyed reading this article and you got some benefit or insight from reading it donate to keep Media Diversified’s website online
Or visit our bookstore on Gumroad – you can donate there too! We are 100% reader funded.