Monday, November 25, 2024

Even Before Deepfakes, Tech Was a Tool of Abuse and Control

Share

Of the many “profound risks to society and humanity” that have tech experts worried about artificial intelligence (AI), the spread of fake images is one that everyday internet users will be familiar with.

Deepfakes – videos or photographs where someone’s face or body has been digitally altered so that they appear to be doing something they are not – have already been used to spread political disinformation and fake pornography.

These images are typically malicious and are used to discredit the subject. When it comes to deepfake pornography, the vast majority of victims are women. Generative AI – technology used to create text, images and video – is already making image-based sexual abuse easier to perpetrate.

A new set of laws in the UK, will criminalise the sharing of deepfake pornography. But with the attention on AI and deepfakes, we cannot forget how less sophisticated technology can be used as a tool of abuse, with devastating consequences for victims.

Tech and control

When I began my research into technology in abusive relationships, deepfakes were just a blip on the horizon. My work focused on the role of smartphones in the abuse of women who had fled controlling relationships. I found that perpetrators of domestic abuse were using technology to extend the reach of their power and control over their partners, a modern take on abuse tactics that were used long before smartphones were in every pocket.

Mobile phones can be used directly to monitor and control, using GPS tracking or by bombarding a victim with texts, videos and voice calls. One participant in my research in 2019 explained how her abusive partner used his phone to access social media, sending her offensive pictures via Instagram and persistent and offensive WhatsApp messages.

When she was out with her friends, he would first text, ring and then video call her constantly to check where she was and to see who she was with. When the participant turned off her phone, her then-partner contacted her friends, bombarding them with texts and calls.

This participant felt too embarrassed to make arrangements to meet with her peer group and so stopped going out. Others in similar situations might be excluded from social plans, if friends want to avoid being contacted by their friend’s abuser. Such social isolation is a frequent part of domestic abuse and an important indicator of controlling relationships.

According to the domestic violence charity Refuge, more than 72% of people who use its services report abuse involving technology.

Mobile phones are a gateway to other gadgets, via the “internet of things” – devices that are web-connected and able to exchange data. These tools can also be weaponised by abusers. For example, using mobile phones to change temperature settings on a household thermostat, creating extremes from one hour to the next.

Confused by this, people seek explanations from their partner only to be told that this must be a figment of their imagination. Gaslighting techniques such as this make victims question their own sanity which undermines their confidence in their own judgment.

Deepfakes often target women as tools for abuse.

A modern panopticon

With the click of a button, mobile phones allow for unprecedented surveillance of others. In the pocket of a perpetrator, they can be used to keep tabs on current and former partners any time, any place and – signal permitting – anywhere. This gives perpetrators a power of omnipotence, leaving victims believing that they are being watched even when they are not.

This brings to mind the work of the 18th-century philosopher Jeremy Bentham, who introduced the concept of the “panopticon”. Bentham proposed a “perfect” prison system, where a guard tower sits in the centre, surrounded by individual cells.

Isolated from one another, prisoners would see only the tower – a constant reminder that they are permanently watched, even though they cannot see the guard within it. Bentham believed such a structure would result in the prisoners’ self-surveillance until eventually no locks or bars were needed.

My most recent research shows that mobile phones have created similar dynamics within abusive relationships. Phones take the role of the tower, and perpetrators the guards within it.

In this modern panopticon, victims can be out and about, visible to strangers, friends and family. Yet because of the presence of the phone, they feel they are still being watched and controlled by their abusive partners.

As one participant put it: “You feel there’s no freedom even when you’re out. You feel like you are locked up somewhere, you don’t have freedom, someone is controlling you.”

Survivors of abuse continue to monitor themselves even when the perpetrators are not there. They act in ways that they believe will please (or at least not anger) their abusers.

This behaviour is often viewed by others as strange, and too readily dismissed as paranoia, anxiety or more serious mental health issues. The focus becomes about the victim’s behaviour and ignores the cause – abusive or criminal behaviour by their partner.

As technology becomes more sophisticated, the tools and strategies available to abusers will continue to evolve. This will extend perpetrators’ reach and present new opportunities for surveillance, gaslighting and abuse.

Until tech companies consider the experiences of domestic abuse survivors and build safety mechanisms into the design of their products, abuse will continue to remain hidden in plain sight.

The article ‘Even Before Deepfakes, Tech Was a Tool of Abuse and Control’ by Tirion E. Havard was originally published on The Conversation and has been republished under a Creative Commons license.

Featured image: Deepfakes have been used as a tool for abuse. Source: Brian / Adobe Stock.

shadowsofcontrol
shadowsofcontrolhttps://shadowsofcontrol.com
Shadows of Control shares articles, latest news, real stories, research and resources on coercive control and emotional abuse.

Read more

Latest News