#parent | #kids | Deepfake videos are the latest cruel form of school bullying


Imagine your child comes home from school with their eyes red and swollen from crying. When they open up about what’s wrong, they tell you a video has been going around at school, mocking and humiliating them. They are in the video – yet it’s showing something that never actually happened.

What they’re talking about is a deepfake, a video created with artificial intelligence that allows its maker to show anyone doing anything they want – doctored with such lifelike realism that nobody would even question whether it’s real footage.

Deepfakes are already rampant on the internet. They are mostly known at the moment for nonconsensual pornography, with internet forums full of false sex tapes showing people apparently having sex – people who would never have made one in real life, who probably have no idea these videos exist and would be horrified.

But during research for my book, Trust No One: Inside the World of Deepfakes – which covers the threats and benefits they pose – I heard stories of how the technology is being weaponised by bullies at school. In their hands, these videos take another form.

For instance, a heavyset student might see their face edited onto the body of a thin, sexualized performer in an R&B video. Or someone who struggles with their studies may find their face has been placed over Dustin Hoffman’s autistic character from Rain Man, Raymond Babbitt. The intent of both is to mock, to humiliate, to get laughs from your peers at the bullied student’s expense.

This sounds like Hollywood-level CGI, I know. But any bully who knows where to look online can make a deepfake these days. Though the underlying artificial intelligence is complex, the technology is so powerful – awe-inspiring, even – because the AI does all the work.

The user needs no understanding of computer programming or special effects. There are plenty of smartphone apps that make it dead easy to create a deepfake – and this is where many of those shared in classrooms and corridors come from.

Feed the app a photo or video of your target and tell it which video you want their face inserted into – and shortly, you’ll have your custom deepfake to spread from phone to phone.

Any kind of bullying is horrible. I know this from my experiences as a teenager in the early 90s, when I was targeted over my weight and my grades. The abuse about my size was humiliating, but more hurtful was being bullied as the dumbest kid in school.

I had a significant learning disability that made concentrating all but impossible. I was only diagnosed in my second year of high school. Retaining what I studied for more than a few hours was hopeless and I was tormented over this. I wasn’t helped by teachers openly stating in class that they didn’t believe I had done any studying.

I quickly learned it was easier to go along with the taunts by lying to my bullies. When they said I was lazy and hadn’t studied, I stopped refuting their claims. Instead, I told them I simply didn’t care about studying or school. Supporting a bully’s narrative always lessens the abuse, though using such a technique to avoid increased suffering is a horrible and unfair “choice” to have to make.

As bad as my bullying was, respite always came with the final bell of the day. The bullying didn’t follow me home. Today’s kids aren’t afforded that luxury. With the internet, smartphones and social media, cyberbullying can continue 24 hours a day.

For years now, cyberbullying has taken the form of ignorant social media posts or jokes at the expense of others in a school-wide WhatsApp group. The worst might have been having a custom, hurtful meme made of you and shared among cliques. But in 2021, memes are old-school when it comes to cyberbullying. You really want to make fun of someone? You make a deepfake of them.

So how do we stop this? We can’t. Not totally, anyway. There will always be kids who are bullied and kids who do the bullying. Deepfake apps aren’t the problem themselves, I would argue – the bullies are. The apps are just the latest tool in their arsenal.

While popular “face swap” apps, like Reface and FaceMagic, aren’t responsible for the bullying, they can and should play a more significant role in helping stem its tide. At the very least, any deepfake app available for smartphones should have an automatic warning that’s displayed when the app is used: “Remember, this can be fun – but never use someone’s face without their consent.” Whether this would stop bullies, of course, is doubtful.

Ultimately, stopping this new variety of bullying comes down to parents and teachers. If you see your son or daughter joking with their friends, showing off how they deepfaked another classmate into a video, ask them if they got the pupil’s consent. Ask them what the intent was – to mock the person? It’s up to parents not to raise a bully.

Teachers also have a responsibility. If deepfake bullying is discovered, the school should act quickly to reprimand the perpetrator. Students should also be taught that misusing the image of another person’s face is a form of identity theft, much like stealing someone’s driving licence or bank details. What’s funny to one person is devastating to another and it should not be tolerated.

We now live in a world where people can take our likeness, our face – our identity – and make a video showing us doing anything. The technology will represent a serious threat to individuals, governments and social cohesion in the years ahead. We must not let it run rampant in our schools, too.

Michael Grothaus is the author of Trust No One: A Journey Into Deepfakes, on sale from 11 November (£18.99, Hodder Studio)



Source link
.