- Sextortion – in which predators blackmail teen victims online – is growing during COVID, experts say.
- Snapchat is particularly vulnerable, but has not prioritized the issue enough, former employees and child-safety advocates say.
- Snapchat says upper management has always been committed to safety, and continues to add more safety features.
- Visit the Business section of Insider for more stories.
“After I’m done using you I’ll give back your Snap,” Joseph Isaiah Woodson messaged a teenager he was extorting, according to records from his 2020 trial.
Woodson then demanded she commit a sex act on camera, or he would post explicit photos of her from her Snapchat account, which he had taken over.
“I have family members on my Snapchat,” she pleaded in a Skype response. “Please don’t do it.”
But the records show Woodson showed little mercy to his victims, demanding they perform degrading sex acts on camera and hurt themselves or he would post nude photos of them from their hijacked accounts. Records show Woodson commanded one to write that she was “owned” on her breasts with a pen so hard that they bled. Another teen asked “What have I done to deserve this?” in a message. Woodson replied, “Slap tits now.”
“Sextortion,” in which predators blackmail victims into sending them sexual images and videos of themselves, is spiking, experts say. Canada’s non-profit Centre for Child Protection says it has seen sextortion reports spike 88% since April. Last week Congressional Democrats re-introduced legislation to address online sexual exploitation of minors, citing an increase during COVID-19.
The results can be tragic. “Sextortion frequently leads to self-harm and suicide,” researchers have found. “It’s difficult for people to understand the emotional and psychological trauma victims go through,” US Attorney Erica MacDonald of Minnesota told Insider. Minc Law, an Ohio firm specializing in representing victims of online defamation and harassment, has seen a deluge of sextortion cases. “We’ve had to start an on-call service with an attorney until 11 pm,” attorney Dan Powell told Insider. “I deal with people every month who are on the ledge, suicidal.”
Snapchat, operated by parent company Snap Inc., is often in the middle of high-profile cases such as Woodson’s, in which he was sentenced to 50 years in prison. (Woodson is appealing the case.) Former Snap employees, child-safety advocates, sextortion victims, and a US senator told Insider that Snapchat is not doing enough to help young sextortion victims, who often don’t know where to turn. Snapchat is failing to make the reporting of sextortion simple and easy, they say – and last year featured content from a publishing partner giving teens guidance on how to send nude photos of themselves during COVID-19. Snapchat disagrees, and points to its in-app reporting features and reporting process on its support page.
Sextortion plays out across many internet platforms, including Instagram and the various messaging apps. But Snapchat’s features and abundance of teens make it a magnet for predators, law enforcement agencies including the FBI say. And according to a former Snapchat employee who left within the past few years, Snacphat just didn’t take the issue seriously.
“I left because of that. It was an ethical reason why I left,” the former employee said. Snapchat “started out as a sexting app, and they want to keep that. That kind of stuff keeps all these young kids on the platform.”
Other former Snapchat employees who worked on trust and safety, and some child-safety advocates agreed. The employees asked to remain anonymous because they feared Snapchat would sue them because of non-disclosure agreements. Insider knows their identities.
The employees, all of whom worked at trust and safety and related areas at other tech companies, say Snap has placed growing the user base – and profits – ahead of protecting young users from predators. Last week Snap reported 265 million daily users, a 22% increase over last year; and fourth-quarter revenue of $911.3 million, up 62% from last year.
Snap disputes this characterization and says “it’s very clear that we place the safety and wellbeing of our community above making profits, as evidenced by the many decisions we have made over the years which are in stark contrast to the way many other social media platforms operate.” Snap also says the safety and privacy of its users is the company’s “top priority,” and pointed Insider to various features on its app to report safety issues and to educate parents about how to keep teenagers safe.
“While we have made progress in combating these types of online threats, we are constantly evaluating how this activity is evolving and where we can continue to improve, and we partner closely with safety experts on both,” the company says.
Snap says it’s made progress in combating online threats
The FBI, which even before the pandemic said it was seeing “an alarming increase” in sextortion and related crimes, told Insider that Snapchat is “particularly vulnerable” to sexploitation because teens often share and save their own private, explicit photos on their accounts. Snapchat users often believe any explicit photos they share will auto-delete and that their private photos are safe, an FBI spokesman said.
“Snapchat promoted the belief that messages are ephemeral,” said Justin Patchin, a University of Wisconsin professor who has researched teen sexploitation. “They have a moral obligation to address” the mistaken belief that private explicit media can’t be stolen, Patchin said. Snap responds that it takes its responsibility to help its users protect their privacy seriously and does so through its Privacy Center.
While photos shared in Snapchat messages do disappear, that is not the case with a user’s own photos, which they can save in their Snapchat account. Extortion hackers often use “social engineering” techniques to trick minors into giving them their passwords, experts and court records show. Once in the account they threaten to post the private photos and blackmail teens into producing more compromising media of themselves.
Nearly 21% of global Snapchat users are minors, Statista shows, versus 7% of Instagram users, 9% of Twitter users, and nearly 6% of Facebook users.
Two sextortion victims, one a teen and one in her early 20s, as well as the attorney for a third, teen sextortion victim, described to Insider what they felt were complicated and unsympathetic processes at Snapchat when they tried to report sextortion. The two victims showed Insider back-and-forth emails with the company in which they struggled to get help as their blackmailers made threats.
“I was just scared and didn’t know what to do,” said one of the victims, a Southern California teen whose Snapchat account has been taken over by sextortion hackers twice in the past six months.
Snapchat’s safety team “wanted me to put in a recovery code I didn’t know existed,” she said. In both cases, she was so frustrated trying to get help from Snapchat that the teen felt she had no choice but to try and resolve the situation on her own, in one instance paying the hacker $50 to give her back access to her account.
Snap responded that it must require users “to verify their identity so we can be sure we are not giving any additional information or access to an unauthorized person. We know this can be a frustrating first step, but it is a crucial one to be able to protect their account from any further abuse.”
Some experts point to Instagram as a company addressing sextortion well. The Facebook-owned photo-sharing app shares similar traits as Snap that make it attractive to sextortion predators, namely a broad intermixing of young and adult users, and built-in tools for creating and sending private images. But Instagram benefits from Facebook’s state-of-the-art reporting tools and safety resources, the experts say, such as easy in-app reporting, large security teams, and close connections with authorities. Like Instagram, Snapchat also works with the National Center for Missing & Exploited Children, and says it applies many of these same processes.
“It should be very easy to push a button and report it”
According to former Snap employees, child-safety advocates and victims, there is one glaring problem with the company’s current approach: Difficulty for users to report sextortion as it is occuring.
“It should be very easy to push a button and report it,” said Patchin, the professor who has researched teen sexploitation. Stephen Sauer, the analyst of online sex crimes at the Canadian Centre for Child Protection, echoes the sentiment, saying Snap should be doing more to make reporting the crime easy within the app. Snap said that it does offer easy, in-app reporting for a variety of abuses, including sextortion. The company also said it has a robust trust and safety team focused on these issues, and that it collaborates frequently with authorities.
“Snap receives thousands of reports a day about sextortion attacks,” according to one former senior SnapChat employee, who said it was difficult to communicate safety issues to the highest-levels of the company. The most recent data from Snapchat shows the company removed 47,136 accounts for child sexual exploitation and abuse offenses from last January to June, a 26% increase over the previous six-month period.
The employee said his team “fought tooth and nail to get basic safety functionality included.” Other former employees said it was difficult to get safety issues in front of a small core of top executives more focused on design of the app than safety features they considered clunky or off-putting to users. Snap CEO Evan Spiegel earned a reputation in past years for “sometimes refusing to listen to the feedback of others,” as The Verge reported in 2018, though in recent years he’s worked on his management style and brought on a new team of executives to support him.
“It was incredibly difficult to get Evan’s buy-in,” the former senior employee said. Other employees said they had more access to and support from upper management at other companies.
The former employees said their teams advocated for basic safety measures, such as displaying an unchangeable user name, so victims could easily identify harassers. Verification of users, robust reporting of sexual predators to authorities, and sharing trust and safety information with other companies were concepts less familiar to Snap’s upper management compared to other companies where they worked, they said.
“There is definitely more that could be done, but upper management was not thrilled about that because it would mean terminating people from the platform,” said one of the former Snap employees. “Sometimes they say it’s a priority, but once the hype and these articles went away, they would roll it back.”
Snap said that its upper management is highly invested in trust and safety issues, and practices “safety by design” that builds in features to protect users.
Snap recently unveiled a new initiative to educate young users about internet privacy and security. A special channel called “Safety Snapshot” within Snaphat’s Discover section of curated content will host episodes with easy-to-digest lessons and tips on various online safety issues, produced in partnership with outside experts.
Some people Insider spoke to defended Snapchat’s efforts handling safety issues. Roy Sinclair, a former UK law enforcement officer who investigated sextortion, told Insider that Snapchat was “really, really good” to work with, and called its trust and safety team “more than cooperative.” Two former employees said they thought Snap upper management took the problem seriously, but asked not to be quoted in this article.
‘Early notoriety as the so-called app built for sexting’
Part of the challenge for Snapchat is rooted in the company’s corporate culture and history.
“Snapchat gained early notoriety as the so-called app built for ‘sexting’, its quick-to-vanish imagery being seen as the ideal medium for transmitting explicit content,” UK researchers Sarah Handysidea and Jessica Ringrose wrote in 2017.
The app was partly inspired by the Anthony Weiner sexting case, in which the former Congressman was imprisoned for sexting with a minor, Spiegel reportedly told TechCrunch in 2012, although the founder said racy messages were a small part of its content. That attitude carried over, former employees say, even as Snapchat itself grew into the social media powerhouse that it is today. Over the six months from January to June 2020, sexually explicit content was reported on Snapchat, an app for users 13 and older, 8,522,585 times.
Some critics say Snapchat has taken a lax attitude towards “sexualized” content in its Discover feed, which features stories created by media partners and publishers.
Last March, for instance, Snapchat’s Discover feed featured posts by publishing partner Teen Vogue giving teens guidance on sexting and sharing nude photos. “Send nudes the right way,” one of the posts read. “Sexting takes practice,” read another. The posts and Teen Vogue articles they quoted included cautions for teens about sharing nude photos. But child-safety advocates and lawmakers say they were stunned that a platform that has struggled to address a traumatizing crime would feature content advising teens on how to share nude photos of themselves, at all.
“It was a shock to all of us,” said Chris McKenna, founder of the business Protect Young Eyes, who testified to a Senate committee about online exploitation of children in April. The National Center on Sexual Exploitation blasted Snapchat and Teen Vogue on its blog for the posts and said the incident shows the need for regulation.
US Senator Marsha Blackburn, a Tennessee Republican who met with Snapchat to discuss child sexual exploitation in 2019, was equally surprised and disappointed by the Teen Vogue posts. “If Snap refuses to see the incentive in protecting young kids and teenagers online, those of us in Congress will make it our priority to hold the company accountable. Snapchat’s demise into a child predator’s paradise can’t be good for its reputation or longterm bottom line.”
Snapchat does not review publishing partners’ content before it is posted in the Discover feed.
Blackburn supports reforms to Section 230 of the Communications Decency Act, the law that prevents platforms like Snapchat from being held legally liable for content posted by their users.
US Senator Ron Wyden, an Oregon Democrat, told Insider the bill submitted last week aims to help address issues such as sextortion on social media. “It is also critical for every tech company to be consulting closely with experts to ensure they promptly act in response to reports of exploitation on their products,” Wyden said.
As legislation seeks to rein in the problem, the sextortion problem continues to claim new victims, many astonished by the assault. The victims that Ohio attorney Powell serves are often stunned to find their private photos used against them. “Snapchat,” he said, “gives them a false sense of security.”