#parent | #kids | Discord Chat App Is Safer Now for Kids but Still Lacks Parental Controls



Discord, the popular chat service used largely by teenage and young-adult gamers, is trying to reinvent itself as a friendly place to hang out—a virtual space where anyone can gather for karaoke nights, group painting sessions or yoga classes.

On Discord, anyone can create a private chat room—a “server,” in Discord parlance—and invite friends in to talk live via video, audio or text and to share pictures and videos. This made Discord a content free-for-all that many parents have been leery of allowing their children to access.

But over the past six months, the service, with more than 140 million monthly active users, has instituted changes and is taking a more proactive approach to policing the site for child predators and blocking minors from seeing porn.

Discord shows this screen when users try to enter a platform ‘server’ that has pornographic content; people registered as minors are blocked from entering.



Photo:

Discord

These changes come as social-media platforms are increasingly scrutinized, from both the political right and left, for their handling of content. They are seen as having played a role in the riot at the U.S. Capitol earlier this month;

Facebook

and

Twitter

are also receiving criticism for deciding to deplatform President Trump and some of his supporters. As Trump supporters look to congregate elsewhere, options are limited: One preferred network, Parler, was evicted by web-service provider

Amazon

and app stores run by Apple and Google, on the grounds that it didn’t do enough to stop the promotion of violence. Parler said its volunteer moderators were overwhelmed with the number of posts as the service experienced rapid growth.

Discord has stayed out of the headlines this month, but that hasn’t always been the case.

“Discord did not come to safety right away; it took a bumpy road for them to acknowledge that trust and safety were absolutely core to their business model because if not, as in the case of Parler, you’re run out of town,” said

Stephen Balkam,

chief executive of the nonprofit Family Online Safety Institute. The organization, which works with tech companies and lawmakers to make the internet safer for children, allowed Discord to become a member earlier this month after what Mr. Balkam described as a rigorous application process.

Discord’s bumpy road included a role in the 2017 “Unite the Right” rally in Charlottesville, Va., that erupted in violence and left one counterprotester dead. The FBI obtained a warrant for the Discord account of the leader of a white supremacist group, after chats suggesting the leader encouraged violence at the rally were leaked on a left-wing media site. Discord said afterward that it banned servers promoting neo-Nazi ideology.

There have also been some high-profile cases in which child predators were alleged to have used Discord to share child pornography or communicate with minors.

“We’re having a moment over trust and safety on social-media platforms,” Mr. Balkam said. “Because Discord had one earlier with Charlottesville, they’re better prepared now for the world we’re in.”

But are parents prepared to let their children loose in what I described back in 2019 as a virtual “Lord of the Flies,” where adolescent boys ruled and where harassment and porn were commonplace?

In my earlier Discord column, the company said it took a laissez-faire approach to enforcement, saying it wouldn’t interfere in a private server unless someone made a complaint. That has changed. Discord has created a team of seven machine-learning experts developing algorithms to detect harmful activity and identify suspected child predators. Upon detection, Discord deletes the servers used by the suspected predators and bans members of those groups from rejoining its platform.

Last year, Discord shut down more servers for sexual content based on its own policing than on user complaints. It is also doing more to train human moderators to spot trouble; it says 15% of its employees now work on trust and safety and monitor certain servers for problematic activity.

“Discord is not about growth at all costs,” said

Clint Smith,

Discord’s chief legal officer, who joined the company in August and oversees its trust and safety team. “We want to be an inclusive platform for healthy engagement with friends and communities, and we have no problem banning people who don’t come here for that reason.”

Discord launched an online safety center, where parents and users can find tools to filter explicit content and block direct messages from strangers. Parents also can request to see all of their teen’s account data.

SHARE YOUR THOUGHTS

Do you think Discord is safe for children to use? Why or why not? Join the conversation below.

However, parents tell me what they really want is a way to control which Discord servers their children can and can’t enter. Mr. Smith said it doesn’t have plans to introduce parental controls, because its philosophy is to put users first—not their parents.

One mom, who told me that a lack of parental controls is why she won’t let her children join Discord, said, “I think putting the user first is fine for people 18 and older but not for anyone under 18.”

Discord terminated more than four million accounts in the first half of 2020 for violating its terms of service or community guidelines, with the majority of the violations related to spam, according to its most recent transparency report. Of the non-spam violations, most account deletions were for exploitative content, including child sexual-abuse material and nonconsensual pornography. While few terminations were for harassment—Discord says harassment is better addressed by users blocking other users—harassment did make up the largest share of complaints in the first half of 2020.

Discord now also limits what minors can access on the platform. It previously didn’t ask people to enter a date of birth when they created an account—social-media sites aren’t supposed to have users under age 13, due to a children’s privacy protection law. Now, though, Discord requires new users to enter a birth date. Kids can, and do, lie about their age, but if they register for an account with a birth date that puts them anywhere between ages 13 and 18, they’re blocked from entering servers dubbed “NSFW,” aka “not safe for work.” They also can’t receive direct messages with images containing nudity, as determined by Discord’s filtering algorithms. Discord says it will respond to any child-safety complaint within two hours.

If the moderator suspects that someone under the age of 13 is in a server and reports it, Discord suspends the suspected child’s account until he or she can show proof of age. Discord also has created a training program to teach moderators how to look out for younger children in their servers, how to encourage good behavior and how to block or report bad actors. Moderators can also create bots to detect inappropriate content.

More Family & Tech columns

While Discord currently only scans images for nudity and porn, it plans to expand the scope of the inappropriate imagery it can detect. The company recently joined the Global Internet Forum to Counter Terrorism, which will give it access to examples of violent and extreme content to train its filters.

Since the Nov. 3 election, Discord has been monitoring its servers for signs of activity that might indicate organized violence. After the Capitol riot, Discord banned a Trump-themed server. Although there was no evidence it had been used to organize the riots, it was connected to another online platform that had been used to incite violence. “Discord was not used by the organizers of the January 6th attack on the Capitol,” Mr. Smith said. “For us this validates several years of our persistent commitment to make our platform an unfriendly place for anyone who promotes or glorifies violent extremism.”

The changes appear to be showing some results, albeit minor ones. Bark, a parental-control and online-monitoring service that monitors Discord’s direct messages on Android and Amazon devices, ranked Discord the fourth most-abusive platform last year. Before that, it was No. 2. Bark measures abusiveness by the amount of activity it flags for bullying, sexual content and violence, among other issues. Last year, 8.2% of all Discord activities that Bark monitored were flagged for being abusive, down from 12.6% in 2019. Bark just issued its latest annual report on children and technology.

Bark can’t monitor activity within the servers, nor any text chats on Apple devices. Titania Jordan, Bark’s marketing chief, cautioned that it is too soon to celebrate.

Mr. Balkam, of the Family Online Safety Institute, said Discord should do still more to educate its youngest users on how to protect themselves from dangers. He pointed to the way TikTok has had some of its most popular content creators share videos on how to stay safe on the TikTok app.

“I would give them a B+ for the improvements they’re making to the platform,” he said.

Write to Julie Jargon at julie.jargon@wsj.com

Copyright ©2020 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8



Source link
.