#parent | #kids | How to Make YouTube Kids Safer for Your Kids

On Friday, a pediatrician and parenting blogger named Free N. Hess published a post about a series of disturbing videos she found on YouTube Kids, a stand-alone app that is supposed to make it “safer and simpler” for those under 13 to browse videos online. A number of news outlets quickly picked up on the clips Hess discovered, which included one where Minecraft–inspired characters carry out a school shooting. In another, an animated girl with long brown hair attempts to commit suicide after her dad dies and her boyfriend breaks up with her.

These cartoons weren’t created for YouTube Kids—they were uploaded to the main YouTube platform and later slipped past filters designed to keep inappropriate content away from minors. Some of the videos have racked up hundreds of thousands, if not millions, of views. “We work to ensure the videos in YouTube Kids are family-friendly and take feedback very seriously,” YouTube said in a statement. “We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video.”

Hess is far from the first parent to find alarming videos on YouTube Kids. When the app was released in 2015, YouTube said parents could “rest a little easier” knowing their children would only be shown “content appropriate for kids.” But just three months after it launched, child consumer advocates complained to the Federal Trade Commission about “potentially harmful” YouTube Kids videos that depicted violence, suicide, and drug use.

Ask WIRED

Two years later, a Medium post about the plethora of disturbing children’s content on YouTube went viral, leading to a wave of media investigations, which often focused on YouTube’s reliance on automated systems to moderate videos.

YouTube soon announced that “content featuring inappropriate use of family characters,” such as upsetting Peppa the Pig knockoffs, would be age-restricted—meaning not permitted on YouTube Kids—and ineligible for advertising. In 2018, the company went further, introducing a new “trusted channels” feature, where parents can choose from collections of curated videos that have been cherry-picked by YouTube Kids and its partners. Parents can now also restrict kids from watching anything other than the videos and channels they handpick themselves.

It’s these control settings, rather than the overall safety of YouTube Kids, that the company now emphasizes. “We use a mix of filters, user feedback and human reviewers to keep the videos in YouTube Kids family friendly,” the YouTube Kids landing page says. “But no system is perfect and inappropriate videos can slip through.” YouTube then highlights the various parental features it offers. It’s not clear whether Hess had any of these settings enabled when she published her blog post. (Hess did not immediately return a request for comment.)

“It takes a lot of effort to curate a space for your kid,” says Jill Murphy, the editor in chief of Common Sense Media, a nonprofit that promotes safe technology for kids.

But YouTube Kids can be a great and convenient way to keep kids entertained. It provides a nearly endless source of educational content, as well as innocuous cartoons. If you’re a parent considering using YouTube Kids, or already have the app and worry about what your kids are seeing, here are the safety features you should know about.

Getting Started

When parents first download YouTube Kids, the app walks them through a tutorial explaining how it works, the options they have, and the kind of information YouTube collects. (In the US, the Children’s Online Privacy Protection Act, or COPPA, regulates how companies can track children under 13. However, a complaint filed to the FTC last year pointed out that many kids still watch regular YouTube.) Parents are then prompted to choose whether they want their child to be able to explore all of YouTube Kids, where “automated systems select content from the broader universe of videos on YouTube,” or be limited to channels that have been “verified” by YouTube. At multiple points in this process, the company admits it “can’t manually review all videos.”


Source link
.