APPLE HAS REVEALED that it last week removed Telegram from the App Store after learning that it was serving child pornography to users.
Telegram was purged from the iOS App Store, with Apple claiming at the time that the messaging app was being used to spread “inappropriate content”.
Now, the company has confirmed that, specifically, that inappropriate content was, in fact, child pornography.
In a verified email sent to a 9to5Mac reader, Apple’s marketing chief Phil Schiller explained: “The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps.
“After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).”
Schiller notes that Telegram removed the inappropriate content and has since put controls in place to ensure this illegal content is no longer being shown to users.
“The App Store team worked with the developer to have them remove this illegal content from the apps and ban the users who posted this horrible content,” he added.
“Only after it was verified that the developer had taken these actions and put in place more controls to keep this illegal activity from happening again were these apps reinstated on the App Store.”
As noted by 9to5Mac, Telegram – like Apple’s own iMessage – relies upon end-to-end encryption for protecting the privacy of messages sent between users, which means that this illegal content was likely being served up from a third-party plug-in.
Telegram has previously been accused of harbouring violent and extremist content on its platform and was recently singled out by prime minister Theresa May who said the app acts as a place where criminals can hide their activities.
“No-one wants to be known as the terrorists’ platform or the first-choice app for paedophiles,” she said.