Apple abruptly pulled Telegram last week when it learned app was serving child pornography
https://9to5mac.com/2018/02/05/apple-telegram-illegal-content/
We now have the answer for why the popular messaging app Telegram was pulled from the App Store last week. Telegram for iOS notably disappeared from the App Store for several hours without an explanation before the service’s CEO blamed the problem on Apple pulling the app due to ‘inappropriate content’ appearing in the app.
According to an email shared by 9to5Mac reader Alijah that includes a response from Phil Schiller who manages the App Store, Telegram was abruptly pulled when Apple learned that the app was serving child pornography to users.
We now have the answer for why the popular messaging app Telegram was pulled from the App Store last week. Telegram for iOS notably disappeared from the App Store for several hours without an explanation before the service’s CEO blamed the problem on Apple pulling the app due to ‘inappropriate content’ appearing in the app.
According to an email shared by 9to5Mac reader Alijah that includes a response from Phil Schiller who manages the App Store, Telegram was abruptly pulled when Apple learned that the app was serving child pornography to users.
The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).
The response also explains what Telegram CEO Pavel Durov referenced when responding to a user last week who asked why the app was pulled:
We were alerted by Apple that inappropriate content was made available to our users and both apps were taken off the App Store. Once we have protections in place we expect the apps to be back on the App Store.
Similar to Apple’s iMessage, Telegram offers a secure messaging feature that relies on end-to-end encryption for protecting the privacy of messages sent between users. This means the illegal content was likely not simply media being shared between users but more likely content being served up from a third-party plug-in used by Telegram.
Within hours of Telegram being pulled, the secure messaging app returned to the App Store with fixes in place to prevent the illegal content from being served to users.
You can read the full email below.
The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).
The App Store team worked with the developer to have them remove this illegal content from the apps and ban the users who posted this horrible content. Only after it was verified that the developer had taken these actions and put in place more controls to keep this illegal activity from happening again were these apps reinstated on the App Store.
We will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity. Most of all, we have zero tolerance for any activity that puts children at risk – child pornography is at the top of the list of what must never occur. It is evil, illegal, and immoral.
I hope you appreciate the importance of our actions to not distribute apps on the App Store while they contain illegal content and to take swift action against anyone and any app involved in content that puts children at risk.
Hi! I am a robot. I just upvoted you! I found similar content that readers might be interested in:
https://www.iphonefirmware.com/apple-abruptly-pulled-telegram-last-week-when-it-learned-app-was-serving-child-pornography/