Apple Removes Controversial Child Abuse Detection Tool From Webpage


Apple has removed all reference to its controversial child pornography detection (CSAM) feature from its Child Safety webpage.

Announced in August, the CSAM feature aimed to protect children from predators who use communication tools to recruit and exploit them, and to limit the dissemination of child pornography.

This was part of the features, including scanning users’ iCloud photo libraries for child sexual abuse material (CSAM), communication safety to alert children and their parents when receiving or letting them know. uploading sexually explicit photos and extended CSAM advice in Siri and Search.

Two of the three safety features, which were released earlier this week with iOS 15.2, are still present on the page titled “Extended protections for children”.

However, references to the CSAM detection, the launch of which was delayed following backlash from nonprofit and advocacy groups, researchers and others, have been removed, MacRumors reports.

The tech giant, however, said its stance had not changed since September, when it first announced it would delay the launch of CSAM detection.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take extra time over the next few months to gather feedback and make improvements before releasing these features of safety of critically important children, “Apple said in September.

Following the announcement, the features came under criticism from a wide range of individuals and organizations, including security researchers, whistleblower Edward Snowden, former Facebook security chief, politicians, etc.

Apple has made efforts to dispel misunderstandings and reassure users by posting detailed information, sharing FAQs, various new documents, interviews with company executives, and more.

According to reports, an upcoming update to Apple iOS will allow parents to protect their children and help them learn to navigate online communication in Messages.

The second beta of iOS 15 (iOS 15.2) includes support for its new communications security feature in Messages.

With this update, Apple Messages will be able to use machine learning on the device to analyze image attachments and determine if a shared photo is sexually explicit, TechCrunch reported.

–IANS

na / ksk /

(Only the title and image of this report may have been reworked by Business Standard staff; the rest of the content is automatically generated from a syndicated feed.)

Dear reader,

Business Standard has always strived to provide up-to-date information and commentary on developments that matter to you and have broader political and economic implications for the country and the world. Your encouragement and constant feedback on how to improve our offering has only strengthened our resolve and commitment to these ideals. Even in these difficult times resulting from Covid-19, we remain committed to keeping you informed and updated with credible news, authoritative views and cutting-edge commentary on relevant current issues.
However, we have a demand.

As we fight the economic impact of the pandemic, we need your support even more so that we can continue to provide you with more quality content. Our subscription model has received an encouraging response from many of you who have subscribed to our online content. More subscriptions to our online content can only help us achieve the goals of providing you with even better and more relevant content. We believe in free, fair and credible journalism. Your support through more subscriptions can help us practice the journalism to which we are committed.

Support quality journalism and subscribe to Business Standard.

Digital editor