Apple opens the encryption Pandora’s box – Axios

Apple's plan to scan iPhones for child sexual abuse material (CSAM) provoked immediate criticism that it was opening a door to much broader efforts by governments seeking a way into citizens' devices.

Between the lines: That debate is important, but Apple is also laying out a technical approach that's worthy of the industry's attention.

Driving the news: Apple last week announced its plan to begin scanning iPhones in the U.S. to see if they contain material that has been flagged as illegal by the National Center for Missing and Exploited Children. A separate change would allow parents to be notified if children under 13 are sent nude images.

The big picture: Much of the debate mirrors past encryption controversies, in which encryption proponents have argued that any kind of exception or back door creates vulnerabilities that will be exploited by bad actors, so you might as well not bother using encryption at all.

Indeed, critics of Apple's approach here say that once it starts scanning devices on the client side, it really won't be offering end-to-end encryption at all.

My thought bubble: The immediate blowback suggests that Apple either didn't get the balance right in this instance, or did a bad job of communicating its system, or both.

Apple has explored this in other areas as well including the system that it created with Google to notify users of potential COVID-19 exposure. A mix of information on a device and in the cloud ensured that only a narrow amount of new data about users' health and location was created, and even less was shared.

Even those who criticize Apple over its new CSAM detection feature acknowledge there is some benefit to Apple's approach.

See the rest here:
Apple opens the encryption Pandora's box - Axios

Related Posts

Comments are closed.