Enroll Course

100% Online Study
Web & Video Lectures
Earn Diploma Certificate
Access to Job Openings
Access to CV Builder



Online Certification Courses

Apple will begin inspecting iPhone and iCloud photos for evidence of child abuse in the near future

iPhone, Cloud, Computing, Photography, GSM. 

Apple will begin inspecting iPhone and iCloud photos for evidence of child abuse in the near future

A new system to automatically match photos taken by iPhones and uploaded to their iCloud accounts to a database of child sexual abuse images will be tested by Apple starting on Thursday, the company announced. If the system detects child sexual abuse images, it will alert authorities.

At a press conference, the company explained that the new service will convert photos stored on devices into an unreadable set of hashes — or complex numbers — that will be stored on the user's device. A database of hashes provided by the National Center for Missing and Exploited Children will be used to match the numbers to the individuals.

Apple (AAPL) is following in the footsteps of other major technology companies, such as Google (GOOG) and Facebook (FB), in taking this step (FB). However, it is also attempting to strike a balance between security and privacy, the latter of which Apple has emphasized as a key selling point for its devices.

 

Some privacy advocates were quick to express their reservations about the project

"Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the United States, but around the world," says Greg Nojeim, co-director of the Security & Surveillance Project at the Center for Democracy & Technology. "Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship," says Nojeim. According to the report, Apple should abandon its plans to make these changes and restore users' confidence in the security and integrity of their data stored on Apple devices and services."

According to Apple's explanation of the changes on its website, "Apple's method... is designed with user privacy in mind." The company also stressed that the tool does not "scan" user photos and that only images from the database will be included as matches in the results. A user's harmless picture of their child in the bathtub, for example, should not be flagged as inappropriate in this case.

The company also stated that a device will generate a doubly-encrypted "safety voucher," which is a packet of information that is sent to servers and encoded on photographs. Apple's review team will be notified as soon as a certain number of safety vouchers have been flagged for review. It will then decrypt the voucher, disable the user's account, and notify the National Center for Missing and Exploited Children, which will in turn notify law enforcement authorities. Those who believe their accounts have been mistakenly flagged can file an appeal to have them reinstated if they believe this has occurred.

When images are identical or visually similar, Apple's goal is to ensure that they produce the same hash, even if the image has been slightly cropped, resized, or converted from color to black and white.

Privacy and child protection can coexist, according to John Clark, president and CEO of the National Center for Missing & Exploited Children, who made the statement on Tuesday. Apple deserves to be congratulated, and we look forward to working with them to make the world a safer place for children.

 

Finally

The company's announcement is part of a larger campaign to raise awareness about child safety. When a user under the age of 18 is about to send or receive a message containing an explicit image, a new communication tool from Apple will warn them, the company announced Thursday. To use the tool, it must be enabled in Family Sharing. It uses on-device machine learning to analyze image attachments and determine whether or not they contain sexually explicit content. A notification feature for parents with children under the age of 13 can also be activated in the event that a child is about to send or receive a nude image. Apple has stated that it will have no access to the messages.

According to the company, that tool will be made available as part of a future software update.

Corporate Training for Business Growth and Schools