FacebookInstagramTwitterContact

 

Meet and Greet Session           >>           Media Conference on Imagine Football Fiesta           >>           Customs Case           >>           Cooking Course           >>           Blood Donation Campaign           >>           KACA's Hari Raya Aidilfitri Celebration and Sports Day           >>           Outstanding Student Awards Ceremony           >>           World Earth Day Celebration           >>           Workplace Safety and Health Conference           >>           Welcoming Dinner Reception           >>          

 

SHARE THIS ARTICLE




REACH US


GENERAL INQUIRY

[email protected]

 

ADVERTISING

[email protected]

 

PRESS RELEASE

[email protected]

 

HOTLINE

+673 222-0178 [Office Hour]

+673 223-6740 [Fax]

 



Upcoming Events





Prayer Times


The prayer times for Brunei-Muara and Temburong districts. For Tutong add 1 minute and for Belait add 3 minutes.


Imsak

: 05:01 AM

Subuh

: 05:11 AM

Syuruk

: 06:29 AM

Doha

: 06:51 AM

Zohor

: 12:32 PM

Asar

: 03:44 PM

Maghrib

: 06:32 PM

Isyak

: 07:42 PM

 



The Business Directory


 

 



Security & Privacy


  Home > Security & Privacy


Apple Announces New Iphone Features To Detect Child Sex Abuse


Apple

 


 August 7th, 2021  |  12:57 PM  |   511 views

CALIFORNIA, UNITED STATES

 

The Messages, Photos and Siri features will also come to watchOS and macOS.

 

Following a report on work the company was doing to create a tool that scans iPhones for child abuse images, Apple has published a post that provides more details on its efforts related to child safety. With the release of iOS 15, watchOS 8 and macOS Monterey later this year, the company says it will introduce a variety of child safety features across Messages, Photos and Siri.

 

To start, the Messages app will include new notifications that will warn children, as well as their parents, when they either send or receive sexually explicit photos. When someone sends a child an inappropriate image, the app will blur it and display several warnings. "It's not your fault, but sensitive photos and videos can be used to hurt you," says one of the notifications, per a screenshot Apple shared.

 

As an additional precaution, the company says Messages can also notify parents if their child decides to go ahead and view a sensitive image. "Similar protections are available if a child attempts to send sexually explicit photos," according to Apple. The company notes the feature uses on-device machine learning to determine whether a photo is explicit. Moreover, Apple does not have access to the messages themselves. This feature will be available to family iCloud accounts.

 

Apple will also introduce new software tools in iOS and iPadOS that will allow the company to detect when someone uploads content to iCloud that shows children involved in sexually explicit acts. The company says it will use the technology to notify the National Center for Missing and Exploited Children (NCMEC), which will in turn work with law enforcement agencies across the US. "Apple’s method of detecting known CSAM [Child Sexual Abuse Material] is designed with user privacy in mind," the company claims.

 

Rather than scanning photos when they're uploaded to the cloud, the system will use an on-device database of "known" images provided by NCMEC and other organizations. The company says that the database assigns a hash to the photos, which acts as a kind of digital fingerprint for them.

 

A cryptographic technology called private set intersection allows Apple to determine if there's a match without seeing the result of the process. In the event of a match, an iPhone or iPad will create a cryptographic safety voucher that will encrypt the upload, along with additional data about it. Another technology called threshold secret sharing makes it so that the company can't see the contents of those vouchers unless someone passes an unspecified threshold of CSAM content. "The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account," according to the company.

 

It's only when that line is passed that the technology Apple plans to implement will allow the company to review the contents of the vouchers. At that point, the tech giant says it will manually review each report to confirm there's a match. In cases where there is one, it will disable the individual's iCloud account and forward a report to NEMEC. Users can appeal a suspension if they believe their account has been mistakenly flagged.   

 

 

 

Apple

 

Lastly, Siri, as well as the built-in search feature found in iOS and macOS, will point users to child safety resources. For instance, you'll be able to ask the company's digital assistant how to report child exploitation. Apple also plans to update Siri to intervene when someone tries to conduct any CSAM-related searches. The assistant will explain "that interest in this topic is harmful and problematic," as well as point the person to resources that offer help with the issue.

 

Apple's decision to effectively work with law enforcement agencies is likely to be seen as something of an about-face for the company. In 2016, it refused to help the FBI unlock the iPhone that had belonged to the man behind the San Bernardino terror attack. Although the government eventually turned to an outside firm to access the device, Tim Cook called the episode "chilling" and warned it could create a backdoor for more government surveillance down the road.

 


 

Source:
courtesy of ENGADGET

by Igor Bonifacic

 

If you have any stories or news that you would like to share with the global online community, please feel free to share it with us by contacting us directly at [email protected]

 

Related News


Lahad Datu Murder: Remand Of 13 Students Extende

 2024-03-30 07:57:54

Philippines: Drought Dries Up Dam To Reveal Centuries-Old Town

 2024-05-01 00:34:00

Bonza: Passengers Stranded As Australian Airline Enters Administration

 2024-05-01 00:22:41