Apple defends new photo scanning child protection technology

Comments Off on Apple defends new photo scanning child protection technology

Apple has defended its new system that scans users’ phones for child sexual abuse material (CSAM), after a backlash from customers and privacy advocates.

Apple defends new photo scanning

The technology searches for matches of known abuse material before the image is uploaded to its iCloud storage.

Critics warned it could be a “backdoor” to spy on people, and more than 5,000 people and organizations have signed an open letter against the technology.

Thoughts

Do not use iCloud storage. Or better yet do not have any images that are of child sexual abuse material (CSAM).

My Cousin Julie Cordua is CEO of thorn (https://www.thorn.org/) she is an expert in child sexual abuse material (CSAM) prevention software to go after the child sex traffickers and the people that share these illegal images.

About Thorn

Thorn was born in 2012. Thorn’s co-founders Ashton Kutcher and Demi Moore had learned about the issue of child sex trafficking from a documentary highlighting what was happening to children in Cambodia.

They describe it as this moment where you learn something about the world that you can’t un-know. As they started learning more, they realized that it is just as prolific of a problem here in the United States as it is overseas.

Thorn CEO Julie Cordua joined soon after to begin to dig further into the issue of child sex trafficking. There was a common theme that emerged from those working in the field — technology was playing a role in extending the crime.

However, technology had yet to play a significant part in its solution. Our team started out in this space focused on investing in the innovation phase of potential tech-led approaches to ending online child sexual abuse.

Within the first two years, we realized that we needed to shift our model for scalable long-term change.

Thorn did and is highly successful saving these abused children from pure evil.

Read More: More About Thorn

We can all agree that Child sexual abuse material (CSAM) must be stopped, and I believe the organizations like my cousin and her team at Thorn along with companies like Apple that are working hard to create technology to protect these abused children are heroes.

The critics need to really think about who they are as humans, and if they are truly good humans before complaining over this issue. Evil always blames their victims.

References:

Reference 1: bbc.com – Apple defends new photo scanning child protection tech.

Contact us today toll free 1-888-392-9623 to find out more on how Adept Technologies can save you money by utilizing our services and software technology.

Adept Media

Adept Technologies Inc.