nude_app

nude_app

Back in 2014, hundreds of nude, semi-nude and revealing pictures of female celebrities were leaked.  They were stolen from their own private collections. I can only imagine how they must have felt to have their own personal images stolen and then shared for the entire world to see.  We all know what happened, but is there a way for this to be prevented?  I’ve heard some say things like – you shouldn’t keep that kind of sensitive data on your phone at all.  What I’m hearing though is judgment for people who chose to take those kinds of photos.  Is that fair? Why can’t you keep whatever you want on your phone?  It is, after all, a personal computer in your pocket.

A new app, called Nude, aims to automatically scan your iPhone for nude photos and then move them into a protected vault in the app so that they can’t be seen by anyone else.  What’s neat about this app is that it automatically deletes them from both your camera roll and iCloud.  The two developers – Jessica Chiu and Y.C. Chen – heard from almost everyone that they didn’t have nude photos, but they all wanted to know more about the app.

The impetus for this development came from Chiu speaking with Hollywood actresses.  All actresses had sensitive images on their phones or laptop and they all also had doubts about how to keep them secure.  Of course, there’s no point in developing an app to keep nude photos out of the cloud if you need to use an online server to detect them, which is why Apple’s CoreML framework was key.

nude_app

Crucially, the images on your device are never sent to Nude itself. This is possible thanks to CoreML, the machine learning framework Apple introduced with iOS 11. These libraries allow developers to do machine learning-intensive tasks such as image recognition on the device itself, without transmitting the image to a server.

One challenge Chiu and Chen faced was that CoreML needed a lot of examples of nude photos to analyze.  No surprise here – they were able to find nude photos by scouring porn sites.  Chiu and Chen attempted to use existing, open-sourced data sets to detect nudes.  But they found that the results were often inaccurate, especially for people of colour.  Which is why they had to go to sites like PornHub in order to find representative images.  This concept is interesting to me.  We often talk about how artificial intelligence can’t always recognize people with darker skin tones.  So will this be any different?  What causes this kind of confusion for AI?

Chiu and Chen obviously rectified this problem, but that doesn’t always happen.  Sometimes these things get left unchecked and then the developer is embarrassed when their app launches and it doesn’t work for some people. While no one wants to admit to having nude pictures on their devices, I think a lot of people do.  Which makes you wonder if Nude is the answer to their problems.  Or, if they shouldn’t keep the data on there in the first place? Nude is a free download from the App Store, but it requires you to pay $0.99/month as an in-app purchase. Is it worth it? Depends on what you have to hide I guess.

By Staff Writer

You were born original so don't live like a carbon copy. Presenting Ubiquitous Originality. | You dream it. We build it. Write about it. Market it. | info@sainteldaily.com|