clevage

clevage

Machine learning is amazing, but it also has some flaws.  It was recently discovered that your iPhone has an image folder specifically for pictures of you (or you in) your bra.  More specifically, it seems to be sorting photos showing skin, but not always of women in their bras.  Now, on one hand, the fact that the technology knows what a bra is and then can identify it is pretty amazing.  But on the other hand, the fact that it’s identifying women in bras is kind of scary.  Why not men in skimpy underwear?  I am not making a big deal about it being sexist, but I do want to point it out.

Apparently, your phone has been doing this for over a year now, but no one knew about it until the other day.  Many people tried the search themselves and ended up posting the photos online.  What’s interesting is that the technology couldn’t necessarily distinguish between a bikini and a bra.  So perhaps the category of “brassiere” is man crafted, but how a bra is identified was helped along by a human.  Most of the pictures taken, women revealed, had a lot of cleavage in them.  Not necessarily of breasts themselves, but rather “scandalous” photos.

clevage

Is this even a big deal?  Yes and no. I think it’s kind of creepy that Apple is categorizing your images this way, but I don’t think it’s a big deal.  I also don’t think this is a deliberate move by Apple either.  You have to keep in mind that this is being put together through machine learning.  So the machine figures out what a bra is.  Perhaps it’s looking at bras and recognizing that there’s a lot of skin.  And therefore it’s taking any photos – even those with cleavage in them – and categorizing them as “bra” photos.

But Apple does this with all kinds of images.  So does Google.  They use AI to identify what an image is so that they can categorize your images.  It’s no different than your images being categorized based on other things – like beaches, or mountains (no pun intended).  The only reason I think people are concerned is because of what it is.  But, like I said, it’s doing it to all your photos, not just those that are maybe a bit more “scandalous”.  (Note – I’m using the word “scandalous” to label these images, but I don’t necessarily think that way.)

Further to that point, my iPhone recognizes images of my nephew.  It knows who he is and they put all his pictures into a folder.  How is this any different? It’s not.  Again, the reason people are outraged is because we didn’t realize they were being categorized in the first place.  It should also be pointed out that this machine learning is done within your iPhone.  It’s not cloud-based. So if you’re storing your images on your phone, Apple isn’t looking at them.  It’s simply an algorithm in your phone that’s saying what your images are.  You should still be careful, but this is the way our phones are being built now, so find another way to keep your images safe if you’re concerned.

 

By Staff Writer

You were born original so don't live like a carbon copy. Presenting Ubiquitous Originality. | You dream it. We build it. Write about it. Market it. | info@sainteldaily.com|

One thought on “Apple Isn’t Looking at Your Photos, but your iPhone Is”

Comments are closed.