Flickr -My "Restricted Content" and the Nipple Algorithm!

DwF

Well-known
Local time
5:38 AM
Joined
Mar 29, 2005
Messages
1,856
I found this file from images I made last summer while in NYC, and during Pride. I rather liked it with lots going on, so I processed it. When I put it on Flickr and while trying to upload it to Groups, each one rejected it as "restricted content". And further this image triggers a lock on the "share" icon. Am I left to surmise from this, that there is a nipple algorithm at play here? What am I missing? And we all know Flickr has much more provocative imagery than this.

David


Face Off.jpg
 
So called AI which acts like this if no real intellect is involved.
I have maturity filter turned on where. But I bump into mature part between legs on Flickr almost at every group related to female.

Same AI which greedy FB dude is using. I posted Nikkormat for sale and AI banned it under gibberish reason. No human are involved any more. I asked to review same gibberish answers.
 
I had the same kind of issue on Facebook recently. In 1990 I travelled on sailing ship Eye of the Wind through the Solomon Islands and outer islands of New Guinea. In the South Pacific men and women go topless. It is just how it is. I recently scanned some of my old slide film images and attempted to post some of those on FB.

Oddly, some were accepted and some rejected even when these were of women too. I do not recall any photos of males rejected. Up to then I had never heard of their "nipple police" or more correctly as you call it "nipple algorithm". All the photos I tried to post were legitimate "ethnographic" studies of village people going about their daily lives in the village and fields. Not only were some images rejected, I was threatened that if it happened again I would be permanently banned from the site. I was asked if I disagreed with their decision and when I clicked yes (anticipating it would flag my photos for review) another page popped up saying, essentially, "tough luck and hard shit old son, due to COVID we are not doing this - so bugger off and suck it up". It seems that even here COVID is being used as an excuse for bad faith processes and arbitrary and authoritarian behavior just like everywhere else at this time. This is not altogether surprising with FB who seem to believe they are now Masters of the Universe but it seems from your experience that Flickr is just as arbitrary, arrogant and buggy as the people and processes at FB. It is particularly egregious at Flickr it seems to me, given we as users have to pay for them to host our images where as at FB it is sort of free (though we pay in another way I suppose by tolerating their relentless adverts). Although some of my images were arbitrarily rejected by FB, I posted the above photos from the South Pacific trip to my Flickr page and had no problem. But admittedly this was at least 6 months ago so things may have changed since then.

At one level though I am not totally surprised to hear that maybe Flickr has gone down this route of restricting content. Perhaps 1-2 years back when searching groups for "women's portraits", "glamour photos" etc. (I was thinking of trying my hand at this a bit more) I did stumble on quite a few groups where individuals were posting outright home-made porn. This was not artistic stuff - it was graphic and biological. In fact, entire groups were for this sole purpose. As a result later, I did hear that Flickr had a blitz to clear these groups and people off the site to prevent it becoming just another means of disseminating porn. I do not particularly have anything against porn in principle myself (I suppose I am a bit of a libertarian though I do believe in some limits to that) but I can readily see why Flickr may wish not to become just another porn site. I can also see why they may wish to use an algorithm to help filter images being posted given the volume of images held by them on their servers. But it is aggravating if their algorithm is too facile to be able to discern the difference between real porn and actual legitimate glamour or other images especially when it thinks that men have women's breasts. At the very least they should have a mechanism for appeal and human review - after all there are legitimate and in my view wholly acceptable artistic images of naked women which I would never think of as being porn. I imagine a poorly implemented algorithm might well flag a photo taken of a nude painting at the Louvre as porn. (Although in this example below I cannot see the female subject's nipples - did they have "nipple police back in Manet's day? )

Edouard Manet - Luncheon on the Grass - Google Art Project - Le Déjeuner sur l'herbe - Wikipedia

In the case of your photo I suspect that there are two things that have drawn the algorithm's attention - the guy's nipples and the swelling of his breasts. (He is a little porky.) I believe that these algorithms look for both the presence of nipples and actual breast tissue. To be honest based on what I saw in the Flickr groups I mentioned if Flickr is serious about blocking actual porn they would do much much better by focusing the algorithms attention lower down the human body where the focus of attention of most of the porn pictures are - and usually in close up and graphic color. :confused:

PS one way around it I have seen on FB if to clone the actual nipples out. Looks weird! But works to avoid the glare of Big Brother.

Also it is a little ironic (and not lost on me given the topic of this thread), that even on this site - RFF, when I typed sh#t in quotes above, this site automatically censored it, changing that word to "****".

:confused::confused::confused::confused::confused::confused::confused::confused::confused::confused::confused::confused::confused::confused::confused:
 
Thank you all for this.....reassurance as it were that it's not just me. AI is evolving when maybe devolution would be a better fit for the some of these sites that they police. We in Washington State just had an initiative about allowing police to have this new facial recognition software. Pretty sure it didn't pass and basically because it's just not that good, as this policing of my image illustrates.

David
 
It sounds nuts, but I would believe that it happens. And Facebook is a sprawling pox on the backside of humanity, I'd have nothing to do with it if it wasn't for work.
 
Thank you all for this.....reassurance as it were that it's not just me. AI is evolving when maybe devolution would be a better fit for the some of these sites that they police. We in Washington State just had an initiative about allowing police to have this new facial recognition software. Pretty sure it didn't pass and basically because it's just not that good, as this policing of my image illustrates.

David

Facial recognition software is in use for the long time. My first television boss and mentor's son wrote face recognition between eighties and nineties. It was "sold" to France. For border services. But back then he couldn't get this as money. Due to USSR. So, instead of money the software author went to live in France for extended period of time, full boarding, food and booze. He so get used to it and immigrated to Australia...
Few years later this face recognition feature was in use by the French media asset management company. Face and voice recognition to search in digitized archives.

It is not about the software, but regime. In China and now Russia they are using it to get to those who are not agree. In Israel, it is also in use to get to those who are even more not agree...
 
It is not about the software, but regime. In China and now Russia they are using it to get to those who are not agree. In Israel, it is also in use to get to those who are even more not agree...

As an aside, have you read much about Black Cube, the private intelligence company staffed by ex-Israeli intelligence? It's remarkable what modern day espionage techniques can do when applied in civilian matters.
 
s
As an aside, have you read much about Black Cube, the private intelligence company staffed by ex-Israeli intelligence? It's remarkable what modern day espionage techniques can do when applied in civilian matters.

I get that "it" ...AI...has been out there, but not that it was employed to this extent by Flickr and especially considering some of the blatant full frontal content I have stumbled on there...operative word...stumbled!

And Ko.Fe Yes to use to more stringent degrees by certain regimes.

David
 
I think this Flickr AI thing is pretty recent, and I had a similar experience. I posted a photograph and then was unable to post it into the usual Groups I post in. I didn't realize it was AI, and I didn't see there was a notification explaining what was going on.
This is the remedy in the notification:
"What if the bot is wrong?

Mistakes happen & we realize the bot can sometimes miss the mark and over-restrict content that is actually "Moderate", or even "Safe."

If your photo is incorrectly flagged by the Moderation bot, you may manually override it by changing your content back to the appropriate safety level. Please keep in mind the guidelines when doing so.
We will continue to monitor bot performance & make adjustments as necessary."

This is the picture taken at a rock and roll Halloween show the bot made "restricted"
Shakin' It for Mark Malibu and the Wasagas by sevres babylone, on Flickr
 
That photo of the guy at the parade is fundamentally no different than what you’d see on any public beach or city park - or maybe even on a warm day in any neighborhood. Because the photos are flagged by stupid AI and humans don’t review it, then these stupid bans occur.

If you were to go back and look at 1950’s and some 1960’s Popular Photography or Modern Photography magazines, you would see both frontal and rear nudity in both the magazine content and in the ads. No big deal. Somehow, during the “enlightened” 1970’s this disappeared and censorship has been getting worse ever since.
 
My pet theory of the moment is the “AI” desires global domination but isn’t yet smart enough, so it tries these awkward attempts to suppress human sexuality as an ineffective form of birth control. Eventually it’ll figure out the more effective way is global nuclear war, except to destroy modern infrastructure also destroys the ability of the AI to multiply and learn. So it really isn’t all that intelligent at the moment. ;)
 
Nah - it’s all about fear of litigation. That’s what got the for-the-most-part artistic nudism out of the mainstream photo magazines and it is what causes American companies and corporations to kowtow to the silliest shakedowns by a small group of loud and visible rabble-rousers. So photo websites are afraid of being accused of “hosting pornography” and they choose an automated way of detecting “skin” where they think it shouldn’t be exposed.
 
I'm a moderator in a local cinephile FB group. A few weeks ago someone posted a review on the recent Minamata film, starring Johny Depp as Gene Smith. For the benefit of those unfamiliar with Smith's work, I added a photo of "Tomoko and Mother in the Bath", including credit to Smith, as well as year and place, and promptly got an account warning for violating FB's rules for "adult nudity and sex activity". Out of sheer disbelief I didn't contest the charge. It's striking, the FB algorithm can look but sometimes it fails "to see". I take it this is the case also with flickr and other "image recognition" algorithms.
 
Nah - it’s all about fear of litigation. That’s what got the for-the-most-part artistic nudism out of the mainstream photo magazines and it is what causes American companies and corporations to kowtow to the silliest shakedowns by a small group of loud and visible rabble-rousers. So photo websites are afraid of being accused of “hosting pornography” and they choose an automated way of detecting “skin” where they think it shouldn’t be exposed.
I think you're half right, but the biggest fear is probably loss of advertising revenue. If I remember right, this is what caused Tumblr to clamp down on NSFW content after the Yahoo buy-out - they didn't want to scare off advertisers.
 
In 2016, a FB 'editor' took down the 1972 Pulitzer Prize-winning photo of a naked 9-year-old Vietnamese girl running down the road after she had been severely burnt during a napalm bombardment for the same reason. After a storm of protest, including the reposting of the photo by the Norwegian Prime Minister, FB backed down and allowed the photo on its site.
 
…A few weeks ago someone posted a review on the recent Minamata film, starring Johny Depp as Gene Smith. For the benefit of those unfamiliar with Smith's work, I added a photo of "Tomoko and Mother in the Bath", including credit to Smith, as well as year and place, and promptly got an account warning for violating FB's rules for "adult nudity and sex activity". Out of sheer disbelief I didn't contest the charge. ..
That is an extremely important photo. I would contest the charge if only to educate them on how ignorant they are and how their policies are censoring important information.
 
Back
Top Bottom