Google has been messing with their search algorithms again. Specifically image search. The change has not been for the better in my opinion.
Adult bloggers and content providers have long had issues with their approach to sexually explicit content. For example, here’s Bacchus back in 2008 and then again in 2011 highlighting the strangeness of the auto-suggest feature. Now they’ve changed their image search to make it far more reluctant to display sexual images. The algorithm used to be based on classification of the content and your ‘safe search’ settings. With safe search on it filtered porn, with safe search off it didn’t filter. Seemed sensible enough. Now they’re also classifying the query you use. Unless the query shows specific intent to search for porn, they still suppress those images, no matter what your safe search settings. This leads to some pretty bad results.
For, example try image searching for one of my favorite female bloggers and adult stars, Mistress T. Here’s the search on Google and the same search on Bing (you need safe search off in both cases). The Google results are basically garbage, with John Edwards and even Mitt Romney featuring for some bizarre reason. The Bing ones are pretty much what you’d expect for a popular porn star/producer. Or try the famous bondage model Ashley Renee on Google and on Bing. One gives you what you’d expect and one gives you a bunch of police mugshots.
Searching for site names is equally weird. For example, searching for ‘captive male’ gives you a bunch of animal pictures, rather than shots from the site itself. Searching for ‘men in pain’ gives you men with migraines, and the query ‘whipped ass‘ returns random junk. Of course you can always refine your query to really make it clear what you’re after. For example, searching ‘men in pain bdsm‘ does return shots from the kink.com site. However, that refinement is going to change the results returned. Now it’s not giving me the top ranked men in pain images. It’s giving the top ranked ones that also feature the word BDSM near them. Before I could always filter out porn by simply changing my safe search settings to be stricter. Now I have to try and force it to show up by guessing the right query to use. What a stupid change.
Here’s an image from someone else affected by the change. That’s Mistress Madeline, who now barely features in her own image search result page.
Hopefully they will add in an tag term like *adult* that will alter the search algorithms. I can understand their desire to tweak it, sometimes I wonder how impossible it would be to prevent a child from seeing adult images.
I think they should really have just added another level to their safe search options. e.g. A strict, a moderate and an off. Strict shows no adult, moderate shows them only if it is really clear that’s what you want (the current behavior) and ‘off’ would be the old behavior. That’d give something for kids, adults who don’t want to see porn and for us perverts. đ
As for children, it’s obviously impossible to be 100% perfect about filtering images, but they’re actually pretty good at it. If you tick the ‘filter explicit results’ option it’s rare to see porn. The stupid thing is they’ve now only got the filtered option, unless your query happens to trigger their adult query classifier. And that classifier appears to be pretty dumb.
-paltego
BTW, here’s a cNet article about it:
http://news.cnet.com/8301-1023_3-57558795-93/google-tweaks-image-search-to-make-porn-harder-to-find/“>
So I’m NOT losing my mind, and Google search actually has become crap. That’s good to know!
Yes, they’ve really screwed it up. So far standard web search doesn’t seem to be suffering from anything quite so dumb just yet, at least from what I can tell. If you search for the examples I gave on the web, the top links returned are all the ones you’d expect. e.g. For ‘Ashely Renee’ the first 5 links are all the bondage model. Yet somehow on the image search they’ve decided that’s the wrong thing to show! Horribly inconsistent.
-paltego