…PORNOGRAPHY, in that it is difficult to define.*
Sometimes it is all too similar to pornography, as we discovered last week in Metadata class when our well-intentioned instructor did a Google image search for “teach teens metadata” and ended up with ALL kinds of anatomy displayed ALL OVER the projector screen. She maintained her composure like a champ, but we were all wondering how such seemingly innocent search terms brought results that could be described neither as innocent, nor about teaching teens metadata.
So, research-minded librarians-to-be that we are, we repeated the query on someone’s laptop in different variations, with and without quotation marks, (e.g. “teach teens metadata”, “metadata teach teens” metadata teach teens, etc.) We found that the results for searching “teach teens” and “teach teens metadata” seemed to be the same, leading us to believe that Google was only using the first two (presumably more popular) query terms to search for results. The entire fiasco made me think of this article from Wired, “Google Kills Its Other Plus, and How to Bring It Back”, that talks about Boolean search operators in Google queries and how they are changing over time.
(As an aside, the old Google logo is so quaint. The letters have lost weight and gained some height over the last decade or so, particularly that commanding first capital G. And I’d forgotten it used to have an exclamation point! Oh, the Google logo…)
Anyway, Boolean operators are fabulously helpful for tricky or complicated search queries, but as this article describes, Google has been “experimenting with silently ignoring search terms completely” and making algorithmic assumptions about your query based on other more common queries.
Could this be why “teach teens” was searched instead of “teach teens metadata”? It’s difficult to say. One variable yet to be considered is the level of safe search. For example, when I search “teach teens” with SafeSearch set at Moderate, I get results that are basically a lot of smiling teachers’ headshots, leading me to believe that there was no SafeSearch setting during the escapade in our class. However, and this is curious, when I search “teach teens metadata” still with SafeSearch Moderate, three of the images from the first row are pornographic (and at a glance seem to be from the same production.)
All of this leaves me with so many questions! How is the term “metadata” associated with these pornographic images? And, is implementing metadata prevalent in the porn industry, or just this one random production that’s showing up? Is there anyway to find out without looking at the source code for a bunch of pornographic websites? Is this research I even want to delve into? Is all of this pondering and consideration about metadata and porn on my blog going to bring all kinds of spam commentary on my blog about teaching teens illicit acts and fr33 iPhon3s NOW?