torsdag 24 oktober 2013

Google Misogyny?


A recent campaign by UN Women, has caused some controversy on the web. The campaign depicts women gagged by the Google auto-complete function, which then suggests what "women should ...". The results that the auto-complete function suggests are not really the results you would want women to do nor get subjected to. 

What the UN wanted to show with this campaign is that there still is a lot of misogynistic ideas circulating the web, and draw attention to this. However, this campaign has also raised the question; what blame is Google to carry in this debate?

To debunk this we first have to come clear concerning what the auto-complete function really does, and to do so we first need to understand how the search function is set up. Google's search does actually not return the most accurate answer, but rather the most popular one, in short the most frequently visited answer.

There is a lot more to this, but handling Google search parameters is a different story, so for now we will have to simplify this process and just set it as the most popular one. 


The effect this has on the auto-complete function is that it will also display the most commonly used search terms that starts with a specific search parameter; "women should ...". As a result of this programming method, if a large number of people search for "women should obey men" this will automatically yield a high position in the auto-complete function.
On a personal note: I use Google to find answers to questions, and in these examples above it seems to me that the asker already has the answer set. This is not as much a question as it is a statement. 

So, how much blame does Google have in this? While it is true that Google has developed a function for censoring this feature (searching for "bitch ..." will not yield any auto-complete suggestions) one cannot really blame them for what the users search for, and therefor how the system will continually be built up. If users, en mass, started searching for "women should be independent" then that would yield a totally different result. This may seem like a fruitless endeavor but it is worth noting since it underlines the fact that the system is built for the user, and the user also, to some extent, dictates how it is being used. 

Personally, I think this is a case of missed screening on Google's part. Everyday more than 5 billion searches are done using Google, and to demand Google to police all of them is asking to much. However, one must also ask oneself; is that really the way we want the system to work? 

I think what has happened here is a perfect example of the system actually working the way it is suppose to work. Google provides the tool for us to botanize an almost infinite amount of knowledge, and we, as the users, will then help Google to point out when this knowledge is flawed - or, as in this case, out right deprecatory. Google will then review the claim, and possibly remove or censor the input. 

Right now we just have to see how Google reacts to this information provided by the UN. Hopefully (most likely, anything else would be brand-suicide) they will censor these ill informed auto-complete suggestion. 

Ball is in your court Google, lets us see that you still want to play fair!

note: to read up more on this story I recommend this post by Jessica Lee. As always please let me know what you think in the comments below!

Inga kommentarer:

Skicka en kommentar