The Landmark Blog

A Researchers Experience with Google Consumer Surveys

Posted by Kevan Oswald on Mar 13, 2014 1:25:00 PM

Jupiter is big.  With a diameter of 88,694 miles it is roughly eleven times wider than the Earth.  Comparing the diameter of Jupiter to the diameter of the Earth is like comparing the 2013 revenue of Google to the 2013 revenue of Nielson (the largest player in the marketing research industry, according to the Honomichl Top 50).  To continue the analogy, GfK (another Hinomichl) would be roughly the width of our moon, and Discovery Research Group, not sure, we’d probably be the width of that asteroid Bruce Willis blew up in the movie Armageddon years ago.

JupiterMy point is that when Google announced it was invading the marketing research universe with Google Consumer Surveys back in 2012, many of us may have felt like the Death Star was rounding the planet on its way to destroy the rebel base.  So big and so capable, Google presented an obvious concern.  Two years later, the Death Star may still be out there, with all its technological potential, but based on my experience with Google Consumer Surveys, it doesn’t pose much of a threat, at least in its current form.

About a month ago a new client came to us asking for consumer feedback on several different versions of a logo they were considering for their new startup.  I proposed a fairly comprehensive research plan, which they promptly turned down stating that it was outside of their budget.  They then told me what their budget was and asked what they could get for it.  My first thought was nothing, at least nothing that would give them anything meaningful, but then I thought about Google Consumer Surveys, ran the numbers, and decided that we could at least give them something instead of nothing, and so the experiment began.   

My client was okay with the limitations of Google Consumer Surveys, as I explained them, but since this was my first foray into their universe, I wasn’t fully aware of all the limitations until I actually wrote and programed the survey.  The following is some of the good, the bad, and the ugly that I encountered during this process:

First the good:

  • The interface of the Google Consumer Survey platform is simple and easy, colorful and intuitive.  Pick one of ten question types, replace the text, upload your pictures, and your good to go.  You can build a survey in minutes.
  • Looking at results is just as simple.  They are straight forward, graphical, include the margin of error, and you can easily see how responses are trending while the survey is running.  With the click of a mouse you can compare the results of different demographic cohorts and geographies, switch between percentages and counts, and easily share results.
  • With prices ranging from .10 cents to $3.50 per complete, depending on the number of questions, Google Consumer Surveys is very affordable.  We were able to stay within our client’s budget.

The bad:

  • Exporting your data is easy, but if you want it in any form other than .csv, you’re out of luck.  Also, expect to spend a fair amount of time converting the text of the responses to numeric outcomes if you want to run any type of meaningful analysis on the data.  Even with a short survey, this ended up being a tedious and time consuming process.
  • Have a lengthy survey?  Sorry, ten questions is the max.  Participants encounter your survey request when they want to gain access to information that they may otherwise have to pay for, like premium news sites or entertainment content.  In order to get to the content, they must take the survey.  Because of this, they won’t take the survey if it is too long, so Google keeps the length in check.
  • You can target demographically and geographically, but the targeting is “inferred.”  This means that while you can specify that you want a certain age group or gender, there is no guarantee that that is what you will get.  Targeting is based on browsing history and uses the same algorithm Google uses for targeting ads.  Google does allow for one multiple choice screener question, but if you choose to use it on some other question, you can’t also choose to target by age, gender, or geography.

The ugly:

  • Don’t plan on being able to add any logic to your survey.  No skip logic, no display logic, no validation, nothing beyond a straight up question.
  • Ten question options is a little deceiving, you really have more like four: Different versions of multiple choice, star rating, image comparison, and open-end.
  • You can have a maximum of only five options for a multiple choice question.  My client had nine different logos to compare (four unique styles and nine different color combinations), making it impossible to display all of them at once.  I had to break it up into one question on style, showing the four style options, and one question on color.  However, for the color question I had to make it an open-ended question and instruct the participant to type in “red”, “blue”, “green”, etc. because there were six color options.  This ended up working out in the end, but I’m sure it came across as being unprofessional and a little weird for the participant.      

There’s more to my bad and ugly list, like a limit of only two open-ended questions, and a max of 175 characters per questions, (this sentence has 175), but I’ll stop for now.

I really like Google.  Chrome is my browser of choice.  I have an Android phone and Gmail account.  Whether it’s YouTube, Maps, Google+, or Drive, like millions of others, I interact with Google in some form or fashion several times a day.  If Google decides to put forth the effort to developing Google Consumer Surveys into something more, as a "hard core" researcher I’ll embrace it, but until then, the product has a long long way to go to becoming a legitimate survey research tool for the masses of market research providers.

Topics: Survey, Online Survey