Dangerous Research

As a planner, I always feel a little guilty for not being a staunch defender of research. I have no doubt that primary research can be a rich source of valuable and unexpected perspectives. For example, we have a really interesting approach within the Y&R network called “Xploring” which is a rough merging of ethnography, cultural immersion and good old fashioned chats with real people. It involves no discussion guides, no hypotheses to test and verify. At it’s simplest, it is just about getting out there and talking to people with an open mind, so you can genuinely understand them. It’s about gaining insight.

In that context, I love research.

However, that is not the type of research which most companies are doing. Instead, we’re seeing more and more bad research.

Most of us have conducted enough research ourselves in our careers, to know just how subjective it is. Everything from the questions you include, the order of the questions, the phrasing of the questions and the stimulus you display, can all heavily influence and skew the responses.

Good qualitative research requires a huge amount of moderator skill, to probe beyond the first responses and get to the bottom of tensions and contradictions. To interpret all the things which are unspoken in the room, in addition to spoken.  To know that there is a gulf of difference between what people say and what people do. To recognise that consumer decision making is not always conscious and reflective. And to take all of this into account before offering conclusions which could have long-ranging reprecussions. These skills are not easy to come by.

And the really really scary thing about much of this research, which makes it so unbelievably dangerous, is that it is masquerading as science.

Which means that too many of our clients feel bound to trust and implement the research findings and conclusions.  They’re becoming locked into relying on research to validate creative ideas, whether or not they believe in them. Their CEOs look for numbers or external testing to safeguard against marketing mistakes.

Research is increasingly being used to decide pitches. The decision about a long-term working relationship is now in the hands of ten people being paid 50 pounds to discuss cheese for two hours. Who aren’t best placed to judge whether a creative execution would make them buy more cheese, because in all honesty, they have zero expertise in this area.

The type of research that we conduct in our business, is not science. It is deeply flawed methodologically. None of that really matters if you are using it in the right way – as one tool for gathering insight which provides a springboard for creative ideas.

So this is an appeal. Most of us genuinely believe that this research is not suitable for measuring and validating creative ideas. We  don’t believe that consumers can consciously know how they make unconscious decisions. And we certainly feel overwhelmed by the pervasiveness of pseudo-science in overriding common sense and intuition.

It’s time for the advertising community to stand up, and speak up, against dangerous research. Or we’ll soon find our voices silenced by yet another weighty powerpoint deck.

Advertisements
  1. Great piece! The merits of Qual are limited at best. There has NEVER been a focus group since the beginning of focus groups that hasn’t been led by one of the participants in the group – conscious or otherwise.

    If brands were truthful with themselves and if brand managers were able to chart the success of various campaigns based on the levels of ‘qual research’ conducted, they would too see that the merits are void. I understand that spend has to be warranted of course – this just isn’t the way to be doing it.

    If people want proof – drumming monkeys in chocolate ads to name the obvious – I like to call that case study ‘Gorillas in the Mist’ – let good creatives (when you know they are good that is) lead, bought opinion (especially that led by one person in a room of 12) inhibits the brave.

    • Neasa Cunniffe
    • June 26th, 2011

    Thanks Adam!

    I just came across an interesting build from John Wilshire (PHD) who said that IPA research on effectiveness of campaigns found that those which weren’t pre-tested were more effective. He also said that processes have a tendency to average things out – which I think can also apply to research. So research will make a bad idea better, but a great idea good.

    http://neilperkin.typepad.com/only_dead_fish/2011/06/firestarters-2-design-thinking-in-planning-the-event.html

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: