1. RN_ESOMAR_WP_GBook_480_60_11_14
  2. usa-banner2015
  3. aha-banner-480x60-ver01gif
  4. SIS International

What are the Benefits and Strengths of Social Media Research?

Social Media Research has been around as a distinct and established tool for about ten years, so it is probably a good time to take stock of what it is offering and what it could be offering.

social media pie

Editor’s Note:  We’ve been talking about social media research within the MR industry (and specifically here on the blog) for many years, with both promoters and detractors voicing their perspectives on the topic. However, what has been harder to lock down is actual proof on how social media is being used effectively within the insights space and an unbiased view of emerging best practices, use cases, and overall strengths and weaknesses.  Ray Poynter aims to change that and has launched a global project to bring in a players from across the industry to explore these questions.

We’re very proud to be a participant in this initiative and  I hope some of our readers will jump in too. Details on the program and how to get involved are below.


By Ray Poynter

Social Media Research has been around as a distinct and established tool for about ten years, so it is probably a good time to take stock of what it is offering and what it could be offering. In order to do that NewMR and GreenBook have created a collaborative project to investigate and highlight the benefits of Social Media Research.

The project is open to anybody conducting Social Media Research and companies currently signed up to the project include:

  • TNS
  • CultrDig
  • Mass Cognition
  • Bakamo Social
  • Connect 4 Marketing
  • MMR International
  • Interact RDT
  • Susan Bell Research

The range of companies taking part means that we should get a 360° picture of Social Media Research. We have companies from North America, Europe, and APAC, and one global company. We have companies who will focus on quantitative approaches and large scale listening and we have companies who will be focusing on smaller-scale and qualitative approaches. And, as a by-product we might learn some pretty interesting things about the prospects for market research!

However, the door is not closed. If you and your company want to take part, there is still time to join, just let us know – but you need to move quickly.

The project is based on data collection in May and reporting in June. The companies taking part have been challenged to research the topic of “Market Research” using any Social Media Research tools and approaches they wish. The reports will reflect the different approaches adopted by the different companies. Drawing on the different reports and elements of the project Lenny Murphy and Ray Poynter will produce and overarching summary in July – highlighting the benefits, strengths, and opportunities of Social Media Research.

If you want to take part in the project, or if you just want to read the full spec, click here.


How To Conduct Online Qualitative Research To Test Ads

The confluence of broadband connectivity, digital technology, and the comfort level and skill of consumers in communicating so easily and so well within computer-mediated environments has made the need to conduct F2F qualitative research increasingly threadbare.



By Paul Rubenstein, Ph. D.

For decades, qualitative research has been used to inform advertising and communication strategies and to test ads before they are made public.  Steps in the development process usually have organizations (and their agencies) moving from creating advertising concepts, to designing story boards and animatics, to assembling final executions.  All of this, often times, is fueled by the insights gathered at each point using traditional focus groups.

And it is the focus group that has been the predominant method used to achieve ad testing related study objectives.  Data collected in these studies are used to inform strategies and tactics in terms of what to say and how to say it.  That is because consumers’ opinions about the subject matter and specific stimuli to which they are asked to opine is used as the foundation of advertising and communications.

Particularly when brainstorming is needed among consumers, conditions that foster creativity can be facilitated by a group setting.  Being able to hear others’ ideas help individuals come up with ones of their own.  Furthermore, role playing and other projective techniques represent tried and true exercises that have been used successfully in focus groups conducted for ad testing purposes for many years.  Simply, a moderator is not worth his salt if he has never used some of these techniques, at least sometimes, and certainly for ad development purposes.

However, there are several highly effective projective techniques and other kinds of exercises that are highly problematic to execute in an in-person setting such as a focus group facility.  Whether it is perception mapping or sorting, story-telling projectives, image tracking or text tracking, or dial-testing exercises that are needed to be deployed, using printed materials, markers, scrap paper, glues, pins, dials, or spit and tissue paper, successfully managing these kinds of research techniques are among the greatest challenges to a moderator.

In contrast, shifting data collection methods from face-to-face (F2F) to an (asynchronous) online qualitative research platform that has these functions will not only produce more data and cost less than focus groups, but will enable these sorts of advanced research techniques to be utilized more easily and the data gathered will be analyzed with greater precision.

To begin, let’s understand the entire research agenda involved at various steps in the development and execution of advertising campaigns.  The figure below may serve to do this in one graphic.


Consideration Research

As stated earlier, the entire end-to-end process involved in advertising development can be boiled down to three (3) simple questions:

  1. What should the ad say?
  2. How should it be said?
  3. How well did the ad work?

These fundamental questions can be mapped to specific types of research studies, namely, consideration, articulation and execution, and evaluation studies, respectively.  Beginning with consideration research, the main objective is to understand the equity of a given brand within the context of its competitive landscape.  As such, the researcher must uncover and reveal how consumers conceptualize the industry in question and which key characteristics drive their consideration of one brand over others.

A useful technique for consideration research is perception mapping, which can be executed in both quantitative as well as qualitative research.  In quantitative research, a set of multivariate techniques are brought to bear, including discriminant function analysis, correspondence analysis, or multi-dimensional scaling.  The resulting perception map that is formed from the pattern in the quantitative data shows how target consumers view the similarities and differences between brands and which brand characteristics define each best.

But a highly useful, first phase of consideration research may warrant a qualitative study that is used to inform the subsequent quantitative phase.  In this case, a perception mapping exercise, similar to what has been used in focus groups for so many years, may be improved in its execution if done online.  Through simple and fun “drag-and-drop” motions that participants are instructed to perform, a resulting perception map may be determined accurately and easily, as shown below in a mock example:




As the map shows, the position of each logo is automatically determined by the programming of the platform which is set to place each one at the average X and Y bi-variate coordinates calculated from the pool of participants in the study.  To do so accurately using F2F methods is almost impossible.  Furthermore, using an online qualitative research platform for this purpose enables the moderator to filter the data with a button push or two in order to test any study objectives that warrant subgroup analyses.

For that matter, other projectives that require drag and drop may also be used.  These include various sorting exercises in which the number of or labels for the categories may or may not be provided in advance, such as shown below:



And yet another projective technique that has been utilized for decades in focus groups is what is known as “Story-Telling.”  This technique uses a set of images that evoke various positive and negative emotions in people.  These images are shown to participants who are asked to choose, from the set, ones that they then put in some order and used to tell a story about the subject matter.  This exercise is most easily facilitated online through drag and drop functions built into the platform and yield individual responses that are captured along with (text-based) stories.  Moreover, these stories integrating images and text can be readily copied and pasted into the moderator’s report with a few button pushes, as opposed to the tallying (by hand) of each story, each image, and each participant’s ID number as would be the case in a traditional focus group setting.

Articulation Research

In Articulation research, the main objective is to inform “how to say it.”  This, too, can entail integrated phases of both qualitative and quantitative studies, one informing the other and each designed to do just that.  Regardless of the type of study, it will include gathering the reactions from target consumers to a distinct advertising/communication stimulus, e.g., print ad, advertising concept description.

For these types of qualitative studies, moderating discussion among participants to generate their ideas for ads is usually a part.  In addition, participants also are shown various articulations that have been developed by the organization and/or its ad agency.  These stimuli then become the grist for the mill to sharpen, clarify, improve, or even eliminate by using functions such as image tracking or text tracking.  As can be seen by the example below, participants can isolate portions of the stimuli according to the moderator’s needs, such as “draw a red circle around the portions you like.”




Likewise, in the case of text tracking shown below, the instructions to the participant may be to “highlight the specific words you find confusing.”




And if the stimulus is a video as opposed to a static image, such as a TV commercial, dial-testing is arguably the best and most sophisticated technique to evaluate this kind of stimulus.  It directly shows at what point during the span of time of the commercial (e.g., 30 seconds, 60 seconds) participants liked or disliked the ad and is usually utilized after a finished commercial has been developed and about to air.




All of these articulation testing exercises have been used in focus groups for a long time, for at least the past 20 years.  I remember many years ago having to carry the heavy, sometimes reliable dial testing equipment to the special facility with auditorium seating to run dial-testing focus groups.  These groups were very expensive for the client and nerve racking to the moderator as dials, wires and their connections would, sometimes, be faulty.

But nowadays, thanks to online qualitative research platforms, all these functions are facilitated very easily, and far less expensively.  Aggregating the data, creating output, and including it in final report deliverables is handled by some clicks on the screen.

This is a great time in the history of the market research industry to be in this business.  The confluence of broadband connectivity, digital technology, and the comfort level and skill of consumers in communicating so easily and so well within computer-mediated environments has made the need to conduct F2F qualitative research increasingly threadbare.  Truly, the net effect of these factors in combination has opened up a whole new world of possibilities in social research in general and market research in particular.  Considering that focus groups have been done the same way for over four decades, qualitative research was very much in need of some fresh approaches.  Conversely, quantitative research has undergone radical shifts and improvements as a function of advances in technology over this same period and that have improved data quality and reduced cost and time requirements.  Indeed, it is a rare occasion when quantitative surveys need to be administered F2F, e.g., mall intercepts.

Indeed, moderators will be ever more hard pressed to rationalize to clients their choice of F2F and why they are not shifting to an online method for studies like these.  Indeed, better, cheaper, and faster is a compelling argument that forces client-side research managers to embrace online methods as they, too, will need to rationalize their choices to the clients they serve within their organizations.


How To Avoid Disruption In Business And In Life

Market research has historically done a poor job of helping companies avoid disruption. We are good at tweaking the known but not so good at spotting the next chapter.  To underline this fact 89% of firms that were on the Fortune 500 list in 1955 are gone. Companies like National Sugar, Detroit Steel and Studebaker all disrupted out of existence. This post, and the accompanying Ted talk, will argue that traditional market research techniques focus too much on the easy rational (system two) analysis vs. the more emotional (system one) world that we live in.  It will also share a new approach to exploring those primal system one motivations in order to avoid disruption.




By David Rabjohns

Disruption isn’t just reserved for old companies like Studebaker.  Companies and categories continue to disappear everyday.  Goodbye Borders, Goodbye Blockbuster.  Our analysis showed that it takes an average of 16 years from the arrival of a new idea to disruption.  Kodak, for example, went bankrupt 16 years after the arrival of the first affordable digital camera and Borders disappeared 16 years after the arrival of Amazon.  Because of this often the people being disrupted don’t even realize it is happening until it is too late, like the story of the poor frog slowing boiling to death in his pot.

The exceptions to the rule

There are some companies that seem to avoid this problem. Take Nike for example. They have managed to grow despite intense competition and the perpetual development of new technology. What did they do differently? Did they make better shoes? Did they hire smarter shoemakers?

We have been researching what causes companies to be disrupted and what survivors do differently. The results have brought us back to the fundamental drivers of humanity.

Nike understood that they wanted to build an emotional experience for their customer. They focused on building a coaching brand that helped people find athletic success. The company embraces new technology, such as the iPhone, as a means of furthering their goal of helping their customers succeed. They put the ‘who’ before the ‘how’.

Kodak, on the other hand, was focused on the product. They advertised a product and brand that focused around innovation and testing new film concepts. The company understood too late the value of helping people create memories.

What everyone wants

People generally share simple, universal, common, motivational needs. People want:

  • to feel successful
  • to feel connected
  • to feel creative
  • etc

These motivational needs exist because of our DNA. These traits have driven us forward as a human race. They allowed us to fight opponents from the wooly mammoth to modern smog.

Studies have identified 12 fundamental human drivers.

In this new world of research, we can now understand the causes of disruption and how to avoid it. Online consumer conversations can be the key we need to gain a clear window into the lives and unmet needs of our customers.

Using these needs

There are many ways to look at these needs. We have developed a software tool called MotiveScape, which scans thousands of conversations and automatically categorizes certain linguistics that related to the 12 fundamental human drivers.

Using this kind of System 1 style of analysis we find that people do not buy the things a company sells; they buy how the brand makes them feel. The company’s core personality, the ‘who’, is what helps consumers work out how they feel.

These major brands that have failed have all made the mistake of focusing too much on what they sold and forgot about their ‘who’. They missed their motivational need and found themselves disrupted.

The companies that succeed are the ones that take the motivational message to heart. These companies have avoided disruption and continue to grow strongly.

New research methods can help us better explore the core motivational drivers, understand them, categorize them, and use them to develop the archetypal story that each brand tells.


For more on this topic, explore this Ted talk https://www.youtube.com/watch?v=AuOIYwHv0I8 by David Rabjohns, Founder of MotiveQuest.


Why “Patient-Centricity” Requires Relationships

Posted by Corey Schwartz Tuesday, April 21, 2015, 8:00 am
It seems like most conversations at healthcare industry conferences center around the notion of “patient-centricity.” Yet everyone seems to define it differently.

patient2Photo credit: COD Newsroom

By Corey Schwartz

Corey Schwartz is the Managing Director of Communispace Health and will be presenting at IIeX Health 2015 about Why Relationships are Critical for Better Health Outcomes.

It seems like most conversations at healthcare industry conferences center around the notion of “patient-centricity.” Yet everyone seems to define it differently.

In reality, the term “patient-centricity” points to a fundamental flaw in our thinking – it artificially relegates (and inherently isolates) patients to a single focal point within a company. However, when we think about only patients as the epicenter of our practice, we are missing an opportunity to holistically understand all the factors that affect the patient.

“Patient-centricity” is about building real, ongoing relationships – not just with patients, but with all the people who nurture and support the patient journey. The doctors, nurses, caregivers, and pharmacists. The pharma companies, retailers, health insurers, medical device makers, and CPG manufacturers. It’s a complex, interconnected symbiosis that requires – and deserves – a deep and broad understanding of every stakeholder.

It’s through relationships that we are able to access and engage with the raw emotions, candid stories, challenges, triumphs, and day-to-day realities of everyone in the patient care ecosystem. Relationships shatter preconceived notions, challenge the status quo, and help us see the world – and the patients, customers, and employees living and shopping and working in it – in new and inspiring ways, spurring innovation and encouraging progress.

Relationships, especially long-term ones, create intimacy and authentic conversation. They get people talking, doing, and sharing with you in ways that you never imagined. For example, a real, ongoing relationship can reveal that nighttime is consistently the scariest time for a diabetic teenager and his mother. Insight like this can reveal the opportunities for numerous organizations – providers, payers, physicians, retailers, and more – to work towards improving outcomes.

When we’re in an honest relationship, we feel it. We can’t ignore it. We internalize it. And that goes for organizations, too. That empathetic gut-feeling gets internalized into the hearts and minds of people across every department – from insights to R&D to marketing to the C-level – guiding every decision and every action. When a company has ongoing access to first-hand accounts of a Parkinson’s patient’s daily struggles, for instance – that access spawns empathy and carries with it enormous power to galvanize alignment and incite action.

Even companies with the best intentions to be patient-led can still make decisions based on the company’s needs, not the patient’s. In the healthcare sector, especially, relationships with patients or physicians may seem scary or riddled with compliance issues. But they don’t have to be. Innovation, expertise, and regulatory prowess can lead to safe, systematic engagement. In reality, it’s scarier to innovate in silos, detached from the people you serve. So let them be your guide. Because your relationship with them is the truest and fastest way to “patient-centricity.”


Checking Your Blind Spots: Take A Look At The “Aren’ts”

It is important not to forget how much we can learn from those that aren’t currently purchasers and users, or the “Aren’ts”. There are a few benefits of getting to know your Aren’ts.



Guest Post By Will Pirkey of iModerate

Today, businesses have access to a wealth of information that allows (among many other things) an easier and quicker view of the audience that’s purchasing one’s product or service. Brands have more insight into the ages, genders, geographies, and more that are putting their products in their physical or virtual shopping carts.

Concurrently, we have been noticing a trend regarding the desired audiences clients want to engage in research.  The audiences are becoming more restricted as marketing, insights, and research teams seek to narrow in and learn more about solely the audiences that are buying their products and using their services.

There is absolutely nothing wrong with learning as much as you can about your target audience – brands should keep trying to get to know these people in as much depth as possible. However, we can’t be so narrow in focus, as if we are looking through blinders, on the target audience. Big Data tells us who our core consumers are, but leaves blind spots that need to be checked.  It is important not to forget how much we can learn from those that aren’t currently purchasers and users, or the “Aren’ts”. There are a few benefits of getting to know your Aren’ts:

Get a clear picture of the barriers, limitations, and gaps that are preventing people from becoming consumers. These barriers could range from brand perceptions to price to product placement to lack of awareness and anything in between. You will never know what your barriers are if you never ask, and you can’t address your barriers until you know what they are and why they exist. Digging deep into why people aren’t using your product will allow you to work to bridge these gaps and overcome limitations. For example, what if people perceive your brand as for experts only, but you want to reach the general public (e.g. Brand Wars: Nike v. Under Armour).  You wouldn’t know this until you talk to the people that aren’t currently purchasing your brand’s products.

Collect some competitive intelligence. If they’re not buying and using your products and services, there’s a good chance they’re customers of a competitor of yours. Take the opportunity to learn about what they do consume in your category. What do they like/dislike? What motivates them to make the choices that they do? What would they change if they could? Having a more open ended conversation with someone about why are currently choosing to use X product or service can be a very enlightening experience that you can leverage to improve your own offerings.

Identify potential opportunities. Connecting with your Aren’ts gives you license to investigate opportunities. Conduct open-ended research with your “Aren’ts” to explore whether you potentially have something, or could develop something, that would make them more likely to become an “are.” One of the major benefits of conducting open ended research is to the ability to minimize interjecting your own assumptions and allow people to speak for themselves as much as possible.  As specific themes or trends start to develop they can become topics for additional research.

Stay ahead of trends in the market. By speaking with a much larger swath of people about what they want, what they aren’t currently getting, what they like and dislike, their excitement and/or fears about the future, and their perceptions of larger socio-cultural forces (e.g. economy, politics, etc.) you can chain foresight into what is on the horizon. By doing this you can get out in front of trends and anticipate rather than react. We all would like to be seen as trend setters rather than trend followers.

One question that should be asked for all research, but especially Big Data, is not only what our data is telling us, but also what is our data NOT telling us.  Big Data lacks the intimacy to allow you to really connect with people and get to know what makes them tick. This is true for both your customers and those that aren’t.  However, if you don’t focus on both you really are only seeing half of the picture. Put in another way, you’re not going to change lanes without checking your blind spots first; everything may seem clear looking straight ahead but you never know what is coming up on you. Getting to know your Aren’ts will provide you with so incredibly valuable information.  Even if in the end you learn that you’re targeting the right people, you’ll be able to strengthen your differentiators, validate your current direction, and act with confidence. We all like to focus on the things we are doing right, but sometimes it’s the things we are doing wrong or not doing at all that we can learn from the most.


Change In Marketing Will Never Be As Slow As It Is Today

Is MarTech improving the CMO’s performance? If so, what is the nature of the improvement: tactical or strategic?



By Peter Orban

MarTech entered the marketing subconscious with a bang ever since Gartner predicted that – in a span of five years, by 2017  – the CMO will spend more on IT than the CTO. Interestingly, the average tenure of the CMO doubled in the past six years – despite increased expectations of marketing. Is MarTech improving the CMO’s performance? If so, what is the nature of the improvement: tactical or strategic?

I was pondering these and similar questions while attending the second MarTech conference in early April, chaired by Scott Brinker, the creator of the MarTech Lumascape. Based on the presentations and conversations with participants and exhibitors alike, here are few observations.

  • The big change in how marketing operates and fits into the enterprise is not coming – it is here.
  • The foundation of the emerging new marketing function is a combination of internal & external data.
  • Sitting on top of the data is the “decisioning layer,” the intelligence which – combining deep user/consumer insights with business logic and machine learning – provides direction.
  • Finally comes the “execution layer” which carries out the interaction with customers and captures the resulting data generated in the process, feeding it back to the first layer.

David Raab’s presentation about Customer Data Platforms illuminated the new architecture. Corey Craig, HP gave a great example of combining business logic with the understanding of the user journey. Tony Ralph, Netflix, shared a story about building its own execution layer when the one for sale was not good enough.

  • To successfully implement the emerging function you cannot piecemeal it: one needs to adopt systems thinking covering structure, processes, and people at the same time.
  • A new organizational blueprint is emerging incorporating Marketing, IT, and even Sales. It is not a new silo replacing old silos, but a flexible and agile organizational “quick sand” adopting as necessary, frequently in real time.

Laura Ramos, Cynthia Gumbert, and others spoke about a merger between Marketing & IT. Jill Rowley had similar thoughts but related to Marketing & Sales, arguing sales is becoming more social, and marketing is capable not just delivering a lead, but also converting it. Jeff Cram included customer service and the principles of service design to integrate thinking around people, content, processes, and platforms when designing customer experiences. In any case, most in the room thought that – to paraphrase David Packard – IT is too important to leave it to IT.

  • With the death of “the campaign,” processes also need to become flexible, and able to handle exceptions to deliver “uninterrupted experience” – just like a computer program.

Advocating Lean and Agile approaches, Jeff Gothelf, (Neo.com) underscored the need to combine customer-centric experimentation, data-informed iteration, and humility. Isaac Wyatt pointed out the increasing similarity between creating a marketing and a computer program. According to Pat Spenner, traditional marketing is like an orchestra (or waterfall), while the new one is more like a jazz band (agile).

  • The new org will need new frameworks, tools and trained marketing technologists. The road to this is via culturally fitting experiments pointing to the many variables in the path of becoming a marketing technologist and specifying role of the new MarTech department.

Joseph Kurian, (Aetna) and Saad Hameed, (Linkedin) described their respective journey of identifying or ‘creating’ hybrid talent necessary for their operations.

However, the other big change – a tectonic shift in consumer/customer behavior – was not nearly as much represented on the agenda. Granted, most speakers suggested as a starting point that consumer and user intent is well understood, signal and noise have been separated, and the ingredients of user experience have been successfully isolated/attributed. But only a few actually dissected the benefits of new insights and a deeper understanding of consumer behavior.

Maybe because of the relative scarcity, the few “outside in” sessions were eye opening.

  • With the right consumer insights MarTech can effectively extend the business model of the organization into the digital realm. Also inevitably, return on technology will start to diminish at some point, but return on creativity – rooted in ever improving consumer insights – will not. (Gerry Murray, IDC)

Perhaps it is not surprising that Venture investors interviewed on the closing panel agreed that future investments will target the “decision layer” while the other layers are more likely to experience consolidation.

This tells me that the most exciting days of MarTech are yet to come.


Why I Don’t Miss Angry MR Client

Fortunately, Angry MR Client, Angry MR Respondent, and Angry MR Vendor seem to have faded away.  Unfortunately, too often the complaints vendors and clients have about “the other side” are still there.  Don’t get angry – get better.  Or buy a saddle (and you’ll have to read to the end to know what that means.)



By Ron Sellers

You know what I don’t miss?  The multiple blog posts from Angry MR Client and Angry MR Respondent that hit the industry a couple of years ago.  There were also a few posts by those claiming to be an Angry MR Vendor, along with plenty of angry vendor rebuttals to the other angry folks.  Remember those?

Why was everyone so angry?  Are they still in a rage, or have they simmered down to being mildly annoyed or maybe just dyspeptic?

While these specific Angry bloggers seem to have stepped away from the keyboard, there are still a lot of complaints vendors have about clients, and clients about vendors.  Too often, these complaints are directed at an entire class of researchers or at the industry as a whole.

Having been both a vendor and a client, I understand many of the complaints from each perspective – shoddy service from vendors, repeated abuse of respondents, unreasonable client demands, payment problems, unhelpful reporting, stale methodological approaches, awful sales attempts, micromanagement, lack of project management, etc.  In fact, I’ve complained about many of these myself, either here in the Greenbook blog or in informal gripe sessions with other researchers.

I get the desire for our industry to be better, but I just don’t get the anger directed so often at an entire sector of our industry, or at the tendency to lump all clients or all vendors together.  Too many times, sentences start with “What vendors fail to do is…” or “What clients don’t understand is…”

Many times, I’ve been a survey respondent, and I’ve experienced the horrors of what’s out there masquerading as research.  But I don’t get angry – I just refuse to waste my time with poorly designed questionnaires.  I see no reason to get angry; instead, I just walk away and let the researchers get angry at the low completion rate and high abandonment rate on their shoddy surveys (if they even know or care what those problems are).

I’ve been badly treated by clients.  I’ve had people fail to pay me on time (or at all to the tune of $40,000 in one case).  I’ve lost a $30,000 project solely because someone underbid me by $100.  I’ve had a client ask me to falsify data.  I’ve had a client try and frame me for a major mistake she made (fortunately I had a copy of her mistake in her own handwriting).  I’ve had a client scream at me because he never provided me with approval on the questionnaire and I did not field it without his approval (leading to one of my all-time favorite lines from a client:  “I don’t care if it’s right!  I need data!  Just get me data!”).

And certainly I got angry.  But I only got angry at those people.  Not even at their companies, because there were other very fine people in those firms.  Just at those people.  I’m not angry at clients in general.  I’m not expecting the worst from every client with whom I work, and I’m happy to say I truly enjoy working with most of my clients.

I also learned early in my career that anger just isn’t worth it.  My boss at that time (who was the owner of the company) heard about the client screaming at me.  Her response taught me a lot about how to conduct my business:  “Do your best to finish up this project and once it’s done, don’t even accept any phone calls from him in the future.  I will not have my employees treated that way.  We don’t need his business that badly.  I refuse to work with him again.”

This was why, a decade later, I fired my biggest client.  Well, I didn’t specifically tell them not to contact me any more – I just stopped asking for their business and instead concentrated on finding replacement business.  They sort of quietly went away to inflict torture on other vendors.

I got tired of the fact that they rotated people in and out of the research department so often that I never worked with the same person twice and had no chance to build relationships.  I got tired of their ridiculous demands (like me begging them for two weeks to allow me to over-recruit a project, then having them demand the night before the focus groups that we add more recruits; or having their people wake me up early on a Sunday morning to discuss something insignificant).  I got tired of their nickel-and-dime approach to work, like the fact that they refused to reimburse vendors for lunch in their travel expenses (because if I were in my Phoenix office rather than on the road, I’d be going out to lunch anyway, so I could darn well pay for my own lunch in Atlanta or Detroit just like I would in Phoenix).  Toward the end, one of their analysts confided to me that one-third of the RFPs they sent out would come back marked “declined to bid.”  Seems that many others had the same perspectives I did.

But other than swapping funny war stories with other researchers, I’m not angry at that client, nor at clients in general.  Rather than getting angry, I got better.  I got better at finding people I could respect and who would respect me, and I concentrated on doing my best to serve them.

I’ve also used scores of vendors over the years, and found some of them to be so incompetent as to boggle my mind.  Like the project that was supposed to go into the field on a Thursday night, but I couldn’t get through to the company at all Friday morning for an update.  Right before noon, someone finally picked up the phone, and told me that his company had just bought the vendor I had contracted with.  They had fired the entire staff (giving them one hour to clean out their desks), and he had taken over.  When I asked about my project, his response was, “We don’t have time to do it, and even if we did, they underbid the project, so we would charge you three times the amount for it.”  And in those days of printed phone lists, he had no idea where my one copy of the list was, nor when he could be bothered to return it.

Or how about the company that was recruiting clergy for my focus groups.  When I asked them to send me a list of local churches, moments later I was the surprised recipient of a list of every Church’s Fried Chicken restaurant in their market.

Or maybe the three different field vendors I’ve had who have utterly falsified data they tried to give me.  One simply made up quantitative surveys and tried to pass them off to me as completed interviews; when I started questioning some of the oddities a junior staff member admitted their attempted swindle.  Two were qualitative recruiters who didn’t want to face up to the fact that they weren’t getting recruits, so they plied me with fictitious reports until the day before the groups, when they finally admitted they had almost no recruits.  Yes, I was very angry – even to the point of getting one person fired for the ruse.  But I don’t lay the blame at the feet of all vendors for these misdeeds.

If you simply cannot find good vendors or good clients, maybe you need to reconsider your approach.  For difficult clients, consider charging them more (at least to be compensated for your misery), or standing up to them (in a nice way) and explaining why you are not going to work all weekend because they forgot to give you something until Friday at 4:59 p.m.  If it’s bad enough, maybe you need to find other clients.

If you simply do not have good vendors who can meet your needs, maybe you need to look harder for new vendors, or do a better job of vetting them and investigating their work before you trust them with a project.  If your vendors are consistently underwhelming you with their work, why do you continue to use those vendors?

Or maybe you need to reconsider your expectations.  I would love to find a car that has 500 horsepower and gets 100 miles to the gallon, but I’m not going to curse all car companies because no one is giving me what I want.

Constant complaints about how vendors are incompetent remind me of the single guy who gripes that there are no good women available – but his definition of “good women” is that they are rich, gorgeous, compliant, and willing to devote themselves entirely to his needs.  (Ironically, this also is usually the guy who hasn’t come within ten yards of a stick of deodorant in days and who thinks a classy date is buying the name-brand pork rinds instead of the store brand for when the two of them watch Married with Children reruns.)

Or maybe you need to reconsider how you work with vendors.  Vendors can’t write a strategically meaningful report if you refuse to tell them what your strategic needs are, or you make them do their work in a vacuum.  Vendors are unlikely to move heaven and earth to meet your deadlines if they know you sat on the RFP for two weeks, or that you’re likely to pay them six weeks late.  Vendors are not going to wrack their brains to come up with innovative ways of getting you what you need if you’ve previously used their ideas but given the actual work to someone less expensive, or if you’re constantly micromanaging their work (or taking credit for it).  Vendors probably won’t absorb unexpected costs or gladly do extra work if you make a habit of demanding they reduce their bid by 20%, or requiring line-item bids so you can question every expense individually.

I have been extremely angry at individual clients and vendors at different times, and I have no doubt I have made some clients and vendors angry at me over the course of my career.  I’ve messed up royally a few times (although I also try to acknowledge and correct the mistakes and make things right with the affected party).  But I don’t see any of this as a reason to feel there are no good clients or no good vendors, or that our industry is a morass of feeblemindedness, group think, and incompetence.  There are plenty of bad clients and vendors in our world, but also plenty of very, very good ones.  That’s why I have a variety of clients I’ve worked with for multiple decades:  we both make a habit of working towards a great partnership where everybody benefits.

I received a valuable piece of wisdom years ago from my pastor, who said this:  “If one person calls you a horse’s behind, don’t worry about it.  If two people call you a horse’s behind, take a good hard look in the mirror.  If three people call you a horse’s behind…buy a saddle.”

If your vendors inevitably fail your expectations or your clients generally make your life miserable, don’t get angry, get better.  Get better at finding good people to partner with, and get better at giving them what they need to succeed.

Or take a good hard look at the price of saddles these days.


The 2nd Edition Of the GRIT Consumer Participation in Research (CPR) Report Is Available

Now in its second year, the GRIT Consumer Participation in Research (CPR) report is our effort to answer the who, what, when, where, and why of global consumer participation.


Respondents are the lifeblood of market research. Whether it’s qual or quant, surveys or communities, neuromarketing or ‘Big Data’ and everything in between knowing how to reach, engage, and understand people is the very bedrock of insights.

In our interconnected world, achieving that goal is in some ways easier, and in many more ways more difficult. Until now, little data have existed to help researchers understand this basic question: how do we get consumers to engage with us and what do those folks look like?

Now in its second year, the GRIT Consumer Participation in Research (CPR) report is our effort to answer the who, what, when, where, and why of global consumer participation.


The report includes the most up to date data in the world on the profiles of fresh vs. frequent responders. It answers questions such as:


  • Are “Frequent Responders” categorically different from “Fresh Responders”, and, if so, in what ways? Does this matter? Why?
  • Is the difference significant enough that it should be of concern, or be of strategic benefit, to different stakeholders in the research process?
  • Do the differences necessitate a form of ‘data triangulation’ whereby customers need to receive a blend of respondents, some “fresh”, and some less so? Or should all respondents be “fresh”? Why?
  • Is there a confounding factor at play? If a majority of all responders online share a more dominant characteristic about which we do not know, such as intellectual curiosity (no matter how frequently they answer a survey), how much weight should we assign to the “freshness” findings shown here?
  • The people who were intercepted are likely somewhat biased toward heavier Web users. Since one can make this same observation of all Web-based respondent data capture modalities, does this matter? Why?
  • What are the implications that need to be addressed as an industry from these findings, specifically, for those who make data-based decisions?

We hope this report will become the go-to resource that researchers globally can use to validate and benchmark their own research. Enjoy!


How Addressable TV Changes Media Measurement Forever (Infographic)

If you haven’t heard of addressable TV, it’s time to start getting familiar with the concept.

If you haven’t heard of addressable TV, it’s time to start getting familiar with the concept. Addressable TV is a technology and marketing practice that selectively segments the ads seen by TV viewers, allowing groups of people to watch the same program yet see different, more effectively targeted ads- regardless of their physical distance from one another.

Here’s how it works: marketers use data-driven household profiles to send targeted ads to specific households. With information like income, family composition, and even car leases and mobile contracts, marketers can designate specific ads to be shown to certain families. This effective targeting increases ROI in sales and enhanced analytic potential, just like targeted online advertising.

It’s Programmatic for Television. Which also means it’s driven by a virtuous cycle of consumer-centric data, including viewing and impact.

Sounds complicated, but as it turns out, setting up the technology is simple. Boxes on top of TV sets have their own IP addresses, which allows a TV’s Nielsen data to be integrated with the data from other devices and databases.

Addressable TV advertising has several advantages that sets it apart from its advertising counterparts. Television has the largest audience reach of any media today at 96%, and it draws more than $70 billion annually in media spending. And even though spend is trending away from broadcast media (radio and television), the average adult in 2014 spent five hours a day watching television, despite the growth of mobile.

In 2014, addressable TV was estimated to only represent $200-$300 million of the $70 billion ad spend on TV. However, industry leaders predict that 25% of TV ad budgets will be spent on addressable TV within three years. Addressable TV will revolutionize the way that advertisers plan their campaigns, and the focus will change from quantity of ads to quality.

As that shift happens, it will speed the transition from panel-based measurement to real-time single-source market measurement. The implications for researchers, marketers, consumers, and advertisers is simply immense.

Check out this great infographic (click on it to make it bigger) developed by the fine folks at Signal, a tech company that is aiming to play a central role in this brave new world. It’s well worth a read and paints a compelling picture of the data-driven marketing world of Addressable TV we’re entering now.



“Analytics is Easy”

Posted by Kevin Gray Thursday, April 9, 2015, 10:37 am
Posted in category General Information
Analytics is a lot harder than some seem to realize.



By Kevin Gray

Erroneous thinking about analytics continues to hang on in the marketing research community.  Often it is tacit, but at times articulated candidly. This is worrisome given that marketing research is a research industry and no longer a young industry.  Some, for example, see analytics as little more than cross tabs and charting that can be done by anyone who has point-and-click software installed on their PC.  This is a bit like saying that if you can talk, you can do qualitative research.  Others think it’s “just programming.”  There are other misperceptions as well and one consequence of all this confusion is shoddy analytics which, in turn, raises doubts about the value of analytics.1  In this short article, I will demonstrate that analytics, in fact, is not easy and why this mistaken belief is potentially costly for marketing any researcher to hold.

Cross tabulations and graphics are an indispensable part of analytics but only part of it, and marketing researchers have long had a vast assortment of sophisticated tools at their disposal.  Even basic analyses should not be undertaken in a slapdash fashion, however.  Churning out stacks of cross tabs is not unheard of in our business but is very risky because even with big data there always will be fluke results.  Instead of placing our bets on shotgun empiricism, as researchers, we should plan cross tabulations and other analyses when designing the research, and interpret the patterns of our findings in the context of other pertinent information, not simply highlight isolated results.  The Improbability Principle: Why Coincidences, Miracles, and Rare Events Happen Every Day by David Hand, a past president of the Royal Statistical Society, is a great read and I can recommend it to marketing researchers.

Another example of substandard analytics can be found in mapping.  Nowadays mapping, in practice, frequently seems to mean junior research execs or even clerical personnel mass producing correspondence analysis maps, usually with the software’s default settings.  The maps are nearly always brand maps and user maps and other kinds of mapping are underutilized, in my opinion.  Moreover, though correspondence analysis is a wonderful technique it is just one of many appropriate for mapping, and biplots, MDPREF, MDS, factor analysis, discriminant analysis, canonical mapping or other methods may be better suited to the problem at hand.  What’s more, I still see maps being interpreted incorrectly.

Somewhat more elaborate but, nonetheless, debatable practice is psychographic segmentation with what has been called the tandem approach.  Though it began to be seriously questioned many years ago this method is still quite popular and, put simply, consists of K-means or hierarchical cluster analysis of factor scores derived from attitudinal ratings.  Tandem refers to the dual use of factor and cluster analysis in the segmentation.  The psychographic statements respondents rate are often improvised, making matters worse.  Poor questionnaire design plagues many kinds of marketing research and items that make little sense to respondents or mean different things to different people will sink a segmentation whatever statistical methods are used.  In the tandem approach, segments obtained from the cluster analysis are cross tabulated with demographics and other data in the hope meaningful and actionable segments will materialize.  They often do not and, accordingly, I sometimes call this the “Factor, Cluster & Pray” method.

Regression is perhaps the most widely-used statistical method of them all but is also deceptively complex.  Many books have been written which detail how regression analysis can be badly abused and Frank Harrell’s Regression Modeling Strategies is the most comprehensive and hard-hitting I’ve read.  Marketing researchers seem to make the sorts of mistakes people working in other disciplines do, though perhaps more often.  Some examples are using highly correlated predictors, neglecting residual analyses, ignoring correlations across time (e.g., in weekly sales data) or space (e.g., regions of a country), categorizing the dependent variable and confusing correlation with causation.

Another concern I have, in fact, pertains to causation.  Whenever we say things like “This sort of consumer does this because of that,” we are making a statement about causation whether or not we are conscious of it.  Causal analysis is a subject even bigger than regression and one bible is Experimental and Quasi-Experimental Designs for Generalized Causal Inference (Shadish et al.).  Trying to establish causation can be likened to walking though a minefield, to paraphrase a comment once made to me by a Marketing professor with a PhD in Statistics.  We need to tread carefully!

The next time you’re in a very brave mood, ask your senior finance director if they are no better at their job than they were 10 years ago.  Common sense should tell us that experience counts, particularly in highly technical professions.  Formal education only lays the groundwork for statisticians and even veterans are constantly learning new things and new tricks.  The list of viable analytic options continues to grow (for examples see Analytics Revolution) and we’ve reached the point where we now have so many tools that skill levels are becoming diluted.  Over-specialization, on the other hand, is also something we need to be wary of and some less-experienced analysts lean on a pet method for nearly any situationif all you have is a hammer, everything looks like a nail…

Now, here comes the bad news: The math stuff can actually be the easiest part of analytics!  Every so often I’m asked questions such as “If I give you 10 million customer records, what technique would you use?”  To characterize questions like these as naive would be too diplomatic, as they reveal little grasp of the fundamentals of research.  The Cross Industry Standard Process for Data Mining (CRISP-DM), illustrated in the diagram below, will help make clear what I mean by this.




Here are very succinct definitions of each CRISP-DM component, courtesy of Wikipedia.2 

Business Understanding:

This initial phase focuses on understanding the project objectives and requirements from a business perspective, and then converting this knowledge into a data mining problem definition, and a preliminary plan designed to achieve the objectives.

Data Understanding:

The data understanding phase starts with an initial data collection and proceeds with activities in order to get familiar with the data, to identify data quality problems, to discover first insights into the data, or to detect interesting subsets to form hypotheses for hidden information.

Data Preparation:

The data preparation phase covers all activities to construct the final dataset (data that will be fed into the modeling tool(s)) from the initial raw data. Data preparation tasks are likely to be performed multiple times, and not in any prescribed order. Tasks include table, record, and attribute selection as well as transformation and cleaning of data for modeling tools.


In this phase, various modeling techniques are selected and applied, and their parameters are calibrated to optimal values. Typically, there are several techniques for the same data mining problem type. Some techniques have specific requirements on the form of data. Therefore, stepping back to the data preparation phase is often needed.


At this stage in the project you have built a model (or models) that appears to have high quality, from a data analysis perspective. Before proceeding to final deployment of the model, it is important to more thoroughly evaluate the model, and review the steps executed to construct the model, to be certain it properly achieves the business objectives. A key objective is to determine if there is some important business issue that has not been sufficiently considered. At the end of this phase, a decision on the use of the data mining results should be reached.


Creation of the model is generally not the end of the project. Even if the purpose of the model is to increase knowledge of the data, the knowledge gained will need to be organized and presented in a way that the customer can use it. Depending on the requirements, the deployment phase can be as simple as generating a report or as complex as implementing a repeatable data scoring (e.g. segment allocation) or data mining process. In many cases it will be the customer, not the data analyst, who will carry out the deployment steps. Even if the analyst deploys the model it is important for the customer to understand up front the actions which will need to be carried out in order to actually make use of the created models.

Bravo!  Properly understood, analytics is not just cross tabs, visualization or programming, or even fancy statistical techniques.  It is a process intended to enhance decision-making.  The first step listed above, Business Understanding, is often the most demanding and, along with Data Understanding and Data Preparation, can absorb the bulk of a project’s time and energy.  CRISP-DM was not developed specifically for marketing research but is applicable to our business and drives home the point that analytics is a multifaceted, iterative process which involves more than narrow technical skills…or the ability to use a mouse.  Serious errors can occur anywhere, anytime and even simple mistakes can have important consequences.

So, the next time someone even suggests that analytics is easy, I’d advise you to be on guard.  It just ain’t so.



1 Some other reactions I have come across are that analytics is “too complicated,” or that isn’t needed or that it doesn’t work.

2 For a brief summary of CRISP-DM see Wikipedia: http://en.wikipedia.org/wiki/Cross_Industry_Standard_Process_for_Data_Mining.  For a more in-depth look, see Data Mining Techniques: For Marketing, Sales, and Customer Relationship Management (Linoff and Berry), a popular, non-technical introduction to Data Mining and Predictive Analytics.