1. Clear Seas Research
  2. SIS International Research
  3. Research Live
  4. RN_In_App_Effectiveness_GBook_480_60

Increasing intimacy through rapport 

With intimacy becoming the new buzz word in the industry, it is important to step back and think about not only the opportunities we have to promote closer relationships, but also the barriers in front of us by way of past and current practices.

connected_sm_500By Will Pirkey of iModerate

Reading the title you might think this is a self-help article on relationships. In a way it is, but rather than the relationships you typically think about (romantic, family, friendship) this is about the relationship with your respondents. Much like our personal relationships, our connection with those that participate in research can be improved by increasing intimacy. Intimacy produces greater trust, understanding, and knowledge of another person’s experiences, perceptions and perspectives. It’s what allows us to truly get to know another person or group of people. And isn’t that what we are after as researchers? If we are going to build intimacy we have to build rapport with the people we are asking to take part in our research projects.

The concept of intimacy is receiving increased attention in market research. Recently, large companies such as General Mills[1] and Intel[2] have expressed their commitment to building more intimate relationships and achieving a better understanding of consumers’ stories as essential aspects of their research approach. Additionally, a recent study on the desires of CMOs cites the crucial need to build and maintain intimacy with customers[3]. Our blog has also recently opined about the importance of intimacy in a Big Data world.

With intimacy becoming the new buzz word in the industry, I believe it is important to step back and think about not only the opportunities we have to promote closer relationships, but also the barriers in front of us by way of past and current practices. That is the thing with buzzwords: they sound great as ideas, but too often reality makes turning those ideas into action difficult. If we are to achieve greater intimacy in our research and with our consumers, then we need to start by thinking about what changes and new innovations are necessary.

One of the first things I noticed when I started working in the field of online market research is the lack of a relationship we have with our respondents. We bid with sample vendors to supply respondents. They are sent through an exercise once and usually never engaged again. We take no time to get to know them. Even if we wanted to build more long-term relationships with respondents it is difficult due to the high turnover rates on most panels. Not to mention, we also treat people as a commodity for which we can get the lowest price per complete. In turn, we have many respondents who are only engaged via the incentive they get for completing a survey. This can result in bad data (e.g. straight-lining, bad open ends, etc.) and does little to create intimacy. I have to ask: Is this the best way to engage and build relationships with respondents? Can we find intimacy in a purely commoditized relationship? While incentives will always be a part of the deal with respondents, can we deepen our relationship by creating more of a community feel and reciprocity that goes beyond dollars, cents, and panel points?

Coming from a background in Cultural Anthropology, the impersonal relationship plaguing online market research is definitely a new experience. As an anthropologist, the backbone to any research started and finished with building strong relationships with the people participating, or better yet, collaborating in my research. On day one of grad school developing rapport was stressed as one of the most important aspects of successful, quality research. And by rapport I mean building a close relationship through developing trust, communication and emotional connections. I found that once people realized you are truly interested in learning about their experiences, opinions, and lives they open up and begin to share on a much deeper level. While monetary incentives were used, they were not the basis of the relationship that developed. Now I realize that online market research and ethnographic fieldwork are two very different animals; however, if our industry wants to move to more intimate relationships with consumers in research, we can learn a lot from the latter to do just that.

When conducting fieldwork the connections I created were on a personal level. I thought of these people more like friends and family and not research subjects (and I think they felt similarly towards me).  This type of relationship allowed me to learn about the lives of people in, my case, Belize, but also for them to learn about my life back home and my experience living in their community. It was a two-way street where the exchange of knowledge went in both directions. This might be a larger point for another time, but through building greater rapport and more long-term research collaborations (as I think is best to think about the research process), the act of conducting research can be an important touch point. If the groundwork is laid properly people begin to feel some sort of belonging and thus intimacy is created beyond the research process. It just may be the case where conducting more thoughtful research can lead to stronger relationships between brand and consumer as people feel more connected to the companies they truly engage with.

Now for the difficult part, how do we actualize more intimate research in an environment that currently is structured to promote the opposite? Is it possible to develop true rapport in an online environment? What would it look like? How do we make respondents feel more like collaborators? Are there new ways to keep participants engaged, engaged for longer periods of time, and increase the ease of re-contacting them? As an industry, answering questions such as these are crucial for developing deeper, more long-term relationships.

While changes in how we sample need to be made by research and panel companies, if we are going to move to greater intimacy, we can also think of designing new research projects that intend to engage participants in a more collaborative relationship. This will ultimately involve conducting more qualitative research since it is the approach best suited to develop rapport. We should also ask – how would trackers look if we qualitatively gauged changing perceptions/attitudes with the same set of participants over a whole year? How could the R&D phase improve if we had future consumers collaborating in the entire process?  Can we work together with research participants in the creation of concepts instead of showing them a set of previously developed options? Can we find new ways to connect on a more emotional level?

To wrap things up, if we want to cultivate greater intimacy with our research participants and consumers we have think of new and innovative ways to create more human relationships with them. This is essentially what rapport is all about – connecting on a personal level with someone in order to deepen relationships and develop a mutual respect and understanding. In my mind, you can’t separate intimacy and rapport; they’re two sides of the same coin. If greater intimacy is the goal, then rapport is path to that goal.

 

[1] http://www.warc.com/LatestNews/News/General_Mills_seeks_consumer_intimacy.news?ID=32423

[2] http://www.nytimes.com/2014/02/16/technology/intels-sharp-eyed-social-scientist.html?nl=todaysheadlines&emc=edit_th_20140216&_r=1

[3] http://www-935.ibm.com/services/us/cmo/cmostudy2011/downloads.htm

Share

80% Of Your Research Should Be With Your Customers

For most brands and services my research has led me to suggest that about 80% of research budgets should be spent researching customers.

unknown-customers

 

By Ray Poynter

When I joined the market research industry in the 1970s, most market research was conducted with the whole market, i.e. with nationally representative samples. But that approach reflected the times. There were fewer products, fewer brands, and fewer channels for advertising. Markets were less mature, brands were establishing themselves, they often had genuine product differences, and market researchers were like explorers, mapping an unfamiliar land.

The later 1980s and the 1990s saw a shift to researching target groups and customers. Ad and brand tracking focused on target groups and customer satisfaction focused almost exclusively on customers. Concept and product testing, which had previously used whole market samples, started to focus on heavy users versus light users versus non-users – i.e. most of the sample were users, even for brands with say a 10% share. This change in focus represented changes in the market place, the number of brands and lines had grown, product advantages were proving to be illusory or temporary, and the battleground was shifting to logistics, sourcing, and image based advertising.

Since 2000 the focus in marketing has moved on again. Most brands manage to achieve product, service and advertising parity. Organisations have become much smarter about calculating the cost of customer acquisition, lifetime value, and the problem of churn. For many brands the issue has become increasing share of throat, size of shopping basket, and total usage, ahead of growing the customer base.

Over the last 15 years the focus in much of the business literature has focused on the use of customers, and co-creation, as a key source of competitive advantage. The writings of authors such as Mark Earls (Herd) and Rijn Vogelaar (The Superpromoter) have highlighted that brands tend to succeed through social copying, rather than through non-users being ‘persuaded’ by marketing or advertising.

In many cases, perhaps most cases, the best way to grow a brand is to increase the number of customers who ‘love’ it, because these people will recommend it, use it ostentatiously, and offer it in group settings. In most cases, a new line, a new campaign, a new service will only succeed if existing customers respond positively to it.

According to reports such as the GreenBook GRIT study the fastest growing major new research methodology is the use of research communities, such as insight communities and MROCs. Given the shift from the whole market to customers in the wider research world, it is not surprising that most research communities focus on customers. There is a community of interest between a brand and its customers, they all benefit if the products and services are improved. Customers know about the strengths and the weaknesses of the brand, they are in a position to give insight into where the brand should go next.

What proportion of research should be with customers?

For most brands and services (I will mention some exceptions in a moment) my research has led me to suggest that about 80% of research budgets should be spent researching customers. This would include measuring satisfaction, usage, the largest part of the ad and brand tracking sample, testing product and service concepts, product and service refinements, and co-creating the future.

The 20% conducted with the wider market would include market sizing, mapping needs in the market, and competitive intelligence (for example why do users of competitive brands use those brands).

This 80:20 prediction is based on two key points:

  1. The brand is most likely to grow through social copying/recommendation/word of mouth.
  2. Most good ideas for the brand will be seen as good ideas by customers.

The exceptions?

The main exception to the 80:20 rule is where the main focus is to massively grow the number of users, either from a zero start (a product launch) or from a very small base. Examples of this situation would include Apple when it launched the iPod, iPhone, and iPad. When these products were launched Apple had no customers in these segments, and the users of existing MP3 players, smartphones, and tablets were not their primary target – so researching customers was not a viable strategy.

Sometimes a brand tries to re-gain customers, a sugar-loaded soft drink launches a diet version, a popular beer tries an alcohol free option, and a coffee brand tries a decaff option. In these sorts of cases there is scope to research non-users, particularly lapsed users, but success tends to occur (when it occurs) by appealing to current users who are considering defecting. For example, Diet Coke is mostly drunk by people who moved from regular Coke to Diet Coke, not people who moved from not drinking Coke to Diet Coke.

In summary

Most brands and services focus on customer retention, providing the right products and services to delight their customers. The thinking behind Fred Reichheld’s Net Promoter Score is based on data that shows that brands that do well have more people who recommend them. A key finding from Andrew Ehrenberg’s double-jeopardy model is that dominant brands have customers who are more loyal.

Most market research, for most brands, most of the time, should focus on customers. I believe that this customer focus is one of the key reasons why insight communities are currently so popular. Insight communities are not pushing brands to focus on customers; the focus on customers is pushing brands and organisations to use communities in order to get closer to their customers.

So, what are your thoughts?

I’d love to hear your comments, or perhaps you can vote on the poll below.

 



 

Share

Contextual Infographics Made Possible

Instead of delivering periodic powerpoint reports, new-age market research organizations will start developing more ongoing collection mechanisms and start processing data in a more continuous fashion.

 good-infographic-1_edit_web

 

By Rachel Grassity & Anup Surendran

We were really happy to see this article by Ron Sellers here (http://www.greenbookblog.org/2014/01/28/context-is-the-key-to-research/) at GreenBook blog.  We are true believers of not only telling a story with Data but also telling as close to an accurate story as possible. Quoting Ron – “It’s important to understand the context you need before you begin to measure anything through research.”  We believe in an approach of what some people call agile market research or iterative market research.  Research differs by the way you ask your questions, by context, and by market position.  Once you start understanding the context in which the research is done, the insights which you derive from the research is much more accurate. 

Market research is sometimes defined as a service, a strategy, a discipline or even the way you use a collection of tools. Some businesses define it as a strong business capability.  The reason is simple, business objective driven market research yields results. Businesses have to adapt with market conditions and market research has to help with the adaptation and the pace of change. What we refer to as Iterative is radically different from traditional market research in terms of cost and lead time. The essence of iterating is speed which gives the ability to add more related or contextual data quickly.  For an industry that prides itself on providing its clients guidance on where things are headed, market research has been alarmingly slow at adapting to the pace of today’s business cycles. Practices for designing, implementing and analyzing market research have moved forward only when pushed by the inevitable forces of technological advance. 

What we have been doing at SecondPrism is providing the ability to iterate with data and visualize that quickly.  For example, you could upload an SPSS file and easily visualize the data on the web and mobile.  What you can do after that is you can add additional context (additional data) which can enable your stakeholders to compare datasets and actually interact with the data. 

The traditional market research efforts will slowly make way for contextual insight. Instead of delivering periodic powerpoint reports, new-age market research organizations will start developing more ongoing collection mechanisms and start processing data in a more continuous fashion.  They will start tailoring reports and analysis to for different roles across the organization spending more time on their roles objectives. The reality is that most people in an organization have very limited visibility into what’s happening in the marketplace. Providing them with relevant data (that isn’t necessarily statistically significant) will help them make better, faster decisions. 

Relative sized Infographic type elements (or what we call Contextual Infographics) is another thing which we have made possible.  Here are some examples –  A branded Dashboard : https://ebony.secondprism.com/s/cl and a dashboard with an Infographic type look and feel : https://demo.secondprism.com/s/d4 

All this time Infographics has been sexy but now we have made these type of visualizations reflect the actual data underneath instead of a designer’s interpretation. Contextualizing also could mean providing an experience relative to the brand. This seems to be a standard thing all market research companies do now to ensure that their clients and stakeholders do.

 

About the Authors: Rachel Grassity is a Customer Experience Specialist at SecondPrism.  Anup Surendran is the cofounder of SecondPrism

Share

Friday Rewind: Google, Twitter & Facebook… Oh My!

In 2012 the MR space was buzzing with the news of Twitter, Google, and other tech platforms entering MR. Nearly 2 years later and now Facebook, Rakuten, and Twitter (again) have announced major research offerings. The message is the same now as then: MR must adapt for face marginalization.

Editor’s Note: With the news this week of Twitter’s launch of a new “every day moments” research tool, Facebook’s ascension as THE de facto global media measurement platform and deal with Nielsen, and the acquisition by Japanese online retail giant Rakuten of AIP Panels, I was reminded of this post that I wrote in October of 2012 when  another spate of announcements were made signaling the movement of tech companies into the MR space. We’re now witnessing the next phase of this trend unfolding now, and the message to the MR remains the same today as it was almost two years ago: “The bottom line is if you are a market researcher, and especially if you are in a  senior role within a supplier organization, you must adapt and get ahead of the curve or face marginalization and eventual irrelevance.”.

I’m not particularly happy about being right on this one folks, but here we are.  I played amateur futurist this week at an ARF Webinar on The Changing Face of MR and discussed the entrance of tech platforms into the marketplace as alternative suppliers, as well as how a host of new technologies are primed to further transform the insights function. If you’re interested in my musings on what else we can expect in the next few years, here is the slideshare of my presentation.

 

With all of these signals in mind, I think this is a post worth re-reading folks: the industry is changing very fast indeed now and the need to adapt to the new world we find ourselves is more important now than ever.

 

By Leonard Murphy, originally published October 13, 2012 

And another piece of the future of research puzzle is in place. Twitter announced today that they were partnering with Nielsen to introduce brand surveys directly into Twitter. As usual, my friends at Research-live had the scoop:

Twitter Survey Invite

Twitter has announced plans to run “Twitter Surveys” on behalf of brands in the format of “Promoted Tweets”. The social media site will run the initiative in conjunction with Nielsen.In a blog post today Joel Lunenfeld, Twitter VP for brand strategy, says that the surveys are currently being tested by a small set of advertisers. Twitter plans to make it available to additional partners early next year.

The service is being marketed as an addition to the site’s advertising tools for its advertising partners and is the first to offer brand impact measurement for Twitter.

Twitter has announced plans to run “Twitter Surveys” on behalf of brands in the format of “Promoted Tweets”. The social media site will run the initiative in conjunction with Nielsen.

In his post Lunenfeld says: “With 400 million tweets occurring every day throughout the world, consumers and brands on Twitter have a unique opportunity to listen and engage in a variety of topics and conversations. As marketers invest in opportunities to connect with users through Twitter’s Promoted Products, we are focused on delivering tools to help brands measure and understand the value of those campaigns.

“Brand surveys will appear to users just like a Promoted Tweet — right within the user’s timeline on both mobile devices and desktop. Users may see a tweet by @TwitterSurveys, inviting them to fill out a survey directly within the tweet itself.

“Building on Twitter’s mobile heritage, we’re giving brands the ability to deliver and measure the impact of mobile and traditional desktop campaigns through these surveys. This is a native experience for the user, and we believe it will give brands better insights to determine purchase intent, overall awareness, and other advertising metrics and analytics that can lead to greater engagement on Twitter.”

So why do these companies need each other?  For Twitter I think it’s primarily an issue of convenient sales channels. Nielsen is much better positioned to sell through on this; they add credibility to the effort and they have the relationships in place, saving Twitter a lot of work. It’s like printing money for Twitter.

For Nielsen they get access to another “exclusive” data channel; this dovetails well with their similar efforts with Facebook, and it’s a step closer to their goal of being the primary conduit for all things related to media and brand measurement.  It’s really a match made in heaven.

Congratulations are in order to Nielsen for showing real vision and leadership here. They pioneered a similar model a while back with their partnership with Facebook, and also in the past week or so have made announcements embracing mobile media measurement. There is a reason they hold the position they do in the industry and this is one more example of that. It’s going to be very challenging for their competitors (regardless of size) to usurp their leading position, not because of their product & service portfolio but because of their business model and organizational architecture. They have the right mix of leadership talent, vision, organizational flexibility, clout and funding to adapt to a changing market and build new revenue streams while others decline. As the industry struggles to find it’s path in this new world emerging around us, we could do far worse than emulating some (but by no means all) of the qualities of Nielsen.

So, we now have Google, Twitter and Facebook with research offerings, and I suspect LinkedIn will make another foray into the space soon. As if that wasn’t enough, this week we saw the launch of Verizon Insights , a new employee sentiment product by Wayin, as well as a new consumer facing and research centric “data bank” offering by Tesco.

As if the entrants of new technology-centric data providers isn’t a loud enough clarion call, we also have the increasing movement of business & strategy consulting firms “in-sourcing” their insights functions or establishing close partnerships with new research providers to offer their clients a robust and business-issue focused research capability. Two key pieces of the MR value chain: data collection and analysis have been broken by multiple new players that are staking their claim to various pieces of the previous domain of market research.

The game has changed.

Regular readers of this blog should not find any of this surprising; we’ve been predicting shifts like this for some time and warning that the pace of change is only accelerating. That isn’t meant to be an “I told you so!” nor am I even particularly pleased that we called it because it’s not good news for most of my friends and colleagues in the mid to long term. At this point all I can do is ask: do you get it yet?

I was chatting about the Twitter/Nielsen deal with my #mrx tweeps earlier, as well as over the phone and email with my colleagues Gregg Archibald and Jason Anderson. Here is an excerpt of what was being discussed on twitter and a few quotes from Gregg and Jason:

Jason: “That explosion you just heard was the industry BLOWING ITS MIND. Though we knew this was inevitable.”

Gregg: “So why do we have such a hard time predicting our own trends when we do it so well for others?”

That is a critical question here. The few analysts who focus on the MR industry (myself, Robert Moran, Ray Poynter, Cambiar, Forrester) as well as most of the blogosphere and many speakers at conferences have been urging the industry to step back, take a look at the current state of play in the wider world of the digital era, and assess where the real value of market research is. The idea that MR is the collector & keeper of data isn’t our value proposition. That function truly has been disintermediated to a very large degree. Yes, specialty functions will continue to exist and some will even thrive (any type of biometrics/neuromarketing, emotional measurement, text analytics, ethnography, gamification, etc… look likely to do well), but ASKING, OBSERVING, LISTENING, MONITORING, TRACKING, METERING, & ANALYZING simply are no longer owned by traditional MR. We do and will continue to play a role in those things, but building or maintaining a supplier business model based on them is going to be a zero sum game.

So where is the white space? What path can MR firms walk to be viable in the future? The winners will be firms that offer the methods I mentioned above, but a few more characteristics come to mind:

  • Own proprietary data sources
  • Have deeply integrated norms or benchmarks
  • Offer technology that collects and delivers data
  • Pure play insight consultancies
  • Primarily focused on qualitative research
  • High end analytics and data modelling
  • Specialists in niche markets

The future of the market research industry simply is NOT based on data collection as a driver of business for the traditional full service firms and that is a big problem for the majority of suppliers. It’s also going to be a big adjustment for clients used to the status quo. Everything from employee profiles to business models, research designs to budgets and analysis to business impact will continue to change as a result of this transformation.  We can engage in the age old arguments about sampling and science all we want to and it will not mean a thing; the genie is out of the bottle.

The bottom line is if you are a market researcher, and especially if you are in a  senior role within a supplier organization, you must adapt and get ahead of the curve or face marginalization and eventual irrelevance.

Nielsen saw this truth a while back and has been working to build a new model. The rest of the industry would do well to learn from that lesson and do whatever it takes to learn how to compete or cooperate with the new players that soon may dominate the industry.

Share

Data Driven Marketing: Knowing The Consumer And Then Doing Something With It

Marketing research needs to start thinking at scale. This will change how we research customers and profile brands from a small number of segments intended to inspire brand ideas to a large number of targetable audience members.

By Joel Rubinson

The former chief marketing officer at Procter and Gamble, James Stengel, said in 2012 responding to a question about how to achieve brand growth, “Get to know the consumer and then do something with it.”  Marketing research has always had more success with the first part of this statement than the “doing something with it” part.

When marketing research conducts a survey among 1,000 people, you get to know 1,000 people. Important insights but limited to what you can ask in a survey, and then of limited use for media targeting purposes where media placement strategy always seems to deteriorate to the lowest common denominator of age and gender.  In fact, to be a hard grader, if standard research doesn’t demonstrate a repeatable ability to improve advertising productivity or innovation success rates, there really isn’t much proof that the insights are worth the billions of dollars we pay for them.

Marketing research needs to start thinking at scale…and this will change how we research customers and profile brands…from a small number of segments intended to inspire brand ideas to a large number of targetable audiences intended to drive advertising ROI, from brand attribute profiles to response profiles, from survey tracker measures to integrated brand KPIs.

Big data promises to be the game changer.  Now marketers can get to know tens of millions of consumers in actionable ways who interact with their brand by using big data and data science approaches…leveraging the transactional and digital data you naturally could capture, and the social, third party and sensing data you can easily bring in. Tens of millions?  Yes…when you consider all those consumers who buy something from you, visit your website, interact with you in social media, or who have profiles in 3rd party databases that can be used for lookalike modeling…yes, we have found a way to scale research and analytics and do something with it.

And what WILL a marketer do with this knowledge?

  • Strengthen brand-consumer relationships via hyper-relevant content and experiences
  • improve short term advertising ROI
  • …basically establish for once and for all the value that the marketing function brings to the enterprise!

So why are marketers behind where they need to be? Most marketers know they need to leverage their data assets but have not made much progress. Different data streams are usually siloed, poorly matched and often unstructured, so marketers analyze one data stream at a time and it feels impossible to really get a handle on synthesis.  They fail to capitalize on opportunities to capture important data via tagging and usually do not database all naturally occurring experiment-style results about digital marketing effects.  Most importantly, they have not worked effectively with IT to link business use cases and marketing activities to data streams.

Consumer knowledge based on using big data to scale insights into the tens of millions changes the ways that marketing works:

  • Personalization of content
  • Ad optimization (right message, right screen, right time via programmatic)
  • Optimize programmatic ad bidding across brands in a portfolio
  • Cross-selling products and service offers based on lookalike modeling and predictive analytics
  • Retail optimization (right product in the right store)
  • Powerful marketing research insights by tracking the brand across all signals of brand health
  • Marketing ROI knowledge management

I am starting to see a number of players in the marketing research ecosystem now emphasize platforms that integrate data sources, so where this is headed is unmistakable. Rubinson Partners and IIS, a top 100 big data technology solution provider, have mapped about 25 data sources to business purpose and developed a process to move you from point A to point B.  We encourage clients to consider these data as organized by levels of sophistication or readiness to fully compete based on data…based on understanding millions of consumers and then doing something with it via personalized and precision marketing approaches.

With these shifts in mind, my strong suggestion is that your organization should move quickly to do an assessment of your readiness to compete on data and a plan for how to take your enterprise to the next level.

Share

The Investment Outlook for Insights: A View from the Capital Markets

The U.S. venture capital funding market has been on a tear. But what is going on specifically in the Marketing Technology & Services sector?

accounting

 

Editor’s Note: One of the major sources of insight into thinking about the future of any industry is watching capital markets. It’s easy to get tunnel vision when our heads are down working to complete the tasks in front of us, but that is a dubious luxury during times of rapid change. Where money is flowing in and around our industry should be something all leaders in our space are cognizant of on a regular basis.

Following what investors are betting on (especially large VC firms) as well as M&A activity is a powerful tool for foresight. In the MR space there are only a handful of firms that pay attention to this: Cambiar Consulting (their Capital Funding Index is required reading), Forrester, and investment bankers England & Company. I also pay attention to Outsell and their work around market sizing and segmentation to help look at total size of MR and closely aligned or overlapping industriesThere may be a few more, but these are the folks I am aware of and pay attention to.

At IIeX in Atlanta we invited Simon Chadwick from Cambiar, Harry Henry of Outsell and the author of today’s article, Corey Luskin of England & Company, to present on their views of the marketplace and where it is going. THey were amazing sessions and chock full of great information for everyone in the MR value chain, especially entrepreneurs and investors.  I asked Corey to build on the themes he presented on in Atlanta in what I hope will be the first of a series of posts exploring this macro view of trends from an investment perspective. The result is today’s post: it is an important one and I think you’ll enjoy it.

 

By Corey Luskin, England & Company

The U.S. venture capital funding market has been on a tear. The first quarter saw roughly $10 billion in VC investments across all sectors, making it the busiest quarter in several years. The second quarter shattered this level with almost $14 billion invested. It’s fair to say that the early-stage investing scene is brisk.

But what is going on specifically in the Marketing Technology & Services sector? At England & Company, we track this activity. It can give advertisers and marketers a feel for the pace of new innovation in the industry. This directly impacts how these professionals will do their jobs in the future. The data also provides new technology and services companies (who might be seeking investment or acquisition) with an understanding of where the commitments are being made.

New venture investment into Marketing Technology and Services companies has mirrored the overall acceleration. In 2014 so far, more than $1.7 billion in new venture capital has been invested into early-stage companies that, one way or another, are trying to transform the practice of marketing.

This $1.7 billion practically matches the total level in all of 2013 and we’re only halfway through the year. This year will inevitably cap off a growth trend that has been underway since the markets began their recovery in 2010.

 

Insights Investment Trends

 

Who received all this capital? It went to almost 120 companies ranging from seed-stage startups through later-stage, pre-IPO success stories. Common sense tells us that there is no practical way for marketers to interact with this many tech and service companies, particularly since the incumbents are also innovating. Many of these 120 will fail, some will be acquired and consolidated, and a few will emerge as new leaders in their respective marketing niches.

But what sorts of of new companies, specifically, are we seeing in this mix? It’s an assortment that we can broadly drill down into. It’s impossible to delineate categories that will neatly separate every company, but we have been grouping them as follows:

  • Advertising Technology: Companies that are primarily concerned with display and search advertising.
  • Analytics, CRM, Database: Companies whose offering is primarily marketing data and/or platforms for managing and analyzing that data.
  • Social Analytics & Marketing: Companies that provide analytics, campaign management or other services specifically within the major social media platforms.
  • Customer Experience: Companies that address specific aspects of the customer interaction such as Customer Service, Product Recommendations, Loyalty, etc.
  • Marketing Automation: Software companies that deliver platforms for the planning, execution and evaluation of complex, multi-channel marketing campaigns.
  • Content Marketing: An emerging category of companies that help advertisers and marketers create, collect, publish and evaluate content at scale.
  • Market Research: Companies that, one way or another, are in the business of asking consumer questions, measuring consumer behavior in a direct fashion, or providing platforms to facilitate these processes.

Anyone that hasn’t been living under a rock will expect that these categories are not all getting equal treatment and, in fact, this is exactly what the numbers show:

 

Market Research related investment levels

 

 

A few things emerge from this segmentation:

  1. Despite various challenges, Ad Tech remains a very strong category. Many feel the channel is over-invested; privacy is an open issue; there is evidence regarding “banner blindness”, “viewability” and other forms of ineffectiveness; and Google, with its grip on the search market, periodically changes the rules on providers, sometimes whipsawing them right out of business. Still, the promise of efficiently targeting consumers and deploying ad dollars is great enough to outweigh these concerns. Also, significant capital is being dedicated to the retooling of infrastructure from desktop to mobile. AdTech remains one of the most active niches, although the 2014 contribution is somewhat skewed by a single, large investment.
  2. As far as VCs are concerned, Big Data lives up to the hype. While there is still no universal definition for this label, we perceive some loose consistency in the kinds of companies that receive it. Many companies in our Analytics and Social categories could be thought of as “Big Data” companies. The specific applications include influence analysis, social CRM, online psychometrics, social listening, data integration, predictive analytics and numerous others. Collectively, they signal a forceful trend for the Market Research community. If research and insights professionals are being pummeled for “better, faster, cheaper”, they can thank Big Data for the challenge. 
  3. Content Marketing has emerged as an important technology niche. While the idea of “content marketing” is as old as the hills, the context has changed: With consumers in a constantly-connected state, brands and agencies are turning to new technology and services to keep up with the volume requirements and “refresh” rates inherent in social media. They also need new tools to try and measure the effectiveness of this content and adjust campaigns accordingly.
  4. Market Research is pretty small. The industry is still big, but in terms of new investment, there just isn’t a lot going on. Now, some might lump one or two of our other categories under the MR umbrella and argue that there is a great deal of new activity. That’s fine. But the distinction I drew earlier is useful. From the investment community’s perspective, “Market Research” connotes a company that works with a sample group to directly ask questions or measure behavior. While this is still important (and probably underappreciated), it isn’t demanding the kind of new capital that we’re seeing in the other categories. There have been very few major investments in the space over the past few years.

This divergence between “old” and “new” is also evident the market when you look at M&A activity, Private Equity activity and public company valuations. We covered these topics at the IIEX Conference in Atlanta a few weeks ago and can come back to them for an update another day.

Share

Lessons From Amazon vs. Hachette: Focusing On What Counts

To win any game, you have to know what counts. Then, you have to execute better than everyone.

focus1

 

Editor’s Note: Market Research is a business, and one facing many changes from multiple angles in the marketplace. It’s important to be reminded that the keys to success in business are fairly universal, and in today’s post master strategist Larry Gorkin reminds us of a few of those core principles, using the recent standoff between Amazon and Hachette publishing as an example. It’s good stuff.

 

By Larry Gorkin

Leaders that want to win long term need to clearly define their business’ key success factors, and consistently deliver them with excellence. Failure to do so can undermine the business’ core strategic intent, weaken its competitive position, and erode results. That’s the lesson from the current stand-off between Amazon and book publisher Hachette.

For those unfamiliar, Hachette is a leading global book publisher, representing popular authors like Malcolm Gladwell, James Patterson, and Stephen Colbert. Its dispute with Amazon centers on pricing and other terms the online retailer wants to sell and promote its books. Amazon wants more favorable terms versus history to reflect its dominant size, and the changed economics of e-books.

With the two sides unable to reach agreement, Amazon has delayed shipping Hachette books, refused to take pre-orders on upcoming releases, and reduced marketing promotions. While Hachette’s business has suffered, the dispute has been a public relations black eye for Amazon. Critics have accused it of abusing its market power, and hurting both customers and authors.

Amazon has responded with a hard line, positioning the negotiation as part of its on-going effort to give consumers the best possible prices and service. The company has apologized for the inconvenience, even suggesting that customers buy impacted books elsewhere until the dispute is over.

What’s important here is the clarity with which Amazon has identified and executed on its key success factors. Amazon needs the best possible terms from every vendor to maintain its own competitive price position. As part of this, they see continued opportunity to disrupt the book market, particularly with the growth of e-books. They are willing to take a short term hit to win this battle long-term.

Hachette has similarly crystallized what counts, albeit from a clearly defensive perspective. The company sees Amazon as a fundamental business threat, and has defined preservation of its traditional business model as essential. From this view, Hachette may be focused on a critical issue, but its goal of preserving the status quo may not be realistic or achievable.

Of course, the idea that companies should define and execute against the key requirements of their strategy is not a new one. Wal-Mart established supply chain leadership as a foundation of its strategy and continuously invested to maintain advantage there. Steve Jobs made ease of use and elegant design the basis upon which Apple would compete; the company still benefits from that.

Yet, many companies fail to crystallize their key requirements for success. Others know what counts, but lack the organization discipline to deliver them with excellence.

Using today’s book market example, Hachette sees its future success dependent on preserving its historic business model with Amazon and other retailers. But, I’d argue that’s mistaken. Instead what is essential for Hachette is to create a new business model that reflects the reality of today’s changed and evolving book market. Even if Hachette strikes an acceptable deal with Amazon, their long term outlook won’t change; they may survive, but won’t thrive.

Importantly, a company will have multiple success drivers, all of which should flow from its current strategy and market reality. Both the strategy and success imperatives should change over time. And once defined, leaders must ensure the resources are in place to deliver them successfully.

Given this important issue, here are five ways to identify and focus on what counts for your business.

1. Identify Dependencies– What does your strategy depend on? What must happen for the strategy to succeed? What elements are in place versus missing?

2. Define Drivers– What’s driving growth in your market and business? What capabilities and assets are needed to support that growth? Where are your strengths and gaps?

3. Examine Changes– What is changing in your market? What are the implications for your business? What are the opportunities to target and threats to defend?

4. Evaluate Progress– Where do you stand on previously identified imperatives? What gaps remain? How should priorities change?

5. Align Resources– What resources are needed to deliver the key requirements? What gaps are there? How can they be filled?

To win any game, you have to know what counts. Then, you have to execute better than everyone.

Questions: Do you have a clear definition of what counts for your business? How well do you deliver against those factors? How could you do better?

Share

Volunteered Personal Information (VPI) and valuing your personal data

Volunteered Personal Information (VPI) plays a critical role in moving the debate about permission marketing forward. It’s a shift that requires everyone to see their profile as an asset - where individuals are actively engaged in valuing and protecting their own data.

580x340-VIP-10Jun14

 

Editor’s Note: Presenting the flip side of the data privacy debate started with yesterdays post by ESOMAR is a potential solution to privacy hawks: the personal data economy. This model empowers consumers to leverage their personal data as an asset via a variety of online exchange models and holds much opportunity for researchers since by default consumer permission driven data collection, aggregation, and synthesis. My friends at Pureprofile (full disclosure I am on their advisory board) is one of the few companies with their roots in research (they started as a panel provider) that has gone far down the path to embracing this new paradigm and thinking through the value proposition for consumers. With that in mind, I thought getting their view on how this shift looks today and the impact on privacy in the future.

 

By Kim Anderson 

Data anxiety is normal. When Google posts a quarterly profit of $15.4 billion, we shake our heads in disbelief. Just how much money are they are making out of selling big data to advertisers? And not just big data, but data about us – obtained in exchange for the ‘search’ service most of us use daily. Personal data is a hugely successful and growing asset – one that many brands are profiting handsomely from.

However, what interests us at Pureprofile is not so much the data being traded behind closed doors (such as the data sets used in programmatic marketing). For us, it’s the information (data) people are happily volunteering that’s really interesting.

Volunteered Personal Information (VPI) plays a critical role in moving the debate about permission marketing forward. It’s a shift that requires everyone to see their profile as an asset – where individuals are actively engaged in valuing and protecting their own data.

There aren’t too many consumers who don’t want to participate in digital life to some extent. Whether it’s offering our details in exchange for a discount, receiving convenience in the form of online buying, or sharing our latest thoughts and moments via Facebook and Instagram. What we tell the world about ourselves through forms, sign ups, petitions, participation and personal web pages is a powerful thing. It allows us to express ourselves and have a voice. It also allows advertisers or anyone in the business of selling, to do their job with greater precision.

VPI is good because it moves us from interruption marketing (TV, radio and pre-roll video) to content such as newsletters, reports, brand monographs or books that are delivered as a result of permission being gained from the individual.

Innovation in content marketing of value has risen greatly in the last couple of years, as organisations recognise that focusing on helping, not selling out, their consumer is the key to obtaining VPI.

The new market for data

Peak consumer research body Ctrl-Shift has identified a huge ‘shift’ back to people power in the Personal Information Economy marketplace. During extensive research, they’ve identified key trends that place consumers in the drivers seat.

First and foremost they see increased agency for the individual – painting them more as collaborators than consumers. Audiences of the future play a powerful role in reclaiming their data and their right to participate in the free market.

Secondly, they are keen to communicate that our biggest currency as a consumer is not just money, but critically, our time and attention. In this new consumer paradigm everyone’s data profile is regarded as an asset. That is, something of value to be traded, given, or volunteered in return for something tangible and meaningful to the individual.

Profit flows from data

We know that data brokers make their money out of selling individuals’ data. It mostly lacks legitimacy due to its very nature – the fact that it was not volunteered by the individual. Collected without informed consent, and used out of context, it fast becomes irrelevant and devoid of amenity – pushing undesired marketing messages at people that rapidly reject them.

Ctrl-Shift’s research demonstrates that by giving power and more control over online identity and personal data, we create immense new value – worth ten of today’s Google business model within a decade.

This impacts the format and rules of this market, and most organisations are scrambling to prepare for this transformation. Brands must find new ways to engage consumers who seek to better manage their data. Adapting the services they deliver, how they create value for their audience, to new strategies for managing customer information – and all the while, still driving efficiency and growth for their business.

New flows of personal data will take over and emerging fourth-party services (i.e. those on the side of the individual) will be widely deployed. Businesses need to understand this framework, or be left behind as consumers become advanced in their knowledge of storage, management and data privacy.

Passionate about the role of fourth party services that place control back in consumer hands, Ctrl-Shift predicts enormous growth in decision support services that choose to help individuals research choices and manage their affairs using digital technologies. Examples include storing your own data securely and blocking advertising, to managing identity and providing insights. Within the next ten years, they believe this will evolve into a new UK market for personal data worth £20bn a year.

Indeed, many new services are poised to enter the marketplace in 2014, and some have already emerged – MiiCard, Mydex, nFluence, Pureprofile, Qiy, Reevoo and VisualDNA. The one thing they all have in common is that they use information volunteered by the consumer to add value to the consumer, but also to address particular challenges, stripping out high levels of cost and waste.

What’s next?

With this in mind, Pureprofile believes customers will transform to become active participants in the world of brands, rather than passive consumers.

A great many positives will flow from this data economy –  a landscape where people will be equipped to store, manage and selectively share their own data. A marketplace where consumers (and brands) will greatly value people’s time and attention. We see this evolution as fair and just.  There are many possibilities and implications of users being empowered to better manage their personal data.

One thing is for sure – the value generated will also be shared with the right person. The individual who shared their data.

Share

Is A Digital First World War Looming – And Would We Survive It?

The free-flow of information is critical, not just for us as market, social, and opinion researchers but for the whole of society. By working together, we can ensure that the smoke over EU/US Safe Harbour does not turn into a real fire.

US-EU-flags-88724

 

Editor’s Note: Regardless of your position on digital privacy laws, the reality is that many legislative bodies are enacting laws that are often complex, contradictory, and inconsistent. This is new territory for us all, and as an industry that is based on handling consumer data it is very easy for insights pros to get caught in the morass of these disparate regulations.   Our trade organizations, most notably ESOMAR and the Global Research Business Network (comprised of most of the national trade orgs around the world), are attempting to help MR firm navigate the minefields of the rapidly changing digital privacy landscape.

Today’s guest post by Kim Smouter of ESOMAR is an example of the type of leadership  and assistance they can provide to researchers who may be (and rightfully so!) confused by the various laws we need to comply with in different areas of the world.  We’re very pleased to post it here on GBB and hope you find it helpful and interesting.

 

By Kim Smouter

For centuries, European and US historical paths have been inextricably linked. In war and in peace, Europeans and Americans have found many reasons to trade, talk, and even wage war together as allies in a tireless effort to impose a shared worldview built on the principles of democracy and self-determination.

Between the clichéd stereotypes, is mutual admiration and a fascination with each other’s histories and achievements. Few societies in this world are quite so intertwined.

Yet the whole topic of personal privacy seems to be a case where the bonds of brotherly love are increasingly giving way to mutual suspicion, jealousy, and a desire to impose a world view designed and defined by “one camp.”

The situation is not only driven by economic concerns but also by real fundamental values resulting from differences in historical, cultural, and social experiences. One does not need to look very far to see how visible the cracks of discord are when Europe responded to the revelations of the US spying on its allies by calling for immediate changes to the EU/US Safe Harbour framework in place since 2000.

The ripple effects of the loss of the EU/US Safe Harbour framework should not be under-estimated. The framework was put in place to enable transfers of data between the EU and the US. It was an important legal fix as EU data protection law makes data transfers outside of Europe only possible with countries offering the equivalent levels of protection (adequacy), or through complex company contractual structures which most small and medium enterprises find difficult to implement.

The US has adopted a very different data protection approach compared the EU’s own global coverage approach. The US has elected to respond only to sectors where there are specific concerns using primarily consumer and unfair commercial practice as the legal basis for action, with the Federal Trade Commission (FTC) as the enforcement body. The US’ sector-specific approach to privacy and data protection is considered inadequate in light of the EU’s own global coverage approach. It is only through the EU/US Safe Harbour scheme that data has been able to flow freely between the two markets. The scheme offers a voluntary self-certification model whereby US companies’ commit to providing certain levels of redress that comply with the requirements of EU law. Without the Safe Harbour, most cloud services, and any projects involving the transfer of data out of the EU into the US would be unable to operate legally.

The Snowden revelations woke Europe to the fact that its citizens benefited from lower levels of protection (and particularly levels of redress in the event of abuse from either public authorities or companies) on US soil. Additionally, it was also clear that the EU/US Safe Harbour had been laxly enforced in recent years.

So when Europe’s leading officials on data protection called for the strengthening of the EU/US Safe Harbour scheme or its suspension, leading companies on both side of the ocean were deeply concerned. These calls emanated from numerous places, from the European Commission [the closest thing Europe has to a federal government], from the European Parliament [its Congress], as well as the European equivalent to the FTC – the Article 29 Working Party.

The EU followed up by presenting a shopping list of recommendations to its US “partners” who expected the issue would be resolved by this summer. These recommendations included requirements that (1) privacy policies be disseminated to the public at large and (2) US regulatory authorities step up their non-compliance enforcement as well as beefing up of redress options offered to EU residents whose data is being sent to the US for processing.

The FTC’s first response was to step up enforcement action taking 12 companies to task because they had failed to renew their EU/US Safe Harbour certificates and were falsely claiming compliance. The certificates have to be renewed every year. The companies have been hit with 20-year orders against them or face additional civil penalties if they fail to meet the requirements of the order to not misrepresent their compliance to schemes like the EU/US Safe Harbour.

At a recent meeting of ESOMAR’s Legal Affairs Committee, companies present at the table were asked whether the loss of the EU/US Safe Harbour scheme would impact their business. Every company around the table agreed on how important the EU/US Safe Harbour is to enable market, social, and opinion research to be conducted effectively across all our operating bases. This is especially important to small and mid-sized companies who stand to lose from the simplified processes that the EU/US Safe Harbour affords them, saving them having to make major investments in legal support to draft and implement the other burdensome schemes available under EU law.

Whether the FTC’s recent enforcement actions will appease Europe enough remains to be seen, but it is clearly the latest in a series of tit for tat actions that highlight the differences in approach and attitude towards privacy and data protection on the two sides of the Atlantic. There is not much market research can do about this, but there are some concrete steps that research companies and the associations tasked with representing them are and should be doing.

Market, social, and opinion research companies must be careful to ensure that when transferring data between the EU and the US, they do take the time to self-certify through the EU/US Safe Harbour and to renew their certifications every year. Ensuring that a company’s entire supply chain is EU/US Safe Harbour compliant is also extremely important (this can be guaranteed through contracts and periodic audits). Offering comprehensive redress in the face of respondent complaints or requests to remove their personal data is also an extremely important requirement for self-certified companies. Losing your EU/US Safe Harbour coverage would mean that the data transfer is illegal and could mean facing legal actions both in the EU and in the US.

ESOMAR, and partner national associations on both sides of the ocean are also working hard to remind legislators of the importance of getting the EU/US Safe Harbour right and not escalating the situation into a full digital world war where we would all lose. Don’t hesitate to let us know how your companies would be affected by the loss of such a scheme so that we can reinforce our messaging to decision makers.

The key decisions that societies make, both in the private and public sector, are increasingly driven by data, both big and small. The free-flow of information is critical, not just for us as market, social, and opinion researchers but for the whole of society. By working together, we can ensure that the smoke over EU/US Safe Harbour does not turn into a real fire.

 

Kim Smouter is Government Affairs Manager at ESOMAR. For more information on legislative developments in your region visit www.esomar.org/government-affairs

Share

#MRX Top 10: Visualizing World Trade, Consumer Attitudes and Digital Cameras

Of the 2,347 unique links shared on the Twitter #MRX channel in the past two weeks, here are 10 of the most retweeted.

Twitter

 By Jeffrey Henning

Of the 2,347 unique links shared on the Twitter #MRX channel in the past two weeks, here are 10 of the most retweeted.

  1. The Patterns of World Trade – Euromonitor shares a stunning data visualization of leading exporters: http://euromonitor.typepad.com/.a/6a01310f54565d970c01a3fd2240af970b-800wi
  2. Secrets to Surviving the Customer Revolution – Megan Clothier of Vision Critical recaps a recent webinar involving author John C. Havens. This image, illustrating how the word “consumer” makes people feel, went viral on #MRX: https://pbs.twimg.com/media/BqbrsdbIIAAwhlY.png
  3. Public Views on Ethical Retail – According to a survey that Ipsos MORI conducted for the Department for Business, Innovation and Skills, 49% of UK adults 16 and up believe that UK retailers aren’t very ethical.
  4. Quality of Own-Label Brands on a Par with Branded Goods – Jane Bainbridge of Research describes a survey of 1,000 UK “consumers” questioned by Perception Research Services: 63% consider store brands to be the same quality as national brands, with 14% believing they are better quality; only 3% are embarrassed to by store brands.
  5. IIeX Atlanta 2014 – Zoë Dowling of Added Value recaps the Insights Innovation Exchange conference for North America: “The industry is on the cusp of a new era; one that is causing a lot of soul searching but also a lot of excitement.”
  6. Picture This: A Smartphone That Satisfies All Your Photo Needs – Adelynne Chao of GfK shares the results of a survey of German and UK consumers about when they prefer to use digital cameras versus the built-in camera of smartphones. http://blog.gfk.com/wp-content/uploads/2014/06/Smartphone-camera-experience-1.jpg
  7. What Does Gamification Offer Healthcare Research? – Joanna Thompson of Adelphi Research, Paola Franco of Janssen, and Jon Puleston of GMI summarize the paper they presented at the British Healthcare Business Intelligence Association annual conference. In a test of a gamified survey of doctors against a conventional survey format, respondents to the gamified survey had a better experience, provided more information, and yet completed the surveys more quickly.
  8. Driving Change: Public Concerned About Safety of Young Drivers and Back Licence Restrictions: 68% of UK adults support a “graduated driver licencing scheme” for new drivers, although young people are less persuaded, according to an Ipsos MORI survey.
  9. Embracing change, cultivating opportunities – Magali Geens and Saartje Van den Branden write, “Researchers are still largely preoccupied stuffing over-abundant PowerPoint decks with sensible graphs drawn from respectable representative samples of meticulously screened participants; thus maximizing the chances of bringing nothing new to the professional who is in dire need of true insights to challenge the status-quo.” Ouch!
  10. Turning Social Media Monitoring into Research: Don’t Be Afraid to Engage – Margaret Roller argues that confining ourselves to monitoring social media handicaps our ability to gain the fuller understanding that comes from asking questions of social-media users. 

Note: This list is ordered by the relative measure of each link’s influence in the first week it debuted in the weekly Top 5. A link’s influence is a tally of the influence of each Twitter user who shared the link and tagged it #MRX. Only market research links are considered, although the #MRX hashtag is occasionally used for other types of tweets, including – recently – tweets about Mr. X, an upcoming Indian 3D thriller film.

Share