Market Research

Market Research at a turning point

 

Report of IREP-CESP Joint Workshop on The Transformation of Market Research

Paris, June 27th 2013


The business of market research is experiencing a turning point due to two concomitant changes: a change in the behavior of individuals (behavioral change) and a change in the technical means to analyze this behavior (technological change).

Behavioral change

Based primarily on their willingness to participate, individuals are more and more difficult to query these days. The proliferation of surveys by phone, email, Internet, in public spaces and retail outlets, all combined with the end of the model of individual (a housewife, under 50 years, always available and happy to answer) has resulted by a steady decrease in the rate of recruitment. For example in a French study carried out continuously by telephone over five years, the rate of recruitment steadily decreased from 22. 5% in 2009 to 19. 2% in 2010, 18. 3% in 2011, 18. 9% in 2012, and 17. 9% in 2013… A continuous decline to the point of asking the question of disappearance of telephone surveys in a few years! And there is no signal that what is true for the phone today will not be true for email and Internet tomorrow!

In addition to this proliferation, the life of an individual is so fragmented that it is more and more complicated for him or her to remember what media he or she was consuming, on which device, and at which time. As a result, surveys and panels (in particular diary-bases ones) become less and less accurate in fully capturing the life of individuals.

Technological change

The advent of new technologies, such as Big Data and social networks, have allowed new, small, and more agile players to enter the market with solutions, methods and costs that often prove more relevant. Before the Big Data era, data was a scarce commodity. With Big Data, it has become abundant and it will be even more abundant with the Internet of Things when each object will have its IP address, its sensors, its connection to bigger “data centers” where not only data will be stored but equally powerful software to analyze, visualize, and transform this data. Google, Amazon, Microsoft, and Rackspace already make available to the new entrants the same infrastructure they offer to the traditional big players: Salesforce. com, SAS, SPSS, QlikView, and Tableau Software are now available as a service delivered over the cloud on demand and for small fees.

Analyzing these two changes and their impact on market research was the aim of the workshop that IREP and CESP have jointly organized in Paris on June 27th 2013. Different professionals were invited to present their views: academics, auditors, advertisers, researchers, and high-techs.

This report is a summary of the salient points noted from the speakers’ presentations followed by a synthesis of the lessons about the future of market research… For sure, it will be data-rich high-tech driven.

Minutes

Zysla Belliat, President of IREP and Charreton Bied-Denis, Managing Director of CESP, have each in turn presented technological innovation as the trigger of the two changes. Technology within the reach of consumers changed their behavior towards advertising, media, loyalty, and more generally towards the products and services they consume. The new technology also changed the way to track, analyze, and anticipate consumer behaviors.

For Zysla Belliat, market research studies must adapt to these two changes. To begin, she recommends starting by understanding that now the questionnaires and samples are now only tools among others. On the one hand, the consumer has become fugitive, recalcitrant towards protocols, and present on several “devices”. On the other hand, the new technologies, especially Big Data, provide new tools for the collection, processing, analysis, and delivery of data. The two changes combined have a big impact on the way studies are conceived, planned, and executed, all in an economic context than forces the community to go faster and cheaper. Zysla Belliat calls research professionals to innovate and evolve, but without sacrificing the rigor and quality that make their mark, their raison d’être.

For Denis Bied-Charreton, being perpetually connected, individuals have become producers of data through their actions on all the “devices” available to them: TVs, game boxes, PCs, smartphones, and tablets.

Passive data is now available in mass but can it be a substitute for all or part of the data collected in a declarative and conscious manner? He asked. Should we abandon samples in favor of mass data? He added. Is this the end of the questionnaires and the beginning of sensors? He added again. For Denis Bied-Charreton, studies need to adapt their analysis or build new ones but they first must go back to data, its value, and its quality as priority #1, which he summarizes with the phrase “back to basic, regardless of the data source. “

Presentations made by Zysla Belliat and Denis Bied-Charreton show that the “technologication” of market research and the availability of mass data bring challenges not only to professionals but also to academics: new protocols to collect, process, and analyze data from varied sources… A real research program for academics in behavioral economics and data science.

Anne-Marie Dussaix, Honorary Professor at ESSEC, was more interested by the expected quality from this technology-led turning. To Anne-Marie Dussaix, relevance, accuracy or completeness, timeliness, accessibility, interpretability and coherence and comparability of data collected in mass are now more than ever essential to ensure the quality of a survey.

The growing amount of data should not hide the fact that 21% of the French population does not have an Internet access, more and more households have completely abandoned their fixed lines, and that a growing number of households do not appear on the public listings (they frequently change operator). . . To Anne-Marie Dussaix, a good study should not put priority #1 to quantity, but to quality instead.

She sees the risk of quality of a study as the consequence of shorter deadlines, smaller budgets, lower response rates, and changing behaviors. To reduce this risk, Anne-Marie Dussaix recommends the use of injection and fusion, two techniques that have long been used in media audience studies. An injection method replaces the missing responses in a questionnaire with responses from similar respondents who answered the questionnaire. A fusion method, as its name suggests, combines the results of two or more complementary studies.

Pascale Carle, Director of Research and Prospecting at Auchan, presented the expectations of retailers. Big Data specialists promise retailers a new and rich era; a wealth the new entrants compute from data consumers generate through merchant cards, outlets, websites, travel agencies, and insurances. Meanwhile, traditional suppliers continue their “business as usual”: sampling, questionnaire, collection, processing… For Pascale Carle, there is a large gap between two value propositions. While Big Data opens new horizons and new opportunities, they often come with answers that are not actionable. Big Data tools do not yet help much the retailers. She gives some illustrative examples of this gap: What does the word “bravo!” mean on a social network? What diagnosis to make if this “bravo!” is taken, multiplied again and again on the Web? Is this a sign of total satisfaction or is it rather a sign of total dissatisfaction? What does the word “Auchan” mean? For Pascale Carle, what Auchan customers or fans have written has a greater value than the number of times they have written the word “Auchan”.

She invites market research professionals to quickly integrate Big Data tools in their data collection and analytics platform and add them the know-how they lack: asking the right questions, looking for the right information. What is of the highest value to Pascale Carle are tools to help her forecast sales and/or mix shelves based on weather conditions, according to the day, prices, sporting and cultural events, history of buyer navigation, tweets, likes, and payments…Pascale Carle’s call for action sounds like an invitation for traditional players to join their efforts with the new entrants to design, develop and propose solutions, combining the expertise of the firsts with the technology of the seconds… A call for a data-rich, technology-based research era!

Jean Thibaud, Director of Research and Segmentation at SFR, presented another fresh view coming from actors outside market research: the big consulting firms such as McKinsey, Accenture, Deloitte, BCG and Bain & Company. These firms bring a new vision…The business that was data-driven is now action-driven. Data is everywhere (in logs, tweets, emails, recommendations…) but its true value is in the contextualization, comparison, and merge, and transformation into an actionable plans. Accenture defines this migration of as the Age of Aggregation.

As Anne-Marie Dussaix of ESSEC and Pascale Carle Auchan before, Jean Thibaud asks to focus on quality, on relevance, and not only on volume. A large amount of data does not mean that the information sought is inside. For example, 15 to 20% of the French population is off the radar of Big Data tools. They do not use Facebook, Twitter, and alike but they also have a voice, an opinion, something to say about their consumption. A real challenge for researchers is to get the voice of these silent individuals.

Yannick Carriou, Global CEO at Ipsos MediaCT, presented a general research firm’s point of view. For Yannick Carriou, media studies are out of control due to the flood of new technologies within the reach of individuals. And collecting audience measurements based on memorization of individuals is no longer relevant. Media consumption is split between several “devices” (radio, TV, PC, smartphone and tablet). Each of the devices supports different medias at once. Reading newspapers on smartphones and tablets, watching movies on PCs and tablets. But the real paradigm shift is perhaps one of the new ways of purchasing advertising space in the digital universe. Programmatic buying, based on the analysis of the digital track of individuals, the implementation of complex algorithms, and stirring of millions of data, coupled with auction systems will affect the television one day, as piloted by SKY in the UK, to send campaigns based on known characteristics of the home directly.

He cites new entrants from the high-tech sector (Rocket Fuel in the U. S. , Criteo in France) that bring new ways, new metrics, new methods completely unknown to market research companies, to media agencies, and to their clients. The intelligence of data will grow strongly, and questionnaires will only be a part of the global picture. And when they are used, they will go shorter. Google even offers questionnaires with two questions!But they combine answers of these two questions with thousands or even millions of data collected by other means such as Google search engine, Gmail mail service and Google+ social network. The era is the passive measurement (via smartphones, tablets, and in the near future Google Glass) and artificial intelligence to collect, store, fuse, and comprehend large volumes of data.

In this fragmented world, the implicit or explicit fusion of information coming from various sources will be the rule. As far as the quality data is concerned, CESP and other quality monitoring and control bodies can analyze the quality of raw data and compare it with a quality reference, but the quality of aggregated data and “statistical glue” between heterogeneous sources is still a large open question. And here, the new entrants must be part of the solution.

Benoît Cassaigne, Executive Director at Médiamétrie, presented the perspective of an audience measurement actor (radio, TV, and Internet). Benoît Cassaigne summarizes the new context of media by a formula: digital world = bloated world… We are in a bloated world of offerings, equipment, screens, and data connections. In a few years, the number of TNT channels in France increased from 7 to 24. The number of Internet access has doubled. The number of set-top boxes is 4 times bigger. The new media environment brings with it new challenges for professionals in the measurement.

First challenge: Scope. Now any new media study must blend Panel data (individuals) with Big Data (devices). It must also consider multiple screens… TV is no longer only watched on TV… Radio is no longer only listen to through radio. And processing is no longer executed periodically but increasingly in real time. Listening, viewing, and browsing data should be available all the time and for all the devices.

Second challenge: Quality. Unlike Panel data, we have no standards for Big Data. Logs, tweets and alike are not all certified, and do not always represent the total universe studied… Big Data doesn’t equal Big Picture!

Third challenge: Myopia, which Benoit Cassaigne defines through a telling example. Extrapolating video time spent on YouTube one hand and on TF1 on other hand, we can assume that the total of 96,527,000 hours of viewing per month for YouTube is superior to 50 minutes per day for TF1! The extrapolation is obviously nonsense!

As for the future of media research, Benoît Cassaigne considers the use of multi-channels to collect data (mobile phones, the web), the fusion of data coming from Panels with Big Data, all with more constrained budget and in a shorter time. . . In one sentence “do more with less. “

Thibaut Munier, CEO of 1000mercis, brought the viewpoint of a high-tech company. Thibaut Munier defines Big Data as the enriching of the customer database at each contact of the latter with the brand, company, product, or service: visit to a web site, access to a social network, click on an ad banner, response to an email…

To Thibaut Munier, Big Data expands the scope of market research. Not only studies can be customer-centric, they can also be site-centric or ad-centric. With Big Data, segmentation is not based on socio-professional criteria or previous purchases; it is based on the behavior of the individual: search, compare, purchase/abandon, and advice/recommend… He cites the example of real time bidding, a new sort of algorithms, which can track Web users, monitor the sites they visit and detect which products they were interested in. If the user moves from one site to another, the algorithm will feature ads related to items he had viewed on the first site. The purchase of advertising space on the Web has become real time, depending on the site, the day, major events in progress, the visitor, and so on. Media planning has become to advertising what the high frequency trading is to finance. Already appeared marketplaces called “Ad Exchanges” to refer to their equivalent in Finance, the “Trade Exchanges. “Without knowing it, without always being aware, web pages change for you and you change web pages according to what you do… Welcome to the Web that learns and adapts to all and in real-time!

Synthesis

The most important thing is not whether we live in an evolution or a revolution. The most important is to remember that so far Information Technology (IT) was considered as a main cost center or a support function. Now IT has become increasingly a differentiator and key driver for the actors who master it at best. Digital technologies have become the DNA of a new generation of market research where the consumer is changing, global, recalcitrant, both consumer and producer of data, battling to keep his or her private life invisible to others, in particular to market researchers and advertisers.

We focus a lot on data but with Big Data and social networks, data is no longer a rare commodity. As noted by Jean Thibaud and Yannick Carriou, traditional market research is challenged, new players are already present (high-tech companies on the one hand, consulting firms on the other), new skills are required…It is not unlikely to see a few years from now completely unknown companies in the “Honomichl Global Top 25, 50 or 100.” The next leaders will be those that incorporate the expertise of market research, the computational power of Big Data and the art of presenting insights and actionable recommendations.

Advertisements
Standard
Big Data, Market Research

Big Data and Market Research

 

Report of the IREP Workshop on Big Data Big Analytics

Paris, May 30th 2013


March 2011, McKinsey published a report entitled “Big data: The new frontier for innovation, competition, and productivity”. Since then, not a day goes by where we do not see on the web, an article, an announcement, or the release of a book on Big Data. All industries and businesses are looking at the phenomenon. Big Data has become the Holy Grail of technology that will change everything, turn everything upside down. New job (Data Scientist), new training (Data Science) are announced in the press, even the in the most established newspapers.

It’s time to demystify the phenomenon and separate realities from promises! It is in this context that the members of IREP started to ask questions about the impact of Big Data on the upstream and downstream activities of market research and decided to organize a dedicated workshop on May 30th 2013. The main objective of the workshop was not to redefine the movement or the underlying technologies of Big Data. Conferences, exhibitions and general seminars are regularly organized on this theme. The goal was to respond to questions posed by industry players in the advertising, marketing, and media research. Questions such as: Is Big Data just yet another hype of software companies as was the ERP, the CRM, and the e-business to name a few? What differentiates Big Data from mere Data that we use for our research and survey projects? What is the impact on our business, our knowledge, our processes, our methods, and IT systems?

To answer these questions and many others, IREP has invited leading French experts. Most have already been faced with Big Data well before the name Big Data became one of the subjects of the most sought after, most indexed, and most documented since the advent of Web 2.0 and social networks.

This document is a report of the key points of each of the presentations made by the speakers and a summary of lessons learned from the workshop, including the points the participants rose during the Q&A sessions.

Minutes

Philip Tassi, Executive Vice President of Médiamétrie and a member of the IREP Scientific Council, made a brief, yet rich history of the concept of Data. In the 19th century, exhaustiveness ruled the world of data where government organizations used to collect all data one could collect on all individuals. The results came years later. It was the age of manual, paper and pencil. But it was already the Big Data at that time. The difference compared to Big Data in the 21st century is the frequency of data collection, the speed of data processing, and the sophistication of data visualization. In the 20th century came the reign of sampling, a true innovation at the time. With only a portion of the population, we could get the results of the general population and in record time compared to the exhaustive approach. Isn’t Big Data just the reborn of the exhaustive approach? Which of these two approaches should be chosen in future studies? For Philip Tassi, one complements the other and vice versa and therefore he proposes to combine both each time when it possible. To add facts and evidence, he cited two cases made by Médiamétrie for measuring the Web traffic on fixed and mobile Internet. Médiamétrie used a Big Data approach centered on sites and completed it by a Panel approach centered on users. The results were much better.

Arnaud Laroche, President and CEO of Bluestone and Board Member of ENSAI presented the underlying technologies of Big Data initially developed by Google and Yahoo and since adopted by other leading websites such as Facebook, LinkedIn, and Twitter. The initial idea of Google and Yahoo was to parallelize the data storage and processing of thousands of unbranded servers. Branded servers at the time were too expensive to achieve this high level of parallelism for both storage and processing. For Arnaud Laroche, Big Data does not create new mathematical or statistical models for analyzing data. Big Data is increasing capacity to collect, store, and process data. And because of this, it is the ultimate application scope for sophisticated data mining algorithms and machine learning that previously did not have a critical mass of data to give their full potential.

While Philip Tassi addressed the question about the changing nature of Data, Gilles Santini, President and Director of Hippo IREP, meanwhile, addressed the question of whether we need to change, extend or abandon our traditional models of probabilistic analysis. Do we use, extend or abandon the Benzecri’s multivariate analysis of matches, Tukey’s exploratory data analysis, Gauss’s linear regression, Fienberg’s response patterns, and Hartigan’s classification methods and time-series analysis? In short, do we need to change, extend, or abandon the mathematical and statistical foundation built for centuries? Like Philip Tassi, Gilles Santini also proposes to combine the two approaches mentioned above. For Gilles Santini there is the modeling level that seeks to provide meaningful relationships in a group of individuals and the “probabilization” level that seeks to provide each individual a probability of action. Big Data can be a tremendous asset for modeling the group for the purpose of targeting (which is the group of individuals who buy BMW cars in general?). Probabilization uses the results of Big Data modeling for finer targeting (in the previous group, which ones are the most likely buyers?). Gilles Santini concluded his presentation with a warning. We have the technical means to deal with massive data in every way but we must remember common sense: Do all the pieces of the collected data have a meaning? For Gilles Santini, always trying to make sense of the available data is the real business challenge for Big Data to gain acceptance. Otherwise, we run the risk of what IT professionals call Garbage In-Garbage Out.

For Gilbert and Grenie Zouhir Guedri, respectively Partner and Director at PWC, Big Data movement comes with challenges of a different order than technical or business orders. They are regulatory, legal and security orders. They cite the reporting requirements to CNIL on personal data and its use, the obligation to give people access to their data to correct or eliminate, the prohibition in principle to transfer personal data on people of the European Union outside Europe, etc. Add to these constraints intellectual property and copyright protection. All of this will make it difficult to use, distribute, and/or monetize data collected from social networks or information sharing platforms. Unlike what we read on the Web in general and in technical and scientific publications in particular, the presentation of Gilbert and Grenie Zouhir enlarges the definition of a Big Data project to encapsulate these and other legislative aspects that can slow the deployment or increase the cost of the original project. From now on, we must consider any Big Data project, especially those focused on consumers, as a mix of a business, a technical and a legal project.

Jean-Charles Cointot, Telecom Industry Leader at IBM France, presented the results of a survey from 1100 IT and business professionals worldwide. The results of this study showed 28% of respondents have implemented at least one Big Data pilot project, 47% plan at least one pilot project and 24% have not yet started. As indicated in these figures, the opportunities and challenges are still ahead. At the business level, we are still in the experimental phase. This is common. As with any major innovation, there is first the vision, then comes the experimentation that if successful becomes a deployment to later become a standard in the sense that it is no longer defined as an innovation. Other results of the study show the importance of expected benefits. As for the ERP, the CRM and the Web, it is the efficiency and effectiveness of customer relationship that is most sought by companies (49% of project objectives Big Data). Another important point of the presentation by Jean-Charles Cointot is the use of the underlying Big Data technologies on data already in house. A new Big Data project does not automatically mean new data or online data but to use new ways to get more from the old, already available data.

For Bruno Walther, Co-founder of Captain Dash, Big Data is both a revolution and an evolution. It is an exponential expansion of the variety and complexity of the concept of Data, now composed of text, numbers, charts, sounds, images, and videos from various sources such as the Web, RFID chips, databases, etc. But it is also a natural evolution of data processing. It is just more sophisticated. Big Data is the logical continuation of the Web with its catalog of products, price comparisons and logs of visits, which is itself a continuation of the CRM with its segmentation of customers, contacts and promotional offers, itself being following the ERP with its receipt tickets, purchase records, and payment records. In agreement with the speech of Gilles Santini, for Bruno Walther the goal of Big Data is not to understand Why but How relationships. With Big Data, it is not the causality that matters; it is the correlation that matters. It is the reign of Business Rules technology, heuristics and machine learning, called more generally Artificial Intelligence. Like Gilbert and Grenie Zouhir Guedri above, Bruno Walther states that the current rules that provided private protection consent, opt-out, and anonymity are no more fully guaranteed. If you use all the Google services on all devices, Google knows everything about you! To gain acceptance by the end consumers, Big Data projects must evaluate the reuse of collected data and the impact of reuse on the consumers. The latter have to find their benefit in delivering part or their intimacy to advertisers and their suppliers.

Michel Bellanger, Head of Marketing Carrefour Médias, presented the history of Big Data at Carrefour. Big Data started since the opening of the first hypermarket in 1963, 50 years ago. Today, Carrefour has 13.7 million loyalty cardholders representing 76% of sales. Each purchase is logged 24 months. The log details extend to the reference product. The example of France’s Carrefour is proof that Big Data is not born, as is often thought, two years ago. There is also evidence that Big Data is not just through the Web. If you total all transactions on all physical outlets of Carrefour (2108 shopping centers), you will arrive at a gigantic figure of 957 million transactions per year, or more than 2.8 million transactions per day. The example also shows that Carrefour Big Data is not based on Hadoop and its derived technologies. Instead, Carrefour uses technologies from the ’1990s to collect, process and analyze the massive data through physical outlets mentioned above plus 5 Web sites totaling 6 million unique visitors per month.

Lisa Labatut, Head Traffic Generation at Bouygues Telecom, and Alain Levy, Chairman of Weborama, presented the results of a study they did using data collected from online users to increase the conversion rate of ad appearance on the Web into visits to sales points. 27 million users were scored against Bouygues Telecom’s offering portfolio. Compared to CRM data only, the conversion rate has been multiplied by 3.1. The use of Big Data can significantly increase the performance of an advertising campaign by bringing it more precision and more details. Here, Big Data collected by Weborama enriched the CRM of Bouygues Telecom. Further proof that Big Data (external data) does not replace but complements the Data already available in business (internal data).

Synthesis

Presentations by different stakeholders show that at present, the concept of Big Data is not formally defined. Each speaker gave his or her definition according to his or her activity: a research firm, a software company, a consultancy firm, and an FMCG distributor. But there is a common finding in all presentations: Big Data is not new.

Big Data is a revival of the exhaustive approach that was the norm before the advent of Sampling that occurred in the late 19th century – early the 20th century. New IT with its capacity for collecting, processing and analyzing masses of data has made economically feasible the Big Data approach with a lower cost and a shorter time. Big Data is not born either with the Web. It is already present and has been used for decades by consumer goods distribution channels, airlines, car renting, and hotel companies, telecom operators, banks and insurance companies, payment card networks, etc.

Big Data will not replace sampling and panels. Indeed, Big Data can collect precisely the behavior and actions of consumers, but not their demographics such as age, gender, socio-professional level, income, etc. Big Data is centered on devices (PCs, set-top-boxes, laptops, smartphones, tablets…) while the panels are centered on people (men, women, baby boomers, teenagers…) For a complete analysis, richer information, we need to combine the two approaches: Big Data for behavioral data, Panels for intentional data. This leads us to wonder if after all Big Data is not the digitizing of all our search, selection, comparison, and purchase of products and services available both in physical outlets and on the Web.

Big Data is not a new technology although it started with Google and Yahoo with their developments of Hadoop, MapReduce, Big Table, etc. The technologies of the 1990s are still used and will continue to be used with great success. Big Data will not replace the ERP’s, the CRM’s and e-business sites already operational. Big Data will be an evolution of the systems already in production at B2C, B2B, and government organizations.

Big Data projects are not projects of a new type even if the legal aspect is more important. Conventional approaches, in particular the waterfall and the agile methods, remain valid. All is needed is the inclusion of a legal person in the project team.

Like any technological innovation, a few pioneers initiated Big Data for internal purposes: to parallelize the storage and processing of massive data at very high velocity but at a lower cost. Then came the experimenters with pilot projects. Gradually, the number of these pilot projects will increase. Some will give birth to larger projects. Others will fade away. In a few years, Big Data will become a regular innovation, in turn caught by a new innovation seeking to improve or even replace it. The force of creative destruction (defined by Joseph Schumpeter in 1942, rediscovered first by Everett Rogers in 1962, a second time in 1985 by Norbert Alter, a third time by Geoffrey Moore in 1991 and a fourth time by Clayton Christensen 1995) continues its work until the next innovation…

Standard