The Future of Data Analytics for Retail

Source: The Future of Data Analytics for Retail

Late last year, Amazon premiered a system that may well be the future of shopping. Nicknamed Amazon Go, it looks just like a regular brick and mortar store, except there are no lines, no self-checking machines, and no cashiers. The items you buy are checked by sensors, your account is charged through your mobile Amazon Go app, and you can just walk out of the store whenever you please.

Amazon Go is a revolutionary spin on retail, commerce, and the experience of going to a store. What’s really special about Amazon Go, however, is what it represents in terms of data.

All across the retail universe, the rapidly widening Internet of Things is becoming equipped for high-frequency event analytics. Across the board, that means faster decision-making, more helpful data, and smarter, more cost-efficient businesses.

The Future of Data Analytics for RetailCLICK TO TWEET

Retail and event-driven analytics

The “event-driven” company, according to VC Tom Tunguz, is one that consumes events as they occur, in real-time, from whatever data sources are available.

Rather than record data manually—making all your data liable to corruption—event-driven companies have set up the pipelines they need to always be collecting up-to-date, quality information.
event-driven saas

(Source: Tom Tunguz)

The first stage in this process—“events occur”—is the most important one to consider in the retail context.

On a website, those events are fairly easy to understand. They might be clicks, button-presses, or scrolling behavior. We’ve been trained to think about the web in terms of events—not so with brick and mortar. And yet, the amount of events that could conceivably be collected as data from a single retail experience is tremendous.

When people enter the store, what items they pick up, which they take with them and which they put down, what order they shop in, even how they navigate the store down to the most infinitesimal of details—all of this is information that could help companies increase revenues, lower costs, and build more efficient businesses. That’s also just the front-end of the retail experience.

The new retail Nervous System

The Internet of Things has spread rapidly up and down the production supply chain, laying the foundation for the future of retail.

RFID chips on products allow companies to track their inventory with an unprecedented degree of precision, even as their shipments rattle around in shipping containers, cargo ships move in and out of port, and trucks travel across the country.

Companies like Flexport make it possible to manage and visualize those complex supply chains, many of which were barely even digitized years ago. Others help optimize last-mile delivery, manage the capacities of warehouses, and plan out routes for truck drivers bringing goods to market.

In stores, the same tags that help track goods as they move around the world can be used to optimize pricing given alterations in local conditions or sudden surges of demand.

This network of physical/digital infrastructure is just the substratum, however, of the true analytics-enabled future of retail.

When data analytics meets retail

Event data is the foundation of all behavioral analytics.

When you’re tracking every discrete click, scroll, or other web action, you can start to look for patterns in the data that you’re collecting. You can see which pieces of content on your blog engage the most users, which version of your checkout flow is the best for conversions, and so on.

There’s already technology out there to help investors like those at CircleUp analyze data around small businesses and predict those that will succeed based on a large corpus of historical data.

With the infrastructure of the Internet of Things in place, the same kind of analysis becomes possible on a physical scale. You can start to find patterns in what people buy, when people order, and how to build a more efficient goods-delivery system.

The possibilities are extensive and powerful. In Amazon’s concept store, you can easily imagine sensors that take notice whenever your gaze rests on a particular item for longer than usual, or when you pick something up only to put it back down afterwards.

The decision to not purchase an item would be just as important for Amazon’s recommendation engine as a confirmed sale—that data could even be fed back to the supplier for their marketing team to analyze the lost sale. Visual recognition systems could be used to show you an ad in the evening for that dress you were eyeing at the store in the afternoon.

That’s just scratching the surface of an extensive universe of possibilities. Already today, IoT-enabled retail is allowing companies to:

  • identify fraud before anyone from Loss Prevention even notices it’s happening
  • systematically reduce shrinkage by analyzing exactly where it’s coming from
  • give estimated delivery times in as small as 10-minute windows

A few years ago, Amazon patented the idea of “anticipatory shipping”—moving goods around based on their predictive analysis of likely consumer behavior. Because of your history, in other words, Amazon could predict that you were about to order a pack of toilet paper—and make sure it was in stock at the closest distributor well before you even clicked on the order button.

In the retail world of the future, innovations like these won’t be cutting-edge. In the age of data analytics, they’ll be little more than table stakes.

The data analytics long tail

The free flow of event data in retail depends on the proliferation of data sources. The more sources of data that can be cross-referenced, the more patterns that can be found and the more intelligence that can be produced.

Fortunately, the retail space is in a great position for data sources. There are not only a massive number of in-store sources of data, from sensors to registers to RFID tags, but there are complementary online sources as well.

big data sources

(Source: IBM)

For businesses that exist only as brick and mortar, the proliferation of IoT components and data analysis will mean a massive step forward in terms of business intelligence.

For those that are both brick and mortar stores and online, the confluence of the IoT and traditional behavioral analytics will mean an unprecedented wealth of data and an unprecedented set of options for customer engagement.

For those of us who have thrown in our lot with data, it is an exciting and fascinating time to be around.


Data science and our magical mind: Scott Mongeau at TEDxRSM


During TEDx at the Rotterdam School of Management Scott Mongeau, manager analytics for Deloitte, gave an interesting talk about how we can be tricked by the analysis done with big data. He explains that models used in big data analytics should be driven bye the data and not the other way around. Essentially all models are wrong, but some are useful and data analytics can help solve some of the problems that we have currently, but it is important to have better diagnostics and semantics.

Thill will ensure that the models used are sound and can better describe computers what we are trying to achieve. He says that we should not become so focused on the engineering part of big data analytics that we forget the social context, which is important to understand as well in order to create true insights.

Human factors limit smart cities more than technology – (RW)

While the latest smart gizmo tends to grab headlines, industry experts are urging urban leaders to focus more on smart city challenges with their citizens, rather than the technology. That’s according to attendees at the recent VERGE 16 conference in Santa Clara, Calif. where leaders in the smart cities space gathered.

A key sentiment that emerged from the conference was that leaders in government and industry need to stay focused on the larger smart city picture and not get caught up in the latest gee-whiz technology.

Specifically, there needs to be greater focus on meshing emerging tech with the current political and economic systems that affect citizens.1

“The technology solutions are there,” said Kirain Jain, Chief Resilience Officer for the City of Oakland. “What we’re really looking at are governance issues.”

The proliferation of new smart city platforms and equipment is driven partly by the increasing ease at which they are integrated into city infrastructure.

However, government leaders are being urged to develop technology strategies around citizens’ needs first rather than prioritizing the technology and figuring out whether public benefits later.

“We just put out an RFP last week that had the words ‘user-centric design,’” said Jain.

Cities needs to evaluate their strategies

The shift from technology-centric strategies to user-centric mindsets also requires a realistic assessment of which populations of the city are actually benefiting from these innovations.3

Specifically, local leaders must recognize that many smart city innovations are providing benefits to the better off segments of society. Meanwhile, those citizens struggling with poverty may not see much benefit at all from technology that makes the morning commute more pleasant.

“A lot of our focus has been on moving the top 20% of the market,” said Kimberly Lewis, senior vice president of the U.S. Green Building Council. “We thought the trickle-down effects would really begin to affect low- and moderate-income communities.”

She says key challenges are being exacerbated by assumptions that any smart city technological advancement automatically creates mass impact on the entire city population. However, it’s becoming clear that smart city technology is not a magic wand that can be waved to eliminate persistent challenges faced by poorer citizens.

For example the community solar concept is beginning to gain traction in various markets, depending on the resources of those who wish to invest. However, this raises the issue of how to increase accessibility to financing for those communities who lack the resources to develop solar projects.

Companies currently underutilize most of the IoT data: Creating a successful Internet of Things data marketplace ? | McKinsey & Company

Source: Creating a successful Internet of Things data marketplace | McKinsey & Company

By Johannes Deichmann, Kersten Heineke, Thomas Reinbacher, and Dominik Wee

Monetizing the flood of information generated by the Internet of Things requires a well-executed strategy that creates value.

The Internet of Things (IoT) will turn the current rush of industrial data into a rogue wave of truly colossal proportions, threatening to overwhelm even the best-prepared company. As the gigabytes, terabytes, and petabytes of unstructured information pile up, most organizations lack actionable methods to tap into, monetize, and strategically exploit this potentially enormous new value. McKinsey research reveals that companies currently underutilize most of the IoT data they collect. For instance, one oil rig with 30,000 sensors examines only 1 percent of the data collected because it uses the information primarily to detect and control anomalies, ignoring its greatest value, which involves supporting optimization and prediction activities. One effective way to put IoT data to work and cash in on the growing digital bounty involves offering the information on data marketplaces to third parties.

How a digital marketplace creates value

Digital marketplaces are platforms that connect providers and consumers of data sets and data streams, ensuring high quality, consistency, and security. The data suppliers authorize the marketplace to license their information on their behalf following defined terms and conditions. Consumers can play a dual role by providing data back to the marketplace (Exhibit 1).

Aggregated data can be an incentive for providers to share information.

Third parties can offer value-added solutions on top of the data the marketplace offers. For example, real-time analytics can makeconsumer insights more actionable and timelier than ever before. The marketplace also has an exchange platform as a technical base for the exchange of data and services, including platform-as-a-service offers. Six key enablers of the data marketplace can help companies put their data to work more effectively:

  • Building an ecosystem. By assembling multitudes of third-party participants, companies can increase the relevance of their own digital platforms.
  • Opening up new monetization opportunities. Today’s interconnected and digitized world increases the value of high-quality data assets while creating innovative revenues streams. One digital marketplace, for example, adds value to Europe’s electric-automobile market by providing information and transactional gateways for businesses such as charging-infrastructure providers, mobility-service players, and vehicle manufacturers. Charging-station operators, for example, are free to determine their own pricing structures based on data available about customer habits and market trends.
  • Enabling crowdsourcing. Data marketplaces make it possible to share and monetize different types of information to create incremental value. By combining information and analytical models and structures to generate incentives for data suppliers, more participants will deliver data to the platform.
  • Supporting interoperability. Data marketplaces can define metaformats and abstractions that support cross-device and cross-industry use cases.
  • Creating a central point of “discoverability.” Marketplaces offer customers a central platform and point of access to satisfy their data needs.
  • Achieving consistent data quality. Service-level agreements can ensure that marketplaces deliver data of consistently high quality.

Designing a data-sharing platform

As they consider the process of setting up a data marketplace, company leaders need to work through a number of critical questions. An enterprise might ponder the following issues as it clarifies its data-market strategy:

What is the data marketplace’s scope? In most cases, a data marketplace begins when companies set up a central exchange for data within their own organizations. Later, they determine which categories of information within that internal exchange are appropriate (from a security and a profitability perspective) and then allow other players outside their organization (and perhaps outside their industry) to access that data.

How is the marketplace best structured? To foster a dynamic ecosystem, the data marketplace needs to assume a neutral position regarding participants. The legal/tax entity that the marketplace becomes and the structures that govern and finance it are key to this neutrality. Among the guiding principles that players follow in setting up data marketplaces are that a) the marketplace must finance itself through transaction-related fees and commissions, and b) neutrality must extend to future participants that provide or receive data or services, offering indiscriminate access to all interested players under fair terms and conditions. And while the data marketplace will support the creation and definition of data licenses, the data suppliers must nevertheless take responsibility for enforcing and legally auditing them. With respect to the marketplace’s governance, two business models are leading the way. Data marketplaces tend to be either independent platforms or limited ownership hybrids. Under the former model, data sets are bought and sold, while fully owned data-as-a-service providers sell primary data in specific segments or with services and solution wraps. Under the latter, the marketplace collects and aggregates data from multiple publishers or data owners and then sells the data.

Who are the data marketplace’s customers? Once the marketplace is commercially viable, customers will include all types of data providers, and the marketplace system should actively source new kinds of data to become more attractive. The key providers of data will be the companies that capture it, own it, and authorize its sharing. At some point, however, application developers will offer infrastructure and support services that further increase the value of the data by offering a relevant analysis of it and facilitating its delivery.

What are the marketplace’s overall terms and conditions, and data categories? During the marketplace’s technical setup phase, data suppliers define their licensing conditions independently, and the platform provides benchmarks for licensing conditions. The overall terms and conditions of the marketplace apply to all traded data. In the subsequent commercialization phase, the marketplace relies on centrally defined data categories and related licensing agreements as expressed in its general terms and conditions. This strategy enables players to license crowdsourced data independently of specific suppliers.

How does the marketplace relate to other licensing models? When dealing with proprietary data, suppliers usually hold certain information apart and do not share it in the marketplace. However, data suppliers that also offer services can make use of their proprietary data to create services they can trade on the marketplace. For other licensed data, information suppliers can freely create licensing agreements that extend beyond the marketplace—for example, with their strategic partners. Both data amount and type, along with the scope of licenses for using the information, can vary from that of marketplace-supplied data. Likewise, suppliers can also impose separate licensing arrangements for data already traded in the marketplace if buyers intend to use it under different conditions.

What are the role and value-creation potential of the marketplace company or participating data brokers? The potential value of the data will differ depending on whether the marketplace is in the technical start-up phase or has achieved full commercialization (Exhibit 2). In the former, the marketplace acts as a data normalizer, defining standard data models, formats, and attributes for all of the traded information. It syntactically verifies all incoming data compared with the defined standard and continuously manages and extends the data inventory. Once the marketplace enters the commercial stage, it becomes a data aggregator. At this point, in addition to normalizing data and verifying incoming information, it aggregates data and organizes it into logical bundles. For instance, it will enable users to combine data for a given region and offer it to service providers.

Depending on the role of the marketplace, depth of value added will vary.

Choosing a monetization model

While traditional licensing will provide marketplace revenue streams, participants can also develop transactional models to monetize data and services, with on-demand approaches constituting the preferred approach. With traditional licensing, companies can pursue either perpetual or one-off deals and collect customer fees using several approaches. For example, they can sign contracts with fixed fees and run times, renegotiate expired contracts, or earn revenues at the time of sale (this final approach typically provides less stability in revenue forecasting). At the transactional level, the two primary alternatives are on-demand and subscription services. With on-demand services, customers either pay as they go or choose volume pricing and pay charges based on metrics such as usage volume, the number of incidents, or hardware-related fees. Subscriptions can involve flat fees—typically applied on a monthly or yearly basis—or free/premium (“freemium”) offers, which provide the basics free of charge while offering additional features for a flat fee.

Another monetization option is the “give and take” model, which offers incentives to data providers to share their information. The incentive can be monetary or take the form of something like highly relevant, aggregated data as an enticement to share. The marketplace then aggregates and anonymizes the data and offers it along with associated data-focused services to customers.

One give-and-take example is an Internet-based service that offers geolocated real-time aircraft flight information. The service reportedly has one of the largest online aviation databases, covering hundreds of thousands of aircraft and flights as well as large numbers of airports and airlines. Data suppliers receive free radio equipment that collects and transmits aircraft data and a free business-level membership to the service worth $500 [≈ Basic iPad, 2011] a year for as long as they transmit data. In another case, a large European credit bureau offers credit-rating information for consumers and corporations. Data suppliers provide information that includes banking activities, credit and leasing agreements, and payment defaults. In return, they receive credit-ranking data for individuals or businesses. Yet another give-and-take marketplace focuses on data and performance analytics on mobile-operator network coverage. It trades apps and coverage information to data suppliers in exchange for crowdsourced data that can generate mobile-network coverage maps and reveal a mobile operator’s performance by region and technology (for example, 3G or 4G networks).

Assessing the competition

A wide variety of traditional commercial data services currently exists, although these services are largely in silos that focus on specific topics, such as healthcare, finance, retail, or marketing. This balkanization provides an opportunity for new, more holistic data-business models. One advantage of the current ubiquity of data providers is that most companies are already familiar with dealing with them. In fact, some sources estimate that 70 percent of largeorganizations already purchase external data, and all of them are likely to do so by the end of the decade. The value potential inherent in data marketplaces is attracting key players from a variety of advanced industries. A number of aerospace companies, for example, offer systems that provide guidance to customers in areas such as maintenance and troubleshooting. Similar efforts are also under way in the agricultural and mining-equipment industries, among others.

The IoT’s big data promises to help companies understand customer needs, market dynamics, and strategic issues with unmatched precision. But in pursuing this goal, organizations will amass previously unimaginable quantities of information. The data marketplace offers them an innovative way to turn some of that data into cash and reap the benefits that will accrue from building a self-reinforcing ecosystem, enabling crowdsourcing, supporting interoperability, satisfying customer data needs, and improving data quality.

About the author(s)

Johannes Deichmann is a consultant in McKinsey’s Stuttgart office, Kersten Heineke is an associate partner in the Frankfurt office, and Thomas Reinbacheris a consultant in the Munich office, where Dominik Wee is a partner.

The authors wish to thank Mark Patel for his contributions to this article.


7 lessons the Tour de France can teach us about digital business

How professional cycling is transforming itself through data storytelling

Source: 7 lessons the Tour de France can teach us about digital business

In all their many shapes and forms, sports entertain us, bring us joy, and showcase the pinnacle of our physical and mental abilities. We love following sports! But in all the excitement and exhilaration they offer us fans, it’s sometimes easy to forget that they’re also businesses. For event organisers, sponsors, the media, and professional athletes, sports must also be lucrative, profitable, and sustainable.

The Tour de France is organised each year by Amaury Sport Organisation (A.S.O.). It’s a great example of how a professional sports business saw the massive potential that the digital era holds for all organisations … and how it’s now transforming itself in several ways to leverage that opportunity through technology. A.S.O. is revolutionising the viewing experience of pro cycling and is doing so through data storytelling. Here are seven important lessons we can learn about digital business from A.S.O.’s journey so far.

Lesson 1: Know what it is you’re really selling
As a sports media organisation, A.S.O.’s product is not necessarily the sport of cycling itself, but the stories it’s able to tell about it. The newspaper L’Auto created the Tour de France back in 1903 with the sole purpose of selling more copies. The newspaper featured fantastic stories of this exciting cycling event: the trials, tribulations, crashes, punishing climbs, daring descents, and dashing sprints to the finish line. That was the product they were selling … and it hasn’t changed much throughout the evolution of A.S.O. , nor in the more than a century of the Tour’s existence.
Cycling today is still a sport in which fitness, teamwork, strategy, mental endurance, and sheer skill all play a part in ensuring victory and glory. The more compelling, intriguing, and entertaining that story is, the more attractive the product that A.S.O. has to offer, and the greater the audience it will attract to consume that product.

Lesson 2: Change will always happen – recognise it when it does
What did change over the last century, however – and massively so – was the buyer of A.S.O.’s product, the consumers of sports stories around the world. And much of that had to do with technology. As technology evolved over the last 100 years, so did the ways in which sports audiences wanted to consume their stories. With the dawn of the digital era came an explosion of new devices, applications, and social and digital channels. Along came a new type of sports fan, too, for whom traditional TV, radio, and print simply weren’t good enough anymore. The new sports audiences wanted more ways of consuming sports stories, more information, more involvement, more interaction, and more conversation about the sport they were passionate about.
Read this blog or download this infographic for more about the close relationship between the evolution of technology and sport.

Lesson 3: Technology has the power to transform entire industries

A.S.O. therefore faced a challenge: they knew they needed to adapt, enrich, and enhance how they told the Tour de France story, in order to leverage the opportunity that technology and new audiences had to offer. And there was only one way of doing so: through the power of technology itself. As a result, A.S.O. dipped its toes into the world of live-tracking and data analytics for the first time during the 2015 race, with the help of Official Technology Partner, Dimension Data. By 2016, A.S.O. was ready to use data to tell truly exciting stories about pro cycling, and deliver this newly enriched product to fans, professionals, and the media around the world – not only via traditional channels, but also online and through social media.
But how exactly did they go about it and how was Dimension Data able to help?

Lesson 4: Digital business begins and ends with data
Data is the lifeblood of digital business. But making the most of your data means more than collecting, storing, and processing bits and bytes. A reliable flow of data across a secure, stable infrastructure also helps to turn data into useful information. At the 2016 Tour de France, it was all about data in motion. Our mobile data centre, called the big data truck, was literally on the road every day to follow wherever the greatest cycling race in the world went. Watch this video for more about how the big data truck formed the analytics hub of the data that flowed from each bike in the race, to eventually help A.S.O. tell great stories of pro cycling.

Lesson 5: People who work for digital businesses work in new and exciting ways
It takes a fantastic group of passionate, willing, and able people to pull off a massive technology project like the Tour de France. It also takes some great technology to bring them all together, in real-time, from around the world. Our Tour de France team worked every day of the event from inside the big data truck, their office on the road. Through a 24-hour technical development cycle, the team ensured the solution kept up with the dynamics of the race. The truck itself is a great example of a workspace for tomorrow. It’s completely mobile and incorporates an impressive range of communication and collaboration tools that bring people together from all over world, while they work. If you wonder what it takes to operate within a global team in an office that moves every day, watch this video.

Lesson 6: A cornerstone technology of digital business is cloud
Cloud gives you the ability to scale your digital business instantly in order to respond smarter and faster to massive data volumes and dynamic market conditions. This was also true for the Tour de France. This year, we delivered valuable race information to A.S.O. through a unified digital platform hosted in the cloud, in parallel with a big data truck disaster recovery solution. The quality of data and real-time availability of that data were critical components of the technology solution. Dimension Data’s cloud ensured we kept the data secure, and encrypted connectivity allowed the race information to be transmitted to broadcasters, the media, the teams, and viewers securely. Watch this video to learn more:

Lesson 7: It always takes a team
As much as technology stands central to digital business, business itself is always about people. The Tour de France technology solution was no exception. Behind all the buzz and excitement ofone of the greatest sporting events in the world, was a large, global team of dedicated, passionate people working together to deliver something great. They’re the ones who made it all happen. Watch this video to meet some of them, and find out what it was like to help A.S.O. tell great stories of pro cycling.

Lastly, if you’re wondering what stories we were able to help the A.S.O. tell at the greatest cycling race in the world this year, take a look at this infographic. It’s a great example of how data analytics can help revolutionise the viewing experience of professional sport through effective data storytelling.




Pokemon Go: Potentiel énorme en Ciblage  au travers de la construction d’une DB de lieux réels à hauts potentiels

Plusieurs experts médias sont restés bouche bée face à la croissance météorique du jeu en réalité augmentée Pokémon Go

Source: La réalité augmentée séisme pour les publicitaires ?


Plusieurs experts médias sont restés bouche bée face à la croissance météorique du jeu en réalité augmentée Pokémon Go qui, malgré un lancement la semaine dernière, a déjà dépassé Twitter en nombre d’utilisateurs actifs quotidiens et se paye le luxe de faire passer Facebook au second plan.


Au lancement de l’appli mobile gratuite, les détails concernant sa monétisation et son ouverture aux publicitaires étaient plutôt flous. Mais nous savons maintenant que la publicité basée sur l’emplacement représentera une grande partie du financement de la plateforme de jeu. Dans une interview avec le Times, le PDG de Niantic John Hanke a déclaré que les publicitaires paieront sur une base de « coût par visite », similaire au « coût par clic » utilisé pour les publicités dans les recherches Google.

Les premiers signes montrent que cette approche pourrait s’avérer extrêmement lucrative pour Niantic, société propriétaire de Pokémon Go. Mais en réalité, le potentiel est encore plus grand que cela et Pokémon Go n’est que la partie émergée de l’iceberg pour une révolution massive de la publicité basée sur l’emplacement.

Les avancées de la géolocalisation, du géorepérage et de la technologie programmatique signifient que les publicitaires sont désormais capables de toucher un consommateur situé dans un bâtiment précis pour lui diffuser des publicités au moment et à l’endroit le plus pertinent. Des sociétés technologiques leaders de la publicité mobile basée sur l’emplacement permettent aux publicitaires de dessiner des frontières autour de bâtiments spécifiques, puis de détecter les signaux des appareils mobiles qui se trouvent dans la zone voulue.

Cette technologie contribue à construire une énorme base de données de centaines de millions d’emplacements réels, chacun taggé de mots-clés tels que « amateurs de fast food » ou « salle d’exposition de voitures » que les experts marketing peuvent utiliser afin de cibler des groupes spécifiques de consommateurs avec des publicités mobiles.

Si Niantic exploite pleinement ces technologies, les capacités de ciblage deviendraient extrêmement sophistiquées et les opportunités vraiment massives. Et si elles sont bien utilisées, ces publicités ne feront pas fuir le public de Pokémon Go. En fait, elles peuvent même améliorer l’expérience utilisateur globale. Avec une approche « native » de la publicité (chose que la programmatique va également révolutionner), les marques peuvent fournir bien plus de messages créatifs et pertinents qui correspondent au contenu et à l’expérience utilisateur en général.

Les publicitaires explorent aussi les possibilités de la réalité augmentée (RA) depuis quelques temps, ce qui aura bien sûr un grand impact pour une plateforme de jeu comme Pokémon Go. Un exemple est Net-A-Porter, qui a utilisé la réalité augmentée afin de réinventer le lèche-vitrine.

Le magasin de vente en ligne de mode de luxe a créé des vitrines RA à Paris, New York, Londres, Munich et Sydney afin de promouvoir la nouvelle collection Karl Lagerfeld. Alors que la vitrine semblait tout ce qu’il y a de plus ordinaire, la regarder à travers l’appli mobile Net-A-Porter Karl, permettait aux visiteurs de voir des vidéos du défilé, des informations sur les produits, des visualisations de produits à360 degrés, et d’acheter directement les articles voulus.

[L’exemple est un peu vieux, lancé sur LeWeb en 2011, mais l’idée est là]

NET-A-PORTER / ‘KARL’ brand launch – case study from fablez on Vimeo.

Net-A-Porter nous montre que tous les emplacements peuvent être une opportunité pour les marques afin de servir des expériences publicitaires augmentées, avec des possibilités créatives infinies. Tandis que la technologie continue d’évoluer, on pourrait même voir des marques créer leurs propres personnages Pokémon ou des fonctions qui utilisent la technologie programmatique afin de se mélanger au jeu et finalement en devenir une partie intégrante, en offrant des bons d’achat électroniques ciblés ou d’autres avantages par exemple. Tant que les publicitaires tirent les leçons de la croissance des bloqueurs de publicités et qu’ils créent des publicités qui se mélangent avec l’expérience plutôt que de l’interrompre, la publicité numérique a toute sa place dans la révolution du jeu menée par Niantic.

Bref, si vous pensez que Pokémon Go est un séisme, préparez-vous à plein d’opportunités encore plus grandes qui ne sont qu’au stade de l’éclosion. Une expérience Pokémon améliorée par la programmatique pourrait bien toucher un emplacement près de chez vous.

ROI, croisement des données, volatilité des clients : Carrefour monétise ses Datas | La Revue du Digital

Source: ROI, croisement des données, volatilité des clients : Carrefour monétise ses Datas | La Revue du Digital

Carrefour monétise les audiences de ses grandes surfaces et de ses sites Web. Le leader européen de la distribution mise sur le lien entre le digital et son CRM qui regroupe 13,7 millions de porteurs de cartes de fidélité. Un nouveau métier qui fait lentement sa place chez le distributeur en s’appuyant sur la technologie. 

Avec un chiffre d’affaires supérieur à 40 milliards d’euros en France, Carrefour est un géant de la distribution classique et si l’enseigne est déjà très présente dans le numérique, elle réalise l’essentiel de son chiffre d’affaires dans ses points de vente physiques.

Jouer la carte de la Data

Néanmoins, le distributeur entend désormais jouer la carte de la “data” notamment pour séduire les marques annonceurs et s’imposer sur le marché publicitaire au même titre que les Google ou Facebook et ajouter un “C” à GAFA.

Michel Bellanger - Carrefour - BF3Cette ambition peut paraître démesurée mais Carrefour a quelques arguments à mettre dans la balance face aux grands annonceurs du marché français, dont un atout de poids, sa “Data”. “Nous avons aujourd’hui 13,7 millions de porteurs de carte de fidélité et plus de 2 ans d’historique, des data précises à l’EAN (code barre) près” déclare Michel Bellanger, directeur marketing de Carrefour Médias (ci-contre)

Il a pris la parole à l’occasion de l’assemblée générale de l’EBG, le 21 Juin. Il poursuit : “notre enjeu est de répondre aux attentes des annonceurs. Le point de vente aujourd’hui, ce n’est pas qu’un magasin ou un site e-commerce, c’est une audience que nous devons être capables de cibler.

Un vaste projet d’onboarding du CRM 

90% du business de Carrefour se fait aujourd’hui en physique or dans le même temps nos clients sont de plus en plus digitaux” poursuit le responsable marketing. Pour réconcilier ces deux mondes, Carrefour a fait le choix d’Acxiom afin “d’onboarder” son CRM sur leur plateforme LiveRamp.

Le lien entre les consommateurs identifiés par leur carte de fidélité et leur navigation sur internet est un travail en cours. Cette opération pourrait atteindre 60% à terme des porteurs de carte de fidélité, considère Michel Bellanger.

Le 100% n’étant pas atteignable, sachant que, statistiquement, 20% des clients Carrefour ne disposent pas d’un accès Internet, et que 20% supplémentaires ne souhaitent pas être sollicités par des messages commerciaux via les canaux digitaux.

Relier une bannière au panier en sortie de caisse

Cet onboarding permet de nouveaux types d’opérations promotionnelles. “C’est ce qui nous permet de proposer aujourd’hui à nos annonceurs des opérations digitales à l’extérieur du magasin.” En “onboardant” la base CRM Carrefour, Michel Bellanger cherche à créer le lien entre une bannière poussée sur un site Web de l’enseigne (ou sur un site quelconque via retargeting), à un client physique, à un panier, un achat.

Cela nous permet de proposer à nos annonceurs des opérations véritablement mesurables que cela soit dans le cadre de simples opérations média ou des opérations promotionnelles via notre CRM.

Objectif : mesurer l’impact réel d’une campagne bien après son achèvement

Pour le directeur marketing de la régie médias de Carrefour, ce lien est capital entre les métriques issues du digital, les données CRM et les données de sortie de caisses. “Il faut être capable de mesurer les KPI standards d’efficacité des campagnes digitales, comme le nombre de personnes exposées, le nombre de clics, le nombre d’impressions, mais il faut aussi comprendre quel est l’impact sur le chiffre d’affaires en magasin.”

Ce qui a été mesuré. Résultat, “en 2015 nous avons mené près de 40 campagnes pour nos clients FMCG (biens de grande consommation) et les ROI vont de 0,5 à 11,” annonce-t-il.

En pratique, une opération de communication peut fort bien avoir été un succès sur le plan digital, mais un échec en termes d’accroissement des ventes. A l’inverse, les effets d’une campagne digitale peuvent être prolongés bien au-delà de la durée de la campagne elle-même, souligne Michel Bellanger.

700 000 € de gagnés pour une mise de 70 000 €

Autre cas évoqué par Michel Bellanger, une campagne réalisée en décembre dernier pour promouvoir les produits traiteur Carrefour. “Nous avons fait quelque chose d’absolument extraordinaire : entre exposés et non-exposés, nous sommes parvenus à augmenter le panier moyen qui est passé de 90 € à 91 €. Or 1 € de chiffre d’affaires incrémental sur 700 000 personnes, cela représente un incrément de 700 000 euros, pour un investissement média de 70 000 euros. Le calcul de ROI d’une telle campagne est évident.”

Si le responsable marketing de Carrefour Medias avance des chiffres aussi précis, c’est aussi qu’il souhaite faire évoluer la vision des annonceurs qui restent encore très axée sur les KPI classiques du marketing digital.

Il faut proposer des outils plus puissants et plus performants afin d’apporter aux annonceurs de la valeur ajoutée sur la compréhension de la construction du business. Le Big Data nous apporte une très grande granularité au niveau de la donnée. Il permet cette capacité à suivre l’individu sur la durée et donc finalement à réconcilier la donnée CRM individuelle et la donnée business pour le compte des marques.”

25% des parts de marché des grands annonceurs

Il ajoute : “ensuite, il faut réfléchir à la meilleure façon de les adresser. La régie offre aux annonceurs la possibilité d’adresser un client de plusieurs manières différentes en fonction de l’objectif : email, mailing postal, coupon caisse, et aujourd’hui un canal supplémentaire, le display ciblé.”

Carrefour propose ses audiences. “Aujourd’hui nous sommes capables de mettre à disposition des annonceurs un CRM déjà opérationnel dans l’écosystème Carrefour qui représente 25% de parts de marché des gros annonceurs en France.

50% de ventes en plus grâce au lien avec le digital

Les effets d’une campagne digitale peuvent être prolongés bien au-delà de la durée de la campagne elle-même, comme l’illustre l’exemple délivré par Michel Bellanger, directeur marketing de Carrefour Médias.

Nous avons mené une campagne pour un alcool blanc bien connu [NDLR : du rhum] qui entre dans la composition des Mojito. Nous avons ciblé nos clients à qui nous avons envoyé des recettes de cocktails Mojito et de Cuba Libre,”explique-t-il.

Les campagnes ont connu de bonnes performances. “Elles ont eu plusieurs millions d’impressions et des taux de clic très satisfaisants par rapport aux bench du marché,” dit-il. Ce sont des KPI très classiques en marketing digital. Et Carrefour a pu ensuite suivre l’impact réel de la campagne aux passages en caisses en ciblant ses clients “physiques” porteurs de sa carte de fidélité.

L’incrément de chiffre d’affaires a atteint 15 points pendant les 8 semaines de campagne. Et puisque nous suivions les achats de nos clients via leur carte de fidélité, nous avons pu établir la performance de la campagne à moyen et long terme sur les ventes,” pointe-t-il.

La performance de la campagne a pu être mesurée selon que les clients aient été exposés ou non à la publicité.“Nous avons donc observé l’impact de la campagne dans les 8 semaines qui ont suivi et l’incrément de chiffre d’affaires entre nos clients exposés et non exposés a atteint 50% !” se réjouit-il.

L’onboarding permet d’affiner le calcul du ROI des campagnes. Ce lien entre les mesures digitales et les données de ventes a permis de démontrer que les gens qui ont reçu ces recettes se sont mis à consommer régulièrement des Mojito et des Cuba Libre et sont effectivement allés racheter une bouteille dans les semaines qui ont suivi la campagne. Cette nouvelle approche doit permettre d’avoir une vision à moyen et long terme d’une campagne digitale.

50% de la base client d’une marque renouvelée chaque année !

L’analyse des données de caisses a permis à Carrefour de faire apparaître l’extrême volatilité des clients. “Venant d’une agence média, j’ai découvert en arrivant chez Carrefour qu’en un an, l’ensemble des marques FMCG perdent et recrutent en moyenne 50% de leur base client,” affirme Michel Bellanger, directeur marketing de Carrefour Médias.

Les clients ne disparaissent pas en revanche de chez Carrefour. “Leurs clients restent chez Carrefour, mais ils changent de marque fréquemment même si les parts de marché de chacun ne bougent pas. Les panels ne reflètent pas les mouvements énormes qui se dissimulent derrière leurs chiffres agrégés,” constate le responsable.

Face à ces phénomènes, Carrefour mise désormais sur ce que l’on nomme le “People Based Marketing”, non plus la simple étude de KPI très générale et de métriques agrégées, mais une approche Big Data. Une vision client unique qui se base sur l’analyse de données très détaillées qui inclut tant l’ensemble des achats réalisés par un client sur plusieurs années, mais aussi les données issues des médias digitaux.

Cette vision client unique est désormais opérationnelle. Celle-ci doit être agnostique et inclure les anciens clients, les clients abandonnistes, ceux qui sont en perte de vitesse, les clients à réactiver et bien sûr les nouveaux clients,” décrit-il.


Le Safe Haven : mettre en commun des Data sans les partager

Pour améliorer encore l’efficacité des campagnes que Carrefour Medias mène pour ses annonceurs, Michel Bellanger, directeur marketing de Carrefour Médias, espère désormais pousser ses annonceurs à ouvrir leurs propres CRM afin de croiser les données clients avec celles de Carrefour et de réaliser des campagnes mieux ciblées.

Comme bien évidemment aucun industriel ne voudrait laisser les distributeurs piller ses bases de données, Michel Bellanger mise sur une approche de type Safe Haven (DSSH / Data Science Safe Haven) telle qu’implémentée sur la plateforme LiveRamp d’Acxiom.

Cette capacité de Safe Haven nous permet de respecter la donnée de chacun. En tant que régie publicitaire, nous savons bien qu’il y a un vrai enjeu sur la propriété de la donnée. Carrefour a des données qu’il ne souhaite pas partager avec Groupe Seb, par exemple, et Seb ne souhaite pas partager toutes ses données avec Carrefour. Pour autant, nous devons être capables de croiser nos données afin de produire des opérations qui ont du sens” résume Michel Bellanger.

Pour le directeur marketing, cette fonction de Safe Haven qui permet de croiser les données dans le Cloud sans que chacun des partenaires n’ait directement accès aux données de l’autre, doit permettre à Carrefour Medias de nouer de nouveaux partenariats que ce soit avec des industriels ou les opérateurs télécoms avec des données de géo-localisation et affiner encore les campagnes marketing pour avoir un impact encore plus grand.