Guillaume Planet, VP media & digital marketing du Groupe SEB, et ancien directeur d’agences médias (Havas, Fullsix, Dentsu Aegis…), livre son analyse sur les bouleversements du secteur publicitaire causés par le déploiement des activités publicitaires d’Amazon.
En dévoilant, lors de ses différentes communications financières depuis début 2018, des chiffres de ventes publicitaires en très forte progression, pour ne pas dire bluffants – plus ou moins 4,2 milliards de dollars sur les six premiers mois de l’année – Amazon a officialisé son arrivée parmi les grands acteurs du marché publicitaire. Le cabinet eMarketer prévoit même que le groupe devienne dès 2018 le troisième acteur de la publicité en ligne aux Etats-Unis, devant Microsoft et Oath.
Et ce n’est qu’un début. Car les passerelles entre ses activités de distributeur de produits en ligne et ses activités publicitaires lui offrent des perspectives énormes.
Si on s’arrête sur le secteur du retail, on peut supposer que les développements d’Amazon vont créer des vocations chez les autres acteurs tant cette évolution du modèle est intelligente. Elle s’appuie sur plusieurs leviers :
1 – Un avantage concurrentiel
Amazon a un double avantage concurrentiel avec les autres vendeurs d’espaces publicitaires : une possession massive de data transactionnelles, associée à une position de clients et non de fournisseurs vis-à-vis des marques qui achètent ces espaces publicitaires.
2 – Un cercle vertueux achat – data – publicité
Amazon jouit d’un cercle vertueux d’investissements publicitaires financés par les marques qui drivent un trafic très qualifié grâce à la data, nourrit le core business de vente des retailers, et alimente encore plus en data qui vont elles même nourrir le volet publicitaire.
3 – Un levier de marge
L’activité publicitaire, surtout avec ses actifs présentés plus haut, offre surtout à Amazon des perspectives de profitabilité élevée, alors que l’activité de négoce l’est peu par nature.
Quel impact sur le marché publicitaire ?
L’impact du développement d’Amazon sur la publicité est triple concernant le secteur :
1 – Les plateformes suivent le même sillon
Les autres acteurs en devenir vont devoir se rapprocher du monde du retail. Google en fait une priorité comme le montre les récents partenariats avec Wallmart et Carrefour et l’investissement dans JD.com. Tencent en fait de même, et Facebook s’y intéresse aussi très probablement, comme le montre la place de marché actuellement en test sur la plateforme.
En effet, les opportunités sont grandes pour ces acteurs en termes de data très pertinentes pour nourrir l’efficacité des solutions proposées. Le retail ouvre aussi accès à d’autres types de budgets marketing des marques : les fameux budgets “BTL”, dédiés aux points de ventes, souvent supérieurs aux budgets publicitaires.
2 – Des opportunités pour de nouveaux acteurs
Ces développements du marché derrière Amazon créent des opportunités pour de nouveaux types d’acteurs pure players de la publicité retail, par exemple Criteo.
3 – Les médias encore plus marginalisés
Mais Amazon pousse surtout un peu plus les acteurs historiques de la vente d’espace publicitaire – je parle ici des médias traditionnels – vers un rôle plus marginal sur ce modèle économique. Ces derniers, déjà chahutés par Google et Facebook, font face à un nombre croissant de concurrents mieux armés pour profiter des transformations du secteur de la publicité : ils sont riches en data ultra-pertinentes, matures en expertises digitales et data, possesseurs d’infrastructures techs sophistiquées, et hyper-puissants financièrement.
Face à cette nouvelle donne, les médias prennent de plus en plus le sujet dans le bon sens. Après une période de déni et de diabolisation des GAFA, ils cherchent maintenant de plus en plus à investiguer de nouveaux modèles économiques et revoient leur relation avec Google et Facebook, qui doivent être assimilés à des partenaires pour contribuer à engager au mieux leurs audiences.
Quelle réaction pour les grands distributeurs ?
Les retailers réagissent différemment. Ils ont étonnamment tendance à se rapprocher immédiatement de leurs nouveaux concurrents : Wallmart et Carrefour pactisent avec Google, Carrefour avec Tencent, Auchan avec Alibaba, Monoprix avec Amazon… Leur objectif est d’apprendre à travers ces partenariats, mais les risques sont évidemment importants.
Quels sont-ils ? Que Monoprix perde l’accès à la data, moteur du nouveau modèle vertueux du retail en s’associant à la market place d’Amazon. Que Carrefour et Wallmart offrent potentiellement à un futur concurrent – au minimum sur le volet publicitaire -, Google, l’opportunité de développer sa courbe d’expérience dans l’univers du retail. Enfin qu’Auchan prend le risque de donner les clés de compréhension de nouveaux marchés cible pour Alibaba.
Le rapprochement avec des acteurs certes matures en digital et data, mais moins menaçants (de type Criteo, par exemple sur le volet publicitaire) serait probablement une démarche moins risquée pour apprendre les nouveaux codes de ce secteur.
En plus de la mise-sur-pied d’un programme d’éducation intensif, Havas Media a également recruté de nombreux talents, qui renforcent aujourd’hui l’ensemble des entités de l’agence.
En début d’année, on enregistrait déjà l’arrivée d’éléments expérimentés tels que Marc Dewulf (ex-Dentsu Aegis) COO assurant la direction de l’ensemble des expertises ; Ruben Ceuppens (ex Social Lab) Head of Socialyse ; et Arnaud Destrée (ex-GroupM) Head of Programmatic.
Depuis mai, pas moins de 10 nouveaux talents ont rejoint l’agence !
Patricia Lo Presti (ex-UM) à l’équipe Commerciale (Conseil) en tant qu’Account Director, tout comme Maurine Piette (ex-MediaCom) comme Account Manager, et Josiane Uwimana en tant qu’Account Executive.
Caroline Grangé (ex-IPM) a pris la tête du Publishing (Presse et Digital), accompagnée par Aurélie Renquet (ex-IPM) et de Séfana Zoufir, respectivement Publishing Account Manager et Publishing Account Executive.
Sandra Ruiz-Pelaez (ex-Havas Media Barcelona, ex-OMD España) est venue apporter son expérience internationale en tant que Performance Expert.
Gaetan Ickx – PhD en sciences biomédicales à l’UCL (CSA, Data Analyst), Céline Denoiseux (Broadcast, Account Executive) et Julien Droulans (Operations Coordinator) sont venus finaliser les recrutements de ces huit premiers mois.
Hugues Rey, CEO Havas Media Group: “Nous évoluons dans un environnement serein depuis de nombreux mois, qui se traduit par des recrutements optimisés; nous investissons davantage dans la recherche et la formation des profils adéquats, qui apportent une réelle plus-value à l’organisation. Nous sommes heureux de pouvoir accueillir des talents d’horizons divers qui viennent enrichir les équipes de leur expérience nos équipes”
Late last year, Amazon premiered a system that may well be the future of shopping. Nicknamed Amazon Go, it looks just like a regular brick and mortar store, except there are no lines, no self-checking machines, and no cashiers. The items you buy are checked by sensors, your account is charged through your mobile Amazon Go app, and you can just walk out of the store whenever you please.
Amazon Go is a revolutionary spin on retail, commerce, and the experience of going to a store. What’s really special about Amazon Go, however, is what it represents in terms of data.
All across the retail universe, the rapidly widening Internet of Things is becoming equipped for high-frequency event analytics. Across the board, that means faster decision-making, more helpful data, and smarter, more cost-efficient businesses.
The first stage in this process—“events occur”—is the most important one to consider in the retail context.
On a website, those events are fairly easy to understand. They might be clicks, button-presses, or scrolling behavior. We’ve been trained to think about the web in terms of events—not so with brick and mortar. And yet, the amount of events that could conceivably be collected as data from a single retail experience is tremendous.
When people enter the store, what items they pick up, which they take with them and which they put down, what order they shop in, even how they navigate the store down to the most infinitesimal of details—all of this is information that could help companies increase revenues, lower costs, and build more efficient businesses. That’s also just the front-end of the retail experience.
The new retail Nervous System
The Internet of Things has spread rapidly up and down the production supply chain, laying the foundation for the future of retail.
RFID chips on products allow companies to track their inventory with an unprecedented degree of precision, even as their shipments rattle around in shipping containers, cargo ships move in and out of port, and trucks travel across the country.
Companies like Flexport make it possible to manage and visualize those complex supply chains, many of which were barely even digitized years ago. Others help optimize last-mile delivery, manage the capacities of warehouses, and plan out routes for truck drivers bringing goods to market.
In stores, the same tags that help track goods as they move around the world can be used to optimize pricing given alterations in local conditions or sudden surges of demand.
This network of physical/digital infrastructure is just the substratum, however, of the true analytics-enabled future of retail.
When data analytics meets retail
Event data is the foundation of all behavioral analytics.
When you’re tracking every discrete click, scroll, or other web action, you can start to look for patterns in the data that you’re collecting. You can see which pieces of content on your blog engage the most users, which version of your checkout flow is the best for conversions, and so on.
There’s already technology out there to help investors like those at CircleUp analyze data around small businesses and predict those that will succeed based on a large corpus of historical data.
With the infrastructure of the Internet of Things in place, the same kind of analysis becomes possible on a physical scale. You can start to find patterns in what people buy, when people order, and how to build a more efficient goods-delivery system.
The possibilities are extensive and powerful. In Amazon’s concept store, you can easily imagine sensors that take notice whenever your gaze rests on a particular item for longer than usual, or when you pick something up only to put it back down afterwards.
The decision to not purchase an item would be just as important for Amazon’s recommendation engine as a confirmed sale—that data could even be fed back to the supplier for their marketing team to analyze the lost sale. Visual recognition systems could be used to show you an ad in the evening for that dress you were eyeing at the store in the afternoon.
That’s just scratching the surface of an extensive universe of possibilities. Already today, IoT-enabled retail is allowing companies to:
identify fraud before anyone from Loss Prevention even notices it’s happening
systematically reduce shrinkage by analyzing exactly where it’s coming from
A few years ago, Amazon patented the idea of “anticipatory shipping”—moving goods around based on their predictive analysis of likely consumer behavior. Because of your history, in other words, Amazon could predict that you were about to order a pack of toilet paper—and make sure it was in stock at the closest distributor well before you even clicked on the order button.
In the retail world of the future, innovations like these won’t be cutting-edge. In the age of data analytics, they’ll be little more than table stakes.
The data analytics long tail
The free flow of event data in retail depends on the proliferation of data sources. The more sources of data that can be cross-referenced, the more patterns that can be found and the more intelligence that can be produced.
Fortunately, the retail space is in a great position for data sources. There are not only a massive number of in-store sources of data, from sensors to registers to RFID tags, but there are complementary online sources as well.
For businesses that exist only as brick and mortar, the proliferation of IoT components and data analysis will mean a massive step forward in terms of business intelligence.
For those that are both brick and mortar stores and online, the confluence of the IoT and traditional behavioral analytics will mean an unprecedented wealth of data and an unprecedented set of options for customer engagement.
For those of us who have thrown in our lot with data, it is an exciting and fascinating time to be around.
During TEDx at the Rotterdam School of Management Scott Mongeau, manager analytics for Deloitte, gave an interesting talk about how we can be tricked by the analysis done with big data. He explains that models used in big data analytics should be driven bye the data and not the other way around. Essentially all models are wrong, but some are useful and data analytics can help solve some of the problems that we have currently, but it is important to have better diagnostics and semantics.
Thill will ensure that the models used are sound and can better describe computers what we are trying to achieve. He says that we should not become so focused on the engineering part of big data analytics that we forget the social context, which is important to understand as well in order to create true insights.
While the latest smart gizmo tends to grab headlines, industry experts are urging urban leaders to focus more on smart city challenges with their citizens, rather than the technology. That’s according to attendees at the recent VERGE 16 conference in Santa Clara, Calif. where leaders in the smart cities space gathered.
A key sentiment that emerged from the conference was that leaders in government and industry need to stay focused on the larger smart city picture and not get caught up in the latest gee-whiz technology.
Specifically, there needs to be greater focus on meshing emerging tech with the current political and economic systems that affect citizens.1
“The technology solutions are there,” said Kirain Jain, Chief Resilience Officer for the City of Oakland. “What we’re really looking at are governance issues.”
The proliferation of new smart city platforms and equipment is driven partly by the increasing ease at which they are integrated into city infrastructure.
“We just put out an RFP last week that had the words ‘user-centric design,’” said Jain.
Cities needs to evaluate their strategies
The shift from technology-centric strategies to user-centric mindsets also requires a realistic assessment of which populations of the city are actually benefiting from these innovations.3
Specifically, local leaders must recognize that many smart city innovations are providing benefits to the better off segments of society. Meanwhile, those citizens struggling with poverty may not see much benefit at all from technology that makes the morning commute more pleasant.
“A lot of our focus has been on moving the top 20% of the market,” said Kimberly Lewis, senior vice president of the U.S. Green Building Council. “We thought the trickle-down effects would really begin to affect low- and moderate-income communities.”
She says key challenges are being exacerbated by assumptions that any smart city technological advancement automatically creates mass impact on the entire city population. However, it’s becoming clear that smart city technology is not a magic wand that can be waved to eliminate persistent challenges faced by poorer citizens.
For example the community solar concept is beginning to gain traction in various markets, depending on the resources of those who wish to invest. However, this raises the issue of how to increase accessibility to financing for those communities who lack the resources to develop solar projects.
By Johannes Deichmann, Kersten Heineke, Thomas Reinbacher, and Dominik Wee
Monetizing the flood of information generated by the Internet of Things requires a well-executed strategy that creates value.
The Internet of Things (IoT) will turn the current rush of industrial data into a rogue wave of truly colossal proportions, threatening to overwhelm even the best-prepared company. As the gigabytes, terabytes, and petabytes of unstructured information pile up, most organizations lack actionable methods to tap into, monetize, and strategically exploit this potentially enormous new value. McKinsey research reveals that companies currently underutilize most of the IoT data they collect. For instance, one oil rig with 30,000 sensors examines only 1 percent of the data collected because it uses the information primarily to detect and control anomalies, ignoring its greatest value, which involves supporting optimization and prediction activities. One effective way to put IoT data to work and cash in on the growing digital bounty involves offering the information on data marketplaces to third parties.
How a digital marketplace creates value
Digital marketplaces are platforms that connect providers and consumers of data sets and data streams, ensuring high quality, consistency, and security. The data suppliers authorize the marketplace to license their information on their behalf following defined terms and conditions. Consumers can play a dual role by providing data back to the marketplace (Exhibit 1).
Third parties can offer value-added solutions on top of the data the marketplace offers. For example, real-time analytics can makeconsumer insights more actionable and timelier than ever before. The marketplace also has an exchange platform as a technical base for the exchange of data and services, including platform-as-a-service offers. Six key enablers of the data marketplace can help companies put their data to work more effectively:
Opening up new monetization opportunities. Today’s interconnected and digitized world increases the value of high-quality data assets while creating innovative revenues streams. One digital marketplace, for example, adds value to Europe’s electric-automobile market by providing information and transactional gateways for businesses such as charging-infrastructure providers, mobility-service players, and vehicle manufacturers. Charging-station operators, for example, are free to determine their own pricing structures based on data available about customer habits and market trends.
Enabling crowdsourcing. Data marketplaces make it possible to share and monetize different types of information to create incremental value. By combining information and analytical models and structures to generate incentives for data suppliers, more participants will deliver data to the platform.
Supporting interoperability. Data marketplaces can define metaformats and abstractions that support cross-device and cross-industry use cases.
Creating a central point of “discoverability.” Marketplaces offer customers a central platform and point of access to satisfy their data needs.
Achieving consistent data quality. Service-level agreements can ensure that marketplaces deliver data of consistently high quality.
Designing a data-sharing platform
As they consider the process of setting up a data marketplace, company leaders need to work through a number of critical questions. An enterprise might ponder the following issues as it clarifies its data-market strategy:
What is the data marketplace’s scope? In most cases, a data marketplace begins when companies set up a central exchange for data within their own organizations. Later, they determine which categories of information within that internal exchange are appropriate (from a security and a profitability perspective) and then allow other players outside their organization (and perhaps outside their industry) to access that data.
How is the marketplace best structured? To foster a dynamic ecosystem, the data marketplace needs to assume a neutral position regarding participants. The legal/tax entity that the marketplace becomes and the structures that govern and finance it are key to this neutrality. Among the guiding principles that players follow in setting up data marketplaces are that a) the marketplace must finance itself through transaction-related fees and commissions, and b) neutrality must extend to future participants that provide or receive data or services, offering indiscriminate access to all interested players under fair terms and conditions. And while the data marketplace will support the creation and definition of data licenses, the data suppliers must nevertheless take responsibility for enforcing and legally auditing them. With respect to the marketplace’s governance, two business models are leading the way. Data marketplaces tend to be either independent platforms or limited ownership hybrids. Under the former model, data sets are bought and sold, while fully owned data-as-a-service providers sell primary data in specific segments or with services and solution wraps. Under the latter, the marketplace collects and aggregates data from multiple publishers or data owners and then sells the data.
Who are the data marketplace’s customers? Once the marketplace is commercially viable, customers will include all types of data providers, and the marketplace system should actively source new kinds of data to become more attractive. The key providers of data will be the companies that capture it, own it, and authorize its sharing. At some point, however, application developers will offer infrastructure and support services that further increase the value of the data by offering a relevant analysis of it and facilitating its delivery.
What are the marketplace’s overall terms and conditions, and data categories? During the marketplace’s technical setup phase, data suppliers define their licensing conditions independently, and the platform provides benchmarks for licensing conditions. The overall terms and conditions of the marketplace apply to all traded data. In the subsequent commercialization phase, the marketplace relies on centrally defined data categories and related licensing agreements as expressed in its general terms and conditions. This strategy enables players to license crowdsourced data independently of specific suppliers.
How does the marketplace relate to other licensing models? When dealing with proprietary data, suppliers usually hold certain information apart and do not share it in the marketplace. However, data suppliers that also offer services can make use of their proprietary data to create services they can trade on the marketplace. For other licensed data, information suppliers can freely create licensing agreements that extend beyond the marketplace—for example, with their strategic partners. Both data amount and type, along with the scope of licenses for using the information, can vary from that of marketplace-supplied data. Likewise, suppliers can also impose separate licensing arrangements for data already traded in the marketplace if buyers intend to use it under different conditions.
What are the role and value-creation potential of the marketplace company or participating data brokers? The potential value of the data will differ depending on whether the marketplace is in the technical start-up phase or has achieved full commercialization (Exhibit 2). In the former, the marketplace acts as a data normalizer, defining standard data models, formats, and attributes for all of the traded information. It syntactically verifies all incoming data compared with the defined standard and continuously manages and extends the data inventory. Once the marketplace enters the commercial stage, it becomes a data aggregator. At this point, in addition to normalizing data and verifying incoming information, it aggregates data and organizes it into logical bundles. For instance, it will enable users to combine data for a given region and offer it to service providers.
Choosing a monetization model
While traditional licensing will provide marketplace revenue streams, participants can also develop transactional models to monetize data and services, with on-demand approaches constituting the preferred approach. With traditional licensing, companies can pursue either perpetual or one-off deals and collect customer fees using several approaches. For example, they can sign contracts with fixed fees and run times, renegotiate expired contracts, or earn revenues at the time of sale (this final approach typically provides less stability in revenue forecasting). At the transactional level, the two primary alternatives are on-demand and subscription services. With on-demand services, customers either pay as they go or choose volume pricing and pay charges based on metrics such as usage volume, the number of incidents, or hardware-related fees. Subscriptions can involve flat fees—typically applied on a monthly or yearly basis—or free/premium (“freemium”) offers, which provide the basics free of charge while offering additional features for a flat fee.
Another monetization option is the “give and take” model, which offers incentives to data providers to share their information. The incentive can be monetary or take the form of something like highly relevant, aggregated data as an enticement to share. The marketplace then aggregates and anonymizes the data and offers it along with associated data-focused services to customers.
One give-and-take example is an Internet-based service that offers geolocated real-time aircraft flight information. The service reportedly has one of the largest online aviation databases, covering hundreds of thousands of aircraft and flights as well as large numbers of airports and airlines. Data suppliers receive free radio equipment that collects and transmits aircraft data and a free business-level membership to the service worth $500 [≈ Basic iPad, 2011] a year for as long as they transmit data. In another case, a large European credit bureau offers credit-rating information for consumers and corporations. Data suppliers provide information that includes banking activities, credit and leasing agreements, and payment defaults. In return, they receive credit-ranking data for individuals or businesses. Yet another give-and-take marketplace focuses on data and performance analytics on mobile-operator network coverage. It trades apps and coverage information to data suppliers in exchange for crowdsourced data that can generate mobile-network coverage maps and reveal a mobile operator’s performance by region and technology (for example, 3G or 4G networks).
Assessing the competition
A wide variety of traditional commercial data services currently exists, although these services are largely in silos that focus on specific topics, such as healthcare, finance, retail, or marketing. This balkanization provides an opportunity for new, more holistic data-business models. One advantage of the current ubiquity of data providers is that most companies are already familiar with dealing with them. In fact, some sources estimate that 70 percent of largeorganizations already purchase external data, and all of them are likely to do so by the end of the decade. The value potential inherent in data marketplaces is attracting key players from a variety of advanced industries. A number of aerospace companies, for example, offer systems that provide guidance to customers in areas such as maintenance and troubleshooting. Similar efforts are also under way in the agricultural and mining-equipment industries, among others.
The IoT’s big data promises to help companies understand customer needs, market dynamics, and strategic issues with unmatched precision. But in pursuing this goal, organizations will amass previously unimaginable quantities of information. The data marketplace offers them an innovative way to turn some of that data into cash and reap the benefits that will accrue from building a self-reinforcing ecosystem, enabling crowdsourcing, supporting interoperability, satisfying customer data needs, and improving data quality.
In all their many shapes and forms, sports entertain us, bring us joy, and showcase the pinnacle of our physical and mental abilities. We love following sports! But in all the excitement and exhilaration they offer us fans, it’s sometimes easy to forget that they’re also businesses. For event organisers, sponsors, the media, and professional athletes, sports must also be lucrative, profitable, and sustainable.
The Tour de France is organised each year by Amaury Sport Organisation (A.S.O.). It’s a great example of how a professional sports business saw the massive potential that the digital era holds for all organisations … and how it’s now transforming itself in several ways to leverage that opportunity through technology. A.S.O. is revolutionising the viewing experience of pro cycling and is doing so through data storytelling. Here are seven important lessons we can learn about digital business from A.S.O.’s journey so far.
Lesson 1: Know what it is you’re really selling
As a sports media organisation, A.S.O.’s product is not necessarily the sport of cycling itself, but the stories it’s able to tell about it. The newspaper L’Auto created the Tour de France back in 1903 with the sole purpose of selling more copies. The newspaper featured fantastic stories of this exciting cycling event: the trials, tribulations, crashes, punishing climbs, daring descents, and dashing sprints to the finish line. That was the product they were selling … and it hasn’t changed much throughout the evolution of A.S.O. , nor in the more than a century of the Tour’s existence.
Cycling today is still a sport in which fitness, teamwork, strategy, mental endurance, and sheer skill all play a part in ensuring victory and glory. The more compelling, intriguing, and entertaining that story is, the more attractive the product that A.S.O. has to offer, and the greater the audience it will attract to consume that product.
Lesson 2: Change will always happen – recognise it when it does
What did change over the last century, however – and massively so – was the buyer of A.S.O.’s product, the consumers of sports stories around the world. And much of that had to do with technology. As technology evolved over the last 100 years, so did the ways in which sports audiences wanted to consume their stories. With the dawn of the digital era came an explosion of new devices, applications, and social and digital channels. Along came a new type of sports fan, too, for whom traditional TV, radio, and print simply weren’t good enough anymore. The new sports audiences wanted more ways of consuming sports stories, more information, more involvement, more interaction, and more conversation about the sport they were passionate about.
Read this blog or download this infographic for more about the close relationship between the evolution of technology and sport.
Lesson 3: Technology has the power to transform entire industries
A.S.O. therefore faced a challenge: they knew they needed to adapt, enrich, and enhance how they told the Tour de France story, in order to leverage the opportunity that technology and new audiences had to offer. And there was only one way of doing so: through the power of technology itself. As a result, A.S.O. dipped its toes into the world of live-tracking and data analytics for the first time during the 2015 race, with the help of Official Technology Partner, Dimension Data. By 2016, A.S.O. was ready to use data to tell truly exciting stories about pro cycling, and deliver this newly enriched product to fans, professionals, and the media around the world – not only via traditional channels, but also online and through social media.
But how exactly did they go about it and how was Dimension Data able to help?
Lesson 4: Digital business begins and ends with data
Data is the lifeblood of digital business. But making the most of your data means more than collecting, storing, and processing bits and bytes. A reliable flow of data across a secure, stable infrastructure also helps to turn data into useful information. At the 2016 Tour de France, it was all about data in motion. Our mobile data centre, called the big data truck, was literally on the road every day to follow wherever the greatest cycling race in the world went. Watch this video for more about how the big data truck formed the analytics hub of the data that flowed from each bike in the race, to eventually help A.S.O. tell great stories of pro cycling.
Lesson 5: People who work for digital businesses work in new and exciting ways
It takes a fantastic group of passionate, willing, and able people to pull off a massive technology project like the Tour de France. It also takes some great technology to bring them all together, in real-time, from around the world. Our Tour de France team worked every day of the event from inside the big data truck, their office on the road. Through a 24-hour technical development cycle, the team ensured the solution kept up with the dynamics of the race. The truck itself is a great example of a workspace for tomorrow. It’s completely mobile and incorporates an impressive range of communication and collaboration tools that bring people together from all over world, while they work. If you wonder what it takes to operate within a global team in an office that moves every day, watch this video.
Lesson 6: A cornerstone technology of digital business is cloud
Cloud gives you the ability to scale your digital business instantly in order to respond smarter and faster to massive data volumes and dynamic market conditions. This was also true for the Tour de France. This year, we delivered valuable race information to A.S.O. through a unified digital platform hosted in the cloud, in parallel with a big data truck disaster recovery solution. The quality of data and real-time availability of that data were critical components of the technology solution. Dimension Data’s cloud ensured we kept the data secure, and encrypted connectivity allowed the race information to be transmitted to broadcasters, the media, the teams, and viewers securely. Watch this video to learn more:
Lesson 7: It always takes a team
As much as technology stands central to digital business, business itself is always about people. The Tour de France technology solution was no exception. Behind all the buzz and excitement ofone of the greatest sporting events in the world, was a large, global team of dedicated, passionate people working together to deliver something great. They’re the ones who made it all happen. Watch this video to meet some of them, and find out what it was like to help A.S.O. tell great stories of pro cycling.
Lastly, if you’re wondering what stories we were able to help the A.S.O. tell at the greatest cycling race in the world this year, take a look at this infographic. It’s a great example of how data analytics can help revolutionise the viewing experience of professional sport through effective data storytelling.