Artificial intelligence, machine learning, and smart things promise an intelligent future. Gartner’s Top 10 Strategic Technology Trends for 2017

Artificial intelligence, machine learning, and smart things promise an intelligent future.

Source: Gartner’s Top 10 Strategic Technology Trends for 2017 – Smarter With Gartner

Top 10 Strategic Technology Trends 2017

Intelligent

AI and machine learning have reached a critical tipping point and will increasingly augment and extend virtually every technology enabled service, thing or application.  Creating intelligent systems that learn, adapt and potentially act autonomously rather than simply execute predefined instructions is primary battleground for technology vendors through at least 2020.

Trend No. 1: AI & Advanced Machine Learning

AI and machine learning (ML), which include technologies such as deep learning, neural networks and natural-language processing, can also encompass more advanced systems that understand, learn, predict, adapt and potentially operate autonomously. Systems can learn and change future behavior, leading to the creation of more intelligent devices and programs.  The combination of extensive parallel processing power, advanced algorithms and massive data sets to feed the algorithms has unleashed this new era.

In banking, you could use AI and machine-learning techniques to model current real-time transactions, as well as predictive models of transactions based on their likelihood of being fraudulent. Organizations seeking to drive digital innovation with this trend should evaluate a number of business scenarios in which AI and machine learning could drive clear and specific business value and consider experimenting with one or two high-impact scenarios..

Trend No. 2: Intelligent Apps

Intelligent apps, which include technologies like virtual personal assistants (VPAs), have the potential to transform the workplace by making everyday tasks easier (prioritizing emails) and its users more effective (highlighting important content and interactions). However, intelligent apps are not limited to new digital assistants – every existing software category from security tooling to enterprise applications such as marketing or ERP will be infused with AI enabled capabilities.  Using AI, technology providers will focus on three areas — advanced analytics, AI-powered and increasingly autonomous business processes and AI-powered immersive, conversational and continuous interfaces. By 2018, Gartner expects most of the world’s largest 200 companies to exploit intelligent apps and utilize the full toolkit of big data and analytics tools to refine their offers and improve customer experience.

Trend No. 3: Intelligent Things

New intelligent things generally fall into three categories: robots, drones and autonomous vehicles. Each of these areas will evolve to impact a larger segment of the market and support a new phase of digital business but these represent only one facet of intelligent things.  Existing things including IoT devices will become intelligent things delivering the power of AI enabled systems everywhere including the home, office, factory floor, and medical facility.

As intelligent things evolve and become more popular, they will shift from a stand-alone to a collaborative model in which intelligent things communicate with one another and act in concert to accomplish tasks. However, nontechnical issues such as liability and privacy, along with the complexity of creating highly specialized assistants, will slow embedded intelligence in some scenarios.

Digital

The lines between the digital and physical world continue to blur creating new opportunities for digital businesses.  Look for the digital world to be an increasingly detailed reflection of the physical world and the digital world to appear as part of the physical world creating fertile ground for new business models and digitally enabled ecosystems.

Trend No. 4: Virtual & Augmented Reality

Virtual reality (VR) and augmented reality (AR) transform the way individuals interact with each other and with software systems creating an immersive environment.  For example, VR can be used for training scenarios and remote experiences. AR, which enables a blending of the real and virtual worlds, means businesses can overlay graphics onto real-world objects, such as hidden wires on the image of a wall.  Immersive experiences with AR and VR are reaching tipping points in terms of price and capability but will not replace other interface models.  Over time AR and VR expand beyond visual immersion to include all human senses.  Enterprises should look for targeted applications of VR and AR through 2020.

Trend No. 5: Digital Twin

Within three to five years, billions of things will be represented by digital twins, a dynamic software model of a physical thing or system. Using physics data on how the components of a thing operate and respond to the environment as well as data provided by sensors in the physical world, a digital twin can be used to analyze and simulate real world conditions, responds to changes, improve operations and add value. Digital twins function as proxies for the combination of skilled individuals (e.g., technicians) and traditional monitoring devices and controls (e.g., pressure gauges). Their proliferation will require a cultural change, as those who understand the maintenance of real-world things collaborate with data scientists and IT professionals.  Digital twins of physical assets combined with digital representations of facilities and environments as well as people, businesses and processes will enable an increasingly detailed digital representation of the real world for simulation, analysis and control.

Trend No. 6: Blockchain

Blockchain is a type of distributed ledger in which value exchange transactions (in bitcoin or other token) are sequentially grouped into blocks.  Blockchain and distributed-ledger concepts are gaining traction because they hold the promise of transforming industry operating models in industries such as music distribution, identify verification and title registry.  They promise a model to add trust to untrusted environments and reduce business friction by providing transparent access to the information in the chain.  While there is a great deal of interest the majority of blockchain initiatives are in alpha or beta phases and significant technology challenges exist.

Mesh

The mesh refers to the dynamic connection of people, processes, things and services supporting intelligent digital ecosystems.  As the mesh evolves, the user experience fundamentally changes and the supporting technology and security architectures and platforms must change as well.

Trend No. 7: Conversational Systems

Conversational systems can range from simple informal, bidirectional text or voice conversations such as an answer to “What time is it?” to more complex interactions such as collecting oral testimony from crime witnesses to generate a sketch of a suspect.  Conversational systems shift from a model where people adapt to computers to one where the computer “hears” and adapts to a person’s desired outcome.  Conversational systems do not use text/voice as the exclusive interface but enable people and machines to use multiple modalities (e.g., sight, sound, tactile, etc.) to communicate across the digital device mesh (e.g., sensors, appliances, IoT systems).

Trend No. 8: Mesh App and Service Architecture

The intelligent digital mesh will require changes to the architecture, technology and tools used to develop solutions. The mesh app and service architecture (MASA) is a multichannel solution architecture that leverages cloud and serverless computing, containers and microservices as well as APIs and events to deliver modular, flexible and dynamic solutions.  Solutions ultimately support multiple users in multiple roles using multiple devices and communicating over multiple networks. However, MASA is a long term architectural shift that requires significant changes to development tooling and best practices.

Trend No. 9: Digital Technology Platforms

Digital technology platforms are the building blocks for a digital business and are necessary to break into digital. Every organization will have some mix of five digital technology platforms: Information systems, customer experience, analytics and intelligence, the Internet of Things and business ecosystems. In particular new platforms and services for IoT, AI and conversational systems will be a key focus through 2020.   Companies should identify how industry platforms will evolve and plan ways to evolve their platforms to meet the challenges of digital business.

Trend No. 10: Adaptive Security Architecture

The evolution of the intelligent digital mesh and digital technology platforms and application architectures means that security has to become fluid and adaptive. Security in the IoT environment is particularly challenging. Security teams need to work with application, solution and enterprise architects to consider security early in the design of applications or IoT solutions.  Multilayered security and use of user and entity behavior analytics will become a requirement for virtually every enterprise.

David Cearley is vice president and Gartner Fellow in Gartner Research and is a leading authority on information technology. Mr. Cearley analyzes emerging and strategic business and technology trends and explores how these trends shape the way individuals and companies derive value from technology.

Advertisements

CES 2017: A glimpse into the future of marketing

Posterscope’s Jeff Tan peers into the crystal ball that is CES to discover the latest trends for tech and brands.

Source: CES 2017: A glimpse into the future of marketing

CES is a crystal ball providing an exciting peek at how mainstream consumers will interact with technology and brands. It is a prophetic look into the future of marketing and there are several themes that are of importance.

Voice AI is here to stay

The huge volume of gadgets that offered integration with Amazon Alexa (including televisions, fridges and alarm clocks) showed that Amazon is clearly ahead in the voice-activated speaker market. We will witness the battle of the voice platforms including Google, Microsoft and the rumoured Apple assistant.

Voice AI developments are coming as fast as autonomous vehicles. 1% of digital integrations are currently voice-activated, this will rise to 30% by 2020.

The platform battle will be won by whoever can provide seamless interaction and developer integration. This will require enormous data processing capabilities – the average person can type 40 wpm, but can speak 145 wpm.

Marketing implication: Brands will compete to be the first default recommendation in voice AI engines, eg asking for restaurants, coffee brands or movies to watch. A parallel with the early days of search engine marketing could lead to a resurgence in audio advertising via optimized suggestions and paid bidding for voice activated keywords.

Face and gesture recognition

An increasing number of technologies are specializing in facial and gesture tracking with enormous potential for marketers, including Netatmo, an outdoor camera that recognizes people, cars and animals, and eyeSight, a gesture detection unit that allows the control of experiences via finger tracking and hand gestures.

Such technologies will lead to the retail store of the future that can scan a shopper’s eyelids and irises to detect what skirt she is looking at, and understand her facial clues that indicate emotion and whether she has a strong visceral reaction to the colour red.

Shopping malls will be able to detect personality type better than a real human and direct shoppers via digital OOH to certain aspects within the mall.

Restaurants will know who you are and your favourite wine as you enter, allowing waiters (or robotic waiters) to recommend pairing options accordingly.

Marketing implication: Retailers can capture and analyze data, providing real-time personalized recommendations for products based on current emotions or actions. Posterscope USA created the world’s first responsive facial recognition campaign for General Motors that displayed one of 30 videos to shoppers based on the age, gender and facial expression.

Security, privacy and the issue of trust

Large-scale hacking in 2016 drew attention to privacy issues; Yahoo, Verizon, Dropbox, and even the Democratic National Committee were targeted.

Today’s 11 billion global connected devices will increase to 80 billion by 2025. Companies that make these devices are typically not security companies, and popular culture such as Netflix’s Black Mirror has painted an image of a distrustful, connected society.

CES featured companies dedicated to security including Bit Defender Box, a network device that prevents hacking into connected home devices.

Marketing implication: Increased scrutiny of privacy is a good thing and needs to be taken seriously by marketers. Data protection should be revered at all costs and marketers need to respect the individual with the continued evolvement of data-driven, programmatic media.

Automation and the connected-everything

The old companies you thought you knew have transformed into smart-technology companies intent on making our lives easier via automating and connecting our utilities.

The Panasonic smart kitchen features a digital kitchen wall with video recipes based on your refrigerator’s contents. A smooth, marble bench surface is transformed seamlessly into a heated stove top as a pot is moved around the surface. Once dinner is finished the Whirlpool Zera food recycler can keep a garden healthy by producing 25lbs of compost a week.

Connected cars continued their dominance at CES with new models including the breathtaking Faraday, electric ride sharing Honda and Alexa-integrated Ford. In ten years most new cars will be autonomous, and for everyone else there are after-market retrofitted autonomous kits such as Delphi.

Even the sport of fishing didn’t escape automation with the PowerRay underwater robot combining fish-detection with VR live-streamed video.

Marketing implication: Automation is changing all aspects of our lives, both as consumers and marketers. The businesses we work in today need to transform to become technology and data led.

Our job titles in as near as five years’ time will be vastly different from today. The savvy marketer will adapt, retool and retrain today to stay relevant in the future. Taken individually, these trends are exciting. When combined, they’re mind blowing.

CES gives us a glimpse into the future of marketing, one of utility, automation and deep personalization. As marketers it will no longer be acceptable to blast consumers with a one-size-fits-all approach. Our role is to provide valuable interactions, hyper-relevant to the micro moments in consumers’ lives.

In the near future, the car I’m driving will detect that I’m drowsy by analyzing my face and driving patterns. She will say “Hello Jeff, you’ve been driving for eight hours. Why don’t you stop for a coffee? There is a Starbucks 1.5 miles ahead.” As I pass a digital billboard that triggers Starbucks content, I will turn into a parking lot to speak to a voice activated digital barista who already knows my order. The future of marketing is exciting.

Jeff Tan is vice president of strategy at Posterscope.
Read more at http://www.campaignlive.co.uk/article/ces-2017-glimpse-future-marketing/1420255#s1dUTJJjt0wgJ6UM.99

Ces 500 objets qui vont connecter la «maison intelligente» d’ici 2022 | FrenchWeb.fr

 

Source: Ces 500 objets qui vont connecter la «maison intelligente» d’ici 2022 | FrenchWeb.fr

Près de la moitié des consommateurs prévoit d’acheter un ou plusieurs objets leur permettant de rendre leur maison «intelligente», selon une étude réalisée par Joshfire, société spécialisée dans la conception d’objets connectés, et publiée ce mardi 30 août. En matière de «smart home», trois grandes tendances séduisent aujourd’hui les consommateurs: la sécurité, l’optimisation de la consommation d’énergie et le divertissement, toujours selon cette même étude, qui aggrège des données provenant de différentes études publiées sur le sujet.

La Smart Home sera capable d’anticiper les besoins de ses habitants

Bien que nos maisons soient en réalité déjà «connectées» depuis l’arrivée de l’ADSL et du Wi-fi, c’est sous l’impulsion de l’IoT que le marché de la Smart Home se démocratise. D’ici 2020, ce sont près de 50 milliards d’objets qui devraient être connectés, soit 15% de la totalité des objets produits, et d’ici 2022, on devrait trouver plus de 500 objets connectés différents à l’intérieur d’une maison «classique», selon les auteurs de l’étude.

Serrures connectées, thermostats intelligents, et autres télévisions connectées commencent déjà à équiper nos intérieurs, avec leur lot de questionnements concernant la sécurité des données collectées par ces objets. 71% des consommateurs craignent en effet un vol de leurs données personnelles, tandis que 64% s’inquiètent de la vente de ces dernières, selon les résultats d’une étude iControl Networks reprise par Joshfire.

Sans surprise, ce sont les 25-34 ans qui sont les plus disposés à s’équiper, pour gagner en productivité dans les tâches ménagères (à 40%), pour se divertir (à 26%), ou encore parce qu’ils apprécient la capacité de ces objets à anticiper leurs besoins (à 24%). C’est d’ailleurs dans cette capacité d’anticipation des besoins de la Smart Home que les auteurs de l’étude voient l’avenir du marché. Au-delà de la sécurité ou de la maîtrise de sa consommation, la maison de demain prendra également soin de la santé de ses habitants, les aidera dans leurs tâches ménagères, leur permettra de se divertir et les aidera à communiquer les uns avec les autres.

smart-home-joshfire

Human factors limit smart cities more than technology – (RW)

While the latest smart gizmo tends to grab headlines, industry experts are urging urban leaders to focus more on smart city challenges with their citizens, rather than the technology. That’s according to attendees at the recent VERGE 16 conference in Santa Clara, Calif. where leaders in the smart cities space gathered.

A key sentiment that emerged from the conference was that leaders in government and industry need to stay focused on the larger smart city picture and not get caught up in the latest gee-whiz technology.

Specifically, there needs to be greater focus on meshing emerging tech with the current political and economic systems that affect citizens.1

“The technology solutions are there,” said Kirain Jain, Chief Resilience Officer for the City of Oakland. “What we’re really looking at are governance issues.”

The proliferation of new smart city platforms and equipment is driven partly by the increasing ease at which they are integrated into city infrastructure.

However, government leaders are being urged to develop technology strategies around citizens’ needs first rather than prioritizing the technology and figuring out whether public benefits later.

“We just put out an RFP last week that had the words ‘user-centric design,’” said Jain.

Cities needs to evaluate their strategies

The shift from technology-centric strategies to user-centric mindsets also requires a realistic assessment of which populations of the city are actually benefiting from these innovations.3

Specifically, local leaders must recognize that many smart city innovations are providing benefits to the better off segments of society. Meanwhile, those citizens struggling with poverty may not see much benefit at all from technology that makes the morning commute more pleasant.

“A lot of our focus has been on moving the top 20% of the market,” said Kimberly Lewis, senior vice president of the U.S. Green Building Council. “We thought the trickle-down effects would really begin to affect low- and moderate-income communities.”

She says key challenges are being exacerbated by assumptions that any smart city technological advancement automatically creates mass impact on the entire city population. However, it’s becoming clear that smart city technology is not a magic wand that can be waved to eliminate persistent challenges faced by poorer citizens.

For example the community solar concept is beginning to gain traction in various markets, depending on the resources of those who wish to invest. However, this raises the issue of how to increase accessibility to financing for those communities who lack the resources to develop solar projects.

Understanding Artificial Intelligence – eMarketer

Artificial intelligence (AI) is already becoming entrenched in many facets of everyday life, and is being tapped for a growing array of core business applications, including predicting market and customer behavior, automating repetitive tasks and providing alerts when things go awry. As technology becomes more sophisticated, the use of AI will continue to grow quickly in the coming years.

Source: Understanding Artificial Intelligence – eMarketer

US Business/IT Executives Who Are Aware of Select Emerging Technologies, June 2016 (% of respondents)

In its most widely understood definition, AI involves the ability of machines to emulate human thinking, reasoning and decision-making. A May 2015 survey of USbusiness executives by Narrative Sciencefound that 31% of respondents believed AI was “technology that thinks and acts like humans.” Other conceptions included “technology that can learn to do things better over time,” “technology that can understand language” and “technology that can answer questions for me.”

At a deeper level, however, there is confusion in the marketplace around AI technology and the terminology used to describe it. Similar-sounding terms—such as cognitive computing, machine intelligence, machine learning, deep learning and augmented intelligence—are used interchangeably, though there are subtle differences among them. Many companies that have been involved with AI for years don’t even call it AI, for various reasons. “In essence we call it machine learning, because I think AI sometimes can spook some folks,” said Mahesh Tyagarajan, chief product officer at ecommerce personalization platform RichRelevance.

Many people also don’t realize that AI powers some of today’s most buzzed-about technologies. For example, a June 2016 survey by CompTIA found surprisingly low awareness of AI among US business and IT executives: Just 54% said they were aware of AI, compared with 78% who were aware of 3-D printing and 71% who knew of drones and virtual reality. However, some of the higher-ranking technologies on the list—including virtual reality, self-driving vehicles and robotics—are underpinned by different types of AI, though they were not identified as such.

Narrative Science also found that 58% of US business executives polled were already using AI—particularly in conjunction with big data technologies. Of those, nearly one-third (32%) said voice recognition and voice response solutions were the AI technologies they used most. The study showed that organizations also used AI for machine learning (24%) and as virtual personal assistants (15%). Smaller percentages cited decision support systems, automated written reporting and communications, analytics-focused applications and robotics.

Most Widely Used Artificial Intelligence (AI) Technology at Their Company According to US Executives, May 2015 (% of respondents)

Businesses in all industries are also making choices about how they will acquire AI technologies. For example, a January2016 survey of globalexecutives in the financial industry byEuromoney Institutional Investor Thought Leadershipfound that 42% of respondents said their organization used internal R&D to develop its AI/machine learning capabilities. Other ways included employing consultants and research firms, participating in innovation hubs and incubators, partnering with other businesses and/or academia, crowdsourcing and joint ventures, mergers and acquisitions.

Companies currently underutilize most of the IoT data: Creating a successful Internet of Things data marketplace ? | McKinsey & Company

Source: Creating a successful Internet of Things data marketplace | McKinsey & Company

By Johannes Deichmann, Kersten Heineke, Thomas Reinbacher, and Dominik Wee

Monetizing the flood of information generated by the Internet of Things requires a well-executed strategy that creates value.

The Internet of Things (IoT) will turn the current rush of industrial data into a rogue wave of truly colossal proportions, threatening to overwhelm even the best-prepared company. As the gigabytes, terabytes, and petabytes of unstructured information pile up, most organizations lack actionable methods to tap into, monetize, and strategically exploit this potentially enormous new value. McKinsey research reveals that companies currently underutilize most of the IoT data they collect. For instance, one oil rig with 30,000 sensors examines only 1 percent of the data collected because it uses the information primarily to detect and control anomalies, ignoring its greatest value, which involves supporting optimization and prediction activities. One effective way to put IoT data to work and cash in on the growing digital bounty involves offering the information on data marketplaces to third parties.

How a digital marketplace creates value

Digital marketplaces are platforms that connect providers and consumers of data sets and data streams, ensuring high quality, consistency, and security. The data suppliers authorize the marketplace to license their information on their behalf following defined terms and conditions. Consumers can play a dual role by providing data back to the marketplace (Exhibit 1).

Aggregated data can be an incentive for providers to share information.

Third parties can offer value-added solutions on top of the data the marketplace offers. For example, real-time analytics can makeconsumer insights more actionable and timelier than ever before. The marketplace also has an exchange platform as a technical base for the exchange of data and services, including platform-as-a-service offers. Six key enablers of the data marketplace can help companies put their data to work more effectively:

  • Building an ecosystem. By assembling multitudes of third-party participants, companies can increase the relevance of their own digital platforms.
  • Opening up new monetization opportunities. Today’s interconnected and digitized world increases the value of high-quality data assets while creating innovative revenues streams. One digital marketplace, for example, adds value to Europe’s electric-automobile market by providing information and transactional gateways for businesses such as charging-infrastructure providers, mobility-service players, and vehicle manufacturers. Charging-station operators, for example, are free to determine their own pricing structures based on data available about customer habits and market trends.
  • Enabling crowdsourcing. Data marketplaces make it possible to share and monetize different types of information to create incremental value. By combining information and analytical models and structures to generate incentives for data suppliers, more participants will deliver data to the platform.
  • Supporting interoperability. Data marketplaces can define metaformats and abstractions that support cross-device and cross-industry use cases.
  • Creating a central point of “discoverability.” Marketplaces offer customers a central platform and point of access to satisfy their data needs.
  • Achieving consistent data quality. Service-level agreements can ensure that marketplaces deliver data of consistently high quality.

Designing a data-sharing platform

As they consider the process of setting up a data marketplace, company leaders need to work through a number of critical questions. An enterprise might ponder the following issues as it clarifies its data-market strategy:

What is the data marketplace’s scope? In most cases, a data marketplace begins when companies set up a central exchange for data within their own organizations. Later, they determine which categories of information within that internal exchange are appropriate (from a security and a profitability perspective) and then allow other players outside their organization (and perhaps outside their industry) to access that data.

How is the marketplace best structured? To foster a dynamic ecosystem, the data marketplace needs to assume a neutral position regarding participants. The legal/tax entity that the marketplace becomes and the structures that govern and finance it are key to this neutrality. Among the guiding principles that players follow in setting up data marketplaces are that a) the marketplace must finance itself through transaction-related fees and commissions, and b) neutrality must extend to future participants that provide or receive data or services, offering indiscriminate access to all interested players under fair terms and conditions. And while the data marketplace will support the creation and definition of data licenses, the data suppliers must nevertheless take responsibility for enforcing and legally auditing them. With respect to the marketplace’s governance, two business models are leading the way. Data marketplaces tend to be either independent platforms or limited ownership hybrids. Under the former model, data sets are bought and sold, while fully owned data-as-a-service providers sell primary data in specific segments or with services and solution wraps. Under the latter, the marketplace collects and aggregates data from multiple publishers or data owners and then sells the data.

Who are the data marketplace’s customers? Once the marketplace is commercially viable, customers will include all types of data providers, and the marketplace system should actively source new kinds of data to become more attractive. The key providers of data will be the companies that capture it, own it, and authorize its sharing. At some point, however, application developers will offer infrastructure and support services that further increase the value of the data by offering a relevant analysis of it and facilitating its delivery.

What are the marketplace’s overall terms and conditions, and data categories? During the marketplace’s technical setup phase, data suppliers define their licensing conditions independently, and the platform provides benchmarks for licensing conditions. The overall terms and conditions of the marketplace apply to all traded data. In the subsequent commercialization phase, the marketplace relies on centrally defined data categories and related licensing agreements as expressed in its general terms and conditions. This strategy enables players to license crowdsourced data independently of specific suppliers.

How does the marketplace relate to other licensing models? When dealing with proprietary data, suppliers usually hold certain information apart and do not share it in the marketplace. However, data suppliers that also offer services can make use of their proprietary data to create services they can trade on the marketplace. For other licensed data, information suppliers can freely create licensing agreements that extend beyond the marketplace—for example, with their strategic partners. Both data amount and type, along with the scope of licenses for using the information, can vary from that of marketplace-supplied data. Likewise, suppliers can also impose separate licensing arrangements for data already traded in the marketplace if buyers intend to use it under different conditions.

What are the role and value-creation potential of the marketplace company or participating data brokers? The potential value of the data will differ depending on whether the marketplace is in the technical start-up phase or has achieved full commercialization (Exhibit 2). In the former, the marketplace acts as a data normalizer, defining standard data models, formats, and attributes for all of the traded information. It syntactically verifies all incoming data compared with the defined standard and continuously manages and extends the data inventory. Once the marketplace enters the commercial stage, it becomes a data aggregator. At this point, in addition to normalizing data and verifying incoming information, it aggregates data and organizes it into logical bundles. For instance, it will enable users to combine data for a given region and offer it to service providers.

Depending on the role of the marketplace, depth of value added will vary.

Choosing a monetization model

While traditional licensing will provide marketplace revenue streams, participants can also develop transactional models to monetize data and services, with on-demand approaches constituting the preferred approach. With traditional licensing, companies can pursue either perpetual or one-off deals and collect customer fees using several approaches. For example, they can sign contracts with fixed fees and run times, renegotiate expired contracts, or earn revenues at the time of sale (this final approach typically provides less stability in revenue forecasting). At the transactional level, the two primary alternatives are on-demand and subscription services. With on-demand services, customers either pay as they go or choose volume pricing and pay charges based on metrics such as usage volume, the number of incidents, or hardware-related fees. Subscriptions can involve flat fees—typically applied on a monthly or yearly basis—or free/premium (“freemium”) offers, which provide the basics free of charge while offering additional features for a flat fee.

Another monetization option is the “give and take” model, which offers incentives to data providers to share their information. The incentive can be monetary or take the form of something like highly relevant, aggregated data as an enticement to share. The marketplace then aggregates and anonymizes the data and offers it along with associated data-focused services to customers.

One give-and-take example is an Internet-based service that offers geolocated real-time aircraft flight information. The service reportedly has one of the largest online aviation databases, covering hundreds of thousands of aircraft and flights as well as large numbers of airports and airlines. Data suppliers receive free radio equipment that collects and transmits aircraft data and a free business-level membership to the service worth $500 [≈ Basic iPad, 2011] a year for as long as they transmit data. In another case, a large European credit bureau offers credit-rating information for consumers and corporations. Data suppliers provide information that includes banking activities, credit and leasing agreements, and payment defaults. In return, they receive credit-ranking data for individuals or businesses. Yet another give-and-take marketplace focuses on data and performance analytics on mobile-operator network coverage. It trades apps and coverage information to data suppliers in exchange for crowdsourced data that can generate mobile-network coverage maps and reveal a mobile operator’s performance by region and technology (for example, 3G or 4G networks).

Assessing the competition

A wide variety of traditional commercial data services currently exists, although these services are largely in silos that focus on specific topics, such as healthcare, finance, retail, or marketing. This balkanization provides an opportunity for new, more holistic data-business models. One advantage of the current ubiquity of data providers is that most companies are already familiar with dealing with them. In fact, some sources estimate that 70 percent of largeorganizations already purchase external data, and all of them are likely to do so by the end of the decade. The value potential inherent in data marketplaces is attracting key players from a variety of advanced industries. A number of aerospace companies, for example, offer systems that provide guidance to customers in areas such as maintenance and troubleshooting. Similar efforts are also under way in the agricultural and mining-equipment industries, among others.


The IoT’s big data promises to help companies understand customer needs, market dynamics, and strategic issues with unmatched precision. But in pursuing this goal, organizations will amass previously unimaginable quantities of information. The data marketplace offers them an innovative way to turn some of that data into cash and reap the benefits that will accrue from building a self-reinforcing ecosystem, enabling crowdsourcing, supporting interoperability, satisfying customer data needs, and improving data quality.

About the author(s)

Johannes Deichmann is a consultant in McKinsey’s Stuttgart office, Kersten Heineke is an associate partner in the Frankfurt office, and Thomas Reinbacheris a consultant in the Munich office, where Dominik Wee is a partner.

The authors wish to thank Mark Patel for his contributions to this article.