The latent promise of the Internet of Things will only be realized if machines can detect complex signal patterns in data across a wide range of sources inside, across and outside the existing silos of data. Today, progress is held back by monolithic application silos and data wastage.
Report Introduction:

Nearly every business is under persistent threat of digital disruption. Successful business leaders are those who consistently lean into possibilities of digital transformation: the strategic application of technologies to improve customer experience, increase operational efficiency, create new digital business models and adapt to threats. Artificial intelligence (AI) and IoT will be the two most important drivers of digital disruption.

Basic AI has existed for decades, via rules-based programs that could deliver rudimentary displays of ‘intelligence’ in specific, narrow contexts. But there was limited progress, because algorithms were too complex for people to program by hand and the costs of computing were too high. In the past few years, that situation has changed. The same can be said for basic IoT; industrial machinery has included programmability and remote control for decades, while the earliest machine-to-machine (M2M) connections can be traced back 30 years to the first ‘smart’ meters. Those early M2M systems were expensive and proprietary, and the digital transport of data was constricted by bandwidth and cost.

The field of AI has progressed in leaps and bounds in the past few years, driven primarily by the effects of Moore’s Law on computing capacity and cost; GPUs, TPUs and emerging AI-optimized CPUs; increasing skill levels of data scientists with the knowhow to build value-creating systems; increasing capacity of the cloud and edge for computing AI workloads; and the proliferation of open source tools and platforms to support AI experimentation and operations.

IoT has also progressed: Computational power and proximity applies equally well to IoT datasets. Costs associated with connecting physical objects to the internet are approaching zero, while our ability to analyze that data in real time has jumped with a plethora of open source and proprietary platforms. The networking challenges of IoT have been largely solved with low-power WAN devices now able to achieve a useful lifespan of 10 years on a single AA battery. Progress in the key enablers underlying IoT and AI will not abate, but rather accelerate. The progress at the intersection of AI and IoT can also create a self-reinforcing flywheel effect as more IoT data fed into machine learning (ML)/AI models begets better outcomes that fuel more investment, and so on.

We believe the discrete application and eventual full convergence of these technologies will reshape how societies work, govern, conduct war, consume goods and services, and pursue leisure. In turn, the confluence of these technologies will generate new ethical, moral, legal and regulatory challenges. This report sets out to explore the situation today, where things are heading, what will get us there and what will stand in the way.

This Technology & Business Insight report on the point where AI and IoT intersect is based on a combination of insights and data gathered through direct interviews with each of the vendors mentioned in the report (with a few exceptions) spanning a wide variety of vertical applications and our analysts' deep experience in the IoT and AI industries. The full report includes:
  • An explanation into the possibilities of IoT today and in the future
  • An analysis of IoT and AI use cases
  • An examination of IT infrastructure running AI and IoT at the edge

Let us know if you're interested in the full report!