'Robot' has become a catch-all term loaded with expectations, such as having some degree of autonomous operation, but many robots in operation today – e.g., industrial robots – are not autonomous at all. They may become autonomous and more connected in an IIoT world, but there is a spectrum of autonomy and functionality to consider. As the mechanics of robotics and developments in AI continue to merge, along with the IIoT evolution, robotics will become an interesting high-function case of edge computing. Collaborative robotics (cobots) are emerging that recognize the importance of the workforce, and autonomous vehicles are making inroads into both the enterprise and consumer spaces. Because not all robots are the same, however, it's helpful to establish a degree of classification to understand and define what a particular robot is.


The 451 Take

A robotic tool in any sector is composed of multiple facets, such as physical capabilities. Robots are starting to benefit from advances in software engineering and data such as machine learning to sense and respond to the world around them. Their physical capabilities can be as simple as a collision-avoidance system or more complicated, like learning to walk and balance. Another facet pertains to the actions the physical device has the ability to perform based on its apparent 'intelligence.' Being able to make a robot walk and navigate is the result of high-end research and development, but our brains are wired so that we perceive an associated high degree of intelligence, which is not the case. We based our nine-level robot classification on autonomy to incorporate this concept and to specifically help differentiate autonomy from intelligence. For example, it would be possible for a level 0 robot arm to be hooked to a Grand Master-beating chess AI as a way to move the pieces on the board. Although choosing the plays and moving the chess pieces are two separate actions, it appears to us that the articulated arm is making the decisions. In investigating robotics as IIoT devices, we are defining robots in a way that is similar to what the autonomous vehicle industry has managed to do.



The early industrial robots of the 1970s brought flexibility and precision of movement powered by advances in servos, motors and hydraulics coupled with early computer control. Robots were typically programmed and tasked with a single role in a production line. The aim was repetition and consistency of movement and output quality. An early automotive spot-welding arm may have appeared to operate with a degree of autonomy, but it was generally following the same path each time with little awareness of anything around it; hence, it might have operated in a safety cage because it lacked sensory awareness of the human workforce. Like many industrial plant machines, these robots are becoming increasingly instrumented and significant industrial IoT components. Standard robot arms are becoming more flexible and dynamic, and the robotics field is introducing new form factors and abilities as well.

The transition from spot-welding robots to fully autonomous rolling or walking devices that are still labeled 'robots' is littered with science fiction references that confuse artificial intelligence with the mechanics of movement. The term 'bot' has been appropriated by the software industry to refer to automated responses on helpdesks and with chatbots or in mass dispersal of questionable facts and opinions on social media. We cover robotic process automation (RPA), and you can find more information in our Automation and integration: Essential tech for digital business, 2019 Research Agenda.

In the evolving world of autonomous vehicles, a standard has emerged to speciate development in the industry. This classification seeks to help clarify the levels of autonomy as the industry progresses. The aim is for fully autonomous vehicles, known as level 5, to be able to transport people and goods with no driver intervention at all. Autonomous vehicles can be considered a specific use case within robotics that has been well described, but we currently lack any classification for the many other levels of robotic interaction. Figure 1 below shows the levels for vehicle automation; a fuller evaluation of this description can be found in our Technology & Business Insight report, The Changing Automotive Industry.

Figure 1: Society of Automotive Engineers Taxonomy and Definitions for Automated Driving

Source: Society of Automotive Engineers, January 2014

It is possible to similarly classify robotic applications along the following levels.

Figure 2: 451 Research Autonomous Robot Classification Index

Source: 451 Research

Level 0 robots are the original industrial robots, along with other computer numerically controlled (CNC) tooling. They are placed at 0 as an anchor point to indicate the path toward autonomous robotics. These devices can be retrofitted as IoT-enabled devices, and the physical attributes of the devices can be made to work in the higher levels described, as is the case with all legacy industrial equipment across the shop floor. Brownfield retrofits and upgrades are the order of the day in the digital transformation of long-standing operation plants.

Level 1 brings in remote controlled devices, including bomb-disposal robots that provide a degree of telepresence, shielding operators from danger, and piloted drones. They are controlled via movement-translation devices such as a joystick. A key feature is the introduction of more advanced connectivity in both control and monitoring over that which could be expected in level 0. These can be considered the start of IoT device interactions.

Level 2 introduces smart tooling, complex devices that assist humans directly and where the level of human interaction involves gesture, voice and body movement. In particular, this level refers to exoskeleton systems (see below) but also higher-end telepresence including VR-based control systems using more human gestures to interact.

Level 3 brings a degree of autonomy onboard an industrial robot, with more processing power than level 0 and a degree of sense in responding to its environment. These devices may adjust as they engage with a task – i.e., realigning to pick up an object that is in a different orientation rather than assuming the same positioning each time.

Level 4 introduces mesh robotics, which are coordinated devices that use high-function code and machine learning to engage with one another to perform tasks with high variability in areas such as picking and assembly.

Level 5 – Cobots are an emerging segment of autonomous robots that are designed to work with humans but in a way that assists and understands the role that they are performing. We described many of the features of cobots in a previous report.

Level 6 robots are equivalent to SAE level 5 autonomous vehicles. The degree of autonomy in a self-driving vehicle that requires no human input makes these AVs high up on the autonomous robot scale. They deal with a known task (transport) in a complex but known domain (the road system). This also extends to the next wave of drones using flight for delivery and inspection tasks.

Level 7 robots move into areas previously reserved for science fiction, but they are starting to appear in demonstrations and videos. These fully autonomous, self-perambulating devices use a variety of ways to traverse the world and engage with tasks. They are still primarily high-end research devices but are starting to make their way out into the world. At this level, they are focused on movement to achieve a limited task.

Level 8 robots extend the movement and navigation abilities of the level 7 robots, but they are starting to have the ability to perform many different tasks and adjust to the situation or role they are given.

Level 9 represents robots that are autonomous, highly mobile and versatile that work in cooperation with others, combining elements of Level 4 robot-to-robot communication and Level 5 human cooperation as a cobot.

What most people consider 'robots' fall into levels 7-9. A fully functioning, AI-powered 'being' with a generic set of skills. Usually we assume these will have a humanoid shape matching our various social and religious creator myths. Sometimes they are the focus of dystopian science fiction as they turn against their makers. Research and development will continue to create these devices primarily as a front end for AI abilities, and the classification will eventually have to be extended to incorporate levels of intelligence, not just autonomy of movement.

Just as the level 0 original industrial robots can be integrated and adapted to be part of an IIoT implementation, other robots may advance up the levels, dependent on the needs of the task to perform. Our categorization will help in establishing where a particular robotic device currently is but may also help to explain how it may increase the level of autonomy as the field develops. An example of this can be found in the branch of robots related to exoskeletons.

Exoskeletons

At its core, an exoskeleton is a metallic frame that is used to help people coordinate and optimize their movements with robotic support. More specifically, exoskeletons assist humans with arduous tasks such as lifting or squatting by augmenting certain movements to reduce the amount of strain on the body as they engage in the activity. While the concept of an exoskeleton frame is not new – its roots trace back to the 1960s when GE developed the first prototype – its development and application have advanced over the years. Now, exoskeletons have widespread applications within medicine, industry, military and more recently, consumer goods.

Perhaps the most well-known exoskeleton use cases are in the industrial and medical verticals. Companies such as Ekso Bionics and suitX are developing wearable technology to help people with medical disabilities get back on their feet and to ease the burden of industrial workers by giving them a helping hand with heaving lifting. Outside of industrial and medical applications, military divisions are also starting to develop use cases for exoskeleton frames. Defense Advanced Research Projects Agency (DARPA) has been experimenting with a Warrior Web program since 2016 that is used to help soldiers carry heavy equipment in the field without exerting too much energy.


Because they assist human work rather than replacing it entirely, exoskeletons currently fall under the definition of Level 2 autonomy. However, over time, we expect them to reach Level 5 autonomy, at which point the exoskeleton will have a greater awareness of the task being performed to the extent that it may move the operator to a suitable position to perform it rather that only reacting to an impulse or movement. Similar to the deployment of collaborative robots, the next stage of development for exoskeletons is to enable sensor technology, computing power, machine learning and analytics to further enhance mobility and strength.

Conclusion

We have created a classification for the autonomous nature of robotics in the industrial sector. Clearly, there will be challenges with many of the domains in which robots will operate. There is less need for a surgical robot to move around a hospital, but its reactions and awareness will work at a much higher precision than many other robot types. Our broad definitions can be combined and drilled into as with the example of exoskeletons and the potential evolutionary path they may take. The actual intelligence level of a robot will be more related to classifications that will emerge from the AI industry over time.

Ian Hughes
Senior Analyst, Internet of Things

Ian Hughes is a Senior Analyst for the Internet of Things practice at 451 Research. He has more than 27 years of experience in emerging technology as a developer, architect and consultant through key technology trends. Ian has 20 years of experience at IBM in cross-industry application development. This included automotive, global sporting events, retail and telecoms.

Raymond Huo
Research Associate

Raymond Huo is a Research Associate at 451 Research. Prior to joining 451 Research, Raymond worked as a Resource Operations Associate at healthcare nonprofit Health Leads, helping to develop the company’s customer relationship management tool to meet the business needs of clients.

Aaron Sherrill
Senior Analyst

Aaron Sherrill is a Senior Analyst for 451 Research covering emerging trends, innovation and disruption in the Managed Services and Managed Security Services sectors. Aaron has 20+ years of experience across several industries including serving in IT management for the Federal Bureau of Investigation.

Want to read more? Request a trial now.