Introduction
The 'video game' concept means many things to many people – not unlike the term 'robot,' which can mean anything from a welding arm to a science-fiction-like sentient being (something we have explored in our autonomous robot levels classification). The term 'game,' or the use of game technology, often leads to some suspicion that a business proposition is somehow not serious, despite the fact that games themselves have spawned a multibillion-dollar market.
The 451 Take
To many, games may not appear relevant to enterprise technology, but they represent some of the most challenging, quickly evolving use cases, alongside the technologies for general digital transformation that are causing disruption. For example, a fully autonomous vehicle needs to be self-contained to provide a safe and useful experience for its passengers. But it must still communicate its road environment (vehicle-to-infrastructure) and communicate with other autonomous vehicles (vehicle-to-vehicle), which presents some of the same challenges and problems that game developers are looking to solve. It is also the infrastructure that communication networks, cloud and edge providers have to address to make experiences that attract and keep gamers/customers happy. Enterprise IT infrastructure is built with detailed knowledge of application requirements and qualities of service needs, and public communication infrastructure has evolved to be able to deliver streaming content, a relatively easily defined asymmetric use case. Games are not merely a payload or a single use case, they provide a complex set of interactions and trade-offs on a case-by-case basis. Understanding at this level of detail is where we see similar tension across industrial IoT – with IT providing services to operational technology (OT), where the intricacies in areas such as manufacturing and process control are not fully understood or addressed by traditional IT.
Not all Games, or Gamers, are Equal
Edge First
For many of the early arcade games, software was locked into the hardware. Often they were single-purpose machines, but evolved to be generic boards with some plug-in hardware for different games. The natural evolution of the industry in the 1980s and 1990s was toward home consoles and PCs, which meant gamers were able to run many types of games by plug-in cartridge or installed from disks. This very much maintained the edge-only model of the amusement arcade. All the game code, logic, visuals and audio are computed and rendered by the device the game is played on.
Games tend to be very interactive – the point is to present something to a player who then alters the flow of what comes next by their actions. This purity of model was altered by some games that evolved with the advent of DVD and earlier video discs. Here, some games had visuals pre-generated, or were real-world film, which offered a much higher-quality visual than the consoles and PCs could generate.
Full-motion video (FMV) was played with very simple computer-generated visuals overlaid, or the FMV was triggered for different scenes, to be played based on player input. You may recognize this pattern as it resurfaced with the recent choose-your-own adventure film, Black Mirror: Bandersnatch. FMV is still used in some installed games for cut scenes to explain narrative elements, or to provide direct tie-ins to movie franchises, but increasingly game engines and graphics cards are able to generate very high-end visuals live, totally under the control of the developer and player.
Two-player Local Networks
The car itself is not being sent across the network cable, just regular position updates. This approach can be explained related to other types of games in the physical world. Consider play by mail/text chess. Each player has their own chess board, the actual shape of the pieces on each board is only relevant to individual chess sets. A player can send a chess move to an opponent with a small algebraic statement, like E4E5, which the receiving player implements before returning with their own move.
While slower, this is very much the model of a basic multiplayer video game. There may be a subtle difference, in that often in such a multiplayer game one of the machines is regarded as the keeper of the truth – a server – and the other offers and receives deltas to its own version of the truth. This is still two edge devices with moderate interaction. Hardly any data is flowing, but it can be seen that any delay in that data can be the difference between being in front of or behind the opponent's vehicle on-screen. But it is, in fact, the opposite way around in the server's data. Even a simple two-player game needs low-latency communication to avoid lag. To mitigate for lost or delayed messages, or the need for lots of data transfer, a game engine predicts where something should be frame-by-frame, and makes minor corrections based in incoming data.
Downloads and App Stores
For developers, it has enabled constant improvements with over-the-network patches and enhancements. It has also enabled economies in game purchases and extending a game's life with new downloadable content. This side of the games industry needs high bandwidth rather than low latency. It is the sort of network requirements that streaming media such as movies requires. The faster and smoother the download, the sooner the game is being engaged with.
Multiplayer Networks
Players require low-latency networks in order to have a smooth multiplayer experience. With a cloud or near-edge-hosted server (the gamer will neither know nor care about the difference), each edge device is able to send and receive better updates, and with many more players in an environment. Lag – a mismatch between what should be seen and what is – is a game experience killer.
It should also be noted that with the rise of e-sports, competitive gaming can now offer millions in prize money and often involves large arena-based tournaments. Although the competitors may have practiced or qualified with internet-based games, for those events, to avoid any lag, organizers bring servers and infrastructure on-site to host the games, even when those games are only sending basic positional and event data.
Cloud Gaming
A video, whether a download or a stream, is fixed. When the viewer presses play, a few milliseconds of buffering may occur, which is just trying to soak up any lag or jitter; the viewer will not notice, or the resolution simply drops for a few frames. A game, on the other hand, whether it's a multiplayer or single player, is based on player input, and then every frame of video streamed back to the player is different based on that input, with 60 or more frames rendered per second. Any lag between pressing a button and an action happening, or any frames dropped in delivering that to the player, destroys the experience.
The round-trip latency from a player's device to the cloud game environment, which is then rendering an image with complex computations for lighting and textures in real time, then encoding that video and returning it to the player 60 times a second at 4K resolution, is a challenge based simply on physics. The solution requires an ultralow-latency network connection, which is one of the options to shorten the distance the signal has to travel. The shortest distance, of course, is right at the edge again, or at least the near-edge. This then impacts the proximity benefit regarding multiplayer games with people anywhere on the planet.
Virtual and Augmented Reality
Some VR experiences in the media world involve streaming 360-degree video, such as from a live event, but most are still very edge-first, with some underlying data sent for multiuser experiences. It's a big-download, install-then-play-locally model. AR has to do a lot of processing to understand the world around it in order to place digital artifacts in view, which is another very local/edge processing task.
Conclusion
The challenge for enterprise IT and the communication industry is in recognizing the complexity of the requirements of these types of applications. It is notable that this is a time when the core of how communications operates is shifting to more dynamic software-defined approaches, such as with public 5G and TSN (time sensitive networking) in industrial systems, so the patterns and requirements of games form an already real-world set of requirements that can be addressed in this context, not merely considered as content payload.
Applications allowing the interaction of a team of field engineers with an IoT instrumented manufacturing plant through AR interfaces, which is using game technology, are seen as a way to attract a modern workforce to industries dealing with an experience drain as the bulk of older workers retire. The financial impact of supporting any type of IT application is always key – people pay to play games, but are paid to fix machinery.
Ian Hughes is a Senior Analyst for the Internet of Things practice at 451 Research. He has 30 years of experience in emerging technology as a developer, architect and consultant through key technology trends.
Raymond Huo is a Research Associate at 451 Research. Prior to joining 451 Research, Raymond worked as a Resource Operations Associate at healthcare nonprofit Health Leads, helping to develop the company’s customer relationship management tool to meet the business needs of clients.
Aaron Sherrill is a Senior Analyst for 451 Research covering emerging trends, innovation