Published: June 9, 2020


Augmented World Expo (AWE) is not the first virtual conference, but it did mark a significant milestone in the spatial computing industry being able to use the very tools and social experiences that augmented, virtual and extended reality (AR, VR, XR) have produced. It also marked an opportunity for many who would not ordinarily be able to make the journey to Santa Clara, California, to join in. Furthermore, with so many people from all walks of life working from home and feeling the differences in forms of communication, some being more stress-inducing than others, they would potentially ask if there better ways to interact, hence AWE provided a way to hear about and experience spatial computing as a whole.

We had previously written about some of the potential challenges for events in COVID-19: Turning the virtual into the virtue – how coronavirus will reshape industry conferencing and VR and AR in the context of COVID-19: An opportunity for virtual and augmented reality to shine. Here we detail how some of the products were shown at AWE, how they were used and how this felt as an attendee. We often document the facts of products and companies, but in spatial computing, the experiential element is very much part of its affordances. It becomes an element of worker health and safety to use the right tools to help people communicate and work efficiently without online burnout.

The 451 Take

We covered AWE 2019 on its 10th anniversary, where it was clear that there was a new wave of both technology and enthusiasm for spatial computing blending with long-established leaders in the field, all waiting for the right opportunity to deliver results. It was clear from everyone presenting at AWE 2020 that many flavors of the industry were in a prime position to help a locked-down world. The world of work (and play) is in a state of flux and some significant changes will emerge even as we come out of lockdown. We have been covering this with 2020 Trends in Workforce Productivity & Collaboration: Workforce Transformation and more recently, The coronavirus impact: 20 long-lasting changes we expect. Spatial computing in all its shades will have a long-term impact. Different virtual environments suit different use cases, but there is scope to use different spatial approaches at different times.

Out in the field, AR is the user interface for IoT, and VR provides distraction-free relocation to a new space. There are opportunities for all sizes of company to find a niche and a way to help people come to terms with the world as it will be in the next few years. It does not have to put everyone in a VR or AR headset; although the technology is improving all the time, it is in short supply. However, those devices and elements of human movement, emotion and different ways to approach information all benefit from the underlying technology.


AWE 2020 was delivered along the same schedule and track format as the physical conference would have been. Many of the presentations were in a standard video format of slides and speaker, generally pre-recorded. The pre-recording of a presentation helped some presenters and companies use a greater degree of editing and production values, but these were alongside more raw presentations done in one take with a webcam. Once the presentation was completed, a live feed was used to allow real-time Q&A of the presenter. Just as the physical event, multiple tracks running at the same time led to conflicts in choosing which to attend. Unlike a physical event it was very much simpler to jump from one URL to another, or as this analyst found, having two or more presentations running at once across multiple devices. The presentations are available to delegates to watch again for the next 12 months, but with other virtual conferences and meetings now running back to back, it seems, the chance to catch up on content is more of a challenge. The conference ran on Pacific time; shifting from UK time produced a different form of jet lag from flying there. The streaming nature of the key events meant it could also be experienced on a tablet or smartphone.

Enterprise Gatherings

One of the keynote presentations, by Charlie Fink, was 'Remote Collaboration, Virtual Conferences, The End of Distance and the Future of Work.' He took the opportunity to present using the standard video stream but from a virtual environment. For this, Fink used a tool from SPACES that bridges a VR environment and, using a soft camera, allows the presenter to push that content instead of via a webcam. An avatar of Fink from the mobile application Loom.AI stood at a lectern before a giant screen. Using a VR headset and controllers, he was able to turn, walk and wave as he presented.

SPACES had itself been affected by the pandemic. Because the company builds and runs immersive experiences in physical locations, it chose to engage with the wave of video applications but provide elements of the benefits of VR in delivering information. As a college lecturer in VR and AR, Fink had also been affected by the stay-at-home orders and decided to take a field trip with his students to as many virtual environments as possible, leading to the documenting of about 120 different applications in multiple categories, some of which were shared at the event.

In the panel section of the presentation, all the panelists were virtual environment exponents with their own platforms. On the panel were VirBELA, Glue Collaboration, Spatial Labs and HTC with its recent investment in Engage. Each platform had a slightly different core use case and ways to access them, some for larger events, some for small meetings. Each revolved around avatar interactions in a virtual space, typically with access to dynamic content creation. The discussions among the company representatives tended to cover many of the same that we had in 2006 with Second Life and enterprises; now of course, a wider range of technology, including accessible VR headsets, controllers or hand sensing and an acceptance that video is not the solution to everything added some layers to consider.

Developer Tools

The event was also about some of the tools and platforms emerging to help people build virtual experiences, ranging from the simplest point-and-click no-code/low-code offerings to full high-end developer tooling. In the latter category, Unity Technologies announced its MARS tool set for the game/experience development platform. Designing interactive experiences in 2-D has some layout challenges for designers and programmers, and in 3-D this gets more challenging, but the design controls the environment.

In AR, the physical space in which an experience will run is not guaranteed to be the same from user to user. AR objects must interact with a contextual awareness of the physical world. Part of what was demonstrated with MARS was providing that context to digital media and being able to test that media in a variety of virtual spaces – living room, office, classroom – and see how it might work. It was interesting to see that in order to run the tests the Unity environment simulated what an AR headset or handset would do by building its own internal scan of the environment, even though the test environment was already virtual.


Among all the channels of content, games, enterprise, healthcare, developers, artists and ethics, the side events in VR provided the most interesting set of user experiences and the some of the most memorable content. There were events in Microsoft's AltspaceVR, Mozilla Hubs, HTC-funded Engage, Glue Technology and Museum of Other Realities. Having dropped into most of these events, in a full VR headset even though some worked on regular screens, this analyst can say it was a great snapshot of where the level of experience is today. It also highlighted the importance of trying these to compare, especially against our potential video burnout and stresses.

In a guided tour of Engage, a group of 30 of us had about an hour of presentations and interactive sessions across a variety of environments. The platform has developed from an education background, so it has many moderator/teacher controls. It was not a free-for-all conversation; we requested to talk by waving or raising a hand, not by clicking a button. In a biology lab setting, the group was all able to touch and explore models of a heart or lungs. In another venue, we wrote on a white board where we came from, all at the same time, spaced out across a large wall board. This ability to interact in any virtual environment and to not feel locked into broadcasting via a camera is something that must be experienced to consider whether this is better than the current norm. It works best in VR initially, but many of the platforms offer a chance to test them out for free.

Another event was a visit to the Museum of Modern Realities. It is a VR-only art museum of 3-D creations made in many different tools and styles. Another indication of how important immersion and real-time events are is there was a large gathering in AltSpaceVR, some talks and a party, but because it happened so late in my time zone, this analyst missed it. Whether it was subsequently streamed as video or not, it is a very different emotional impact from missing a video and watching it later in a different time zone. The same happens at physical events; an attendee goes to one gathering yet misses another. That feels like an important element to the human experience too.


Three full days of at least three concurrent tracks and side events is clearly not something to cover in a short report. Hundreds of collaboration tools, several new headset announcements and product releases filled the event. Now it is up to the companies to be in the right place at the right time to make communication at distance a more satisfying and useful experience.
Senior Analyst, Internet of Things

Ian Hughes is a Senior Analyst for the Internet of Things practice at 451 Research. He has 30 years of experience in emerging technology as a developer, architect and consultant through key technology trends.

Paige Bartley
Senior Analyst, Data Management

Paige is a Senior Analyst for the Data, AI and Analytics channel at 451 Research, covering data management, including data integration, data governance, data quality and master data management. She has experience covering a broad range of information management technologies spanning database functionality and self-service analytics to regulatory policy and compliance.

Scott Crawford
Research Vice President, Security

Scott Crawford is Research Vice President for the Information Security Channel at 451 Research, where he leads coverage of emerging trends, innovation and disruption in the information security market. Scott is also a member of 451 Research’s Center of Excellence for Quantum Technologies.

Want to read more? Request a trial now.