Published: June 9, 2020
Augmented World Expo (AWE) is not the first virtual conference, but it did mark a significant milestone in the spatial computing industry being able to use the very tools and social experiences that augmented, virtual and extended reality (AR, VR, XR) have produced. It also marked an opportunity for many who would not ordinarily be able to make the journey to Santa Clara, California, to join in. Furthermore, with so many people from all walks of life working from home and feeling the differences in forms of communication, some being more stress-inducing than others, they would potentially ask if there better ways to interact, hence AWE provided a way to hear about and experience spatial computing as a whole.
The 451 Take
We covered AWE 2019 on its 10th anniversary, where it was clear that there was a new wave of both technology and enthusiasm for spatial computing blending with long-established leaders in the field, all waiting for the right opportunity to deliver results. It was clear from everyone presenting at AWE 2020 that many flavors of the industry were in a prime position to help a locked-down world. The world of work (and play) is in a state of flux and some significant changes will emerge even as we come out of lockdown. We have been covering this with 2020 Trends in Workforce Productivity & Collaboration: Workforce Transformation and more recently, The coronavirus impact: 20 long-lasting changes we expect. Spatial computing in all its shades will have a long-term impact. Different virtual environments suit different use cases, but there is scope to use different spatial approaches at different times.
SPACES had itself been affected by the pandemic. Because the company builds and runs immersive experiences in physical locations, it chose to engage with the wave of video applications but provide elements of the benefits of VR in delivering information. As a college lecturer in VR and AR, Fink had also been affected by the stay-at-home orders and decided to take a field trip with his students to as many virtual environments as possible, leading to the documenting of about 120 different applications in multiple categories, some of which were shared at the event.
In the panel section of the presentation, all the panelists were virtual environment exponents with their own platforms. On the panel were VirBELA, Glue Collaboration, Spatial Labs and HTC with its recent investment in Engage. Each platform had a slightly different core use case and ways to access them, some for larger events, some for small meetings. Each revolved around avatar interactions in a virtual space, typically with access to dynamic content creation. The discussions among the company representatives tended to cover many of the same that we had in 2006 with Second Life and enterprises; now of course, a wider range of technology, including accessible VR headsets, controllers or hand sensing and an acceptance that video is not the solution to everything added some layers to consider.
In AR, the physical space in which an experience will run is not guaranteed to be the same from user to user. AR objects must interact with a contextual awareness of the physical world. Part of what was demonstrated with MARS was providing that context to digital media and being able to test that media in a variety of virtual spaces – living room, office, classroom – and see how it might work. It was interesting to see that in order to run the tests the Unity environment simulated what an AR headset or handset would do by building its own internal scan of the environment, even though the test environment was already virtual.
In a guided tour of Engage, a group of 30 of us had about an hour of presentations and interactive sessions across a variety of environments. The platform has developed from an education background, so it has many moderator/teacher controls. It was not a free-for-all conversation; we requested to talk by waving or raising a hand, not by clicking a button. In a biology lab setting, the group was all able to touch and explore models of a heart or lungs. In another venue, we wrote on a white board where we came from, all at the same time, spaced out across a large wall board. This ability to interact in any virtual environment and to not feel locked into broadcasting via a camera is something that must be experienced to consider whether this is better than the current norm. It works best in VR initially, but many of the platforms offer a chance to test them out for free.
Another event was a visit to the Museum of Modern Realities. It is a VR-only art museum of 3-D creations made in many different tools and styles. Another indication of how important immersion and real-time events are is there was a large gathering in AltSpaceVR, some talks and a party, but because it happened so late in my time zone, this analyst missed it. Whether it was subsequently streamed as video or not, it is a very different emotional impact from missing a video and watching it later in a different time zone. The same happens at physical events; an attendee goes to one gathering yet misses another. That feels like an important element to the human experience too.
Ian Hughes is a Senior Analyst for the Internet of Things practice at 451 Research. He has 30 years of experience in emerging technology as a developer, architect and consultant through key technology trends.
Paige is a Senior Analyst for the Data, AI and Analytics channel at 451 Research, covering data management, including data integration, data governance, data quality and master data management. She has experience covering a broad range of information management technologies spanning database functionality and self-service analytics to regulatory policy and compliance.
Scott Crawford is Research Vice President for the Information Security Channel at 451 Research, where he leads coverage of emerging trends, innovation and disruption in the information security market. Scott is also a member of 451 Research’s Center of Excellence for Quantum Technologies.