Thursday 4 June 2020

AWE 2020: Leading AR Conference Goes Online, With an Assist from VR

Like other industry conferences, AWE (Augmented World Expo) 2020 went entirely virtual this year. Unlike conferences on, say, security or IT infrastructure, AWE and its companion VR Expo rely very heavily on in-person experiences. So in addition to moving all the main conference tracks to video, AWE did a large-scale experiment with “Side Events” held in various VR settings like AltspaceVR. There were far too many interesting talks, insights, and announcements to cover in a single article, but here are some of what I thought were the most interesting.

AR/VR Business Should Be Gangbusters, but Probably Not Soon

As always, along with various product introductions, keynotes consisted of speakers forecasting the future of the XR industry. (For those who haven’t caught up with the latest in acronyms, XR for eXtended Reality, is the popular new shorthand for AR/VR/MR.) Intuitively, keeping most of the world’s population at home should be a huge boon to XR. Lots of otherwise unoccupied eyeballs, and plenty of companies looking for better remote collaboration tools.

However, industry luminaries pointed out some negative headwinds that the current stay-at-home orders and economic slowdown have created. First, budgets are being slashed across the board. That goes both for many households and most companies. With quality XR rigs still being expensive, it will be hard to make room for them anytime soon. And brands have less money to spend creating consumer experiences. Second, the supply chain for XR hardware has also been disrupted. Currently, decent HMDs are selling for scalper prices due to limited supply. Longer-term, as those issues recede, online collaboration using XR should have a rosy future.

Smartphone AR Is a Gateway to the Immersive Web

8th Wall’s Erik Murphy-Chutorian was one of the conference’s opening speakers and sounded an upbeat message about XR — not about HMD-based VR, but about smartphones running webAR apps — as a way to get to a more immersive web. Purists might argue that a lot of webAR experiences aren’t “real” AR, but some of them are impressive, and the company says they can run on nearly 3 billion current smartphones with no app download required. The company’s customers aren’t really the app users; they’re the brands using the tools to create experiences. As part of the appeal to those enterprises, Murphy-Chutorian said that webAR experiences have more than double the dwell time of more traditional 2D experiences.

8th Wall powers hundreds of different phone-based webAR experiences for major brands.

8th Wall powers hundreds of different phone-based webAR experiences for major brands.

The company also used AWE to launch its new Face Effects AR tools, which — as you might expect from the name — allow AR developers to add special effects to faces being videoed using a phone’s front-facing camera. Because their platform is web-based, developers can also integrate other services such as speech recognition into their webAR apps. Asked about the impact of COVID on their business, Murphy-Chutorian said they had seen an understandable shift from the creation of outdoor “with friends” experiences to more indoor and insular offerings.

Unity Launches MARS Intelligent AR Development Environment

Unity MARS editor screenshot from talkOne of the show’s biggest announcements was the MARS graphical development environment from Unity. It allows app developers to drag and drop objects and characters into simulated scenes. The developer can then modify and test the behavior of characters without needing to actually build scenes and deploy the app on a real device.

MARS comes with a large number of pre-built simulated scenes, but you can also add your own based on scans of the physical world. Unity demoed one imported from Realities.io. Part of what makes MARS “intelligent” is that it’s capable of automatically placing objects in appropriate places in a user’s actual environment when the finished app is run. That dovetails nicely with the environment’s support for a wide variety of simulated environments, as it allows for much more streamlined testing of character behaviors. One impressive-looking marquee project for the platform is a Wallace & Grommit interactive experience called “The Big Fix Up,” which is planned for a fall launch.

Qualcomm used this image to illustrate the claim that 5G + XR could be better than reality. In one sense, I get that, in another it seems spooky

Qualcomm used this image to illustrate the claim that 5G + XR could be better than reality. In one sense, I get that. But in another, it seems spooky.

Challenges in VR-based Collaboration

With more of us working from home, and a reduction in business and personal travel, VR-based collaboration seems like a natural evolution for many types of interaction. But it isn’t without its challenges. Many are fairly obvious, like the need for lots of bandwidth and low latency, coupled with a robust backend system. The limitations were even on display at the conference, where various VR “side events” experienced technical glitches.

One interesting requirement that might not be so obvious is the need for spatial audio in group settings. Unlike a typical multi-player game where the audio channel or channels act like radio channels, if you are wandering around a virtual meeting or cocktail party, you expect to be able to focus on what the people immediately around you are saying.

It’s going to be a while before the industry can solve all these challenges in a cost-effective way. In addition, the lack of a common platform that can be instantly deployed and scaled will keep XR-based collaboration as a small niche player compared with streaming video solutions such as Zoom, Skype, and Webex. In large part, it’s for the simple reason that everyone can play video. But when it works, VR-based collaboration, like VR-based gaming, does provide a unique and compelling experience.

XR in Medicine

Dr. Walter Greenleaf, from the Virtual Human Interaction Lab at Stanford University, provided attendees with a thought-provoking look at the ways AR is beginning to transform health care. He believes those changes will build on the already-in-progress transition of the “center of gravity” of health care from institutions to wherever the individual is located. Digital health-related wearables and telemedicine are two important pieces of that transition. Coupled with cloud aggregation of the data, it provides powerful new tools for both patients and healthcare providers.

Greenleaf sees both VR and AR already making an impact across all aspects of healthcare, powered in part by over 200 companies providing or developing relevant XR products. We’ve covered how VR is used to train surgeons before, but the solutions are becoming more powerful and are being extended to include entire virtual simulated patients for training on human interactions and bedside manner. Room-scale XR environments also allow for improved cognitive assessment tools.

After a procedure, VR tools can also help with physical and occupational therapy. In particular, XR’s ability to assist in treating mental health issues may be especially important. For example, VR experiences have an impressive ability to activate neuroplastic changes in the human brain, according to Greenleaf. Since the repetition of experiences is also required, having software do that job is ideal.

One of the next developments Greenleaf foresees is combining prescription medicines with XR experiences to improve their effectiveness. An intriguing possibility he teased would be the ability to have a conversation with your “future self” that could be set up to provide additional motivation for behavior changes. One silver lining of the current pandemic is that it has increased investment and lowered regulatory barriers in this area, so progress is accelerating.

You Can Still Be Part of AWE2020

Conference organizers have made the main stage presentations freely available online (those who registered can also stream any of the other sessions). Which, in turn, raises another interesting question about XR-based events: Will there eventually be a way to record and replay them that’s as effective as simply playing a video of a talk?

Now Read:



No comments:

Post a Comment