Adlens - Adaptive focus lens technologies - Featured - Real Benefits

Real benefits in a virtual world

Adlens - Adaptive focus lens technologies - News - Real Benefits

Although many may think of augmented reality (AR) and virtual reality (VR) as relatively new phenomena, one of the flagship events in the calendar, Augmented World Expo, will celebrate its 10th anniversary next year. Walking through the halls of AWE EU in Munich this month put into perspective just how far these technologies have come.

This year’s event differed significantly from earlier editions, in terms of the greater number of real-world deployments of both technologies being discussed, where applications in training, remote support, and visualisation are delivering tangible results.

Although there are still a disproportionately large number of start-ups in attendance, this year’s presentations came from a growing number of major multinationals that have moved beyond the drawing board and are now deploying VR and AR applications in the field. In some cases, these are mission critical tasks such as aircraft construction at Boeing, alongside an excellent presentation from Seb Canniff of FundamentalVR on the use of VR to help train the next generation of surgeons.

Across multiple presentations; passionate speakers shared hard numbers. Boeing for example, reported a 20% reduction in wiring time using Hololens and an 80% reduction in error rate. Centrica, the UK’s largest gas and electricity supplier, reported a 60% drop in error rates and benefits in sales training.

Bosch reported a 50-80% increase in knowledge retention for training using AR enabled ActiveSchematics for an automotive repair project, plus a 75% reduction in expert travel requirements. Across multiple presentations, the overall sentiment was clear: for the many that had taken the leap of faith, the business benefits of AR and VR are real.

Listening to popular sessions from the likes of HTC, Microsoft, and Google; another message was clear, our society is moving to images as the preferred method of communication. Described by one speaker as the “snapchat generation”; extended reality (XR) is seen as a natural progression by millennials across their personal and professional lives.

This feeds into another prevalent trend that emerged across a number of sessions. As AR and VR moves from niche to mainstream; inclusion of the widest potential user community is now vital. The announcement that the new Vive Pro Eye and Cosmos headsets can be worn with glasses was a welcome highlight. But considering that roughly 65% of the world’s adult population need corrective lenses, having a scenario where only a third of the aerospace engineers or prospective surgeons can benefit from AR and VR training is clearly unacceptable for enterprise use cases.

Another area where the optical challenges of AR and VR were explored is within the growing use of haptics. In use cases where physical feedback is critical, the requirement for the visual accuracy is also heightened. A user in a passive viewing scenario or leisure activity may not notice or particularly care that an object is “felt” to be in a slightly different space than it is within a virtually generated world. This illusion is broken when this object needs to be grasped or must be manipulated within, for example, a surgical procedure. The ability to accurately map visual representation to physical interaction needs millimetre accuracy and a fascinating presentation from Manuela Chesso at the University of Genoa highlighted the work that is underway to ensure this is not forgotten.

Interesting sessions from Kevin Williams of KWP and Eozin Che of the American Museum of Natural History also touched on how to scale for mass adoption within Location Based Entertainment (LBE). Irrespective of whether it’s a simulated roller-coaster or a tour around a museum; operators have similar requirements for inclusivity but also for simplified operation. Current headsets can require a significant setup process to be tuned to each user’s visual requirements. A process that takes multiple minutes per participant with a trained operator is unlikely to be commercially viable or popular with users within an LBE environment.

Our own presentation on variable power optics that stimulate our eyes to respond and focus as they do in the real world, proved popular. The presentation was followed by a number of interesting discussions on how adaptive lens technology can be integrated into headsets that can automatically configure themselves to the user based on simplified, self-service methodology. This is part of our ongoing research scope and it was encouraging to see that several hardware manufacturers were keen to engage with us on this topic at a practical level.

As AR and VR continue their rise from niche technology through to mass market application, it’s evident that the industry is starting to take a more holistic view of how to create a user experience that is not just comfortable but also accessible to the widest possible audience.