By: Cuyler Gibbons, Images: Courtesy The Ayzenberg Group
“For the last 50 years we’ve been looking at media through aflat screen,” says Joey Jones, lead designer and director of the “a.new reality”team at Pasadena ad agency the Ayzenberg Group. Jones’ mission is to changethat paradigm and bring media—and the stories his clients hope to tell—into avirtual 3D experience that can integrate with the real world.
It turns out that the virtual realityecosystem is something of a continuum defined by the degree of “virtual”reality that the experience provides and the amount of flexibility to movebetween realities available to the user. On one end virtual reality, or VR,provides a sight-occluded experience that is an artificial, computer-generated simulation or re-creation of areal-life environment or situation. With VR, users immerse themselves, anddivorced from the real world, experience the simulated reality firsthand,primarily though stimulation of vision and hearing. Alternatively, withaugmented and mixed reality (AR and MR, respectively), the virtual elements areintegrated into the user’s actual reality.
I’ve come tothe Ayzenberg Group to get some insight into where this technology is headed. “Today we’re in the baby steps of seeing allthat [digital content] come out of those flat screens so that we are able tointeract with it in real time. It’s not only a processor being able to generate3D objects, but AI (artificial intelligence) being able to have those objectscome out and understand the weather, the space, where you are at, where you arelooking, and what you’re saying,” Jones tells me.
While gamingtechnology, by virtue of a massive game market that dwarfs even the movieindustry, helps push the demand for faster processors and higher fidelity,other commercial implications are far-reaching and perhaps more profound. “Architects use it to study scale models thatthen they can easily blow up to a 1-to-1 scale, and medical students are usingit … to simulate operations on holographic cadavers,” Jones says.
Jones himself comes to the tech world as aself-described storyteller, via architecture school. As a student he was deeplyinvolved with the development of what was, at that time, a nascent softwareprogram called Form-Z, intended to allow architects and engineers to design in3D. Realizing he was more enamored with software development than the practiceof architecture, Jones soon found himself working in 3D animation. It’s a pathhe deems linear.
Jones now leads a team developing Moonbloom , a mixed-reality experience app engaged with through the Magic Leap virtual retinal display. Using a digital light field projected into the user’s eye, Magic Leap superimposes computer-generated 3D imagery over real-world objects. It’s leading-edge technology that allows real-world interaction with virtual objects and spaces in a mixed-reality experience.
With Moonbloom, Jones and his team havecreated a world within a world containing flowing water, gleaming towers,arches, and waterfalls in a hypnotic, rainbow-hued milieu that existsin space, while keeping the real world visible “outside.” An original story,the narrative asks you and a lone fox to find and restore the pieces of theshattered moon. And in doing so, you walk around while interacting with thevirtual environment. No controller or special sensors are required, as thesystem recognizes natural hand gestures, allowing you to manipulate virtualobjects with intuitive movements.
“The laws of this game follow how you engagewith objects in your natural life,” Jones says.Moonbloom and its creators are at something of a“If you build it, they will come” moment as they wait for the retinal displaytechnology to truly take off. “When these devices get out, when they becomeubiquitous, clients will become more confident … Moonbloom is a proof ofconcept,” Jones says. “When enough people want these experiences, we want to beable to create enough content.”