Players can be virtually splashed by a whale and chat with a hobbit. Credits: Air New Zealand
Air New Zealand is taking tourism promotion to the tabletop with a new augmented reality board game.
The Air New Zealand Fact or Fantasy Game of New Zealand sees players wear Magic Leap One headsets to view and interact with a 3D map of Aotearoa.
Using Magic Leap technology, users can virtually watch the growth of a kauri tree, interact with a rather grumpy hobbit and get splashed by a breaching whale.
The game was on display at the first L.E.A.P conference in Los Angeles this week.
Magic Leap’s technology works by layering digital objects onto the real world so that light enters the eye as it would with a real object. This means users can see detail both up close and from afar.
Air NZ has been working with the creative team at Framestore for the last 18 months to create its board game.
The airline’s Jodi Williams says it’s important Air NZ continues to discover new technologies to improve the customer experience.
“By getting in early and being both a developer and creator, we have been able to test and learn, creating an incredible platform,” she says.
Ms Williams also says the Magic Eye technology may be used to “reframe customers’ perceptions of the physical cabin environment”.
The Air New Zealand Fact or Fantasy Game of New Zealand can be played by four people and is aimed at educating and promoting New Zealand as a destination.
Unlike VR, when you’re talking about augmented reality, describing what an experience is like can be incredibly difficult — primarily because the experiences are even more contextual than relatively static virtual worlds that don’t involve real-world settings.
In AR, everything is about how “you” see things interacting with your real environment. Such is the case with what I’m calling the most important demonstration of Magic Leap technology to date in the form of an AI assistant called Mica.
Together, the team described a world in which a Magic Leap user will be able to interact with intelligent assistants in the form of fully realized augmented reality humans that can recognize your position in a room, as well as items in that room. Having mapped the area and your position within it, the AI assistant will then interact with you to help you do any number of things.
For example, as detailed in the presentation, the AI assistant might scan the Magic Leap One wearer’s eyes to detect his mood and then suggest an appropriate song to play through the home’s music system. Similarly, the AI assistant my access the Magic Leap One user’s preferences to adjust things such a the level of light in a room at a certain time of day.
We’re already becoming accustomed to such interactions on the audio plane via digital assistants like Amazon’s Alexa, Google Home, and, to a lesser extent, Apple’s Siri. But what Magic Leap is describing is putting an even more robust and responsive version of such a digital assistant in the form of a human that inhabits the same space as you, thus taking the assistant metaphor to its highest level. It sounds and looks a bit like science fiction, but it’s not.
What Magic Leap is describing is so close to reality, the company now feels comfortable enough to offer demonstrations of a rudimentary version of the dynamic at work with the Magic Leap One in conjunction with its intelligent assistant Mica.
The result is a stunning experience that takes AR into brand new and exciting territory.
I met Mica for the first time earlier this week. And if you get a chance to meet her, she will fundamentally change how your view the Magic Leap One and augmented reality in general.
When Magic Leap’s team brought me into an empty room hidden deep in the bowels of an LA event center, I didn’t know what to expect. The space was designed to look like a normal room, complete with a table, two chairs, and other furniture situated around the table. Nothing looked particularly futuristic or tech-enabled, so I wasn’t expecting much. Wow, was I wrong.
Upon donning the Magic Leap One, I’m greeted by a virtual woman (Mica) sitting at the very real wooden table. Then, Mica, with an inviting smile, gestures for me to join her and sit in the chair opposite her. I oblige, and then a very weird interaction begins — she starts smiling at me, seemingly looking for a reaction.
I’ll admit, I deliberately avoided smiling (though it was really hard, Mica seems so nice) and kept a poker face in an attempt to see if I could somehow throw the experience off by not doing the expected, that is, returning the smile.
Undaunted, Mica continued to look into my eyes and go through a series of “emotions” that, surprisingly, made me feel a bit guilty about being so stoic.
It’s at this point that I should mention that she doesn’t speak yet, so all of our interactions were conducted in silence, and instead of using words, she communicated using gestures, eye moments, and various body language. At first, I thought this might be a limitation, but retrospect, I think this served to make the experience even more impactful.
That would have been enough to mildly impress me, but what came next was the kicker. She then pointed to a real wooden picture frame on the table, gesturing for me to hang it on a pin on the wall next to us. I did as asked, and… it was the eurekamoment. This was a virtual human sitting at a real world table and she just got me to change something in the real world based on her direction.
But then it got better. Once I’d hung the empty frame, Mica got up (she’s about five feet six inches tall) and began writing a message inside the frame, which in context looked about as real as if an actual person had begun writing on the space.
Alas, I don’t remember what the message was (honestly, I was too blown away by what was happening), but I’m assuming it was somewhat profound, as Mica then looked to me in a way that seemed to ask that I consider the meaning of the message. After a few beats, the life-sized, augmented reality human walked out of the room. But she didn’t just disappear into a wall in a flurry of sparkly AR dust. Instead, she walked behind a real wall in the room leading to a hallway. It was a subtle but powerful touch that increased the realism of the entire interaction.
As I said earlier, it’s incredibly difficult to describe just how profound this experience was, but if and when it’s made available to the public, you’ll be doing yourself a grave disservice if you pass the opportunity up. `
I’ve been trying to think of tool or app that would compel me to wear the Magic Leap One for an entire day. And while I’ve had the device for months now, I haven’t been able to think of anything that would get me to wear it beyond one hour spurts of activity. That’s all changed now. Although battery life and the experience itself aren’t quite ready for such rigorous and extended use, I could easily see coming home and slipping on the Magic Leap One for the rest of the night if it meant having access to such a fully realized AI assistant such as Mica.
After meeting Mica, I have no doubt that this is what the virtual assistant future will look like for most people in the very near future. It’s not assured that it will be Magic Leap that delivers it, but whichever company does, I think it’s safe to say that Magic Leap was first to show us that future in this particular way, and it’s incredible.
Hevolussvela le innovative soluzioni di mixed reality messe a punto per il Gruppo Würth, player mondiale nella distribuzione di sistemi di fissaggio e montaggio.
In questo settore, Hevolus propone HoloWarehouse e HoloMaintenance, piattaforme che consentono la massima interazione tra operatore e sistemi informativo. Un vero e proprio cambio di paradigma per chi lavora sul campo, il tutto basato sui visori attivi Microsoft Hololens. In dettaglio, HoloWarehouse è una App in Mixed Reality per la presentazione e configurazione delle soluzioni logistiche. Mentre Holomaintenance è deputata alla gestione di attività post-vendita di manutenzione e assistenza remota.
Grazie ad HoloWarehouse, ad esempio, sarà più facile per un’azienda meccanica capire in che modo potenziare la sicurezza degli ambienti di lavoro. Così possono fare ora clienti – attuali e potenziali – di Würth, installando nei propri stabilimenti di produzione i distributori automatici per l’antinfortunistica Würth (che includono guanti, mascherine, occhiali di sicurezza, ecc.).
Grazie alla tecnologia Hevolus e agli Hololens di Microsoft, l’ologramma 3D del distributore ne mostra il funzionamento: al passaggio di un badge elettronico, il dispenser eroga al lavoratore l’equipaggiamento di sicurezza personalizzato in base alla sua mansione lavorativa, garantendo all’azienda il tracciamento quotidiano dei prodotti stoccati e permettendo la gestione in tempo reale dei riordini.
Invece, in caso di guasti o malfunzionamenti, grazie a HoloMaintenance, Würth può gestire in tempi celeri gli interventi di manutenzione o riparazione del distributore in modo diretto, ottenendo le informazioni necessarie dall’ologramma 3D o in modalità remota, chiamando in videoconferenza il supporto tecnico specializzato che potrà vedere esattamente ciò che l’operatore on site visualizza con gli Hololens e indirizzarne diagnosi e procedure di intervento.
Hevolus sarà presenta a SMAU Milano. Hevolus presenterà in fiera ulteriori novità in cantiere, tra cui i suoi programmi di ricerca più recenti e l’innovativa soluzione Photoplanner, studiata per Natuzzi, la più grande azienda italiana nel settore dell’arredamento.
Antonella La Notte, CEO di Hevolus Siamo orgogliosi di prendere parte a SMAU anche quest’anno continuando a portare innovazione tecnologica in un settore in cui fino a poco tempo fa sembrava impensabile, quale il mondo retail. La Mixed Reality rappresenta per le aziende un’incredibile opportunità per regalare ai propri clienti un’esperienza d’acquisto unica e siamo lieti che partner di pregio come Würth ci accordino la propria fiducia per portare la customer experience su un nuovo livello.
OnSight is mixed-reality software that allows scientists and engineers to virtually walk and meet on Mars. It was created by NASA’s Jet Propulsion Laboratory, in collaboration with Microsoft, for the HoloLens. The software won NASA’s Software of the Year Award 2018. For more about NASA’s exploration of Mars, visit https://mars.nasa.gov
When you work at a factory that pumps out thousands of a single item, like iPhones or shoes, you quickly become an expert in the assembly process. But when you are making something like a spacecraft, that comfort level doesn’t come quite so easily.
“Just about every time, we are building something for the first time,” says Brian O’Connor, the vice president of production operations at Lockheed Martin Space.
Traditionally, aerospace organizations have replied upon thousand-page paper manuals to relay instructions to their workers. In recent years, firms like Boeing and Airbus have started experimenting with augmented reality, but it’s rarely progressed beyond the testing phase. At Lockheed, at least, that’s changing. The firm’s employees are now using AR to do their jobs every single day.
This piece first appeared in our twice-weekly newsletter, Clocking In, which covers how technology is transforming the future of work. Sign up here—it’s free!
Spacecraft technician Decker Jory uses a Microsoft HoloLens headset on a daily basis for his work on Orion, the spacecraft intended to one day sit atop the powerful—and repeatedly delayed—NASA Space Launch System. “At the start of the day, I put on the device to get accustomed to what we will be doing in the morning,” says Jory. He takes the headset off when he is ready to start drilling. For now, the longest he can wear it without it getting uncomfortable or too heavy is about three hours. So he and his team of assemblers use it to learn a task or check the directions in 15-minute increments rather than for a constant feed of instructions.
LOCKHEED MARTIN
In the headset, the workers can see holograms displaying models that are created through engineering design software from Scope AR. Models of parts and labels are overlaid on already assembled pieces of spacecraft. Information like torquing instructions—how to twist things—can be displayed right on top of the holes to which they are relevant, and workers can see what the finished product will look like.
The virtual models around the workers are even color-coded to the role of the person using the headset. For Jory’s team, which is currently constructing the heat shield skeleton of Orion, the new technology takes the place of a 1,500-page binder full of written work instructions.
Lockheed is expanding its use of augmented reality after seeing some dramatic effects during testing. Technicians needed far less time to get familiar with and prepare for a new task or to understand and perform processes like drilling holes and twisting fasteners.
LOCKHEED MARTIN
These results are prompting the organization to expand its ambitions for the headsets: one day it hopes to use them in space. Lockheed Martin’s head of emerging technologies, Shelley Peterson, says the way workers use the headsets back here on Earth gives insight into how augmented reality could help astronauts maintain the spacecraft the firm helped build. “What we want astronauts to be able to do is have maintenance capability that’s much more intuitive than going through text or drawing content,” says Peterson.
For now, these headsets still need some adjustments to increase their wearability and ease of use before they can be used in space. Creating the content the workers see is getting easier, but it still takes a lot of effort. O’Connor sees these as obstacles that can be overcome quickly, though.
“If you were to look five years down the road, I don’t think you will find an efficient manufacturing operation that doesn’t have this type of augmented reality to assist the operators,” he says.