TRAHAN ARCHITECTS TRANSFORMS A HISTORIC VENUE INTO AN INTERACTIVE DESTINATION MADE POSSIBLE THROUGH THE COMBINATION OF ARTISTRY, DESIGN, AND LASERS.
The sculptural furniture and objects Brooklyn artist Matthias Pliessnig handcrafted from steam-bent wood had long captured the eye of Trey Trahan, FAIA. In 2015, when his firm, New Orleans–based Trahan Architects, was commissioned to renovate Atlanta’s Alliance Theatre, he seized the opportunity to bring Pliessnig’s sinuous designs into an architectural context.
Not only would the custom-shaped and -positioned slats provide outstanding acoustics inside the theater, but steam-bending the wood would be more efficient than milling it on a lathe, which Trahan knew could be wasteful based on previous work with precision-milled wood. “I was fascinated with how one could go about the process of creating complicated shapes in a more ecological way,” he says.
The challenge here was how to scale the artist’s handcrafted quality to outfit a 650-seat theater. After some iterating, Pliessnig and Trahan’s team derived a technique to use steam to soften hundreds of reclaimed white oak slats, each ½-inch square in section, and then bend them into place around the theater to create a serpentine surface along the theater’s balcony railing and side terraces.
To achieve Pliessnig’s vision, Trahan collaborated with Plaistow, N.H.–based wood fabricator CW Keller Associates. Working in Rhino, the team devised a model that called for approximately 100,000 linear feet of wood slats placed around the theater. In locations where the acoustics needed a reflective surface, the slats were spaced close together; where absorption was desired, the slats were set further apart. Thanks to the model’s accuracy and precision, CW Keller could specify the placement of each strand to a 1⁄32-inch tolerance.
CW Keller’s engineers then went to the shop and used the model to laser-project the exact location of each strand onto a wooden jig framework, which in turn was attached to a steel armature. The fabricators used a similar augmented-reality environment to install the completed framework panel in the theater itself. The wood strips are stained a rich, dark brown, enhancing the warmth and ambiance of the interior. “Over time, as audience members touch the surface, it will take on a beautiful patina,” Trahan says.
The result is human-scale, handcrafted millwork made possible with the latest 3D technology, which merges design, sustainable construction, and acoustical performance to challenge the relationship between a theater and its audience. “I want to just hug this thing and touch it,” says juror James Garrett Jr., AIA.
Alliance Theatre’s leadership could not agree more. “The design,” says the Jennings Hertz artistic director Susan V. Booth, “inherently unites each performance’s audience into a fostered and connected community, and provides not [simply] a frame for the work we do, [but moreover] a graceful conduit for the work to land in the heads and hearts of those folks.”
THE BALL AND THE RACKETS EXIST IN AN AUGMENTED REALITY WORLD
Developed by Stereolab’s’ ZED Mini and HTC vive, an augmented reality game of table tennis has the internet amazed, and gearing up for a future.
ZED Mini is the world’s first camera mixed-reality camera that uses virtual and augmented reality together. Virtual reality is a totally artificial world created through computer graphics which the user navigates and interacts with like in the real world.
Augmented reality, however, is a scenario like this one, where players can see the virtual table, rackets and balls, but also the real-world room they’re in. And HTC Vive is a headset which, “pulls virtual worlds off your computer screen and into your home”.
After debuting its virtual Pocket Gallery last year with the works of Johannes Vermeer, Google Arts & Culture has released a sequel that brings even more artists into your home via augmented reality.
Available in the Google Arts & Culture app for iOS and Android, “The Art of Color” features 33 famous paintings from around the world organized into wings by color palette, with Pablo Picasso and Vincent Van Gogh among the featured artists.
Like the Vermeer gallery, users can anchor a miniature version of the virtual gallery in their physical environment via ARKit or ARCore.
To access “The Art of Color” feature, first open the app, then click on the camera icon button located at the bottom of the app. The next menu will show you a menu including the Pocket Galley option. Once you click on the Pocket Gallery menu option you’ll be prompted to look find a well-lit surface upon which to place the virtual gallery.
Once that tracking is done, you can then tap on the “Art of Color” icon, located at the bottom of the screen, and download the new feature. When that’s done, just tap the Enter button and you’ll be immersed in a virtual gallery in your real world location. The experience almost becomes a VR experience, except users can still see the real world through the exit doors of the gallery.
Once immersed in the gallery, users can walk around the virtual halls to view works of art more closely or double-tap to transport themselves to various wings of the digital museum. Also, tapping on a painting brings up a card with more information on the piece.
“One of the goals of the Google Arts & Culture team is to find new or unexpected ways to bring people closer to art. From renowned masterpieces to hidden gems, ‘The Art of Color’ brings together artworks like Georgia O’Keeffe’s ‘Red Cannas’ and Amrita Sher-Gil’s ‘Mother India’ or Hokusai’s ‘South Wind, Clear Dawn,’” said Andy Joslin, design lead for Google Arts & Culture, in a blog post.
While Google has begun using augmented reality in many of its existing products, like Google Maps and Google Search, its seems like the Google Arts & Culture team has gone “all in” on AR, so much so that they’ve consolidated all of the AR tools under the Camera tab in the app.
In recent years, the Google Arts & Culture initiative has been best known for its VR experiments, but augmented reality is increasingly front and center for the team, including an Art Projector tool that brings life-sized individual works of art into the user’s personal space.
Outside of its mobile app, the team has also partnered with other organizations to tell their stories in augmented reality. For example, the team assisted CERN in using AR to explore the Big Bang. The Google team also spearheaded the Notable Womenproject, which featured an experience that used AR to digitally insert historically famous women into real currency.
Despite these wide ranging uses, it appears that showing off art in AR through a mobile app is becoming one of Google’s favorite palettes for immersive experimentation. And, until teleportation becomes a thing, it’s the only way to see the world’s most famous works of art in one space.
Dal palco dell’evento Inspire 2019 organizzato da Microsoft e in scena in questi giorni a Las Vegas, Julia White (Corporate Vice President di Azure) si è rivolta ai presenti in sala descrivendo in giapponese di una nuova tecnologia sviluppata. Julia White, però, non parla la lingua del Sol Levante. Lo ha fatto per lei il suo ologramma, un complesso e dettagliato modello tridimensionale che ne ha replicato fedelmente le fattezze, la voce, i movimenti e persino i vestiti.
Un ologramma per tradurre ciò che diciamo
È il frutto dell’incontro tra la Mixed Reality del visore HoloLens di seconda generazione e gli algoritmi di intelligenza artificiale gestiti sui server cloud dell’infrastruttura Azure. Per il rendering del parlato è stata impiegata la sintesi vocale di un sistema text-to-speech basato su rete neurale. È bene precisare che la conversione da essere umano a ologramma non avviene in tempo reale, ma necessita di uno scan preventivo del corpo nonché della registrazione di quanto far pronunciare allo speaker virtuale. Detto questo, la resa visibile nella demo qui sotto risulta piuttosto convincente.
HoloLens 2, annunciato nei mesi scorsi in occasione del MWC 2019 di Barcellona, è al momento un’esclusiva dell’ambito business. Entro fine anno arriverà anche la Developer Edition, accessibile dagli sviluppatori allo stesso prezzo di 3.500 dollari (o 99 dollari al mese). L’intento di Microsoft è quello di spingere l’evoluzione della Mixed Reality per avere successo laddove la realtà virtuale e quella aumentata hanno parzialmente fallito, arrivando a offrire non solo concept o esercizi di stile, ma prodotti e servizi che possano risultare realmente utili sia per i professionisti sia nel segmento consumer.
THE SCIENCE BEHIND TIME’S NEW APOLLO 11 MOON LANDING AUGMENTED REALITY EXPERIENCE
TIME this week launched TIME Immersive, a new iPhone and Android app that we’ll use to deliver groundbreaking augmented reality and virtual reality experiences. First up: the TIME Moon Landing experience, the world’s most accurate 3D re-creation of the Apollo 11 mission, which took place 50 years ago this month. Users can watch an approximately five-minute AR simulation of the Apollo 11 landing, narrated by TIME’s Jeffrey Kluger and featuring original NASA audio from the mission, then explore the surface of the moon on their own.
What makes the TIME Moon Landing hyper-accurate? At the experience’s core lies incredibly precise data meticulously collected over the last 20 years by John Knoll, the chief creative officer and visual effects supervisor at Industrial Light and Magic, a top Hollywood special effects company founded by George Lucas.
“I’m old enough to remember seeing the Apollo 11 landing live as a kid,” says Knoll, who gave his data to TIME. “That really left a big impression on me. In the years that followed, I was always fascinated with the space program.”
Knoll began collecting Apollo 11 landing data after stumbling upon a transcript of radio calls between the spacecraft and mission control. Those transcripts, he says, underscored the harrowing few minutes just before the “Eagle” lander touched down on the lunar surface, when it was running dangerously low on fuel. That moment, says Knoll, was largely glossed over in the Apollo 11 documentaries of his youth. “In reading the timestamped transcripts, this is white-knuckle time,” he says.
Knoll’s commitment to accuracy came in part from his disappointment with some Hollywood directors who pay lip service to scientific precision but abandon it in favor of what they or the studios believe is better storytelling. “I was very committed to making the re-creation as technically accurate as I could make it, in getting everything right about the motion of the spacecraft, the lighting conditions, the lunar terrain, where individual rocks and craters were,” says Knoll. “And to figure out if there were clever or sneaky ways to extract data from unlikely sources.”
To that end, Knoll relied on a handful of data sources, including NASA telemetry graphs, footage from a descent camera on the lunar module (LEM), and data from the Lunar Reconnaissance Orbiter (LRO), a probe orbiting the moon that was launched in 2009. He made up for shortcomings in the data with advanced computer vision techniques, including one process whereby the altitude of moon surface features can be estimated based on how bright or dark they appear in photographs.
“When you look at a photograph of the moon, and you see all that light and shadow, what you’re seeing is the orientation of the surface relative to the sun,” says Knoll. “If a surface is brighter, it’s because it’s inclined more towards the illuminance, and if it’s darker, it’s because it’s inclined more away. If you start on one end of an image, and if a surface is lighter than the average then it’s inclined up, so you accumulate the altitude, and if it’s darker, it’s declined, and so you decrement the altitude. By doing that, you can integrate an approximation of the terrain.”
Knoll hopes that the experience helps people better understand and take pride in the complexity of the Apollo project.
“I’m a big champion of science education, and people really understanding what we achieved,” says Knoll. “Those Apollo missions were great and amazing, and especially in these very divisive times, everyone regardless of their political affiliation can look back with some pride and look back at the accomplishment.”
The TIME Moon Landing experience was co-produced by TIME, John Knoll, the Smithsonian’s National Air and Space Museum and Smithsonian’s Digitization Program Office, Trigger, RYOT, and the Yahoo News XR Program. It is available within the TIME Immersive app, which you can download for iPhone in Apple’s App Store, or for Android in the Google Play Store. Look out for more TIME Immersive projects in the near future.