La Medaglia del Presidente della Repubblica premia l’evento dedicato a “Quantum Art and Augmented Reality”

 La Medaglia del Presidente della Repubblica

Sabato 9 novembre la città di Pietrasanta (Lucca) ha ospitato, presso il Chiostro Sant’Agostino, la conferenza “quARte” dal titolo: “Arte quantistica, scienza e realtà aumentata: sinergie d’innovazione”.

Il capo dello Stato Giorgio Napolitano ha donato al gruppo “quARte” la Medaglia del Presidente della Repubblica quale premio di rappresentanza all’evento dedicato alla “Quantum Art and Augmented Reality” organizzata da Egocreanet e collaboratori  per il potenziamento della Creativita’ scientifica ed artistica e lo sviluppo della comunicazione in Realta Aumentata.

Il Comune di Pietrasanta si è confermato una splendida cornice per ospitare l’evento. Capoluogo della Versilia storica, Pietrasanta è rinomata per la lavorazione del marmo e del bronzo che l’ha resa un crocevia di scultori provenienti da tutto il mondo. Grazie a un centro storico ricco di sculture e monumenti, la cittadina è conosciuta a livello internazionale come città d’arte, con numerose gallerie ed esibizioni stagionali sulla Piazza del Duomo e nella Chiesa di Sant’Agostino.

Il MuSA, museo reale e virtuale di scultura e architettura, illustra il procedimento di estrazione del marmo, principale risorsa della città.

Dagli anni Ottanta, Fernando Botero, icona dell’arte moderna, ha iniziato a trascorrere vari periodi dell’anno proprio a Pietrasanta per la vicinanza alle cave di marmo. Le sue pregiate sculture e affreschi disseminati per le strade, le chiese e i chiostri conferiscono alla città un fascino singolare.

L’iniziativa “quARte”, ha trovato in questo luogo un palcoscenico ideale per conseguire l’obiettivo di potenziare la creatività scientifica, artistica e lo sviluppo della comunicazione in realtà aumentata.

Il professor Paolo Manzelli ha aperto i lavori sui futuri orizzonti dell’arte quantistica.

I successivi relatori hanno investigato:

1. Le implicazioni della chimica nell’arte e nella letteratura (Vincenzo Schettino);

2. L’arte quantistica e le sue intuizioni verso una scienza olistica (Daniela Biganzoli);

3. L’interdipendenza nella realtà olografica (Andrea Moroni);

4. Immaginazione, creatività e colore (Fabrizio Chemeri);

5. I modelli del neo figurativismo quantico (Luciano Annichini);

6. L’essere mente, spirito, anima nella quantum art (Salvatore Cilio);

7. Di quale arte? (Adelio Schieroni);

8. La chimica quale ponte fra due culture (Luigi Campanella);

9. Le forme e il vuoto (Giuseppe Guanci);

10. La realtà aumentata nelle arti attraverso il gioco-performance e le manipolazioni (Giuliana Guazzaroni e Mirco Compagno);

11: L’ecto musica 432Hz (Fabio Bottaini);

12. La performance i mattoidi (Pietro Antonio Bernabei);

13. Matematica e arte (Fabrizio Ferracin) e, in chiusura, la suggestiva performance sui ritmi naturali, la danzaterapia e i neuroni specchio svoltasi all’aperto nel chiostro (Anna Paola Desiderio e Martin Comploi).

La conferenza esibizione può essere considerata una pietra miliare dell’innovazione concettuale dell’arte quantistica, della scienza e della realtà aumentata. Il punto di vista condiviso dai partecipanti favorisce e stimola una rinnovata dimensione innovativa della creatività nelle espressioni artistiche, scientifiche e tecnologiche.

La realtà aumentata, in questo contesto, rappresenta un connubio positivo per una comunicazione anch’essa innovativa, giocosa e performativa, ma soprattutto capace di offrire un valore aggiunto a differenti settori strategici in rapida crescita, dall’editoria al marketing artistico, dall’ingegneria alla promozione del territorio ecc.

La comunicazione attraverso la realtà aumentata può rappresentare un rilancio culturale locale. Un’esperienza questa in grado di rivolgersi a un pubblico più ampio per promuovere lo sviluppo di movimenti artistici dell’era quantistica, attraverso forme di AR-marketing territoriale per la crescita dell’innovazione di sistemi di eco-economia. Un utilizzo sistematico e metodologicamente corretto delle svariate forme di realtà aumentata è, infatti, quanto di più auspicabile si possa sviluppare anche nella costruzione delle future smart city.

Come deciso durante l’evento di Pietrasanta la Medaglia donata dal Presidente della Repubblica sarà fruibile attraverso un’esperienza di realtà aumentata, da definirsi in base alle idee che i partecipanti vorranno proporre.

Le idee verranno raccolte nella comunità arxlab.

 

Giuliana Guazzaroni

 

Licenza Creative Commons
“La Medaglia del Presidente della Repubblica premia l’evento dedicato a “Quantum Art and Augmented Reality” di Giuliana Guazzaroni – THE ROUND è distribuito con Licenza Creative Commons Attribuzione – Non commerciale – Condividi allo stesso modo 3.0 Unported.

 

Augmented reality glasses ‘translate foreign menus as you read’

Glasses that can automatically translate foreign menus into the wearer’s own language have been unveiled in Japan.

The next-generation spectacles were revealed at a gadget fair on the outskirts of Tokyo and could be available in time for the city’s hosting of the Olympics in 2020.

An engineer for NTT Docomo demonstrates a headset which can translate foreign-language menus

An engineer for NTT Docomo demonstrates a headset which can translate foreign-language menus

By using augmented reality the glasses can project text in the wearer’s native tongue over unfamiliar signs and menus, potentially proving invaluable for British tourists whose grasp of Japanese is limited.

The invention may be particularly useful for those who journey beyond the most popular destinations in Japan where foreign-language menus are rarely found.

In a statement Japanese telecoms firm NTT Docomo, the company behind the glasses, said: “Character recognition technology enables instant language translation for users travelling abroad and reading restaurant menus and other documents.”

Another function the smart glasses can perform is turning any flat surface into a touchscreen with the wearer using a finger ring to activate animated tags.

(Fonte: Telegraph)

Spaceglasses promise mobile augmented reality

The startup, Meta, is developing so-called augmented-reality glasses that combine the power of a laptop and smartphone in a pair of stylish specs that map virtual objects into the physical world, controlled by your hands, similar to the movie portrayals of app control via gestures in “Iron Man” and “Avatar.”

Meta, which came out of Columbia University rather than the Homebrew Computer Club, is trying to usher in the next era of computing. Apple inspired the modern personal computer revolution with the Macintosh and the mobile revolution with the iPod, iPhone, and iPad. Meta wants to lead the transition from mobile to wearable, augmented-reality computing.

Gribetz, a 27-year-old former neuroscience graduate student who grew up in Israel, bears some resemblance to Jobs (as much as Ashton Kutcher with a scraggly beard). He also shares with Jobs the ability to seize on an idea and make visionary claims, but he has yet to ship a product.

“There is no other future of computing other than this technology, which can display information from the real world and control objects with your fingers at low latency and high dexterity,”Gribetz told CNET in May. “It’s the keyboard and mouse of the future.”

The video below offers Meta’s vision of a future enabled by augmented-reality technology:

[sz-youtube url=”http://www.youtube.com/watch?feature=player_embedded&v=b7I7JuQXttw” caption=”SpaceGlasses are the future of computing” /]

Meta sprung to life in December 2012 with $1 million in seed funding, and the blessing of Y Combinator startup guru Paul Graham, as well as wearable computing pioneer Steve Mann, who signed on as the company’s chief scientist.

Gribetz and his band of less than 25 employees are ensconced in the Los Altos mansion, filled with mattresses, cables, and aluminum bins of takeout food, building glasses, an operating system, and apps to show off the capabilities of the platform. He took pride in noting the number of people, most likely competitors, checking out Meta’s employees on LinkedIn.

“We are hacking 24-7,” Gribetz said, “and making less than McDonald’s wages. We have many Ph.D.s, which is indicative of their desire to build the ‘Iron Man’ experience.” He also said that his team is frustrated by the limits of Google Glass.

 

(Fonte CNET)

 

Microsoft Research uses Kinect to translate between spoken and sign languages in real time

Microsoft’s Kinect is a wonderful piece of technology that seems to know no bounds. Microsoft Research is now using it to bridge the gap between folks who don’t speak the same language, whether they can hear or not.

[sz-youtube url=”http://www.youtube.com/watch?feature=player_embedded&v=HnkQyUo3134″ caption=”Kinect captures the gestures” /]

As you can see in the video below, the Kinect Sign Language Translator is a research prototype that can translate sign language into spoken language and vice versa. The best part? It does it all in real time.

In short, Kinect captures the gestures, while machine learning and pattern recognition programming help interpret the meaning. The system is capable of capturing a given conversation from both sides: a deaf person who is showing signs and a person who is speaking. Visual signs are converted to written and spoken translation rendered in real-time while spoken words are turned into accurate visual signs.

While this is clearly a massive achievement, there is still a huge amount of work ahead. It currently takes five people to establish the recognition patterns for just one word. So far, only 300 Chinese sign language words have been added out of a total of 4,000.

Guobin Wu, the program manager of the Kinect Sign Language Translator project, explains that recognition is by far the most challenging part of the project. After trying data gloves and webcams, however, the Kinect was picked as the clear winner.

Wu says there are more than 20 million people in China who are hard of hearing, and an estimated 360 million such people around the world. As a result, this project could be a huge boom for millions around the globe, if it ever makes it out of the lab.

 

fonte The Next Web

Background of an AR experience

Musical itineraries to celebrate the 200th anniversary of Giuseppe Verdi’s birth

“Muri e divisioni” (“Walls and divisions”) is the title of the Macerata Opera Festival 2013. An edition completely devoted to the opera composer Giuseppe Verdi to celebrate the 200th anniversary of his birth. A call for artists was organized by Adam Accademia delle Arti, a local no-profit organization composed by 40 artists, to collect proper artworks for the exhibition. An augmented reality (AR) experience was planned to engage visitors with Verdi’s music, from Nabucco and Il Trovatore. The event exhibition “Muri e divisioni” took place from July 18th to September 29th 2013 in a Gallery called Palazzo Galeotti located in the city centre. To live the augmented reality performance, 7 auras were made. Two musical paths were created: one was based on “Dio di Giuda”, “S’appressan gl’istanti” and “Va pensiero” from Nabucco; another “Di quella pira”, “Tacea la notte placida”, “Terzetto Anima mia!” and “Stride la vampa” from Il Trovatore. The music augmented 7 artworks of the painters: Simona Breccia, Hernàn Chavar, Dorian X, Gabriella Gattari, Luna Simoncini, Marco Temperini and Tomas.

AR

The augmented reality has been created to make visitors perform a unique experience in a contemporary art gallery. The public may dynamically live something special, while appreciating paintings and installations. Giuseppe Verdi’s music augments pieces of art and specific gestures appear at the same time. The visitor has to abandon his/her apathy to perform something unexpected and to enjoy the two itineraries.
The augmented reality experience was designed and realized by Mirco Compagno (AR techno-scientific researcher) and Giuliana Guazzaroni (AR researcher) for Adam Accademia delle Arti.

The AR Performance

The artists weren’t been advised about the augmented reality that modifies their works. Some painters were enthusiastic, other showed attention in the experience and one wasn’t completely satisfied by the selected music for his canvas. The public reacted in different ways. In general, they were amazed and engaged by the first performer, a person that modelled the AR experience for visitors, and participated to the experience. Moreover, an ex-post questionnaire was distributed to 50 users in order to collect useful information for the evaluation. Most of the participants declared to have truly enjoyed the AR performance (60%); 20% said to have enjoyed it and 20% declared to have hardly enjoyed it. For 48% of the public the activity performed highly promoted an emotional bond with contemporary art. For 36% of the visitors the activity performed promoted an emotional bond and for 16% scarcely promoted it.

Most of the visitors declared that they would really recommend AR experiences to other people (68%); 16% would warn such experiences and 16% would hardly recommend them. As for technological difficulties, most of the public could easily enjoy the performance (80%), while 20% declared to have encountered problems using the AR browser in their smartphones or tablets (e.g. obsolete operating system etc.). An interviewed visitor said: “it was engaging and exciting to participate in the exhibition “Walls and divisions” that allows people to deal with art in a different and touching way. Thanks to AR there is no longer a simple contemplation, but a real opportunity to experience the exhibition as a whole. Moreover, it becomes a real chance to share feelings with other spectators. A new way to enjoy contemporary art merged in a new surreal dimension”.

Video of the Opening Dayhttp://youtu.be/IWgyGxw_2Ac

Giuliana Guazzaroni

 

Licenza Creative Commons
“Background of an AR experience” di Giuliana Guazzaroni – THE ROUND è distribuito con Licenza Creative Commons Attribuzione – Non commerciale – Condividi allo stesso modo 3.0 Unported.