Set 26, 2014 | News on Augmented, Mixed and Virtual Reality

Airbus wants to put your head in a bowl to ensure you have no more Total Recall of your dreadful flight experience.
Source: www.cntraveler.com
The aircraft manufacturer has now filed U.S. Patent 8814266 B2 for a headrest design that converts into a full-immersion, multisensory, sensory-deprivation, virtual-reality, headrest-hat-astronaut-helmet thingie.
According to the aircraft manufacturer:
"During aircraft flights, certain passengers have periods when they are bored either during a wait phase preceding take-off or following landing or during a cruise phase. Moreover, it is known that aircraft flights generate stress for certain passengers."
To solve this problem, activities are currently proposed onboard the aircraft. Thus, music is broadcast or films shown, access is given to video games or, again, catering services are ensured. However, these various occupations are in some cases insufficient to relieve boredom or stress.
An aim of the invention is to improve in this respect the comfort of aircraft passengers.
Thus, the helmet in which the passenger houses his/her head offers him/her sensorial isolation with regard to the external environment. This isolation can be more or less pronounced according to the configuration of the helmet and the functionalities which are associated with it.
This isolation allows the passenger to better profit from some of the distractions offered, for example: listening to music, watching films, etc. If the passenger is sensitive to stress, this isolation, possibly associated with one of the above-mentioned activities, allows him/her to more easily calm down and relax. In all cases, the invention improves therefore the comfort of the passengers and the pleasantness of their flight.
According to the embodiments, the isolation can be a sound, visual and/or olfactory isolation. It can be total or partial for any of these aspects, or several of them or all."
See on Scoop.it – augmented world
Set 26, 2014 | News on Augmented, Mixed and Virtual Reality

La Realtà Aumentata è una tappa dell’evoluzione della Realtà Virtuale: invece dell’occupazione incorporea dei mondi virtuali, lo spazio fisico e il virtuale sono percepiti come un insieme contiguo, stratificato e dinamico. L’Hyper Reality di Matsuda è aumentata non perché aggiunge un nuovo livello al reale, ma perché ci mostra una realtà in cui siamo già immersi. L’integrazione tra fisico e virtuale provocata dalle nuove tecnologie ha conseguenze sociali su ciò che ci circonda: l’architettura della città contemporanea, infatti, non è più vista semplicemente come uno spazio fisico fatto di edifici e paesaggi, ma diventa spazio determinato dalle informazioni digitali che raccogliamo, consumiamo e organizziamo.
Source: milano.mentelocale.it
La ricerca di Matsuda su questi temi ha un approccio non tecnico, ma culturale e sociale: da qui nasce il progetto Augmented (Hyper)Reality, un ciclo di tre film, di cui il primo in produzione, che racconta e visualizza attraverso le immagini e il linguaggio del film la dimensione satura di informazioni in cui siamo immersi, dando forma visiva ed espressione a un cambiamento che ci vede già protagonisti.
La forza del suo lavoro è la capacità di rappresentare con un linguaggio comprensibile e facilmente accessibile – come quello del cinema – il mondo dell’iperrealtà aumentata, di cui spesso sentiamo parlare ma non sempre cogliamo la portata.
Nei suoi film Matsuda non dimentica di evidenziare come l’Augmented Reality possa essere una tecnologia concepita come “human-centered”, in grado di restituirci al mondo, eliminando la sensazione di disconnessione che ci porta a essere allo stesso tempo connessi e distanti. «Oggi – sottolinea Matsuda – possiamo prendere un grande numero di informazioni e metterle ovunque; non abbiamo più bisogno di guardare uno schermo, di essere seduti alla nostra scrivania o guardare il telefono. Possiamo semplicemente essere nel mondo.
Questo significa che possiamo liberarci dei nostri devices e, invece di focalizzarci su questi rettangoli, torniamo improvvisamente a guardare il mondo. E questo è molto bello. L’Augmented Reality è una tecnologia centrata sull’uomo, è naturale e intuitiva».
See on Scoop.it – augmented world
Set 26, 2014 | News on Augmented, Mixed and Virtual Reality

I RAGGI T sono onde millimetriche a bassa energia dello spettro elettromagnetico, che si collocano tra le microonde e frequenze infrarosse,ed essendo ad energia piu bassa dei raggi X ,sono meno invasivi nel danneggiare tessuti viventi.
Con la Tecnologia di spettrometria THz scopriremo a disanza e con maggior dettaglio la presenza di molte molecole inquinanti batteri ed altri veleni nell’ area che respiriamo . Paolo Manzelli
Per saperne di più: http://gizmodo.com/a-new-graphene-sensor-will-let-us-see-through-walls-1632459897
Source: invirtual01.tumblr.com
The new sensor is remarkable for its ability to detect the terahertz radiation spectrum (aka T-rays) at room temperature. This unique part of the light spectrum can be tuned to see through surfaces—anything from concrete to human skin. Historically, scientists haven’t been able to make use of T-rays, though because the sensors needed to be kept at extremely low temperatures to work. But the Maryland team found a way to make a room temperature sensor with graphene.
Immediate applications of this technology will almost definitely involve the military. Just imagine how useful goggles that see through walls would be in a war zone. The sensors could also let us use T-rays instead of harmful X-rays for medical applications. The idea that consumers could get ahold of these sensors is dubious from a privacy perspective, not to mention graphene is still so wildly expensive.
See on Scoop.it – augmented world
Set 5, 2014 | News on Augmented, Mixed and Virtual Reality

Chris Kluwe wants the NFL to adapt emerging technologies and add augmented reality to the football watching experience. In his TED Talk “How augmented reality will change sports… and build empathy” Kluwe outlines steps that he feels the league should take in the near future.
Source: www.engineering.com
The talk begins with the idea that an augmented reality will happen soon, and the Google Glass that Chris wears on his head throughout the talk punctuates that point. The glass when placed under a helmet will give fans a better idea of the player’s experience.
Footage is shown of Kluwe with a teammate during tackling practice, and some game play at the University of Washington. Visual feedback that tells you about someone else’s experience is good, but Kluwe says we can go further.
Oculus Rift can bring a more immersive experience. The idea of being a football running back, or a soccer player, or a downhill skier is the next step.
Still, virtual reality isn’t the same as augmented reality.
When coaches and managers start to use this technology to gain a competitive advantage, then augmented reality can see a wider exposure. Kluwe outlines previous technology upgrades in NFL history, and for each of them tells us that the game got more exciting and the league grew.
In 2023 Kluwe predicts that the plastic visors already worn by players will display coverage assignments and playbook information. He says that cameras in each corner of the stadium, along with accelerometers placed in the players’ helmets can feed a continuous stream of information to the players.
See on Scoop.it – augmented world
Set 5, 2014 | News on Augmented, Mixed and Virtual Reality

According to a pair of patent applications published on Thursday, Apple is investigating augmented reality systems for iOS capable of providing users with enhanced virtual overlays of their surroundings, including an “X-ray vision” mode that peels away walls.
Source: appleinsider.com
Apple filed two applications with the U.S. Patent and Trademark Office, titled "Federated mobile device positioning" and "Registration between actual mobile device position and environmental model," both describing an advanced augmented reality solution that harnesses an iPhone’s camera, onboard sensors and communications suite to offer a real-time world view overlaid with rich location data.
The system first uses GPS, Wi-Fi signal strength, sensor data or other information to determine a user’s location. From there, the app downloads a three-dimensional model of the surrounding area, complete with wireframes and image data for nearby buildings and points of interest. Corresponding that digital representation with the real world is a difficult task with sensors alone, however.
To accurately place the model, Apple proposes the virtual frame be overlaid atop live video fed by an iPhone’s camera. Users can align the 3D asset with the live feed by manipulating it onscreen through pinch-to-zoom, tap-and-drag and other gestures, providing a level of accuracy not possible through machine reckoning alone.
Alternatively, users can issue audible commands like "move left" and "move right" to match up the images. Wireframe can be "locked in" when a point or points are correctly aligned, thus calibrating the augmented view.
In yet another embodiment, the user can interact directly with the wire model by placing their hand into the live view area and "grabbing" parts of the virtual image, repositioning them with a special set of gestures. This third method requires object recognition technology to determine when and how a user’s hand is interacting with the environment directly in front of the camera.
See on Scoop.it – augmented world