There’s a reason why the PolyEyes 2.0, an augmented reality project by the Interactive Architecture Lab, looks like a helmet of an alien invader.

Thanks to a couple of Raspberry Pi Camera Modules built into the headset and spinning around, there’s no need to do complex eye tricks to see everything around us.

Their video sums it up perfectly what the user sees:

The Polymelia Project considers the human body as an assemblage; a collection of heterogeneous components, a material-informational entity whose boundaries undergo continuous construction and reconstruction. We think of the body as the original prosthesis we all learn to manipulate, so that any replacement or extension becomes part of a continuing process of upgrading the human entity. The Polymelia Suit (PolyEyes, PolyLimbs, Exoskeleton, Sensing Suit) suggests a new communication language for the future of prosthesis and of humanity.

“You are alone in the room, except for two Raspberry Pi Camera Module spinning in the dim light. You use PolyEyes (aka Hammerhead Vision System) and through the Raspberry Pi Compute Module, you communicate with some other entities in another room, whom you cannot see. Relying solely on the Exo-skeletical Suit Controller, you must decide whether to share or receive stimuli. One of the entities wants to share its own visual field. The other entity wants to send you signals from its sensing body. He/she/it will reproduce through the PolyLimbs the body movement of the other entity. Your job is to explore alternative ways for communicating that distinguish your current performance from an embodied augmented reality.”

Advertisement

Sounds fun!

Top gif: Prosthetic Knowledge

To contact the author of this post, write to: gergovas@kotaku.com