I love presenting on stage – it's always fun to show people things they didn't know before, and to spark new ideas and conversations.
But I hate PowerPoint.
I'm a researcher in the field of Tangible User Interfaces, and so I decided to do a project on making presentations better: more usable, more intuitive, less abstract.
The result is a presentation system that's based on how we speak about presentations in everyday language: We 'pick up' a topic, and we 'walk through them', sometimes 'step by step'.Video Demonstration
That's also how DataTouch works:Step by step instructions
Technically, the rings are re-wound antennas of RFID readers, which are connected to an Arduino, which, in turn, is talking to the computer. A Kinect tracks the user on stage, and their hands.
Being able to detect which objects a user holds in their hands, as well as how they are held in relation to each other, opens up a lot of opportunities for interaction design:
While I was happy with the concept, the system turned out to be quite prone to detection errors, so I looked for an alternative.
As nice as the rings are, they also caused a lot of problems – hence, I decided to move the technology away from the body, and into a small pedestal on which all the objects (each of which can represent one topic) are stored. Every object contains an RFID tag, the pedestal contains one RFID reader for every object spot on it. As soon as one RFID tag is detected to be missing, it is assumed that the user has picked it up, taking it into their hand.Conclusions
The result is a much more robust version of the system, following the same interaction principles. I enjoy using it on stage, and I won't switch back to PowerPoint.