Yawn and the lights turn off... Can we really control our homes with gestures?
Interactions need to be shaped around people's natural preferences, writes Kevin O'Mahony
It may seem like a scene from a science fiction film; however, such interactions could become a reality in the near future.
As technology evolves, the use of gestures to control gadgets and computers is becoming increasingly commonplace.
Many people will be familiar with Kinect for controlling the X-box, but the focus of my research is not limited to gaming, and includes gesture control for "smart" environments.
Emerging gesture recognition technologies, such as Microsoft Kinect, Leap Motion and Thalmic Labs MYO, will offer even more advanced gesture interaction opportunities. So, the possibility of controlling our homes with gestures may not be too far in the distance!
In recent years, the design of gesture interaction methods has gathered momentum, but issues have arisen around the inconsistencies across various platforms.
Many people will understand the frustration of switching between a Windows PC and a Mac, or having to learn new interaction methods for different platforms and applications each time.
Interactions should be shaped around users' preferences and natural abilities – what's good for the designer or engineer may not necessarily be good for the person who ultimately has to use the system.
My research focuses on personalisation approaches for enabling people to determine how they control conditions in their environment through gesture interaction.
The aim is to improve usability and user experience by enabling people to interact in a consistent way, reflective of their own gesture preferences.
We carried out gesture analysis studies in the Cork IT Nimbus Centre Lab, where participants performed their preferred gesture interactions for controlling conditions such as lights, sound and temperature within home living environments.
Microsoft Kinect motion-sensing cameras were set up to capture users' gestures and I also recorded participants' reasons for selecting each gesture.
My research findings have shown the variety of gesture preferences, along with the most popular gesture patterns performed for each task given.
By storing participants' user profiles and associated gesture preferences in a database, predicted personalised gestures sets can be generated. This can become more sophisticated and accurate as more profiles and preferences are gathered.
Microsoft recently offered myself and my colleague, John Sheehy, a place on their upcoming Microsoft Kinect for Windows developer kit programme. The aim is to explore the possibilities of applying the technology to situations that extend beyond gaming and introduce it into everyday life.
Kevin O'Mahony is a PhD student and Interaction designer at the Nimbus Centre, an embedded systems research centre at Cork IT. He graduated with a BA (Hons) in Multimedia in 2007 and an MA in Media Design in 2009. He began a PhD in 2010 at the Nimbus Centre.