man with VR glasses

Project: Real-Time 3D Gesture Analysis for Natural Interaction with Smart Devices

This project will help develop the next-generation interface – being able to interact with mobile phones and other smart devices using gestures with our hands and fingers in three dimensions. The applications are many and diverse: virtual and augmented reality (VR and AR), medical settings, robotics, e-learning, 3D games and much more.

Project information

Project manager
Shahrouz Yousefi
Project members
Shahrouz Yousefi, project management, technical development and supervision
Aris Alissandrakis, design and analysis of the studies, development of methodologies
Marcelo Milrad, senior advisor, empirical studies
Haibo Li, guest professor from KTH, senior advisor
Nico Reski (project development)
Participating organizations
Linnaeus University and Screen Interaction AB, MindArk PE AB and Globalmouth AB from industry
The Knowledge Foundation (KK-stiftelsen), the HÖG 16 programme
Febr 2017-Febr 2020
Media Technology (Department of Media Technology, Faculty of Technology)

More about the project

The rapid development and wide adoption of mobile devices in recent years have been mainly driven by the introduction of novel interaction and visualization technologies. Although touchscreens have significantly enhanced the human device interaction, clearly for the next generation of smart devices (future smartphones, smart watches, virtual/augmented reality glasses, etc.), users will no longer be satisfied with just performing interaction over the 2D touchscreen. They will demand more natural interactions performed by the bare hands in the free space in front or around the smart device. Thus, next generation of smart devices such as augmented reality glasses and smart watches will require a gesture-based interface to facilitate the bare hands for manipulating digital content directly. Therefore, users will be able to interact, through the mobile device, with the physical space, objects and information in virtual/augmented environments, medical applications, robotics, 3D games etc.

In general, 3D hand gesture recognition and tracking have been considered as classical computer vision and pattern recognition problems. Although substantial research has been carried out in this area, the state-of-the-art research results are mainly limited to global hand tracking and low-resolution gesture analysis. However, in order to facilitate the natural gesture-based interaction, full analysis of the hand and fingers will be required. This process will be even more challenging considering the analysis of multiple hands in complex scenarios such as collaborative applications among different users. The vision with the research ideas described in this project is to substantially enhance the way people interact with smart devices. We believe that 3D gestural interaction, through fingers and hands, is the next "new big thing" in user experience. 3D interaction has a direct connection with realistic experiences users have in real life. For instance, the same experience as they have when they grab and rotate an object in the physical world but instead through a smart device. The goal is to re-produce the same realistic experience in the digital space with extremely high accuracy hand gesture analysis. The main objectives behind this research project are to develop new concepts and introduce innovative methodologies for hand gesture recognition and tracking that can facilitate the gesture-based interfaces in mobile/smart devices. In order to fulfill these objectives, technical challenges such as efficiency and effectiveness of the solutions to be developed, as well as its design and usability aspects will be considered.

Our initial research results and findings enable large-scale and real-time 3D gesture tracking and analysis. The proposed unique framework and technical solutions enable user-device interaction in real-time applications in smartphones, tablets, wearable devices, etc., where 100% natural and fast 3D interaction is important. Medical Applications, Virtual Reality (VR), Augmented Reality (AR), E-learning and 3D gaming will be among the areas that directly benefit from the 3D interaction technology.

Specifically, this research project will investigate the following research questions:

  • How to develop and employ the new approaches of Big Data analysis for gesture recognition and tracking? Moreover, how to integrate them with existing computer vision and pattern recognition solutions to tackle the high degrees of freedom 3D gesture analysis?
  • How to extend the gesture detection, recognition and analysis to multiple hands? How to integrate the developed gesture-based solution in AR/VR scenarios, with the support of industrial partners?
  • What type of design and usability issues should be considered to effectively employ the 3D gesture-based interaction in the desired application scenarios in mobile and wearable devices? How to design and facilitate the mobile applications with effective 3D gestures?

Three industrial partners will join the project serving as active collaborators. The companies will contribute with unique knowledge and expertise, human resources, and equipment to speed-up the progress in research, development, implementation, and demonstration of the technology in the desired application scenarios. It is expected that the interaction between the different project partners will result in an effective and productive collaboration leading to the generation of novel ideas and solutions to address the challenges mentioned above. Moreover, from a research perspective the aim is to introduce novel and intuitive methods for 3D hand gesture analysis and enable users to interact with smart devices intuitively. Therefore, the current user experience will be significantly improved. A complementary and experienced team of hard working research scientists in media technology together with the different competences from each industrial partner will conduct the project with a focus on interaction technologies in industrial applications.

See also the news feature about the project from 4 April 2017.

Image: CC0, HammerandTusk,


  • Haibo Li, guest professor from KTH