Development of new human-machine interaction systems using Deep Learning techniques
Human-machine interaction systems are systems that allow people to interact with machines (computers, Virtual or Augmented Reality devices, etc.). This interaction can be of various types, from the use of remote controls to touch screens or voice interaction.
Today’s technological advances allow the development of various interaction metaphors, since they not only provide the ability for people to provide information to the machine, but also for the machines to acquire information about the users (e.g., their location).
I3B has the capacity to develop new human-machine interaction systems in various formats, such as screens, smart speakers, Virtual Reality or Augmented Reality devices, allowing the application of this capability to various sectors and contexts.
On the other hand, applications based on Artificial Vision techniques are systems that allow information to be acquired from images and transformed into numerical or symbolic information that can be processed and analysed by a computer. The possibility of interpreting images automatically and in real time is a great technological advance that allows: detecting and estimating the pose of people and objects present in images, reconstructing and semantically analysing 3D scenes in real time through the use of RGB-D cameras, performing thermographic analysis or training object detection models through the use of synthetic images extracted from a 3D model, among others.
Applications:
• Gesture recognition.
• Speech recognition.
• Development of multi-touch technology.
• Analysis of the movement of people in a controlled space, e.g. a physical store.
• Identification of physical elements for integration with Augmented Reality applications.
• Semantic analysis to identify the presence of specific objects in images.
• Identification of real objects and analysis of user interaction with these elements.
These models can be applied for different purposes, both from the clinical point of view (identification of Autism Spectrum Disorder), as well as from the organizational neuroscience point of view (evaluation of capabilities, leadership style, decision-making processes in the face of risk) or knowledge of the consumer profile (prediction of trajectories within a physical store, prediction of gender from movements within a physical space).