On 18th August 2015, I presented my research project “Enabling Audiovisual User Interfaces” at the Instruments and Interactions event, at Queen Mary University of London (from 3pm to 4pm in room G2 in the Engineering Building).
Nuno Correia will present the research project “Enabling Audiovisual User Interfaces” that he is conducting at EAVI (Embodied AudioVisual Interaction group), Goldsmiths, University of London. It is a 2-year project, started in mid-2014 and supported by a Marie Curie EU fellowship. The main research question is: how can interconnected sound and image be used to create more usable, accessible, playful and engaging user interfaces? To address this issue, a new UI paradigm is proposed – AVUI (AudioVisual User Interface). AVUI links interaction, sound and image, building upon the concept of Graphical User Interface (GUI) by adding interconnected sound and image. The research hypothesis is: the introduction of AVUI, integrating interrelated sonic and visual feedback, reacting to user interactions, will lead to more usable, accessible, playful and engaging UIs, as compared to a traditional GUI – particularly in use cases where accessibility and/or engagement are determinant. He will present the main research threads he has been developing as part of his project, partly in collaboration with Queen Mary, University of London. Project link: http://avuis.goldsmithsdigital.com