Audiovisual relationships have been explored in Human-Computer Interaction (HCI) to enhance the user experience and usability, and facilitate content generation. Among the application fields for this exploration, focusing on sound visualization: increased engagement and creativity using media players ; sound synthesis ; musician training ; accessibility and interfaces for users with hearing impediments   ; cross-modal information display to improve driving performance ; and game development .
 Collins, K. and Taillon, P.J. 2012. Visualized sound effect icons for improved multimedia accessibility: A pilot study. Entertainment Computing. 3, 1 (Jan. 2012), 11–17.
 Ferguson, S. et al. 2005. Seeing sound: real-time sound visualisation in visual feedback loops used for training musicians. International Conference on Information Visualisation, 97–102.
 Grierson, M.S. 2011. Making music with images: interactive audiovisual performance systems for the deaf. International Journal on Disability and Human Development. 10, 1 (Jan. 2011
 Ho-Ching, F.W. et al. 2003. Can you see what i hear?: the design and evaluation of a peripheral sound display for the deaf. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (New York, NY, USA, 2003), 161–168.
 Kim, M. et al. 2013. Sound Sketchbook: Synthetic Synesthesia on a Mobile Platform. Leonardo. 46, 3 (Mar. 2013), 284–285.
 Kim, S. and Lee, W. 2013. SOUND BOUND: making a graphic equalizer more interactive and fun. CHI ’13 Extended Abstracts on Human Factors in Computing Systems (New York, NY, USA, 2013), 2963–2966.
 Onimaru, S. et al. 2008. Cross-modal information display to improve driving performance. Proceedings of the 2008 ACM symposium on Virtual reality software and technology (New York, NY, USA, 2008), 281–282.
 Sauer, D. and Yang, Y.-H. 2009. Music-driven character animation. ACM Transactions on Multimedia Computing, Communications and Applications. 5, 4 (Nov. 2009), 27:1–27:16.