About
EVOLVE-QoE
Ecological Validity Evaluation of Interactive Virtual Environments: A QoE Framework for Audiovisual Scenes
Project duration, 2024 - 2027
The proposed project EVOLVE-QoE aims to provide appropriate schemata for characterizing IVEs and IVE systems/media in terms of EV, in the context of auditory cognition and QoE. During the project QoEVAVE in AUDICTIVE Phase 1, the underlying IVE scene content was identified to introduce key factors for this research. EVOLVE-QoE is planned as a renewal proposal of QoEVAVE for AUDICTIVE Phase 2, building on the developed conceptual QoE framework, QExE Tool,and VR scenes. One main emphasis of this renewal proposal lies with the AUDICTIVE Priority (b) Interactive Virtual Environments. To bridge the gap of EV between IVEs and IREs, as illustrated in Figure 1, Priority (a) is addressed through systematic analysis via top-down and bottom-up characterizations of IVEs, focusing on auditory cognition and audiovisual interaction. Finally, quality evaluation methods are further expanded, building on the foundation laid in Phase 1, addressing Priority (c).
Project description excerpted from the original DFG proposal (SPP 2236 — Audictive).
QoEVAVE
Quality of Experience Evaluation of Interactive Virtual Environments with Audiovisual Scenes
Project duration, 2021 - 2024
Interactive virtual environments (IVEs) aim to replace real-world sensory input with corresponding streams of artificial stimulation. If successful, such a replacement will make the technology transparent and allow the user to interact naturally in a virtual world. Hence, how close to real life the IVE experience can get represents a key criterion for IVE quality. IVEs bring new challenges to quality evaluation and render current evaluation approaches in the audio and video communities partially inapplicable. Traditionally, the quality judgment is a dedicated cognitive process, and the participant is constantly aware of the quality judgment task. In contrast, IVEs rely on presence through immersion. Introducing a quality judgment task may reduce immersion in the IVE experience, which in turn affects the quality.
The QoEVAVE project draws inspiration from the virtual reality (VR) community and the long history of using indirect methods to investigate cognitive functions of immersion, presence, and performance in IVEs. The project aims at finding and closing the gaps in current quality evaluation methodologies for audio and video, and examines the feasibility of inferring quality from human behavior in an IVE. IVEs are multimodal and allow 3- or 6-degrees-of-freedom movement in the virtual scene. Compared to a uni-modal scenario, state-of-the-art research shows that multimodal sensory stimulation has significant effects on the resulting object localization, attention and quality evaluation, to name a few perceptual aspects. Regardless, quality evaluation today is mostly conducted within a specific sensory modality and without interaction. QoEVAVE builds upon the foundation of quality of experience (QoE) research and integrates methodologies from the VR community to develop the first QoE framework for IVEs. Here, the aim is to achieve an integrated view of IVE quality perception as a cognitive process and of cognitive performances on specific tasks as IVE-quality indicators.
The QoEVAVE project addresses the three Audictive priorities. Its main emphasis is on the quality evaluation priority area (C), developing a taxonomy and framework of IVE-quality evaluation methods. Next, the project addresses the auditory cognition priority area (A), since it aims to expand classical QoE-evaluation approaches by systematically including aspects of auditory and audiovisual cognition and hence task-performance into the assessment schemes. Finally, it addresses the IVEs priority area (B) by investigating QoE and cognitive performance in multiple instances of IVEs. The project builds upon state-of-the-art technologies of audiovisual IVEs and explores the possibilities for advancing audio rendering in virtual environments. In summary, the project recognizes the divergence between the VR community and the media technology community and sets its aim to unifying the field with regards to QoE evaluation in IVEs.
Project description excerpted from the original DFG proposal (SPP 2236 — Audictive).
More information on the Audictive Priority Program website.
Project Partners
International Audio Laboratories Erlangen*
*A joint institution of the Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) and Fraunhofer Institute for Integrated Circuits (IIS).
Team

Thomas M. Robotham
Post-doctoral Researcher
Emanuël A. P. Habets
Principal Investigator
Institut für Nachrichtentechnik, RWTH Aachen
Team

Abhinav Bhattacharya
PhD Student
Alexander Raake
Principal Investigator
Alumni

William Menz
PhD Student, TUIL (2023-2024)
Olli S. Rummukainen
Associated Researcher, AudioLabs (— April 2021)

Ashutosh Singla
PhD Student, TUIL (— 2022)

Daniela Rebmann
Student, FAU (2022-2024)

Dominik Fintineanu-Anghelescu
Student, FAU (2022-2024)
Funding
This research was funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation), project number 444832250 — SPP 2236.