You're currently viewing an old version of this dataset. To see the current version, click here.

Immersive music experience in surround sound music reproduction

This repository includes data collected within the main study of the research project Richard Wagner 3.0 investigating immersive musical experience in multi-loudspeaker music reproduction.

The dataset contains immersion ratings given by participants in a listening experiment, sound field features of the stimuli as well as the results of statistical evaluations of the data.

This is a companion repository to the paper

Roman Kiyan, Jakob Bergner, Stephan Preihs, Yves Wycisk, Daphne Schössow, Kilian Sander, Jürgen Peissig, Reinhard Kopiez: "Towards predicting immersion in surround sound music reproduction from sound field features". Details TBA.

Code implementing the paper's methodology can be found at https://gitlab.uni-hannover.de/roman.kiyan.jr/immersionmodeling/.

Contact: Roman Kiyan

Data and Resources

Cite this as

Roman Kiyan, Jakob Bergner, Stephan Preihs, Yves Wycisk, Daphne Schössow, Kilian Sander, Jürgen Peissig, Reinhard Kopiez (2023). Dataset: Immersive music experience in surround sound music reproduction. https://doi.org/10.25835/3vx9ls5h

DOI retrieved: July 7, 2023

Additional Info

Field Value
Imported on August 4, 2023
Last update August 4, 2023
License CC-BY-SA-3.0
Source https://data.uni-hannover.de/dataset/immersive-music-experience-in-surround-sound-music-reproduction
Author Roman Kiyan
More Authors
Jakob Bergner
Stephan Preihs
Yves Wycisk
Daphne Schössow
Kilian Sander
Jürgen Peissig
Reinhard Kopiez
Author Email Roman Kiyan
Maintainer Roman Kiyan
Maintainer Email Roman Kiyan
Source Creation 07 July, 2023, 16:59 PM (UTC+0000)
Source Modified 07 July, 2023, 17:13 PM (UTC+0000)