An eye-tracking experiment is conduced in an immersive virtual reality environment with 6 degrees of freedom, using a head mounted display . The users interact with 3D point cloud models following a task-dependent protocol, while recording their gaze and head trajectories.
In this webpage, we make publicly available a data set consisting of the tracked behavioural data, post-processing results, saliency maps in form of importance weights, re-distribution of a sub-set of contents and scripts to generate the exact versions of the point clouds that were used in the study, and usage examples.
You can download the data set from the following FTP by using dedicated FTP clients, such as FileZilla or FireFTP:
The total size of the dataset is ~710 MB.
Please instruct the README file in the data set for further information on the structure and the usage of the material.
You may also read  for more details about the experiment, and the methodologies that were defined and used.
Conditions of use
If you wish to use any of the provided material in your research, we kindly ask you to cite .
- E. Alexiou, P. Xu and T. Ebrahimi, “Towards Modelling of Visual Saliency in Point Clouds for Immersive Applications,” 2019 IEEE International Conference on Image Processing (ICIP), Taipei, Taiwan, 2019, pp. 4325-4329. doi: 10.1109/ICIP.2019.8803479