Skip to content

NYU-ICL/what-is-hdr

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 

Repository files navigation

What is HDR? Perceptual Impact of Luminance and Contrast in Immersive Displays

Kenneth Chen1, Nathan Matsuda2, Jon McElvain2, Yang Zhao2, Thomas Wan2, Qi Sun1†, Alexandre Chapiro2†

† Equal contribution  

1 NYU logo2 Intel logo

project page

A model to predict perceptual impact (in Just-Objectionable-Differences, or JODs) is derived from HDR preference data for combinations of display contrast and peak luminance, with predictions visualized as a heatmap (left). In this plot, the baseline 0 JOD condition is set to values similar to commercially- available VR displays: 100 nits peak luminance and 64:1 contrast. In addition, we simulate 3 displays with different dynamic ranges. Our model allows us to examine the perceived improvement from increased peak luminance and contrast. For example, both display 2 and 3 provide a 1 JOD improvement over display 1. Note that HDR content cannot be displayed in a PDF format, so all images in this manuscript are tone-mapped for presentation. See our supplementary webpage for representative content.

Abstract

The contrast and luminance capabilities of a display are central to the quality of the image. High dynamic range (HDR) displays have high luminance and contrast, but it can be difficult to ascertain whether a given set of characteristics qualifies for this label. This is especially unclear for new display modes, such as virtual reality (VR). This paper studies the perceptual impact of peak luminance and contrast of a display, including characteristics and use cases representative of VR. To achieve this goal, we first developed a haploscope testbed prototype display capable of achieving 1k nits peak luminance and 1M:1 contrast with high precision. We then collected a novel HDR video dataset targetting VR-relevant content types. We also implemented custom tone mapping operators to map between display parameter sets. Finally, we collected subjective preference data spanning 3 orders of magnitude in each dimension. Our data was used to fit a model, which was validated using a subjective study on an HDR VR prototype headmounted display (HMD). Our model helps provide guidance for future display design, and helps standardize the understanding of HDR.

Data & Code

Dependencies

The plotting code required the Matlab library legendUnq from File Exchange.

Data

User study data and the results scaled to Just-Objectionable-Difference (JOD) units for our SIGGRAPH 2025 work can be found in data/siggraph25/:

  • data/siggraph25/study_results.csv: pairwise comparison ASAP study results for all participant trials.
  • data/siggraph25/scaled_results.csv: pairwise comparison results scaled to JODs using the pwcmp library. Data from our HVEI 2026 direct-view HDR study are found in data/hvei26/.

The model parameters are found in data/model_param.csv:

  • fixed: model parameters for the fixed TMO.
  • contentAware: model parameters for the content-aware TMO.
  • hvei26: model parameters for the direct-view HDR study published at HVEI 2026.

Code

Quick start with using our model can be found in src/eval_model.py.

Function for evaluating the model are located in src/model.m, and a simple example located in src/example.m.

To plot a figure similar to our teaser, use the function below:

  • src/plot_isolines.m: plot heatmaps, similar to Fig. 9 in the manuscript. The inputs to the function plot_isolines are as follows:
    • displayNames [string array] - names of displays to plot
    • displayLuminances [numerical array] - peak luminances of displays to plot
    • displayContrasts [numerical array] - contrasts of displays to plot
    • showIsolines [boolean array] - whether to plot isolines for a display
    • baseline [numerical array] - the [peak luminance, contrast] of a baseline, where JOD is pegged to 0
    • baselineName [string] - name of the baseline display

An example of this function being used to plot heatmaps is in src/heatmap_plot.m, and the result of running it is shown below:

Acknowledgements

We thank Ken Koh for creating HDR productivity content and Maurizio Nitti for rendering and designing HDR teddy bear scenes. Thanks to Dennis Pak for designing/constructing the haploscope and the mirror setup. Calibration of the EIZO display could not have been accomplished without the support of Yuta Asano. Thank you to Ben Mills for building the enclosure of our haploscope, for calibration of displays, as well as binocular calibration and alignment of mirrors. Thanks go to Will McCann and Xin Li, who supported the construction of the hardware and mirror fabrication. This project would not have been successful without the support of Fartun Sheygo and Alex Gherman, who conducted the main study, Nour Shoora who organized it, the user study participants for their time, and John Hill and Romain Bachy for help with logistics. Thanks go to Henry Milani for providing a PR-745 for validation of our displays, and Reza Saeedpour for support with using the device. Thank you to Dounia Hammou for providing pointers to HDR video datasets, Professor Rafał Mantiuk for the many discussions related to tone mapping and more, and Daryn Blanc-Goldhammer for comments on our work. We are grateful to Alexis Terterov for conducting the validation study, and EIZO support team for help debugging HDR displays. Thanks to Doug Lanman for discussions. Finally, thank you to Jenna Kang, Niall Williams, and Colin Groth for help with figure style. This work is partially supported by National Science Foundation grant #2225861, and a DARPA ICS program.

Contact

Contact Kenneth Chen (kennychen@nyu.edu) with any questions about the codebase or to request dataset access.

Related Projects

Perceptual Impact of Peak Luminance and Contrast in Direct View HDR Display, HVEI 2026. Kenneth Chen*, Yunxiang Zhang*, Qi Sun, Alexandre Chapiro.

Citation

If you find this project helpful to your research, please consider citing the following papers:

@inproceedings{
    chen2025whatishdr,
    author = {Chen, Kenneth and Matsuda, Nathan and McElvain, Jon and Zhao, Yang and Wan, Thomas and Sun, Qi and Chapiro, Alexandre},
    title = {What is HDR? Perceptual Impact of Luminance and Contrast in Immersive Displays},
    year = {2025},
    isbn = {9798400715402},
    publisher = {Association for Computing Machinery},
    address = {New York, NY, USA},
    url = {https://doi.org/10.1145/3721238.3730629},
    doi = {10.1145/3721238.3730629},
    booktitle = {Proceedings of the Special Interest Group on Computer Graphics and Interactive Techniques Conference Conference Papers},
    articleno = {40},
    numpages = {11},
    keywords = {High Dynamic Range, Displays, Visual Perception, Virtual Reality},
    series = {SIGGRAPH Conference Papers '25}
 } 
@article{
    chen2026wihdirectview,
    author = {Kenneth Chen and Yunxiang Zhang and Qi Sun and Alexandre Chapiro},
    title = {Perceptual Impact of Peak Luminance and Contrast in Direct View HDR Display},
    journal = {Electronic Imaging},
    volume = {38},
    number = {10},
    pages = {222-1--222-1},
    keywords = {Applied Perception, High Dynamic Range, Display Technology, Tone Mapping, Psychophysics},
    doi = {10.2352/EI.2026.38.10.HVEI-222},
    url = {https://library.imaging.org/ei/articles/38/10/HVEI-222},
    year = {2026},
} 

About

Official code release for "What is HDR? Perceptual Impact of Luminance and Contrast in Immersive Displays", presented at SIGGRAPH 2025

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors