原文网址:https://learnopengl.com/Advanced-Lighting/HDR
HDR
brightness and color values by default are clamped between 0.0 and 1.0 when stored into a framebuffer.
this at first seemingly innocent statement caused us to always specify light and color values somewhere in this range trying to make them fit into the scene.
this works oke’ and gives decent results, but what happens if we walk in a specifically bright area with multiple bright light sources that as a total sum exceed 1.0?
the answer is that all fragments that have a brightness or color sum over 1.0 get clamped to 1.0 which is not pretty to look at:

due to a large number of fragment’s color values getting clamped to 1.0 each of the bright fragments have the exact same white color in a large region, losing a significant amount of detail and giving it a fake look.

a solution to this problem would be to reduce the strength of the light sources and ensure no area of fragments in your scene ends up brighter than 1.0;
this is not a good solution as this forces u to use unrealistic lighting parameters. a better approach is to allow color values to temporarily exceed 1.0 and tranform them back to the original range of 0.0 and 1.0 as a final step, but without losing detail.

monitors are limited to display colors in the range of 0.0 and 1.0, but there is no such limitation in lighting equations.
by allowing fragment colors to exceed 1.0 we have a much higher range of color values available to work in known as high dynamic range (HDR).
with high dynamic range bright things can be really bright, dark things can be really dark, and details can be seen in both.

high dynamic range was originally only used for photography where a photographer takes multiple pictures of the same scene with varying exposure levels, capturing a large range of color values.
these combined images form a HDR image where a large range of details are visible based on the combined exposure levels or a specific exposure it is viewed with.
for instance, the image below shows a lot of detail at brightly lit regions with a low exposure (look at the window), but these details are gone with a high exposure. however, a high exposure now reveals a great amount of detail at darker regions that were not previously visible.


this is also very similar to how the human eye works and the basis of high dynamic range rendering.
when there is little light the human eye adapts itself so the darker parts are much better visible and similarly for bright areas, it is like the human eye has an automatic exposure slider based on the scene’s brightness.

high dynamic range rendering works a bit like that.
we allow for a much better range of color values to render to collecting a large range of dark and bright details of a scene, and at the end we transform all the HDR values back to the low dynamic range (LDR) of [0.0,1.0].
this process of converting HDR values to LDR values is called tone mapping and a large collection of tone mapping algorithms exist that aim to preserve most HDR details during the conversion process. these tone mapping algorithms often involve an exposure parameter that selectively favors dark or bright regions.

when it comes to real-time rendering high dynamic range allows us not only to exceed the LDR range of [0.0, 1.0] and preserve more detail, but also gives us the ability to specify a light source’s intensity by their real intensities.
for instance, the sun has a much higher intensity than something like a flashlight so why not configure the sun as such (like a diffuse brightness of 10.0).
this allows us to more properly configure a scene’s lighting with more realisitc lighting parameters,
something that would not be possible with LDR rendering as they would then directly get clamped to 1.0.

as monitors only display colors in the range between 0.0 and 1.0 we do need to transform the currently high dynamic range of color values back to the monitor’s range.
simply re-transforming the colors back with a simple average still would not do us too much good,
as brighter areas then become a lot more dominant. what we can do however, is use different equations
and/or curves to transform the HDR values back to LDR that give us complete control over the scene’s
brightness.
this is the proces earlier denoted as tone mapping and the final step of HDR rendering.

floating point framebuffers
to implement high dynamic range rendering we need some way to prevent color values getting clamped after each fragment shader run.
when framebuffers use a normalized fixed-point color format (like GL_RGB) as their colorbuffer’s internal format OpenGL automatically clamps the values between 0.0 and 1.0 before storing them in the framebuffer.
this operation holds for most types of framebuffer formats, except for floating point formats that are used for their extended range of values.

when the internal format of a framebuffer’s colorbuffer is specified as GL_RGB16F, GL_RGBA16F, GL_RGB32F or GL_RGBA32F the framebuffer is known as a floating point framebuffer that can store floating point values outside the default range of 0.0 and 1.0. this is perfect for rendering in high dynamic range!

to create a floating point framebuffer the only thing we need to change is its colorbuffer’s internal format parameter:

glBindTexture(GL_TEXTURE_2D, colorBuffer);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA16F, SCR_WIDTH, SCR_HEIGHT, 0, GL_RGBA, GL_FLOAT, NULL);

the default framebuffer of OpenGL (by default) only takes up 8 bits per color component.
with a floating point framebuffer with 32 bits per color component (when using GL_RGB32F or GL_RGBA32F) we are using 4 times more memory for storing color values. as 32 bits is not really necessary unless u need a high level of precision using GL_RGBA16F will suffice.

with a floating point colorbuffer attached to a framebuffer we can now render the scene into this framebuffer knowing color values will not get clamped between 0.0 and 1.0.
in this tutorial’s exampe demo we first render a lighted scene into the floating point framebuffer and then display the framebuffer’s colorbuffer on a screen-filled quad; it will look a bit like this:

glBindFramebuffer(GL_FRAMEBUFFER, hdrFBO);glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);  // [...] render (lighted) scene
glBindFramebuffer(GL_FRAMEBUFFER, 0);// now render hdr colorbuffer to 2D screen-filling quad with different shader
hdrShader.use();
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, hdrColorBufferTexture);
RenderQuad();

here a scene’s color values are filled into a floating point colorbuffer which can contain any arbitrary color value, possibly exceeding 1.0.
for this tutorial a simple demo scene was created with a large stretched cube acting as a tunnel with four point lights, one being extremely brigh positioned at the tunnel’s end:

std::vector<glm::vec3> lightColors;
lightColors.push_back(glm::vec3(200.0f, 200.0f, 200.0f));
lightColors.push_back(glm::vec3(0.1f, 0.0f, 0.0f));
lightColors.push_back(glm::vec3(0.0f, 0.0f, 0.2f));
lightColors.push_back(glm::vec3(0.0f, 0.1f, 0.0f));

rendering into the floating point framebuffer is exactly the same as we would normally render into a framebuffer. what is new is hdrShader’s fragment shader that renders the final 2D quad with the floating point colorbuffer texture attached. let us first define a simple pass-through fragment shader:

#version 330 core
out vec4 FragColor;in vec2 TexCoords;uniform sampler2D hdrBuffer;void main()
{             vec3 hdrColor = texture(hdrBuffer, TexCoords).rgb;FragColor = vec4(hdrColor, 1.0);
}

here we directly sample the floating point colorbuffer and use its color value as the fragment shader’s output.
however, as the 2D quad’s output is directly rendered into the default framebuffer, all the fragment shader’s output values will be clamped between 0.0 and 1.0 even though we have several values in the floating point color texture exceeding 1.0.

it becomes clear the intense light values at the end of the tunnel are clamped to 1.0 as a large portion of it is completely white, effectively losing all lighting details in the process that exceed 1.0.
as we directly transform HDR values to LDR values it is as if we have no HDR enabled in the first place.
what we need to do to fix this is to transform all the floating point color values back to 0.0-1.0 range without losing any of its details. we need to apply a process called tone mapping.

tone mapping
tone mapping is the process of transforming floating point color values to the expected [0.0,1.0] range known as low dynamic range without losing too much detail, often accompanied with a specific stylistic color balance.

the simplest tone mapping algorithm is known as Reinhard tone mapping and involves dividing the entire HDR color values to LDR color values evenly balancing them all out.
the Reinhard tone mapping algorithm evenly spreads out all brightness values onto LDR.
we include Reinhard tone mapping into the previous fragment shader and also add a gamma coorection filter for good measure (including the use of SRGB textures):

void main()
{const float gamma = 2.2;vec3 hdrColor = texture(hdrBuffer, TexCoords).rgb;// reinhard tone mappingvec3 mapped = hdrColor / (hdrColor + vec3(1.0));// gamma correctionmapped = pow(mapped, vec3(1.0/gamma));FragColor = vec4(mapped, 1.0);
}

with Reinhard tone mapping applied we no longer lose any detail at the bright areas of our scene. it does tend to slightly favor brighter areas, making darker regions seem less detailed and distinct:

here u can again see details at the end of the tunnel as the wood texture pattern becomes visible again.
with this relatively simple tone mapping algorithm we can properly see the entire range of HDR values stored in the floating point framebuffer, giving us precise control over the scene’s lighting without losing details.

note that we could also directly tone map at the end of our lighting shader, not needing any floating point framebuffer at all!
however, as scenes get more complex u will frequently find the need to store intermediate HDR results as floating point buffers so this is a good exercise.

another interesting use of tone mapping is to allow the use of an exposure parameter.
u probably remember from the introduction that HDR images contain a lot of details visible at different exposure levels.
if we have a scene that features a day and night cycle it makes sense to use a lower exposure at daylight and a higher exposure at night time, similar to how the human eye adapts.
with such an exposure parameter it allows us to configure lighting parameters that work both at day and night under different lighting conditions as we only have to change the exposure parameter.

a relatively simple exposure tone mapping algorithm looks as follows:

uniform float exposure;void main()
{             const float gamma = 2.2;vec3 hdrColor = texture(hdrBuffer, TexCoords).rgb;// Exposure tone mappingvec3 mapped = vec3(1.0) - exp(-hdrColor * exposure);// Gamma correction mapped = pow(mapped, vec3(1.0 / gamma));FragColor = vec4(mapped, 1.0);
}

here we defined an exposure uniform that defaults at 1.0 and allows us to more precisely specify whether we would like to focus more on dark or bright regions of the HDR color values.
for instance, with high exposure values the darker areas of the tunnel show significantly more detail.
in constrast, a low exposure largely removes the dark region details, but allows us to see more detail in the bright areas of a scene.
低曝光——能看到亮的细节,暗的细节丢失
高曝光——能看到暗的细节,凉的细节丢失
take a look at the image below to see the tunnel at multiple exposure levels:

this image clearly shows the benefit of high dynamic range rendering.
by changing the exposure level we get to see a lot of details of our scene, that would have been otherwise lost with low dynamic range rendering.
take the end of the tunnel for example, with a normal exposure the wood structure is barely visible, but with a low exposure the detailed wooden patterns are clearly visible. the same holds for the wooden patterns close by that are much better visible with a high exposure.
You can find the source code of the demo here. https://learnopengl.com/code_viewer_gh.php?code=src/5.advanced_lighting/6.hdr/hdr.cpp

more HDR
the two tone mapping algorithms shown are only a few of a large collection of (more advanced) tone mapping algorithms of which each has their own strengths and weakness.
some tone mapping algorithms favor certain colors/intensities above others and some algorithms
display both the low and high exposure colors at the same time to create more colorful and detailed images.

there is also a collection of techniques known as automatic exposure adjustment or eye adaption techniques that determine the brightness of the scene in the previous frame and (slowly) adapt the exposure parameter such that the scene gets brighter in dark areas or darker in bright areas mimicking 模仿 the human eye.

the real benefit of HDR rendering really shows itself in large and complex scenes with heavy lighting algorithms.
as it is difficult to create such a complex demo scene for teaching purposes while keeping it accessible the tutorial’s demo scene is small and lacks detail.
while relatively simple it does show some of the benefits of HDR rendering:
no details are lost in high and dark regions as they can be regained with tone mapping,
the addition of multiple lights does not cause clamped regions and light values can be specified by their
original brightness values not being limited by LDR values.
furthermore, HDR rendering also makes several interesting effects more feasible and realisitic;
one of these effects is bloom that we discuss in the next next tutorial.

learnopengl——HDR——完结相关推荐

  1. HDR (automatic exposure control + Tonemapping + Bloom)

    <div class="markdown_views"><!-- flowchart 箭头图标 勿删 --><svg xmlns="http ...

  2. KlayGE 4.0中Deferred Rendering的改进(五)完结篇:Post process

    转载请注明出处为KlayGE游戏引擎 上一篇分析了KlayGE中实现实时全动态GI的方法,本篇是这个系列的完结篇,主要讲流水线的最后一段:Post process. Post process 在Kla ...

  3. OpenGL基础50:HDR

    一.HDR与LDR 由于显示器只能显示值为0.0到1.0间的颜色,因此当数据存储在帧缓冲(Framebuffer)中时,亮度和颜色的值也是默认被限制在0.0到1.0之间的,这个颜色范围即是LDR(Lo ...

  4. 【LearnOpenGL】-PBR材质

    PBR,或者用更通俗一些的称呼是指基于物理的渲染(Physically Based Rendering),它指的是一些在不同程度上都基于与现实世界的物理原理更相符的基本理论所构成的渲染技术的集合.正因 ...

  5. sRGB HDR概念性学习

    threejs交流群511163089 sRGB sRGB是个啥 s是standard 是大厂为了统一标准搞的色彩空间 对应了gamma0.45空间. 然后是伽马矫正 gammaxx空间就是那个幂 伽 ...

  6. LearnOpenGL学习笔记—PBR:IBL

    LearnOpenGL学习笔记-PBR:IBL 0 引入 1 渲染方程的求解 2 hdr文件转成cubemap 3 预计算漫反射积分 4 预计算镜面反射积分 4.1 预滤波HDR环境贴图 4.1.1 ...

  7. learnopengl——Specular IBL——貌似读懂了

    https://learnopengl.com/#!PBR/IBL/Specular-IBL in the previous tutorial we've set up PBR in combinat ...

  8. HDR in OpenGL

    HDR Framebuffer中的亮度和颜色值的范围被限制在0.0和1.0之间.我们在场景中设置光和颜色的时,也只能这个范围取值.这么做大部分情况下是OK的,结果也还可以,但是当场景中有一块多光源,亮 ...

  9. 【图形学】后处理下的HDR、颜色分级、颜色映射与颜色空间

    目录 一.Color Grading 1.1 概念 1.2 内置在Unity/UE4 二.HDR.LDR与VDR 2.1 LDR 2.2 HDR 2.3 VDR 三.Tone Mapping 3.1 ...

最新文章

  1. java.lang AAPT_android R.java aapt
  2. AC-Tek Sidewinder v7.2.2 输送机设计+IDEA StatiCa v9.1.31.50722 钢结构混凝土结构件设计...
  3. 如何做好配电室、临时用电安全管理?
  4. vue依赖缓存_Vue SSR服务端渲染之数据缓存
  5. 章鱼扫描仪:Java构建工具和恶意软件
  6. mybatis中使用使用模块化sql
  7. 嵌入Windows User Control到ASP.NET web form
  8. 分享一个嘉立创封装库(内含AD和PADS两种格式)
  9. matlab 正弦信号合成三角波,【matlab求助】正弦波叠加成三角波信号
  10. 谱尼测试网络安全护航
  11. ps html 优化,优化 Photoshop 的性能
  12. 【RDMA】19. RDMA之iWARP Soft-iWARP
  13. 在线gif图片压缩,如何压缩gif动图大小
  14. ARM服务器开箱测试【转载】
  15. MATLAB S-function(教程分享) 报错 flag = 3(output), at time 0.0. 输入参数的数目不足。
  16. 数学基础:反三角函数
  17. qcc300x开发调试笔记
  18. 阿里云域名申请 + 七牛云CDN加速
  19. KETTLE实现循环批量多表抽取添加字段
  20. 独家 | 数据化思维、 数字化陷阱和 0.01 突破

热门文章

  1. 论文阅读 “Adaptive Tool Path Planning Strategy for Freeform Surface Machining using Point Cloud Article”
  2. 55、nginx rewrite
  3. RAID磁盘阵列之RAID 5
  4. matlab导入数据画二维云图,matlab中用xyz三组数据画出2维云图
  5. 2020牛客暑期多校训练营Decrement on the Tree(图论,set)
  6. 5款优秀的在线表格生成工具
  7. java 域账户登录失败_域用户登陆,错误:无法登录到你的账户,通常可以通过从你的账户注销,然后重新登录来解决此问题...
  8. python随机密码生成以整数17为随机数种子_简述pythonpytorch 随机种子的实现
  9. 综合日语第一册第六课
  10. 渗透测试必备google插件