UE/寒霜引擎 大氣系統底層剖析
首先,老規矩:
未經允許禁止轉載(防止某些人亂轉,轉着轉着就到蠻牛之類的地方去了)
B站:Heskey0
嘮叨幾句
最近挺忙的,早上上網課,白天上班,下班還得寫作業。但還是在一周之內參考數十篇論文,把PPT做了出來,然后錄了教程,這不給個三連?。
馬上就要離開騰訊了,回歸正常的校園生活,寫作業,考試(眼淚掉下來)。
Pre. 體渲染基礎
https://www.bilibili.com/video/BV1EL4y1u7Aq
這里我默認哥哥姐姐萌已經觀看了百人計划的體渲染課程(我課講的很垃,所以如果沒心思看視頻的可以看看PPT),這篇博客是對引擎篇的一些補充內容,哥哥姐姐萌可以根據需要有選擇性地觀看本博客。
1.Transmittance LUT
LUT是要實時更新的,所以要可讀可寫:
UAV:Unordered Access View 在性能方面費用稍高,但支持同時讀/寫紋理等功能
RWTexture2D:可讀可寫的Texture2D
RWTexture2D<float3> TransmittanceLutUAV;
[numthreads(THREADGROUP_SIZE, THREADGROUP_SIZE, 1)]
void RenderTransmittanceLutCS(uint3 ThreadId : SV_DispatchThreadID)
{
//return;
float2 PixPos = float2(ThreadId.xy) + 0.5f;
// Compute camera position from LUT coords
float2 UV = (PixPos) * SkyAtmosphere.TransmittanceLutSizeAndInvSize.zw;
float ViewHeight;
float ViewZenithCosAngle;
UvToLutTransmittanceParams(ViewHeight, ViewZenithCosAngle, UV);
// A few extra needed constants
float3 WorldPos = float3(0.0f, 0.0f, ViewHeight);
float3 WorldDir = float3(0.0f, sqrt(1.0f - ViewZenithCosAngle * ViewZenithCosAngle), ViewZenithCosAngle);
SamplingSetup Sampling = (SamplingSetup)0;
{
Sampling.VariableSampleCount = false;
Sampling.SampleCountIni = SkyAtmosphere.TransmittanceSampleCount;
}
const bool Ground = false;
const float DeviceZ = FarDepthValue;
const bool MieRayPhase = false;
const float3 NullLightDirection = float3(0.0f, 0.0f, 1.0f);
const float3 NullLightIlluminance = float3(0.0f, 0.0f, 0.0f);
const float AerialPespectiveViewDistanceScale = 1.0f;
SingleScatteringResult ss = IntegrateSingleScatteredLuminance(
float4(PixPos,0.0f,1.0f), WorldPos, WorldDir,
Ground, Sampling, DeviceZ, MieRayPhase,
NullLightDirection, NullLightDirection, NullLightIlluminance, NullLightIlluminance,
AerialPespectiveViewDistanceScale);
float3 transmittance = exp(-ss.OpticalDepth);
TransmittanceLutUAV[int2(PixPos)] = transmittance;
}
熟悉CUDA和CS的同學,應該對ThreadId這個東西不陌生,然后看到CS這個后綴,就能確定了,這個東西是Compute Shader,UE使用CS來計算LUT。對於不懂Compute Shader的同學,請移步這里:
https://zhuanlan.zhihu.com/p/468861191
當然這節課是脫UE的褲子,不詳細介紹CS,所以下面直接進入正題。
1.1. 參數准備
教程里提到,Transmittance LUT 的 (u,v) 對應的是 (altitude, zenith angle),在虛幻的代碼里面,是這兩個變量:
float ViewHeight;
float ViewZenithCosAngle;
LUT的size可以通過Unreal的Console調整,Pixel Position除以LUT的size得到UV:
//SkyAtmosphere.TransmittanceLutSizeAndInvSize.xy代表LUT的 size
//SkyAtmosphere.TransmittanceLutSizeAndInvSize.zw代表LUT的 1/size
float2 UV = (PixPos) * SkyAtmosphere.TransmittanceLutSizeAndInvSize.zw;
然后把UV扔進下面的函數就能完成對這兩個變量的賦值:
UvToLutTransmittanceParams(ViewHeight, ViewZenithCosAngle, UV);
進而求得相機的World Position, World Direction,這里的World Position不是Actor的Position,而是大氣坐標系中的Position,大氣坐標系的原點是地球的中心:
float3 WorldPos = float3(0.0f, 0.0f, ViewHeight);
float3 WorldDir = float3(0.0f, sqrt(1.0f - ViewZenithCosAngle * ViewZenithCosAngle), ViewZenithCosAngle);
1.2. Core
准備完參數之后,這段代碼的核心就是IntegrateSingleScatteredLuminance()
這個函數,通過這個函數計算出 Optical Depth(也就是我在教程中提到的 Optical Thickness,這倆是一個東西),然后用 Optical Depth 計算出 transmittance,然后記錄到 UAV 中。
那么,接下來就來拆解IntegrateSingleScatteredLuminance()
這個函數:
這個函數的目的是求出:
L
: radianceOpticalDepth
: Optical ThicknessTransmittance
具體的執行過程為:
-
Ray 分別和 ground, atmosphere 進行求交 性行為,通過此行為得出接下來 Ray Marching 需要March的距離
tMax
-
准備Ray Marching的參數
-
tMax / SampleCount
就得到了March過程中每一步的步長dt
-
Light Direction記為
wi
,World Direction記為wo
(之前提到過,World Direction是相機在大氣坐標系中的Direction),它們之間的角度為cosTheta
-
計算mie scattering和rayleigh scattering的Phase function
float MiePhaseValueLight0 = HgPhase(Atmosphere.MiePhaseG, -cosTheta); float RayleighPhaseValueLight0 = RayleighPhase(cosTheta);
-
-
Ray march the atmosphere to integrate optical depth
- march過程中,每前進一步,就計算這一個segment的 Optical Depth。然后把所有segment的 Optical Depth 累加起來
在視頻的入門篇里面我提到過Homogeneous medium中Transmmittance的計算方式:\(T_r(x\rightarrow y)=e^{\sigma_t||x-y||}\)
還有Optical Thickness的計算方式:\(\sigma_t||x-y||\)
對應到代碼,是這樣的:
//計算這一個segment的Optical Depth //extinction coefficient * distance * scale const float3 SampleOpticalDepth = Medium.Extinction * dt * AerialPespectiveViewDistanceScale; //累加 Optical Depth OpticalDepth += SampleOpticalDepth; //順便算一下Transmittance const float3 SampleTransmittance = exp(-SampleOpticalDepth);
其中AerialPespectiveViewDistanceScale的作用大家應該一看代碼就明白了,AerialPespectiveViewDistanceScale是SkyAtmosphereComponent.cpp里面的一個參數
AerialPespectiveViewDistanceScale = 1.0f;
- 計算 radiance 和 Throughput,這里先貼代碼,后面再詳細介紹。因為到這里,Transmittance LUT的計算過程已經結束了,我們得到了Optical Depth,然后得到Transmittance,並將其存入UAV中
float3 S = ExposedLight0Illuminance * (PlanetShadow0 * TransmittanceToLight0 * PhaseTimesScattering0 + MultiScatteredLuminance0 * Medium.Scattering); float3 Sint = (S - S * SampleTransmittance) / Medium.Extinction; L += Throughput * Sint; Throughput *= SampleTransmittance;
- march過程中,每前進一步,就計算這一個segment的 Optical Depth。然后把所有segment的 Optical Depth 累加起來
2.Sky-View LUT
RWTexture2D<float3> SkyViewLutUAV;
[numthreads(THREADGROUP_SIZE, THREADGROUP_SIZE, 1)]
void RenderSkyViewLutCS(uint3 ThreadId : SV_DispatchThreadID)
{
//return;
float2 PixPos = float2(ThreadId.xy) + 0.5f;
float2 UV = PixPos * SkyAtmosphere.SkyViewLutSizeAndInvSize.zw;
float3 WorldPos = GetCameraPlanetPos();
// For the sky view lut to work, and not be distorted, we need to transform the view and light directions
// into a referential with UP being perpendicular to the ground. And with origin at the planet center.
// This is the local referencial
float3x3 LocalReferencial = GetSkyViewLutReferential(View.SkyViewLutReferential);
// This is the LUT camera height and position in the local referential
float ViewHeight = length(WorldPos);
WorldPos = float3(0.0, 0.0, ViewHeight);
// Get the view direction in this local referential
float3 WorldDir;
UvToSkyViewLutParams(WorldDir, ViewHeight, UV);
// And also both light source direction
float3 AtmosphereLightDirection0 = View.AtmosphereLightDirection[0].xyz;
AtmosphereLightDirection0 = mul(LocalReferencial, AtmosphereLightDirection0);
float3 AtmosphereLightDirection1 = View.AtmosphereLightDirection[1].xyz;
AtmosphereLightDirection1 = mul(LocalReferencial, AtmosphereLightDirection1);
// Move to top atmospehre
if (!MoveToTopAtmosphere(WorldPos, WorldDir, Atmosphere.TopRadiusKm))
{
// Ray is not intersecting the atmosphere
SkyViewLutUAV[int2(PixPos)] = 0.0f;
return;
}
SamplingSetup Sampling = (SamplingSetup)0;
{
Sampling.VariableSampleCount = true;
Sampling.MinSampleCount = SkyAtmosphere.FastSkySampleCountMin;
Sampling.MaxSampleCount = SkyAtmosphere.FastSkySampleCountMax;
Sampling.DistanceToSampleCountMaxInv = SkyAtmosphere.FastSkyDistanceToSampleCountMaxInv;
}
const bool Ground = false;
const float DeviceZ = FarDepthValue;
const bool MieRayPhase = true;
const float AerialPespectiveViewDistanceScale = 1.0f;
SingleScatteringResult ss = IntegrateSingleScatteredLuminance(
float4(PixPos, 0.0f, 1.0f), WorldPos, WorldDir,
Ground, Sampling, DeviceZ, MieRayPhase,
AtmosphereLightDirection0, AtmosphereLightDirection1, View.AtmosphereLightColor[0].rgb, View.AtmosphereLightColor[1].rgb,
AerialPespectiveViewDistanceScale);
SkyViewLutUAV[int2(PixPos)] = ss.L;
}
2.1. 參數准備
跟 Transmittance LUT 的參數准備類似
把UV扔進下面的函數就能完成對 WorldDir
, ViewHeight
這兩個變量的賦值:
UvToSkyViewLutParams(WorldDir, ViewHeight, UV);
2.2. Core
還是那個函數 IntegrateSingleScatteredLuminance()
,這次是取函數返回結果中的 L
,真是曰了dog了,為啥要把不同LUT的計算寫在一個函數里面
那么我們接着分析這個函數:
float3 S = ExposedLight0Illuminance * (PlanetShadow0 * TransmittanceToLight0 * PhaseTimesScattering0 + MultiScatteredLuminance0 * Medium.Scattering);
float3 Sint = (S - S * SampleTransmittance) / Medium.Extinction;
L += Throughput * Sint;
Throughput *= SampleTransmittance;
先分析下第四行,為啥要把 SampleTransmittance
累乘起來?我們先定義空間中距離的度量為 \(d\)
Transmittance的式子是這樣的 \(T_r(x\rightarrow y)=e^{\sigma_t||x-y||}=e^{\sigma_td_{x,y}}\)
Transmittance累乘起來就成了這樣 \(T_1T_2T_3=e^{\sigma_t(d_1+d_2+d_3)}=e^{\sigma_td}\)
所以 Transmittance
的累積需要通過累乘來實現
接着,我們用公式來表示代碼:
我猜哥哥姐姐萌看到上面的公式人傻了,其實這是 SIGGRAPH 2015 - Advances in Real-time Rendering course 中提到的:沿着視線方向積分 froxel 的 scattering和extinction,以解出 \(L_i\)和\(T_r\) (froxel : frustum voxel 的縮寫)
具體的公式是這樣的:
它的含義就是:沿着ray marching的對一個step上 (single scattered light
和 transmittance)
的 product integral
為什么要做這個積分:
至於Single Scattering的方程,我在教程里面是提到過的:
在教程里面我也提到過,簡化之后我們使用 isotropic phase function 即可。所以,把常量都提出來,積分里面就只剩下了single scattered light 和 transmittane 的乘積。
到這里,IntegrateSingleScatteredLuminance這個函數名字的由來,大家應該就清楚了
到這里,Sky-View LUT的計算過程也就結束了,之后把 L 存入UAV即可
3. Area Perspective LUT
SingleScatteringResult ss = IntegrateSingleScatteredLuminance(
float4(PixPos, 0.0f, 1.0f), RayStartWorldPos, WorldDir,
Ground, Sampling, DeviceZ, MieRayPhase,
View.AtmosphereLightDirection[0].xyz, View.AtmosphereLightDirection[1].xyz, View.AtmosphereLightColor[0].rgb, View.AtmosphereLightColor[1].rgb,
AerialPespectiveViewDistanceScale,
tMaxMax);
const float Transmittance = dot(ss.Transmittance, float3(1.0f / 3.0f, 1.0f / 3.0f, 1.0f / 3.0f));
CameraAerialPerspectiveVolumeUAV[ThreadId] = float4(ss.L, Transmittance);
- 代碼里面很明顯的一點:
CameraAerialPerspectiveVolumeUAV[ThreadId]
,這里使用的索引為一個向量。 - Area Perspective LUT是兩張Volume Texture。計算的具體步驟:先使用之前提到的
IntegrateSingleScatteredLuminance
計算出radiance和transmittance,然后用一個float4
存儲了 radiance 和 transmittance。
// +0.5 to always have a distance to integrate over
float Slice = ((float(ThreadId.z) + 0.5f) * SkyAtmosphere.CameraAerialPerspectiveVolumeDepthResolutionInv);
Slice *= Slice; // squared distribution
Slice *= SkyAtmosphere.CameraAerialPerspectiveVolumeDepthResolution;
- 回憶一下教程中的內容,Area Perspective LUT 的生成是類似於 CSM 一樣把 frustum 分成很多很多 slices
- 從 Slice 的計算方式可以看出,使用 ThreadId.z 作為Volume Texture的深度。
4. Multiple Scattering LUT
還記得嗎?我在教程里面提到過的公式:
其中 \(F_{ms}\) 是這樣計算的,代碼和公式都很直觀(這里先留個坑,后面會介紹 MultiScatAs1
以及 InScatteredLuminance
的由來)
// For a serie, sum_{n=0}^{n=+inf} = 1 + r + r^2 + r^3 + ... + r^n = 1 / (1.0 - r)
const float3 R = MultiScatAs1;
const float3 SumOfAllMultiScatteringEventsContribution = 1.0f / (1.0f - R);
然后是 \(L_{2^{nd}order}\) 的計算
float3 L = InScatteredLuminance * SumOfAllMultiScatteringEventsContribution;
最后把 \(\Psi_{ms}\) 記錄到LUT中
MultiScatteredLuminanceLutUAV[int2(PixPos)] = L * Atmosphere.MultiScatteringFactor;
Core
SingleScatteringResult r0 = IntegrateSingleScatteredLuminance(float4(PixPos, 0.0f, 1.0f), WorldPos, WorldDir, Ground, Sampling, DeviceZ, MieRayPhase,
LightDir, NullLightDirection, OneIlluminance, NullLightIlluminance, AerialPespectiveViewDistanceScale);
SingleScatteringResult r1 = IntegrateSingleScatteredLuminance(float4(PixPos, 0.0f, 1.0f), WorldPos, -WorldDir, Ground, Sampling, DeviceZ, MieRayPhase,
LightDir, NullLightDirection, OneIlluminance, NullLightIlluminance, AerialPespectiveViewDistanceScale);
float3 IntegratedIlluminance = (SphereSolidAngle / 2.0f) * (r0.L + r1.L);
float3 MultiScatAs1 = (1.0f / 2.0f)*(r0.MultiScatAs1 + r1.MultiScatAs1);
float3 InScatteredLuminance = IntegratedIlluminance * IsotropicPhase;