Unity CommandBuffer


  它在Unity5中已經開始實裝了的, 不過因為現在的開發在顯示效果有什么需求的, 基本都是插件能找到的, 也是策划人員隨大流的設計, 所以基本沒有實際上的開發需求, 一直就沒有關注過.

  先下載它的官方樣例來看看吧, 給了三個例子, 比較有意義的是第二個 DeferredCustomLights, 順便能復習一下Unity Built-In Shader API. 這個假光照的例子, 它包含了繪制簡易物體, 假光照, 以及目標渲染路徑 : 

  這個假光照使用了一個常用的渲染思路, 使用3D物體在物理空間的位置和體積, 控制渲染范圍, 然后通過屏幕空間的反向計算獲取真實像素對應的實際物體的位置, 通過它再次計算光照, 某些動態貼花插件用的就是這個方法, 先不管這些, 看看它是怎樣構建一個渲染流程的吧.

  PS : 因為以前做類似暴風雪技能范圍的顯示, 用過Unity自帶的投影, 那性能簡直坑爹.

  就這東西

  

  先把CommandBuffer能插入的生命周期弄過來看看:

  因為我手機, PC -> HTC Vive, 微軟HoloLens都用過, 所以知道手機基本五花八門, 是否能支持不止要看API版本, 還要看GPU, 比如找到iOS的Doc : 

Apple GPU Hardware

Together, the A7, A8, A9, A10, and A11 GPUs create a new generation of graphics hardware that support both Metal and OpenGL ES 3.0. To get the most out of a 3D, graphics-dominated app running on the these GPUs, use Metal. Metal provides extremely low-overhead access to these GPUs, enabling incredibly high performance for your sophisticated graphics rendering and computational tasks. Metal eliminates many performance bottlenecks—such as costly state validation—that are found in traditional graphics APIs. Both Metal and OpenGL ES 3.0 incorporate many new features, such as multiple render targets and transform feedback, that have not been available on mobile processors before. This means that advanced rendering techniques that have previously been available only on desktop machines, such as deferred rendering, can now be used in iOS apps. Refer to the Metal Programming Guide for more information about what features are visible to Metal apps.

   安卓就更不用說了, 那么在不同的渲染路徑下, 他們能通用的只有從SkyBox開始的步驟了, 很大概率在做的時候要兩種渲染路徑都做...

  創建CommandBuffer(代碼來自官方代碼, 全部省略), 非常簡單 : 

  buf.m_AfterLighting = new CommandBuffer();
  buf.m_AfterLighting.name = "Deferred custom lights";
  cam.AddCommandBuffer(CameraEvent.AfterLighting, buf.m_AfterLighting);

  它是假光照, 所以放在系統光照完了之后再進行疊加渲染即可, 然后下面就是更新各種數據, 讓燈光的位置和數據可以被更新 : 

        // construct command buffer to draw lights and compute illumination on the scene
        foreach(var o in system.m_Lights)
        {
            // light parameters we'll use in the shader
            param.x = o.m_TubeLength;
            param.y = o.m_Size;
            param.z = 1.0f / (o.m_Range * o.m_Range);
            param.w = (float)o.m_Kind;
            buf.m_AfterLighting.SetGlobalVector(propParams, param);
            // light color
            buf.m_AfterLighting.SetGlobalColor(propColor, o.GetLinearColor());

            // draw sphere that covers light area, with shader
            // pass that computes illumination on the scene
            var scale = Vector3.one * o.m_Range * 2.0f;
            trs = Matrix4x4.TRS(o.transform.position, o.transform.rotation, scale);
            buf.m_AfterLighting.DrawMesh(m_SphereMesh, trs, m_LightMaterial, 0, 0);
        }

  這樣就完成了一個渲染流程, CommandBuffer可以說的就完了...完了... 

  確實沒有什么好說的, 它提供了很多Draw, Blit方法, 提供了插入各個生命周期的功能, 你可以用來代替普通的后處理, 也可以儲存臨時變量跨生命周期來使用, 很好很強大的功能, 根據使用者的不同能發揮不同的效果...比如說Unity自帶的FrameDebugger, 你把每個生命周期的圖片都存下來, 就能做了...

  我來試一下, 先創建幾個CommandBuffer, 放在不同的生命周期里 : 

using UnityEngine;
using UnityEngine.Rendering;

public class CommandBufferTest : MonoBehaviour
{
    public Shader imageShader;

    void Start()
    {
        var cam = Camera.main;
        var mat = new Material(imageShader);

        {
            var tex0 = Shader.PropertyToID("GetTex0");

            CommandBuffer cmd1 = new CommandBuffer();
            cmd1.name = "GetTex0";

            cmd1.GetTemporaryRT(tex0, Screen.width + 1, Screen.height + 1, 0, FilterMode.Bilinear, RenderTextureFormat.ARGB32);   // 同樣大小的RT分到的是同一個RT...
            cmd1.Blit(BuiltinRenderTextureType.GBuffer0, tex0);
            cmd1.SetGlobalTexture(Shader.PropertyToID("_Tex0"), tex0);
            cmd1.ReleaseTemporaryRT(tex0);

            cam.AddCommandBuffer(CameraEvent.AfterGBuffer, cmd1);
        }

        {
            var tex1 = Shader.PropertyToID("GetTex1");

            CommandBuffer cmd2 = new CommandBuffer();
            cmd2.name = "GetTex1";

            cmd2.GetTemporaryRT(tex1, Screen.width + 2, Screen.height + 2, 0, FilterMode.Bilinear, RenderTextureFormat.ARGB32);   // 同樣大小的RT分到的是同一個RT...
            cmd2.Blit(BuiltinRenderTextureType.CameraTarget, tex1);
            cmd2.SetGlobalTexture(Shader.PropertyToID("_Tex1"), tex1);
            cmd2.ReleaseTemporaryRT(tex1);

            cam.AddCommandBuffer(CameraEvent.BeforeLighting, cmd2);
        }

        {
            CommandBuffer cmd3 = new CommandBuffer();
            cmd3.name = "_PostEffect";

            cmd3.Blit(BuiltinRenderTextureType.CurrentActive, BuiltinRenderTextureType.CameraTarget, mat);
            cam.AddCommandBuffer(CameraEvent.AfterImageEffects, cmd3);
        }
    }
}

  cmd1在Gbuffer渲染完成后獲取GBuffer0貼圖, cmd2在光照之前獲取輸出貼圖, 然后設置到全局變量_Tex0, _Tex1里面去了, 然后cmd3在后處理階段使用了這兩張貼圖, 輸出到屏幕上, 就跟FrameDebug一樣顯示了不同生命周期下的輸出了, Shader見下圖, 把GBuffer0渲染在左半邊, 光照前的輸出在右半邊 : 

        sampler2D _MainTex;
        sampler2D _Tex0;
        sampler2D _Tex1;

        fixed4 frag (v2f i) : SV_Target
        { 
            if (i.uv.x < 0.5) 
            {
                return float4(tex2D(_Tex0, float2(i.uv.x * 2.0, i.uv.y)).rgb, 1);
            }
            else
            {
                return float4(tex2D(_Tex1, float2((i.uv.x - 0.5) * 2.0, i.uv.y)).rgb,1);
            }
        }

  場景原圖和輸出圖 : 

 

  這就是跨越生命周期獲取的信息了. 以單純只使用Shader是做不到的, 即使Gbuffer可以獲取, 可是光照前的圖像是無法獲取的, 明白了這點就知道它的妙用了. 打開FrameDebugger可以看到 : 

  這里獲取GBuffer0跟右邊顯示的不一樣, 右邊顯示的GBuffer0的跟我獲取的是相反的, 而GBuffer1才是我獲取的, 貌似BuiltinRenderTextureType.GBuffer0, BuiltinRenderTextureType.GBuffer1被他搞反了......

 

   不管怎樣, 獲取未光照時的效果是正確的.

   補充一些函數的特點 : 

1. CommandBuffer的Blit會導致RenderTarget被改變

CommandBuffer.Blit(source, dest, mat) // 會導致RenderTarget改變

2. CommandBuffer怎樣實現GL的全屏幕繪制 : 

// GL
    Graphics.SetRenderTarget(destination);

    GL.PushMatrix();
    GL.LoadOrtho();

    GL.Begin(GL.QUADS);
    {
        // Quad ...
    }
    GL.End();

// ComandBuffer -- 這里如果是OpenGL的話需要反向y軸
    quad = new Mesh();            
    quad.vertices = new Vector3[]
    {
        new Vector3(-1f, y1, 0f), // Bottom-Left
        new Vector3(-1f, y2, 0f), // Upper-Left
        new Vector3( 1f, y2, 0f), // Upper-Right
        new Vector3( 1f, y1, 0f)  // Bottom-Right
    };
    
    quad.uv = new Vector2[]
    {
        new Vector2(0f, 0f), 
        new Vector2(0f, 1f), 
        new Vector2(1f, 1f), 
        new Vector2(1f, 0f)
    };
    
    quad.colors = new Color[]
    {
        ...
    };
    
    quad.triangles = new int[] { 0, 1, 2, 2, 3, 0 };
    CommandBuffer.SetRenderTarget(...)
    CommandBuffer.DrawMesh(quad, Matrix4x4.identity, ...);

 

=================================================================================

  補一些計算過程, 可能以后其它地方會用到 : 

  1. 簡單光照計算(自身為光源)

    struct v2f {
        float4 pos : SV_POSITION;
        float4 uv : TEXCOORD0;
        float3 ray : TEXCOORD1;
    };    
    
    // Common lighting data calculation (direction, attenuation, ...)
    void DeferredCalculateLightParams(
        unity_v2f_deferred_instanced i,
        out float3 outWorldPos,
        out float2 outUV,
        out half3 outLightDir,
        out float outAtten,
        out float outFadeDist)
    {
        i.ray = i.ray * (_ProjectionParams.z / i.ray.z);
        float2 uv = i.uv.xy / i.uv.w;

        // read depth and reconstruct world position
        float depth = SAMPLE_DEPTH_TEXTURE(_CameraDepthTexture, uv);
        depth = Linear01Depth(depth);
        float4 vpos = float4(i.ray * depth, 1);
        float3 wpos = mul(unity_CameraToWorld, vpos).xyz;

        float fadeDist = UnityComputeShadowFadeDistance(wpos, vpos.z);

        float3 lightPos = float3(unity_ObjectToWorld[0][3], unity_ObjectToWorld[1][3], unity_ObjectToWorld[2][3]);
        float3 tolight = wpos - lightPos;
        half3 lightDir = -normalize(tolight);

        float att = dot(tolight, tolight) * (距離平方倒數); // 光源的最遠照亮距離
        float atten = tex2D (_LightTextureB0, att.rr).UNITY_ATTEN_CHANNEL;

        atten *= UnityDeferredComputeShadow(tolight, fadeDist, uv);

        outWorldPos = wpos;
        outUV = uv;
        outLightDir = lightDir;
        outAtten = atten;
        outFadeDist = fadeDist;
    }

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM