Wednesday, February 19, 2014

Global Illumination - lifting the curtain

As promised, I will elaborate on the topic of global illumination and provide a bit of source code to illustrate the method, mentioned in a previous post.
Light probes are spread across critical areas, manually or automatically. Incoming lighting at these points in space is captured in cube maps which are then converted to spherical harmonics.
A short video shows these probes in action




The HLSL code for sampling the spherical harmonics coefficients looks like this :

You provide a normal direction and the function returns the illumination coming from that direction.
 
// 'lightingSH' is the lighting environment projected onto SH (3rd order in this case),

// and 'n' is the surface normal

float3 ProjectOntoSH9(in float3 lightingSH[9], in float3 n)

{

    float3 result = 0.0f;

   

    // Cosine kernel

    const float A0 = 1.0f;

    const float A1 = 2.0f / 3.0f;

    const float A2 = 0.25f;



    // Band 0

    result += lightingSH[0] * 0.282095f * A0;



    // Band 1

    result += lightingSH[1] * 0.488603f * n.y * A1;

    result += lightingSH[2] * 0.488603f * n.z * A1;

    result += lightingSH[3] * 0.488603f * n.x * A1;



    // Band 2

    result += lightingSH[4] * 1.092548f * n.x * n.y * A2;

    result += lightingSH[5] * 1.092548f * n.y * n.z * A2;

    result += lightingSH[6] * 0.315392f * (3.0f * n.z * n.z - 1.0f) * A2;

    result += lightingSH[7] * 1.092548f * n.x * n.z * A2;

    result += lightingSH[8] * 0.546274f * (n.x * n.x - n.y * n.y) * A2;



    return result;

}


Here is the code for rendering the light probes (for debug purpose)
technique RenderSH
{
    pass p0
 {
  VertexShader = compile vs_3_0 SimpleVSTransformed();
  PixelShader = compile ps_3_0 psLightingRenderSH();
        CullMode = CCW;
  FillMode = solid;
  Zenable = true;
  StencilEnable = true;  
  AlphaBlendEnable = false;
  AlphaTestEnable = false; 
  ZWriteEnable = true; 
 }
};
void SimpleVSTransformed(in float4 inPos: POSITION, in float2 inTex: TEXCOORD0, 
out float4 outPos: POSITION, out float2 outTex: TEXCOORD0, out float4 wPos : TEXCOORD1)
{
outPos = inPos;
outTex = inTex;


outPos = mul(float4(inPos.xyz, 1), c_mViewProjection);
wPos = mul(float4(inPos.xyz, 1), c_mWorld); ; 


}
float4 psLightingRenderSH(PS_INPUT_LIGHT i, in float4 wPos : TEXCOORD1 ) : COLOR0
{

 
 
float4 color = 1.0 ;

  
    
 float3 vLightDir = normalize(   wPos.xyz - lightProbePos ) ;

   float4 probeCol = float4(ProjectOntoSH9(SHarmonicsCoefficients,-vLightDir) , 1.0) ; 
            return float4(probeCol.xyz  , 1.0);



}
As you can see, a sphere mesh is rendered and this is what is happening, briefly :
 Running through the vertex shader, world space vertex positions (yep, a sphere mesh has vertices spread around the center) are send to the pixel shader via TEXCOORD1 slot. Pixel shader then runs through every pixel, gets the light probe position we are currently rendering, gets the pixel position in world space, subtracts those to form direction and samples the spherical harmonics coefficients to obtain the pixel color.

No comments:

Post a Comment