顯示包含「Spherical Harmonics」標籤的文章。顯示所有文章
顯示包含「Spherical Harmonics」標籤的文章。顯示所有文章

Light Map in "Seal Guardian"

Introduction
Light mapping is a common technique used in games for storing lighting data. "Seal Guardian" used light map in order to support large variety of hardware from iOS, Mac to PC because of its low run time cost. There are many methods to bake the light map such as photon mapping and radiosity. Our baking method is similar to radiosity hemicube[1], but we render a full cube map for each light map texel to store incoming lighting data instead.

Scene with light map
Scene without light map


Light Map Atlas
In each level, light map is built for all static meshes with a second unique UV set. We gather all those static meshes and pack them into a large light map atlas by using this method[2], others method can be chosen, we just pick a simple one.

Packing a single large light map atlas for all static mesh in the scene

Compute Light Map Texel Position
Then we render all the meshes into a RGBA32Float world position render target using the light map atlas layout created before (by a vertex shader which transform the mesh 3D world position vertex to its unique 2D light map UV). Then we query back the render target to store all the written texels which correspond to the world position of each light map texel. Those position will be used for rendering cube maps for radiosity.

Each square represent a single light map texel,
we query back those texel world space position to render cube map for radiosity

Radiosity Baking
As talked before, we use a method similar to hemicube, but rendering a full cube map instead, so we will render a cube map at each light map texel with all the post processing effect/tone mapping off and just storing the lighting data. Because our light map is intended to store the incoming indirect static lighting for each texel, we convert the incoming lighting data cube map rendered at each texel to 2nd order spherical harmonics coefficients(i.e. 4 coefficients for each RGB channels), the conversion method can be found in "Stupid Spherical Harmonics (SH) Tricks"[3]. So we will need 1 RGBA32Float(or RGBA16Float) cube map and 3 temporal RGBA32Float(or RGBA16Float) textures for each radiosity iteration.

No light map, direct lighting and emissive materials only 
Lighting without albedo texture

Radiosity pass 1
In the first pass, we render all the meshes without analytical light source (e.g. directional light) into the cube map. Only the emissive material such as sky and static light placed in the scene get rendered to inject the initial lighting into the radiosity iterations. We support sphere and box shape static light which get rendered into the cube just like an emissive mesh during rendering the cube map. Once the cube map render is completed, we convert the cube map to SH coefficients and store the values. After all the texels are rendered, we will have an incoming lighting light map from emissive mesh and static light source ready for the next pass.
Light map baked with the emissive sky material
Lighting with light map using the emissive sky

Radiosity pass 2
In the second pass, we render all the meshes only with analytical light source and the SH light map from previous pass into a new cube map to calculate the first bound incoming lighting. Then convert the cube map to SH coefficients. After all the texels are rendered and converted to an SH light map, we need to sum this SH light map to the previous pass SH light map to get the accumulated lighting for pass 1 and 2 into another 3 accumulated SH light maps for our final storage (this accumulated light map is not used in the radiosity iteration, just for final radiosity output.).
Light map baked with direct lighting and emissive material
Lighting with light map using direct lighting and emissive sky

Radiosity pass >= 3
For the sub-sequence passes, we can use the SH light map from previous iteration to render the cube map and repeat the conversion to SH and accumulate SH lighting for all passes steps to get the incoming indirect lighting for each light map texels.

Final baked result, showing both direct and indrect lighting
Lighting using light map, without albedo texture 

Storage Format
To store the light map data for runtime and reduce memory usage (3 SH light maps in float format, i.e. 12 values for each texels, is too much data to store...), we decompose the the incoming lighting color data to luma and chroma. We only store the luma data in SH format and compute an average chroma value by integrating the SH RGB incoming lighting data with a SH cosine transfer function along the static mesh normal direction, this will get a reflected Lambertian lighting and we use this value to compute the chroma value. By doing this, we can preserve the directional variation of indirect lighting, keeping an average color of incoming lighting and reduce the light map storage to 6 values per texels. To further reduce the memory usage, we clamp the incoming SH luma value to a predefined range so that we can store it in 8-bit texture. However, using compression like DXT will result in artifacts, so we just store the light map data in 2 RGBA8 textures.

Final light map used in run-time, storing SH luma and average chroma

Conclusion
In this post, we have briefly outlined how the light maps are created in "Seal Guardian". It is based on a modified version of radiosity hemicube and using SH as an intermediate representation for baking and reduce the storage size (by splitting the lighting data to luma and chroma.). We skipped some of the baking details like padding the lighting data for each UV shell in each radiosity iteration to avoid light leaking from empty light map texels. Also "Seal Guardian" is rendered using PBR, that means we have metallic material which doesn't work well with radiosity. Instead of converting the metallic to a diffuse material, we pre-filter all the environment probe in each radiosity pass to get the lighting for metallic material. Also, we would like to improve the light map baking in the future, like improving the baking time, fixing the compression problem, we may try BC6H (but need to find another method for iOS compression...), or using a smaller texture size for chroma light map than the luma SH light map texture...

Lastly, if you are interested in "Seal Guardian", feel free to check it out and its Steam store page is live now. It will be released on 8th Dec, 2017 on iOS/Mac/PC. Thank you.

The yellow light bounce on the floor is done by the yellow metallic wall with pre-filtering the environment map in each radiosity pass
Showing only the indirect lighting



References

[1] https://www.siggraph.org/education/materials/HyperGraph/radiosity/overview_2.htm
[2] http://blackpawn.com/texts/lightmaps/
[3] http://www.ppsloan.org/publications/StupidSH36.pdf

Photon Mapping Part 2

Introduction
Continue with previous post, this post will describe how light map is calculated from the photon map. My light map stores incoming radiance of indirect lighting on a surface which are projected into Spherical Harmonics(SH) basis. 4 SH coefficients is used  for each color channels. So 3 textures are used for RGB channels (total 12 coefficients).

Baking the light map
To bake the light map, the scene must have a set of unique, non-overlapping texture coordinates(UV) that correspond to a unique world space position so that the incoming radiance at a world position can be represented. This set of UV can be generated inside modeling package or using UVAtlas. In my simple case, this UV is mapped manually.
To generate the light map, given a mesh with unique UV and the light map resolution, we need to rasterize the mesh (using scan-line or half-space rasterization) into the texture space with interpolated world space position across the triangles. So we can associate a world space position to a light map texel. Then for each texel, we can sample the photon map at the corresponding world space position by performing a final gather step just like previous post for offline rendering. So the incoming radiance at that world space position, hence the texel in the light map, can be calculated. Then the data is projected into SH coefficients, stored in 3 16-bits floating point textures. Below is a light map that extracting the dominant light color from SH coefficients:

The baked light map showing the dominant
light color from SH coefficients

Using the light map
After baking the light map, during run-time, the direct lighting is rendering with usual way, a point light is used to approximated the area light in the ray traced version, the difference is more noticeable at the shadow edges.

direct lighting only, real time version
direct lighting only, ray traced version

Then we sample the SH coefficients from the light map to calculate the indirect lighting
indirect lighting only, real time version
indirect lighting only, ray traced version

Combining the direct and indirect lighting, the final result becomes:
direct + indirect lighting, real time version
direct + indirect lighting, ray traced version

As we store the light map in SH, we can apply normal map to the mesh to change the reflected radiance.
Rendered with normal map
Indirect lighting with normal map
We can also applying some tessellation, adding some ambient occlusion(AO) to make the result more interesting:
Rendered with light map, normal map, tessellation and AO
Rendered with light map, normal map, tessellation and AO
Conclusion
This post gives an overview on how to bake light map of indirect lighting data by sampling from the photon map. I use SH to store the incoming radiance, but other data can be stored such as storing the reflected diffuse radiance of the surface, which can reduce texture storage and doesn't require floating point texture. Besides, the SH coefficients can be store per vertex in the static mesh instead of light map. Lastly, by sampling the photon map with final gather rays, light probe for dynamic objects can also be baked using similar methods.

References
March of the Froblins: http://developer.amd.com/samples/demos/pages/froblins.aspx
Lighting and Material of HALO 3: http://www.bungie.net/images/Inside/publications/presentations/lighting_material.zip

Extracting dominant light from Spherical Harmonics

Introduction
Spherical Harmonics(SH) functions can represent low frequency data such as diffuse lighting, where those high frequency details are lost after projected to SH. Luckily we can extract a dominant directional light from SH coefficients to fake specular lighting. We can also extract more than 1 directional light from SH coefficients, but this post will only focus on extracting 1 dominant light, those interested can read Stupid Spherical Harmonics (SH) Tricks for the details. A webGL demo is provided at the last section which will only extract 1 directional light.


Extracting dominant light direction
We can get a single dominant light direction from the SH projected environment lighting, Le. Consider we approximate the environment light up to band 1 (i.e. l=1):

Finding the dominant light direction is equivalent to choose an incoming direction, ω, so that Le(ω)is maximized. In other words, cosθ should equals to 1:


So we can extract the dominant light direction for a single color channel. Finally the dominant light direction can be calculated by scaling each dominant direction for RGB channels using the ration that convert color to gray scale:


Extracting dominant light intensity
After extracting the light direction, the remaining problem is to calculate the light intensity. That's mean we want to calculate an intensity s, so that the error between the extracted light and the light environment is at minimum (Le is the original environment light while Ld is the directional light):

To minimize the error, differentiate the equation and solve it equals to zero:

If both lighting functions are projected into SH, the intensity can be simplified to:

The next step is to project the directional light(with unit intensity) into SH basis (ci is the SH coefficient of the projected directional light):

Therefore the SH coefficients of projected directional light can be calculated by substituting the light direction into the corresponding SH basis function.

As the SH projected directional light is in unit intensity, we want to scale it with a factor so that the extracted light intensity s is the light color that can be ready for use in direct lighting equation which is defined as (detail explanation can be found in [4]):
For artist convenience, clight does not correspond to a direct radiometric measure of the light’s intensity; it is specified as the color a white Lambertian surface would have when illuminated by the light from a direction parallel to the surface normal (lc = n).
So we need to calculate a scaling factor, c, that scale the SH projected directional light such that:


We can project both L(ω) and (n . ω) into SH to calculate the integral. To project the transfer function (nω) into SH, we can first align the n to +Z-axis, which is zonal harmonics, then we can rotate the ZH coefficient into any direction using the equation:

The ZH coefficients of (n . ω) are: (note that the result is different from Stupid Spherical Harmonics (SH) Tricks in the Normalization section as we have taken the π term outside the integral)


Then rotate the ZH coefficients such that the normal direction is equals to the light direction, ld (because we need ld = n as stated above), we have:

Finally we can go back to compute the scaling factor, c,  for the SH projected directional light (we calculate up to band=2):

Therefore the steps to extract the dominant light intensity are first to project the directional light into SH with a scaling factor c, and then light color, s,  can be calculated by:



WebGL Demo
A webGL demo (need a webGL enabled browser such as Chrome) is provided to illustrate how to extract a single directional light to fake the specular lighting from the SH coefficient. The specular lighting is calculated using the basic Blinn-Phong specular team for simplicity reason, other specular lighting equation can be used such as those physically plausible. (The source code can be downloaded from here.)
Your browser does not support the canvas tag/WebGL. This is a static example of what would be seen.
Render Diffuse
Render Specular
Rotate Model
Glossiness
Conclusion
Extracting the dominant directional light from SH projected light is easy to compute with the following steps: First, calculate the dominant light direction. Second, project the dominant light into SH with a normalization factor. Third, calculate the light color. The extracted light can be used for specular lighting to give an impression of high frequency lighting.

References
[1] Stupid Spherical Harmonics (SH) Tricks: http://www.ppsloan.org/publications/StupidSH36.pdf
[5] PI or not to PI in game lighting equation: http://seblagarde.wordpress.com/2012/01/08/pi-or-not-to-pi-in-game-lighting-equation/
[6] March of the Froblins: Simulation and Rendering Massive Crowds of Intelligent and Detailed Creatures on GPU: http://developer.amd.com/documentation/presentations/legacy/Chapter03-SBOT-March_of_The_Froblins.pdf
[7] Pick dominant light from sh coeffs: http://sourceforge.net/mailarchive/message.php?msg_id=28778827




Spherical Harmonic Lighting

Introduction
Spherical Harmonics(SH) functions are a set of orthogonal basis functions defined in spherical coordinates using imaginary numbers. In this post, we use the following conversion between spherical and cartesian coordinates:
Since we are dealing with real value functions, we only need to deal with real spherical harmonics functions which in the form of:
The index l of the SH function is called the band index which is an integer >= 0 and index m is an integer with range -l<=m<=l , so there will be (2l + 1) functions in a given band. You may refer to the Appendix A2 of Stupid Spherical Harmonics(SH) Trick to look up the evaluated value of the SH basis function for a pair of (l, m).

The linear combination of SH basis functions with scalar values can be used to approximate a function as below:
With an approximation up to band l = - 1, which n×n coefficients are needed.
So the remaining problem to approximate a function is to compute the coefficient c which can be solved either analytically or numerically by Monte Carlo Integration.

Monte Carlo Integration
To compute a definite integral numerically, we can consider the Monte Carlo Estimator:
When the number of samples, N, is large enough, the estimator F will equal to the definite integral because considering the expected value of F:
When number of samples,N, is large enough, by the law of large numbers, the estimator F will converge to the definite integral. Therefore, we can calculate the coefficient of the SH basis functions by using Monte Carlo Estimator.

Properties of Spherical Harmonics Function
There are 2 important properties properties of SH functions:
First, it is rotationally invariant. 
Where the rotated function g is still a SH function which its coefficients can be computed by using the coefficients of f. For details of rotating a general SH functions, you can refer to the section 'Rotating Spherical Harmonics' in Spherical Harmonics Lighting: The Gritty Details.

Second, when integrating 2 SH projected functions over the spherical domain, the results will equals to dot product of their SH coefficients (due to the SH basis functions are orthogonal):
This is a nice property that we can calculate the integration over the spherical domain by a dot product of the SH coefficients.

Lighting with SH functions
When performing lighting calculation, we need to solve the rendering equation:
For shading lambert diffuse surface without shadow, we can simplify the rendering equation into:
To solve this integral, we can project the functions L(x, ω) and max(N.ω, 0) into SH functions using Monte Carlo Integration, then by the property 2 described above, the integral equals to dot product of the SH coefficients of the 2 SH projected functions.

Zonal Harmonics
If a SH projected function is rotational symmetric about a fixed axis, it is called Zonal Harmonics(ZH). If this axis is the z-axis, this will make the ZH function only depends on θ, which will result in only one non-zero coefficient in each band with m= 0. Then rotation of the ZH function can be greatly simplified. When the ZH function is rotated to a new axis d, the coefficients of the rotated SH function will equals to:
,which is faster than the general SH rotation. The ZH function is well suit to approximate the function max(N.ω, 0) in the above diffuse surface rendering equation since the SH projected L(xω) is usually done in world space while the shading surface can be re-oriented to the same space to perform lighting calculation.

WebGL Demo
Below is a webGL demo (which need a webGL enabled browser such as Chrome) using the cube map on the right as light source and projected to SH function using Monte Carlo Integration.

Both the white and the blue color on the model is reflected from the sun and the blue sky using SH coefficients generated from the cube map and the ZH coefficients projected from max(N.ω, 0) which rotated to world space according the surface normal. The approximation is done up to band l=2.  You can drag in the viewport to rotate the camera.
Your browser does not support the canvas tag/WebGL. This is a static example of what would be seen.
The source code of the webGL can be downloaded here.

Conclusion
SH functions can be used to approximate the rendering equation with only a few coefficients and a simple dot product to evaluate lighting during run time. But it also has its disadvantage while SH can only approximate low frequency function as it needs large number of bands to represent high frequency details.

Reference
[1] Spherical Harmonics Lighting: The Gritty Details: http://www.research.scea.com/gdc2003/spherical-harmonic-lighting.pdf
[2] Stupid Spherical Harmonics(SH) Trick: http://www.ppsloan.org/publications/StupidSH36.pdf
[3] Physically Based Rendering: http://www.amazon.com/gp/product/0123750792/ref=pd_lpo_k2_dp_sr_1?pf_rd_p=486539851&pf_rd_s=lpo-top-stripe-1&pf_rd_t=201&pf_rd_i=012553180X&pf_rd_m=ATVPDKIKX0DER&pf_rd_r=09AG8FQQWKJHC2AEFPD1
[4] Sky box texture downloaded from: http://www.codemonsters.de/home/content.php?show=cubemaps