A new Mag­num ex­am­ple im­ple­ments an an­a­lyt­ic method for area light shad­ing pre­sent­ed in the pa­per “Re­al-Time Polyg­o­nal-Light Shad­ing with Lin­ear­ly Trans­formed Cosines”, by Er­ic Heitz, Jonathan Dupuy, Stephen Hill and David Neubelt.

The code is avail­able through the Area Lights ex­am­ple page in the doc­u­men­ta­tion and the ex­am­ple has al­so a live web ver­sion linked be­low. This blog post ex­plains the ba­sics of shad­ing with lin­ear­ly trans­formed cosines (LTCs) and how Mag­num was used for the im­ple­men­ta­tion.

Shad­ing with LTCs

To un­der­stand lin­ear­ly trans­formed cosines I will start off by ex­plain­ing some ba­sics. If you al­ready know what BRDFs are, you may want to skip the next para­graph or two.

Bi­na­ry Re­flectance Dis­tri­bu­tion Func­tions

When shad­ing a point on a sur­face, phys­i­cal­ly, you need to take all in­com­ing rays from ev­ery di­rec­tion in­to ac­count. Some light rays af­fect the fi­nal col­or of the shad­ed point more than oth­ers — de­pend­ing on their di­rec­tion, the view di­rec­tion and the prop­er­ties of the ma­te­ri­al.

A per­fect mir­ror, for ex­am­ple, may take on­ly the ex­act re­flec­tion of the view vec­tor in­to ac­count, as can be seen in fig­ure (a), where­as a more dif­fuse ma­te­ri­al will be af­fect­ed by all or most in­com­ing rays sim­i­lar­ly or equal­ly, as vi­su­al­ized in fig­ure (b).

These fig­ures show spher­i­cal dis­tri­bu­tions: imag­ine you want to shade a point on a sur­face (the point where the view vec­tor is re­flect­ed). Imag­ine a ray of light hit­ting this point, it will pierce through the col­ored sphere at some lo­ca­tion. The col­or at that lo­ca­tion on the sphere in­di­cates how much this ray will af­fect the col­or of the sur­face point: the more red, the high­er the ef­fect.

Visualization of the BRDF of a nearly perfect mirror
(a) Vi­su­al­iza­tion of what the BRDF of a near­ly per­fect mir­ror may look like.
Visualization of the BRDF of a diffuse material
(b) Vi­su­al­iza­tion of what the BRDF of a more dif­fuse ma­te­ri­al may look like.

The func­tion that de­scribes how much ef­fect an in­com­ing light ray has for giv­en view­ing and in­com­ing light an­gles, is called BRDF. This func­tion is very spe­cif­ic to ev­ery ma­te­ri­al. As this is very im­prac­ti­cal for re­al-time ren­der­ing and art pipe­lines, it is com­mon to in­stead use a so called para­met­ric BRDF; a func­tion which is able to ap­prox­i­mate many dif­fer­ent BRDFs of dif­fer­ent ma­te­ri­als us­ing in­tu­itive pa­ram­e­ters, e.g. rough­ness or met­al­ness.

There are many para­met­ric BRDFs out there: the GGX mi­cro­facet BRDF, the Schlick BRDF and the Cook-Tor­rance BRDF. I recom­ment play­ing around with them in Dis­ney’s BRDF Ex­plor­er.

Shad­ing area lights

With point lights, shad­ing is re­al­ly sim­ple as you on­ly have a sin­gle in­com­ing ray — as­sum­ing you do not want to take in­di­rect rays in­to ac­count. You can get the ap­pro­pri­ate fac­tor (of how much of that ray will be re­flect­ed in view di­rec­tion) from the BRDF us­ing the view an­gle and light an­gle, mul­ti­ply that with the light in­ten­si­ty and that is al­ready it.

With area lights, it is a lot more com­pli­cat­ed, as you have an in­fi­nite amount of in­com­ing rays. The fi­nal in­ten­si­ty of the shad­ed point is the in­te­gral over the BRDF in the do­main of the poly­gon of the light (which, pro­ject­ed on­to the spher­i­cal dis­tri­bu­tion, is a spher­i­cal poly­gon).

This is a prob­lem, be­cause we do not have an an­a­lyt­i­cal so­lu­tion to in­te­grat­ing over ar­bi­trary spher­i­cal dis­tri­bu­tions. In­stead, such a so­lu­tion is known on­ly for very spe­cif­ic dis­tri­bu­tions, the uni­form sphere or the co­sine dis­tri­bu­tion for ex­am­ple.

So, how can we still do it with­out rad­i­cal­ly ap­prox­i­mat­ing the area light?

Lin­ear­ly trans­formed cosines

The ge­nius of the pa­per is that the au­thors re­al­ized they can trans­form spher­i­cal dis­tri­bu­tions us­ing lin­ear trans­forms (scal­ing, ro­ta­tion and skew­ing) and that this leaves the val­ue of the in­te­gral un­changed.

Visualization of a cosine distribution
(Un­trans­formed) co­sine dis­tri­bu­tion
Animation for scaling a cosine distribution
Uni­form­ly scaled co­sine dis­tri­bu­tion
Re­sults in more/less rough­ness.
Animation for scaling a cosine distribution on one axis
Co­sine dis­tri­bu­tion scaled on one ax­is
Re­sults in an­isot­ropy.
Animation for skewing a cosine distribution
Skewed co­sine dis­tri­bu­tion

Im­age source: Er­ic Heitz’s Re­search Page

You can there­fore trans­form a spher­i­cal dis­tri­bu­tion to look like an­oth­er spher­i­cal dis­tri­bu­tion. This means that you can trans­form some­thing like the co­sine dis­tri­bu­tion to look like a spe­cif­ic BRDF giv­en a cer­tain view an­gle. You can then — be­cause the in­te­gral is un­af­fect­ed by the lin­ear trans­form — in­te­grate over the co­sine dis­tri­bu­tion, to which an an­a­lyt­i­cal so­lu­tion is known, in­stead of in­te­grat­ing over the BRDF.

As this BRDF is view de­pen­dent, you need a trans­for­ma­tion for ev­ery in­ci­dent view an­gle, and ev­ery pa­ram­e­ter of a para­met­ric BRDF. In the pa­per, they achieve this by fit­ting a 3x3 ma­trix (for the trans­for­ma­tion) for a set of sam­pled val­ues for the BRDF pa­ram­e­ter alpha (rough­ness) of the GGX Mi­cro­facet BRDF as well as the view­ing an­gle.

The 3x3 ma­tri­ces have on­ly four re­al­ly sig­nif­i­cant com­po­nents. Con­se­quent­ly they can be stored in an RG­BA tex­ture.

M = \left(\begin{matrix} a & 0 & b \\ 0 & c & 0 \\ d & 0 & 1 \end{matrix}\right)

For shad­ing we need the in­verse ma­tri­ces to trans­form the poly­gon of the light. Orig­i­nal­ly it is of course in the space of the BRDF over which we do not know how to in­te­grate over. If we ap­ply the in­verse ma­trix to poly­gon, it is then in the space of the co­sine dis­tri­bu­tion over which we can in­te­grate in­stead.

Animation for transforming the polygonal light into cosine distribution space
Trans­form­ing the BRDF and light poly­gon in­to a co­sine dis­tri­bu­tion

Im­age source: Er­ic Heitz’s Re­search Page


To aid my un­der­stand­ing of the method, I im­ple­ment­ed a ba­sic ver­sion of LTC shad­ing us­ing Mag­num. The C++ ex­am­ple pro­vid­ed with the pa­per us­es the Schlick BRDF and al­ready con­tained tex­tures with the fit­ted in­verse LTC ma­tri­ces.

The code of the Mag­num ex­am­ple is well doc­u­ment­ed and if you are in­ter­est­ed, I rec­om­mend you go check it out. In­stead of giv­ing a thor­ough line by line ex­pla­na­tion, I will point out some of the fea­tures in Mag­num that were most help­ful to me. They are more gen­er­al­ly ap­pli­ca­ble to oth­er projects as well.

Load­ing LTC ma­trix tex­tures

The orig­i­nal C++ im­ple­men­ta­tion pro­vid­ed with the pa­per al­ready con­tained .dds files for the fit­ted in­verse LTC ma­tri­ces. Many thanks to Er­ic Heitz, who was kind enough to let me use these for the Mag­num ex­am­ple.

I packed these dds files as a re­source in­to the bi­na­ry (makes port­ing to web eas­i­er lat­er). It was a mat­ter of sim­ply adding the resources.conf, telling Cor­rade to com­pile it in your CMakeLists.txt

corrade_add_resource(AreaLights_RESOURCES resources.conf)

add_executable(magnum-arealights AreaLightsExample.cpp ${AreaLights_RESOURCES})

… and then load­ing the tex­ture from the re­source mem­o­ry us­ing DdsIm­porter:

/* Load the DdsImporter plugin */
PluginManager::Manager<Trade::AbstractImporter> manager;
Containers::Pointer<Trade::AbstractImporter> importer =
if(!importer) std::exit(1);

/* Get the resource containing the images */
const Utility::Resource rs{"arealights-data"};

/* Set texture data and parameters */
Containers::Optional<Trade::ImageData2D> image = importer->image2D(0);
    .setStorage(1, GL::TextureFormat::RGBA32F, image->size())
    .setSubImage(0, {}, *image);

/* Bind the texture for use in the shader */

Shad­er Hot-Reload

Dur­ing shad­er de­vel­op­ment, you will not want to restart your ap­pli­ca­tion ev­ery time you make a change to the GLSL shad­er code. It is rather nice to be able to just hit F5 and see the changes im­me­di­ate­ly in­stead.

It turns out that if you im­ple­ment­ed an GL::Ab­stract­Shad­er­Pro­gram, hot-reload­ing is just a mat­ter of re­in­stan­ti­at­ing it:

/* Reload the shader */
_shader = AreaLightShader{};

Yes, it is that sim­ple.

Of­ten you will com­pile your shad­er files as re­sources in Mag­num (as done in the ex­am­ple). To use shaders from a re­source in your GL::Ab­stract­Shad­er­Pro­gram you would again make use of Util­i­ty::Re­source:

GL::Shader vert{version, GL::Shader::Type::Vertex};
GL::Shader frag{version, GL::Shader::Type::Fragment};

/* Load shaders from compiled-in resource */
Utility::Resource rs("arealights-data");

In this case you will need to over­ride the re­source group us­ing Util­i­ty::Re­source::over­ride­Group() to load the re­source from the orig­i­nal file rather than from mem­o­ry be­fore hot-reload­ing:

/* Reload the shader */
Utility::Resource::overrideGroup("arealights-data", "<path>/resources.conf");
_shader = Shaders::AreaLight{};


Fi­nal ap­pre­ci­a­tions go to Er­ic Heitz, Jonathan Dupuy, Stephen Hill and David Neubelt for pub­lish­ing a in­cred­i­bly well writ­ten pa­per with a ton of sup­ple­men­tal ma­te­ri­al and ef­fort around it — and of course Mag­num for mak­ing it vi­able to quick­ly get this ba­sic im­ple­men­ta­tion run­ning.

Thank you for read­ing! I’ll be back.

Jonathan Hale

About the author

Jonathan Hale lives for Vir­tu­al Re­al­i­ty. De­vel­op­er and project man­ag­er at Vhite Rab­bit. Fol­low him on Twit­ter: @Squareys

Guest Posts

Guest Posts

This is a guest post by an ex­ter­nal au­thor. Do you al­so have some­thing in­ter­est­ing to say? A suc­cess sto­ry worth shar­ing? We’ll be hap­py to pub­lish it. See the in­tro­duc­to­ry post for de­tails.