Practical and theoretical implementation discussion.
dawelter
Posts: 41
Joined: Sun Oct 29, 2017 3:15 pm
Location: Germany

Hi there,

I seek help for robustly implementing Veach's correction for shading normals for the adjoint BSDF, i.e. for photon mapping or light tracing.

The problem: According to Veach we should add a correction factor that compensates for using shading normals instead of geometry normals:
eq5.18.PNG (4.41 KiB) Viewed 1817 times
However, wo*Ns/wo*Ng can become arbitrary large since the denominator does not cancel and can become very small at grazing angles. I saw this factor go up to 1000. And this introduced ugly fireflies in my render. I clamped the factor to 10 as a band aid fix. But this is certainly not very pretty. It did help though with no noticeable change in the image.

How do production renderers deal with this?

I took a look at PBRT's implementation and found that in BDPT they use the correction straight forwardly
https://github.com/mmp/pbrt-v3/blob/mas ... pt.cpp#L55

To my surprise I did not found the correction in the photon mapping code!
https://github.com/mmp/pbrt-v3/blob/mas ... m.cpp#L403
There ist just the regular

Code: Select all

Spectrum bnew =
beta * fr * AbsDot(wi, isect.shading.n) / pdf;
Did I manage to find a bug? Is it a conscious decision to omit it to prevent fireflies?

shocker_0x15
Posts: 75
Joined: Sun Aug 19, 2012 3:24 pm
Contact:

I don’t have a clear answer for your question, but I think there is a limitation with shading normal because it doesn’t have physical basis (as Veach noted).
One apparent way to mitigate arbitrarily large values is using well polygonized model which has shading normal similar to geometric normal...

I completely agree with your surprise.
I felt the same surprise when I saw the implementation of PBRT v3.
It should take the correction factor into account in renderers other than BPT too in my opinion because PBRT is not a production oriented, it is education oriented.

dawelter
Posts: 41
Joined: Sun Oct 29, 2017 3:15 pm
Location: Germany

Hi.

Thanks for your reply. At least it is good that I didn't totally misunderstood something there.

For some reason the correction factor went way down, when I used another mesh (higher res version of the stanford bunny). I could swear it has a similar amount of jagged edges. I start to think, that either I have a weird bug somewhere, or the mesh has errors.

Apart from investigating this, I will stick to clamping. It should be fine.

shocker_0x15
Posts: 75
Joined: Sun Aug 19, 2012 3:24 pm
Contact:

It might be help for you to visualizing the relation between geometric normal and shading normal like this:
clamp(
RGB(
0.5 + 10 * (dot(shadingNormal, geometricNormal) - 1),
0.5 + 100 * (length(geometricNormal) - 1),
0.5 + 100 * (length(shadingNormal) - 1)
)
, 0, 1)

This shows completely a gray image in the ideal and correct situation (shadingNormal == geometricNormal and the lengths of two normals are 1).
Multiplier like 10 * or 100 * are used to exaggerate difference from the ideal.

dawelter
Posts: 41
Joined: Sun Oct 29, 2017 3:15 pm
Location: Germany

https://github.com/mmp/pbrt-v3/issues/209

@Shocker: Yeah ... It's a bit overkill but if I cannot understand the differences between the mesh otherwise, why not.

T.C. Chang
Posts: 6
Joined: Tue Nov 29, 2016 3:59 am

Haha, I was the one that filed the issue. In my implementation I keep the correction term since I do not like the idea of turning a consistent method into an inconsistent one. If I recall correctly, Jakob said that (in Mitsuba renderer) a better importance sampling technique is needed for the term. I also felt the same surprise as you do.

charles
Posts: 1
Joined: Tue Mar 26, 2019 10:25 am

You can find an alternative formulation in this paper as well: https://blogs.unity3d.com/2017/10/02/mk ... 1501401260

I haven’t implemented it myself, but it is supposed to be symmetric and more stable.

dawelter
Posts: 41
Joined: Sun Oct 29, 2017 3:15 pm
Location: Germany