Page 1 of 1

An error in pbrt? About MLT start-up bias

Posted: Fri Oct 03, 2014 8:45 am
by shiqiu1105
Hi

I have been reading the MLT implementation in pbrt and trying to understand the math as much as I can.

One thing that confuses me is that, here in the start up phase, the initial sample X0 is sampled in the following way.
1.jpg
1.jpg (79.64 KiB) Viewed 7598 times


where it says we need to weight all contributions with w.

However, when adding contribution, the w is ignored.. Is this a mistake?
2.jpg
2.jpg (108.77 KiB) Viewed 7598 times


I derived it myself, and it seems like the w should equal to b. Should we multiply another b to all the contributions?

Also, Keleman's paper seems to be using a different weighting scheme for large step and rejection samples.
Compared the approach in pbrt, which one is better?

Thanks,