I actually wonder how you get your reference image so clear with jut 16 spp.

In this case I used closed-form tracking as described in the volume rendering course. (I think this is valid even for spectrally refined media, as long as there is no scattering. More on that below.)

Try the weighting scheme from "5.1.2 Incorporating Path History". It should reduce the noise somewhat.

Indeed. Massive difference!

"5.1.3 Reduced Termination Rates" also works well for me, but it is only suitable for non-emissive media.(This is not yet included in the images further down)

I spot one error:

- t is updated before calling direct_light. But p still points to the previous interaction point.

Excellent find! I introduced this error recently by shuffling around some things and did not even consider it anymore.

After fixing this problem the results actually already look very promising. Isn't the transmittance along the ray handled by probabilities already? The delta tracking as described in the volume rendering course also doesn't need this. There you multiply the incoming light by the scattering albedo. But this doesn't seem to be necessary in spectral tracking because of the weights?!

Anyway, here are some new images (16 samples per pixel across all schemes) and thoughts:

Spectrally refined absorption and scattering. The spectral tracking seems to converge to the same result as the ray marched reference. Closed-form tracking is slightly off.

Spectrally refined absorption with zero scattering. All three schemes seem to converge to the same result. Closed-form tracking converges much faster than spectral tracking. (Judging from my experiments, spectrally refined absorption with gray scattering will also be handled incorrectly by closed-form tracking)

Do you think it is possible to modify closed-form tracking in such a way that it handles spectrally refined absorption/scattering for homogeneous media correctly? Or would you end up at spectral scattering anyway, because of the null collision thing we talked about earlier?

Statistics: Posted by b_old — Mon Mar 19, 2018 10:53 am

]]>

Do you think this looks reasonable?

Not sure. The method is pretty noisy, indeed. I actually wonder how you get your reference image so clear with jut 16 spp.

Try the weighting scheme from "5.1.2 Incorporating Path History". It should reduce the noise somewhat.

I spot one error:

- t is updated before calling direct_light. But p still points to the previous interaction point.

And a potential error:

- You need to factor in the transmittance along the ray to the light source.

Apart from that your code looks fine to me. Very much like my "spectral_tracking" function.

Statistics: Posted by dawelter — Mon Mar 19, 2018 8:25 am

]]>

I think this is incorrect w.r.t. spectral tracking. ...

I think this is a very valuable hint! I did not get this part at all.

I updated my code and it now looks very similar to your python implementation. (Apart from the scattering part.)

The gist also contains updated images.

Visual inspection seems to suggest that closed form tracking and delta tracking might converge to the same result now. But spectral tracking is incredibly noisy, at least for this absorption-only case. Do you think this looks reasonable?

I'm still unsure about the scattering part. I don't really understand what your python code does in this case, but simply multiplying the estimated in-scattered light by the accumulated weight, like I do in the gist, doesn't seem to do the trick.

Statistics: Posted by b_old — Sun Mar 18, 2018 5:27 pm

]]>

1. The data is homogeneous, so null collisions don't play a role at the moment

I think this is incorrect w.r.t. spectral tracking. Because

mu^bar = max_lambda { mu_t(lambda) },

you get

mu_n = mu^bar - mu_t > 0 for some lambda if not all mu_t(lambda) are equal.

Moreover, the weights are multiplied "on top" of the previous weights. So there should be a variable w with

w = wa * w,

and

w = ws * w

somewhere around line 37 and 41, respectively.

I happen to have implemented exactly the scheme you are working on. Maybe this helps:

https://github.com/DaWelter/ToyTrace/bl ... racking.py

Or if you don't mind wading through my atmosphere code

https://github.com/DaWelter/ToyTrace/bl ... re.cxx#L72

Statistics: Posted by dawelter — Sat Mar 17, 2018 3:09 pm

]]>

To make debugging easier for debugging I make the following restrictions at the moment:

- 1. The data is homogeneous, so null collisions don't play a role at the moment

2. The scattering coefficient is zero, so lighting should not factor into the result at the moment

3. Only single scattering is considered

My implementation and the results can be viewed in this gist.

Because my data is homogeneous I can compare against plain tracking, and find that the results match for cases where the absorption coefficient is not spectrally refined (apart from noise). In this case the spectral tracking weights simply are vec(1), so I guess it is the same algorithm as delta tracking.

As soon as I introduce a spectrally refined absorption coefficient, however, I no longer understand what is going on. With the data I'm using, I basically only have to care about absorption, but even for that case I cannot figure out how I have to apply the weight to the computed transmittance. For normal delta tracking the transmittance is always zero when a collision happens, but for spectral tracking a compensation is necessary. Assuming the compensation weights are calculated correctly, something is wrong with the way I apply them.

Any idea how I should apply the weights to the transmittance?

Statistics: Posted by b_old — Fri Mar 16, 2018 11:32 am

]]>

A perfectly parallel beam send through a scattering medium. The sphere is made of a much denser medium. It is sub surface scattering with brute force BDPT.

Statistics: Posted by dawelter — Fri Mar 16, 2018 8:34 am

]]>

So this is really a question about the density of the end points in the left hand side case. I suppose the density does not change no matter how much I would move the target surface up or down. So I would omit the r^2 term in the density.

I tried to implement that, but to my surprise I didn't see a difference to the baseline version. It is probably bugged, but the renderings look all right.

Statistics: Posted by dawelter — Fri Mar 16, 2018 8:27 am

]]>

Light path construction is performed as follows:

- sample a point y0 on the light with pdf p_A(y0) and get emittance (spatial component of emission) L_e^0(y0) [W/m^2]

Monte Carlo estimate:

L_e^0(y0) / p_A(p0) - sample a direction along which a ray emits with pdf p_w(y0->y1) and get a directional component of emission L_e^1(y0->y1) [1/sr]

Cumulative Monte Carlo estimate:

L_e^(y0) L_e^1(y0->y1) * |dot(n0, y0->y1)| / (p_A(y0) * p_w(y0->y1)) =

L_e(y0->y1) * |dot(n0, y0->y1)| / (p_A(y0) * p_w(y0->y1))

However we notice that it implicitly contains 1/r^2 if it is written with respect to surface area:

L_e(y0->y1) * |dot(n0, y0->y1)| / (p_A(y0) * p_w(y0->y1)) =

L_e(y0->y1) * G(y0<->y1) / (p_A(y0) * p_w(y0->y1) / |dot(n0, y0->y1)| * G(y0<->y1)) =

L_e(y0->y1) * G(y0<->y1) / (p_A(y0) * p_A(y1))

The numerator is the measurement contribution function from y0 to y1.

The function contains 1 / r^2 term but it is cancelled by corresponding G term for the pdf.

Statistics: Posted by shocker_0x15 — Mon Mar 05, 2018 5:23 pm

]]>

Question is, when to add the 1/r^2 factor in the geometry term?

For illustration, my Gedankenexperiment:

*Case on lhs: Source projects a parallel beam. Its cross section is fixed. Even after going through the mirror. Thus the power received by the target in the bottom is independent of how far we take it away from the mirror. So I would omit the r-factor.

*Case on rhs: Light spreads out the further it goes away from the light source and/or the mirror. Decreasing power density must be accounted for by the r^-2 term.

So, tracing a path from a parallel source, I would omit the r^-2 term until the path hits a non-specular surface.

I wonder if I have the wrong idea in mind because I don't recall reading anything about propagating a "parallel beam flag".

Statistics: Posted by dawelter — Sat Mar 03, 2018 9:48 am

]]>

Now this looks so much better!

Statistics: Posted by dawelter — Sat Mar 03, 2018 7:51 am

]]>

dawelter wrote:

Will investigate. Hints appreciated

Will investigate. Hints appreciated

Yeah, doesn't look correct. The most likely reason is that the MIS weights corresponding to the different techniques for constructing a path don't sum up to one. This in turn is probably due to incorrect/inconsistent path pdf computation. What has helped me greatly in the past to debug such issues has been to debug one path length at a time.

Here are some more specific tips:

1. Start with an entirely diffuse scene. The pdfs are simpler.

2. Have a unidirectional path tracer that you're sure converges to the right solution.

3. Have a unidirectional light tracer that you're sure converges to the right solution. Compare against the images produced by the path tracer.

4. In the bidirectional path tracer, start by rendering direct illumination only, i.e. paths of length two (segments). There are already three ways to construct each such path. Make sure you compute the correct path pdfs, with solid-angle-to-area conversion factors (Jacobians) etc.

5. Only after you're sure the direct illumination converges to the same result as the path tracer and light tracer, move on to debugging one path longer. And one longer, etc.

6. Once you can correctly render a diffuse scene with your bidirectional tracer, add more complex materials.

Hope this helps!

Statistics: Posted by ingenious — Thu Mar 01, 2018 1:48 am

]]>

Meanwhile, I implemented what I want to be mis weighting straight from the defnition p_k^a / sum_i p_i^a ... I also fixed the reflector. It has 10k triangles now and increased tesselation near the focal point. The light source was a point light, now I take very small sphere light, to "smooth" the irregularities of the model. Moved it a slight bit, too. And tada! The reflector focusses the light much much more cleanly.

Unfortunately, it does not converge The noise around the sphere is gone. Which is great. But near the corners - No way this is correct. Also the brightness seems to increase from sweep to sweep as you can see at the line where I interrupted the rendering.

Will investigate. Hints appreciated

That's 512 samples per pixel btw. Took the night to render.

Statistics: Posted by dawelter — Wed Feb 28, 2018 8:43 am

]]>

Statistics: Posted by graphicsMan — Thu Feb 22, 2018 2:55 pm

]]>

High variance at 128 bidirectional random walks per pixel. Paths are weighted by one over the number of techniques that can generate the path. Implementation of MIS pending ...

Statistics: Posted by dawelter — Thu Feb 22, 2018 8:44 am

]]>

I do not agree that this is interesting only for Python developers because it is very easy to embed Python interpreter in C++ application than SIMDy

can also be used from c++.

For example in context of renderers if you embed Python interpreter in C++ you can use SIMDy as very flexible shading language like OSL (Open Shading Language), so user can write scripts that are very very fast.

Statistics: Posted by Tahir007 — Thu Feb 15, 2018 9:37 am

]]>