CODE:

` 1--------------------------- pr + pg + pb---- ---- ---- fr fr fr `

like in Eq. 4. was the key to make the algorithm practical. I "only" had to carefully handle edge cases with arithmetic operations, 0/0 and 0 x inf.Like explained in my previous post, I do handle the case where the delta-tracking "hits" a surface. That is, when the tracking step crosses a surface, I multiply the transmittance T(x,y) from the last node x to the surface node y, to f and p.

I think it is working now! Unidirection path tracing + NEE, chromatic heterogeneous media.

Statistics: Posted by dawelter — Fri Aug 07, 2020 12:56 pm

]]>

It would be interesting to see a chart of build times vs incoherent rays/s for several architectures.

Statistics: Posted by graphicsMan — Fri Jul 31, 2020 6:50 pm

]]>

on cpus today.

Statistics: Posted by mpeterson — Fri Jul 31, 2020 2:29 pm

]]>

]]>

are there any comparisons ?

for example, World of Tanks enCore RT used (for RT shadows) embree (CPU) for bvh building (1.5M triangles), and DX11 Compute for RT tracing (this BVH with triangles)

from here,

https://gamegpu.com/mmorpg-/-%D0%BE%D0%BD%D0%BB%D0%B0%D0%B9%D0%BD-%D0%B8%D0%B3%D1%80%D1%8B/world-of-tanks-encore-rt-test-gpu-cpu

Statistics: Posted by xma — Thu Jul 30, 2020 2:27 am

]]>

Built quasi monte-carlo sampling into the forward path tracer. I'm using a 21201-dimensional Sobol sequence from Joe & Kuo https://web.maths.unsw.edu.au/~fkuo/sobol/index.html. I tried a 2d base pattern with rotations and xor-scrambling, but I got biased images compared to the random sampler. A fixed number of dimensions is allocated per BSDF sample, NEE light sample, distance sample and so on. Some things are random sampled still, like the BSDF component. Due to the use of delta-tracking, distance sampling can use unbounded number of random numbers. I limit its use of dimensions to 10 or so. After that pseudo-random samples are drawn.

The base-sequence is rotated for each pixel by different amounts based on a blue noise pattern like in the Arnold paper. Correlations don't matter here as each pixel should converge individually.

And it actually works. Some more pics:

* Basic wine glass. Rendered with photon mapping. Nothing fancy

https://www.dropbox.com/s/owyeb63x1zhkms4/wineglass.jpg

* Dragon with more stuff. Forward path tracing with QMC. Using the maximum-roughness trick from Arnold to prevent fireflys. https://www.dropbox.com/s/q2is7g0ub8bxixp/xyzrgb_dragon_extended_qmc1024spp.jpg

Statistics: Posted by dawelter — Wed Jul 29, 2020 8:40 am

]]>

Statistics: Posted by dawelter — Wed Jul 08, 2020 6:20 pm

]]>

Statistics: Posted by dawelter — Wed Jul 08, 2020 4:00 pm

]]>

Anyway, just found this tech report (didnt read in details yet):

[url] https://github.com/honzukka/null-scattering [/url]

also, there is a link to author's pbrt code for that paper:

[url]https://github.com/baileymiller/nullpath/blob/master/pbrtv3/src/integrators/nullpath.cpp[/url]

i will check it out myself a bit later , please share any thoughs if it will shed some light

cheers

Statistics: Posted by gigacore — Tue May 26, 2020 2:28 pm

]]>

https://apply.workable.com/tangent-animation/j/B40BB5C7FF/

Statistics: Posted by stefan — Fri May 15, 2020 5:55 am

]]>

Nice paper you found there. I try to answer because I may attempt to implement this for my renderer, too.

It's just a wild guess, but the authors omitted from algorithm 1 what happens when the particle hits a surface. In Eq (9) there is the last term T(x,z)L^s(z,w) which adds the light from the surface. In regular tracking T cancels out with the probability to not scatter in front of the surface. I think in the MIS algorithm the terms do not cancel so they should be present.

Moreover, my gut feeling says maybe something is wrong where the "hero wavelength" is picked. Because for the "hero wavelength MIS scheme" Wilkie et al. [2014] to work one would have to trace one path distributed according to the pdf for red, one according to green, and so on. It is my understanding that only the combined estimator will result in "clean" air.

Please let us know any tips the Authors share. I'm very interested, too.

Best regards

Statistics: Posted by dawelter — Fri May 08, 2020 7:07 am

]]>

I started with pure scattering volume , some density in the center of larger volume box( so, many empty areas ). I have matched results with the reference delta-tracker in monochromatic case, however, spectral (RGB actually) version has issues:

https://i.gyazo.com/32dd6d8a42fc54601e2cd62d28185634.png (volume box)

https://i.gyazo.com/3ddf9e9183cd84d54f260f981c2ff200.png (monochrome pure scattering)

https://i.gyazo.com/9bf4f2f997d62366a01058836f54e57c.png ( problematic chromatic pure scattering)

In the last problematic case, i set up my RGB majorant (combined extinction) to somewhat red color dominant value (density in the volume is restricted to [0,1]), and got that weird red filled air.

Once new position x is outside of the volbox, I used the same return value expression as on absorption event : return EnvironmentValue * f / ((p.r + p.g + p.b) / 3)

As i can see from the algorithm

https://i.gyazo.com/c2e70967e1a2c674ca5be5695a0ae57d.png

those air zones without density will be always null-scattered until ray gets outside the box, but according to line 9 and 10 the null Un which equals to Umajorant in the absense of Uabs and Uscatt will influence subpath contribution and pdf, these f and p perfectly cancels each other in the monochromatic case, but when dealing with color case, the average of p is no more red-dominant and cant compensate too much for f . I am sure i got something wrong from the above derivations , but cant understand what exactly.

Just for clarity: i assume p and f is rgb values which gets multiplied each subpath with itself, with T (transmittance with subpath length t and Umajorant rgb extinction) and with rgb extinction sampled from volume (abs, scatt or null selected when the event type is choosen).

And this also produces a one more issue, once i went with scattering/null/abs coeff above 1, p and f will quickly raised so high that float32bit value is not enough to cover (something like 100 bounces and i got INF)

Could anyone advise in the right direction ? ( i have contacted directly with authors but they are busy at the moment and suggest to ask here)

Thanks.

Statistics: Posted by gigacore — Tue Apr 21, 2020 9:29 am

]]>

Statistics: Posted by JasonSmith — Sun Apr 12, 2020 11:25 pm

]]>

I wasn't very much concerned with memory so far. However, I can tell that this scene uses 2k cells or so. Each of which has to store the coefficients for two mixtures - plus some statistics for the EM algorithm. To be frank, I haven't tested the algorithm on much else than this one scene ... It's still pretty much work in progress.

And regarding the weighting - you mean in the expectance maximization? Yes, I absolutely needed that because I only do forward path tracing. Hence, initially the distribution from which directions are sampled has nothing to do with the actual distribution of incident light. If you want to see the code, it's pretty terrible, but take a look if you like https://github.com/DaWelter/ToyTrace/blob/master/src/distribution_mixture_models.cxx#L407

Edit: Obligatory cornelbox with water. This one is unfortunate as my (poor) implementation of brute force path tracing (but still with NEE) beats my (also poor) implementation of the guided algorithm in an equal-time rendering. The guided algorithm seems more sample-efficient though. P.S. A cell containing the mixtures and other stuff takes about 1.1kB.

Statistics: Posted by dawelter — Sat Mar 14, 2020 2:48 pm

]]>

Statistics: Posted by koiava — Tue Mar 10, 2020 8:48 am

]]>