Last Tme? Monte Carlo Renderng Monte-Carlo Integraton Probabltes and Varance Analyss of Monte-Carlo Integraton Monte-Carlo n Graphcs Stratfed Samplng Importance Samplng Advanced Monte-Carlo Renderng Monte-Carlo computaton of π Take a square Take a random pont (x,y) n the square Test f t s nsde the ¼ dsc (x 2 +y 2 < 1) The probablty s π /4 y x Monte-Carlo computaton of π The probablty s π /4 Count the nsde rato n = # nsde / total # trals π n * 4 The error depends on the number or trals Convergence & Error Let s compute 0.5 by flppng a con: 1 flp: 0 or 1 average error = 0.5 2 flps: 0, 0.5, 0.5 or 1 average error = 0. 25 4 flps: 0 (*1),0.25 (*4), 0.5 (*6), 0.75(*4), 1(*1) average error = 0.1875 Does not converge very fast Doublng the number of samples does not double accuracy 1
Questons? Monte-Carlo Integraton Probabltes and Varance Analyss of Monte-Carlo Integraton Monte-Carlo n Graphcs Stratfed Samplng Importance Samplng Advanced Monte-Carlo Renderng Revew of Probablty (dscrete) Random varable can take dscrete values x Probablty p for each x 0 < p < 1, Σ p =1 Expected value Expected value of functon of random varable f(x ) s also a random varable Varance & Standard Devaton Varance σ 2 : devaton from expected value Expected value of square dfference Also Standard devaton σ: square root of varance (noton of error, RMS) Monte Carlo Integraton Consder N random samples over doman wth probablty p(x) Defne estmator: Example We know t should be 1.0 Probablty p allows us to sample the doman more ntellgently In practce wth unform samples: error σ 2 - σ 2 N 2
Monte Carlo Analyss We want to compute the average of a functon We can pck a random value of ths functon and hope to fall n the mddle chances are slm but on average we ll be rght For N tres, the expectaton stays the same but the varance decreases by N the standard devaton (error) decreases by.e. quadruplng the number of sampled ponts wll halve the error, regardless of the number of dmensons Advantages of MC Integraton Few restrctons on the ntegrand Doesn t need to be contnuous, smooth,... Only need to be able to evaluate at a pont Extends to hgh-dmensonal problems Same convergence Conceptually straghtforward Effcent for solvng at just a few ponts Dsadvantages of MC Monte Carlo Recap Nosy Slow convergence Good mplementaton s hard Debuggng code Debuggng math Choosng approprate technques Punctual technque, no noton of smoothness of functon Turn ntegral nto fnte sum Use random samples 1 Convergence n Independent of dmenson Very flexble Tweak samplng/probabltes for optmal result A lot of ntegraton and probablty theory to get thngs rght Questons? 1 glossy sample per pxel 256 glossy samples per pxel Monte-Carlo Integraton Probabltes and Varance Analyss of Monte-Carlo Integraton Monte-Carlo n Graphcs Stratfed Samplng Importance Samplng Advanced Monte-Carlo Renderng 3
What can we ntegrate? Pxel: antalasng Lght sources: Soft shadows Lens: Depth of feld Tme: Moton blur BRDF: glossy reflecton Hemsphere: ndrect lghtng Domans of ntegraton Pxel, lens (Eucldean 2D doman) Tme (1D) Hemsphere Work needed to ensure unform probablty Lght source Same thng: make sure that the probabltes and the measures are rght. Example: Lght source Questons? Integrate over surface or over angle Be careful to get probabltes and ntegraton measure rght! Images from the ARNOLD Renderer by Marcos Fajardo Samplng the source unformly Samplng the hemsphere unformly source hemsphere Monte-Carlo Integraton Probabltes and Varance Analyss of Monte-Carlo Integraton Monte-Carlo n Graphcs Stratfed Samplng Importance Samplng Advanced Monte-Carlo Renderng Important ssues n MC renderng Reduce varance! Choose a smart probablty dstrbuton Choose smart samplng patterns And of course, cheat to make t faster wthout beng notced 4
Stratfed samplng Wth unform samplng, we can get unlucky E.g. all samples n a corner To prevent t, subdvde doman Ω nto non-overlappng regons Ω Each regon s called a stratum Take one random samples per Ω Example Borrowed from Henrk Wann Jensen Unstratfed Stratfed Stratfed Samplng Recap Cheap and effectve Typcal example: jtterng for antalasng Sgnal processng perspectve: better than unform because less alasng (spatal patterns) Monte-Carlo perspectve: better than random because lower varance (error for a gven pxel) Glossy Renderng Integrate over hemsphere BRDF tmes cosne tmes ncomng lght I ( Slde from Jason Lawrence Samplng a BRDF Samplng a BRDF 5 Samples/Pxel 25 Samples/Pxel U( U( P( P( Slde from Jason Lawrence Slde from Jason Lawrence 5
Samplng a BRDF Importance samplng 75 Samples/Pxel U( P( Choose p wsely to reduce varance p that resembles f Does not change convergence rate (stll sqrt) But decreases the constant Slde from Jason Lawrence bad unform good Results 1200 Samples/Pxel Questons? Images by Veach and Gubas Tradtonal mportance functon Better mportance by Lawrence et al. Naïve samplng strategy Optmal samplng strategy Monte-Carlo Integraton Probabltes and Varance Analyss of Monte-Carlo Integraton Monte-Carlo n Graphcs Stratfed Samplng Importance Samplng Advanced Monte-Carlo Renderng The Renderng Equaton ω' x' x ω L(x',ω') = E(x',ω') + ρ x '(ω,ω')l(x,ω)g(x,x')v(x,x') da emsson BRDF Incomng lght Geometrc term vsblty 6
Ray Castng Cast a ray from the eye through each pxel Ray Tracng Cast a ray from the eye through each pxel Trace secondary rays (lght, reflecton, refracton) Monte-Carlo Ray Tracng Cast a ray from the eye through each pxel Cast random rays from the vsble pont Accumulate radance contrbuton Monte-Carlo Ray Tracng Cast a ray from the eye through each pxel Cast random rays from the vsble pont Recurse Monte-Carlo Monte-Carlo Cast a ray from the eye through each pxel Cast random rays from the vsble pont Recurse Systematcally sample prmary lght 7
Importance of samplng the lght 1 path per pxel Wthout explct lght samplng Wth explct lght samplng Monte Carlo Path Tracng Trace only one secondary ray per recurson But send many prmary rays per pxel (performs antalasng as well) 4 path per pxel Results: 10 paths/pxel Results: 10 paths/pxel, glossy Results: 100 paths/pxel, glossy Questons? 8