Abstract
We want to understand the physics of colors. In this project, we built a spectral ray tracer system, by tracing rays of different wavelength sampled using the human eye’s spectrum sensitivity distribution. By modeling light dispersion we reproduced effects such as the chromatic aberrations of a lens, thin film light interference in soap bubbles, as well as temperature lights based on blackbody radiation.
We also provide our proposal link here:Project Proposal
Technical approach
Spectral Ray Tracer System
In order to build a spectral ray tracer, we want the system to take care of each ray with a single wavelength. The system first samples a wavelength and assigns it to the ray; it then traces this ray into the scene, calculates wavelength-dependent BSDF and returns a single radiance value. Intuitively a spectral ray tracer would require more samples per pixel since we need more rays to get the RGB value for a single pixel. In this project, we have three color samples so that at least 3X amount of rays are required to get a reasonably converged scene than a monochromatic ray tracer system. Following are some more details for our spectral ray tracer system.
- Color Sampling
We sample wavelengths according to the spectrum distribution of human vision sensitivity. Human vision system has red, green, and blue receptors that could be approximately modeled as normal distribution centered at different wavelength with different standard deviation, as shown below.
Human vision sensitivity
|
We approximate the three normal distribution as: Red N(600, 25), Green N(550, 25), Blue N(450, 15), and randomly sample from the three distributions to get a 'red', a 'green' and a 'blue' wavelength.
- Dispersion Function
Dispersion function describes the relationship between wavelength and refractive index of a medium. For a specific type of medium, 6 parameters are needed to determine its dispersion function, which models the medium's dispersion power. For a ray traveling through a specific type of medium, we applied the dispersion function shown below to calculate the refractive index. In general, the dispersion function is a inverse-like function where the refractive index decreases with increasing wavelength. This can be intuitively thought as red light (larger wavelength) bends less than blue light (shorter wavelength). The figure below on the right is a plot of the dispersion function for a glass type of N-BK7 (SCHOTT).
(LEFT) Dispersion Formula and (RIGHT) the dispersion curve for a particular type of glass.
|
More specifically, we searched for different types of lens materials to get their dispersion formulas, and then modified the parser of the lens file to read in the 6 parameters for each lens element.
With the color sampling and dispersion function to calculate the correct refractive index for a wavelength, we are able to use the lens tester application to test our spectral ray propagation. From the lens tester shown below, we can observe the color band that is caused by dispersion effect. Red light bends the least while blue light bends the most. Notice that because of the dispersion, light with different wavelength is not able to focus at a single point, which causes lens aberration that could be seen in the later rendered scenes.
Spectral Lens Tester
|
- Wavelength-dependent BSDF
Now we need to modify all the BSDF to be wavelength-dependent. An intuitive way to think of this is that a red wall reflects red light the most while almost reflect none blue or green light. To achieve this, we apply a wavelength-dependent weight to the albedo of the materials.
Wavelength-dependent albedo formula
|
Black Body Radiation
With the spectral ray tracer system, we are able to model light sources as objects with blackbody radiation. We follow Planck's Law:
Planck's Law
|
where Kb is the Boltzmann constant, h the Planck constant and c the speed of light in the medium. Therefore with a given temperature and wavelength, we are able to calculate the energy emitted by the light source. Another detail is that due to Wien's displacement law, the product of the peak wavelength and the temperature should be a constant, and thus for a specific temperature we normalize the energy by the Wien's constant calculated at that temperature.
Thin Film Rendering
There are stunning natural phenomena caused by light interference; thin film is one of them, and this effect could be rendered with our spectral ray tracer system. In this project we specifically considered soap bubble rendering, but similar effect could be seen from oil slick or a disk.
- Light Interference
The key concept in thin film rendering is light interference. When two waves of the same wavelength are exactly in phase, the result is a perfect reinforcement, when exactly out of phase, the result is a perfect cancellation, and when they aren't exactly in phase or out of phase, the result is somewhat in between. A straightforward illustration is shown below.
Destructive and constructive waves
|
Thin film makes light waves with different wavelengths constructive or destructive under different situations, and thus the visibility and invisibility of different wavelengths cause the effect of color bands on thin film surfaces. The phase difference among different wavelengths is caused by their different effective optical path length, the EOPL.
- Phase Shift
We treat light as waves. When it hits the surface of the thin film, some light bounces right back while the rest enters the film and eventually returns to the air. Then it will interfere with the light that directly bounced off. Phase shift contains two components: one is the extra path traveled by the wave inside the film (highlighted as orange in the left figure below; A2 travels extra path inside the film than A1); the other is caused by the optical phase shift, in which light undergoes a phase shift that equals to half its wavelength when it enters to a medium of higher refractive index. If the extra path inside the film is d, we can write out the effective optical path length as EOPL = d + lambda/2. A little bit more math gives us the equation:
EOPL equation
|
where w is the thickness of the thin film, and n is the index of refraction. In reality, light will bounce many times until its energy dies out, shown in the below right figure. But we show that 1 bounce inside the thin film is enough to render realistic thin film effect.
Light path in thin film
|
- Fresnel Approximation
We used Schlick's approximation to model the amount of refraction and reflection. But remember that we need to consider multiple bounces at the thin film surface. In the figure shown below, blue rays are refracted rays and orange rays are reflected rays. We calculate two reflection coefficients: one for the entering ray (labeled as r1) and the other one for the ray enters the film and reflected inside the thin film (labeled as r2). Then the total reflected ray coefficient should be calculated as shown below, as r1 + r1 * r2 * (1-r2). Then we also calculated the refraction coefficient. These two values should add to one if they are exact, but since we are doing an approximation, we normalized each by the sum of the two.
Fresnel Approximation for thin film effect
|
- Thickness Approximation
One more detail to render a soap bubble is that the thickness is not constant due to gravity. The bottom of the bubble should have larger thickness than the top of the bubble. We model this by a linear change dependent on the height of ray intersection. A more realistic modeling could be adding Perlin noise to the thickness.
Modifying dae files
Blendr didn't like to play nicely with the .dae files we were using in our ray tracer. It would load a giant box around the scene making it difficult to edit. It may have been possible, but it was easier to edit the .dae files manually with a text editor. Inside a dae, a scene has nodes containing geometry information / transformation matrix pair along with material information. The geometry information contains a mesh of vertices and faces constructed from it, and the material information sets things like dispersion, lighting, and any extra effects like glass or chrome. Making a scene with many spheres would be easy to do, because they all have the same geometry and only transform based on position and scale, whereas making a prism would be difficult. A prism needs specific information about vertices and faces, as well as a rotation, making it a much for difficult node to create.
Results
Warm and Cold lighting
By setting temperature to 3500K (warm light) and 7500K (cool light), we are able to render our scene with different temperature atmospheres.
Cornell box under different temperature lightings (LEFT) rendered with 3500K (RIGHT) rendered with 7500K
|
Chromatic Aberration
Our spectral ray tracer is able to show chromatic aberration caused by lens that cannot focus lights with different wavelengths at a single point. The cornell box image on the left has this chromatic aberration effect prominently shown at the light. Since we define our focus metric as focusing the green wavelength, blue and red light is not able to be focus, and thus a mixture of the two produces a magenta-like halo around the area light on the ceiling. This effect is usually called "purple fringing" in photography. And on the right, similar effect of chromatic aberration can be seen at the scene edges.
Rendered scenes with chromatic aberration
|
Soap Bubbles
All our rendering has 4096 samples per pixel and ray depth as 100. In both images, lens focus is on the biggest soap bubble on the right. We are able to observe some prominent light interference that produces color bands (more obvious on the later chrome bubble rendering). The reflection of the environment can also be seen from the soap bubbles.
Rendered soap bubbles
|
In order to observe more striking light interference at the soap bubble surface, we manually set the refraction to zero for all rays hitting the bubbles, leaving only the effect of reflections.
Notice that the right rendering has a few color leaking that causes some highlight artifacts. Under the same rendering setting (both 4096 samples per pixel and depth 100), the relative position between the environment lighting and the bubbles affect the resulting rendering quality. There might be more samples needed for the second scene to converge and be noise-free.
Chrome Bubbles
Rendered chrome bubbles
|
Contributions
Cecilia and Ashwin
Ashwin: Changed raytrace_pixel to ask for multiple ray samples for each color channel, then combined those color channels into a Specturm
Ashwin: Rewrote bsdfs of colored objects to return a wavelength dependent magnitude as opposed to a constant spectrum argument
Cecilia: Changed camera.generate ray to take in a color channel argument and sample that color channel’s wavelength distribution (Gaussian) to change the ray’s wavelength
Cecilia: Modified lens file and parser to include dispersion formula used for refractive index calculation
Cecilia: Wrote environment lighting that uses environment maps for the scene
Cecilia: Changed lenstester to take in wavelength dependent glass and bend lights based on different wavelengths
Cecilia: Changed lens_camera’s tracing through the lens to use the wavelength argument to change indices of refraction when tracing through the lens
Cecilia: Implemented black body radiation
Cecilia: Wrote the technical part and result sections of the report (Since graduate students are required to put into more details into report)
Ashwin/Cecilia: Refactored code so that rays have a wavelength argument that can be passed in and checked as well as that functions that return Spectrums now return a single intensity value
Ashwin/Cecilia: Changed sample_L of lights to have a wavelength dependent intensity to simulate different colors of lights initializizing lights with a temperature argument and modeling them as ideal black bodies to get the intensities for each color)
Ashwin/Cecilia: Rewrote glass bsdf to have wavelength dependent indices of refraction (similar code as lens_camera’s tracing)
Ashwin/Cecilia: Wrote a bubble/ thin film interference bsdf that uses wavelength, thickness and light to determine if the interference occurs (integer multiples of wavelength)
Justin
I mostly played a support role in this project. I didn’t affect the main pipeline directly, but one way I contributed was by helping debug in some of the later stages of the project and making design decisions with the team early. Most of how I helped the team was through providing a copius amount of .dae files. Most of them, we didn’t end up using in the presentation or even final renders. I made empty CBs, several bubble scenes, and finally a prism scene. I actually spent most of my time on this scene, trying to find the best position and angle of the prism, as well as reducing the size of the area light and making helping to make it only shine downward, like a laser. I created the prism by scratch and tranforming it with the 4x4 rotation matrix. We're thinking we didn't have enough light samples, and that's why we couldn't find anything resembling a rainbow in the scene. We have a high-sample-rate image rendering now, so the scene wasn't added to the writeup or presentation, but the bubble scenes did a good job holding their own and showing off the effects by themselves.
References
[1] http://refractiveindex.info
[2] https://www.wikiwand.com/en/List_of_refractive_indices
[3] http://www.glassner.com/
[4] http://hyperphysics.phy-astr.gsu.edu/hbase
[5] Morris, Nigel. "Capturing the Reflectance Model of Soap Bubbles." University of Toronto (2003).