Add-ons and more

Raytracing concepts and code, part 9, adding a background image



This is an article in a multipart series on the concepts of ray tracing. I am not sure where this will lead but I am open to suggestions. We will be creating code that will run inside Blender. Blender has ray tracing renderers of course but that is not the point: by reusing Python libraries and Blender's scene building capabilities we can concentrate on true ray tracing issues like shader models, lighting, etc.
I generally present stuff in a back-to-front manner: first an article with some (well commented) code and images of the results, then one or more articles discussing the concepts. The idea is that this encourages you to experiment and have a look at the code yourself before being introduced to theory. How well this works out we will see :-)

So far the series consists of the several articles labeled ray tracing concepts

When a ray does not hit anything we would like it to intersect with some infinite spherical background.

In Blender it is quite easy to associate a texture with the world background which is used by Blender Internal (but not Cycles which uses node based world settings exclusively):



(The important bit when creating a new texture is select World texture and of course an appropriate mapping for the image. [here we use an image from HDRIHaven])

The ray tracing renderer that we are writing will make use of this texture for the background image as well as for some environment lighting (more on this in a future article)

Elevation and azimuth

Any image texture object offers an evaluate() function that will take two arguments x and y between [-1,1] and will return the color at that point. The point (0,0) will be exactly in the middle of the regardless of its dimensions.

If we assume this image covers the whole <em>skydome</em>, this means that all we have to do when a ray doesn't hit anything in the scene, is to find out how high above (or below) the horizon the ray is pointing and in what direction along the horizon.

The first item is called the elevation (or altitude) and ranges from -90° (or -pi/2 , pointing straight down) to +90° (+pi/2, i.e. straight up).

The second item is called the azimuth and ranges from -180° (-pi) to +180° (+pi) relative to some fixed direction. We will use the positive x-axis as our reference.

Given a normalized direction in cartesian (x,y,z) coordinates, finding the elevation (theta) and azimuth (phi) is pretty straightforward.

Code


Calculating theta and phi and mapping them to [-1,1] is shown in the code below:

    elif scene.world.active_texture:
        theta = 1-acos(dir.z)/pi
        phi = atan2(dir.y, dir.x)/pi
        color = np.array(scene.world.active_texture.evaluate(
                                     (-phi,2*theta-1,0)).xyz)

Because the dir vector is normalized the z component will give us the cosine of the angle between the vector and the z-axis. The conversions are necessary to relate this angle (theta) to the horizontal plane and scale it to [-1,1]. The angle phi is calculated relative to the positive x-axis. The color we get pack from the evaluate call may contain an alpha channel therefore we make sure we only keep the first three components with the .xyz attribute.
The code is available on GitHub. It also contains the necessary code to  show a panel with options to associate a background texture with a World (it reuses a panel from Blender's internal renderer)



No comments:

Post a Comment