OSL debugging tips

The other day I was debugging a shader that didn't produce the results I expected, in fact it didn't produce any output at all. In its basic form it resembled the following shader:


shader S(
vector position = P,

output color Color = 0
){
... many calculations ...
if( ... some condition ...){ Color = 1; }
}

When I hooked this shader up in some node setup, I'd expected it to produce a pattern but all I got was a uniform black. Obviously the condition was never true and Color was never assigned a value of 1. So I had to determine what the values of some intermediate values were to get an idea why the if statement didn't fire.

So I inserted a printf() call:


shader S(
vector position = P,

output color Color = 0
){
... many calculations ...
printf("%.2f\n",position);
if( ... some condition ...){ Color = 1; }
}

I clicked the recompile button, looked in the console and ... nothing was printed! To be 100% sure that the shader was evaluated I changed it to:


shader S(
vector position = P,

output color Color = 0
){
... many calculations ...
printf("%.2f\n",position);
if( ... some condition ...){ Color = 1; }
Color = 0.5;
}

Now the node setup produced the expected shade of gray, so it was hooked up correctly and was indeed being executed but more surprisingly, the console showed the printf output.

This baffled me: if I commented out the final assignment to Color, the printf() wasn't executed. I thought I found a bug in OSL and reported it. But was it?

Constant folding

It turned out that this is intended behavior. The OSL compiler is a clever piece of software and does its best to reduce unnecessary calculations. One of the tricks it employs is called constant folding, a method in common use in many compilers. Basically it tries to calculate at compile time anything that can be compiled, for example:

twopi = 2 * 3.14;fourpi = 2 * twopi;

becomes after compilation:

twopi = 6.28;fourpi = 12.56;

This keeps your code readable but reduces actual calculation at runtime. And this can lead to significant speedups when you are shading milions of samples.

OSL has a lot more of these tricks up its sleeve and can reduce quite complex expressions, even removing branches from if/else statements. But it goes a step further still. If the compiler detects that none of the output values is changed from its default by a shader, this shader is not executed at all.

For my example shader this implies that because the condition in the if statement is apparently based on things that can be calculated at compile time and never true, the if statement is optimized away. The result of this is that no output value will ever be changed by this shader and so the whole shader is optimized away, even if we put printf(), warning() or error() calls in this shader, because statements that produce side effects only and do not contribute to the output of the shader are not by themselves considered enough to keep an if/else branch or even a shader from being optimized away.

Debugging OSL code

Now if we want to debug some code we might have a problem because simply inserting a printf() call is no guarantee that the information we are interested in will be printed. Also, assigning a value to an output socket might not be enough:


shader S(
vector position=P,

output color Color=0
){
...
printf(.....);
Color = 1;
Color = position[0];
}

Neither of the last two statements will ensure that the printf() call is executed. The first one us an assigment of a constant and can be calculated at compile time. The last one only depends on the input and can be 'hardwired', that is, there is no need to execute the shader; the compiler can simply connect input and output.

So if we want to make sure the printf() executes, we need an assignment to Color that cannot be calculated at compile time, for example one of:

Color = position[0]>0;
Color = rand();

Another option is to create an extra output socket just for debugging and assign the variable that you are interested in to this socket. If you connect it to a diffuse shader there is a lot you can tell based on the color, but of course not all variables are suitable for this approach (strings to name one).

New debugging options

From release r57660 (download it from the daily builds, I tested the 64bit windows version) you can disable OSL optimizations by assigning a value to the OSL_OPTIONS environment variable. If you use Windows this can be done like this:

  • open a command prompt
  • set the variable (don't use quotes)
    set OSL_OPTIONS=optimize=0
  • run Blender from the command line
    C:\path\to\blender.exe
  • don't forget to recompile your OSL shader
On unix-like systems you start from an xterm or equivalent and type export OSL_OPTIONS="optimize=0"
path/to/blender
(if you do not use bash but tcsh use setenv instead of export). Setting these variables is local to the command prompt/xterm you are working in so when you exit Blender and close the command prompt the optimization override will be gone. Changing the enviroment on a more global level is not recommended. Also when you're done it might be a good idea to remove all printf calls from you production code because printing to a console is extremely slow. You might still want to print information in exceptional situations but you might consider using error() for that purpose.

Conclusion

In many cases using printf() calls to debug your OSL shaders is still the method of choice but if nothing shows up on the console make sure that these calls are not optimized away.

No comments:

Post a comment