Writing a basic OSL color shader

The following is an introduction to basic OSL shader syntax using a simple color blending shader example. a more general explanation of the subject can be read here.

Notes:
> It’s highly recommended to get acquainted with basic C language syntax, since it’s the basis for common shading languages like OSL, HLSL and GLSL.
> More detailed information about writing OSL shaders can be found in the osl-languagespec PDF document from ImageWorks’s OSL GitHub.

This example shader blends 2 color sources according to the surface viewing angle (aka “facing ratio” or “incident angle” or “Perpendicular-Parallel”). the user can choose a facing (“front”) color or texture, a side color or texture, and the shader’s output bell be a mix of the these 2 inputs that depends on the angle the surface is viewed at.

This is the full code of the shader, you’re also welcome to download the it here.

 #include "stdosl.h"
 
shader cglColorAngleBlend
[[ string help = "Blend colors by view angle" ]]
(
	color facing_color = color(0, 0, 0)
		[[ string help = "The color (or texture) that will appear at perpendicular view angle" ]],
	color side_color = color(1, 1, 1)
		[[ string help = "The color (or texture) that will appear at grazing view angle" ]],
	float base_blend = 0.0
		[[ string help = "The percent of side_color that is mixed with facing_color at perpendicular view angle",
		float min = 0.0, float max = 1.0]],
	float curve_exponent = 1.0
		[[ string help = "A power exponent value by which the blend value is raised to control the blend curve",
		float min = 0.001, float max = 10.0]],
	output color color_out = color(1, 1, 1))
{
	// calculate the linear facing ratio:
	float facing_ratio = acos(abs(dot(-I,N))) / M_PI_2;
	// calculate the curve facing ratio:
	float final_blend_ratio = pow(facing_ratio , curve_exponent);
	// blend the facing color:
	color final_facing_color = (facing_color * (1 - base_blend)) + (side_color * base_blend);
	// blend and output the final color:
	color_out = ((final_facing_color * (1 - final_blend_ratio)) + (side_color * final_blend_ratio));
}

The first statment:

#include "stdosl.h"

The #include statement is a standard C compiler directive to link the OSL source code library code file stdosl.h to the shader’s code, so that the OSL data types and functions in the code will be recognized.
* Some systems compile the code successfully without this statement. I’m not sure if their compiler links stdosl.h automatically or not.

The double-square bracketed statements provide both help annotations and value range limits for the shader parameters:

[[ string help = "The percent of side_color that is mixed with facing_color at perpendicular view angle", float min = 0.0, float max = 1.0]]

Note that these statements are appended to single parameters in the shader right before the comma character that ends the parameter statement.
* Not all shading systems that supprt OSL also implement the annotations in the shader’s interface generated by the host software (the shader will work, but it’s parameters wont be described and limited to the defined value range).

Removing the #include statement, annotations and comments,
We can see that the OSL shader structure is very similar to a C function:

shader <identifier> (input/output parameters) {code}

shader cglColorAngleBlend
(
color facing_color = color(0, 0, 0),
color side_color = color(1, 1, 1),
float base_blend = 0.0,
float curve_exponent = 1.0,
output color color_out = color(1, 1, 1)
)
{
float facing_ratio = acos(abs(dot(-I,N))) / M_PI_2;
float final_blend_ratio = pow(facing_ratio , curve_exponent);
color final_facing_color = (facing_color * (1 - base_blend)) + (side_color * base_blend);
color_out = ((final_facing_color * (1 - final_blend_ratio)) + (side_color * final_blend_ratio));
}

First the data type, in this case shader, followed by the shader identifier, in this case “cglColorAngleBlend”:

shader cglColorAngleBlend

After the shader’s type and identifier, a list of parameters is defined within parentheses, separated by comma’s. these parameters define the shader’s input’s, outputs, and default values. Output parameters are preceded by the output modifier.

(
<parameter type> <parameter identifier> = <parameter default value>,
<parameter type> <parameter identifier> = <parameter default value>,
output <parameter type> <parameter identifier> = <parameter default value>
)

(
color facing_color = color(0, 0, 0),
color side_color = color(1, 1, 1),
float base_blend = 0.0,
float curve_exponent = 1.0,
output color color_out = color(1, 1, 1)
)

In this case the shader has 4 user input parameters, and 1 output parameter.
2 color type parameters, “facing_color” and “side_color” for the facing and side color that will be blended together, a float* parameter “base_blend” that specifies how much of the side color will be mixed with the facing color regardless of view angle, and a second float parameter “curve_exponent” specifying a power exponent by which the blend value will be raised to create a non-linear blend curve.
The output parameter “color_out” is a color that will calculated by the shader.
* Note that even though the the output parameter will be calculated by the shader, it is required to define a default value for it for the shader to compile.

After the shader parameters, enclosed within curly braces is the actual body of the shader code, containing the instructions, each ending by a semicolon ; character:

{
<instruction>;
<instruction>;
…..
}

{
float facing_ratio = acos(abs(dot(-I,N))) / M_PI_2;
float final_blend_ratio = pow(facing_ratio , curve_exponent);
color final_facing_color = (facing_color * (1 - base_blend)) + (side_color * base_blend);
color_out = ((final_facing_color * (1 - final_blend_ratio)) + (side_color * final_blend_ratio));
}

I the case of our shader the first code instruction is to define a new float internal variable named “facing_ratio”, calculate the surface/view angle and assign the resulting value to it:

float facing_ratio = acos(abs(dot(-I,N))) / M_PI_2;

The expression:

acos( abs( dot( -I, N ) ) )  / M_PI_2

calculates incident angle, i.e. angle between 2 vectors, originating at the surface shading point, one pointing towards the origin of the incoming ray, and the other the surface normal, as a factor of 0 to 1 representing 0 – 90 degrees.
These 2 vector are easily obtained through the built in OSL global variables N and I. N is the surface normal at the shading point, and I is the incoming ray vector pointing to the shading point which is inverted in this case to point backwards by  typing a minus before it: -I.
The incident angle is calculated in radians as the arc-cosine of the dot-product of N and -I and then divided by half a π to convert it to a linear factor of 0 to 1 representing 0 to 90 degrees in radians, M_PI_2 being a convenient half π constant.
* M_PI being a full π, M_2PI being 2π representing 180 degrees in radians and 360 degrees respectively (OSL provides there are more constants in this series).

The second instruction raises the facing ratio that was calculated in the previous instruction by a power value provided by the curve_exponent input parameter, to create a non linear angle/color blend in values other than 1.0.
The resulting modified blend value is stored in a new internal variable final_blend_ratio:

float final_blend_ratio = pow(facing_ratio , curve_exponent);

Note:
We could avoid setting a new variable by modifying the value of  the facing_ratio variable, and we could also combine the these 2 instruction into 1 bigger expression like this:

pow( acos( abs( dot( -I, N ) ) ) / M_PI_2 , curve_exponent )

But I decided to keep it separated into 2 variable and 2 instructions for clarity.
* try modifying the code as an exercise

The third instruction modifies the input color facing_color by premixing it with the input side_color according to the percent give by the input parameter base_blend and assigns the resulting color to a new internal variable named final_facing_color:

color final_facing_color = (facing_color * (1 - base_blend)) + (side_color * base_blend);

The expression:

( facing_color * ( 1 – base_blend ) ) + ( side_color * base_blend )

Calculates a linear combination** (linear interpolation) between the 2 input colors using the base_blend as a 0 – 1 factor between them.
* Note that OSL allows to define arithmatic operations freely between colors and floats.

The forth and final instruction creates the final mix between the modified facing color stored in final_facing_color variable and the side color given by the input color parameter side_color, by again, calculating a linear combination between the 2 colors, this time using final_blend_ratio variable value we calculated previously as the combination factor, and very importantly, finally, assigning the mixed result to the shader output parameter color_out so it will be the final output of the shader:

color_out = ((final_facing_color * (1 - final_blend_ratio)) + (side_color * final_blend_ratio));

This screen capture shows this shader at work in Blender and Cycles, connected to a Principled BSDF shader as it’s base color source:
Annotation 2020-06-17 235933

Thats it! 🙂
Hope you find this article informative and useful.

Clarifications:

* A “float” data type is simply the the computer-science geeky way of saying “accurate non-integer number”. when we have to store numbers that can describe geometry and color, we need a data type that isn’t limited to integers so for that purpose we use float values. there’s actually a lot more to the float formal definition in computer science, but for our purpose here this will suffice.

** A Linear Combination, or Linear interpolation (lerp) is one of the most useful numerical operations in 3D geometry and color processing (vector math):
A * ( 1 – t ) +  B * t 
A and B being your source and target locations or colors or any other value you need to interpolate and t being the blending factor from 0 – 1.

Related posts:

  1. What are OSL shaders?
  2. Using OSL Shaders in Maya and Arnold
  3. Using OSL Shaders in Blender and Cycles
  4. Using OSL Shaders in 3ds max and V-Ray

Blender 2.83 – OSL bug and fix

Software:
Blender 2.83

I recently found that some of my OSL shaders don’t work in Blender 2.83.
More specifically, shaders compiled with Blender 2.83 had their microfacet bsdf not recognized by the renderer and return 0.
* Shaders that were compiled by previous versions did work.
Looking into the problem I found that the OSL source files that ship with Blender 2.83 are significantly different than the files shipped with previous versions.

Simple fix:
Copy the stdosl.h and oslutil.h from Blender 2.82 to 2.83.
On Windows these files are found in this folder:
C:\Program Files\Blender Foundation\Blender 2.83\2.83\scripts\addons\cycles\shader

* I’m not suggesting to download the files strait from ImageWorks GitHub because I’m not sure the ones that ship with Blender & Cycles are not modified.

I have reported the bug here.

Related:
Using OSL Shaders in Blender & Cycles

Using OSL Shaders in Blender & Cycles

Software:
Blender 2.82 | Cycles

The Cycles render engine in Blender has a very convenient OSL Shader development and usage workflow.
Shaders can be both loaded from external files or written and compiled directly inside Blender.

Before you begin:
Make sure your Blender scene is set to use the Cycles render engine, in CPU rendering mode, and also check the option Open Shading Language:
Annotation 2020-05-28 165830

 

To write an OSL shader in Blender:

  1. Write your shader code in Blender‘s Text Editor:
    Annotation 2020-05-28 173405
  2. In your object’s material shader graph (Shader Editor view),
    Create a Script node:
    Annotation 2020-05-28 170331
  3. Set the Script node‘s mode to Internal,
    And select your shader’s text from the Script node‘s source drop-down:
    Annotation 2020-05-28 170711
  4. If the shader compiles successfully, the Script node will display its input and output parameters, and you can connect it’s output to an appropriate input in your shading graph.
    * If your shader is a material (color closure) connect it directly to the Material Output node’s Surface input, is it’s a volume to the Volume input, or if its a texture to other material inputs as needed.
    Annotation 2020-05-28 171419
  5. If the shader code contains errors, it will fail to compile, and you’l be able to read the error messages in Blender‘s System Console window:
    Annotation 2020-05-28 172423
  6. After fixing errors or updating the shader’s code, press the Script Noe Update button on the Script node to re-compile the shader:
    Annotation 2020-05-28 172735

 

Loading an external OSL shader into Cycles:

Exactly the same workflow described in the previous section, except setting the Script node‘s mode to External and either typing a path to the shader file in the Script node or pressing the little folder button to locate it using the file browser:Annotation 2020-05-28 173159

 

Related:

  1. What are OSL shaders?
  2. Using OSL shaders in Arnold for Maya
  3. Using OSL shaders in V-ray for 3ds max
  4. Writing a basic OSL color shader
  5. Blender 2.83 OSL bug & fix

Using OSL Shaders in V-Ray for 3ds max

Software:
3ds max 2020 | V-Ray next

V-Ray for 3ds max supports compiling and rendering OSL shaders,
And also offers some handy shaders for download on the V-Ray documentation website.
Note:
OSL shaders are supported only in V-Ray Advanced and not in V-Ray GPU.

 

To load an external OSL shader:

  1. For a material (color closure) shader, create a:
    Materials > V-Ray > VRayOSLMtl
    For a texture shader create a:
    Maps > V-Ray > VRayOSLTex
  2. In the VRayOSLMtl or VRayOSLTex‘s General properties,
    Click the Shader File slot-button to locate and load the *.osl file.
  3. Provided that the shader has loaded and compiled successfully,
    You will now be able to set it’s custom parameters in its Parameters section:
    Annotation 2020-05-25 200439
  4. If compile errors will be found you’l be able to read the error messages in the V-Ray messages window:
    Annotation 2020-05-25 200625

 

To write an OSL shader:

  1. To write a material shader (color closure) create a:
    Materials > V-Ray > VRayOSLMtl
    To write a texture shader create a:
    Maps > V-Ray > VRayOSLTex
  2. Expend the Quick Shader section of the node’s properties,
    And check the Enable option.
  3. Write you’r OSL code, and press Compile.
    Annotation 2020-05-25 204835
  4. Provided that the shader compiled successfully,
    You will now be able to set it’s custom parameters in its Parameters section:
    Annotation 2020-05-25 205225
  5. If compile errors will be found you’l be able to read the error messages in the V-Ray messages window.

 

Related:

  1. What are OSL shaders?
  2. Using OSL shaders in Arnold for maya 
  3. Using OSL shaders in Blender & Cycles
  4. Writing a basic OSL color shader

Using OSL Shaders in Arnold for Maya

Software:
Maya 2020 | Arnold 6

Autodesk Maya 2020 & Arnold 6 offer a flexible OSL development and usage workflow.
You can both load or write OSL shaders on the fly, compile, test, and render them,
And also define a shader folder path for shaders to be available as part of your library for all projects.
Steps for using OSL shaders in Maya & Arnold:

Writing an OSL shader or loading it for single use (just the current project):

  1. Create a new aiOslShader node:
    Annotation 2020-05-19 220416
  2. Select the new aiOslShader node and in its attributes either write new OSL code in the code OSL Code section, or press Import to Load an OSL shader file (*.osl):
    Annotation 2020-05-19 220454
  3. When new shader code is imported, it’s automatically compiled:
    Annotation 2020-05-19 220649
  4. I f you’ve written new code, or changed the code it will have to be re-compiled.
    In that case press Compile OSL Code:
    Annotation 2020-05-19 223632
  5. The code may contain errors, in that case you will see a red Compile Failure message:
    Annotation 2020-05-19 224342
    You can read the error message in the Maya output window, or in the Maya Script Editor, Correct the code and press Compile OSL Code again.
    Annotation 2020-05-19 224410
  6. After the OSL code is compiled successfully, the shader’s input parameters can be accessed in the OSL Attributes section below the code:
    Annotation 2020-05-19 224431
  7. Depending on the type of output the OSL shader generates, the aiOSLShader node should to be connected to an input in the object’s shader graph or Shading Group.
    * OSL shaders can be surface shaders, volume shaders, procedural textures, texture processors and more..
    To Apply the OSL shader to an object as a surface shader, disconnect the object’s current surface shader if it has one,
    And then drag and drop the aiOSLShader node from the Hypershade window onto the object.
    In the Connection Editor select outValue on the left side (node outputs) and surfaceShader in the right side (object inputs):
    Annotation 2020-05-19 220810

 

 

Installing OSL shaders so they will always be available as custom nodes in the Hypershade library

  1. Create a folder for storing your OSL shaders, and place you OSL shader files (*.osl) in this folder.
    Annotation 2020-05-20 225134
  2. Locate Maya’s Maya.env file.
    This is an ascii text file containing environment variables that Maya loads at startup.
    The Maya.env will usually be located at:
    C:\Users\<your user>\Documents\maya\<maya version>
    Annotation 2020-05-20 225221
  3. Open Maya.env in a text editor and add the following line to it:
    ARNOLD_PLUGIN_PATH=<path to your OSL shaders folder>
    for example:
    ARNOLD_PLUGIN_PATH=D:\3DAssets\OSL_Shaders
    Annotation 2020-05-20 225400
  4. Restart Maya.
    When Maya loads, the MtoA (Maya to Arnold) plugin will automatically compile the shaders that are found in the folder, report about the compilations or found errors in the Maya output window, and create compiled *.oso files for each shader:
    Annotation 2020-05-20 230917
  5. The compiled shaders will now be available as custom nodes in the Hypershade Arnold library with the typical “ai” (Arnold Interface) prefix added to their names:
    Annotation 2020-05-20 231138
  6. The OSL shaders will be created as nodes with their editable attributes, that can be connected to an object’s shading network graph:
    * Connecting the node to the graph is the same as described in the previous part (7)
    Annotation 2020-05-20 231232

 

Related:

  1. What are OSL Shaders?
  2. Using OSL shaders in V-Ray for 3ds max
  3. Using OSL shaders in Blender & Cycles
  4. Writing a basic OSL color shader

What are OSL shaders?

OSL is an acronym for Open Shading Language.
Developed Originally at Sony Pictures Imageworks for the Arnold render engine, Open Shading language is a C like programming language with which custom material, textures and shading effects can be developed OSL shaders (*.osl files), that are supported many by popular render engines.

OSL allows development of complex texturing and shading effects using scene input parameters like the shading point’s world position vector, normal vector, UV coordinates etc., and optical ray-tracing  functions – BSDF*’s or “Color Closures” as they are called in OSL, like Diffuse, Glossy, Refraction light scattering etc. that can be combined with C  logic and math programming.

*.osl files are compiled to *.oso file for rendering.
Most render engines supporting OSL shaders ship with an OSL compiler.

 

Useful OSL shader libraries found on the web:

> OSL Shaders for download on the Chaos Group website:
https://docs.chaosgroup.com/display/OSLShaders/OSL+Shaders+Home

> OSL Shaders for download at the Autodesk Developer Network Github repository:
https://github.com/ADN-DevTech/3dsMax-OSL-Shaders
These are the OSL shaders that ship with 3ds max 2019 or newer, and are providing texture and pattern processing tools, but not materials.
* Material shaders or “Closures” as they are referred to in OSL are not supported by 3ds max’s native implementation of OSL.

> A library of OSL shaders collected by Shane Ambler:
https://github.com/sambler/osl-shaders

 

Notes:

  1. In general, OSL shaders are supported only in CPU Rendering, but not supported by GPU renderers. There are some attempts to develop OSL support for GPU renderers, But as far as I know they are limited.
  2. Some OSL shaders will work on one or more render engines, and not work as expected on other render engines. the reason being that each render engine has it’s own implementation of OSL.
    These differences may show in a different rendered result and also compile failure.

 

Basic example:
The following example renders show how a combination of two basic OSL shaders iv’e written, one of which is a dielectric material shader, and the other a color/angle blend procedural texture, produce fairly consistent results when rendered in different render engines.
* note the difference in specular glossy roughness interpretation for the same 0.1 value..

> You’r welcome to download these two basic OSL shaders here.

Arnold for Maya:
Annotation 2020-05-22 213609

V-Ray for 3ds max:
Annotation 2020-05-22 213629

Cycles for Blender:
Annotation 2020-05-22 213723

 

Related:

  1. Using OSL shaders in Arnold for Maya
  2. Using OSL shaders in V-Ray for 3ds max
  3. Using OSL shaders in Blender & Cycles
  4. Writing a basic OSL color shader

Understanding Transparency Render Settings

In theory, all clear* refractive surfaces should have their shadow calculated using a refractive caustics calculation in-order to render the refractive lensing** effect correctly, have their transparency color calculated as volumetric absorption of light through the medium in-order to render the color correctly for areas of different thickness, and have not only external reflections, but also internal reflections calculated, in-order to render the interaction between light and the transparent body correctly.
However, for thin surfaces of even thickness, like window glazing and car windshields, these optical effects can be rendered in much cheaper (non physical) methods, with very little compromise on final image quality or look, and even have an easier setup in most cases.
For this reason most popular render engines have object (mesh) and material (shader) parameters that allow configuration of the way these transparency effects will be rendered.
In this short article we’ll cover the different methods for rendering transparency effects, the reasoning behind them and the way to configure these settings in different render-engines.

In the comparison images below (rendered with Cycles), the images on the left were rendered with physically correct glass settings, 8192 samples + denoising,
And the images on the right were rendered with “flat” transparency settings and 1024 samples + denoising.
> See the shader settings below
Note that while for the monkey statue, the fast flat transparency settings produce an unrealistic result, the window glazing model loses very little of its look with the flat fast settings:

Transparency_Settings

Lensing, caustics and transparent shadows:

3D-Rendering-of-glassware

It’s a common intuitive mistake, that transparent objects don’t cast shadows, but they actually do. they don’t block light, they change its direction. light is refracted through them, gets focused in some areas of their surroundings (caustics) but can’t pass through them directly, so a shadow is created.
A good example of this would be a glass ball, acting like a lens, focusing the light into a tiny area, and otherwise having a regular elliptical shadow. if we tell the render-engine to just let direct light pass through the object we won’t get a correct realistic result, even if the light gets colored by the object’s transparency color.
There is however one case where letting the direct light simply pass through the object can both look correct and save a lot of calculations, and that is when the object is a thin surface with consistent thickness like window glazing.
So in many popular render-engines, when rendering an irregular thick solid transparent body like a glass statue or a glass filled with liquid, we have to counter-intuitively set the object or material to be opaque for direct light and let the indirect refracted light (caustics) create the correct lensing effect (focused light patterns in the shadow area)
> physically, light passing through a material medium is always refracted, i.e. indirect light. but for thin surfaces with even thickness like glazing, the lensing effect is insignificant, and can be completely disregarded by letting light pass directly through the object and be rendered as ‘transparent shadow’.
So the general rule regarding calculating caustics (lensing) vs casting transparent shadows (non physical), is that if the transparent object is a solid irregular shape with varying thickness like a statue or a bottle of liquid it should be rendered as opaque for direct light but with fully calculated caustics i.e. refracted indirect light.

Transparency color:

Cola_Test_ODED_ERELL_3D_Crop_signed

Physically, the color of transparency*** is always created by volumetric absorption of light traveling within the material medium. as light travels further through a material, more and more of it’s energy gets absorbed in the medium**** (converted to heat), therefore the thicker the object, less light will reach its other side, and it will appear darker. this volumetric absorption of light isn’t consistent for all wave lengths (colors) of light so the object appears to have a color.
For example, common glass, absorbs the red and blue light at a higher rate than green light, and therefore objects seen through it will appear greenish. when we look at the thin side of a common glazing surface we see a darker green color because we see light that has traveled through more glass (through a thicker volume of glass) because of refraction bending the light into the length of the surface. tea, in a glass, generally looks dark orange-brown, but if spilled on the floor it will ‘lose’ its color, and look clear like water because spilled on the floor, it’s too thin to absorb a significant amount of light and appear to have a color.
Most render engines allow setting the transparency (“refraction”/”transmission”) color of the material both as a ‘flat’ non physical filter color, and as a physical RGB light absorption rate (sometimes referred to as ‘fog’ color), that can in some cases be more accurately tuned by additional multiplier or depth parameters.
Setting an object’s transparency color using physical absorption (fog) usually requires more tweaking because in this method, the final rendered color is dependent not only on the color we set at the material/shader, but also on the model’s actual real world thickness.*****
In general, the transparency color of thick, solid, irregularly shaped objects (with varying thickness) must be set as a physical absorption rate color, and not as a simple filter color, otherwise the resulting color will not be affected by the material thickness, and look wrong.
For thin surfaces with consistent thickness, like window glazing, however, it’s more efficient to setup the transparency color as a ‘flat’ filter color, because it’s more convenient and predictable to setup, and produced a correct looking result.
For example, if we need to render an Architectural glazing surface that will filter exactly 50 percent of the light passing through it, it’s much simpler to set it up using a simple 50% grey transparency filter color, because this method disregards the glass model’s thickness. This approach isn’t physical, but for an evenly thick glazing surface, the result has no apparent difference from a physical volumetric absorption approach to the same task.

Internal reflections:

Diamond-close-up-inspection

It’s not intuitive to think that the air surface itself has reflections when seen through a transparent material volume like water or glass.
Viewed from under water, the air surface above, acts like a mirror for certain angles, reflecting objects that are under water. a glass ball lit by a lamp has a very distinct highlight, which is the reflected image of the light source itself (specular reflection), but it also has an internal highlight appearing on inside where the glass volume meets the air volume. we can easily ‘miss’ this internal highlight because in many cases it’s appearance converges with the bright focused light behind the ball, caused lensing (refractive caustics). the distinctly shiny appearance of diamonds, for example, is very much dependent on bright internal reflections, diamond cutting patterns are specifically designed to reflect a large percentage of light back to the viewer and look shiny, and if we wish to create a realistic rendering of diamonds, we will not only have to setup the correct refractive index for the material, but also model the geometric shape of the diamond correctly, and of course, set the material to render both external (“regular”******) reflections and internal reflections.
Your probably already guessing what I’m about to say next..
For thin surfaces with even thickness, the internal reflection is barely noticeable, because it converges with the main surface reflection, an for this reason, when rendering window glazing, car windshields, and the like, we can usually turn the internal reflections calculation off to save render time.

Underwater_31.12.18

Render Settings:

Simplified settings summary table:

Flat (Glazing) Physical (irregular volume)
Shadow Transparent Caustics
Color Filter Volumetric Absorption
Reflections External only External and Internal

Example Cycles (Blender) shaders:
> The Flat glazing shader is actually more complex to define since it involves defining different types of calculations per different type of rays being traced (cheating).
In general, for Shadow and Diffuse rays that shader is calculated as a simple Transparent shader and nor a refraction shader, and when back-facing, the shader is calculated as pure white transparent instead of glossy to remove the internal reflections.
> While the flat glazing shader is only connected to the Surface input of the material output, the physical glass shader has also a Volume Absorption BSDf node connected to the Volume input of the material output node.
> Note that a simple Principled BSDF material will have flat transparency and physical shadow (caustics) by default.
> For caustics to be calculated, the Refractive Caustics option has to be enabled in the Light Path > Caustics settings in the Cycles render settings.

Cycles

Example V-Ray Next for 3ds max material settings:
>
In V-Ray for 3ds max (and Maya) the Affect Shadows parameter in the VrayMtl Refraction settings determines weather the shadows will be fake transparent shadows suitable for glazing or (on) or opaque (off) which is the suitable setting for caustics.
> The caustics calculation is either GI Caustics which is activated by default in the main GI settings or a dedicated Caustics calculation that can be activated, also in the GI settings.
> For flat glazing the color is defined as Refraction Color and for physical glass the Refraction color is pure white and the glass color is set as Fog color.

V-Ray_Glass

Example Arnold for Maya settings:
> In Arnold 5 for Maya the Opaque setting in the shape node Arnold attributes must be unchecked for transparent shadows, and checked for opaque shadows suitable for caustics.
> For rendering refractive caustics in Arnold for Maya more settings are needed.
> When the Transmission Depth attribute is set to 0 the Transmission Color will be rendered as flat filter color, and when the Transmission Depth attribute is a value higher than 0 the transparency color will be calculated as volumetric absorption reaching the Transmission Color at the specified depth.

ArnoldMaya

General notes:

> in Brute Force Path Tracers like Cycles and Arnold the Caustics calculation is actually a Diffuse indirect light path. this seems un-intuitive, but the light pattern appearing on a table surface in the shadow of a transparent glass is actually part of the table surface’s diffuse reflection phenomenon.

> what we refer to as ‘Diffuse Color’ in dielectric (non-metals) is actually a simplification of absorption of light scattered inside the object volume (SSS).

* Optically all dielectric materials (non-metals) are refractive, but not all of them are also clear, the is, most of them actually have micro particles or structures within their volume, that scatter and absorb light that travels through them, creating the effects we’re used to refer to as “Subsurface Scattering” (SSS) and in the higher densities “Diffuse reflection”.

** Lensing is a term used to describe the effect of a material medium bending light, focusing and dispersing it, and so acting as a lens.

*** Actually all color in dielectric (non metallic) materials is created by Volumetric Absorption.

**** Light isn’t only absorbed as it travels through medium, it’s also scattered.

***** Volumetric shading effects usually use the model original scale (the true mesh scale), so to avoid unexpected results it’s best that the object’s transform scale will be 1.0 (or 100% depending on program annotation)

Related Posts:
>
Cycles Nested Transparencies
>
Arnold for Maya Refractive Caustics
> Arnold for Maya Transmission Scattering
> Understanding Fresnel Reflections
> Advanced Architectural Glazing shader for Blender
> V-Ray Underwater Rendering

Understanding the Photometric Light Measurement Units

There are two sets of light intensity measurement units:
Photometric units and Radiometric* units.
The Photometric units measure the intensity of visible light** as it is perceived by the human eye, and the Radiometric units measure the intensity of electromagnetic radiation***, which is the broader physical phenomenon of light, including the whole spectrum of radiation beyond visible light** (like x-rays and infrared radiation for example).

Light intensity is generally measured in three ways:

1. The directional intensity received from a light source as it is measured from a point in space. i.e Luminance in Photometric units or Radiance in Radiometric units.

2. The total light intensity output a light source emits to all directions i.e Luminous Flux in Photometric units or Radiant Flux in Radiometric units.

3. The amount of light intensity received by a surface from all directions i.e Illuminance in Photometric units or Irradiance in Radiometric units.

Similarly to the way measurement of kinetic power is based on the power of an ideal horse, the Photometric measurement units base the scale of light intensity on the light emitted by an ideal candle.

Ster

Luminance (Candela):
When measured from any point in space, the Luminance of an ideal candle seen from that point is measured as 1Candela‘ i.e. 1CD‘.
> In 3D rendering, a photometric IES file describes a light source’s light beam pattern by listing the Luminance or CD of the light source in different directions.

For light emitting surfaces like LCD screens Luminance is measured as Candelas per 1 square meter of surface i.e. CD/m2. Typical LCD computer monitors for instance, have a Luminance of about 250 CD/m2. imagine your computer screen displaying pure white and extended to an area of 1m x 1m, the light intensity perceived from it would be as if about 250 candles were spread on the whole area.

Luminous Flux (Lumen):
The amount of light emitted by an ideal candle through 1 solid angleSteradian‘**** conic beam distribution is measured as 1Lumen‘ or 1lm‘. the total Luminous Flux of the candle in all directions is 4 x PI lumens i.e 12.56 lumens which is simply the whole surface area of the unit sphere.
> The total amount of light produced by different kind of light bulbs is usually specified by Luminous Flux measurement i.e how many Lumens does the light source output.
> If sn optical reflector is placed next to a light source, focusing all it’s light output to a narrow direction, it wont change the light source’s Luminous Flux (Lumen) output, but since the same Luminous Flux will be focused to a narrower beam, it will have a higher Luminance (CD) measured from that direction, and therefore surfaces at that direction will be receive a brighter Illumination (Lux) (see below).

Illuminance (Lux):
A 1 m2 (meter squared) area surface, receiving illumination of 1 lumen has a measured Illuminance of 1 lux or 1 lx. Illuminance is measured by how many lumens a surface receives per square meter.
> In photography, the amount of Illuminance at which a surface is lit is important for determining the proper photographic exposure for the picture.

The inverse-square law:
As a light beam travels through space it’s distribution covers a larger and larger area, therefore it’s energy per area is reduced. the light energy a candle emits through 1 solid angle steradian, in a distance of 1 meter will cover an area of 1 meter squared, therefore the area of 1 meter squared will receive 1 lumen of light energy and will be illuminated with an illuminance of 1 lux. as that 1 lumen of light energy travels another 1 meter further, to a distance of 2 meters from the candle, it spreads and covers an area of 4 meter squared. each square meter of the 4 now receives just 1/4 of a lumen, so it’s illuminated by only 1/4 lux. as that 1 lumen of light energy travels another 1 meter further, to a distance of 3 meters from the candle, it spreads and covers an area of 9 meter squared. each square meter of the 9 now receives just 1/9 of a lumen, so it’s illuminated by only 1/9 lux. after a distance of 4 meters, the same 1 lumen on light energy will be spread on an area of 16 meter squared, so each square meter will be illuminated by 1/16 lux. you can already see the emerging pattern, the illumination intensity is inversely proportional to the square of the distance to the light source. This phenomenon is referred to as ‘The inverse-square law‘, and in practical terms it means that surface illumination is greatly influenced by it’s distance from the light source.

inv

Notes:

* Radiometric units measure light intensity using Watt light energy units.
note that this isn’t the Watt measurement units of electric consumption we’re used to for classifying electric light sources with, but a Watt measurement of the actual energy in the light itself.

** Electromagnetic radiation of wave lengths that stimulate the human eye.

*** Also referred to as ‘light‘ in physics.

**** A ‘Steradian‘, also referred to as ‘square radian’ is a measurement unit of 3D conic span or ‘solid angle’. a solid angle of 1 Steradian beginning at the center of a unit sphere covers exactly an area of 1 squared on the surface of the sphere. (the whole surface area of the sphere being 4 PI). The Steradian can be thought of as the Radian’s 3 dimensional ‘cousin’.

Related posts:
IES lighting in CG
Fresnel reflections

Basic Cloth Material in Arnold for Maya

Software:
Maya 2018 | Arnold 5

An example of a basic traditional (not scanned) cloth material setup in Arnold 5 for Maya using an aiStandardSurface shader.

Untitled-1.jpg

The shading network uses a classic angle dependent color blend to simulate the color of the cloth being washed out at grazing angle of view.

Explanation of the node graph:

Untitled-2.jpg

  1. A black and white fabric weave texture that will serve as input for multiple shading channels.
    * This is actually not the best example of such a pattern, and could be replaced with a much better texture.
    cotton grey bump
  2. A remapValue node is used to set contrast to the fabric pattern (reduce contrast in this case) prior to it being multiplied with the fabric colors.
    * Note that only one of the textures RGB channels is connected to the remapValue node since it’s a float (mono) processor and not RGB.
    Untitled-3
    * Note that depending on the fabric texture, you may have to design different curves to achieve the right effect.
  3. Two colors are defined with colorConstant nodes:
    A deep color as the main fabric color, and a washed out color for grazing angle view (“side color”).
  4. An aiFacingRatio node is used as an input for incident angle info.
    * Note that in this case I checked the node’s invert option to make it behave more like other systems I’m used to (if you don’t use invert, the angle blend curve in 5 will be different..)
  5.  A remapValue node used to set the angle blend curve or in other words, how much does the color appears washed out per change of view angle of the cloth surface.
    * The longer it take the curve to become steep from left to right, the more the main color will be dominant before the washed out color will appear.
    Untitled-4
  6. A colorCorrect node is used in this example just as a way to convert the remapped float value back to RGB for being multiplied with the cloth colors.
    * We could also connect it directly to the individual float components of the RGB colors but this way the node graph is cleaner.
  7. A multiplyDivide node is used to multiply the processed fabric texture with the 2 fabric colors “baking” the pattern into the color.
  8. A blendColors node is used to blend the 2 processed fabric colors together according to the processed facingRatio angle input.
    The result is the final cloth color that is connected to the aiStandardSurface shader.
  9. An aiBump2d node is used to convert the fabric pattern to normal data that will be connected to the aiStandardSurface shader to produce bumps.
  10. An aiStandartSurface shader serving as the main shading node for this material.
    * Note that under Geometry the Thin Walled option is checked so that the Subsurface layer of the shader will act as a Paper Shader rather than SSS.
    * The main cloth color is connected to the SubSurface Color input.
    Untitled-5

 

More Arnold shading posts

Simple Snow Material in VRay for 3ds max

Software:
3ds max 2019 | V-Ray Next

A simple way to create a snow material in V-Ray for 3ds max is to combine a VRayFastSSS2 material with a VRayFlakesMtl using a VRayBlendMtl.
The VRayFastSSS2 creates the soft translucent shading for the snow, and the VRayFlakesMtls adds sparkling highlights.

  • Note that depending on the scene and view scale,
    The VRayFlakesMtls ‘flake glossiness’, ‘flake density’ and ‘flake size’ have to be tweaked to achieve the wanted result.

Untitled-1

Untitled-2

Untitled-3