UE4 – Enable DXR Raytracing

Software:
Unreal Engine 4.25

Steps for activating DXR Ray-tracing in a UE4 project:

  1. Project Settings:
    Platforms > Windows > Targeted RHIs:
    Set Default RHI to DirectX 12
    * RHI = Rendering Hardware Interface
  2. Project Settings:
    Engine > Rendering > Ray Tracing:
    Check Ray Tracing
    * Requires restarting the editor, and may take a while to load the project afterwards..
    * I’m actually not sure if the reason for delay in re-launching the project is a full re-build of the lighting or compiling shaders..
  3. Post Process Volume > Rendering Features > Reflections:
    Set Type to: Ray Tracing
  4. Post Process Volume > Rendering Features > Ray Tracing Reflections:
    Set Max Bounces to more than 1 if needed

No DXR Reflections:
Annotation 2020-07-05 011317

DXR Reflections on a GTX 1070 GPU:
Annotation 2020-07-05 020433

 

Related posts:

  1. UE4 Light calculation tips
  2. UE4 HDRI lighting
  3. UE4 – Technical model quality tips

Writing a basic OSL color shader

The following is an introduction to basic OSL shader syntax using a simple color blending shader example. a more general explanation of the subject can be read here.

Notes:
> It’s highly recommended to get acquainted with basic C language syntax, since it’s the basis for common shading languages like OSL, HLSL and GLSL.
> More detailed information about writing OSL shaders can be found in the osl-languagespec PDF document from ImageWorks’s OSL GitHub.

This example shader blends 2 color sources according to the surface viewing angle (aka “facing ratio” or “incident angle” or “Perpendicular-Parallel”). the user can choose a facing (“front”) color or texture, a side color or texture, and the shader’s output bell be a mix of the these 2 inputs that depends on the angle the surface is viewed at.

This is the full code of the shader, you’re also welcome to download the it here.

 #include "stdosl.h"
 
shader cglColorAngleBlend
[[ string help = "Blend colors by view angle" ]]
(
	color facing_color = color(0, 0, 0)
		[[ string help = "The color (or texture) that will appear at perpendicular view angle" ]],
	color side_color = color(1, 1, 1)
		[[ string help = "The color (or texture) that will appear at grazing view angle" ]],
	float base_blend = 0.0
		[[ string help = "The percent of side_color that is mixed with facing_color at perpendicular view angle",
		float min = 0.0, float max = 1.0]],
	float curve_exponent = 1.0
		[[ string help = "A power exponent value by which the blend value is raised to control the blend curve",
		float min = 0.001, float max = 10.0]],
	output color color_out = color(1, 1, 1))
{
	// calculate the linear facing ratio:
	float facing_ratio = acos(abs(dot(-I,N))) / M_PI_2;
	// calculate the curve facing ratio:
	float final_blend_ratio = pow(facing_ratio , curve_exponent);
	// blend the facing color:
	color final_facing_color = (facing_color * (1 - base_blend)) + (side_color * base_blend);
	// blend and output the final color:
	color_out = ((final_facing_color * (1 - final_blend_ratio)) + (side_color * final_blend_ratio));
}

The first statment:

#include "stdosl.h"

The #include statement is a standard C compiler directive to link the OSL source code library code file stdosl.h to the shader’s code, so that the OSL data types and functions in the code will be recognized.
* Some systems compile the code successfully without this statement. I’m not sure if their compiler links stdosl.h automatically or not.

The double-square bracketed statements provide both help annotations and value range limits for the shader parameters:

[[ string help = "The percent of side_color that is mixed with facing_color at perpendicular view angle", float min = 0.0, float max = 1.0]]

Note that these statements are appended to single parameters in the shader right before the comma character that ends the parameter statement.
* Not all shading systems that supprt OSL also implement the annotations in the shader’s interface generated by the host software (the shader will work, but it’s parameters wont be described and limited to the defined value range).

Removing the #include statement, annotations and comments,
We can see that the OSL shader structure is very similar to a C function:

shader <identifier> (input/output parameters) {code}

shader cglColorAngleBlend
(
color facing_color = color(0, 0, 0),
color side_color = color(1, 1, 1),
float base_blend = 0.0,
float curve_exponent = 1.0,
output color color_out = color(1, 1, 1)
)
{
float facing_ratio = acos(abs(dot(-I,N))) / M_PI_2;
float final_blend_ratio = pow(facing_ratio , curve_exponent);
color final_facing_color = (facing_color * (1 - base_blend)) + (side_color * base_blend);
color_out = ((final_facing_color * (1 - final_blend_ratio)) + (side_color * final_blend_ratio));
}

First the data type, in this case shader, followed by the shader identifier, in this case “cglColorAngleBlend”:

shader cglColorAngleBlend

After the shader’s type and identifier, a list of parameters is defined within parentheses, separated by comma’s. these parameters define the shader’s input’s, outputs, and default values. Output parameters are preceded by the output modifier.

(
<parameter type> <parameter identifier> = <parameter default value>,
<parameter type> <parameter identifier> = <parameter default value>,
output <parameter type> <parameter identifier> = <parameter default value>
)

(
color facing_color = color(0, 0, 0),
color side_color = color(1, 1, 1),
float base_blend = 0.0,
float curve_exponent = 1.0,
output color color_out = color(1, 1, 1)
)

In this case the shader has 4 user input parameters, and 1 output parameter.
2 color type parameters, “facing_color” and “side_color” for the facing and side color that will be blended together, a float* parameter “base_blend” that specifies how much of the side color will be mixed with the facing color regardless of view angle, and a second float parameter “curve_exponent” specifying a power exponent by which the blend value will be raised to create a non-linear blend curve.
The output parameter “color_out” is a color that will calculated by the shader.
* Note that even though the the output parameter will be calculated by the shader, it is required to define a default value for it for the shader to compile.

After the shader parameters, enclosed within curly braces is the actual body of the shader code, containing the instructions, each ending by a semicolon ; character:

{
<instruction>;
<instruction>;
…..
}

{
float facing_ratio = acos(abs(dot(-I,N))) / M_PI_2;
float final_blend_ratio = pow(facing_ratio , curve_exponent);
color final_facing_color = (facing_color * (1 - base_blend)) + (side_color * base_blend);
color_out = ((final_facing_color * (1 - final_blend_ratio)) + (side_color * final_blend_ratio));
}

I the case of our shader the first code instruction is to define a new float internal variable named “facing_ratio”, calculate the surface/view angle and assign the resulting value to it:

float facing_ratio = acos(abs(dot(-I,N))) / M_PI_2;

The expression:

acos( abs( dot( -I, N ) ) )  / M_PI_2

calculates incident angle, i.e. angle between 2 vectors, originating at the surface shading point, one pointing towards the origin of the incoming ray, and the other the surface normal, as a factor of 0 to 1 representing 0 – 90 degrees.
These 2 vector are easily obtained through the built in OSL global variables N and I. N is the surface normal at the shading point, and I is the incoming ray vector pointing to the shading point which is inverted in this case to point backwards by  typing a minus before it: -I.
The incident angle is calculated in radians as the arc-cosine of the dot-product of N and -I and then divided by half a π to convert it to a linear factor of 0 to 1 representing 0 to 90 degrees in radians, M_PI_2 being a convenient half π constant.
* M_PI being a full π, M_2PI being 2π representing 180 degrees in radians and 360 degrees respectively (OSL provides there are more constants in this series).

The second instruction raises the facing ratio that was calculated in the previous instruction by a power value provided by the curve_exponent input parameter, to create a non linear angle/color blend in values other than 1.0.
The resulting modified blend value is stored in a new internal variable final_blend_ratio:

float final_blend_ratio = pow(facing_ratio , curve_exponent);

Note:
We could avoid setting a new variable by modifying the value of  the facing_ratio variable, and we could also combine the these 2 instruction into 1 bigger expression like this:

pow( acos( abs( dot( -I, N ) ) ) / M_PI_2 , curve_exponent )

But I decided to keep it separated into 2 variable and 2 instructions for clarity.
* try modifying the code as an exercise

The third instruction modifies the input color facing_color by premixing it with the input side_color according to the percent give by the input parameter base_blend and assigns the resulting color to a new internal variable named final_facing_color:

color final_facing_color = (facing_color * (1 - base_blend)) + (side_color * base_blend);

The expression:

( facing_color * ( 1 – base_blend ) ) + ( side_color * base_blend )

Calculates a linear combination** (linear interpolation) between the 2 input colors using the base_blend as a 0 – 1 factor between them.
* Note that OSL allows to define arithmatic operations freely between colors and floats.

The forth and final instruction creates the final mix between the modified facing color stored in final_facing_color variable and the side color given by the input color parameter side_color, by again, calculating a linear combination between the 2 colors, this time using final_blend_ratio variable value we calculated previously as the combination factor, and very importantly, finally, assigning the mixed result to the shader output parameter color_out so it will be the final output of the shader:

color_out = ((final_facing_color * (1 - final_blend_ratio)) + (side_color * final_blend_ratio));

This screen capture shows this shader at work in Blender and Cycles, connected to a Principled BSDF shader as it’s base color source:
Annotation 2020-06-17 235933

Thats it! 🙂
Hope you find this article informative and useful.

Clarifications:

* A “float” data type is simply the the computer-science geeky way of saying “accurate non-integer number”. when we have to store numbers that can describe geometry and color, we need a data type that isn’t limited to integers so for that purpose we use float values. there’s actually a lot more to the float formal definition in computer science, but for our purpose here this will suffice.

** A Linear Combination, or Linear interpolation (lerp) is one of the most useful numerical operations in 3D geometry and color processing (vector math):
A * ( 1 – t ) +  B * t 
A and B being your source and target locations or colors or any other value you need to interpolate and t being the blending factor from 0 – 1.

Related posts:

  1. What are OSL shaders?
  2. Using OSL Shaders in Maya and Arnold
  3. Using OSL Shaders in Blender and Cycles
  4. Using OSL Shaders in 3ds max and V-Ray

Using OSL Shaders in Blender & Cycles

Software:
Blender 2.82 | Cycles

The Cycles render engine in Blender has a very convenient OSL Shader development and usage workflow.
Shaders can be both loaded from external files or written and compiled directly inside Blender.

Before you begin:
Make sure your Blender scene is set to use the Cycles render engine, in CPU rendering mode, and also check the option Open Shading Language:
Annotation 2020-05-28 165830

 

To write an OSL shader in Blender:

  1. Write your shader code in Blender‘s Text Editor:
    Annotation 2020-05-28 173405
  2. In your object’s material shader graph (Shader Editor view),
    Create a Script node:
    Annotation 2020-05-28 170331
  3. Set the Script node‘s mode to Internal,
    And select your shader’s text from the Script node‘s source drop-down:
    Annotation 2020-05-28 170711
  4. If the shader compiles successfully, the Script node will display its input and output parameters, and you can connect it’s output to an appropriate input in your shading graph.
    * If your shader is a material (color closure) connect it directly to the Material Output node’s Surface input, is it’s a volume to the Volume input, or if its a texture to other material inputs as needed.
    Annotation 2020-05-28 171419
  5. If the shader code contains errors, it will fail to compile, and you’l be able to read the error messages in Blender‘s System Console window:
    Annotation 2020-05-28 172423
  6. After fixing errors or updating the shader’s code, press the Script Noe Update button on the Script node to re-compile the shader:
    Annotation 2020-05-28 172735

 

Loading an external OSL shader into Cycles:

Exactly the same workflow described in the previous section, except setting the Script node‘s mode to External and either typing a path to the shader file in the Script node or pressing the little folder button to locate it using the file browser:Annotation 2020-05-28 173159

 

Related:

  1. What are OSL shaders?
  2. Using OSL shaders in Arnold for Maya
  3. Using OSL shaders in V-ray for 3ds max
  4. Writing a basic OSL color shader
  5. Blender 2.83 OSL bug & fix

Using OSL Shaders in Arnold for Maya

Software:
Maya 2020 | Arnold 6

Autodesk Maya 2020 & Arnold 6 offer a flexible OSL development and usage workflow.
You can both load or write OSL shaders on the fly, compile, test, and render them,
And also define a shader folder path for shaders to be available as part of your library for all projects.
Steps for using OSL shaders in Maya & Arnold:

Writing an OSL shader or loading it for single use (just the current project):

  1. Create a new aiOslShader node:
    Annotation 2020-05-19 220416
  2. Select the new aiOslShader node and in its attributes either write new OSL code in the code OSL Code section, or press Import to Load an OSL shader file (*.osl):
    Annotation 2020-05-19 220454
  3. When new shader code is imported, it’s automatically compiled:
    Annotation 2020-05-19 220649
  4. I f you’ve written new code, or changed the code it will have to be re-compiled.
    In that case press Compile OSL Code:
    Annotation 2020-05-19 223632
  5. The code may contain errors, in that case you will see a red Compile Failure message:
    Annotation 2020-05-19 224342
    You can read the error message in the Maya output window, or in the Maya Script Editor, Correct the code and press Compile OSL Code again.
    Annotation 2020-05-19 224410
  6. After the OSL code is compiled successfully, the shader’s input parameters can be accessed in the OSL Attributes section below the code:
    Annotation 2020-05-19 224431
  7. Depending on the type of output the OSL shader generates, the aiOSLShader node should to be connected to an input in the object’s shader graph or Shading Group.
    * OSL shaders can be surface shaders, volume shaders, procedural textures, texture processors and more..
    To Apply the OSL shader to an object as a surface shader, disconnect the object’s current surface shader if it has one,
    And then drag and drop the aiOSLShader node from the Hypershade window onto the object.
    In the Connection Editor select outValue on the left side (node outputs) and surfaceShader in the right side (object inputs):
    Annotation 2020-05-19 220810

 

 

Installing OSL shaders so they will always be available as custom nodes in the Hypershade library

  1. Create a folder for storing your OSL shaders, and place you OSL shader files (*.osl) in this folder.
    Annotation 2020-05-20 225134
  2. Locate Maya’s Maya.env file.
    This is an ascii text file containing environment variables that Maya loads at startup.
    The Maya.env will usually be located at:
    C:\Users\<your user>\Documents\maya\<maya version>
    Annotation 2020-05-20 225221
  3. Open Maya.env in a text editor and add the following line to it:
    ARNOLD_PLUGIN_PATH=<path to your OSL shaders folder>
    for example:
    ARNOLD_PLUGIN_PATH=D:\3DAssets\OSL_Shaders
    Annotation 2020-05-20 225400
  4. Restart Maya.
    When Maya loads, the MtoA (Maya to Arnold) plugin will automatically compile the shaders that are found in the folder, report about the compilations or found errors in the Maya output window, and create compiled *.oso files for each shader:
    Annotation 2020-05-20 230917
  5. The compiled shaders will now be available as custom nodes in the Hypershade Arnold library with the typical “ai” (Arnold Interface) prefix added to their names:
    Annotation 2020-05-20 231138
  6. The OSL shaders will be created as nodes with their editable attributes, that can be connected to an object’s shading network graph:
    * Connecting the node to the graph is the same as described in the previous part (7)
    Annotation 2020-05-20 231232

 

Related:

  1. What are OSL Shaders?
  2. Using OSL shaders in V-Ray for 3ds max
  3. Using OSL shaders in Blender & Cycles
  4. Writing a basic OSL color shader

What are OSL shaders?

OSL is an acronym for Open Shading Language.
Developed Originally at Sony Pictures Imageworks for the Arnold render engine, Open Shading language is a C like programming language with which custom material, textures and shading effects can be developed OSL shaders (*.osl files), that are supported many by popular render engines.

OSL allows development of complex texturing and shading effects using scene input parameters like the shading point’s world position vector, normal vector, UV coordinates etc., and optical ray-tracing  functions – BSDF*’s or “Color Closures” as they are called in OSL, like Diffuse, Glossy, Refraction light scattering etc. that can be combined with C  logic and math programming.

*.osl files are compiled to *.oso file for rendering.
Most render engines supporting OSL shaders ship with an OSL compiler.

 

Useful OSL shader libraries found on the web:

> OSL Shaders for download on the Chaos Group website:
https://docs.chaosgroup.com/display/OSLShaders/OSL+Shaders+Home

> OSL Shaders for download at the Autodesk Developer Network Github repository:
https://github.com/ADN-DevTech/3dsMax-OSL-Shaders
These are the OSL shaders that ship with 3ds max 2019 or newer, and are providing texture and pattern processing tools, but not materials.
* Material shaders or “Closures” as they are referred to in OSL are not supported by 3ds max’s native implementation of OSL.

> A library of OSL shaders collected by Shane Ambler:
https://github.com/sambler/osl-shaders

 

Notes:

  1. In general, OSL shaders are supported only in CPU Rendering, but not supported by GPU renderers. There are some attempts to develop OSL support for GPU renderers, But as far as I know they are limited.
  2. Some OSL shaders will work on one or more render engines, and not work as expected on other render engines. the reason being that each render engine has it’s own implementation of OSL.
    These differences may show in a different rendered result and also compile failure.

 

Basic example:
The following example renders show how a combination of two basic OSL shaders iv’e written, one of which is a dielectric material shader, and the other a color/angle blend procedural texture, produce fairly consistent results when rendered in different render engines.
* note the difference in specular glossy roughness interpretation for the same 0.1 value..

> You’r welcome to download these two basic OSL shaders here.

Arnold for Maya:
Annotation 2020-05-22 213609

V-Ray for 3ds max:
Annotation 2020-05-22 213629

Cycles for Blender:
Annotation 2020-05-22 213723

 

Related:

  1. Using OSL shaders in Arnold for Maya
  2. Using OSL shaders in V-Ray for 3ds max
  3. Using OSL shaders in Blender & Cycles
  4. Writing a basic OSL color shader

3ds max & V-Ray to UE4 – Datasmith workflow basics and tips

Software:
3ds max 2020 | V-Ray Next | Unreal Engine 4.25

This post details basic steps and tips for exporting models from 3ds max & V-Ray to Unreal Engine using the Datasmith plugin.
The Datasmith plugin from Epic Games is revolutionary in the relatively painless workflow it enables for exporting 3ds max & V-Ray architectural scenes into Unreal Engine.
Bear in mind however, that Datasmith‘s streamlined workflow can’t always free us from the need to meticulously prepare models as game assets by the book (UV unwrapping, texture baking, mesh and material unifying etc.) (especially if we need very high game performance).
That being said, the Datasmith plugin has definitely revolutionized the process of importing assets into Unreal, making it mush more convenient and accessible.

 

Preparation:
Download and Install the Datasmith exporter plugin compatible with your modeling software and Unreal Engine version:
https://www.unrealengine.com/en-US/datasmith/plugins

 

In 3ds max & V-Ray:

  1. Make sure all materials are VRayMtl type (these get interpreted relatively accurately by Datasmith)
  2. Make sure all material textures are properly located so the Datasmith exporter ill be able to export them properly.
  3. In Rendering > Exposure Control:
    Make sure Exposure control is disabled.
    Explanation:
    If the Exposure Control will be active it will be exported to the Datasmith file, and when imported to Your Unreal Level/Map a “Global_Exposure” actor will be created with the same exposure settings.
    Sounds good, right? So what’s the problem?
    The problem with this is that these exposure setting will usually be compatible with photo-metric light sources like a VRaySun for example, but when imported to Unreal, the VRaySun does not keep its photo-metric intensity. (in my tests it got 10lx intensity on import). the result is that the imported exposure settings cause the level to be displayed completely dark.
    Of-course you can simply delete the “Global_Exposure” actor after import, but honestly, I always forget its there, and start looking for a reason why would everything be black for no apparent reason…
    * If your familiar with photo-metric units, you can set the VRaySun to its correct intensity of about 100000lx, and also adjust other light sources intensity to be compatible with the exposure setting.
  4. Annotation 2020-05-12 192439
  5. Select all of the models objects intended for export,
    And File > Export > Export Selected:
    * If you choose File > Export > Export you’l still have an option to export only selected objects..
    Annotation 2020-05-12 192506
  6. In the File Export window,
    Select the export location, name the exported file,
    And in the File type drop-down select Unreal Datasmith:
    Annotation 2020-05-12 192550
  7. In the Datasmith Export Options dialog,
    Set export options, and click OK.
    * Here you select whether to export only selected object or all objects (again)
    Annotation 2020-05-12 192654
  8. Depending on the way you prepared your model,
    You may get warning messages after the export has finished:
    Explanation:
    Traditionally, models intended for use in a game engine should be very carefully prepared with completely unwrapped texture UV coordinates and no overlapping or redundant geometry UV space.
    Data-smith allows for a significantly forgiving and streamlined (and friendly) workflow but still warns for problem it locates.
    In many cases these warnings will not have an actual effect (especially if Lightmap UV’s are generated by Unreal on import), but take into account that if you do encounter material/lighting issues down the road, these warnings may be related.
    Annotation 2020-05-12 192730
  9. Note that the Datasmith exporter created both a Datasmith (*.udatasmith) file, and a corresponding folder containing assets.
    It’s important to keep both these items in their relative locations:
    Annotation 2020-05-12 204541

 

In Unreal Editor:

  1. Go to Edit > Plugins to open the Plugins Manager:
    Annotation 2020-05-12 192802
  2. In the Plugins Manager search field, type “Datasmith” to find the Datasmith Importer plugin in the list, and make sure Enabled checked for it.
    * Depending on the project template you started with, it may already be enabled.
    * If the plugin wasn’t enabled, the Unreal Editor will prompt you to restart it.
    Annotation 2020-05-12 192901
  3. In the Unreal project Content, create a folder to which the now assets will be imported:
    * You can also do this later in the import stage
    Annotation 2020-05-12 193030
  4. In the main toolbar, Click the Datasmith button to import your model:
    Annotation 2020-05-12 193043
  5. Locate the the *.udatasmith file you exported earlier, double click it or select it and press Open:
    Annotation 2020-05-12 193129
  6. In the Choose Location… dialog that opens,
    Select the folder to which you want to import the assets:
    * If you didn’t create a folder prior to this stage you can right click and create one now.
    Annotation 2020-05-12 193301
  7. The Datasmith Import Options dialog lets you set import options:
    * This can be a good time to raise the Lightmap resolution for the models if needed.
    Annotation 2020-05-12 193326
  8. Wait for the new imported shaders (materials) to compile..
    Annotation 2020-05-12 193408
  9. The new assets will automatically be placed into the active Map\Level in the Editor.
    All of the imported actors will be automatically parented to an empty actor names the same as the imported Datasmith file.
    In the Outliner window, locate the imported parent actor, and transform it in-order to transform all of the imported assets together:
    * If your map’s display turns completely dark or otherwise weird on import, locate the “Global_Exposure” actor that was imported and delete (you can of-course set new exposure setting or adjust the light settings to be compatible)
    Annotation 2020-05-12 193517

 

 

Related:

  1. Preparing an FPS project for archviz
  2. Unreal – Architectural glass material
  3. Unreal – Camera animation
  4. UE4 – Archviz Light calculaion tips

Python for 3ds max – Loading an image file and reading pixel values

Software:
3ds max 2020

An example of loading and displaying an image file using Python for 3ds max:
* The EXR image file is located in in the same directory with the 3ds max file in this case.
Annotation 2020-05-12 123229
from MaxPlus import BitmapManager
image_file_path = r'BG_park_A.exr'
bmp_storage = MaxPlus.Factory.CreateStorage(17)
bmp_info = bmp_storage.GetBitmapInfo()
bmp_info.SetName(image_file_path)
bmp = BitmapManager.Load(bmp_info)
bmp.Display()

Script explanation line by line:
1. Import the BitmapManager class needed to load image files.
2. Set a variable containing the path to the image file
3. Call the MaxPlus.Factory class’s CreateStorage method to initiate a BitmapStorage object.
This is embarrassing IMO..
And it may very well be that I simply didn’t find the correct way it should be done..
I couldn’t find any other way to independently initiate the BitmapInfo object needed for loading the image, other than Initiating a BitmapStorage object and getting referece to its BitmapInfo object. (the BitmapInfo class has no constructor..)
* If you know a better method to do this I’ll be very grateful if you take the time to comment.
Note:
that the 17 integer argument we supply sets the storage to be compatible with:
32-bit floating-point color depth format (without an alpha channel).
See list of other color format options in this example here:
https://help.autodesk.com/view/3DSMAX/2020/ENU/?guid=__developer_using_maxplus_creating_a_bitmap_html
* They wrote a class containing convenient named constants of the integer arguments (see example code below).
* In this example of creating the BitmapStorage just as a way to generate a BitmapInfo object the actual format you’l supply doesn’t matter, but you can’t use a format that can’t be written to like 8 for example (see list below)
4. Get a reference to the BitmapInfo object contained in the BitmapStorage object.
5. Setting the name property (full file path) of the BitmapInfo object.
6. Loading the image.
7. Displaying the image in 3ds max‘s image viewer window.

Example code for a BitmapStorage format constants container class:

* This example’s source is in the 3ds max help Python examples:
https://help.autodesk.com/view/3DSMAX/2020/ENU/?guid=__developer_using_maxplus_creating_a_bitmap_html
class BitmapTypes(object):
     BMM_NO_TYPE = 0 # Not allocated yet
     BMM_LINE_ART = 1 # 1-bit monochrome image
     BMM_PALETTED = 2 # 8-bit paletted image. Each pixel value is an index into the color table.
     BMM_GRAY_8 = 3 # 8-bit grayscale bitmap.
     BMM_GRAY_16 = 4 # 16-bit grayscale bitmap.
     BMM_TRUE_16 = 5 # 16-bit true color image.
     BMM_TRUE_32 = 6 # 32-bit color: 8 bits each for Red, Green, Blue, and Alpha.
     BMM_TRUE_64 = 7 # 64-bit color: 16 bits each for Red, Green, Blue, and Alpha.
     BMM_TRUE_24 = 8 # 24-bit color: 8 bits each for Red, Green, and Blue. Cannot be written to.
     BMM_TRUE_48 = 9 # 48-bit color: 16 bits each for Red, Green, and Blue. Cannot be written to.
     BMM_YUV_422 = 10 # This is the YUV format - CCIR 601. Cannot be written to.
     BMM_BMP_4 = 11 # Windows BMP 16-bit color bitmap. Cannot be written to.
     BMM_PAD_24 = 12 # Padded 24-bit (in a 32 bit register). Cannot be written to.
     BMM_LOGLUV_32 = 13 BMM_LOGLUV_24 = 14
     BMM_LOGLUV_24A = 15 BMM_REALPIX_32 = 16 # The 'Real Pixel' format.
     BMM_FLOAT_RGBA_32 = 17 # 32-bit floating-point per component (non-compressed),
     RGB with or without alpha
     BMM_FLOAT_GRAY_32 = 18 # 32-bit floating-point (non-compressed), monochrome/grayscale
     BMM_FLOAT_RGB_32 = 19
     BMM_FLOAT_A_32 = 20

Reading pixel values from the image:
Annotation 2020-05-12 143904
bmp_storage = bmp.GetStorage()
hdr_pixel = bmp_storage.GetHDRPixel(3000,200)
print(hdr_pixel)
1. Get reference to the Bitmap‘s BitmapStorage object.
* in this case, writing over the BitmapStorage object we created earlier just to get the BitmapInfo object..
2. Read the pixel value.

Note:

When copying and pasting a script from this example, the indentation may not be pasted correctly.

Related:

 

GitHub Desktop – Not launching UI – Problem & Fix

Software:
GitHub Desktop 2.4.3 | Windows 10

About 1 or 2 weeks ago GitHub Desktop stopped opening it’s window when lunched.
It would run in the background, you could see it in the Windows Task Manager, but its window would not open.
Looking into this on various web discussions, I found that deleting the folder named app-2.4.3 in its application data solves the problem:
Annotation 2020-05-09 200238
This is the path to GitHub Desktop‘s data:
C:\Users\<YOUR USER>\AppData\Local\GitHubDesktop
After deleting the folder and re-launching GitHub Desktop, I found it generated a new folder with the same name, so I guess settings for the latest update got corrupted, and the software generated new settings..

 

Related:
UE4 GitHub Setup

 

Using Color Lookup Tables (CLUTs)

What is this all about?

Color Lookup Tables – CLUTs (also “Color LUTs“) are a method of storing and reusing complex linear color transformations*.
CLUTs have the advantages of being supported by many video and image processing software packages, and also the ability to be calculated in real-time on the GPU, costing very little computing resources.
* More simple, daily use terms can be: “color styles”, or “color corrections”

CLUTs are used in the movie production industry to perform color conversions of images acquired from different sources for monitoring and editing purposes, and also for testing, applying and sharing different creative color styles across different departments, and stages of the production.
Examples of common CLUT file formats are *.3DL and *.CUBE

Why is this called a “3D” or “Cube” Lookup Table?

The reason CLUTs are referred to as “3D” color lookup tables or “Cube LUT” is that they store the effect of color operations as linear transformations of a 3D cubic space.
To understand this we have to imagine RGB color encoding as a 3D space with the R, G and B values of each color being coordinates in this cubic color space.
This means that the color correction operations we perform to create a color style, like adding contrast, saturation, warming the hues etc. are all defined as a function that for every color coordinate in the RGB color cube space defines the new coordinate where the corrected or stylized color is found.
The term Lookup table means that the new color values don’t have to be calculated every time because they have been pre-calculated and stored in a table of values.
3D CLUTs are often processed and stored as 3D Cubic textures like this example generated with Blackmagic Fusion of a 32 x 32 x 32 value CLUT.
Imagine the little 32 x 32 square patches all stacked one upon the other, that would create a 32 x 32 x 32 RGB color cube, with which color transformations can be stored by simply applying them on this texture:Cube0000

Working with CLUTs:

In this post we’ll go trough the process of creating and using a CLUT in some popular creative software packages.
* Note that there are many other software packages that support creation and usage of CLUTs, the process should be similar.

Steps shown in the following software:
Adobe Photoshop 2020
Adobe After Effects 2020
Adobe Premiere 2020
Blackmagic Design Fusion 9

In this example our source image with which the CLUT will be designed is an interior scene modeled with Blender 2.82 and rendered with Cycles Render Engine with “Filmic” tone-mapping applied, saved as a PNG file.
* I usually save Linear unclamped 32bit float EXR files as the raw output from render engines, because this is the format that provides the most freedom to manipulate and process rendered images and animation, but from my experience CLUTS don’t work well on linear unclamped color, for that reason I usually apply them at a later stage of the image development (usually after applying tone-mapping to the image).
This is why I saved the file directly as a tone-mapped PNG for this example.

A.Test_Image

Creating a CLUT in Adobe Photoshop:

For this example, a greenish-contrasty-desaturated color style is created in Photoshop by applying color adjustment layers to the image.
In this case Color Balance, Vibrance, and Curves.
* You can use different numbers and combinations of color adjustments

B.Photoshop_Grade

The new Color Style is now exported to CLUT files:

C.Photoshop_Export_CLUTS

In the Export Color Lookup Tables dialog allows naming the CLUT, adding a description, setting a quality for the color transform it will define, and selecting the wanted CLUT file formats that will be written.
After clicking OK the CLUT files will be saved in a chosen location.C1.Photoshop_Export_CLUTS

Note:
Saving the CLUT in the Presets\3DLUTs folder (found in the Adobe Photoshop installation folder) will allow reusing the CLUT as a preset look available by drop-down selection without having to locate the file each time.

Applying a CLUT in Adobe Photoshop:

With the image later selected, add a Color Lookup adjustment layer:

D.Photoshop_Lut_Adjustment

In the Color Lookup adjustment properties, open the 3DLUT File drop-down, choose Load 3D LUT, and locate the CLUT file you saved:E.Photoshop_Lut_Load

The original image now has the same color style we created earlier, but this time it’s applied by only a single Color Lookup adjustment layer:

F.Photoshop_Lut_Correction

An example of the same CLUT applied to a different image:

F1.Photoshop_Lut_Correction_B

Applying a CLUT in Adobe After Effects:

Add a Util > Apply Color LUT effect to a layer,
In the Effect Controls window, click Choose LUT and locate the wanted CLUT file:

G.AE_Apply_CLUT

Applying a CLUT in Adobe Premiere:

  1. Select the image/video clip in the timeline.
  2. Switch to the Color UI tab to get access to the Lumetri Color controls on the right.
  3. In the Creative section of the Lumetri Color controls, open the Look dropdown, choose Browse and locate the wanted CLUT file.

I.Premiere_Apply_CLUT

I.Premiere_Apply_CLUT_B

Applying a CLUT in Blackmagic Design Fusion:

Add a LUT >File LUT node to the image source.
In the File LUT properties, click the browse button and locate the wanted CLUT file:

H.Fusion_Apply_CLUT

Creating a CLUT in Blackmagic Design Fusion:

* See the numbered nodes in the flow graph below

  1. Source image/video on which the CLUT is designed.
  2. A LUT Cube Creator node, generating default neutral 3D CLUT data in the form of a Color Cube map.
  3. The nodes creating the actual color style (in this case a Color Corrector and Color Curve nodes) are operating on the LUT Cube Creator node’s output.
  4. A LUT Cube Apply node is applying the stylized CLUT data to the image/video for previewing purpose (displayed on the right viewer)
  5. A LUT Cube Analyzer node generates CLUT data from the styled LUT Cube Creator data, and allows saving it to disk as a CLUT file.
    * Choose a location and click Write File to save the CLUT file.

K.Fusion_Create_LUT

Blender – Adding a texture to an Area Light

Software:
Blender 2.82

Adding a texture to an area light can make it produce softer and more detailed highlights and an overall more organic lighting effect.

Note:
Since an Area light in Blender isn’t rendered as an actual mesh object with UV coordinates, it’s texture coordinates are parametric (see below).

Adding a texture to an Area Light:

  1. In the Area Light properties click the Use Nodes button (see image A) to initiate its node graph and allow texturing it.
  2. In the Shader Editor view (with the light selected), drop your texture to the light’s node graph and connect it to the light’s Emission node’s Color input. (see image B)
  3. Create a new Input > Geometry node, and connect it’s Parametric output to the Image Texture’s Vector input. (see image B)

A. Without a texture the Area light produces a hard flat highlight:
a

B. With the vignette texture, the Area light now has a more subtle organic effect:
* The Emission node’s Strength was increased in this case to compensate for the lower light output with the texture.
b
Related posts:

  1. Cycles Area Light pleasent surprise
  2. Cycles Area Light shader visibility