The cglSurfaceCarPaint car-paint material combines 3 layers: Base:Â A blend of diffuse/metallic shading with a view-angle color mix Metallic flakes:Â Distance blended procedural metallic flakes Clear coat:Â Glossy clear coat layer with built-in procedural bump irregularity
And has been tested with:
Blender & Cycles
Maya & Arnold
3ds max & V-Ray
The TeoSidedSign node let’s the shader “know” if a rendered polygon is facing the camera or not by outputting a value of 1 for facing polys and -1 for back-facing polys.
This is useful for creating materials that have different properties when seen front-facing or back-facing.
Example 1:
Blending two different colors based on face direction:
Check the Two Sided material attribute.
* Needed so that the engine will render the polygons beck sides.
In the material blueprint, create a blend of to colors using a Lerp (LinearInterpolate) node and connect it to the material’s Base Color input.
Add a TwoSidedSign node to get polygon facing input (1,-1).
Connect the TwoSidedSign node’s output to a Clamp node to clamp the values to (1,0).
Connect the Clamp node’s output to the Lerp node’s Alpha input so that the polygon’s facing direction will control the Lerp blend.
Note:
You can use this method to blend any other material attribute based on polygon facing direction.
Example 2:
Create an “inwards facing” flipped normal material:
Set the material’s Blend Mode to Masked.
* Needed for being able to make areas parts of the mesh invisible.
Check the Two Sided material attribute.
* Needed so that the engine will render the polygons beck sides.
Add a TwoSidedSign node to get polygon facing input (1,-1).
Connect the TwoSidedSign node’s output to a Clamp node to clamp the values to (1,0).
Connect the Clamp node’s output to a 1-X node to invert the facing input.
Connect the 1-X node’s output to the material’s Opacity Mask input so that polygons facing the camera will be invisible.
An example of a simple fog effect created using a Post Process material:
The fog material Blueprint: The method for creating the fog effect is to take distance of the objects from the camera, map it to a value range suitable for color blending 0 – 1, and use that for blending the object’s color with the fog color, so the further away the object, the more it will be colored by the fog.
Start by creating a new material, and follow the details below to create the Blueprint:
The Material Domain is set to Post Process.
And has its Blendable Location parameter set to Before Tonemapping so it will be applied on the raw render.
A SceneTexture node with its Scene Texture Id parameter set to PostProcessInput0 serves as the input of the view’s original rendered pixel colors:
A Lerp (LinearInterpolate) node calculates the blending of the view’s original pixel colors with the Fog color to create the fog effect.
A SceneTexture node with its Scene Texture Id parameter set to SceneDepth serves as the input of depth of each pixel (distance from camera):
A ComponentMask node set to the R channel allows using the depth data as a single float value instead of a Vector4:
A Clamp node is used to clamp (limit) the depth value to the Fog’s maximum depth value (see below)
A RemapValueRange maps the distance value to a fog density value that will be used as the Lerp (3) alpha parameter.
Simply put, the further the object, the more the original color will be blended with the fog color.
A Power node (raises the fog blend factor by an exponent) make the fog blending non-linear, that is beginning gently for closer objects and than increasing more drastically as the distance grows (provided that the exponent value is above 1)
A Constant Vector4 serves as an input for the fog color.
* Note that having this input be a Vector4 and not a Vector3 allows it to be interpolated with the PostProcessInput0 data, otherwise a ComponentMask (RGB) node would have been necessary to convert the PostProcessInput0 to a Vector3.
a float constant serves as an input for the fog’s minimal distance (from camera)
a float constant serves as an input for the fog’s maximal distance (from camera)
* Note that it’s connected both to the Clamp node and to the RemapValueRange node.
a float constant serves as an input for the fog’s minimal opacity (blend amount)
a float constant serves as an input for the fog’s maximal opacity (blend amount)
a float constant serves as an input for the fog’s blend exponent.
Applying the Post Process material to the level:
Select the PostProcessVolume actor in the World Outliner window.
* Create a PostProcessVolume actor if necessary.
In the Details panel, under Rendering Features > Post Process materials,
Add a new item to the array, in the new item’s value choose Asset Reference,
And then select your fog material:
Using the GradientRamp procedural texture map in Mapped mode can very useful for creating procedural material effects.
The Idea is that the lightness value from a different map will determine what part of the GradientRamp is sampled.
In this example the GradientRamp uses values produced by a procedural Falloff map set to Perpendicular-Parallel mode, as its coordinates source, to create richly colored metal that changes its Hue depending on View/Incident angle:
In this example the GradientRamp uses values produced by a procedural Noise map as its coordinates source to create an irregular color effect:
Note:
The examples here were rendered using V-Ray Next for 3ds max, but this technique could also be used with other rendering engines.
The example explained in this article is creating a blend between a mud material, and a mud-leaves material using a mask (Alpha) texture.
>> The scanned PBR materials in demonstrated in this post are from Texture Haven (texturehaven.com)
How does it work? There is actually no blending of Unreal materials, but rather a regular Unreal material in which each of the parameters is defined as a linear blend between 2 different source values for that parameter.
We could create such a material blueprint that uses a Lerp (Linear Interpolate) node’s to provide each of the material parameters with a blend of 2 input textures/colors or parameters, connecting the alpha texture to all the Lerp nodes’s Alpha input, and effectively achieve blending of 2 different materials, but it would be a complex blueprint in which it’s very inconvenient to design each of the individual materials participating in the blend:
This complexity can be greatly simplified by collecting each of the participating materials parameters into a Material Attributes data structure.
The Material Attributes data structure contains all the data needed to compile a material, and allows input, output, and processing of this data as a single blueprint data stream (connection).
For example, when the material parameters are grouped as a Material Attributes data structure, they can be blended by connecting them into a single BlendMaterialAttributes node, instead of “Lerping” (blending) between 2 inputs to create each individual material parameter, which produces an unworkable complex material blueprint like the previous example.
> To collect material parameters into a Material Attributes data structure, connect them into a MakeMaterialAttributes node:
> To create a blend between 2 Material Attributes data streams, use the BlendMaterialAttributes node:
* The Alpha parameter determines the weights of the blend (a black and white texture can be connected to it as the blend mask)
> In order for the material output to receive a grouped Material attributes input instead of individual inputs for each parameter, select the material output, and in the Details panel, check the Use Material Attributes option:
Using the Material Attributes data structure, the blended material’s Blueprint in now much simpler and cleaner, while producing the exact same result as before:
But designing 2 different materials within one material Blueprint is still far from being ideal..
What if we want to use just one of these materials on some surfaces?
What if the individual materials are not as simple as the materials shown here, it would be mush more efficient to be able to have one Blueprint for each of the materials allowing to focus on its development and preview it.
We can achieve this desired workflow by developing each of the materials as a Material Function.
Each of the participating materials is created as a Material Function with a Material Attributes output.
> One of the huge advantages of UE4’s material editing is that it allows us to preview a full material while developing it as a Material Function.
* This may sound trivial, but it isn’t. the Material Function isn’t compiled by itself as a material, it just produces data needed to define a material. in many other media production systems, this would have meant that you can develop data within the function but only preview it in the main material where the function is used.
The Material Function defining the mud-leaves material:
The Blend material using the Material Function nodes:
Note:
When blending a non-metallic material with a metal material, the alpha values (mask colors) should be only 0 or 1 (black or white), otherwise blend areas that have a mid-range metallic value will make no sense visually.
> A RemapValueRangenode can be used to force a color threshold on the mask texture or value.
Material Functions encapsulate shading flow graphs (material blueprints) into reusable shading nodes that have their own inputs and outputs.
This allows development of custom shading nodes, and saving the time it takes to recreate the same flow graphs multiple times or even copy and paste material flow graphs.
Common shading processes and operations that have to be performed in many different materials, and even multiple times in a single material can be defined as a Material Function for quick and easy re-usability. Material functions can also be used to encapsulate a full material blueprint with a Material Attributes output. this provides a convenient workflow for blending different materials together.
In this post I’ll detail the steps needed to create and use a Material Function.
The Material Function example we’ll create, called “ColorAngleBlend” performs a commonly needed shading operation of blending 2 colors or textures according to the surface viewing angle (facing ratio).
The ColorAngleBlendMaterial Function will have the following inputs:
color a:
The color or texture appearing when viewing the surface at perpendicular angle.
color b:
The color or texture appearing when viewing the surface at grazing view angle.
curve exponent:
The steepness of the blend curve between the colors, 1 being a linear blend and higher values displaying color a at more angles “pushing” color b to be seen only at grazing angle.
base color blend:
The percent of color b seen at perpendicular view angle.
normal:
bump normals input.
The final “ColorAngleBlend”Material Function Blueprint:
* The internals of the “ColorAngleBlend”Material Function
An example of the “ColorAngleBlend”Material Function node used to create a reach view-angle dependent color blend for a steampunk metal material:
An example of the “ColorAngleBlend”Material Function node used to create a reach color for a car-paint material:
An example of the “ColorAngleBlend”Material Function node used to create a washed-out effect for a cloth material:
Steps for creating the “ColorAngleBlend” Material Function:
In the content browser, create a Material Function Object and name it “ColorAngleBlend”:
Double click the ColorAngleBlend Material Function to open it for editing:
Click the background of the work space to deselect the Output Result Node,
So that the Details panel on the left will display the Material Functions‘s properties.
Type a description into the Description field, check the Expose to Library option so that the new Material Function will be available to all materials in the Palate and node search, and define in which node categories it should appear:
Select the Output Result node and in the Details panel on the left set its output name to “color”:
Add a Linear Interpolate (Lerp) node, a Fresnel node and a Transform Vector (Tangent space to World space) node to the Blueprint and connect the nodes like this:
* The Lerp node will blend the 2 color inputs with the Fresnelproviding view angle data as the alpha for the Lerp.
The Transform Vector node is needed to convert normal (bump) input to world space for the Fresnelnode.
Adding function inputs:
Create 2 Function Input nodes, in their Details panel, name them “color a” and “color b”, leave their Input Type as default Vector3D, check the option Use Preview Value as Default, number their Sort Priority parameters 0 and 1 to make them appear as the first inputs of the ColorAngleBlend node as it will appear when used in a material, and connect them to the Lerp node’s A and B inputs:
Adding function inputs:
Create 2 new Function Input nodes, name them “curve exponent” and “base color blend”, this time set their Input Type to Scalar, check the option Use Preview Value as Default, set their Sort Priority parameters to 2 and 3 and connect their outputs to the Fresnel node’s ExpoentIn and BaseReflectFractionIn inputs:
Adding function inputs: Create the final Function Input node, name it “normal“, set its Input Type to Vector3D, check its Preview Value as Default option, set its Sort Priority to 4, and connect its output to the Fresnel node’s Normal input:
Adding default inputs:
Finally, add constant nodes to serve as default input values for the Material Function.
A pure white Constant3Vector (color) constant as the default value for “color a” input,
A pure black Constant3Vector (color) constant as the default value for “color a” input,
A Constant with value 1.0 as the default value for “curve exponent” input,
A Constant with value 0.0 as the default value for”base color blend” input,
A pure blue Constant3Vector (color) constant as the default value for “normal” input. > Tip for quick creation of constant value nodes
Save the new Material Function.
To use the new ColorAngleBlend Material Function create a new material, in the node search start typing color… to locate the ColorAngleBlend node and create it, and connect it to the desired material input.
>Material Functions can also be used by dragging them from the Content Browser into the Material Blueprint.
Having to remap a value range is very common in designing shaders, whether it’s to perform a traditional “levels” operation on a texture, or use just a specific range of values in a float input.
The RemapValueRange node let’s us do just that. this node has 5 inputs:
Input: The input value
Input Low: Input mapping range minimum
Input High: Input mapping range maximum
Target Low: Output range minimum
Target Height: Output range maximum
Mono Noise examples:
The original noise pattern:
Mapping input 0 -> 1 to output 0 -> 1 obviously has no effect:
Mapping input 0 -> 1 to output 0.3 -> 0.7 reduces the pattern contrast:
Mapping input 0.3 -> 0.7 to output 0 -> 1 increases the pattern contrast:
Mapping input 0 -> 1 to output 1 -> 0 inverts the pattern:
In this example RemapValueRange nodes are applied to a texture’s individual color channels to increase contrast (Levels): > Note that this operation can be performed using a simpler graph by using float3d adding and multiplication operations on the texture color (this will be a subject for a different post) > Note that a different remapping operation can be performed on each color channel of a texture to adjust its color balance.
Before the value remapping:
After the value remapping tha value in each channel 0.1 -> 0.9 to 0.0 to 1.0:
The same operation performed by multiplication by 1.2 and subtracting 0.1:
CG-Lion Architectural Glazing Presets Pack 1.0 is an custom architectural glazing shader I developed for Cycles render engine, that provides easy setup of real world architectural glazing surfaces, and ships with 40 ready to use material presets.
The shader has architecture-friendly real world parameters like ‘frosted‘, ‘milky‘, ‘smoked‘ glass etc., has convenient built-in inputs for effects like selective sand blasting or selective graphic coating and is internally optimized for transparent shadow casting.