UE4 – Basic Material Blending

Software:
Unreal Engine 4.24

The example explained in this article is creating a blend between a mud material, and a mud-leaves material using a mask (Alpha) texture.
>>
The scanned PBR materials in demonstrated in this post are from Texture Haven (texturehaven.com)

How does it work?
There is actually no blending of Unreal materials, but rather a regular Unreal material in which each of the parameters is defined as a linear blend between 2 different source values for that parameter.
We could create such a material blueprint that uses a Lerp (Linear Interpolate) node’s to provide each of the material parameters with a blend of 2 input textures/colors or parameters, connecting the alpha texture to all the Lerp nodes’s Alpha input, and effectively achieve blending of 2 different materials, but it would be a complex blueprint in which it’s very inconvenient to design each of the individual materials participating in the blend:
Annotation 2019-12-25 232734

This complexity can be greatly simplified by collecting each of the participating materials parameters into a Material Attributes data structure.
The Material Attributes data structure contains all the data needed to compile a material, and allows input, output, and processing of this data as a single blueprint data stream (connection).
For example, when the material parameters are grouped as a Material Attributes data structure, they can be blended by connecting them into a single BlendMaterialAttributes node, instead of “Lerping” (blending) between 2 inputs to create each individual material parameter, which produces an unworkable complex material blueprint like the previous example.

> To collect material parameters into a Material Attributes data structure, connect them into a MakeMaterialAttributes node:
annotation-2019-12-26-015532.jpg

> To create a blend between 2 Material Attributes data streams, use the BlendMaterialAttributes node:
* The Alpha parameter determines the weights of the blend (a black and white texture can be connected to it as the blend mask)
Annotation 2019-12-26 015717

> In order for the material output to receive a grouped Material attributes input instead of individual inputs for each parameter, select the material output, and in the Details panel, check the Use Material Attributes option:
matatts

Using the Material Attributes data structure, the blended material’s Blueprint in now much simpler and cleaner, while producing the exact same result as before:
Annotation 2019-12-26 021435

But designing 2 different materials within one material Blueprint is still far from being ideal..
What if we want to use just one of these materials on some surfaces?
What if the individual materials are not as simple as the materials shown here, it would be mush more efficient to be able to have one Blueprint for each of the materials allowing to focus on its development and preview it.
We can achieve this desired workflow by developing each of the materials as a Material Function.
Each of the participating materials is created as a Material Function with a Material Attributes output.

> One of the huge advantages of UE4’s material editing is that it allows us to preview a full material while developing it as a Material Function.
* This may sound trivial, but it isn’t. the Material Function isn’t compiled by itself as a material, it just produces data needed to define a material. in many other media production systems, this would have meant that you can develop data within the function but only preview it in the main material where the function is used.

> Learn how to create Material Functions

The Material Function defining the mud material:
Annotation 2019-12-26 024834.jpg

The Material Function defining the mud-leaves material:
Annotation 2019-12-26 024906.jpg

The Blend material using the Material Function nodes:
Annotation 2019-12-26 025111

Note:
When blending a non-metallic material with a metal material, the alpha values (mask colors) should be only 0 or 1 (black or white), otherwise blend areas that have a mid-range metallic value will make no sense visually.
> A RemapValueRange node can be used to force a color threshold on the mask texture or value.

Related:
Material Functions
Material Instances
Texture Painting

UE4 – RemapValueRange Node

Software:
Unreal Engine 4.24

Having to remap a value range is very common in designing shaders, whether it’s to perform a traditional “levels” operation on a texture, or use just a specific range of values in a float input.
The RemapValueRange node let’s us do just that. this node has 5 inputs:

  1. Input: The input value
  2. Input Low: Input mapping range minimum
  3. Input High: Input mapping range maximum
  4. Target Low: Output range minimum
  5. Target Height: Output range maximum

Mono Noise examples:

  1. The original noise pattern:0
  2. Mapping input 0 -> 1 to output 0 -> 1 obviously has no effect:1
  3. Mapping input 0 -> 1 to output 0.3 -> 0.7 reduces the pattern contrast:2
  4. Mapping input 0.3 -> 0.7 to output 0 -> 1 increases the pattern contrast:3
  5. Mapping input 0 -> 1 to output 1 -> 0 inverts the pattern:4

In this example RemapValueRange nodes are applied to a texture’s individual color channels to increase contrast (Levels):
> Note that this operation can be performed using a simpler graph by using float3d adding and multiplication operations on the texture color (this will be a subject for a different post)
> Note that a different remapping operation can be performed on each color channel of a texture to adjust its color balance.

Before the value remapping:5

After the value remapping tha value in each channel 0.1 -> 0.9 to 0.0 to 1.0:6

The same operation performed by multiplication by 1.2 and subtracting 0.1:7.jpg

UE4 – Create and Play a Level Sequence

Software:
Unreal Engine 4.24

To create animations and trigger them to play on game start:

First create a Level Sequence containing the animation:

  1. Create a new Level Sequence actor:
    Annotation 2019-12-23 164745
  2. Name the new Level Sequence and drag it to into the level:
    Annotation 2019-12-23 165833.jpg
  3. Select the actor you want to animate in the level and double click the Level Sequence in the Content Browser to open it in the Sequencer window:
    Annotation 2019-12-23 170453.jpg
  4. In the Sequencer window, press the +Track button to add a sequence track, choose the upper most option Actor To Sequence, the option to Add the selected actor will automatically appear first on the menu that will open on the right:
    Annotation 2019-12-23 170622
  5. Add the selected actor as a sequence track, expand the track’s Transform channels to reveal the Transform property you would like to animate, and click the + button for that channel to create the first key-frame:
    Annotation 2019-12-23 171623
  6. Activate the Create when channels/properties change option button:
    Annotation 2019-12-23 171932
  7. Move the time slider to a desired time for the motion and move/change the actors transform to create a new key-frame:
    Annotation 2019-12-23 172115.jpg

The Level Sequence now contains animation for the Actor, but when we play the game, the animation doesn’t play.
For the animation to play in game, we must trigger it fro a Blueprint, in this case the Level Blueprint:

  1. From the Editor Blueprints menu, choose Open Level Blueprint:
    Annotation 2019-12-23 173450
  2. In the Level Blueprint, drag the Event BeginPlay execution graph and create CreateLevelSequencePlayer node that will follow it:
    Annotation 2019-12-23 173759
  3. Drag the CreateLevelSequencePlayer node’s Return Value output and create a Play node that will be executed after it and receive it’s output:
    Annotation 2019-12-23 173854
  4. The Level Blueprint now has instructions to play a Level Sequence,
    but it’s not yet specified which Level Sequence to play:
    Annotation 2019-12-23 173929
  5. In the Variables list on the left, press the +Variable button to create a new variable and name it:
    Annotation 2019-12-23 174757
  6. With the new variable selected, in it’s details on the right, press the Variable Type button, and locate Level Sequence – Object Reference type:
    Annotation 2019-12-23 174932.jpg
  7. The Level Blueprint now contains a variable named seq of type: Level Sequence – Object Reference:
    Annotation 2019-12-23 175200.jpg
  8. Drag the new variable to the Blueprint and choose Get when placing it:
    Annotation 2019-12-23 175536
  9. Connect the variable’s output to the Level Sequence input of the CreateLevelSequencePlayer node:
    Annotation 2019-12-23 175609
  10. With the variable selected, in the details panel on the right, select the Level Sequence object it will be referencing:
    Annotation 2019-12-23 175627
  11. Press Compile and save the Level Blueprint:
    Annotation 2019-12-23 175646

The Level Blueprint now has instructions to play the desired Level Sequence when the level begins playing so a the animation we created plays when we hit play game in the editor:
Annotation 2019-12-23 180500

animseq.gif

 

Related:
UE4 Camera Animation

UE4 – Python – Placing level actors bottom at Z 0.0

Software:
Unreal Engine 4.22

This simple Unreal Editor Python example sets the Z axis location of all actors with names beginning with ‘Sphere_’ in a way that their bottom (minimum Z bound) is at height 0.0.

Download the script

> learn how to run Python scripts in the UE4 Editor

import unreal
from unreal import Vector

lst_actors = unreal.EditorLevelLibrary.get_all_level_actors()
print('place actors at 0 z')
for act in lst_actors:
    act_label = act.get_actor_label()
    if 'Sphere_' in act_label:
        print('placing: {}'.format(act_label))
        act_location = act.get_actor_location()
        act_bounds = act.get_actor_bounds(False)
        act_min_z = act_bounds[0].z - act_bounds[1].z
        location_offset = Vector(act_location.x, act_location.y, act_location.z - act_min_z)
        act.set_actor_location(location_offset, False, False)

* note that when copying and pasting a script from this example, the indentation may not be pasted correctly.

Note:
The get_actor_bounds unreal.Actor class method returns a tuple object containing 2 unreal.Vector objects, the first being world space location of the actor geometric center, and the second is the corner of the bounding box relative to the center.

‘Sphere_*’ actors before running the script:

Annotation 2019-12-08 225356.jpg

‘Sphere_*’ actors after running the script:

Annotation 2019-12-08 225503.jpg

 

Related:
Get started with Python for Unreal Editor
UE4 – Python – Importing assets

UE4 – Package a Project for Windows

Software:
Unreal Engine 4.21

Basic steps for packaging a simple UE4 project for Windows:

Package settings:
Open the Project Settings window:
Untitled-1.jpg

  1. In: Project > Description
    Set the project’s details and thumbnail:
    > The Project thumbnail will apear in the UE4 Editor browser.
    > Thumbnail image must be a 192 x 192 resolution PNG
    Untitled-4
  2. In: Project > Maps & Modes
    Set default level (Map) for the project:
    Untitled-5
  3. In: Project > Packaging
    Choose build configuration
    > For final distribution choose ‘Shipping‘:
    Untitled-6
  4. In: Project > Supported Platforms:
    Make sure Windows is selected:
    Untitled-7.jpg

Note:
If the project folder is located within a deep folder structure, there might be packaging errors because of long file paths.
Adding a Quit command:
Select the FirstPersonCharacter Actor and enter editing mode.
In the Event Graph blueprint, add an Escape key press event node,
And connect it to a Quit command node.

Untitled-9.jpg

 

Creating the game package:
Choose:
File > Package Project > Windows > Windows (64-bit)
And select an output folder.

Untitled-8.jpg

A folder named “WindowsNoEditor” will be created,
And inside it will be the game executable along with code and assets folders.
This package can be renamed and copied to other locations.

 

UE4 – Connect a Directional Light to an Atmospheric Fog effect

Software:
Unreal Engine 4.21

UE4‘s Directional Light mimics sun lighting and shadow fairly realistically,
And the Atmospheric Fog visual effect produces a realistic clear sky background that includes the sun’s disk image in the sky.
> The Atmospheric Fog produces a result similar to an implementation of a Preetham physical sky model in common rendering software.

Combined together, these two elements can produce an effective day-light system,
But inorder for this to work the sun’s direction should be controlling the time of day simulated by the Atmospheric Fog.

For the Directional light to control the Atmospheric fog:
In the Directional Light’s details, under light, expend the settings and enable:
Atmosphere / Fog Sun Light

Note:
There is a limitation to this setup’s realism because the sunlight’s color shouldn’t be pure white, but rather yellowish, to orange, depending on the time of day (you can find tables of the sunlight’s color temperature for various time of day on the web)
There is no problem setting the sun’s color, but the problem is that it causes additional coloring of the Atmospheric Fog which isn’t needed since it already is simulating the sky’s color correctly.
The result of this limitation is that we can see a sunset sky with a white sunlight which is wrong.
I didn’t find a workaround for this yet, can’t find a way to set the Atmospheric Fog‘s sun vector without connecting it to a Directional light and can’t find a way to make the Directional Light‘s color not affect the Atmospheric Fog‘s color.

Update:
It seems this limitation can be overcome by disconnecting the Atmospheric Fog from the Directional Light and using the Atmospheric Fog Object’s rotation to control it.

Directional_light_Affect_Atmo_Fog
atmo_fog.gif

UE4 – Enable complex collision for models

Software:
Unreal Engine 4.21

By default UE4 uses fast simplified convex collision shapes to calculate collision for static mesh actors.
This means that the player or projectiles wont be able to path through holes, openings or doors in the model.

To set complex (concave) collision for a static mesh model:

In Static Mesh editing window, in the details pane, under Collision:
Set Collision Complexity to: Use Complex Collision As Simple

Static_Mesh_Complex_Collision.jpg

This example shows the default behavior for a model that has an opening, neither the projectiles nor the player can pass:

Simple_Collision.gif

In this example collision for the model was set to Use Complex Collision As Simple:

Complex_Collision.gif