UE4 – HLSL texture sample quick tip

Software:
Unreal Engine 4.26

When sampling textures using an HLSL custom node,
The UE4 TextureObject input name, will automatically have a sampler object generated named:

<your TextureObject name>sampler

For example, if you named your TextureObject input “tex_in”, the available sampler will be named “tex_insampler”.
So the code for sampling the texture will be:

Texture2DSample(tex_in, tex_inSampler, coords);

The following is an example of a simple u-blur custom node code, with 4 node inputs:
1. tex_in – TextureObject
2. coords – float2
3. size – float
4. sample – float

int blur_samples = int(samples * 0.5f);
float3 col = 0.0f;
float2 sample_coords = 0.0f;
for (int i = -blur_samples; i < blur_samples; i ++)
{	
	sample_coords.x = i * (size / blur_samples * 2);
	col += Texture2DSample(tex_in, tex_inSampler, coords + sample_coords ) / (blur_samples * 2);
}
return col;

The above code can typed directly in the Custom node’s Code field. or loaded from an external .usf file.

See also:
Loading HLSL shaders from the project folder

UE4 – Tiny tips for tiny scale projects

Software:
Unreal Engine 4.26

When having to develop a UE4 project that deals with a tiny world scale, like the whole level being less the 50cm size for example,
The following steps may help make the project more easily navigate-able and convenient to work on.

  1. Scale down the camera Icon:
    UE4 has by default a huge, bulky, opaque camera icon.
    For a tiny scale project, this camera icon may cover the whole level and be very inconvenient to work with.
    Select the relevant camera component and scale it down.
    In my tests, this modified camera matrix isn’t breaking the camera optics in any way,
    But if you want to have no such scale offsets in your project, you can also replace the camera icon with a suitable small one.
  2. In Editor Preferences > Viewport:
    Decrease both:
    Mouse Scroll Camera Speed and Mouse Sensitivity
    To allow finer navigation at small scale

Note:
A global scale conversion factor can be used instead of taking these measures,
And in many cases this can be a more practical solution for managing a sub 50cm world,
For example, building everything 100X size so that 1 meter will be representing 1 centimeter in your project’s world.
But take into account, that if the project demands rendering realistic physical lighting and optics, extra conversions will have to be made to account for the scale conversion factor, so is such cases it may be better to setup a real-world scale project.

UE4 – Replace the Camera component icon

Software:
Unreal Engine 4.26

Note:
This seems like an awkward workaround..
So if I missed something here, and there’s a better method to do this,
I’ll be very grateful is you share it in the comments.
Also,
The following tip is only relevant for the CineCameraActor and won’t work with a regular CameraActor as it has a different built in offset (hopefully, I’ll have time to add this to the post later..)

Replacing the camera icon:
Its fairly simple to replace the Camera component’s mesh icon,
Just select the component and replace it’s Camera Mesh Static Mesh component with a different static mesh object:

So what’s the problem?
The problem is that the default mesh used for the camera icon doesn’t have its natural pivot at the focal point of the camera, but at its bottom somewhere,
And there is a hardcoded transform offset that compensates for that and places the icon mesh in a way that has the Icon lens roughly at the actual Camera actor pivot / focal point:

* I haven’t found any exposed transform parameter that allows moving the icon itself without moving the camera.
So in-order to replace the camera mesh with an alternative icon mesh, and have it be aligned properly to the camera’s pivot / focal point (without changing engine code and building it) the built-in offset must be negatively pre-added to the new mesh model:

In this example in Blender, a new icon is modeled facing positive Y, with pre-built offset to compensate for the hardcoded offset in UE.

The Camera actor with the alternative Icon:

Note:
In this example, I’ve replaced the camera icon with a much smaller model, intentionally, to suite a tiny scale project,
You can also scale the icon without replacing the model.

Read-list: Intro to UE4 Arch-viz

A list of UE4 Architectural-Visualization related tutorials:

  1. 3ds max & V-Ray to UE4 – Datasmith workflow basics and tips
  2. UE4 – “Cleaning up” the FPS template for an Archviz project
  3. Basic architectural glazing material in UE4
  4. UE4 – HDRI Environment & Lighting
  5. UE4 – Enable DXR Raytracing
  6. UE4 – Lighting calculation tips for Archviz
  7. Creating a camera animation in UE4
  8. UE4 – Technical model visualization tips

UE4 – Loading shaders from within the project folder

Software:
Unreal Engine 4.25

*Also tested on Unreal Engine 5.0.1

Disclaimer:
I’m probably the 10th guy that’s documenting these steps on the web,
I didn’t come up with this solution myself, I learned it from the sources listed below.
The reason I’m documenting this (again) myself is to have a clear source I can come back to for this issue because I’m absolutely incapable of remembering subjects like this…… :-\
If you find inaccuracies in the steps I’m detailing or my explanation, I’ll be very grateful if you share a comment.

  1. https://forums.unrealengine.com/development-discussion/rendering/1562454-virtual-shader-source-path-link-custom-shaders-shadertoy-demo-download
  2. https://dinwy.github.io/study/2019/july/customShaderInUE4/
  3. https://forums.unrealengine.com/community/community-content-tools-and-tutorials/1710373-using-external-shader-files-ush-usf-and-getting-the-most-of-the-custom-node

In short:
AFAIK since version 4.21 UE doesn’t load custom node shader code from your project/Shaders folder by default anymore, but only from the shaders folder in the engine installation, which makes it less practical for developing shaders for specific projects.


Steps for setting the UE project to load shaders from the project folder in UE 4.22:

> The examples here are for a project named: “Tech_Lab”

A. The project must be a C++ project:

So either create a new project, define as such or just create a new C++ class and compile the project with it to convert it to a C++ project.
Notes:
a. You may need to right click the .uproject file icon and and Generate Visual Studio Project Files for the project to load correctly into Visual Studio and compile.
b. You can delete the unneeded C++ class you added after the new settings took place.

B. Create a folder for the shader files:

Typically, it will be called “Shaders” and will be placed in the project root folder.

C. Add the RenderCore module to the project:

This is done by adding string “RenderCore” to array of public dependency modules in the <project>.build.cs file:

PublicDependencyModuleNames.AddRange(new string[] { "RenderCore", "Core", "CoreUObject", "Engine", "InputCore" });
(see image)

Notes:
a. In UE 4.21 it should be ShaderCore.
b. This addition is needed in-order to compile a new primary project module (next step).

D. Define a custom primary module for your project:

In <project_name>.h file add a new module named F<project_name>Module, with a StartupModule function overrides.
Notes:
a. We have add an include statement for “Modules/ModuleManager”.
b. The <project_name>.h file is located in the /Source/<project_name> folder.
c. Some sources state that you also have to override the ShutdownModule function, with an empty override, it works for me without this (maybe its just a mistake..)

E. Implement the function override,
and set the custom module as the project primary module:

In <project_name>.cpp file, add the StartupModule override,
With the definition of the added shaders path:
FString ShaderDirectory = FPaths::Combine(FPaths::ProjectDir(), TEXT("Shaders"));

and mapping this new path as “/Project” for conveniently including it:
AddShaderSourceDirectoryMapping("/Project", ShaderDirectory);

Last thing to do is to replace “FDefaultGameModuleImpl” with our custom module name in the IMPLEMENT_PRIMARY_GAME_MODULE macro:
IMPLEMENT_PRIMARY_GAME_MODULE(FTech_LabModule, Tech_Lab, "Tech_Lab" );

Notes:
a. We must include “Misc/Paths”
b. Note that the addition of this folder mapping is restricted to versions 4.22 and higher via a compiler directive condition. for version 4.21, you should state “ENGINE_MINOR_VERSION >= 21:

Note for UE5:
Unreal Engine 5 supports this from the get go so this compiler directive condition should be deleted for this to work.

F. Wrapping up:

After taking these steps and compiling the project.
You should be able to include .ush and .usf files stored in <your_ue_project>/Shaders with the “Project” path mapping:
include "/Project/test.usf"

That’s it! 🙂

I hope you found this helpful,
And if you encountered errors, or inaccuracies,
I’ll be grateful if you’ll take the time to comment.


Related:

  1. UE4 – Cyber enhancement shader
  2. UE4 – Fog post process effect

Blender to Unreal Engine tips

Software:
Blender 2.9 | Unreal Engine 4.25

The following is a list of guidelines for preparation and export of 3D content from Blender to Unreal Engine 4 via the FBX file format.

Disclaimer:
This is not a formal specification.
It’s a list of tips I found to work well in my own experience.
* Some of the issues listed here may have already been solved


Blender Scene and model settings:

System units in Blender:
Define the scene units in Blender as:
Metric unit with 0.01 scale (centimeters)
And model your content correctly using centimeter units.
* Modeling in 1 meter units may seem to be imported correctly into UE4 but will cause unsolvable problems like a skeletal mesh physics asset having incorrect auto-generated shapes, a problem that in my experience can’t be fixed manually.

Transform:
Model your model in Blender facing the -Y world axis, +Z obviously being up (obviously for Blender).
* This way the model is aligned to Blender’s views so the front view displays the model’s front etc.
Make sure to apply your model’s transformations before export.

Armatures:
Make sure the Armature object isn’t named “Armature”.
naming or leaving the Blender skeleton named “Armature” will cause the UE4 importer to fail due to “multiple roots”.
* Also remember some weird related bug with animation scale incorrectly imported, but can’t confirm this now..
No need for a dedicated root bone in the hierarchy. the Armature object is the root of the bone hierarchy.
* See export option below

Texture baking:
Set the normal map’s green channel to -Y.
* This is not critical at all because if baked as +Y it can easily be fixed in UE4.

Metadata:
Blender custom properties import as UE4 asset metadata that can be read by editor scripts for automation purposes.
* See export option below


FBX Blender export and UE4 import settings:

I recommend saving an FBX export preset with these settings.

Optional:
I prefer the export settings to include only selected objects.
* It’s more efficient for me to select the specifics objects I want to export into a single FBX file prior to export, than to delete all the temp / reference / draft objects from the scene.
If you want to export Blender custom properties with to the FBX check the “Custom Properties” option

Axes:
Blender’s native model/world orientation is model’s forward facing the -Y axis, left side facing +X and of-course up facing +Z.
UE4’s native model/world orientation is model’s forward facing the +X axis, left side facing -Y and up facing +Z.
There are axis settings in Blender’s FBX export module, that theoretically, should be set like this:

However, in tests I did, The axis settings made no difference when importing to UE4, even when setting intentionally incorrect upside-down axes.
Maybe the FBX exporter writes these settings to metadata that the UE4 importer doesn’t read..
From my experience, what’s important is to orient the model correctly in Blender (see above), apply the transformations,
And in the UE4 import menu, check the “Force Front XAxis” option:

Geometry:
Make sure either “Edge” or “Face” is chosen in the “Smoothing” option to import the mesh’s smooth shading correctly ans avoid a smoothing groups warning on import:

Optional:
Depending on how much control you need over the mesh’s tangent space,
You may want to check the “Tangent Space” export option,
This will make Blender export the full tangent space to the FBX and make UE4 read it from FBX instead of generate it automatically.
* For this option to be supported, the mesh geometry must have only triangle or quad polys.
In the UE4 import settings, choose the “Import Normals and Tangents” option in “Normal Import Method”:

Armature:
Set “Armature FBXNode Type” to “Root”.
Uncheck the “Add Leaf Bones” option to avoid adding unneeded end bones.
Set bones primary axis as X, and secondary axis as -Z.

Animation:
Uncheck “All Actions” to avoid exporting actions that don’t actually belong to the skeleton.
* Un-related animations in the FBX can also corrupt the character rest pose in UE.
The “NLA Strips” option is useful for exporting a library of animations with the skeleton.
* In Blender’s NLA editor, activate the actions you want exported to the FBX.


Related:
3ds max & V-Ray to UE4 Datasmith workflow

A collection of Python snippets for 3D

If you’re interested in taking the first step into Python for 3D software,
Or simply would like to browse some script examples, your welcome to visit my Gist page,
It contains a useful library of Python code example for Blender, Maya, 3ds max and Unreal engine:
https://gist.github.com/CGLion

UE4 – Triplanar projection mapping setup

Software:
Unreal Engine 4.25

Triplanar Projection Mapping can be an effective texture mapping solution for cases where the model doesn’t have naturally flowing continuous UV coordinates, or there is a need to have the texture projected independently of UV channels, with minimally visible stretching and other mapping artifacts.
Classic use cases for Triplanar Projection Mapping are terrains and organic materials. provided that the image being used is a seamless texture, no seams will be visible because this projection type isn’t affected by UV coordinates.
Triplanar Projection Mapping can also be used in world space to create a continuous texture between separate meshes, allowing the meshes to be indestructibly transformed and edited.

How does Triplanar Projection Mapping work?
Triplanar Projection Mapping is a linear blend between 3 orthogonal 2D planar texture projections, typically each aligned to a natural world or object axis.
The more the surface faces an axis, the higher the weight of this axis projection in the final blend.

UE4 local (object) space Triplanar Projection Mapping material setup:
* It’s usually more efficient to create this setup as a Material Function

  1. Local shading coordinates are multiplied by by a “density” parameter to allow convenient scaling of the projected.
  2. The scaled coordinates vector is separated to its components who are combined to 3 pairs of planar coordinates XY, XZ and YZ, and fed as the sample coordinates to the 3 Texture Sample nodes.
  3. The Vertex Normal input vector is transformed to local space, converted to absolute value (absolute orientation in the positive axes octant) and separated to its X, Y and Z components so they can serve as blend weights in the mix.
  4. Each of the 3 planar axis projections is multiplied by the blend factors, and the resulting values are added to the raw mix.
  5. A value of 1.0 is divided by the Normal vector component values to obtain the factor needed to normalize the blend result to a value of 1.0.
  6. The raw blend value is multiplied by the normalizing factor so the blend resulting color will be normalized.
    * The blend weights should add to a value of 1.0, but a unit vector’s values add up to more than 1 in diagonal directions. for this reason, without this final step, the color of the texture in point on the surface that are diagonal to the projection axes will appear brighter than in points on the surface that face a projection axis.

An example of Triplanar Projection Mapping in world space:

A bunch of Blender monkeys (Suzanne) continuously textured using world space Triplanar Projection Mapping:

Related posts:
UE4 – Material Functions
UE4 – Material Instances
UE4 – Bump mapping
UE4 – Procedural bump mapping

UE4 – Procedural 3D noise bump setups

Software:
Unreal Engine 4.25

Yet another case where I develop my own costly solution only to find out afterwards that there’s actually a much more efficient built-in solution.. 😀

In this case the subject is deriving a bump normal from a procedural or non-uv projected height map/texture (like noise, or tri-planar mapping for example).

The built-in way:
Using the pre-made material functions, PreparePerturbNormalHQ and PerturbNormalHQ, the first of which uses the low level Direct3D functions DDX and DDY to derive the two extra surface adjacent values needed to derive a bump normal, and the last uses the 3 values to generate a world-space bump normal:

240 instructions
  1. Noise coordinates are obtained by multiplying the surface shading point local position by a value to set the pattern density.
  2. The Noise output value is multiplied by a factor to set the resulting bump intensity.
  3. The PreparePerturbNormalHQ function is used to derive the 2 extra values needed to derive a bump normal.
  4. The PerturbNormalHQ function is used to derive the World-Space bump normal.
  5. Note:
    Using this method, the material’s normal input must be set to world-space by unchecking Tangent Space Normal in the material properties.

The method I’m using:
This method is significantly more expensive in the number of shader instructions, but in my opinion, generates a better quality bump.
Sampling 3 Noise nodes at 3 adjacent locations in tangent-space to derive the 3 input values necessary for the NormalFromFunction material function:

412 instructions
  1. Noise coordinates are obtained by multiplying the surface shading point local position by a value to set the pattern density.
  2. Crossing the vertex normal with the vertex tangent vectors to derive the bitangent (sometimes called “binormal”).
  3. Multiplying the vertex-tangent and bitangent vectors by a bump-offset* factor to create the increment steps to the additional sampled Noise values.
    * This factor should be parameter for easy tuning, since it determines the distance between the height samples in tangent space.
  4. The increment vectors are added to the local-position to get the final height samples positions.
  5. The NormalFromFunction material function is used to derive a tangent-space normal from the 3 supplied height samples.

Note:
From my experience, even though the UV1, UV2 and UV3 inputs of the NormalFromFunction are annotated as V3, the function will only work is the inputs are a scalar value and not a vector/color.

Related:
UE4 – Material Functions
UE4 – Bump map
UE4 – fix an inverted normal map
UE4 – Triplanar mapping

UE4 – Quick fix for normal map encoding

Software:
Unreal Engine 4.25 | Photoshop 2020

Quick UE normal map tip:
If you load a normal map into a UE material and the result appears inverted, i.e. holes instead of bumps or the other way:

The quick fix:

  1. In the texture settings, check the Flip Green Channel option and save it:

Annotation 2020-09-06 160838

The deep fix:
* This can be performed an automated action on multiple files

  1. Open the normal map in Photoshop
  2. In the Channels panel, select the Green channel
  3. Press Ctrl + I,
    Or Select Image > Adjustments > Invert
    To invert the green channel.
  4. Save the texture and reload into Unreal Engine.
    Annotation 2020-07-06 233156

Inverted normal map:Annotation 2020-07-06 232554

Fixed normal map:Annotation 2020-07-06 233036

Related:
UE4 Bump map
UE4 – Procedural Bump Normals