Software: Unreal Engine 4.25 *Also tested on Unreal Engine 5.0.1
Disclaimer: I’m probably the 10th guy that’s documenting these steps on the web, I didn’t come up with this solution myself, I learned it from the sources listed below. The reason I’m documenting this (again) myself is to have a clear source I can come back to for this issue because I’m absolutely incapable of remembering subjects like this…… :-\ If you find inaccuracies in the steps I’m detailing or my explanation, I’ll be very grateful if you share a comment.
In short: AFAIK since version 4.21 UE doesn’t load custom node shader code from your project/Shaders folder by default anymore, but only from the shaders folder in the engine installation, which makes it less practical for developing shaders for specific projects.
Steps for setting the UE project to load shaders from the project folder in UE 4.22:
> The examples here are for a project named: “Tech_Lab”
A. The project must be a C++ project:
So either create a new project, define as such or just create a new C++ class and compile the project with it to convert it to a C++ project. Notes: a. You may need to right click the .uproject file icon and and Generate Visual Studio Project Files for the project to load correctly into Visual Studio and compile. b. You can delete the unneeded C++ class you added after the new settings took place.
B. Create a folder for the shader files:
Typically, it will be called “Shaders” and will be placed in the project root folder.
C. Add the RenderCore module to the project:
This is done by adding string “RenderCore” to array of public dependency modules in the <project>.build.cs file:
Notes: a. In UE 4.21 it should be ShaderCore. b. This addition is needed in-order to compile a new primary project module (next step).
D. Define a custom primary module for your project:
In <project_name>.h file add a new module named F<project_name>Module, with a StartupModule function overrides. Notes: a. We have add an include statement for “Modules/ModuleManager”. b. The <project_name>.h file is located in the /Source/<project_name> folder. c. Some sources state that you also have to override the ShutdownModule function, with an empty override, it works for me without this (maybe its just a mistake..)
E. Implement the function override, and set the custom module as the project primary module:
In <project_name>.cpp file, add the StartupModule override, With the definition of the added shaders path: FString ShaderDirectory = FPaths::Combine(FPaths::ProjectDir(), TEXT("Shaders"));
and mapping this new path as “/Project” for conveniently including it: AddShaderSourceDirectoryMapping("/Project", ShaderDirectory);
Last thing to do is to replace “FDefaultGameModuleImpl” with our custom module name in the IMPLEMENT_PRIMARY_GAME_MODULE macro: IMPLEMENT_PRIMARY_GAME_MODULE(FTech_LabModule, Tech_Lab, "Tech_Lab" );
Notes: a. We must include “Misc/Paths” b. Note that the addition of this folder mapping is restricted to versions 4.22 and higher via a compiler directive condition. for version 4.21, you should state “ENGINE_MINOR_VERSION >= 21: Note for UE5: Unreal Engine 5 supports this from the get go so this compiler directive condition should be deleted for this to work.
F. Wrapping up:
After taking these steps and compiling the project. You should be able to include .ush and .usf files stored in <your_ue_project>/Shaders with the “Project” path mapping: include "/Project/test.usf"
That’s it! 🙂
I hope you found this helpful, And if you encountered errors, or inaccuracies, I’ll be grateful if you’ll take the time to comment.
Note: This material setup is using V-Ray 5, but will also work for V-Ray next, And in previous versions, not having the Metallic property, it can be implemented by unchecking Fresnel Reflections and setting the color through the Reflection color. This technique can also be implemented in V-Ray for other 3D software like V-Ray for Maya etc.
The idea is simple: Use a VRayBlendMtl to additively combine 3 anisotropic metallic materials, Each of which reflects a pure primary RGB color, but has a slightly different anisotropic angle. The additive combination creates an anisotropic reflection that “spreads” the color of the spectrum. Note that in this example we use Anisotropic Rotation values of 0, 8, 16, but this may change with different roughness and anisotropy values.
For the materials Diffuse Color property we set levels that will combine to create the general brightness and tint of the metal, and the materials Reflection Color is set to 100% so they will combine to pure white reflection at grazing view angle. For example, if you want the general metallic color to be a yellow RGB: 255, 186, 57, than the first red metallic component material would have a Diffuse Color of 255,0,0, the second green metallic component material would have a Diffuse Color of 0,186,0, and the third material, that adds the blue metallic component would have a Diffuse color of 0,0,57, so the combined blend will have the desired general color. More info about the VRayMtl Metalness parameter
The following is a list of guidelines for preparation and export of 3D content from Blender to Unreal Engine 4 via the FBX file format.
Disclaimer: This is not a formal specification. It’s a list of tips I found to work well in my own experience. * Some of the issues listed here may have already been solved
Blender Scene and model settings:
System units in Blender: Define the scene units in Blender as: Metric unit with 0.01 scale (centimeters) And model your content correctly using centimeter units. * Modeling in 1 meter units may seem to be imported correctly into UE4 but will cause unsolvable problems like a skeletal mesh physics asset having incorrect auto-generated shapes, a problem that in my experience can’t be fixed manually.
Transform: Model your model in Blender facing the -Y world axis, +Z obviously being up (obviously for Blender). * This way the model is aligned to Blender’s views so the front view displays the model’s front etc. Make sure to apply your model’s transformations before export.
Armatures: Make sure the Armature object isn’t named “Armature”. naming or leaving the Blender skeleton named “Armature” will cause the UE4 importer to fail due to “multiple roots”. * Also remember some weird related bug with animation scale incorrectly imported, but can’t confirm this now.. No need for a dedicated root bone in the hierarchy. the Armature object is the root of the bone hierarchy. * See export option below
Texture baking: Set the normal map’s green channel to -Y. * This is not critical at all because if baked as +Y it can easily be fixed in UE4.
Metadata: Blender custom properties import as UE4 asset metadata that can be read by editor scripts for automation purposes. * See export option below
FBX Blender export and UE4 import settings:
I recommend saving an FBX export preset with these settings.
Optional: I prefer the export settings to include only selected objects. * It’s more efficient for me to select the specifics objects I want to export into a single FBX file prior to export, than to delete all the temp / reference / draft objects from the scene. If you want to export Blender custom properties with to the FBX check the “Custom Properties” option
Axes: Blender’s native model/world orientation is model’s forward facing the -Y axis, left side facing +X and of-course up facing +Z. UE4’s native model/world orientation is model’s forward facing the +X axis, left side facing -Y and up facing +Z. There are axis settings in Blender’s FBX export module, that theoretically, should be set like this:
However, in tests I did, The axis settings made no difference when importing to UE4, even when setting intentionally incorrect upside-down axes. Maybe the FBX exporter writes these settings to metadata that the UE4 importer doesn’t read.. From my experience, what’s important is to orient the model correctly in Blender (see above), apply the transformations, And in the UE4 import menu, check the “Force Front XAxis” option:
Geometry: Make sure either “Edge” or “Face” is chosen in the “Smoothing” option to import the mesh’s smooth shading correctly ans avoid a smoothing groups warning on import:
Optional: Depending on how much control you need over the mesh’s tangent space, You may want to check the “Tangent Space” export option, This will make Blender export the full tangent space to the FBX and make UE4 read it from FBX instead of generate it automatically. * For this option to be supported, the mesh geometry must have only triangle or quad polys. In the UE4 import settings, choose the “Import Normals and Tangents” option in “Normal Import Method”:
Armature: Set “Armature FBXNode Type” to “Root”. Uncheck the “Add Leaf Bones” option to avoid adding unneeded end bones. Set bones primary axis as X, and secondary axis as -Z.
Animation: Uncheck “All Actions” to avoid exporting actions that don’t actually belong to the skeleton. * Un-related animations in the FBX can also corrupt the character rest pose in UE. The “NLA Strips” option is useful for exporting a library of animations with the skeleton. * In Blender’s NLA editor, activate the actions you want exported to the FBX.
Steps for creating a new Visual Studio project based on existing code files:
Create an empty project folder and name it your intended project name.
Inside the new project folder, create a folder for your source code files. * I call is “Source”, not sure if it has to named that way..
Copy your existing code files to the source folder.
Launch Visual Studio, and open it without code:
Select: File > New > Project From Existing Code.. To open the Create New Project From Existing Code Files wizard:
Note: The documentation on this operation states that the wizard will copy files by itself, my own experience is that it doesn’t, it just links them to the project, that’s why I copy the source files prior to this step. In the Create New Project From Existing Code Files wizard, 1. Set the path to your project folder. 2. Specify a project name. * I set this to be the same name as the project folder name, not sure if otherwise it will create a sub-folder.. 3. This is set to the same folder as the project folder. * It’s possible that I don’t understand this correctly.. but I think theoretically, the intention is that you would add external folders to this list, from which source files would be copied, but like I said, when I tried that the files where not copied to the project.
Set your new projects settings and press Finish to create the new project:
Note: New code files generated in a ported code project may be stored in a wrong folder by default, see this post for solutions.
If you’re interested in taking the first step into Python for 3D software, Or simply would like to browse some script examples, your welcome to visit my Gist page, It contains a useful library of Python code example for Blender, Maya, 3ds max and Unreal engine: https://gist.github.com/CGLion
Triplanar Projection Mapping can be an effective texture mapping solution for cases where the model doesn’t have naturally flowing continuous UV coordinates, or there is a need to have the texture projected independently of UV channels, with minimally visible stretching and other mapping artifacts. Classic use cases for Triplanar Projection Mapping are terrains and organic materials. provided that the image being used is a seamless texture, no seams will be visible because this projection type isn’t affected by UV coordinates. Triplanar Projection Mapping can also be used in world space to create a continuous texture between separate meshes, allowing the meshes to be indestructibly transformed and edited.
How does Triplanar Projection Mapping work? Triplanar Projection Mapping is a linear blend between 3 orthogonal 2D planar texture projections, typically each aligned to a natural world or object axis. The more the surface faces an axis, the higher the weight of this axis projection in the final blend.
UE4 local (object) space Triplanar Projection Mapping material setup: * It’s usually more efficient to create this setup as a Material Function
Local shading coordinates are multiplied by by a “density” parameter to allow convenient scaling of the projected.
The scaled coordinates vector is separated to its components who are combined to 3 pairs of planar coordinates XY, XZ and YZ, and fed as the sample coordinates to the 3 Texture Sample nodes.
The Vertex Normal input vector is transformed to local space, converted to absolute value (absolute orientation in the positive axes octant) and separated to its X, Y and Z components so they can serve as blend weights in the mix.
Each of the 3 planar axis projections is multiplied by the blend factors, and the resulting values are added to the raw mix.
A value of 1.0 is divided by the Normal vector component values to obtain the factor needed to normalize the blend result to a value of 1.0.
The raw blend value is multiplied by the normalizing factor so the blend resulting color will be normalized. * The blend weights should add to a value of 1.0, but a unit vector’s values add up to more than 1 in diagonal directions. for this reason, without this final step, the color of the texture in point on the surface that are diagonal to the projection axes will appear brighter than in points on the surface that face a projection axis.
An example of Triplanar Projection Mapping in world space:
A bunch of Blender monkeys (Suzanne) continuously textured using world space Triplanar Projection Mapping:
Yet another case where I develop my own costly solution only to find out afterwards that there’s actually a much more efficient built-in solution.. 😀
In this case the subject is deriving a bump normal from a procedural or non-uv projected height map/texture (like noise, or tri-planar mapping for example).
The built-in way: Using the pre-made material functions, PreparePerturbNormalHQ and PerturbNormalHQ, the first of which uses the low level Direct3D functions DDX and DDY to derive the two extra surface adjacent values needed to derive a bump normal, and the last uses the 3 values to generate a world-space bump normal:
240 instructions
Noise coordinates are obtained by multiplying the surface shading point local position by a value to set the pattern density.
The Noise output value is multiplied by a factor to set the resulting bump intensity.
The PreparePerturbNormalHQ function is used to derive the 2 extra values needed to derive a bump normal.
The PerturbNormalHQ function is used to derive the World-Space bump normal.
Note: Using this method, the material’s normal input must be set to world-space by unchecking Tangent Space Normal in the material properties.
The method I’m using: This method is significantly more expensive in the number of shader instructions, but in my opinion, generates a better quality bump. Sampling 3 Noise nodes at 3 adjacent locations in tangent-space to derive the 3 input values necessary for the NormalFromFunction material function:
412 instructions
Noise coordinates are obtained by multiplying the surface shading point local position by a value to set the pattern density.
Crossing the vertex normal with the vertex tangent vectors to derive the bitangent (sometimes called “binormal”).
Multiplying the vertex-tangent and bitangent vectors by a bump-offset* factor to create the increment steps to the additional sampled Noise values. * This factor should be parameter for easy tuning, since it determines the distance between the height samples in tangent space.
The increment vectors are added to the local-position to get the final height samples positions.
The NormalFromFunction material function is used to derive a tangent-space normal from the 3 supplied height samples.
Note: From my experience, even though the UV1, UV2 and UV3 inputs of the NormalFromFunction are annotated as V3, the function will only work is the inputs are a scalar value and not a vector/color.
Note that accessing an model’s animated vertex locations requires reading the model’s evaluated (deformed) mesh state per frame. For that reason a new bmesh object is initiated per each frame with the the model’s updated dependency graph.
import bpy
import bmesh
obj = bpy.context.active_object
frames = range(0,10)
get the object's evaluated dependency graph:
depgraph = bpy.context.evaluated_depsgraph_get()
iterate animation frames:
for f in frames:
bpy.context.scene.frame_set(f)
# define new bmesh object:
bm = bmesh.new()
bm.verts.ensure_lookup_table()
# read the evaluated mesh data into the bmesh object:
bm.from_object( obj, depgraph )
# iterate the bmesh verts:
for i, v in enumerate(bm.verts):
print("frame: {}, vert: {}, location: {}".format(f, i, v.co))