This post is a summary of the tips given by Epic Games technical-artist Min Oh in his GDC 2017 lecture about improving photo-realism in product visualization, more specifically, how to render high quality surfaces.
I recommend watching the full lecture:
Render sharper reflections by increasing the Cubemap resolution of reflection captures: Project Settings > Engine > Rendering > Reflection > Reflection Capture Resolution
* use powers of 2 values i.e. 256, 512, 1024….
Improve the accuracy of environment lighting by increasing the Cubemap resolution of the Skylight:
* use powers of 2 values i.e. 256, 512, 1024….
Improve screen space effects accuracy like screen space reflections by setting the engine to compute high precision normals for the GBuffer:
Set Project Settings > Engine > Rendering > Optimizations > GBuffer Format to: High Precision Normals
Use a high degree of tessellation (subdivision) for the models pre-import.
Simpy put: Use high quality models.
Improve the surfaces tangent space accuracy, and as a result also the shading/reflection accuracy by setting the model’s static mesh components to encode high precision tangent basis: Static Mesh Editor > Details > LOD 0 > Build Settings > Use High Precision Tangent basis
Creating materials with rich dual specular layers by enabling material clear coat separate normal input: Project Settings > Engine > Rendering > Materials > Clear Coat Enable Second Normal Set the material Shading Model to Clear Coat and use a ClearCoatBottomNormal input node to add a normal map for the underlying layer:
Steps for activating DXR Ray-tracing in a UE4 project:
Project Settings: Platforms > Windows > Targeted RHIs:
Set Default RHI to DirectX 12 * RHI = Rendering Hardware Interface
Project Settings: Engine > Rendering > Ray Tracing:
Check Ray Tracing
* Requires restarting the editor, and may take a while to load the project afterwards..
* I’m actually not sure if the reason for delay in re-launching the project is a full re-build of the lighting or compiling shaders..
Post Process Volume > Rendering Features > Reflections:
Set Type to: Ray Tracing
Post Process Volume > Rendering Features > Ray Tracing Reflections:
Set Max Bounces to more than 1 if needed
The Static Lighting calculation in UE4 is performed by the Lightmass module (UE4’s integrated GI* engine), and the result of this calculation is stored in each object’s Lightmap, an extra texture map used for storing static light and shadow information.
This post provides a list of useful tips and techniques for improving your UE4 scene setup for an efficient light calculation.
Notes:
The following tips are aimed at achieving a good lighting calculation/solution but they don’t include optimization methods for high performance projects.
Namely, we don’t get into manual Lightmap UV optimizations here.
The following tips don’t take into account the now real-time ray-tracing options that have become available with Nvidia Geforce RTX / DirectX DXR.
Scene Setup:
Delete unseen polygons from your mesh, so they wont waste Lightmap resolution.
* For example, in an interior Archviz project, delete the outer polygons of the walls.
Set the architectural surfaces to cast shadows from both sides: Details > Lighting > Shadow Two Sided
Place “light blockers” around the structure to avoid light licks.
* Wrap the structure on all sides with scaled cubes that have an absolute black material:
Set the “light blockers” to be invisible in rendering:
Scale the Lightmass Importance Volume fit around the structure tightly.
Lightmap Resolution:
Optimize the architectural surfaces (static meshes) Light map resolution.
A higher resolution will allow the Light Map to store more detailed lighting.
The Static Mesh resolution setting is found in: Static Mesh Edior > Details > General Settings > Light Map Resolution:
* This setting can also be overriden at the actor settings by selecting the actor in the map/level and activating: Details > Lighting > Override Lightmap Res
Use the Lightmap Density optimization display mode to inspect the actual Lightmap texel density.
The Lightmap Density display mode also color codes the display to indicate the efficiency of the Lightmap resolution per object (green color being optimal, and warm colors being too dense)
* Note that in many cases of Archviz you may want a higher density than the editor displays as optimal.
Lighmass Settings:
The Lightmass setting are found in: World Settings > Lightmass
Decrease the Volumetric Lightmap Detail Cell Size to increase the light calculation accuracy:
* This will increase the calculation time
Decrease the Indirect Lighting smoothness to get more detailed shadows:
Disable Compress Lightmaps to avoid banding artifacts in the shadow gradient:
Use the Lighting Only display mode to evaluate the lighting solution:
For final quality, set the Light Quality to Production: Build menu > Lighting Quality > Production
* GI – “Global Illumination” is a term referring to indirect light simulation, namely a calculation of how light reflects and bounces between surfaces.
The UE4 First Person template is a good way to start an Architectural virtual tour project, but we first need to “clean” it up, namely, get rid of all the unnecessary objects and settings.
Start with the obvious:
Delete all the cubes and blocks. (Simply select them and press delete)
The quickest way to select all these objects is through the World Outliner window.
Select all the unneeded objects (see image below) and delete them. Note:
I’m intentionally keeping the 4 surrounding wall objects because I want them to serve as invisible barrier objects that will stop the player from wondering of the platform.
So now our level looks like this, with weird static shadows left by the “BigWall” objects that were deleted.
It’s not really critical to fix this at this stage, but if you want to get rid of the weird left-over shadows, simply press the Build button to re-build the lighting, and they will be gone.
Making the walls invisible:
Select the 4 wall objects, and in the Details window, in the Lighting Settings uncheck Cast Shadow,
And under Rendering uncheck Visible.
The level is now clear, and when we press play, we can free roam on the empty stage until we hit the invisible walls.
* You can re-build the lighting to get rid of the walls static shadow.
Time to get dirty!
We now have to get rid if the FPS rifle and shooting setup….
Select the FirstPersonCharacter actor, and in the World Outliner window click Edit FirstPersonCharacter to open the actors Blueprint:
In the FirstPersonCharacter Blueprint, navigate to the Viewport tab so you’l be able to see the mesh components clearly,
And in the actor Components Window on the left, select all the unneeded components, delete them and press the Compile button.
* make sure you don’t select the FirstPersonCamera or any of the inherited components
A list of reported errors will now be displayed in the Compile Results window, because we deleted objects that are referenced by the Blueprint, we will fix this in the next step:
Navigate to the Construction Script tab, Select the AttachComponentToComponent node (currently displaying an error) and delete it.
Navigate to the Event Graph tab, locate the first Event Graph at the top of the Blueprint, this is the Event BeginPlay graph.
Select the 2 Set Hidden in Game nodes (currently displaying an errors) and delete them:
Locate the Spawn projectile node graph at the bottom of the Event Graph,
Select this whole section, delete it and press Compile.
The Event Graph should now look like this, and compilation should be without errors because we deleted all the Blueprint parts that were referencing the deleted actor components:
Almost there..
It’s time to remove the small red targeting cross-hair icon displayed on the screen when playing.
The cross-hair icon is defined in the level’s HUD (Heads Up Display) Blueprint class.
The easiest way to remove it is to simply remove the HUD class from the level.
Note:
The FirstPersonHUD class can be useful to an Archviz project for displaying branding and architectural data on screen so it’s good to keep it in the project. it can later be modified to suit our needs used again (doing that is beyond the scope of this article).
If you wish to edit the HUD Blueprint instead of disconnecting it from the level, you’ll find it in Content > FirstPersonBP > Blueprints > FirstPersonHUD:
To remove the HUD from the level, navigate to the World Settings window,
If it isn’t displayed open it from Settings > World Settings:
In the World Settings window, under Game Mode > Selected GameMode, open the HUD Class drop-down and instead of FirstPersonHUD, choose None.
This will remove the HUD from the level but wont delete it from the project:
Creating HDRI environment backgruond and lighting* in UE4: Note:
Lighting using a panoramic HDRI background is also referred to as IBL – Image Based Lighting.
Import HDRI environment file. Note:
The file must be saved as a *.hdr file and not *.exr because AFAIK that’s the only way UE4 will recognize it as an HDRI environment and encode it as a Texture Cube (cube map)
Enable the HDRIBackdrop plugin:
Go to Edit > Plugins
Type “HDRI” in the search field to locate HDRIBackdrop and enable it.
* You’l have to restart the UE Editor before using the plugin
Drag a Lights > HDRI Backdrop object to your level:
In the HDRIBackdrop details, select the wanted Cubemap:
> Set the HDRIBackdrop‘s Intensity (self explanatory..).
> Rotate the HDRIBackdrop around its Z axis to set the environment’s direction.
> Set the HDRIBackdrop‘s Size.
* Make it larger than your whole scene,
And if Use Camera Projection is unchecked make it also large enough so that noticeable objects in the HDRI image will be distant enough as to not move incorrectly when you strife.
* When Use Camera Projection is activated the Size property has no effect.
> If Use Camera Projection is unchecked, set the Projection Center Z value to define the background image height below which it is projected as a flat ground.
> Lighting Distance Factor defines ground projection area that will appear to receive shadows from your scene objects.
* Set this attribute to 0 in-order to turn off the ground projection shadow.
> Use Camera Projection:
Activate this option to get a traditional infinitely far background with no flat ground surface projection.
Software: 3ds max 2020 | V-Ray Next | Unreal Engine 4.25
This post details basic steps and tips for exporting models from 3ds max & V-Ray to Unreal Engine using the Datasmith plugin.
The Datasmith plugin from Epic Games is revolutionary in the relatively painless workflow it enables for exporting 3ds max & V-Ray architectural scenes into Unreal Engine.
Bear in mind however, that Datasmith‘s streamlined workflow can’t always free us from the need to meticulously prepare models as game assets by the book (UV unwrapping, texture baking, mesh and material unifying etc.) (especially if we need very high game performance).
That being said, the Datasmith plugin has definitely revolutionized the process of importing assets into Unreal, making it mush more convenient and accessible.
Make sure all materials are VRayMtl type (these get interpreted relatively accurately by Datasmith)
Make sure all material textures are properly located so the Datasmith exporter ill be able to export them properly.
In Rendering > Exposure Control:
Make sure Exposure control is disabled. Explanation:
If the Exposure Control will be active it will be exported to the Datasmith file, and when imported to Your Unreal Level/Map a “Global_Exposure” actor will be created with the same exposure settings. Sounds good, right? So what’s the problem?
The problem with this is that these exposure setting will usually be compatible with photo-metric light sources like a VRaySun for example, but when imported to Unreal, the VRaySun does not keep its photo-metric intensity. (in my tests it got 10lx intensity on import). the result is that the imported exposure settings cause the level to be displayed completely dark.
Of-course you can simply delete the “Global_Exposure” actor after import, but honestly, I always forget its there, and start looking for a reason why would everything be black for no apparent reason…
* If your familiar with photo-metric units, you can set the VRaySun to its correct intensity of about 100000lx, and also adjust other light sources intensity to be compatible with the exposure setting.
Select all of the models objects intended for export,
And File > Export > Export Selected:
* If you choose File > Export > Export you’l still have an option to export only selected objects..
In the File Export window,
Select the export location, name the exported file,
And in the File type drop-down select Unreal Datasmith:
In the Datasmith Export Options dialog,
Set export options, and click OK.
* Here you select whether to export only selected object or all objects (again)
Depending on the way you prepared your model,
You may get warning messages after the export has finished: Explanation:
Traditionally, models intended for use in a game engine should be very carefully prepared with completely unwrapped texture UV coordinates and no overlapping or redundant geometry UV space.
Data-smith allows for a significantly forgiving and streamlined (and friendly) workflow but still warns for problem it locates.
In many cases these warnings will not have an actual effect (especially if Lightmap UV’s are generated by Unreal on import), but take into account that if you do encounter material/lighting issues down the road, these warnings may be related.
Note that the Datasmith exporter created both a Datasmith (*.udatasmith) file, and a corresponding folder containing assets.
It’s important to keep both these items in their relative locations:
In Unreal Editor:
Go to Edit > Plugins to open the Plugins Manager:
In the Plugins Manager search field, type “Datasmith” to find the Datasmith Importer plugin in the list, and make sure Enabled checked for it.
* Depending on the project template you started with, it may already be enabled.
* If the plugin wasn’t enabled, the Unreal Editor will prompt you to restart it.
In the Unreal project Content, create a folder to which the now assets will be imported:
* You can also do this later in the import stage
In the main toolbar, Click the Datasmith button to import your model:
Locate the the *.udatasmith file you exported earlier, double click it or select it and press Open:
In the Choose Location… dialog that opens,
Select the folder to which you want to import the assets:
* If you didn’t create a folder prior to this stage you can right click and create one now.
The Datasmith Import Options dialog lets you set import options:
* This can be a good time to raise the Lightmap resolution for the models if needed.
Wait for the new imported shaders (materials) to compile..
The new assets will automatically be placed into the active Map\Level in the Editor.
All of the imported actors will be automatically parented to an empty actor names the same as the imported Datasmith file.
In the Outliner window, locate the imported parent actor, and transform it in-order to transform all of the imported assets together:
* If your map’s display turns completely dark or otherwise weird on import, locate the “Global_Exposure” actor that was imported and delete (you can of-course set new exposure setting or adjust the light settings to be compatible)
The TeoSidedSign node let’s the shader “know” if a rendered polygon is facing the camera or not by outputting a value of 1 for facing polys and -1 for back-facing polys.
This is useful for creating materials that have different properties when seen front-facing or back-facing.
Example 1:
Blending two different colors based on face direction:
Check the Two Sided material attribute.
* Needed so that the engine will render the polygons beck sides.
In the material blueprint, create a blend of to colors using a Lerp (LinearInterpolate) node and connect it to the material’s Base Color input.
Add a TwoSidedSign node to get polygon facing input (1,-1).
Connect the TwoSidedSign node’s output to a Clamp node to clamp the values to (1,0).
Connect the Clamp node’s output to the Lerp node’s Alpha input so that the polygon’s facing direction will control the Lerp blend.
Note:
You can use this method to blend any other material attribute based on polygon facing direction.
Example 2:
Create an “inwards facing” flipped normal material:
Set the material’s Blend Mode to Masked.
* Needed for being able to make areas parts of the mesh invisible.
Check the Two Sided material attribute.
* Needed so that the engine will render the polygons beck sides.
Add a TwoSidedSign node to get polygon facing input (1,-1).
Connect the TwoSidedSign node’s output to a Clamp node to clamp the values to (1,0).
Connect the Clamp node’s output to a 1-X node to invert the facing input.
Connect the 1-X node’s output to the material’s Opacity Mask input so that polygons facing the camera will be invisible.
An example of a simple fog effect created using a Post Process material:
The fog material Blueprint: The method for creating the fog effect is to take distance of the objects from the camera, map it to a value range suitable for color blending 0 – 1, and use that for blending the object’s color with the fog color, so the further away the object, the more it will be colored by the fog.
Start by creating a new material, and follow the details below to create the Blueprint:
The Material Domain is set to Post Process.
And has its Blendable Location parameter set to Before Tonemapping so it will be applied on the raw render.
A SceneTexture node with its Scene Texture Id parameter set to PostProcessInput0 serves as the input of the view’s original rendered pixel colors:
A Lerp (LinearInterpolate) node calculates the blending of the view’s original pixel colors with the Fog color to create the fog effect.
A SceneTexture node with its Scene Texture Id parameter set to SceneDepth serves as the input of depth of each pixel (distance from camera):
A ComponentMask node set to the R channel allows using the depth data as a single float value instead of a Vector4:
A Clamp node is used to clamp (limit) the depth value to the Fog’s maximum depth value (see below)
A RemapValueRange maps the distance value to a fog density value that will be used as the Lerp (3) alpha parameter.
Simply put, the further the object, the more the original color will be blended with the fog color.
A Power node (raises the fog blend factor by an exponent) make the fog blending non-linear, that is beginning gently for closer objects and than increasing more drastically as the distance grows (provided that the exponent value is above 1)
A Constant Vector4 serves as an input for the fog color.
* Note that having this input be a Vector4 and not a Vector3 allows it to be interpolated with the PostProcessInput0 data, otherwise a ComponentMask (RGB) node would have been necessary to convert the PostProcessInput0 to a Vector3.
a float constant serves as an input for the fog’s minimal distance (from camera)
a float constant serves as an input for the fog’s maximal distance (from camera)
* Note that it’s connected both to the Clamp node and to the RemapValueRange node.
a float constant serves as an input for the fog’s minimal opacity (blend amount)
a float constant serves as an input for the fog’s maximal opacity (blend amount)
a float constant serves as an input for the fog’s blend exponent.
Applying the Post Process material to the level:
Select the PostProcessVolume actor in the World Outliner window.
* Create a PostProcessVolume actor if necessary.
In the Details panel, under Rendering Features > Post Process materials,
Add a new item to the array, in the new item’s value choose Asset Reference,
And then select your fog material: