3ds max & V-Ray to UE4 – Datasmith workflow basics and tips

Software:
3ds max 2020 | V-Ray Next | Unreal Engine 4.25

This post details basic steps and tips for exporting models from 3ds max & V-Ray to Unreal Engine using the Datasmith plugin.
The Datasmith plugin from Epic Games is revolutionary in the relatively painless workflow it enables for exporting 3ds max & V-Ray architectural scenes into Unreal Engine.
Bear in mind however, that Datasmith‘s streamlined workflow can’t always free us from the need to meticulously prepare models as game assets by the book (UV unwrapping, texture baking, mesh and material unifying etc.) (especially if we need very high game performance).
That being said, the Datasmith plugin has definitely revolutionized the process of importing assets into Unreal, making it mush more convenient and accessible.

 

Preparation:
Download and Install the Datasmith exporter plugin compatible with your modeling software and Unreal Engine version:
https://www.unrealengine.com/en-US/datasmith/plugins

 

In 3ds max & V-Ray:

  1. Make sure all materials are VRayMtl type (these get interpreted relatively accurately by Datasmith)
  2. Make sure all material textures are properly located so the Datasmith exporter ill be able to export them properly.
  3. In Rendering > Exposure Control:
    Make sure Exposure control is disabled.
    Explanation:
    If the Exposure Control will be active it will be exported to the Datasmith file, and when imported to Your Unreal Level/Map a “Global_Exposure” actor will be created with the same exposure settings.
    Sounds good, right? So what’s the problem?
    The problem with this is that these exposure setting will usually be compatible with photo-metric light sources like a VRaySun for example, but when imported to Unreal, the VRaySun does not keep its photo-metric intensity. (in my tests it got 10lx intensity on import). the result is that the imported exposure settings cause the level to be displayed completely dark.
    Of-course you can simply delete the “Global_Exposure” actor after import, but honestly, I always forget its there, and start looking for a reason why would everything be black for no apparent reason…
    * If your familiar with photo-metric units, you can set the VRaySun to its correct intensity of about 100000lx, and also adjust other light sources intensity to be compatible with the exposure setting.
  4. Annotation 2020-05-12 192439
  5. Select all of the models objects intended for export,
    And File > Export > Export Selected:
    * If you choose File > Export > Export you’l still have an option to export only selected objects..
    Annotation 2020-05-12 192506
  6. In the File Export window,
    Select the export location, name the exported file,
    And in the File type drop-down select Unreal Datasmith:
    Annotation 2020-05-12 192550
  7. In the Datasmith Export Options dialog,
    Set export options, and click OK.
    * Here you select whether to export only selected object or all objects (again)
    Annotation 2020-05-12 192654
  8. Depending on the way you prepared your model,
    You may get warning messages after the export has finished:
    Explanation:
    Traditionally, models intended for use in a game engine should be very carefully prepared with completely unwrapped texture UV coordinates and no overlapping or redundant geometry UV space.
    Data-smith allows for a significantly forgiving and streamlined (and friendly) workflow but still warns for problem it locates.
    In many cases these warnings will not have an actual effect (especially if Lightmap UV’s are generated by Unreal on import), but take into account that if you do encounter material/lighting issues down the road, these warnings may be related.
    Annotation 2020-05-12 192730
  9. Note that the Datasmith exporter created both a Datasmith (*.udatasmith) file, and a corresponding folder containing assets.
    It’s important to keep both these items in their relative locations:
    Annotation 2020-05-12 204541

 

In Unreal Editor:

  1. Go to Edit > Plugins to open the Plugins Manager:
    Annotation 2020-05-12 192802
  2. In the Plugins Manager search field, type “Datasmith” to find the Datasmith Importer plugin in the list, and make sure Enabled checked for it.
    * Depending on the project template you started with, it may already be enabled.
    * If the plugin wasn’t enabled, the Unreal Editor will prompt you to restart it.
    Annotation 2020-05-12 192901
  3. In the Unreal project Content, create a folder to which the now assets will be imported:
    * You can also do this later in the import stage
    Annotation 2020-05-12 193030
  4. In the main toolbar, Click the Datasmith button to import your model:
    Annotation 2020-05-12 193043
  5. Locate the the *.udatasmith file you exported earlier, double click it or select it and press Open:
    Annotation 2020-05-12 193129
  6. In the Choose Location… dialog that opens,
    Select the folder to which you want to import the assets:
    * If you didn’t create a folder prior to this stage you can right click and create one now.
    Annotation 2020-05-12 193301
  7. The Datasmith Import Options dialog lets you set import options:
    * This can be a good time to raise the Lightmap resolution for the models if needed.
    Annotation 2020-05-12 193326
  8. Wait for the new imported shaders (materials) to compile..
    Annotation 2020-05-12 193408
  9. The new assets will automatically be placed into the active Map\Level in the Editor.
    All of the imported actors will be automatically parented to an empty actor names the same as the imported Datasmith file.
    In the Outliner window, locate the imported parent actor, and transform it in-order to transform all of the imported assets together:
    * If your map’s display turns completely dark or otherwise weird on import, locate the “Global_Exposure” actor that was imported and delete (you can of-course set new exposure setting or adjust the light settings to be compatible)
    Annotation 2020-05-12 193517

 

 

Related:

  1. Unreal – Architectural glass material
  2. Unreal – Camera animation

Maya – Setting the V-Ray Sun direction according to location, date and time

Software:
Maya 2018 | V-Ray 3.6

To set the VRaySun photometric light source diretion according to the location in the world, the date and the time:

  1. Select the VRaySun parent node – ‘VRayGeoSun1Transform‘ and rotate it so its Z axis points to the architectural plan’s south.
  2. Select the VRaySun node – ‘VRayGeoSun1‘ and in its attributes un-check Manual Position.
    This will make the location / date / time parameters accessible.
  3. Set the GMT zone of you architectural project’s location in the world, the Date and time.
    * haven’t found how to set daylight saving time….

Untitled-1

Related:
V-Ray for Maya Physical Camera
V-Ray for Maya White Balance
Daylight system addon for Blender

Architectural Visualization can be both physically correct and aesthetically pleasing

B_Sunset_EV-8_Oded-Erell

Thinking we must “cheat” about the real-world lighting conditions of an architectural interior in order to render an aesthetically pleasing image of it is a common misconception in the field of Architectural Visualization.

I have been a professional in the field of digital 3D Visualization and Animation for the past 17 years, and the technologies we use to create synthetic imagery have developed dramatically during this period. The profession that is traditionally named “Computer Graphics”, can today rightfully be named “Virtual Photography”.

At the beginning of my career, photo-realistic rendering was impossible to perform on a reasonably priced desktop PC workstation. Today things are very different. In the early years, the process of digital 3D rendering produced images of a completely graphic nature. No one back than would mistake a synthetic 3D rendering for being a real-world photograph.

About 12 years ago, the development of desktop CPU performance and the advent of 3D rendering software that use Ray-Tracing* processes have made possible a revolution in the ability to render photo-realistic images on desktop PC’s. The term “photo-realistic” simply means that an uninformed viewer might mistake the synthetically generated image for a real-world photo, but it doesn’t mean the image is an accurate representation of the way a photograph of the subject would look if it were really existing in the world. For a computer generated image to faithfully represent how a real-world photo would look, it’s not enough for the rendering to be photo-realistic, it also needs to be physically correct and photo-metric.

“Physically correct” rendering means the rendered image was produced using an accurate virtual simulation of physical light behavior, and “Photo-Metric” rendering means that the virtual light sources in the 3D model have been defined using real-world physical units and and the rendered raw output is processed in a way that faithfully predicts the image that would result from a real-world camera exposure.

Most contemporary rendering software packages, have the features I described above, and therefore are capable of generating photo-realistic images that are also physically correct and photo-metric, and so faithfully predict how a real world photo of the architectural structure would look.

So what’s the problem?

The problem is that when we virtually simulate the optics of a scene using real world physical light intensities, we come across the challenges that exist in real world photography, mainly the challenge of contrast management, or in more geeky terms, handling the huge dynamic range of real-world physical lighting, simply put, we encounter the common photography artifacts like unpleasing “blown out” or “burnt” highlights, light fixtures and windows.

Trying to solve the problem by lowering the camera exposure simply reveals more details in bright areas at the expense of darkening the more important areas of the image. traditional photo editing manipulations don’t do the trick, they might serve as a blunt instrument to darken areas of the image selectively but the result looks unnatural and fake and the traditional approach in interior rendering is to simply give up the realism of the visualization by drastically reducing the intensities of visible light sources and adding invisible light sources, a solution that might produce an aesthetic image but not one that faithfully reflects how a real photograph of the place would look and can be said to be physically correct.

Fortunately today we have tools and processes, that allow for a much more effective development of physically accurate renders, somewhat similar in approach technologies incorporated into professional digital photography. these techniques involve processing the rendered images using specialized file formats that contain a very high degree of color accuracy and can store the full dynamic range of the “virtual photograph”, a process called “Tone mapping” designed to display an image in a way that mimics the the way are eyes naturally see the world, optically simulated lens effects that mimic the way a real lens woulds react to contrast and high intensities of light.

Incorporating this workflow requires taking a completely different approach to creating and processing 3D rendered images than the traditional methods used in the past decades. we give up some of the direct control we’re used to in computer graphics, but in return we are able to produce physically correct visualization that are both aesthetically pleasing and have a naturally feeling lighting.

Daylight_Oded_Erell

In conclusion, with effective usage of today’s imaging technologies, it’s possible to produce 3D visualization that will serve both as a faithful representation of a possible real world photograph of the architectural design, thus aiding the creative design and planning process, and at the same time provide a photo-realistic basis for producing highly aesthetic marketing media.

Thank you for reading! I would love to hear your opinion, discuss the subjects in the article and answer any questions that you may have about it.

* “Ray-Tracing” is a process that simulates the physical behavior of light by tracing the directions it travels as it hits surfaces, reflects of them and refract though them. Ray-Tracing calculations are a key ingredient in photo-realistic rendering.

The author is Oded Erell, photo-realistic rendering specialist and instructor, the 3D visualizations displayed in this article have all been produced CG LION Studio.
Your’e welcome to visit our portfolio website
 and see more examples of our work.

 

Related Posts:

  1. Understanding the Photo-Metric Units
  2. IES Lighting
  3. Understanding Fresnel Reflections
  4. Understanding Transparency Render Settings
  5. Wooden floor material in V-Ray
  6. Advanced Spotlights for Blender & Cycles
  7. Advanced Architectural Glazing for Blender & Cycles

 

Measure Distance & Angle in Blender

Software:
Blender 2.79

To create and edit measurement Rulers & Protractors:

  1. In the Tool Shelf > Grease Pencil,
    Press Ruler/Protractor to activate Ruler/Protractor mode.
  2. LMB Click & Drag the 3D Viewport to create a measuring Ruler.
    * Hold Ctrl while creating the Ruler to snap its start and end points to 3D elements.
    * If a one or more measuring Rulers already exist, Ctrl must be held anyway to create a new Ruler.
  3. LMB Click & Drag along an existing Ruler line to turn it into a Protractor for angle measurement.
    * Hold Ctrl while creating the Protractor to snap its apex to 3D elements.
  4. LMB Click & Drag a point in an existing Ruler or Protractor to change its location.
  5. LMB Click an existing Ruler or Protractor and Press Delete to delete it.
  6. When finished creating measurement Rulers & Protractors,
    Press Enter to save them for later use of the Ruler/Protractor mode,
    Or Press Escape to discard them (without discarding Rulers / Protractors that were previously saved).

Untitled-1

AAA

2

 

 

To Delete all Ruler / Protractor data:
In the Properties Panel > Grease Pencil Layers,
Delete the RulerData3D Layer.

Untitled-2

Animating the sample settings in Cycles

Software:
Blender 2.79

A quick Cycles rendering tip:

There are situations in which we need to render an animation with changing lighting complexity, and as a result, parts of the animation need more samples than others to be effectively rendered.
For example when the camera starts it’s movement on the outside in an exterior scene, and moves into an interior space like house or a cave, or a vehicle, in many cases, the exterior part of the animation can be rendered with much less samples than the interior part.

In such cases, rendering the whole animation with the higher sample settings will demand unneeded render time in the simpler parts of the animation.

One possible solution would be to simply render the animation in two separated render jobs with different sampling settings, one for the less demanding part and another for the more complex part and than append the two parts in an editing / compositing software. but that requires more work on the shot, more management etc.

A simple solution is to animate the sample settings in Cycles.
Make tests at different times along the animation to determine how many samples are needed at each part, and key-frame the settings accordingly.

AnimateSamples

Basic architectural glazing material in UE4

Software:
Unreal Engine 4.18

  1. Create a new material, and double click it to edit it.
  2. In the Details panel, under Material, set Blend Mode to Translucent.
  3. In the Details panel, under Translucency, set Lighting Mode to Surface Translucency Volume.
  4. Set Base Color to White.
  5. Set Metallic to 1.
  6. Set Roughness to 0.
  7. Create a Fresnel node and connect it to the Opacity input.
  8. In the Fresnel node, set Base Reflect Fraction to control reflection amount in perpendicular surface viewing angle (front).
    * Note that its connected to Opacity, but since the material is basically a flat mirror, when it’s not purely transparent it will be reflective.
  9. In the Fresnel node, set Exponent to control the reflection amount falloff curve from perpendicular surface viewing angle (front) to parallel surface viewing angle (sides).
    * Higher values will create a steep falloff curve, resulting in less reflection in most viewing angles.

Untitled-3

Exporting 3D camera data from After Effects to 3ds max

Software:
After Effects 2020

  1. Download the AE3D_Export script here:
    http://www.urbanspaceman.net/shared/AEscripts/AE3Dexport/AE3D_Export.jsx
  2. Perform 3D tracking on the footage if necessary.
  3. Select the 3D Camera layer and also Null layers if available.
  4. Choose File > Scripts > Run Script File and locate the AE3D_Export script.
  5. In the Script parameters highlight 3ds max.
  6. Click Options and set the scale.
    * you might need to try and see the scale in 3ds max to set it right.
  7. Set a name for the exported ms (MaxScript) file.
  8. Click Export.
    The resulting MaxScript file will appear on the desktop named <your after effects project name>.ms
    Note:
    You may be prompted to check the Allow Scripts to Write Files and Access Network option in File > Preferences > Scripts & Expressions.
  9. Drag the generated MaxScript file into the 3ds max viewport.
    The script will run and create an animated Camera and Dummy object, and also set the timeline range to fit the animation.
  10. Create a new Point Helper object.
  11. Align the new Point Helper to the Dummy object in both position and orientation.
  12. Group the Camera and the Dummy objects together, and link the group to a new Point Helper.
    This will allow for easy orientation and scaling.
  13. Set the Point Helper object’s rotation to default (0,0,0) , this will also reset the Dummy + Camera group’s orientation relative to the world.
    Scale the Point Helper object if needed, to scale the whole camera setup.
  14. Display the original video sequence as viewport background to check how the camera motion fits the video.
    The center of your Point Helper should appear “glued” to a specific point in the background video. 
Untitled-2

 

Related:
After Effects 3D Camera tracking