Welcome! In this tutorial series we’ll be creating a full-featured, cinematic VR Experience with Unreal Engine, starting from scratch. Throughout each post in the series, we’ll go over every aspect of a high-fidelity real-time production, from the best practices for this type of content to various real-time engine tips, tricks and workarounds.

Our end product places the viewer in an dreamy, Celtic landscape in the short, silent moment between nighttime and dawn. Are they alone, or are ghosts rising for an instant, only to taste the dew and fall back once again into their slumber?

This tutorial is intended for seasoned game engine users and real-time rendering newcomers alike. We won’t be going into the actual creation of 3D assets much; instead we’ll be looking into how we can push real-time rendering to it’s limits and all the various, novel ways that we can use game engines to deliver high-quality pixels in real-time. The engines are coming, and when they do you’ll be ready for them!

Part 1: Terrain – Photogrammetry on a Micro and Macro Scale

Using Satellite Imagery with Unreal’s Landscape System

In more projects than it’s possible to count, natural landscapes are one of the key components to the world geometry. From high-fidelity real-time productions to low-poly arcade games, almost everyone using real-time engines use real-time landscape tools… and abide to their rules.

Most real-time engine landscape systems follow a similar throughline:

  • A heightmap (painted or imported by the user) deforms a plane.
  • The plane is chunked into a number of manageable parts whose resolutions are dynamically adjusted for best performance.
  • Onto these chunked planes, various textures (supplied by the user) are blended onto the world geometry according to a splat map, essentially a color mask for which textures go where.

While this approach does allow for unprecedented quality and performance in the context of real-time rendering, it does (like all powerful pieces of engineering) come conditionally:

  • You’ll have to supply the engine with textures.
  • You’ll have to supply the engine with a heightmap.
  • You’ll have to supply the engine with a splat map.

These three conditions don’t really come as a surprise. We haven’t quite reached neural-link terrain generation yet, so we’ll have to stick with a more manual workflow. Fortunately for us, there’s a few ways to go around these things. While the option to paint your own heightmap and splat map is always offered to you, it’s not much of a surprise that aside from minor adjustments, doing either of these things by hand is incredibly inefficient and time-consuming. In this part of the series, we’ll be doing two things to ease our landscapes woes:

  • We’ll source the heightmap from satellite imagery.
  • We’ll come up with a way to generate our splat map based on terrain geometry so we can automatically have cliff textures on our cliffs and grass textures on our rollings hills.

But first things first! Inside the Unreal project, we navigate to the Landscape tab where we can create and edit landscapes in our scene. There, you’re given the option to either create a blank slate or to import an existing heightmap.

We’re now at the stage where we need a heightmap for our scene. While we’re using sourced satellite data for our project, it’s important to know that if you’re in need of a more granular way to come up with a heightmap, there’s a number of industry-grade  tools you can use to generate terrain like World Machine or Houdini’s heightfield tools. We’re however not burdened by a specific terrain layout with our project, so we can use satellite data.

There’s a few ways to get satellite data online, but the most time-efficient one I’ve found yet is most definitely Terrain.Party. It started as a way to get real-world heightmap for Cities: Skylines, but even the smallest online search will reveal that’s it’s become basically ubiquitous for our current purposes. We select an area (we stuck to our theme and went with northern Scotland), export and end up with a zip file containing several HDR texture files. We go with the sharpest-looking one; it’s our heightmap!

We can now load our heightmap file into Unreal. Before completing the import, we’ll be offered to tweak various import settings; for our purposes we can just use the default ones. Following import, we rescale the terrain to better fit our needs. We (and you, if you’re following along in your own project) should end up with something that looks like this:

High-Fidelity PBR Materials and How to Efficiently Use Them

If you’re even moderately interested in 3DCG rendering in 2018, you’ve surely heard of either Megascans or Substances. These two products don’t operate the same way, but they adhere to the same underlying rules of Physically Based Rendering.

You’re probably going to be better served elsewhere if you’re looking for an deep-dive into the intricacies of the PBR shader model, but we can definitely form a conclusion by comparing the two products: while the various calculations crunched by PBR ensure a physically-accurate render, how close the various inputted parameters are to the real world is entirely up to the artist. Megascans have photogrammetry based values and Substances are crafted by hand and they both look great, it’s merely a question of where you’d like to stand vis-à-vis photorealism and procedural handiness.

A good starting point between the two options can be to fetch non-PBR textures from sites like Textures.com and transmute them into being PBR compliant with utilities like Substance B2M or the Unity Technologies De-Lighting Tool. For the purposes of this project, we’ll be using Substance B2M; Since we get to pick our own texture maps it’s a little more granular than picking from the Megascan library, and following the theme of being as efficient as possible in our workflow we can generate an entire array of PBR texture maps (Albedo, Roughness, Metalness, Ambient Occlusion, Displacement, etc.) from one base texture.

We’ll need three textures for our terrain; a grass texture, a sandy cliff texture and grassy soil texture to serve as an inbetween for the two main textures. Here are the ones we’ve picked from Textures.com.

After locking down on our choices, we run them through Substance B2M; You can tweak the various parameters to your liking. There’s various PBR value reference tables online that you can use to be as real-world accurate as possible, but for the purposes of this project we’ve decided to just go with what looks best to the naked eye. Are we not after all the masters of our own art?

Once done with the entire B2M process, we should have generated a fair amount of texture maps for our three materials. Following Epic’s asset and project naming conventions, we can now place them into our project and bring it all together!

Using Procedural Landscape Materials to Save Time

Now that we’ve created our terrain heightmap and textures, we can get to work on creating the Landscape Material for it. Similarly to run-of-the-mill meshes, real-time engine landscapes have materials applied to them that govern the way that they intake a splat map and the way that they output a finished product.

While some engines like Unity and Amazon Lumberyard offer a stock terrain material out-of-the-box, Unreal landscapes require a bit more work; for our purposes it’s not exactly an issue, however, as we’ll be taking advantage of this granular control to generate the entire splat map via shader code. Our final material graph will look like this:

It does look a bit daunting at first glance, but not to worry, it’s actually much less painful to understand than it is to look at it. If we close in on the components, we can break the entire material down to a small recipe that get repeated for various maps:

We start with the LandscapeCoords node, which essentially supplies the UV coordinates of the terrain to the material editor. For a more granular control of the UV tiling for our three textures, we simply divide that node to get the texel size we’d like for our various maps.

We then simply make use of the WorldAlignedBlend node, which essentially outputs an Alpha based on the normal directions of our mesh. It takes some adjusting bias and sharpness levels, but at the end of it we can generate two satisfying masks: one from grass to cliff and one that adds the grassy soil on top of that as to enhance the overall realism of our blending. For our entire material, every material output slot we use ( Albedo, Normal, Roughness and Ambient Occlusion ) follows this simple loop.

Our parameter twiddling complete our finished product for this post should look like this. A quick buffer visualization of the scene roughness really conveys the extent of the blending effectuated.

If you’d like, you can head over to BlueprintUE and snatch the material graph for yourself! For our purposes, this landscape is still very much bare-bones, but we’ve now quickly enough set a baseline that we can now build on.

In the next post, we’ll be looking at real-time foliage tips, material instancing and best practices for environment assets. See you soon!

Close Menu