Creating a Photorealistic Outdoor Environment for Games in Unreal Engine
If you are embarking on your first photorealistic environment in Unreal Engine, this article is for you! Daniel Cano Martinez breaks down his entire process for creating an outdoor game environment in UE4.
Daniel Cano Martinez is a 25 year old 3D Environment Artist and student from FX ANIMATION Barcelona 3D & Film School. He worked as a Motion Graphics artist before deciding to immerse himself in the Game Industry.
If you are embarking on your first photorealistic environment in Unreal Engine, this article is for you! In it, he breaks down his entire process for creating an outdoor game environment in UE4.
Hello, I am Daniel Cano Martínez, an Environment Artist from Terrassa, Barcelona. I have always loved video games and I have been playing them ever since I can remember.
Due to my fascination with this discipline I decided to get involved in the games world. After graduating in Mobile Application Development I started to take my first steps in 3D art. Once I graduated in Interactive Environments I started working as a Motion Graphics Artist in Flop Work, Barcelona. I worked there for a while until I decided to study a master’s degree at FX Animation in Barcelona where I am now.
Inspiration and goals
This project was one made for FX Animation, where I had to make an environment and a creature. Both had some restrictions.
The environment had to be an open natural world with vegetation and architecture. The creature had to look like a gecko but several metres long, so I started to look for my references.
I have always liked the environments made in Star Wars Battlefront and Star Wars Battlefront 2, that is why I first thought about these options. I remembered there was a planet called Yavin-4 in the Star Wars universe that looked like the style I was looking for. I started to find some references of this environment.
When I thought about the gecko I figured that it should look like a dinosaur more than a gecko. Bigger teeth, rougher skin, angrier face, not as friendly as a gecko looks, so I took some references from different films, like, for example, Peter Jackson’s King Kong.
My first goal was for this project to look as realistic as possible. I knew that the only way to get that realistic and leafy environment was the huge optimisation I needed. This was my biggest difficulty.
The landscape was going to take up a lot of resources because of the textures files. That was a problem so I decided to use the RH Normal workflow (Roughness, Height Map, Normal X and Normal Y). Rh Normal is a workflow that Blizzard uses in their games which saves a lot of memory space and a texture file.
It only uses 2 texture files to package all the textures. It also uses only 2 channels from the normal map. Then by using some expressions I got the entire normal map. Base Color and Ambient Occlusion go to one file and Normal Map, Height Map and Roughness go to the other one.
The next step was the landscape shader. To get all the layers to blend well I used a node called Layer Blend. This is where I put my texture sets with a height blend.
There was a problem with the tiling. The repetition of the texture pattern was easily visible. To avoid this issue I created variation tiles in the same texture set. I created a function that grabs all the textures from a set and performs linear interpolations, then blending the two variations of textures with the macro variation. Once I had all the channels with their tile variation I connected every channel to the distance variation mask.
To make the distance variation mask I used the pixel depth node. This node uses the scene depth. With this mask variation I can decide at what distance the texture is gonna change to another one. This is another way to avoid the tiling.
After that I made a heightmap correction mask only in the first texture set. This one interacts with all the others. The blending of the other textures will work depending on the heightmap of the first texture set.
To get the displacement I used the VertexNormalWS node. I multiplied each WS with the heightmap from each texture set.+90o
Finally I connected each channel to its own layer blend input. Then each layer blend went to the correct input in the master material.
Kitbash and vegetation
Once I got the landscape ready I built the block out of the architecture in Unreal Engine to know the measurements.
After that I started to make the modules in 3ds Max. I made 2 types of modules. Some of them used tileable textures and some used adoc ones. I packaged the adoc ones in the same UVs set to use the same texture in every structure. This would reduce the draw calls importantly.
Since the assets with adoc textures were going to have their own material I decided to create their high poly version to bake them in Marmoset. I sculpted their high poly in Zbrush. Before bringing all the adoc assets to Marmoset I packaged them in custom texture sets to optimize some memory.
The vegetation gave some trouble! At first I thought there would be some problems related to the draw calls on screen, as there was going to be a lot of trees, bushes and vines on screen. Then I decided to reduce those draw calls making texture atlases and making groups of vegetation as I did with the modules before. Something else I did to optimise the project was setting the architecture to static but the vegetation in movable.
I made the trees in TreeIt but I didn’t use the textures the program gives. Due to not using those textures I had to order the UVs to fit in the texture I would use.
I used 3ds Max to model the ferns modules. After modeling the ferns and vines I made some groups to optimize and reduce the draw calls on screen.
There was a specific tree trunk that made winded forms. To make this one I decided to build a Spline in Unreal Engine.
The art of photogrammetry
One of the disciplines I wanted to try in this project was photogrammetry. I took some objects I thought would fit in the project and undertook the process of photogrammetry.
For assets related to the ground I went outside when it was cloudy and I took some photos with indirect lighting. I used Reality Capture to calculate the model by its images. Once I had the model calculated I brought it to Zbrush to get a clean model. After that I made the process of decimating and UVs.
Although Reality Capture makes color texture from the photos, I wanted to give the model its own texture to fit in the environment.
Texturing and creating the materials
Creating the textures and materials was so interesting. I had never worked the way I did for this project. I always sculpted the texture and painted it but for this project I decided to use a really interesting workflow.
I used Quixel Suite and Quixel Mixer. I took some textures from Quixel Bridge, after that, using some masks and blending options I had a new and different texture. With this workflow I was able to get some textures that fit amazingly in my environment.
The assets with adoc textures were a bit harder. I made some custom masks in Substance Painter for these assets and brought them to Quixel Mixer. Those masks allowed me to paint the object as if I had used generators.
For the vegetation I packed some fern atlases in a file and I got all the channels for each atlas. There was a texture file that I made on my own, the translucency. I put a black layer on the base color and added some transparency.
The moss that grew up the assets had to be at the top of them. To get that result I created a Dot Product Mask for my master material. It had a VertexNormalWS node to get the normal of the asset and a Constant3Vector to get the direction where the moss was going to go. In my case I set the Z to 1 to place the moss from above. Then I put some editable nodes to set the amount and contrast of the moss.
Once I got the mask I only blended the original texture with the moss with HeightLerps. There was another problem when the assets were inserted into the terrain: they created a rough line because of the bad blending. To fix that I used the mesh distance fields and made a mask with them. If I multiplied that mask with a node called DitherTemporalAA the assets would become translucent when they were near another surface.
Lighting and post-processing
Lighting was one of the most important parts of this project. I had to be careful with the lighting I would use.
I decided to set the lighting between 17:00 pm and 18:00 pm. The range temperature in the directional light is between 3500 and 3700. I was looking for a lighting which had shadow and light contrasts so I managed low values both intensity and indirect intensity.
I wanted the environment to look thick, that is why I used light shafts and exponential height fog. This one has the volumetric fog option which is really important to check.
The movable vegetation projected some weird shadows, there is where cascaded shadow maps come into the scene.
For the post-processing volume I changed some values from the color grading a little bit. I wanted the environment to look a bit more cinematic. I gave some contrast to the shadows, midtones and saturation and gain, to the global. This would make the vegetation look more green and alive and the shadows darker, giving the spectator the feeling of an abandoned but alive rainforest.
Learning the anatomy of the creature was a really fun and interesting challenge for me. Not only did I have to learn the anatomy of a gecko, the real challenge was to distort the friendly idea of the animal and create a dangerous and bigger one.+
When I started modeling a first block out in Zbrush I decided to make the body fatter than that of a simple gecko. I sculpted the paws a little thicker than the original. But, the biggest difference was the teeth, I made them much bigger than a gecko’s. I thought making them bigger would make the gecko look more dangerous.
Another difference would be the tongue. The tongue in the creature had to be dirtier and rougher than that of a little gecko’s.
When I was about 70% done with the high poly model I started the retopology. I made the low poly and set the UVs in 3ds Max. I made two material sets, one was only for the eye and the other one was for the body. While I got the low poly version ready, I sculpted the high poly one with the base of the low poly.
The remaining 30% of the high poly was the microdetail of the skin. I made it after applying the project tool in Zbrush. Doing that allowed me to make bakes faster because of the clean topology.
Texturing the gecko was a challenge for me due to the colours I planned to use. I thought that if the rainforest was going to have green tones the gecko should also have some, because of the adaptation to nature. In the end, I decided that it was not a good idea and I opted to paint the gecko with their complementary colors which are red tones. I thought the gecko would look more aggressive if I used them. Time to open Substance Painter!
Whilst I was texturing, I also did the skinning in 3ds Max. I used a dragon skeleton from the Unreal Store. Given that I made the block out following this skeleton I was not going to have any problem with the skinning.
Once I had finished the gecko I made a render scene in Marmoset. There were some tricks to make the gecko look better. I made a scatter map in Substance Painter to get the subsurface, and I also created a mask which made the eye look like a lens, in Photoshop.
I spent about a couple of months on this project. While I worked on it, I read and watched a lot of documentation on Unreal to learn new workflows and fix some problems I found and resolved.
Although I have improved my skills related to 3D art these years I still have plenty to learn from many professionals. There is also a lot of documentation to read and carry on improving my skills.
I must thank Javier F. Flores, Daniel Sánchez and Alberto Martínez for their mentorship and support. Thank you to my family for always supporting me. I would also like to mention Star Wars Battlefront 2 by EA Dice for the references I took. Finally, I must give a big thanks to The Rookies for giving me the opportunity to write a breakdown on this project.