A Game of Tricks IV – Stylized normal mapping

Ok. First thing, I’m sorry this isn’t the second part of the particles tutorial. I think I have vastly underestimated how complex it would be to extract a clear explanation of our FX system which is very very cool in a twisted and intricate way. Making the rest of the explanation worthwhile will require that I isolate some code from a more complex set. I’ll probably have to split it in two more parts to swallow the effort.

But today, I’ll write about normal maps… and rocks. I have already explained how we had used normal mapped sprites for Transcripted. If you don’t know what normal maps are, you can take a look at this previous post to learn the basics. Drifting Lands is a much more 3D game than Transcripted was. All ships and a lot of background elements are “real 3D” but with a pretty stylized modelling and fancy shaders hopefully making the whole thing look more like an illustration. The overall look of the game is influenced by a lot of games (Journey, Diablo3, Darksiders…) and a lot of animation movies (Disney and Ghibli productions mainly). The art direction is more focused on shapes, silhouettes and harmony of colors, than on detailed hi-res textures or realistic physically-based materials.

01_DriftingLands_Capture

Drifting Lands takes place over, around and inside a broken planet. There will be plenty of flying bits of rocks involved in all environments of the game. The background rocks represent a large part of the visual identity of the game so we have worked quite some time to find the good balance between polycount and shader complexity to create nice looking blocks. Right now, we will focus on how these rocks catch the light.

Look carefully at the floating islands of rocks above. See how the external silhouette is really simple? Nothing random here, it’s been handcrafted with care by alternating long and short edges at various angles but it’s still very low poly. Now pay closer attention to “internal edges” where light and shadow meet. You’ll still see the same general design but with a very important difference : there’s a lot of dents and notches making the whole thing a lot richer visually. Usually, this kind of details comes with a quite detailed and complex geometry. If you tried to model the exact same stylized aspect in a traditional way, first it would take you a lot of time, and you would most likely fail to achieve the same shading behavior. See how we can change the light’s position and still get the same nice dented but sharp looking edges between light and shadows?

02_VariousLights

You could think that this effect is achieved by some kind of cell shading with a non-linear lighting ramp. But you would be wrong. Our rocks are indeed lit with a color ramp texture but it’s mostly a linear gradient. No, the trick is to tweak the normals of some pixels with normal maps and create something which is in fact physically impossible.

Let’s consider only two adjacent faces of a pillar of rock. Each face of a pillar is composed of several triangles smoothed together : normals of these triangles are continuous and point relatively all in the same direction. What we want is to transform the normals of some pixels along the edge of the left face to align them with the normals of the right face. And we will also modify normals of pixels of the right face to align with the polygons of the left.

03_principle

If you want to translate that in “standard” geometry… well you can’t, you have to use normal maps or altered geometry. When we first went this way, we went the “hard” way : hand drawn normal maps. For each face of stone pillars, we searched for tangent space colors to simulate normals of the adjacent face. And here is what the result looks like…

04_handDrawnNM

It’s not really hard to do, just tedious and inefficient. But the result was cool, so we started to think about a better way to do this. In 3DSMax, when you model something, normals are automatically calculated to be perpendicular to the surface. And when two triangles not strictly coplanar are smoothed together, the normals of the common edge is an average of the directions of each triangle’s normal.

06_smooth_creaseleft : polygons are not smoothed, normals are split
right : polygons are smoothed, normals are common to the 2 faces

Typically you don’t manipulate normals, you only assign smoothing groups to triangles or polys. In Maya, you define creases between faces, which is the same thing as telling the two triangles of this edge to be from different smoothing groups. I don’t know Maya well enough, but in 3DSMax you can edit normals manually and point them wherever you want, even if it has no “real-world” meaning. For example, if you totally invert normals, modified faces will be lit as if the light was the other way around : faces looking at the light will be dark and faces looking away will be bright. Ok, that one is not often usefull… Now imagine several crossed planes to create a fake volume of vegetation. Many light positions will betray the true nature of this model, making it look very cheap.

05_bushQuads

Left : “classic” normals / Right : edited normals

But if you take all normals of these planes and point them all in the same direction, let’s say up… you’ve got a uniformely lit bush receiving an amount of light similar to the ground where it’s placed.

Let’s get back to our rocks, shall we? Right now, we have a low poly model but we want to add all those dents along the different edges. First step, we’ve got to unwrap the UVs of the low poly model. The important point here is that faces with edges to modify must be separated on the UV layout. You’ll be able to create ‘clean’ dents only between faces of different UV clusters because of texture filtering and mip mapping. You have to keep clear space for padding between your clusters. I’d recommend between 8 and 16 pixels depending on the normal map resolution and the size the objects will have most of the time on screen. You should end with something like this :

07_UWTemplate

Now it’s cutting time but first you have to create a copy of your mesh. The original will stay low poly and the new one will be the ‘high-poly’ version. You must add geometry by cutting polygons along the edges you want to dent but keep everything flat. You add vertices and edges but do not move anything! If you do not display edges in the viewport, the high poly version should for now look roughly the same as the low poly. A few shading differences could appear due to the new tessellation but we will deal with that later. You don’t have to create really complex dent shapes right now, you’ll be able to refine manually the normal map later because you’ll have color references.

08_low_high

Now it’s time for ‘edit normals’, the Max modifier to manipulate normals manually. You should be aware that edit normals must often be the top modifier in the modifiers stack. Indeed if you add other modifiers, like edit mesh or edit poly on top of it, the smoothing groups are reapplied, normals are automatically recalculated and your manual editing is overwritten. For each poly to modify the process is as follow :
– select the normals of the poly you want to edit (you can select them all at once through face selection)
– split the normals of the poly you’ve selected (this is the same as making this poly part of another smoothing group)
– unselect everything because you’ve got the newly created normals also selected
– copy the direction of one of the normals of the adjacent face of the rock pillar
– reselect the poly you were editing, now only the normals of the poly are selected and not the normals of the surrounding faces
– paste the normal direction from the adjacent face
– repeat for all dents!  Please note that you can edit multiple polys at the same time if they are facing the same direction and must be converted to another common direction.

You should now have something looking a lot like this :

09_normalsTweaked

Turn the light around in the viewport and see how the light is reacting along the edges. That’s what we want! You could export the model as is and it would still work exactly the same in Unity (providing you’ve configured the FBX importer to ‘import’ normals and not ‘calculate’ normals). But we don’t want to use that model because :
1 – it’s high poly
2 – the tessellation for dents probably created bad looking smoothing irregularities

The next step is normal map baking. It’s exactly what you do when you work on a AAA game and you want to transform a 3-million-polys-character into a 10000-polys-model. You’ve got a high poly model, a low poly model and you want to calculate the normal map which will transform the low poly into a high poly looking model. Ok, so I won’t cover the whole process in detail. There would be too much to say and you can find hundreds of tutorials out there about this. So let’s keep it a simple step by step :
– place the low and high poly models at the same coordinates (they should fit really well because they have actually the same 3D shape)
– select the low poly model
– go to the render to texture panel (shortcut ‘0’)
– choose your padding
– in the projection mapping section, enable projection mapping, pick the high poly object and go the projection options
– select the RayTrace method, disable Use Cage, choose tangent normal space and set green orientation to up
– return to the render to texture panel, use existing mapping channel, and in the output section add ‘normal map’,
– select file name and destination, texture resolution…
– … and finally click the render button!

If you see a window rendering something which does not look like a normal map, keep cool, it’s ok (in the twisted minds of autodesk engineers at least). The file written on your hard drive is a normal map. Open the normal map in Photoshop and add a layer with the UV layout on top of it (you can easily render the UV layout in the Unwrap editor window : Tools > Render UVW template). At this point you want to do two things :
– remove all smoothing problems caused by the details tessellation : all blueish color gradients which are not clearly color spots corresponding to notches and dents must be covered with a uniform 128/128/255
– if you want, add even more details to the dents you roughly modeled in 3DSMax. You just have to use the colors calculated by 3DSMax or 128/128/255 if you want to ‘erase’ some bits. Don’t forget to create color padding for all color stains you add.

11_normalmap_edit

 

Left : raw normal maps from render to texture / Right : manually edited normal maps

Now you want to import the low poly model in Unity (important parameters : import normals, not calculated / split tangents on). If you add a material with normal mapping and import the normal maps from Photoshop, it should work!

12_unity_final

… or maybe not completely. I was stuck for a long time on a last problem and only solved it recently. For some creases, I was using the same technique but in Unity the light was behaving strangely with the normal map created in Max. And this is due to the way normal maps are stored in Unity. In order to save resources, Unity normal maps are stored only in 2 channels even if you need 3 values to define a normal. If you store 2 values for a 3D vector but you know its length, you can calculate the third component (because Pythagore is your friend). It happens we know the length of a normal, it’s one. Every Unity built-in shaders use a function called UnpackNormal whose role is to calculate the missing Z (blue) component. But there’s just a tiny problem : there are in fact two Z values possible, one positive and one negative. By default Unity decides that the good one is the positive value and honestly I can’t blame this decision. In a realistic world, tangent space normals with negative z value are impossible… but our technique relies on creating impossible surfaces.

I’ll get back to the shader we use in a later post but here is our solution. We have to import the normal map as a standard texture. Our shader is a surface shader but in the fragment shader we replace the usual :

o.Normal = UnpackNormal( Tex( _BumpMap, uv_BumpMap));

by…

o.Normal = Tex( _BumpMap, uv_BumpMap)*2 - 1;

And yes, the editor does complain that we use a standard texture as a normal map but we know better :)

A few additional notes :
– the Unity normal mapping storage problem arises only when you try to shift a normal by more than 90° on one of the axis. If you don’t do that, you should be safe with classic Unity normal maps.
– this technique is not without problem : dents on the outer silhouette of the object don’t always look terribly good so you should keep them small.

13_lightProblems

And that’s all for now! Do not hesitate to ask questions or propose improvements to this technique.

If you like what we do, please share our work by liking our facebook page by following us on twitter or sharing our youtube channel. We’ll self publish Drifting Lands and we need all the love we can get out there ;)

8 thoughts on “A Game of Tricks IV – Stylized normal mapping

  1. Very interesting- would you be able to achieve something similar by exporting world space normals to get the right colors, painting on those, and then converting to tangents?

    • Errr, yes of course but since a lot of softs can give you tangent space normals directly, I’m not sure it would be terribly efficient to do it that way.

  2. Very impressive and stylish result, but I have a feeling pipeline could be simplified considerably.
    Why didn’t you use nDO to draw those normals in the first place? Modeling actual edge geometry for cracks and then baking seems like an inefficient and overly complicated approach to me – making a brush stroke in PS and then nDO’ing it will achieve the same much faster, or is that not the case?
    Then there’s always the baking-from-ZBrush approach as opposed to Edit Normals in 3dsmax, but I think you can achieve http://www.alkemi-games.com/wp-content/uploads/2014/07/04_handDrawnNM.jpg and even more detailed results in nDO in one click.

    As for the baking itself, I’m not sure why you use such a roundabout method when there’s xnormal available which can bake tangent space NM, curvature and all the other nice stuff for you. Again with proper baking setup you will get absolutely clean tangent space without all those artifacts seen here http://www.alkemi-games.com/wp-content/uploads/2014/07/11_normalmap_edit.jpg
    By the way I’m hoping you did not remove them “manually” as it could be easily done via mask baked from MatId it seems.
    As for normal storage issue, what settings for tex encoding on import into Unity did you use?
    Please clarify if I’m mistaken in some of the assumptions.

    • I do not know nDO but using ZBrush will not achieve the same result. We do not want to add relief or complexity to the surface. We want to add details to the polygon edges without adding any additional orientations. So basically we need on each polygon to have normals pointing in the direction of the normal of the adjacent polygon. That’s not something you can do with ZBrush and that explain our ‘overly complicated approach’. We do not want more detailed results, we do not want more realistic normals. And yes going for something stylized is often more complicated than going for realism :)

      I don’t understand your point about baking-from-Zbrush… We’re baking in 3DS. The only difference is that instead of doing a high poly, realistic model in ZBrush, we’re doing an unrealistic, physically impossible high poly model in 3DS by editing normals. That’s actually not more work, it’s a different kind of work that’s all.

      For the baking itself, we’re using render to texture from 3DS which is pretty much the most straightforward thing you can use when you use only 3DS. The artifacts we get on our render to texture are perfectly normal, we would not get those if we spend more time on the proper tesselation of our high poly models. But doing them roughly and cleaning the artifacts in Photoshop in less than 5 minutes is actually a lot faster than what you might imagine. And no there’s no other way to remove them except by doing the modelling more cleanly.

      We use classical normal maps and not the unity default compressed storage method. By default unity will ignore the Z axis and store only x and y component of the normals. Z can be recalculated usually because the length of the normal vector is one. But to this equation there are 2 solutions and unity will always choose the positive one : the vector pointing in roughly the same direction as the triangle. Since our technique is using phyically impossible configuration of normals, we can’t use the unity way of storing normals and the UnpackNormal method in the surface shaders. Sometimes our normals are pointing in the opposite direction of the face holding them and even if it’s weird, it’s what we’re looking for.

      Hope you understand better what we’re doing. I sincerely think there’s not a lot of ways to do this and exactly this faster. Now I can understand if you’ll argue the result is not worth the effort :D

  3. Have a look at http://quixel.se/dev/ndo and http://www.xnormal.net/ and see if they can fit somehow to benefit your pipeline. With every effect there is computational complexity/authoring time (=budget) considerations so surely it’s hard balancing all that with artistic requirements.
    If you need to keep only parts that correspond to newly added “chips” polygons in your Normal Map, perhaps if you assigned different material id to those faces and baked that as a mask the cleanup process could be automated even if you get messy bakes (but with xnormal you should be getting clean ones, I think).
    As for “impossible normals” part, wouldn’t there arise a situation where effect does not work on mobile because someone “optimised” renderer there to assume positive solution and all that? Same as your example with bushes, I think the idea to mess with normals is rather cool but wonder if it will work across APIs/Environments even if it works for DX on PC right now.

    • Before I take a look at nDo and xNormal, I will surely spend more time learning Substance Designer.

      Our messy bakes are normal : it’s just that we project a high poly model whose tesselation modifies the smoothing of the low poly model. We could remove all smoothing to ignore this problem but it would give slightly less ‘correct’ normals for the chipped areas. And I promise, cleaning these artifacts is a 5 minutes task at most maybe done once or twice in a day.

      I see no reason why using uncompressed normals would fail on mobile. This is just how normal maps have been working for a long time before some optimization techniques were added to Unity and other engines. Normal mapping is really something very simple and ‘low tech’. It just means you have to do your shaders yourself. Right now it works perfectly on DX and OpenGL (Mac, Linux). Concerning mobile problems, I’m afraid I’ll leave that to others. We’re not interested right now because Drifting Lands will never be ported on mobile.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>