A game of Tricks – Normal mapped sprites

flowerHonestly, I’m a bit ashamed at how little time we took to communicate this last year. I’ve always wanted to share my experiences, just to repay everything I’ve learned from others generous enough to publish sources, tutorials, study cases in the game industry. When we started Alkemi 3 years ago, I tried to write tutorials regularly but I was soon overwhelmed by the work required to finish our game and the contract jobs necessary to finance its production. 2012 was a stressful year but hopefully all of that is behind us now and I really want to get back to playing with small ideas and sharing a few tricks here and there.

Designing small visual or development techniques and tricks is my favorite part in the game making process. It is in fact far more rewarding for me than game design or pure visual creation. I’ll share here what I’ve learned over the years and more recent stuff.

My first topic will be something that is massively used in Transcripted : normal-mapped-sprites. In other words, how to make your 2D game look like it’s real time 3D. Well… to some extent. A lot of recent 2D games are made with 3D engines. For tools like Unity, a 2D game is just a simple scene with an orthographic camera filming orthogonally a lot of planes with pretty textures mapped on each of them.

Let’s face it : planes are not really interesting when it comes to how they react to light. There’s not much you can do with dynamic lighting and dumb planes or sprites. Sure, you can use light attenuation ranges to create halo of lights in darkness but you won’t get a lot further than that…


Halos of light, that’s about all you can do with flat planes and lights in 2D games…

Normal mapping has been around in games for more than 10 years now. Basically it’s just a texture (a normal map) telling a triangle how he is not so flat after all. Color informations of the normal map translate into space directions, either abolute (aka world space), or relative to the direction faced by the triangle (aka tangent space). You need 3 values to define a direction in 3D space and guess what ? There’s also 3 values in a color ! RGB values of each pixel defines X,Y,Z components of a direction in space. Most normal maps appear as blueish textures which can be explained by the fact that all pixels “looking” roughly in the same direction as the base triangle use a color near R = 0.5 // G = 0.5 // B = 1. This exact color has in fact absolutely no effect when it comes to changing the pixel ‘orientation’ compared to the original triangle’s surface.

Ok so how is it useful for a 2D game ? In 3D, normal mapping is used to give the impression an object as much more polygons than it really has. You can’t fake the silhouettes but the lighting will behave as if the object was really complex.

In 2D games most objects have 2 polygons  to create a quad (provided that you’re using a 3D engine of course). With a normal map on a quad, you can also create a lighting faking a volume made of as many polygons as you want. You’re still limited to a single point of view like with any 2D asset but you can make it react to light as if it was really 3D.


Same assets, different light positions… nice !

If it looks like 3D, why bother ? Let’s do real 3D ! Well, with actual 3D objects you still have to display a heavy load of polygons if you want decent silhouettes. With normal mapped sprites your silhouettes are defined by an alpha channel and not real time polygons. Your only limitation is similar to any other 2D games : resolution and memory usage. In Transcripted, organic backgrounds with the same visual quality would have required millions of polygons.

Ok, so with normal maps you can have dynamic lighting but you can also have specular effects on sprites. For each step of your animations, you can even define a gloss map to limit the area where the specular effect will appear like for any 3D model. In Transcripted, we used a trick to avoid adding a gloss map : most base textures of assets are already rendered with some kind of static ambient lighting (in fact it’s ambient occlusion mixed with rim light). In the shader, we use the luminosity of the diffuse texture as a gloss map to define where the specular effect will appear. In dark areas, the specular effect will be nullified preventing to some extent the impression that the whole thing is wrapped in a plastic bag.


Specular without (left) and with (right) faked gloss map based on the diffuse luminosity

You can also apply rim light effects to your sprites. Rim light is a generic term to define the act of applying light colors to the border of 3D objects or more specifically to parts  of the object ‘not looking at the camera’. It can be used for different purposes in 3D :
– to soften a hard directionnal lighting to create materials looking more like skin or velvet (this is like faking a very cheap sub surface scattering),
– to create special effects like force fields or auras if it’s added on top of the base texture,
– to create an X-ray or ghost version of your assets like most ennemies of Transcripted when you’re invulnerable. We used this ghost like appearance for hostiles to make it clearer that you can pass through them with no harm when you’re holding a cube.


While invulnerable, hostiles appears as ghosts in Transcripted : a rim light effect requiring only the normal map and no other texture.

This ghost version of hostiles needs only the normal map version of sprites to render, in this case you don’t even need the diffuse base texture. Sure, we could have created an entirely new set of sprites for the invulnerable mode, making the rim light effect ‘by hand’, but it would have been a terrible terrible waste of resources: normal maps are already used for dynamic lighting so it was really just a change of shader to reuse the same assets for another visual version of hostiles.

We’ve seen a few things you can pull out in 2D games with normal mapped sprites. Now let’s see how you actually create these. When you create 2D animated games, you possibly engage in a long process of creation where you have to produce a high number of animation steps or keyframes for a large number of assets. If you want dynamic lighting for those you’ll have to create a normal map fitting exactly its standard equivalent for every single step of animation. The first and easiest way to do that is to create all your 2D assets from 3D renderings. As I said before, even if it may seem odd to go through the pain of creating and animating 3D assets just to have them pre-rendered and used as textures in your game, you can have achieve a far higher quality even for very very complex objects with this technique. Transcripted is mainly composed of very simple 3D objects which have been enhanced with A LOT of animated procedural displacement. So much displacement that it would require far too much polygons to create approaching silhouettes in real time 3D.

For each step of animation all objects were rendered in 2 versions: first with a static ambient lighting to create the base or diffuse texture, then with a special material to render normal maps.


These normal maps must be rendered in the reference space defined by the plane of the camera. I’m a 3DSMax / Mental Ray user and originally I created a composite material with 3 fall offs (toward / away) maps binded to the 3 RGB channel to render the X,Y,Z components of the normal direction in camera space (it sounds a lot more complicated than it really is)… It kinda worked but it turned out it was completely useless since Mental Ray already includes something doing exactly just that! To create the correct material in 3DSMax, you just have to :
– turn on mental ray as the renderer
– create a 100% self illuminated material
– chose 127/127/255 as the background color
– assign the ambient / reflective occlusion map to the diffuse channel
– chose 3 as the type parameter in the AO map
– chose a decent number of samples (128+) to create a high quality render of your normals

test caption

Ambient occlusion parameters to render normal maps for 2D sprites

You can have all the effects you want applied in the bump slot, the displacement slot, every warps of deformers, the final result will be correctly rendered in the normal output. But if you want really top quality normal maps you’re not quite there yet. There are two more things to take care of : get rid of antialiasing and add padding to your normal maps.

What’s the problem with antialiasing ? If you didn’t touch to mental ray basic settings, you just most likely made a nice rendering of your object with a good old antialiasing against the background of your image. Intermediary colored pixels generated by the antialiasing have nothing to do with normals and can generate lighting artifacts in real time.


left : with antialiasing // right : no antialiasing

Mental Ray doesn’t care about the ‘dont antialias against background’ option of 3DSMax, you have to prevent antialiasing directly in the sampling option of the renderer. Empirically I got my best results with the filter type Lanczos and with width and height parameters equal to 1.


Sampling parameter to eliminate AA

Beware! You don’t want to render your base textures with these parameters! You definitely WANT antialiasing for your base / diffuse textures for clean, smooth silhouettes. And now the padding. What is this padding thing, and why do you need it ? Add padding to a texture is the simple act of repeating the border pixels of a texture on its edge going progressively outward.


left : normals without padding // right : normals with padding

In this case, it’s useful for two reasons : since you have eliminated the antialiasing pixels of your normal maps, if you try to make it fit with the diffuse version it will not cover the most outward semi transparent pixels. It means that the  ‘uncovered’ pixels of your diffuse map will use the background color of the normal map (127/127/255) as their normal value which is incorrect. They most likely have a normal value equal or near the value for their direct inward neighbors. With padding, you ‘transfer’ the correct normal value to the most outward diffuse pixels.

You also need padding in your normal maps to compensate for mip mapping. If you display your sprites at a lower size than their original resolution, the 3D engine will most likely use mip maps for your diffuse and normal textures which is a set of lower resolution textures generated from the one you’ve provided. Without going into too much details about mip maps, let’s just say that at lower sizes, the background color has a tendency to bleed into your objects creating once more potential lighting artifacts. If you want to prevent those you have to create a fair amount of padding : 4 or even 8 pixels is a minimum and you could have to go up to 16 for high resolution objects which can be seen from close and far away.

Now, how do you create padding ? It’s quite simple : you need to export a PNG or any RGBA image from 3DSMax instead of just RGB files with a 127/127/255 background. Then in a soft like photoshop :
– you duplicate height times the layer of your transparent normal map
– you move each of the 8 inferior layers in one of the 4 cardinal directions and the 4 diagonals
– you flatten the nine layers and you repeat the operation from the start to add one by one pixels of padding
– just before saving the result as a RGB file since you don’t want any transparency, you add an opaque background with the 127/127/255 color. Et voila!

It’s quite easy to create a padding script with photoshop macros but you can also find plenty of those around.

I’ve said going pre-rendered 3D is the easiest way to create normal mapped sprites but there are plenty of things to try for the creative mind. Here is a small experiment I’ve been toying with but which didn’t make it yet to any serious production: the creation of a fully 2D hand drawn character with ‘faked’ normal maps used to create dynamic backlighting. Backlighting is a commonly used graphic trick to enhance volume of 2D shapes, add drama, or whatever… It’s basically adding a lightcolor on the borders of 2D shapes in certain directions to simulate a light coming from behind the object.

Let’s start from this 2D character drawn in Flash :


I can easily create an height map to serve my purpose in photoshop. I only need to create a black picture with a white silhouette of my character in a layer. Then, I just have to add a simple inner shader shadow filter to the layer, oriented toward the bottom to obtain this :


Height map generated with a simple inner shadow filter and its corresponding normal map

This texture represents the distance between the camera and the object thereby describing very very roughly the volume of the character. This height map can be turned into a normal map either directly in Unity or in Photoshop with the NVidia’s texture tools (doing it in Photoshop instead of Unity will allow a post process to add padding).

Once I get in Unity, the diffuse texture and its normal counterpart will allow me to create different lighting variations of the character which can be animated dynamically. Nice eh ?


This is just a plane, one or 2 lights placed at the same depth as the quad and some tweaking of the ambient light.

There must be a lot more things to do with normal maps in 2D games. Even if our next game is going to be a lot more 3D-oriented than Transcripted, I’ll probably get back to this at some point!

10 thoughts on “A game of Tricks – Normal mapped sprites

  1. I just thought about this today after seeing a fun Speed Photoshop tribute to Metroid…

    I am a Flash dev who is interested in procedural special effects. It led me to question why normal mapping has never been used for 2D assets. It definitely piqued my interest. I might have to play around with the concept some more to see what can be done.

    Thanks for the post!

  2. That last effect is pretty interesting but what I’m wondering is if there is a way to generate an animated normal map for special effects like we do with water and such. Say a simple shockwave or hit after-effect, maybe a black hole. You know, cool stuff. xP

    • You can generate animated normal maps for distortion effects quite easily. You just have to either animate a black & white height map in 2D or model a 3D rippled plane. Then you convert the height values into normal values with nvidia’s textures tool. If you want distortion though, your shader must not use normal informations to provide lighting but to apply a refraction effect to what is behind the normal mapped sprite. In Unity, it will involve a shader with a grab pass (pro version only).

    • Mixed feelings about this : drawing the lighting profiles for cardinal light directions is actually the same thing as creating the normal map manually. There’s no magic trick or anything, you do all the work yourself. If you have a basic understanding of normal maps and a good sense of volume you can do this with Photoshop using color channels and a few level filters. Creating 2-5 light profiles per animation frame is a huuuuuuuuge work if you want decent fluid animations. I’d find this interesting if it somehow interpreted key frames and was able to interpolate the inbetweens. I may be wrong but I think that the estimate of the author that it takes just twice as time to create all the assets rather than a simple color image is at best very optimistic.

    • For Transcripted, nope, everything was pure flatness. Pre-rendered 3D in fact applied on flat quads. Drifting Lands on the other hand is nearly all 3D except most FXs and some background stuff like clouds.

  3. Very interesting idea for sprites. Big thanks!
    Can you make simple example on Unity for animation with different normal maps ?

    • Sorry, just took 3 months to answer… Well animating normal maps frame by frame is just the same that it is for diffuse. I’m not familiar with the current 2D tools in Unity because we’ve been using our own for several years now. If Unity doesn’t offer an easy way to animate normal maps with a spritesheet synchronized with a diffuse spritesheet… well no I have no easy simple example to show you how to do it :(

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>