iOS OpenGL Color Cycle effect with 8 bit palette

Aug 9, 2013

Greetings

OpenGL is really great for many things. But at times, it can be frustrating. One thing that was super easy to do with old school graphics hardware was to create an 8 bit palette texture and then color cycle the displayed pixels by updating values in the palette instead of the texture. This operation can be very efficient and can produce reasonable results in many cases. Here are a couple links that describe how and why color cycling was used in video games along with some nice examples.

The reason a color cycle is so amazingly fast is that each frame update need only transfer a table of 256 color values to the graphics card. But, this approach is not so easy to implement using OpenGL ES 2.0 under iOS. It took quite some time to learn the tricky details of GLSL, the new shader language that a developer uses to upload code to the graphics card to implement fragment shading in OpenGL ES 2.0. Once that code was working, it was not to hard to apply this logic to an interesting little problem.

The Problem:

I want to render a smiley face in a variety of different colors. But, I do not want to take up a lot of OpenGL resources to do it. So, what I want to do is take a smiley face image and then run it through a desaturate filter to product a grayscale image that looks like this:

Gray Smiley Face

The smiley face above should then be colored in with different colors and different brightness levels. Here is how the textures should appear at full screen size on an iPhone in portrait mode with blue and green colors:

Green Smiley Face     Blue Smiley Face

The grayscale image is used to determine how much of a specific color to include at a specific pixel. The black and white values remain unchanged in the generated 8 bit table. Notice for example that the full white color around the smiley face image stays white. This little trick is implemented by simply not changing the first and last palette values since these two entries are used for black and white. The image is a little bit jaggy but the goal here is to demonstrate palette based color fill not a perfectly scaled texture with perfectly blended edges.

Now lets make things interesting. The key aspect of using a palette of 256 values is to show that minimal resources are used in the render cycle. Anyone that has ever coded a game in OpenGL knows that there is always a tradeoff between what a visual effect might add to the game vs the impact on FPS performance on the device. Any effect that uses a lot of resources will slow down the GPU processing and the overall game becomes slower. This example uses a smiley face texture that is the largest possible texture that can be allocated on the iOS device to show that the impact on FPS performance is minimal.

The grayscale texture is 2048x2048 on iPhone class devices and 4096x4096 on iPad class devices. This texture is really really large, it would be about 68 megs of data on an iPad if stored with 4 bytes per pixel. But, because a 1 byte index values is stored instead, this massive texture takes up only 17 megs of memory. That is still big, but the real benefit is that this 17 meg texture only needs to be transferred to OpenGL once on startup. During each invocation of the render loop, the app will only transfer the small 8 bit table. This is a big advantage over attempting to transfer the big 17 meg texture on every draw cycle, my own testing showed that FPS performance of this trivial app would drop from 60 to about 30 if the 17 meg texture were transferred on every draw cycle on an iPad2. Okay, enough buildup! Here is the source code:

The GLSL Shader Code:

The tricky part of this implementation was the GLSL shader. A shader is a piece of code that is compiled and uploaded to the graphics card in order to implement the shading of specific pixels. Basically, this code gets a value in the range 0 to 255 from the large indexes texture and then does a lookup to determine the RGBA color value associated with the 8 bit index value.

// GL_TEXTURE0 = indexes, GL_TEXTURE1 = lut
varying highp vec2 coordinate;
uniform sampler2D indexes;
uniform sampler2D lut;
uniform highp float lutScale;
uniform highp float lutHalfPixelOffset;

void main()
{
  highp float val = texture2D(indexes, coordinate.xy).r;
  highp float denormalized = (val * lutScale) +
    lutHalfPixelOffset;
  highp vec2 lookupCoord = vec2(denormalized, 0.0);
  gl_FragColor = texture2D(lut, lookupCoord);
}

The shader depends on a couple of parameters passed in from the Objective-C code via OpenGL uniform values. See the implementation in ViewController.m for the details of how the two input texture units are bound and how the scale and pixel offset values are passed into the shader program. This stackoverflow post covers the tricky details of centering sampled texture coordinates. It would have been easier to use a 1D table for the 8 bit table, but a 1D texture is not supported on iOS with OpenGL ES 2.0. This render logic was not easy to get working, so if you want to experiment with this type of thing then it would be better to start with this already working example and go from there.

The GLKit iOS Framework was used to implement the view controller in this example app. Using GLKit is a lot easier than the previous approach where a developer needed to explicitly setup a renderbuffer via a CAEAGLLayer. GLKit takes care of rendering based on a timer or rendering on demand and it integrates with UIKit in a clean way.

I hope this example gives you some interesting ideas about how an 8 bit palette could be useful in your own projects. Use of a shader program to do the table lookup and blit operation seems to be very efficient because the GPU is able to do many operations in parallel. This 8 bit table lookup seems to be faster than anything that could be done in regular ARM or NEON code.