Load OpenGL textures with alpha channel on iOS

June 15, 2013

Hello OpenGL Hackers

So, you have been using OpenGL and you know the difference between a Viewport and a Frustum. Great! But now consider that you have run into a problem that is not so easy to solve. How does one send multiple textures from a movie to OpenGL? That is a hard enough problem, but now make it even harder by adding a requirement that the textures include an alpha channel. The texture could be anything, but for the purposes of this example a 64x64 goldfish animation like this will do just fine.


GoldFish

This animation contains 20 frames showing a goldfish swimming. This post will show how to include the animation in an iOS project as a source of OpenGL textures. A texture is basically the same as a 2D image except that the texture gets mapped into 3D space by OpenGL. Instead of starting from scratch, let's use the code from OpenGL ES 2.0 for iPhone Tutorial Part 2 by Ray Wenderlich. The existing code displays a still image of a fish on the side of a spinning cube. The existing code is a great little demo and it will be even more interesting once the swimming fish shown above is added to the project.

TexturedCube

Now for the implementation. The first thing one might think of is simply including a series of PNG images in the iOS project. It is not so hard to do, but this simple approach wastes a lot of space. If each PNG image in this animation is stored in a zip file, that file is 198118 bytes or 198Kb. That is not huge, but it is not hard to do a lot better.

With the AVAnimator library for iOS, the total size of the animation can be compressed down to 121965 byte or 121Kb. The space savings is possible because AVAnimator includes code that is able to decompress the image data with 7zip as opposed to the less effective zlib compression used by plain PNG images. The result is not quite half the size, but it is a significant reduction in file size and that means the final app will download more quickly for the end user.

In addition to app size, AVAnimator is able to decompress multiple images much more efficiently than would be possible when decompressing a series of PNG files. In this example, only a single movie with 20 frames will be decompressed, so CPU time used on the device will not be critical. But, if a developer wanted to decode 2, 4, or 8 videos at the same time then execution time on the iOS device would become a real issue.

Okay okay, enough talk. Let's see some results!

TexturedCube2

The image above is a screenshot from an iPhone running the demo with the addition of the goldfish animation. Of course, you will need to actually download the source code and run it yourself to see how nice the goldfish texture looks animating on the side of the cube.

The most interesting code is in OpenGLView.m, see the method named "render", it is the CADisplayLink callback that is invoked once for each rendered frame. The very first display link call cannot actually render the fish, since the media still needs to be decoded and prepared to render. Once the fish animation is loaded, it will be pushed into OpenGL via the following code in the render method:

if (self->_frameDecoder) {
  // Texture frames are ready to display now
  [self loadNextGoldfishTexture];
  glActiveTexture(GL_TEXTURE0);
  glBindTexture(GL_TEXTURE_2D, _fishTexture);
  glUniform1i(_textureUniform, 0);
}

See the implementation of the loadNextGoldfishTexture method for all the details of how to extract texture frames from the animation file. The most interesting code in loadNextGoldfishTexture is the logic to upload a texture with an alpha channel to OpenGL:

if (self->_goldfishFrame > 0) {
  GLuint texName = self->_fishTexture;
  glDeleteTextures(1, &texName);
}
...
AVFrame *frame = [_frameDecoder advanceToFrame:offset];
...
uint32_t *pixels = (uint32_t*) cgFramebuffer.pixels;
...
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height,
  0, GL_BGRA_EXT, GL_UNSIGNED_BYTE, pixels);

Each time a fish animation frame is loaded into a texture on the graphics card, the previous one needs to be deallocated. The code then advances to the next frame and gets a pointer to the first word in the next framebuffer. This pointer "pixels" is then passed to the glTexImage2D() and that API will copy the framebuffer to graphics memory. Note the use of GL_BGRA_EXT, this is an apple specific extension that makes using BGRA little endian texture data more efficient. The optimized loading of textures is possible because AVAnimator internally stores movie data in a premultipled BGRA format.

Many thanks go to Ray Wenderlich for providing such a nice compact OpenGL ES 2 demo. The fish animation comes from a couple of codeproject demos A-lovely-goldfish and Fishy-Fishy-Fish.