The examples that follow demonstrate interesting ways that AVAnimator can be used in an iOS app. Each example is a complete Xcode project that contains the full source code for AVAnimator. All you need to do is download the example, build in Xcode, and then run in either the simulator or on iOS hardware.
To integrate AVAnimator into your own Xcode project, copy the
LZMASDK directories into your project and add the directories to the project file in the normal way. Older non-ARC projects will also need to copy over the
AutoPropertyRelease.m files. Modern versions of Xcode handle Framework linking automatically except that
libz will need to be explicitly added to linked libraries to support APNG decoding. On iOS 9, explicitly add
libcompression. Unlike previous releases, no special flags should be needed and the code will automatically work in ARC or non-ARC projects.
StreetFighter Example : StreetFighter II for iPhone
This iPhone project is a quick little hack based on this page that shows a punch, kick, and special move button. When you press a button, it makes Ryu throw a punch, kick, or a fireball.
This example is not a playable game, it is just a quick little demo of playing audio and video loops as a result of a user action. This demo was whipped up in a couple of hours with some files off the net. This example demonstrates how quickly one can create an iPhone app that makes use of existing audio/video resources. Take a look at the implementation and run the example to see how multiple audio and video resources are loaded, played, and unloaded without hogging the CPU or taking up all the system memory.
Fireworks Example : Touch Screen for Fireworks
This iPhone/iPad project shows off H.264 RGB+Alpha decoding logic in AVAnimator. The first time the app is launched, each firework video is converted from H.264 encoded data to MVID files cached on disk. No audio clips are included in this example. Each user touch on the screen kicks off a randomly selected firework that explodes near the touch position.
This example demonstrates how high quality explosion videos can be included in an iOS app without introducing too much bloat in the app download size. This example app includes 10 videos, but the total download size is only about 2 megs. This example also demonstrates how multiple explosion videos can be displayed on screen at the same time. AVAnimator makes it possible to display many explosion videos at the same time because limited instances of H.264 hardware decode paths are not being used at runtime.
AVRender Example : Comp images, video, and text
This example demonstrates how to use AVAnimator's
AVOfflineComposition module to compose two different videos and text into one single output video.
The app displays a larger video of a person in a newsroom like setting and a smaller alpha video of a specific news story that is superimposed over an area in the upper right corner. The smaller video is only visible for a portion of the time the larger movie plays. The CoreText Framework is used to render pixel perfect red text in the lower portion of the screen. These composition operations are controlled by XML files that describe the composition operations. After the background composition operation is completed, the resulting lossless movie is played. This type of offline composition can be very useful in certain cases. For example, one might compose an alpha channel video over different backgrounds to produce a 24BPP movie that could be exported via email. Note that composition does not happen in real time since the operation can take quite some time to generate frames and render an output video.
AVSync Example : Synchronize Audio and Video
This example project demonstrates how a video track can be displayed in sync with an audio track using the AVAnimator library.
The video content is a C major scale lesson from iPractice, a digital practice assistant for guitar players. A key requirement for a musical app is that the visuals need to be tightly synchronized to the audio track, if the video sync is off by as little as 1/20th of a second then the user experience will suffer. This example actually contains 2 audio tracks, one is a fast tempo of 70 BPM and the other is a slow tempo of 45 BPM. This example contains only one video track, so the playback rate of the video track is dynamically adjusted to match the tempo of the audio track. This type of playback rate adjustment and dynamic selection of the audio track used with a specific video track is not possible using the built-in movie player classes shipped with iOS. In addition, this example shows how a developer can use custom movie controls that look like the built-in movie controls.