Every games should have an audio system. In my little engine, it supports 2 types of sound: 3D effects sounds and BGM. Separate this 2 types because the BGM can be decoded on hardware. A Audio Queue on iOS.

Effects sounds
Effect sounds are played using OpenAL, which is an API similar to the OpenGL. In the engine, there has an audio thread where all OpenAL calls are executed in this thread and this thread is communicate with main thread through a command buffer which is similar the one used in graphics programming. For example, during main thread update, the game logic may request to play an explosion sound, then an audio command is made and this command is pushed to the command buffer. When the audio thread find that there is a command inside the buffer, the command is executed and initiate an OpenAL call. I set up the audio buffer and thread because the OpenAL call may stall the calling thread according to this article.

On iPhone, there are AudioFileOpenURL() to open the audio file again to get the description of the audio file only. So I decided to get this data by myself and I picked the Apple CAF file format with the AAC compression method because it is an open format and Mac machine have a command line tools to convert file into this file format. (Note that on iPhone, the Audio Queue can only decompress 1 song using hardware decoding. If more audio need to be played, it will fall back to use software decoding.)

CAF file format
Just like the WAV file format, the CAF file format is divided into many different chunks, such as description chunk(which store the sample rate, channels per frame, …) and the data chunk(which store the audio sample data). The specification of CAF can be found sample project.

Screen Shot from Sample Project

Apple Audio Queue
To playback the audio using Audio Queue API, we need a couple of steps:

  1. An audio output need to be created using AudioQueueNewOutput().
  2. We need to set the property of the newly created queue by AudioQueueSetProperty() which supply the Magic cookie property which is required by the audio format.
  3. A property listener should be set up using AudioQueueAddPropertyListener() to listen to the event occurs in the audio queue such as the playback is finished.
  4. We need to allocate memory for audio queue to contain the packet description by AudioQueueAllocateBufferWithPacketDescriptions().
  5. After setting up the description, the audio sample data need to be put into the audio queue by AudioQueueEnqueueBuffer().
  6. After that, we need to tell the hardware to decode the audio samples by AudioQueuePrime().
  7. Finally, the audio are ready to be playback using AudioQueueStart().

To stop an audio queue, 3 steps are needed:

  1. AudioQueueStop() need to be called to stop the playback.
  2. The property listener set up in step 3 above needed to be removed by AudioQueueRemovePropertyListener().
  3. Finally, AudioQueueDispose() is called to release all the audio queue resources.

You may refer to the sample project to have a full understand on how to use the audio queue, especially on the part to enqueue the audio sample to audio queue buffer.

Playback 3D effects sound using OpenAL on iPhone is similar to the other platform, while playback the BGM takes some efforts because I need to get the audio file description by myself as the sample code from Apple only provide how to play back an audio by specifying a file path but not the audio file that is loaded in memory. Hope that my sample code can help someone faces the same problem with me.