Using PID Controller

Introduction
Several weeks ago, I need to implement a game screen for user to choose a level. At that time, I read some post and article about using PID controller for controlling the behavior of a system, so I decided to try it on the UI. PID controller is a control loop feedback mechanism which generate an output to a system based on the difference between a measured value and a desired value:

where f(t) is the output apply back to the system, e(t) is the difference between a measured value and a desired value and P, I, D are tuning variable for controlling the behavior. More information can be found on this post.

Using PID Controller
In the UI, the user can drag on the view to choose the level, when the user swipe on the icons, they will scroll according to velocity and acceleration. The output of PID Controller is used to control the acceleration so that there will always be an icon staying in the middle of the screen when the system becomes stable. The code is ported to WebGL as below(need a WebGL enabled browser to view):

Your browser does not support the canvas tag/WebGL. This is a static example of what would be seen.
P: I: D:

 
 

Tuning the PID variables

The behavior of the scrolling can be controlled by tuning the constants: P, I, D. The effects of changing the 3 constants can be summarized as below:
Summary of the effects of PID constant from Wikipedia
You can play around with the 3 Constants with the input text field above if you have a WebGL enabled browser.

Conclusion
Using PID Controller to manage the system is convenient but it is a bit tricky to tune the PID constants. It is easier to tune the constant one by one and referring to the above table. The source code of my implementation can be downloaded here.

Reference:
[1]: http://en.wikipedia.org/wiki/PID_controller
[2]: http://altdevblogaday.com/2011/08/07/animation-using-closed-loop-control/
[3]: http://altdevblogaday.com/2011/02/27/webgl-part-2-in-the-beginning-there-was/
[4]: http://www.learnopengles.com/how-to-embed-webgl-into-a-wordpress-post/

Writing an iPhone Game Engine (Part 6- Performance)

Introduction
Performance of a game is very important, it is important to maintain a constant frame rate of a game. So I will talk about how I keep track of the performance of the game. However, the game is not finished, I can only profile the engine with sample placeholder assets, the profiled data might not be match with the final data after the game is finished. Anyway, I will try to talk about my first attempt to profile my engine. (All the data are measured with iPhone 3G).

XCode OpenGL ES Analysis
For profiling the graphics performance on iPhone, there is an OpenGL ES analyzer that comes with Xcode 4. It can detect redundant state changes and give some advice to improve rendering performance. It also help me to spot some bugs such as forgetting to set the texture states to use mip-map.
After using the analyzer, I discover that 2 recommendations that improve my engine rendering performance significantly. The first suggestion is to use the EXT_discard_framebuffer extension after rendering each frame.
Before using the analyzer, I did not notice iPhone have this OpenGL extension. This helps the frame rate from 20.8ms per frame to 19.2ms per frame. The second suggestion is to use a more compact back-buffer format to reduce the fill rate.
After setting the back-buffer format from RGBA8888 to RGB565, the time taken for each frame reduced from 19.2ms per frame to 18.52ms per frame.

Performance Graph
I also generate a  performance graph to display how much time each subsystem is used in the engine. It is easier to have a graph to spot spikes of the game so that I can investigate which subsystem cast the spikes.

Conclusion
The above 2 techniques are used to track down the performance issues and increase the frame rate. However, as the game is still under development, there are still some area, such as physics and script, which can be optimized. Those optimization will be done when most of the game is completed. I am going to stop writing this series as most of the problems I have encountered during development are mentioned. May be I will write a postmortem after the game is finished, but this will not in the near future as there are still tons of art-work have not been done yet. Hope you all enjoyed the series.

Writing an iPhone Game Engine (Part 5- Audio)

Introduction
Every games should have an audio system. In my little engine, it supports 2 types of sound: 3D effects sounds and BGM. Separate this 2 types because the BGM can be decoded on hardware. A sample project is provided for playing the BGM using the Audio Queue on iOS.

Effects sounds
Effect sounds are played using OpenAL, which is an API similar to the OpenGL. In the engine, there has an audio thread where all OpenAL calls are executed in this thread and this thread is communicate with main thread through a command buffer which is similar the one used in graphics programming. For example, during main thread update, the game logic may request to play an explosion sound, then an audio command is made and this command is pushed to the command buffer. When the audio thread find that there is a command inside the buffer, the command is executed and initiate an OpenAL call. I set up the audio buffer and thread because the OpenAL call may stall the calling thread according to this article.

BGM
On iPhone, there are hardware to decode audio which can be used through the Apple's Audio Queue API. By this API, you can play sound by creating an audio queue output using AudioQueueNewOutput() providing the audio file description such as sample rate. You can get this description by calling the AudioFileOpenURL(). Unfortunately, this is not suitable in my case as my audio file is already loaded in the memory when the game world tile is streamed, I don't want to called the AudioFileOpenURL() to open the audio file again to get the description of the audio file only. So I decided to get this data by myself and I picked the Apple CAF file format with the AAC compression method because it is an open format and Mac machine have a command line tools to convert file into this file format. (Note that on iPhone, the Audio Queue can only decompress 1 song using hardware decoding. If more audio need to be played, it will fall back to use software decoding.)

CAF file format
Just like the WAV file format, the CAF file format is divided into many different chunks, such as description chunk(which store the sample rate, channels per frame, ...) and the data chunk(which store the audio sample data). The specification of CAF can be found here. We need to get those data inside the CAF file to playback the BGM using AudioQueue. For details, you can take a look at the AudioCAFHelper.cpp in the sample project.
Screen Shot from Sample Project
Apple Audio Queue
To playback the audio using Audio Queue API, we need a couple of steps:
  1. An audio output need to be created using AudioQueueNewOutput().
  2. We need to set the property of the newly created queue by AudioQueueSetProperty() which supply the Magic cookie property which is required by the audio format.
  3. A property listener should be set up using AudioQueueAddPropertyListener() to listen to the event occurs in the audio queue such as the playback is finished.
  4. We need to allocate memory for audio queue to contain the packet description by AudioQueueAllocateBufferWithPacketDescriptions().
  5. After setting up the description, the audio sample data need to be put into the audio queue by AudioQueueEnqueueBuffer().
  6. After that, we need to tell the hardware to decode the audio samples by AudioQueuePrime().
  7. Finally, the audio are ready to be playback using AudioQueueStart().
To stop an audio queue, 3 steps are needed:
  1. AudioQueueStop() need to be called to stop the playback.
  2. The property listener set up in step 3 above needed to be removed by AudioQueueRemovePropertyListener().
  3. Finally, AudioQueueDispose() is called to release all the audio queue resources.
You may refer to the sample project to have a full understand on how to use the audio queue, especially on the part to enqueue the audio sample to audio queue buffer.

Conclusion
Playback 3D effects sound using OpenAL on iPhone is similar to the other platform, while playback the BGM takes some efforts because I need to get the audio file description by myself as the sample code from Apple only provide how to play back an audio by specifying a file path but not the audio file that is loaded in memory. Hope that my sample code can help someone faces the same problem with me.

Reference:
Sample code: http://code.google.com/p/audio-queue-caf-sample/downloads/list
CAF file format:  http://developer.apple.com/library/mac/#documentation/MusicAudio/Reference/CAFSpec/CAF_spec/CAF_spec.html%23//apple_ref/doc/uid/TP40001862-CH210-DontLinkElementID_64
Audio Queue Reference:  http://developer.apple.com/library/ios/#DOCUMENTATION/MusicAudio/Reference/AudioQueueReference/Reference/reference.html
Using Audio on iOS: http://developer.apple.com/library/IOS/#documentation/AudioVideo/Conceptual/MultimediaPG/UsingAudio/UsingAudio.html