Steven Hickson Final Project Proposal

I propose to use the Microsoft Kinect Sensor and the work we have done on particle systems to create an interactive, augmented reality game. At the minimum, this game will involve one person. I have created an open-source Kinect library located here for windows that can get the skeleton from people, track people, and use 3D geometry to create 3D reconstructions of a scene in real-time. Recently, I have also used the knowledge gained from this class in order to display the 3D reconstructions in real-time. Results of 3D reconstruction are shown below:
Figure 1

Using this information, I can extract skeletons from each person using the joint like structure shown below in Figure 2.

Figure 2

I can display the skeleton information in OpenGL as a rendered ball and stick model. Though it would be nice to have a deformable model, I could not find one online for OpenGL. I will continue to look but am not hopeful. So for the game I will display the information either as the 3D reconstructed data or something like what is shown in Figure 3 below.

Figure 3
I can then use a particle system (hopefully a nice volumetric one with perlin noise or simplex noise or something) to get a really nice fireball close to the one shown below in Figure 4.

Figure 4
An SVM classifier will be used to train a throwing motion on the relative position change of the arms. The velocity of the arm joints will also be tracked. When the classifier gets a result above x%, I determine that a fireball is being thrown and generate it with the start velocity determined by the arm joints. This will allow the user to throw a fireball.
Hopefully, I can then use this to generate a game where 2 people can throw fireballs at each other.

Below is a video of the finished results. The first section is the results as described above. The second section is the result of porting my code over to Unity3D to get a well rendered character throwing a fireball.