Kinect 4 Windows v2 – Custom Gestures in Unity

4 minute read

header

I have received a few questions recently about detecting Kinect custom gestures in Unity 3D and it seems that there are a few issues with getting this up and running. I have previously posted about Kinect Custom Gestures and also using ‘built-in’ gestures from the Kinect 4 Windows SDK inside Unity so it makes sense to complete the loop here. I’ll run through my experience. A few things to set up straight off the bat:

I created a new Unity project and selected to install the Visual Studio 2013 Tools Unity Package (not required but I like to debug using Visual Studio)

I imported some Unity Kinect custom packages: Assets > Import Package > Custom Package and then navigated to the Kinect Unity Package location on my local disk. (the Kinect unity package can be downloaded here http://go.microsoft.com/fwlink/?LinkID=513177). Add both the Kinect and Kinect.VisualGestureBuilder packages.

Creating the Scene

As before, in this previous post I created an empty game object and also a new c# script and added the following code to detect the Kinect sensor, pull the body data, exposing it to other scripts and also to retrieve the body tracking ID which will be required to identify the body on which we are detecting the gestures. To keep things simple I assumed only one tracked body and as always remember this is intended as a sample so I may not be as careful with cleaning up event handlers, etc.

  1. public class BodySourceManager : MonoBehaviour
  2. {
  3.     private KinectSensor _Sensor;
  4.     private BodyFrameReader _Reader;
  5.     private Body[] _Data = null;
  6.     private ulong _trackingId = 0;
  7.  
  8.     public CustomGestureManager GestureManager;
  9.  
  10.     public Body[] GetData()
  11.     {
  12.         return _Data;
  13.     }
  14.  
  15.     void Start()
  16.     {
  17.         _Sensor = KinectSensor.GetDefault();
  18.  
  19.         if (_Sensor != null)
  20.         {
  21.             _Reader = _Sensor.BodyFrameSource.OpenReader();
  22.  
  23.             if (!_Sensor.IsOpen)
  24.             {
  25.                 _Sensor.Open();
  26.             }
  27.         }
  28.     }
  29.  
  30.     void Update()
  31.     {
  32.         if (_Reader != null)
  33.         {
  34.             var frame = _Reader.AcquireLatestFrame();
  35.             if (frame != null)
  36.             {
  37.                 if (_Data == null)
  38.                 {
  39.                     _Data = new Body[_Sensor.BodyFrameSource.BodyCount];
  40.                 }
  41.  
  42.                 _trackingId = 0;
  43.                 frame.GetAndRefreshBodyData(_Data);
  44.  
  45.                 frame.Dispose();
  46.                 frame = null;
  47.  
  48.                 foreach (var body in _Data)
  49.                 {
  50.                     if (body != null && body.IsTracked)
  51.                     {
  52.                         _trackingId = body.TrackingId;
  53.                         if (GestureManager != null)
  54.                         {
  55.                             GestureManager.SetTrackingId(body.TrackingId);
  56.                         }
  57.                         break;
  58.                     }
  59.                 }
  60.             }
  61.         }
  62.     }
  63.  
  64.     void OnApplicationQuit()
  65.     {
  66.         if (_Reader != null)
  67.         {
  68.             _Reader.Dispose();
  69.             _Reader = null;
  70.         }
  71.  
  72.         if (_Sensor != null)
  73.         {
  74.             if (_Sensor.IsOpen)
  75.             {
  76.                 _Sensor.Close();
  77.             }
  78.  
  79.             _Sensor = null;
  80.         }
  81.     }
  82. }

The only difference from the previous post here is the call to set the tracking ID. Next, I added another script to load the custom gesture database and detect the gestures:

  1. public class CustomGestureManager : MonoBehaviour
  2. {
  3.     VisualGestureBuilderDatabase _gestureDatabase;
  4.     VisualGestureBuilderFrameSource _gestureFrameSource;
  5.     VisualGestureBuilderFrameReader _gestureFrameReader;
  6.     KinectSensor _kinect;
  7.     Gesture _salute;
  8.     Gesture _saluteProgress;
  9.     ParticleSystem _ps;
  10.  
  11.     public GameObject AttachedObject;
  12.  
  13.     public void SetTrackingId(ulong id)
  14.     {
  15.         _gestureFrameReader.IsPaused = false;
  16.         _gestureFrameSource.TrackingId = id;
  17.         _gestureFrameReader.FrameArrived += _gestureFrameReader_FrameArrived;
  18.     }
  19.  
  20.     // Use this for initialization
  21.     void Start ()
  22.     {
  23.         if (AttachedObject != null)
  24.         {
  25.             _ps = AttachedObject.particleSystem;
  26.             _ps.emissionRate = 4;
  27.             _ps.startColor = Color.blue;
  28.         }
  29.         _kinect = KinectSensor.GetDefault();
  30.  
  31.         _gestureDatabase = VisualGestureBuilderDatabase.Create(Application.streamingAssetsPath + "/salute.gbd");
  32.         _gestureFrameSource = VisualGestureBuilderFrameSource.Create(_kinect, 0);
  33.  
  34.         foreach (var gesture in _gestureDatabase.AvailableGestures)
  35.         {
  36.             _gestureFrameSource.AddGesture(gesture);
  37.  
  38.             if (gesture.Name == "salute")
  39.             {
  40.                 _salute = gesture;
  41.             }
  42.             if (gesture.Name == "saluteProgress")
  43.             {
  44.                 _saluteProgress = gesture;
  45.             }
  46.         }
  47.  
  48.         _gestureFrameReader = _gestureFrameSource.OpenReader();
  49.         _gestureFrameReader.IsPaused = true;
  50.     }
  51.  
  52.     void _gestureFrameReader_FrameArrived(object sender, VisualGestureBuilderFrameArrivedEventArgs e)
  53.     {
  54.         VisualGestureBuilderFrameReference frameReference = e.FrameReference;
  55.         using (VisualGestureBuilderFrame frame = frameReference.AcquireFrame())
  56.         {
  57.             if (frame != null && frame.DiscreteGestureResults != null)
  58.             {
  59.                 if (AttachedObject == null)
  60.                     return;
  61.  
  62.                 DiscreteGestureResult result = null;
  63.  
  64.                 if (frame.DiscreteGestureResults.Count > 0)
  65.                     result = frame.DiscreteGestureResults[_salute];
  66.                 if (result == null)
  67.                     return;
  68.  
  69.                 if (result.Detected == true)
  70.                 {
  71.                     var progressResult = frame.ContinuousGestureResults[_saluteProgress];
  72.                     if (AttachedObject != null)
  73.                     {
  74.                         var prog = progressResult.Progress;
  75.                         float scale = 0.5f + prog * 3.0f;
  76.                         AttachedObject.transform.localScale = new Vector3(scale, scale, scale);
  77.                         if (_ps != null)
  78.                         {
  79.                             _ps.emissionRate = 100 * prog;
  80.                             _ps.startColor = Color.red;
  81.                         }
  82.                     }
  83.                 }
  84.                 else
  85.                 {
  86.                     if (_ps != null)
  87.                     {
  88.                         _ps.emissionRate = 4;
  89.                         _ps.startColor = Color.blue;
  90.                     }
  91.                 }
  92.             }
  93.         }
  94.     }
  95. }

This code is more or less the same as the code here /kinect/unity/winrt/2014/11/21/kinect-4-windows-v2-custom-gestures-in-unity.html.

Attaching a Game Object

Once this script is in place you can set the ‘AttachedObject’ variable via the Unity3D UI by dragging a game object onto the field on the property editor. I created a particle system for the sample and dragged it on after changing some of the properties. When a gesture is detected I use the progress value to alter some properties on the particle system; changing colours and emission rate.

Accessing the Gesture Database

Notice that the code above imports the .gbd file from my previous post which contains a discrete gesture ‘salute’ and a continuous gesture ‘salute progress’. To access the database you can use using the streamingAssetsPath provided by Unity to gain access to local resources; more details here.

Build settings

You can use the Unity File > Build Settings… dialog to configure the output project (in this case Windows Store output).

unitybuildsettings_custom

Building this will generate the Windows Store App project which you can load into Visual Studio and run\debug as required. At this stage it is important to remember to do the following:

Add Webcam and Microphone capabilities to your app by editing the Windows Store app package manifest

manifest

Add references to the app to WindowsPreview.Kinect and Microsoft.Kinect.VisualGestureBuild

referecnes

These last two have caught me out on a number of occasions and it’s not always obvious when this is the cause of problems.

salutedetected

This shows the final result captured mid-salute.

Find the sample project here.

Other useful resources:

Another working sample - https://github.com/carmines/workshop/blob/dev/Unity/VGBSample.unitypackage

Comments