Integrating Perceptual Computing in Unity

Perceptual Computing can be defined as a set of techniques available to the developer to provide a new user experience.This concept, developed by Intel ® , is composed of these elements:

  • a  camera  with many sensors (HD, depth, stereo microphone);
  • a  SDK  developed by Intel to retrieve events captured by the caméra. The SDK allows to perform the following tasks: This technology is available for developers since the end of 2012 and will be integrated into Ultrabooks in 2014.
  • tracking of fingers and hands;
  • recognition of poses and gestures;
  • tracking of parts of the face;
  • face recognition;
  • speech recognition;
  • speech;
  • etc..

Many examples are available with the SDK. The SDK interface with many frameworks and tools available on the market such as OpenFramework, Unity, etc..

It is available in C + + and C #.

We will see at first how to install the SDK, how to integrate Unity and finally how to use the camera and the SDK to implement a flight controller in a 3D game.

I. Camera and SDK

IA. Download and install the SDK

The first step is to get the SDK from the Intel site. The address is: http://software.intel.com/fr-fr/vcsource/tools/perceptual-computing-sdk .

Image not available

Once the installation file is downloaded, install it. You can keep the default directory (C: \ Program Files (x86) \ Intel \ PCSDK). The installation may take some time if you decide to install the voice recognition tools.

IB. Purchase camera

It is possible to buy a perceptual camera from the Intel site. The latter is intended for developers. More recently, Intel announced at Computex that Creative put on the market a great camera appointed public  Senz3D .

IC. Test the installation and camera

You can test the installation by running the SDK samples delivered with it. Open your browser and go to the folder  C: \ Intel \ PCSDK \ bin \ Program Files (x86) .Then choose the appropriate platform folder. In my case, I will use the samples the x64 folder.

Image not available

Run example program gesture_viewer:

Image not available

II. Integration in Unity

II-A. Presentation of Unity

Unity is a tool to facilitate the development of video games. It allows the integration of 3D objects in scenes managing many formats (. Obj,. Blend, etc..). It is also possible to add behavior such as gravity, collision management, visual effects, etc.. The behavior of various elements of the scene is done by using script written in C #, JavaScript or Boo. Perceptual Computing SDK is available in Unity through plugins. The plugin is actually a C # file wrappant interfaces commonly used by developers Perceptual Computing in C # and C + +.

II-B. Prerequisites for using the SDK in Unity

To run plugins in Unity, it is necessary to have the professional version. This version is not free, however it is possible to use it free for 30 days. The Unity site will give you all the necessary details ( http://unity3d.com/ ).

You must also have a perceptual camera and have installed the SDK. If these criteria are not met, please refer to the previous sections.

II-C. Project payback

To facilitate the development and really focus on the aspect Preceptual Computing, you can get the project on which we will work in the following zip ( Developpez-lab.zip  ).

This project contains:

  • land representing an island;
  • of water;
  • a vessel;
  • a cube for handling collisions of the vessel and water;
  • a light.

The project can be executed, but nothing happens. We’ll start by adding the plugin.To do this, we must first create a folder named  Plugins  folder in the  Assets  of the Unity project.

Image not available

It is then necessary to copy some files into the newly created directory. To do this in your file explorer, get the files in the folder:  C: \ Program Files (x86) \ Intel \ PCSDK \ framework \ Unity \ hellounity \ Assets \ Plugins  and copy them into the Plugins folder of Unity project.

Image not available

Your project is now configured. We can create a script to control the ship.

III. Control vessel

III-A. Creating a script to move the ship

To move the ship, we’ll tie him a C # script who will retrieve the events of Perceptual Computing SDK and then operate on the position of the vessel.

To create a script, select the ship and in the Inspector, select ”  Add Component  “.Create a C # script and name  SpaceShipMoves .

Open this newly created by double-clicking script. Unity automatically creates two methods:

  • void Start ();
  • void Update ().

The method  Start ()  is called to launch the application. Method  Update ()  is called every frame.

We will add at the end of the method  Update () , a line to move the ship. Add the following code to the method  Update ()  .

void Update ()

{

transform.position transform.position transform.forward SPEED_FACTOR;

}

When a script is attached to an object, it has a variable  transform  for accessing data on its position, orientation, etc.. Thus, the position can be retrieved usingtransform.position  and the direction vector of our object in the variable transform.forward . We use a constant named SPEED_FACTOR whose value is initialized to 3f.
public float SPEED_FACTOR=3f;

If you restart the game now, you will find that the ship ahead alone.

III-B. Movements

We will implement a control on three axes:

  • Pitch  : raise or lower the nose of the aircraft;
  • Roll  : orient the wings up or down;
  • Yaw  : directing the nose of the aircraft to the right or left.

To play on the  pitch , we use the distance of the nearest camera hand (Y axis). It is necessary to use a reference point. When the hands are between the screen and the reference point, the ship will move down when the user’s hands between his body and the reference point, the ship will move upwards. Here is a picture showing how to play on the  pitch .

Image not available

To act on the  roll , we use the difference in height (Z-axis) between the two hands.We can compare the movement of the roll movement carries a driver to turn his car. The following image shows a user acting on the  roll .

Image not available

Finally, to act on the  yaw , the user must pull the hand to which the device must move. Thus, if we want to turn to the right, it will pull the right hand. The following figure illustrates this process.

Image not available

III-C. Using the SDK

The next step is to read data from the SDK and then use them in the game at launch the application, it is necessary to set some variables. We will first declare four new attributes to our class:

private user PXCUPipeline.Mode = PXCUPipeline.Mode.GESTURE;

private PXCUPipeline pp;

private bool calibrated false ;

private float calibrationY;

void Start ()

{

pp new PXCUPipeline ();

pp. Init (mode);

}

We take this opportunity to give the code of the method  start  that initializes some variables SDK.

The attribute  method  is actually a constant that we will use to set up the SDK.The attribute  pp  is meanwhile our link with the SDK interfaces. It is this variable that pass all our data readings.

calibrated  will we know if our reference point (needed to calculate the pitch) is reliable. Finally,  calibrationY  memorize the distance Y from our point of reference.

Complement to this method  Update ()  to retrieve the data of the SDK.

void Update ()

/ / Creation of two references to retrieve the data

/ / coming from the hand right and from the hand left.

PXCMGesture.GeoNode mainHand;

PXCMGesture.GeoNode secondaryHand;

/ / The method AcquireFrame () is blocking when it is used with

/ / the parameter to true. Allow this method blocking allows to synchronize

/ / the frame rate of the game with one of the camera.

if ( ! pp. AcquireFrame ( true )) return ;

/ / We use the method QueryNode to read the position of the hands.

/ / This is done in two times, the first time for the hand main,

/ / the second to the main school.

out secondaryHand out)) {

/ / We make the things here

}

else { calibrated false ; }

/ / It relaxes the lock . pp. ReleaseFrame ();

/ / We can use the data here !

transform.position = transform.position + transform.forward * SPEED_FACTOR;

}

The method  AcquireFrame  ensures that a frame is actually available. Indeed, we should not read a frame while the SDK is still trying to write on the same resource. So this function is used to protect access to data. When the parameter passed to  AcquireFrame  is true, this function is blocking.

We then use the method  QueryNode  to “fill” the contents of variables  mainHand and  secondaryHand . The method  QueryNode  must have the following arguments:

  • PXCMGesture.GeoNode.Label body:  for the part of the body it interesting.We then construct a specific label by using the operator  |  to indicate the hand and the hand part that interests us. Thus, in the example PXCMGesture.GeoNode.Label.LABEL_BODY_HAND_LEFT | PXCMGesture.GeoNode.Label.LABEL_HAND_MIDDLE, we focus on the central part of the left hand;
  • PXCMGesture.GeoNode geonode  : the reference to retrieve the data.

If one calls  QueryNode  fails, we need to change our game data are more reliable.We then change the value  calibrated  to pass  false .

Above all, do not forget to call the method  ReleaseFrame ()  to release the lock.

However, this template is not enough. If you run the game again, you will see that the behavior has not changed. In reality, we do not use any information for the moment.

Before implementing the  roll , the  pitch  and  yaw , we must check that the hands are properly mapped. At the time of this writing, the SDK is still young and some features are not reliable. To verify that the hands are properly mapped (right hand and left hand right to left), we read the data on the X axis of each hand and check that the x coordinate of second hand is higher than that of the hand primary.

void checkHands (PXCMGesture.GeoNode mainHand ref, ref PXCMGesture.GeoNode secondaryHand) {

/ / If the position in of the main focus is more large than that of the

/ / hand side, we invert the references.

if (mainHand.positionWorld.x > secondaryHand.positionWorld.x) {

PXCMGesture.GeoNode Temp = mainHand;

mainHand = secondaryHand;

secondaryHand = temp; }

}

This review is arbitrary, we could reverse the hands. What is important is to always do the same thing and apply good sign that we will achieve the calculations to calculate the pitch , the  roll  and  yaw .

This function is called if the data read successfully (like our movements depend, it is worth the call just after reading positions).

Again, we modify the Update () function:

void Update () {

/ / Creation of two references to retrieve the data

/ / coming from the hand right and from the hand left.

PXCMGesture.GeoNode mainHand;

PXCMGesture.GeoNode secondaryHand;

/ / The method AcquireFrame () is blocking when it is used with

/ / the parameter to true. Allow this method blocking allows to synchronize

/ / the frame rate of the game with one of the camera.

if ( ! pp. AcquireFrame ( true )) return ;

/ / We use the method QueryNode to read the position of the hands.

/ / This is done in two times, the first time for the hand main,

/ / the second to the main school.

out secondaryHand out)) {

checkHands (mainHand ref, ref secondaryHand);

/ / Calibrate or recalibrate the game if needed

calibrate (ref mainHand); }

else { calibrated false ; }

/ / It relaxes the lock . pp. ReleaseFrame ();

if (calibrated false ) return ;

/ / We can use the data here !

transform.position = transform.position + transform.forward * SPEED_FACTOR;

}

The lines in bold are those that we added.

Image not available

We use the function  calibrate ()  to calibrate or recalibrate the game if necessary.Its code is as follows:

void calibrate (ref PXCMGesture.GeoNode mainHand) {

PXCMGesture.Gesture dataMain;

PXCMGesture.Gesture dataSecondary;

// Nous essayons de récupérer un événement "pouce levé" sur la main principaleif (pp. QueryGesture (PXCMGesture.GeoNode.Label.LABEL_BODY_HAND_PRIMARY , Out dataMain)) {

/ / If there is the event, it records the position of the hand

if (dataMain.label = PXCMGesture.Gesture.Label.LABEL_POSE_THUMB_UP) {

calibrated true ;

calibrationY = mainHand.positionWorld.y; }

/ / We try to retrieve an event "thumbs up " on the main secondary

else if (pp. QueryGesture (PXCMGesture.GeoNode.Label.LABEL_BODY_HAND_PRIMARY, dataSecondary out)) {

/ / If there is the event, it records the position of the hand

if (dataSecondary.label = PXCMGesture.Gesture.Label.LABEL_POSE_THUMB_UP) {

calibrated true ; calibrationY = mainHand.positionWorld.y; }

}

If you try to run the game, you will see the behavior changes. Indeed, the vessel is no longer advance. To advance, you must place your hands in front of the camera and put a thumb up with either of your hands (both if you wish). As you can see, we use the variable  positionWorld  to retrieve the position of the hand. This variable is a three-dimensional vector. In our example, we use the Y component to retrieve the distance of the hand from the camera.

Image not available

Once calibrated the game, as your hands are to the camera, the vessel will continue to move forward. It only remains to integrate rotations on the ship and you’ll have it!

III-D. Implementation of controls

III-D-1. Precalculus useful data

Our variables  mainHand  and  secondaryHand  contain a lot of data not relevant to us. To simplify the calculations, we will recover before sending them to functions that take care to apply the rotations.

We still use once attribute  positionWorld  our nodes.

float mainHandY = mainHand.positionWorld.y;

float mainHandZ = mainHand.positionWorld.z;

float secondaryHandY = secondaryHand.positionWorld.y;

float secondaryHandZ = secondaryHand.positionWorld.z;

It remains only to apply rotations by calling the functions created for this purpose.

controlRoll (mainHandZ, secondaryHandZ);

controlYaw (mainHandY, secondaryHandY);

controlPitch (mainHandY, secondaryHandY);

The code of the Update () method is now complete and looks like:

void Update ()

/ / Creation of two references to retrieve the data

/ / coming from the hand right and from the hand left.

PXCMGesture.GeoNode mainHand; PXCMGesture.GeoNode secondaryHand;

/ / The method AcquireFrame () is blocking when it is used with

/ / the parameter to true. Allow this method blocking allows to synchronize

/ / the frame rate of the game with one of the camera.

if ( ! pp. AcquireFrame ( true )) return ;

/ / We use the method QueryNode to read the position of the hands.

/ / This is done in two times, the first time for the hand main,

/ / the second to the main school.

out secondaryHand out)) {

checkHands (mainHand ref, ref secondaryHand);

/ / Calibrate or recalibrate the game if needed

calibrate (ref mainHand); else { calibrated false ; }

/ / It relaxes the lock . pp. ReleaseFrame ();

if (calibrated false ) return ;

/ / It retrieves the data interesting

float mainHandY = mainHand.positionWorld.y;

float mainHandZ = mainHand.positionWorld.z;

float secondaryHandY = secondaryHand.positionWorld.y;

float secondaryHandZ = secondaryHand.positionWorld.z;

/ / We apply the rotations controlRoll (mainHandZ, secondaryHandZ);

controlYaw (mainHandY, secondaryHandY);

controlPitch (mainHandY, secondaryHandY);

transform.position = transform.position + transform.forward * SPEED_FACTOR; }

We will detail the functions  controlRoll ,  controlYaw  and  controlPitch thereafter.

III-D-2. Implementing the pitch

The pitch is actually the most difficult to implement control. She asked in effect to use a reference point. The function is actually very simple.

void controlPitch ( float mainHandY, float secondaryHandY) {

/ / We use the hand the more close to the camera

float positionY = (mainHandY < secondaryHandY)? mainHandY: secondaryHandY;

float pitch = calibrationY - positionY;

/ / The function square allows to give rendering flexible the movements

pitch = Mathf.Abs (pitch);

pitch = sensibilityFactor;

. transform RotateAroundLocal (transform.right, pitch); }

We start by recovering the Y position of the nearest of the camera by hand (in order to have a stable behavior when you use  yaw ). The pitch is then the difference between the reference point and the position of the hand selected. To depreciate the value and result in a flexible control, we square this value. We then multiply by a factor parameters must be set.

The last line applies the rotation. Remember, we used the reference transform.position to move the ship. We now use the RotateAroundLocal function on the X axis to tilt the ship up or down.

To the squared value is a real advantage. This allows to increase the amplitude of small movements. We thus arrive to have very precise control, while allowing to have a large rotation from a certain amplitude. Detailed explanation of this process are the subject of a white paper available on the Net.

III-D-3. Implementation of the roll

The implementation of the roll is very similar to that of the pitch and therefore will not be detailed.

void controlRoll float mainHandZ, float secondaryHandZ) {

float roll = mainHandZ - secondaryHandZ;

roll = Mathf.Abs (roll);

roll = sensibilityFactor;

. transform RotateAroundLocal (transform.forward, roll); }

III-D-4. Implementation of yaw

As the implementation of the roll, the details were explained above.

void controlYaw float mainHandY, float secondaryHandY) {

float yaw mainHandY secondaryHandY;

yaw Mathf.Abs (yaw);

yaw sensibilityFactor;

. transform RotateAroundLocal (transform.up, yaw); }

IV. Conclusion

We have seen how to use the Perceptual Computing SDK in Unity and how to implement a game controller in a few lines of code. Our game does not handle collisions, it has no purpose and is very elementary. You are free to do whatever you want!

Advertisements