Use Syphon to send Processing frames to VPT

899 views Asked by At

I'm using Processing with FaceOSCSyphon and need to use Syphon to send the frames from Processing to VPT. What do I need to add to my code for this to work?

1

There are 1 answers

0
George Profenza On

I haven't used FaceOSCSyphon, but played a bit with the FaceTracker library. Looking at the examples, the FaceOSCSyphon.pde sketch acts a a Syphon Client.

In your case, according to page 33 of the VPT Documentation(large pdf link), your Processing sketch needs to be a Syphon Server.

In Processing run Examples > Contributed Libraries > Syphon > Send Frames.

In VPT, scroll the layer list on the right towards the bottom until you find the syph section. You should then be able to choose the Processing Syphon server ran by the Send Frames example:

Syphon input in VPT

Processing Sketch in Syphon

Now you should have a clear idea of how to get frames from Processing into VPT.

Regarding the FaceOSC part, I'd recommend merging the SendFrames example with the FaceOSCReceiverClass example: read the FaceOSC data, but instead of setting up a Syphon Client, setup a Syphon Server.

For example:

//
// a template for receiving face tracking osc messages from
// Kyle McDonald's FaceOSC https://github.com/kylemcdonald/ofxFaceTracker
//
// this example includes a class to abstract the Face data
//
// 2012 Dan Wilcox danomatika.com
// for the IACD Spring 2012 class at the CMU School of Art
//
// adapted from from Greg Borenstein's 2011 example
// http://www.gregborenstein.com/
// https://gist.github.com/1603230
//
import codeanticode.syphon.*;
import oscP5.*;
OscP5 oscP5;

PGraphics canvas;
SyphonServer server;


// our FaceOSC tracked face dat
Face face = new Face();

void setup() {
  size(640, 480,P2D);
  frameRate(30);

  oscP5 = new OscP5(this, 8338);

  canvas = createGraphics(640, 480, P2D);

  // Create syhpon server to send frames out.
  server = new SyphonServer(this, "FaceOSC Processing Syphon");
}

void draw() {  
  canvas.beginDraw();
  canvas.background(255);
  canvas.stroke(0);

  if(face.found > 0) {
    canvas.translate(face.posePosition.x, face.posePosition.y);
    canvas.scale(face.poseScale);
    canvas.noFill();
    canvas.ellipse(-20, face.eyeLeft * -9, 20, 7);
    canvas.ellipse(20, face.eyeRight * -9, 20, 7);
    canvas.ellipse(0, 20, face.mouthWidth* 3, face.mouthHeight * 3);
    canvas.ellipse(-5, face.nostrils * -1, 7, 3);
    canvas.ellipse(5, face.nostrils * -1, 7, 3);
    canvas.rectMode(CENTER);
    canvas.fill(0);
    canvas.rect(-20, face.eyebrowLeft * -5, 25, 5);
    canvas.rect(20, face.eyebrowRight * -5, 25, 5);

    print(face.toString());
  }
  canvas.endDraw();
  image(canvas,0,0);
  server.sendImage(canvas);
}

// OSC CALLBACK FUNCTIONS

void oscEvent(OscMessage m) {
  face.parseOSC(m);
}

// a single tracked face from FaceOSC
class Face {

  // num faces found
  int found;

  // pose
  float poseScale;
  PVector posePosition = new PVector();
  PVector poseOrientation = new PVector();

  // gesture
  float mouthHeight, mouthWidth;
  float eyeLeft, eyeRight;
  float eyebrowLeft, eyebrowRight;
  float jaw;
  float nostrils;

  Face() {}

  // parse an OSC message from FaceOSC
  // returns true if a message was handled
  boolean parseOSC(OscMessage m) {

    if(m.checkAddrPattern("/found")) {
        found = m.get(0).intValue();
        return true;
    }      

    // pose
    else if(m.checkAddrPattern("/pose/scale")) {
        poseScale = m.get(0).floatValue();
        return true;
    }
    else if(m.checkAddrPattern("/pose/position")) {
        posePosition.x = m.get(0).floatValue();
        posePosition.y = m.get(1).floatValue();
        return true;
    }
    else if(m.checkAddrPattern("/pose/orientation")) {
        poseOrientation.x = m.get(0).floatValue();
        poseOrientation.y = m.get(1).floatValue();
        poseOrientation.z = m.get(2).floatValue();
        return true;
    }

    // gesture
    else if(m.checkAddrPattern("/gesture/mouth/width")) {
        mouthWidth = m.get(0).floatValue();
        return true;
    }
    else if(m.checkAddrPattern("/gesture/mouth/height")) {
        mouthHeight = m.get(0).floatValue();
        return true;
    }
    else if(m.checkAddrPattern("/gesture/eye/left")) {
        eyeLeft = m.get(0).floatValue();
        return true;
    }
    else if(m.checkAddrPattern("/gesture/eye/right")) {
        eyeRight = m.get(0).floatValue();
        return true;
    }
    else if(m.checkAddrPattern("/gesture/eyebrow/left")) {
        eyebrowLeft = m.get(0).floatValue();
        return true;
    }
    else if(m.checkAddrPattern("/gesture/eyebrow/right")) {
        eyebrowRight = m.get(0).floatValue();
        return true;
    }
    else if(m.checkAddrPattern("/gesture/jaw")) {
        jaw = m.get(0).floatValue();
        return true;
    }
    else if(m.checkAddrPattern("/gesture/nostrils")) {
        nostrils = m.get(0).floatValue();
        return true;
    }

    return false;
  }

  // get the current face values as a string (includes end lines)
  String toString() {
    return "found: " + found + "\n"
           + "pose" + "\n"
           + " scale: " + poseScale + "\n"
           + " position: " + posePosition.toString() + "\n"
           + " orientation: " + poseOrientation.toString() + "\n"
           + "gesture" + "\n"
           + " mouth: " + mouthWidth + " " + mouthHeight + "\n"
           + " eye: " + eyeLeft + " " + eyeRight + "\n"
           + " eyebrow: " + eyebrowLeft + " " + eyebrowRight + "\n"
           + " jaw: " + jaw + "\n"
           + " nostrils: " + nostrils + "\n";
  }

};

Note, this is merged code that isn't tested. It should get the point across, but I can't test right now.

Make sure the sketch is running before you launch VPT (otherwise, restart VPT)

FaceOSC to Processing to Syphon to VPT