I want to implement a kind of drums. For every hit I want to play a song. So I need to detect every "hit" and the position. before I start implementing a function who will analyse the positions and detect "hits" I want to be sure that there is no other solution, so is there any event, gesture detection who allow me to detect that ?
kinect Gesture recognition in SimpleOpenNI
1.2k views Asked by user3697171 At
2
There are 2 answers
0
On
I made a drum kit with the Kinect using this is a wonderful class for placing and using boxes in Kinect. Import Libraries:
import processing.opengl.*;
import SimpleOpenNI.*;
Use something like this bit of code inside your Setup()
myTrigger = new Hotpoint(200, 10, 800, size);
Use the methods inside your draw()
if(myTrigger.currentlyHit()) {
myTrigger.play();
println("trigger hit");
}
Use the following methods inside this class!
class Hotpoint {
PVector center;
color fillColor;
color strokeColor;
int size;
int pointsIncluded;
int maxPoints;
boolean wasJustHit;
int threshold;
Hotpoint(float centerX, float centerY, float centerZ, int boxSize) {
center = new PVector(centerX, centerY, centerZ);
size = boxSize;
pointsIncluded = 0;
maxPoints = 1000;
threshold = 0;
fillColor = strokeColor = color(random(255), random(255), random(255));
}
void setThreshold( int newThreshold ){
threshold = newThreshold;
}
void setMaxPoints(int newMaxPoints) {
maxPoints = newMaxPoints;
}
void setColor(float red, float blue, float green){
fillColor = strokeColor = color(red, blue, green);
}
boolean check(PVector point) {
boolean result = false;
if (point.x > center.x - size/2 && point.x < center.x + size/2) {
if (point.y > center.y - size/2 && point.y < center.y + size/2) {
if (point.z > center.z - size/2 && point.z < center.z + size/2) {
result = true;
pointsIncluded++;
}
}
}
return result;
}
void draw() {
pushMatrix();
translate(center.x, center.y, center.z);
fill(red(fillColor), blue(fillColor), green(fillColor),
255 * percentIncluded());
stroke(red(strokeColor), blue(strokeColor), green(strokeColor), 255);
box(size);
popMatrix();
}
float percentIncluded() {
return map(pointsIncluded, 0, maxPoints, 0, 1);
}
boolean currentlyHit() {
return (pointsIncluded > threshold);
}
boolean isHit() {
return currentlyHit() && !wasJustHit;
}
void clear() {
wasJustHit = currentlyHit();
pointsIncluded = 0;
}
}
As far as I know there is no "event" that is defined natively other than stream callbacks which are called when you receive data such as the joint positions and depth image which should be enough to get you started.
I would take a look at this:https://code.google.com/p/kineticspace/ in order to know what to expect or how to proceed with your own code.
Once you manage to get the skeleton data just find where the hand is at that time, put a threshold to its position and start tracking for a certain amount time and see if its movement path fits your pattern for a particular gesture such as "translation in y direction for x amount of seconds". Then you have very simple "hit" gesture detection. This can get as complex as you need, but there is not much to it at the basics in terms of what you receive from the library side.
Good luck.