Classify gestures using Machine Learning from IMU data

707 views Asked by At

I have an interesting problem to solve. The aim is to classify gestures recorded as Sensor data in terms of Acceleration, roll pitch yaw, etc. Every gesture has fixed array of values. The objective is to use supervised learning algorithms to classify a gesture.

Here are some details about the nature of data. Each gesture consists of :
1. 40 Rows
2. 6 columns ( Acceleration X, Y, Z, Angular Velocity X, Y, Z)

Each gesture has basically 40 x 6 float values, which are sensor data.

Here is a link to sample data https://docs.google.com/spreadsheets/d/1tW1xJqnNZa1PhVDAE-ieSVbcdqhT8XfYGy8ErUEY_X4/edit?usp=sharing

This file has 16 gestures which belong to same class.There are other values as well, but to begin with lets use six of these columns listed above. All the sample gestures belong to class 1.

Lets say there are ten types gestures. Once the classifier is trained, the prediction of the model should be one of the ten classes to which the new gesture belongs to. The number of classes is fixed and the values in each data point are also constant (40x6).

Following are my questions to the community:

  1. How much data does the learning procedure needs so that the model performs well.
  2. How to convert this data into an intermediate form that can be fed into classifier.

Any suggestions on how to approach the problem will be helpful. Thanks in advance.

So this is what I have done so far and I am stuck at how to convert this into suitable format where it can be fed into classifier. I will be using sklearn for the task. Create Pandas dataframe with list as values in rows

0

There are 0 answers