I'm almost there? 4 DOF robot arm forward kinematics

140 views Asked by At

I started a project to learn the forward kinematics of a custom 4 DOF robot arm I made. Essentially, I wanted to learn the basics to apply them to learning inverse kinematics fingers crossed. I cannot seem to grasp the rotation axis. It will click in my brain once I overcome whatever mental block stops me.

I set out to create a Joint class that would identify each joint and the rotation axis. Again, why can't my brain wrap my head around the rotation axis? Do you have any suggestions to overcome this mental blockage?

  public class Joint {

    public double Length { get; }
    public Vector3 RotationAxis { get; }

    public Joint(double length, Vector3 rotationAxis) {

      Length = length;
      RotationAxis = Vector3.Normalize(rotationAxis);
    }
  }

The heart of the class is to loop through each joint and keep the rotation state while calculating the vector3 offset for the end effector. But I feel like I am misusing Quaternions. Either that or the Transform axis is incorrect. Most likely, the axis is incorrect because I can't get my head around it.

using System;
using System.Collections.Generic;
using System.Numerics;

namespace InverseKinematicsMover {

  public class Joint {

    public double Length { get; }
    public Vector3 RotationAxis { get; }

    public Joint(double length, Vector3 rotationAxis) {

      Length = length;
      RotationAxis = Vector3.Normalize(rotationAxis);
    }
  }

  public class Forward2 {

    public List<Joint> Joints = new List<Joint>();

    public Forward2() {

      Joints = new List<Joint>();
    }

    float degreeToRadian(int angle) {

      // because 90 is the center for servos
      int angleDegrees = angle - 90;

      return (float)Math.PI * angleDegrees / 180.0f;
    }

    public Vector3 CalculateEndEffectorPosition(List<int> jointAngles) {

      Vector3 endEffectorPosition = Vector3.Zero;
      Quaternion cumulativeRotation = Quaternion.Identity;

      for (int i = 0; i < Joints.Count; i++) {

        // continue to keep track of the rotation as we progress through joints
        // ie 45 degrees of the current joint is relative to 90 degrees of previous joint
        cumulativeRotation += Quaternion.CreateFromAxisAngle(Joints[i].RotationAxis, degreeToRadian(jointAngles[i]));
        
        // add to the end effector with the quaternion transformation of this joint's rotation axis
        endEffectorPosition += (float)Joints[i].Length * Vector3.Transform(Joints[i].RotationAxis, cumulativeRotation);
      }

      return endEffectorPosition;
    }
  }
}

Lastly, my test program uses each servo's valueo (1-180).

      Forward2 robotArm = new Forward2();

      // Base joint rotates the arm
      robotArm.Joints.Add(new Joint(0, new Vector3(0, 1, 0)));

      // Joints extend the robot arm
      robotArm.Joints.Add(new Joint(128.0, new Vector3(1, 0, 0)));
      robotArm.Joints.Add(new Joint(148.0, new Vector3(1, 0, 0)));
      robotArm.Joints.Add(new Joint(146.0, new Vector3(1, 0, 0)));

      // Specify joint angles (in degrees) for each joint
      var  jointAngles = new List<int> {
        GetServoPosition(0),
        GetServoPosition(1),
        GetServoPosition(2),
        GetServoPosition(3),
      };

      // Calculate the end effector position
      Vector3 endEffectorPosition = robotArm.CalculateEndEffectorPosition(jointAngles);

      // Display the end effector position
      log($"End Effector Position: X={endEffectorPosition.X}, Y={endEffectorPosition.Y}, Z={endEffectorPosition.Z}");

I get a strange behavior for the logged output. It appears the Y and Z axis are dependent on the X (base rotation) from not being zero. And the Y and Z axis values are heavily dependent on the X value as a multiplier for some reason.

I have tried configuring the Joint definitions for different Axis. The first Joint is a base rotation, as seen on common Robot Arms. The remaining joints all rotate the arm vertically (up and down).

I have also tried thinking the end effector accumulated value may need the length multiplied by the rotation axis during the transform.

  endEffectorPosition += Vector3.Transform(Joints[i].RotationAxis * (float)Joints[i].Length, cumulativeRotation);

That provides similar results where the Y and Z values are heavily dependent on the X axis and also seem to be a multiplier of X.

1

There are 1 answers

0
Ponderer On BEST ANSWER

Ah, after some further research I figured out my issue with the axis. You see, the arm extends upward from the base when all joint values are 0. That means all joints are along the y axis relative to each other.

Also, the first joint was configured for the z axis rotation which is incorrect. It should be the Y axis because it rotates the arm around it.

The main issue was using the rotation axis in the vector transform. Because the arm extends from the y axis out of the base, it merely needs this code changed to…

endEffectorPosition += (float)Joints[i].Length * Vector3.Transform(Vector3.UnitY, cumulativeRotation);
      }