The Bias in my Single Layer Perceptron doesn't work

113 views Asked by At

Blockquote

I am making a single layer perceptron that learns how to detect if a point is above or below a given line (it outputs results as a 1 or -1). It works fine when the line's y-intercept is 0 and there is no bias, but when I incorporate a bias and change the y-intercept (in this case to -150), the bias keeps decreasing past -150 and the perceptron can't solve it. How do I fix this?

(Don't worry about the details of the JFrame and the dots, some methods that I know work fine were deleted from this post to simplify the code)

The main class:

public class Runner {

ArrayList<JLabel> dots = new ArrayList<JLabel>();
JFrame frame = new JFrame();
Brain brain = new Brain();

public Runner() {
    
    frame.setBounds(0, 0, 1400, 740);
    frame.setLayout(null);
    frame.setVisible(true);
    frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
    
    addDots(1000, 1400, 740); //1000 dots are added with x from 0-1400 and y from 0-740

    
}

public static void main(String[] args) {

    new Runner();
}

public void trainBrain(int dotsInd) {
    
    JLabel dot = dots.get(dotsInd);
    
    brain.guess(dot.getX(), dot.getY());
    float er = brain.getError(getTarget(dot));
    brain.changeWeights(er);
    
    if(er == 0) {dots.get(dotsInd).setBackground(Color.GREEN);}
    else {dots.get(dotsInd).setBackground(Color.RED);}
}
//Used to set the target line (y = x/2 - 150)
public int getTarget(JLabel dot) {
    
    if(dot.getX() > (2 * dot.getY() + 300)) {return 1;}
    return -1;
}

}

The Brain class (The Perceptron):

public class Brain {

int inputOne = 0; //x of a dot
int inputTwo = 0; //y of a dot
float weightOne = (float) (Math.random() * 3 - 1);
float weightTwo = (float) (Math.random() * 3 - 1);
int output = 0;
float biasWeight = (float) (Math.random() * 3 - 1);

public int guess(int iOne, int iTwo) {
    
    inputOne = iOne;
    inputTwo = iTwo;
    
    output = activationFunc(iOne * weightOne + iTwo * weightTwo + biasWeight);
    return output;
}

public float getError(int target) {
    
    return target - output;
}   

public void changeWeights(float error) {
    
    weightOne += error * inputOne * 1;
    weightTwo += error * inputTwo * 1;
    biasWeight += error * 1;
}

public int activationFunc(float d) {
    
    if(d > 0) return 1;
    return -1;
}

}

1

There are 1 answers

1
John Angland On

Based on your code, you would only expect bias to stop diverging if the relative rate of positive error values at some point becomes equal to the rate of negative error values.