sklearn Perceptron not able to classify NAND function

43 views Asked by At

I am new to Machine learning. I was implementing a Perceptron to see which logic gate functions are linearly separable. For NAND gate, I got unexpected result . For inputs X and output y, the model gives a score of 0.5 , and y prediction is [1,0,0,0] instead of [1,1,1,0]. Can anyone help me with whats wrong with my model

import numpy as np
from sklearn.linear_model import Perceptron

clf = Perceptron(max_iter=100, random_state=0)

X= np.array([[0,0],[0,1],[1,0],[1,1]])
y = np.array([1,1,1,0])
clf.fit(X, y)
score= clf.score(X, y)
print(score)
a=clf.predict(X)
print(a)

I am getting this output for the code

0.5
[1 0 0 0]

I tried calculating z (net input) from the weights and bias of the model like this:

w = clf.coef_ ;b = clf.intercept_
z= np.dot(X,w.T) +b
print(z.T)

which gave me the following result:

[[ 2.  0.  0. -2.]]
1

There are 1 answers

4
Muhammed Yunus On

It will iteratively converge, so the low accuracy might be because its not getting enough samples. You can give it more examples as follows:

X= np.array([[0,0],[0,1],[1,0],[1,1]] * 2)
y = np.array([1,1,1,0] * 2)

This will duplicate your original examples twice.

Regarding your second point about the matrix operation - the perceptron has a step function at the very end that maps negative numbers to 0, and positive numbers to 1. After you perform the matrix operation, push it through a step function (np.where helps), and you should get the same classification output as the perceptron: manual_predict = np.where(X @ w.T + b < 0, 0, 1).