Can ReLU replace a Sigmoid Activation Function in Neural Network

270 views Asked by At

I'm new into this and I'm trying to replace the sigmoid activation function in the following simple NN with ReLU. Can I do that? I've tried replacing the sigmoid function, but it's not working. The output should be the AND gate(if input (0,0)-> output 0).

import numpy as np

 # sigmoid function
def nonlin(x, deriv=False):
   if(deriv == True):
       return x*(1-x)
   return 1/(1+np.exp(-x))

# input dataset
X = np.array([[0, 0],
          [0, 1],
          [1, 0],
          [1, 1]])

# output dataset            
y = np.array([[0, 0, 0, 1]]).T

# seed random numbers to make calculation
# deterministic (just a good practice)
np.random.seed(1)

# initialize weights randomly with mean 0
syn0 = 2*np.random.random((2, 1)) - 1

for iter in xrange(10000):

    # forward propagation
    l0 = X
    l1 = nonlin(np.dot(l0,syn0))

    # how much did we miss?
    l1_error = y - l1

    l1_delta = l1_error * nonlin(l1, True)

    syn0 += np.dot(l0.T,l1_delta)
0

There are 0 answers