When building Sequential model, I notice there is a difference between adding relu
layer and LeakyReLU
layer.
test = Sequential()
test.add(Dense(1024, activation="relu"))
test.add(LeakyReLU(0.2))
- Why cant we add layer with activation = "
LeakyReLU
" ? (LeakyReLU is not a string which keras can work with) - When adding
relu
layer, we set the number of units (1024 in my example) Why can't we do the same forLeakyReLU
?
I was sure that the different between relu
and LeakyReLU
is the method behavior, but it seems more than that.
ouput: