Zero initialiser for biases using get_variable in tensorflow

11.1k views Asked by At

A code I'm modifying is using tf.get_variable for weight variables, and tf.Variable for bias initialisation. After some searching, it seems that get_variable should always be favoured due to its portability in regards to sharing. So I tried to change the bias variable to get_variable but can't seem to get it to work.

Original: tf.Variable(tf.zeros([128]), trainable=True, name="b1")

My attempt: tf.get_variable(name="b1", shape=[128], initializer=tf.zeros_initializer(shape=[128]))

I get an error saying that the shape should not be specified for constants. But removing the shape then throws an error for no arguments.

I'm very new to tf so I'm probably misunderstanding something fundamental here. Thanks for the help in advance :)

1

There are 1 answers

1
user1454804 On BEST ANSWER

Following should work: tf.get_variable(name="b1", shape=[128], initializer=tf.zeros_initializer())