I'm using a pre-trained VGG-16 net to transform images into features. I can do this fine sequentially. However, I'd like to do this in parallel and I'm not sure how to properly construct the batch.
Specifically, let's say I load up 16 images that are held in a numpy array (i.e. 16x224x224x3). I want to transform these in parallel. Here's what I have so far:
checkpoint_file = './vgg_16.ckpt'
input_tensor = tf.placeholder(tf.float32, shape=(None,224,224,3), name='input_image')
scaled_input_tensor = tf.scalar_mul((1.0/255), input_tensor)
scaled_input_tensor = tf.subtract(scaled_input_tensor, 0.5)
scaled_input_tensor = tf.multiply(scaled_input_tensor, 2.0)
arg_scope = vgg_arg_scope()
with slim.arg_scope(arg_scope):
_, end_points = vgg_16(scaled_input_tensor, is_training=False)
sess = tf.Session()
saver = tf.train.Saver()
saver.restore(sess, checkpoint_file)
images = get_16_images() #arbitrary function, images is 16x224x224x3
images = tf.convert_to_tensor(images)
batch = tf.train.batch(im, 2, num_threads=6, enqueue_many=True, allow_smaller_final_batch=True)
sess.run(end_points['vgg_16/fc7'], feed_dict={input_tensor: batch}) #Error
I end up getting an error:
*** ValueError: setting an array element with a sequence.
Can anyone help me here? The batch tutorials seem to be focused on creating batches while reading in data, but I've already read in the data and simply want to create a batch to parallelize the network's computation of the different images.
OK, this was a stupid question, but I think other tensorflow novices might run into the same issue.
If it's not clear,
feed_dict
takes in numpy arrays. So you don't construct an explicit batch tensor. Simply pass in multiple numpy arrays and tensorflow will handle it. Note where I defined input tensor:So tensorflow already knows that the input will be a "batch" so to speak. The variable
images
(fromget_16_images()
) is already exactly what I need.You can confirm this is parallel by checking CPU allocation. Python CPU allocation spiked up to 650% for that batch.