Code:
from keras.applications import InceptionV3
model = InceptionV3(weights="imagenet")
shape = (None,image_size,image_size,num_channels)
x = tf.placeholder(tf.float32, shape=shape)
adv_x,grad_x = fgm(x, model, model.predict(x), y=y, targeted=True, eps=0, clip_min=-0.5, clip_max=0.5)
adv_,grad_ = batch_eval(sess, [x,y], [adv_x,grad_x], [inputs,targets], args={'batch_size': args['batch_size']})
model.predict(x)
Error:
File "/u/.../env/lib/python3.5/site-packages/keras/engine/training.py", line 1594, in predict
batch_size=batch_size, verbose=verbose)
File "/u/.../env/lib/python3.5/site-packages/keras/engine/training.py", line 1208, in _predict_loop
batches = _make_batches(samples, batch_size)
File "/u/.../env/lib/python3.5/site-packages/keras/engine/training.py", line 364, in _make_batches
num_batches = int(np.ceil(size / float(batch_size)))
TypeError: unsupported operand type(s) for /: 'Dimension' and 'float'
I can use model.predict on actual images, but end up with this error on tf.placeholders or tf.variables Can anyone help me debug this error?
Keras'
Model.predict
expects an numpy array for input data. You may also want to include abatch_size
value, unless your batch size is 32. From the documentation: