i built and trained my LSTM model for a regression task and everything works fine. i would like to use the fast_gradient_method function from cleverhans (or any other cleverhans function as the issue stands for any other attack).

i don't understand how am i supposed to pass the model to the function. from cleverhans:

:param model_fn: a callable that takes an input tensor and returns the model logits

whatever input i give to the function (the model itself, the weights i get with get_weights, the weights of the "stage" right before the dense layer...), i get this error:

TypeError: 'module' object is not callable

what would be the correct input to make it work?

in the only working example i found, the following line of code is used to define logits_model and then pass it as :param model_fn:, but i still get the error above

logits_model = tf.keras.Model(model.input,model.layers[-1].output)
1

There are 1 answers

0
khada On BEST ANSWER

to pass a valid model, it should be defined in the following way:
(it is just an example)

"make" is only needed for model.summary() to work, I found the code in another SO post that I can't seem to find right now

class modSubclass(Model):
    def __init__(self):
        super(modSubclass, self).__init__()
        self.lstm1 = GRU(hidden_size1, activation='relu',return_sequences=True,input_shape=(input_size,1))
        self.lstm2 = GRU(hidden_size2, activation='relu')
        self.dense1 = Dense(K, activation='relu')

    def call(self,x):
        x = self.lstm1(x)
        x = self.lstm2(x)
        x = self.dense1(x)
        return x

    def make(self, input_shape):
        '''
        This method makes the command "model.summary()" work.
        input_shape: (H,W,C), do not specify batch B
        '''
        x = tf.keras.layers.Input(shape=input_shape)
        model = tf.keras.Model(inputs=[x], outputs=self.call(x), name='actor')
        print(model.summary())
        return model