Unable to load_model due to 'unknown activation_function: LeakyReLU'

4.4k views Asked by At

I have constructed, fitted, and saved the following model:

import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
from tensorflow.keras import preprocessing
from tensorflow.keras.models import Sequential
import config
from tensorflow.keras import applications  

model = Sequential()  
model.add(layers.Flatten(input_shape=input_shape.shape[1:]))  
model.add(layers.Dense(100, activation=keras.layers.LeakyReLU(alpha=0.3)))  
model.add(layers.Dropout(0.5))  
model.add(layers.Dense(50, activation=keras.layers.LeakyReLU(alpha=0.3)))  
model.add(layers.Dropout(0.3)) 
model.add(layers.Dense(num_classes, activation='softmax'))

I am using the load_model function for evaluation, and I have not had any trouble up until now, but I am now getting the following error:

ValueError: Unknown activation function: LeakyReLU

Are there any syntactic changes to the architecture I should make, or is there a deeper issue here? Any advice would be appreciated, as I had already tried setting some custom objects as described here: https://github.com/BBQuercus/deepBlink/issues/107

Edit: My imports in the file where I am calling load_model are the following:

import config
import numpy as np
from tensorflow.keras.preprocessing.image import img_to_array, load_img 
from models.create_image_model import make_vgg
import argparse
from tensorflow.keras.models import load_model
import time
from tensorflow import keras
from tensorflow.keras import layers
1

There are 1 answers

0
desertnaut On BEST ANSWER

There seem to be some issues when saving & loading models with such "non-standard" activations, as implied also in the SO thread keras.load_model() can't recognize Tensorflow's activation functions ; the safest way would seem to be to re-write your model with the LeakyReLU as a layer, and not as an activation:

model = Sequential()  
model.add(layers.Flatten(input_shape=input_shape.shape[1:]))  
model.add(layers.Dense(100)) # no activation here
model.add(layers.LeakyReLU(alpha=0.3)) # activation layer here instead 
model.add(layers.Dropout(0.5))  
model.add(layers.Dense(50)) # no activation here
model.add(layers.LeakyReLU(alpha=0.3))  # activation layer here instead
model.add(layers.Dropout(0.3)) 
model.add(layers.Dense(num_classes, activation='softmax'))

This is exactly equivalent to your own model, and more consistent with the design choices of Keras - which, for good or bad, includes LeakyReLU as a layer, and not as a standard activation function.