Linked Questions

Popular Questions

I'm using tensorflow 1.8.0rc1. I'm trying to save a very simple NN model to tflite format, with weight quantization, following this documentation:

However, when converting with toco, I receive this error:

Array Relu, which is an input to the Div operator producing the output array dropout/div, is lacking min/max data, which is necessary for quantization. Either target a non-quantized output format, or change the input graph to contain min/max information, or pass --default_ranges_min= and --default_ranges_max= if you do not care about the accuracy of results.\n"

And this is the graph:

At some point it was not complaining about RELU, but Assign operations (fixed that no idea how), and if I remove the RELU layers it complains about the Add layers. Any idea what's going on?


Just realized that between dropout_1 and activation2 (see picture) there's an act_quant node that must be the fake quantization of activation2 (a RELU). This is not happening in the first layer, between dropout and activation1. I guess this the problem? According to the tensorflow quantization tutorial (attached before) the scripts described there should rewrite the graph with all the necessary information for toco to quantize the weights.

Related Questions