Is there an implementation of Keras Adam Optimizer to support Float16

103 views Asked by At

I am currently working with deploying tiny-yolov3 on openvino toolkit and for that i need to convert my model to float16. But for that I need an optimizer that supports FP16. I tried modifying SGD to support fp16 but its accuracy is too low. So, I am looking to use Adam but it doesn't support FP16. Atleast the Keras implementation, I am aware that using tf.keras.mixed_precision I can achieve this but that requires tf2.0 and openvino does not fully support tf2.0 as of yet. So, if anyone has faced this issue and can help me figure this out, it would be really helpful.

0

There are 0 answers