Linked Questions

Popular Questions

TensorFlow Lite Inference on a PC very slow

Asked by At

I'm experimenting with TensorFlow Lite on the PC:

from tensorflow.contrib.lite.python import interpreter as interpreter_wrapper


model_path = os.path.join(ROOT_DIR, 'model', 'yolov3.tflite')

interpreter = interpreter_wrapper.Interpreter(model_path=model_path)
interpreter.allocate_tensors()

input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()

if input_details[0]['dtype'] == np.float32:
    floating_model = True

orig = cv2.imread('data/dog-cycle-car.png')

height = input_details[0]['shape'][1]
width = input_details[0]['shape'][2]

image, image_data = preprocess_image(orig, (height, width))

start = time.time()
interpreter.set_tensor(input_details[0]['index'], image_data)
interpreter.invoke()
end = time.time()

output_data = interpreter.get_tensor(output_details[0]['index'])

# Takes around 30 seconds on a PC
print("Inference time: {:.2f}s".format((end - start)))

However inference takes around 30 seconds, which seems slightly abnormal. Am I missing something?

Related Questions