I successfully created a network that can infer on a NCS2. Now, I want to speed up the inference by using multiple NCS2 device that work in parallel on an inference. I do this by the following:
plugin = IECore()
net = plugin.read_network(model,weight)
exec_net = plugin.load_network(net,'HETERO::MYRIAD.1.1.2-ma2480,MYRIAD.1.2-ma2480')
However, my configuration didn't work, the inference time didn't change. I also check the query layers map, and all layer are always assigned to only NSC2 device. Anyone had solved this problem? If you know any information please let me know. Thank you very much.
The thing you are looking for is the multi-device plugin. To use it you need to replace HETERO with MULTI, like this
More info about the plugin https://docs.openvinotoolkit.org/latest/openvino_docs_IE_DG_supported_plugins_MULTI.html