I'm trying to analyze DeepSpeech's (a third-party library that uses TensorFlow and TFLite) performance on android devices and had built it successfully as they mentioned in their docs.
After I read the source codes, I found out that tensorflow uses Google's ruy as the back-end for matrix operations for TFLite. But I also found out that there is support for different GEMM libraries like Eigen and GEMMLOWP in the TFLite source codes.
But I was unable to found a way to use them to build TFLite.
How can I use them instead of ruy?
My build command is almost the same as the one in DeepSpeech docs.
bazel build --jobs 5 --workspace_status_command="bash native_client/bazel_workspace_status_cmd.sh" --config=monolithic --config=android --config=android_arm64 --define=runtime=tflite --action_env ANDROID_NDK_API_LEVEL=21 --cxxopt=-std=c++14 --copt=-D_GLIBCXX_USE_C99 --copt=-g --cxxopt=-g //native_client:libdeepspeech.so
What should I change in the command to change the back-end library?
Please note that I have no problem building the library, and I can build it successfully, and it works fine for me. I want to change the back-end GEMM library of the TFLite.
I haven't tested with the DeepSpeech library compilation but the following bazel flag can disable the RUY to enable the other GEMM libraries for the TensorFlow Lite library compilation through the bazel tool.
The following tables summarize how the TensorFlow Lite kernels choose the GEMM library based on the above build flag:
On ARM platforms:
On x86 platforms:
See the more details at here.