I am new to tensorflow for deep learning and interested in deconvolution (convolution transpose) operation in tensorflow. I need to take a look at the source code for operating deconvolution. The function is I guess conv2d_transpose() in nn_ops.py.
However, in the function it calls another function called gen_nn_ops.conv2d_backprop_input()
. I need to take a look at what is inside this function, but I am unable to find it in the repository. Any help would be appreciated.
You can't find this source because the source is automatically generated by bazel. If you build from source, you'll see this file inside
bazel-genfiles
. It's also present in your local distribution which you can find usinginspect
module. The file contains automatically generated Python wrappers to underlying C++ implementations, so it basically consists of a bunch of 1-line functions. A shortcut to find underlying C++ implementation of such generated Python op is to convert snake case to camel case, ieconv2d_backprop_input
->Conv2dBackpropInput
If you cared to find out how this file really came about, you could follow the trail of bazel dependencies in
BUILD
files. It to find Bazel target that generated it from tensorflow source tree:So now going to
BUILD
file insidetensorflow/python
you see that this is a target of typetf_gen_op_wrapper_private_py
which is defined here and callsgen_op_wrapper_py
fromtensorflow/tensorflow.bzl
which looks like thisThis
native.cc_binary
construct is a way to have Bazel target that represents execution of an arbitrary command. In this case it callstool_name
with some arguments. With a couple more steps you can find that "tool" here is compiled from framework/python_op_gen_main.ccThe reason for this complication is that TensorFlow was designed to be language agnostic. So in ideal world you would have each op described in ops.pbtxt, and then each op would have one implementation per hardware type using
REGISTER_KERNEL_BUILDER
, so all implementations would be done in C++/CUDA/Assembly and become automatically available to all language front-ends. There would be an equivalent translator op like "python_op_gen_main" for every language and all client library code would be automatically generated. However, because Python is so dominant, there was pressure to add features on the Python side. So now there are two kinds of ops -- pure TensorFlow ops seen in files likegen_nn_ops.py
, and Python-only ops in files likenn_ops.py
which typically wrap ops automatically generated filesgen_nn_ops.py
but add extra features/syntax sugar. Also, originally all names were camel-case, but it was decided that public facing release should be PEP compliant with more common Python syntax, so this is a reason for camel-case/snake-case mismatch between C++/Python interfaces of same op