I am currently writing some code that involves two neural networks, and
. In this structure, I am giving batch data
, where I am trying to train
through some loss function
. For this problem,
has already been pretrained and much larger relative to
. I do not want to perform any updates on
, I simply want to train
. The issue I am having is that performing backpropagation to update the parameters of
is taking much longer than expected, it is on the same order as what it took for
. I am now thinking that since
is being fed into
, computing backpropagation needs to run through both networks even though
won't be updated. Is this how backpropagation works and could it be the source of my slow code? And if so, are there any work arounds so I don't need to compute the backpropagation through the
?
Backpropagation through composition of neural networks where one has fixed parameters
31 views Asked by cdmath At
1
There are 1 answers
Related Questions in OPTIMIZATION
- Optimize LCP ReactJs
- Efficiently processing many small elements of a collection concurrently in Java
- How to convert the size of the HTML document from 68 Kb to the average of 33 Kb?
- Optimizing Memory-Bound Loop with Indirect Prefetching
- Google or-tools soft constraint issue
- How to find function G(x), and make for every x, G(x) always returns fixed point for another function F(G(x))
- Trying to sort a set of words with the information theory to solve Worlde in Python but my program is way to slow
- Do conditional checks cause bottlenecks in Javascript?
- Hourly and annual optimization problem over matrix
- Sending asynchronous requests without a pre-defined task list
- DBT - Using SELECT * in the staging layer
- Using `static` on a AVX2 counter function increases performance ~10x in MT environment without any change in Compiler optimizations
- Is this a GCC optimiser bug or a feature?
- Performance difference between two JavaScript code snippets for comparing arrays of strings
- Distribute a list of positive numbers into a desired number of sets, aiming to have sums as close as possible between them
Related Questions in COMPOSITION
- Scala composition of partially-applied functions
- Dependency Injection with Generic Interface and Inheritance
- Difference Between StateList and List<State> in Jetpack Compose
- Composing React Providers with Value props in Typescript
- Component inheritance and sharing of elements in Angular with dynamic template
- How to make a comonad instance of list zipper of list zippers data type?
- Backpropagation through composition of neural networks where one has fixed parameters
- It is possible to reference to another JSON file for the value of a field?
- How to compose complex discriminated union types with generics into a single type in TypeScript?
- Using Real-time list of midi messages in this SCAMP script
- OOP problem: Relationship between category, sub category and product
- vue.js vue3: use a sub-component (formkit) in a self-defined component, render in App (main)
- C2027 use of undefined type 'Variable'
- Twilio compositions with different audio level per participant?
- class inheritance in python as fields - side effects / incorrect handling
Related Questions in BACKPROPAGATION
- Why doesn't this code work? - Backpropagation algorithm
- SymPy - Can't calculate derivative wrt expression, is there an alternative for intermediate expressions?
- Siamese Network Backpropagation
- Locally blocking gradient update for nested neural network
- Where exactly are TPUs used during machine learning?
- How to implement gradient op for a custom tensorflow op, for which the it is hard to derive a mathematical closed form formula for gradient?
- Does padded rows (fake inputs) affect backpropagation?
- Backpropagation for Two Different Neural Network Models with Combined Loss Functions
- Backpropagation through composition of neural networks where one has fixed parameters
- Neural Network backpropagation algorithm only partially training in python
- Neural Network works with ReLu bot not sigmoid
- Backpropagation and gradient descent with python
- Skipping backpropagation for certain element of certain data with stop_recording() in tf.GradientTape()
- How can I make a one neuron neural network?
- My one neuron neural network does not work with my dataset
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
Popular Tags
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
In backpropagation, the errors are (back-) propagated from the outputs to the neurons of the network in order to compute the gradients. In your arrangement, the neural network, which is not trained, is closer to the outputs than the neural network, which you intend to train. Hence, you need to propagate the errors through this network in order to compute the gradients for the neural network, which you intend to train. So as far as I know, you can not avoid these computations in your setting, if you are using the backpropagation algorithm for training.
If this is a problem for you, then you could try to use another, non-gradient based training algorithm. Possible methods for this are genetic or evolutionary algorithms.