I'm trying to make a network that has an input that lingers/decays. The raw input will be a vector with either 0,1 or -1 in for each element. I'm curious if there is any value in simultaneous activation of any given input so I would like to have a weight decay from 1 or -1 back to 0 rather than just being 0 the next iteration, a crude form of memory I guess. An example of what I'm trying to say:
Normal input:
1 -> 0 -> 0 -> -1 -> 0 ...
With decay .2:
1 -> .8 -> .6 -> -1 -> -.8 ...
This is easy to do manually by adding an extra input that takes a vector of decay values, but I want to know if it's possible to have the network learn it's own values here so that it can give smaller decays to inputs that are more important.
Since each neuron outputs one value it is possible to have N neurons (one for each required decay value) and then pass them 1 as a constant input so they would just output their weight which could be run through sigmoid activation and then used as the decay values.
Will this layer learn weights given it's input is always 1? If not is there a way to do this?
NOTES: The data is sequential which is why I would assume that the activations could affect each other. Also I am aware that recurrent networks are made to have memory but I don't know if I have enough data for it to learn the relationships. Also this custom decay function can eventually make it back to 0 because it subtracts decay, multiplying by a small weight would approach 0 asymptotically which, if I understand correctly, is what a RNN would do.
You can create this type of architecture easily using TensorFlow functional API.
Creation of dataset and model Code:
Your model looks like this.
Training process:
Output:
The final value of your decay rate.
Output:
Things to remember-
Your learning not only depends on your input but also your output. While calculating the gradients which are shown above the output as well as predicted output term is present in the gradient equation. Therefore, as long as you have different output learning will still take place.