I am building a simple feed forward neural network at compile time using const generics and macros. These are a bunch of matrices one after the other.
I have created the network!
macro, which works like this:
network!(2, 4, 1)
The first item is the number of inputs, and the rest are the number of neurons per layer. The macro looks as follows:
#[macro_export]
macro_rules! network {
( $inputs:expr, $($outputs:expr),* ) => {
{
Network {
layers: [
$(
&Layer::<$inputs, $outputs>::new(),
)*
]
}
}
};
}
It declares an array of layer elements, which use const generics to to have a fixed size array of weights on each layer, the first type parameter is the number of inputs it expects and the second type parameter is the number of outputs.
This macro produces the following code:
Network {
layers: [
&Layer::<2, 4>::new(),
&Layer::<2, 1>::new(),
]
}
This is completely wrong, because for each layer the number of inputs should be the number of outputs of the previous one, like so (notice 2 -> 4):
Network {
layers: [
&Layer::<2, 4>::new(),
&Layer::<4, 1>::new(),
]
}
To do this, I need to replace the $inputs
value by the value of $outputs
on each iteration, but I have no clue how to do.
You can match on the two leading values and then all the rest. Do something specific for the two values and call the macro recursively, reusing the second value:
Expanding the macro leads to: