Are there any plans to implement a leaky ReLU in H2O?

253 views Asked by At

Are there any plans to implement a leaky ReLU in the Deep Learning module of H2O? I am a beginner to neural nets, but in the limited amount of model building and parameter tuning, I have found the ReLUs to generalize better, and was wondering if even better performance might be obtained by using leaky ReLUs to avoid the dying ReLU problem.

1

There are 1 answers

1
Lan On

This is not a direct answer to your question because product roadmap is not really something we can comment on. However, if you are worried about dying ReLU problem in H2O, why don't you use ExpRectifier, which stands for exponential linear unit (RLU), which does not suffer dying ReLU problem. As a matter of fact, this paper proves that ELU outperforms all ReLU variants. The only drawback is it is more computational heavy as it involves exponent in calculation.