I'm interested in using Alea GPU with a third-party library and am trying to get a sense of my options. Specifically, I'm interested in using this L-BFGS library. I'm fairly new to the F# ecosystem but do have experience with both CUDA and functional programming.
I've been using that L-BFGS library as part of a program which implements logistic regression. It would be neat if I could assume the library correct and write the rest of my code (including that which runs on the GPU) in type-safe F#.
It seems possible to link C++ with F#. Assuming I figure out how to integrate the L-BFGS library into a F# program, would the introduction of Alea GPU cause any issues?
What I am trying to avoid is re-writing L-BFGS in F# using Alea. However, maybe that's actually the easiest path to using F#. If Alea has any facilities for nonlinear optimization, I could probably use those instead.
Alea GPU does not have a nonlinear optimizer yet. The CUDA version has a slightly different implementation than the standard CPU L-BFGS which sometimes causes some accuracy issues. Apart from this I did not face any issues with the code, except that the performance win also significantly depends on the objective function. The objective function for logistic regression is numerically relatively cheap.
We have an internal C# version for this code ported to Alea GPU, which could be also used from F# and we plan to release it in a future version.