modern-fortran / neural-fortran

A parallel framework for deep learning
MIT License
395 stars 82 forks source link

Provide different back-ends? #28

Closed ivan-pi closed 16 hours ago

ivan-pi commented 3 years ago

Would it be possible to use sub-modules to provide different backends to the neural-fortran interface?

What I have in mind are things like:

While I admire the effort to build a pure Fortran NN library, the amount of effort (and money) being put into these other libraries is simply enormous. Perhaps this way disciplines traditionally reliant upon Fortran (meteorology, quantum chemistry, ...) could also benefit from the numerous existing machine learning frameworks containing all kinds of advanced graph and runtime optimizations.

The way I see this working, is we need to define (and possibly expand) the high-level interface for creating and training NNs. Then the non-Fortran implementations (effectively just adaptors to other frameworks) can be placed in submodules which could be switched on by a CMake flag.

ivan-pi commented 3 years ago

Digging deeper into TensorFlow as an example, they have some documents on how to build bindings for other languages through the C api:

The C API however, is still in development, and doesn't support all the features of TensorFlow in Python. I've found a few blog posts on how the C API can be used to call an existing graph:

It looks more complicated than I expected.

milancurcic commented 3 years ago

Thank you for the proposal! It sounds like a useful though daunting effort. I won't be able to lead the development but I'd be happy to help. I'm also open to changes to the API.

All this assuming that any added back-end would be optional at build time.

milancurcic commented 2 years ago

I just became aware of pytorch-fortran.

ivan-pi commented 2 years ago

Very cool that NVIDIA is involved. I see one just needs to define a model and the input/output layers:

    type(torch_module) :: torch_mod
    type(torch_tensor) :: in_tensor, out_tensor

The tensors can be initialized from Fortran arrays and vice-versa.

It reminds me a bit of the tensor classes by Patrick Seewald: fortran-einsum-example

With TensorFlow I didn't get any further than what I posted at Discourse.

milancurcic commented 2 years ago

There's also Fortran Torch Adapter, which takes an approach similar to pytorch-fortran.

milancurcic commented 16 hours ago

Thank you for the suggestion again. After some time, I'm more convinced that neural-fortran will remain a pure Fortran approach so alternative backends are out of scope.