ACEsuit / Polynomials4ML.jl

Polynomials for ML: fast evaluation, batching, differentiation
MIT License
12 stars 5 forks source link

Lux Extension #43

Closed cortner closed 2 months ago

cortner commented 1 year ago

Since J1.9 we have very convenient pkg extensions. We can implement additional Lux functionality or Zygote functionality or ... that gets loaded if and only if those packages are loaded by the user.

Just a thought for now. We can discuss.

cortner commented 1 year ago

@CheukHinHoJerry -- I do wonder whether we are overcomplicating our life by not just making Lux.jl a direct dependency. What do you think? If we make it a dependency then we can just add all sorts of utility functions to the package to make interesting composed models.

CheukHinHoJerry commented 1 year ago

I think it depends what is our future direction to this package. It we simply want this to be a kernel-like package it is better not to add that.

But personally I prefer adding that and we can then have flexibility on exporting functionalities like AceLuxBlocks (e.g. ProductBasis in ACEpsi.jl) and even pretrained models from P4ML. It is annoying to write the same block multiple times as we need it.

I was actually planning to start a new package by myself to do the Blocks and all workflows on different usage of ACE but it would be great if we can do that just in P4ML.

cortner commented 1 year ago

So basically there are two ways to think about this:

The backend can then be swapped out as needed - e.g. @tjjarvinen will work on GPU kernels. I'm a bit on the fence. It feels like we can just work in P4ML for now but on the other hand there will be a bigger and bigger barrier to refactoring later.

cortner commented 1 year ago

In terms of branding, I like the idea of an EquivariantLux.jl package. It would probably be the best way to get others' attention.

CheukHinHoJerry commented 1 year ago

I think having front-end and back-end separate will be much better if we also want to support GPU kernels and have a long term development/support. Somehow like other deep learning packages calling cuda BLAS backends.

CheukHinHoJerry commented 1 year ago

Maybe we should have a meeting with @tjjarvinen and @zhanglw0521 to discuss this so that we can have a clearer direction to work on onwards?

cortner commented 1 year ago

Yes, that sounds useful.

cortner commented 1 year ago

@CheukHinHoJerry -- what was the outcome of our meeting on that point. Where do the Lux Layers go. Into P4ML or into EquivariantModels?

CheukHinHoJerry commented 1 year ago

I remember that we inclined to keep it in P4ML but I don't have a clear memory.

@tjjarvinen @zhanglw0521

Any idea?

cortner commented 1 year ago

Right - I think Teemu said that there will simply be dispatch based on whether arrays are CPU types or GPU types.

So now all we have left to decide is whether we make a Lux extension and have all Lux wrappers in there, or just make Lux a dependency and keep all Lux functionality locally with the different layers.

I think we can choose one and change later, so the decision isn't crucial. Do you have a preferene?

CheukHinHoJerry commented 1 year ago

Yes - now I remember as you mentioned. Given that we are only using LuxCore, should be fine just to keep this as a direct dependency for now.

But if we want to make use of Lux itself then I think we should make it an extension.

cortner commented 2 months ago

I think this can be closed for now and revisited at some point if we feel like re-organizing codes again...