isayev / ASE_ANI

ANI-1 neural net potential with python interface (ASE)
MIT License
220 stars 56 forks source link

Portable version? #19

Closed jchodera closed 5 years ago

jchodera commented 6 years ago

Just curious why you're distributing binary libraries instead of a platform-portable version here. Is the actual code that could be targeted to multiple systems available in another repo?

isayev commented 6 years ago

John: Thanks for comments. We will keep this in mind. Let me know if yoy would like to play with ANI in a Docker container. We have a prototype version.

On Fri, Jul 6, 2018 at 21:35 John Chodera notifications@github.com wrote:

Just curious why you're distributing binary libraries instead of a platform-portable version here. Is the actual code that could be targeted to multiple systems available in another repo?

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/isayev/ASE_ANI/issues/19, or mute the thread https://github.com/notifications/unsubscribe-auth/ADDzLV0sXLpfY4AxySPRcn9tnKEh0G4Tks5uD7wVgaJpZM4VF704 .

jchodera commented 6 years ago

A docker container at least enables reproducibility and portability, though not understanding and innovation. I think a docker container would still be useful for us to play around with! Do you have one on dockerhub?

dgasmith commented 6 years ago

Could these custom models be transferred over to standard TF/Keras/PyTouch backends, would make transferability and accessibility an order of magnitude higher.

jchodera commented 6 years ago

Looks like someone has already done this!

https://github.com/proteneer/khan

dgasmith commented 6 years ago

Cool, the resulting forcefield would be slightly different. Close enough for trial runs however.

roitberg commented 6 years ago

HI guys. We have a beta version in pytorch and will release it soon. The khan version is not quite ANI. The Deepchem also is not. The basis are there, but the devil is in the details of activation functions, epochs, gradients, etc etc

jchodera commented 6 years ago

The basis are there, but the devil is in the details of activation functions, epochs, gradients, etc etc

So the paper is useless without releasing all the code, since it's an incomplete description of what you actually did? :)

jchodera commented 6 years ago

Sounds like a great argument for releasing the code!

roitberg commented 6 years ago

It is actually the COMPLETE opposite. In the case of Deepchem for instance, they implemented whatever the hell they felt like, even when the paper was VERY clear about our choices. They published a paper showing that ANI was bad, and it turned to have been their own mistakes. That is what I meant. Take khan or deepchem, read our paper carefully and implement OUR choices there. Then the comparison is fair

jchodera commented 6 years ago

Got it! Thanks for the clarification!

isayev commented 6 years ago

@jchodera paper has a complete and honest description of what we did. Not every one pay attention to full technical details in SI or simply made their own informed decision on those parameters.

Like, wtf these guys are crazy to use gaussian activation function, let me use laters greatest ReLU etc.

jchodera commented 6 years ago

Awesome. Looking forward to the pytorch implementation then!

proteneer commented 6 years ago

We clarified the use of the gaussian activation function with you guys here:

https://github.com/isayev/ASE_ANI/issues/9

In addition to the arxiv paper:

https://arxiv.org/pdf/1610.08935.pdf

Which specifically mentions:

"All hidden layer nodes use a Gaussian activation function[47] while the output node uses a linear activation function."

proteneer commented 6 years ago

PS - we're also looking forward to the pytorch version!

isayev commented 5 years ago

@jchodera @proteneer @dgasmith Pytorch ANI is now available: https://github.com/aiqm/torchani