nikitakit / self-attentive-parser

High-accuracy NLP parser with models for 11 languages.
https://parser.kitaev.io/
MIT License
861 stars 153 forks source link

Reason for depending on TF for install, but PT for training? #34

Closed BramVanroy closed 5 years ago

BramVanroy commented 5 years ago

I am curious to find out what the reason is for this inconsistency. Why does the package depend on Tensorflow, where you require PyTorch for training? Did you start off with Tensorflow and then made the switch to PT?

nikitakit commented 5 years ago

The project was in pytorch from the very beginning, but when it came time to do a parser release I wasn't very happy with the tools that pytorch offered. In pytorch (as of version 0.4) you can't distribute models independently of the python code that was used to create them, which causes issues in that (a) the training code requires python 3.6+ and doesn't run on Windows (b) it's very disruptive for me to upgrade pytorch (or other library) versions while I'm actively working on a research project, but a release codebase needs to support new framework versions whenever they come out, and (c) Tensorflow has much better tools for model quantization/compression, which I use to keep model download sizes low.

The nice thing about Tensorflow is that you can compile your model into a computation graph and save the graph to disk, after which it can be loaded and used independently of the python code that created it. I had figured that forward compatibility would be better for the tensorflow on-disk format than for any python code I write. I also tried my best to make sure that the release code runs across a wide range of environments, including Python 2.

At one point I also wanted to write release bindings for languages other than Python, but I never got around to doing that.

BramVanroy commented 5 years ago

Thank you for the elaborate reply!

Many projects are switching to support >=3.6, and Python 2.x support is being dropped everywhere. Official support ends on January 1st, anyway. This is a good website to get an idea of who's dropping support (spoiler: almost all major packages). I don't think that making your project depend on 3.6 or later is a bad idea - perhaps work towards it for a new release at the end of this year? It will save you a lot of compatibility head aches, I'm sure!

I'm not sure why PyTorch models won't run on Windows. I've been using pretrained models since 0.4.1 and it works fine. It's true that models might not be as small as the TF computation graphs, but one could argue that that shouldn't be too big an issue. The model is only downloaded once. Using computation graphs vs. model states is a discussion that has a lot of pros and cons on both sides. I see that in the field of NLP, many implementations have moved to PyTorch >= 1.0.

All this to say that I understand and of course respect your decision. Thank you again for this package!