Open PallHaraldsson opened 3 years ago
idk. TensorFlow.jl is still a very featureful binding for TensorFlow 1.0 I don't know how this compares to GenTF. neither in goals nor capacity.
The goals of TensorFlow.jl was to train neural networks with a ideomatic julia API.
It was not intended to be a restricted or limitted API, and indeed when we were working on it it was the most feature complete client outside of Python (for a few months it could even do things python didn't, like overloaded getindex
).
What is considered an idiomatic julia API has changed a little since this was created and being worked on.
e.g. julia used to do implict broadcasting like numpy and TensorFlow. And so TensorFlow.jl still allows that.
Meanwhile julia packages (like Flux) have come a long that are more flexible, ideomatic, and surprisingly actually comparatively performant.
https://github.com/probcomp/GenTF is not intended to be a Julia wrapper for TensorFlow, but instead a plugin for Gen that lets you write differentiable pieces of generative models and inference models in TensorFlow. Specifically, GenTF enforces certain modeling disciplines (e.g. use of probabilistic loss functions that are computed by Gen itself) that TensorFlow does not. Also, GenTF wraps the Python TensorFlow API.
TensorFlow.jl wraps mosytly libtensorflow (the C API). But for some graph manipulation PyCall's the Python TensorFlow API (because those are not exposed in libtensorflow). I suspect that GenTF needs similar graph manipulation so probably also needs to PyCall
GenTF is not intended to be a Julia wrapper for TensorFlow
Right, I realized that. I was curious, why this package wasn't used at the time (maybe you didn't know of it). By now it's I guess a good choice, while either choice would (still) be valid.
Another choice might be to change it and use say Flux.
TensorFlow.jl is still a very featureful binding for TensorFlow 1.0
Yes, up to "1.13.1", not 1.5 or 2.5 and even 1.13.2, and 1.15.5, have some security issues fixed:
https://github.com/tensorflow/tensorflow/releases/tag/v1.13.2
https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-26271
Now I'm not really worried, in case I was just learning, still it seems warranted to learn TF using Python ecosystem directly, also for tutorial reasons, even if I want to end up using Julia.
The goals of TensorFlow.jl was to train neural networks with a idiomatic julia API.
Does it have any advantage over using TF through PyCall, I mean other than being idiomatic? I really liked that, this was my favorite example of Julia API wrapper, since better than Google's official.
More important question is: is using TF through PyCall better than just using Flux (for reasons, besides pedagogical), and/or possibly something else in Julia ecosystem?
@marcoct
I came across this Julia project: https://github.com/probcomp/GenTF
that uses TF through Python. It seems to be younger than TensorFlow.jl, while maintained for longer. That may always be a valid option to use TF.
The (former) maintainers here seem to prefer Flux over TF. There's nothing stopping me or anyone to fork this project and support latest. I'm just thinking is there a need or is it advised? I'm all for Julia-only solutions if/when they are better, and maybe Flux is the future for Julia and well ML community in general.
I want to get more into ML, learn about the 1) Julia packages worth using and helping with and/or mainstream, 2) TF and PyTorch. If you're rather new to packages (less new to theory), would you recommend looking first into mainstream?
Is there some restricted solution needed (or would this package already be it) similar to: https://github.com/FluxML/Torch.jl