LaurentMazare / ocaml-torch

OCaml bindings for PyTorch
Apache License 2.0
412 stars 38 forks source link

The simple example does not build #44

Closed philtomson closed 3 years ago

philtomson commented 4 years ago

I can run the simple example from the README.md in the toplevel:

utop # #require "torch.toplevel";; ─( 14:21:46 )─< command 1 >────────────────────────────────────────────────────────{ counter: 0 }─ utop # Torch_toplevel.register_all_pps () ;;

  • : unit = () ─( 14:22:02 )─< command 2 >────────────────────────────────────────────────────────{ counter: 0 }─ utop # open Torch;; ─( 14:22:15 )─< command 3 >────────────────────────────────────────────────────────{ counter: 0 }─ utop # let () = let tensor = Tensor.randn [ 4; 2 ] in Tensor.print tensor ;; -0.8721 -0.6984 1.2961 -0.3749 0.9753 -0.9037 -0.4380 1.5404 [ CPUFloatType{4,2} ]

However, when I try to build it according to the instructions (created example.ml and dune file):

$ dune build example.exe gcc src/wrapper/torch_api.o (exit 1) (cd _build/default/src/wrapper && /usr/bin/gcc -I /home/phil/.opam/4.10.0/lib/ocaml -I /home/phil/.opam/4.10.0/lib/bytes -I /home/phil/.opam/4.10.0/lib/ctypes -I /home/phil/.opam/4.10.0/lib/integers -I /home/phil/.opam/4.10.0/lib/ocaml/threads -std=c++14 -fPIC -D_GLIBCXX_USE_CXX11_ABI=1 -isystem /home/phil/.opam/4.10.0/lib/libtorch/include -isystem /home/phil/.opam/4.10.0/lib/libtorch/include/torch/csrc/api/include -g -o torch_api.o -c torch_api.cpp) In file included from torch_api.cpp:6:0: torch_api.cpp: In function ‘torch::optim::Optimizer* ato_adam(double, double, double, double)’: torch_api.cpp:326:10: error: ‘struct torch::optim::AdamOptions’ has no member named ‘betas’; did you mean ‘beta1’? .betas(std::tuple<double, double>(beta1, beta2)) ^ torch_api.h:14:5: note: in definition of macro ‘PROTECT’ x \ ^ torch_api.cpp: In function ‘void ato_add_parameters(optimizer, at::Tensor*, int)’: torch_api.cpp:372:10: error: ‘class torch::optim::Optimizer’ has no member named ‘param_groups’; did you mean ‘parameters’? t->param_groups()[0].params().push_back((tensors[i])); ^ torch_api.h:14:5: note: in definition of macro ‘PROTECT’ x \ ^ torch_api.cpp: In function ‘void ato_set_learning_rate(optimizer, double)’: torch_api.cpp:378:19: error: ‘OptimizerOptions’ is not a member of ‘torch::optim’ torch::optim::OptimizerOptions d = &(t->defaults()); ^ torch_api.h:14:5: note: in definition of macro ‘PROTECT’ x \ ^ torch_api.cpp:378:19: note: suggested alternative: ‘Optimizer’ torch::optim::OptimizerOptions d = &(t->defaults()); ^ torch_api.h:14:5: note: in definition of macro ‘PROTECT’ x \ ^ torch_api.cpp:378:37: error: ‘d’ was not declared in this scope torch::optim::OptimizerOptions d = &(t->defaults()); ^ torch_api.h:14:5: note: in definition of macro ‘PROTECT’ x \ ^ torch_api.cpp:378:46: error: ‘class torch::optim::Optimizer’ has no member named ‘defaults’ torch::optim::OptimizerOptions d = &(t->defaults()); ^ torch_api.h:14:5: note: in definition of macro ‘PROTECT’ x \ ^ torch_api.cpp:379:63: error: cannot dynamic_cast ‘d’ (of type ‘’) to type ‘struct torch::optim::AdamOptions’ (source is not a pointer) if (auto adam = dynamic_cast<torch::optim::AdamOptions>(d)) ^ torch_api.h:14:5: note: in definition of macro ‘PROTECT’ x \ ^ torch_api.cpp:381:70: error: cannot dynamic_cast ‘d’ (of type ‘’) to type ‘struct torch::optim::RMSpropOptions’ (source is not a pointer) else if (auto rms = dynamic_cast<torch::optim::RMSpropOptions>(d)) ^ torch_api.h:14:5: note: in definition of macro ‘PROTECT’ x \ ^ torch_api.cpp:383:66: error: cannot dynamic_cast ‘d’ (of type ‘’) to type ‘struct torch::optim::SGDOptions’ (source is not a pointer) else if (auto sgd = dynamic_cast<torch::optim::SGDOptions>(d)) ^ torch_api.h:14:5: note: in definition of macro ‘PROTECT’ x \ ^ torch_api.cpp: In function ‘void ato_set_momentum(optimizer, double)’: torch_api.cpp:392:19: error: ‘OptimizerOptions’ is not a member of ‘torch::optim’ torch::optim::OptimizerOptions d = &(t->defaults()); ^ torch_api.h:14:5: note: in definition of macro ‘PROTECT’ x \ ^ torch_api.cpp:392:19: note: suggested alternative: ‘Optimizer’ torch::optim::OptimizerOptions d = &(t->defaults()); ^ torch_api.h:14:5: note: in definition of macro ‘PROTECT’ x \ ^ torch_api.cpp:392:37: error: ‘d’ was not declared in this scope torch::optim::OptimizerOptions d = &(t->defaults()); ^ torch_api.h:14:5: note: in definition of macro ‘PROTECT’ x \ ^ torch_api.cpp:392:46: error: ‘class torch::optim::Optimizer’ has no member named ‘defaults’ torch::optim::OptimizerOptions d = &(t->defaults()); ^ torch_api.h:14:5: note: in definition of macro ‘PROTECT’ x \ ^ torch_api.cpp:393:63: error: cannot dynamic_cast ‘d’ (of type ‘’) to type ‘struct torch::optim::AdamOptions’ (source is not a pointer) if (auto adam = dynamic_cast<torch::optim::AdamOptions>(d)) { ^ torch_api.h:14:5: note: in definition of macro ‘PROTECT’ x \ ^ torch_api.cpp:397:70: error: cannot dynamic_cast ‘d’ (of type ‘’) to type ‘struct torch::optim::RMSpropOptions’ (source is not a pointer) else if (auto rms = dynamic_cast<torch::optim::RMSpropOptions>(d)) ^ torch_api.h:14:5: note: in definition of macro ‘PROTECT’ x \ ^ torch_api.cpp:399:66: error: cannot dynamic_cast ‘d’ (of type ‘’) to type ‘struct torch::optim::SGDOptions’ (source is not a pointer) else if (auto sgd = dynamic_cast<torch::optim::SGDOptions>(d)) ^ torch_api.h:14:5: note: in definition of macro ‘PROTECT’ x \ ^ torch_api.cpp: In function ‘int ati_tag(ivalue)’: torch_api.cpp:641:17: error: ‘struct c10::IValue’ has no member named ‘isList’; did you mean ‘isInt’? else if (i->isList()) return 12; ^ torch_api.h:14:5: note: in definition of macro ‘PROTECT’

... and lots more.

LaurentMazare commented 4 years ago

Looking at the error message, it seems that you are compiling with an old version of pytorch (for the adam optimizer, beta1 and beta2 were replaced by betas in the c++ api in PyTorch 1.5). If you installed the libtorch package via opam, maybe you want to update it to 1.5.0. Otherwise if you installed libtorch manually, you can probably get the 1.5 version on the pytorch website

philtomson commented 4 years ago

You're right. I had libtorch 1.4. I did:

$ opam upgrade libtorch

And then:

$ opam update $ opam upgrade

$ opam install torch The following actions will be performed: ↘ downgrade libtorch 1.5.0 to 1.4.0 [required by torch] ∗ install torch 0.8

Opam wants to downgrade torch back to 1.4.0. Any idea how I can tell it to use the correct torch package?

LaurentMazare commented 4 years ago

We've just released torch 0.9 which should work with libtorch 1.5.0, as this just happened you may have to update your opam. (and maybe you don't even need the torch opam package to be installed if you're actually compiling inside a clone of theocaml-torch repo)