HIPS / autograd

Efficiently computes derivatives of NumPy code.
MIT License
7.02k stars 914 forks source link

Support for advanced library based on autograd #613

Closed linjing-lab closed 12 months ago

linjing-lab commented 12 months ago

I use the sympy.jacobian method for the development of optimization algorithms and feel that symbolic differentiation brings high-precision in objective functional value. @j-towns This repository has recently merged a pull request, so I want to get some support about the application and examples of autograd.

First, I was developing Python software with Python 3.7+, and found autograd currently released version under python < '3'. JAX is not only an automatically differentiate native Python and Numpy code, so adopting JAX to wrap an advanced library is an irrational behavior, because I think the autograd relys solely on numpy to achieve automatic differentiation mode in native scenarios.

May I ask if it is possible to develop a new release that supports typing and Python 3.7+, I could not adopt hessian now because the test_wrappers may execute under Python version smaller than 3, and tox.ini contains py3, are there some cases about auograd, like machine learning library?

CamDavidsonPilon commented 12 months ago

autograd is Python3 compatible: https://github.com/HIPS/autograd/blob/9a90bd6172d1882235c326c56c17a9540357d86b/pyproject.toml#L28-L34

Have you tried to install and use it in a python3 environment?

linjing-lab commented 12 months ago

OK, I'm busy about making improvements into advanced software, like optimtool, and its architecture by optimizing numeric approximation. I'm now know autograd supports python3, and did autograd adopts the mature solution and nothing else to optimize? I would like to use this software, though I haven't delved deeply into it yet.

CamDavidsonPilon commented 12 months ago

?? You'll have to try it to see if it fits your application

linjing-lab commented 12 months ago

Got it, already download autograd!