Accenture / AmpliGraph

Python library for Representation Learning on Knowledge Graphs https://docs.ampligraph.org
Apache License 2.0
2.16k stars 250 forks source link

Add gradient based optimization for calibration with negatives #239

Closed adrijanik closed 3 years ago

adrijanik commented 3 years ago

Background and Context At the moment model calibration with negatives is done via default Scipy optimizer (BFSG), this is working as expected for small graphs, however when the size increases optimization takes much longer and requires much more memory, making it long and heavy to calibrate a model. This can be solved by switching optimizer to a gradient based one.

Description The fix would be to either add an extra option to calibrate function e.g. gradient_optimizer=False adding a possibility when default optimization doesn't work but keeping functionality as it use to be. Another option would be to remove scipy optimization altogether and substitute it with gradient-based one.

adrijanik commented 3 years ago

Closing with commit 2974593df460e55ee3ac5df624a8fb46fceccddc (scipy optimization removed in favor of Adam optimizer).