allenai / allennlp

An open-source NLP research library, built on PyTorch.
http://www.allennlp.org
Apache License 2.0
11.71k stars 2.24k forks source link

Remove upper bounds for requirements #5728

Closed aphedges closed 1 year ago

aphedges commented 1 year ago

Is your feature request related to a problem? Please describe.

Remove upper bounds for dependencies in requirements.txt, where possible

Describe the solution you'd like

I would like the upper bounds for versions in requirements.txt to removed except for cases that are known to be broken. In #4324, the stated reason for this pinning was to prevent CI breakage. It also stated that you had a bot to upgrade dependencies automatically. However, now that the library is being retired and Depdendabot was disabled in 20df7cdd3eea7f895ceee9c57e2be1a843510748, this upgrade process is no longer happening.

I would like to keep using this library, at least for a little while, and removing these upper bounds would make that much easier. I feel that the circumstances mean the original reasons might not apply and that the topic should be reconsidered. To prevent CI breakage after upper bound removal, they can be kept in a constraints file for use in CI. It would be up to end users in the future to limit dependency conflicts if they encounter any.

I understand if you don't want to go through the effort for this, but I can create a PR if this idea is approved.

Describe alternatives you've considered

I've considered forking this library and applying a patch so I can upgrade spacy:

diff --git a/requirements.txt b/requirements.txt
index a9514301d70..c2bbe742460 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -7,7 +7,7 @@ cached-path>=1.1.3,<1.2.0
 fairscale==0.4.6
 jsonnet>=0.10.0 ; sys.platform != 'win32'
 nltk>=3.6.5
-spacy>=2.1.0,<3.4
+spacy>=2.1.0,<3.5
 numpy>=1.21.4
 tensorboardX>=1.2
 requests>=2.28

However, everyone would need to maintain their own forks, which seems inefficient. Doing this centrally would save work overall and allow the library to be more usable after its official retirement.

epwalsh commented 1 year ago

Hey @aphedges, yes this seems like a good solution to me. I'd be happy to review a PR.