Closed mgoin closed 6 months ago
vLLM now requires torch 2.3.0, so we should try to raise the restriction in SparseML. Going forward, I think we shouldn't be so restrictive to newer versions of pytorch and should raise a warning at most, not an explicit exception.
@bfineran @robertgshaw2-neuralmagic is this good with you going forward to not restrict pytorch?
@mgoin should we land this (after the GHA tests have passed?)
vLLM now requires torch 2.3.0, so we should try to raise the restriction in SparseML. Going forward, I think we shouldn't be so restrictive to newer versions of pytorch and should raise a warning at most, not an explicit exception.