Open zhou9110 opened 2 months ago
This is a good point - @tazarov should we bump the version?
Pydantic 1.x also has similar function to so maybe we can implement an adapter that is version-aware function to keep backward compatibility.
We've added the orjson serialization in 0.5.0 and so far @zhou9110 is the first to notice which leads me to believe that not so many people are running pydantic <2.x. Maybe bumping the pydantic range >=2.0 and ditching 1.x as part of the 0.5.x versions can be an OK approach.
However if we go 2.x route we may want to also add pydantic-settings package and refactor the config (can be done in a separate PR).
looking at pydantic-settings
package I can see that it basically requires pydantic
> 2.7
https://github.com/pydantic/pydantic-settings/blob/6d25cee4bb7a6db592ca0da123c53f3d775cd1e1/pyproject.toml#L43
Using pedantic-settings
as a package dependency, we're effectively forcing a pydantic>2.0
upgrade, which in @zhou9110's and other's cases might be a breaking change.
Another interesting aspect is our fastapi
dep fastapi>=0.95.2
, which translates:
https://github.com/tiangolo/fastapi/blob/c81e136d75f5ac4252df740b35551cf2afb4c7f1/pyproject.toml#L45
The above will cause some dependency resolution issues with older fastapi
versions.
In the end, it will be a trade-off and a bit of a jerk move to force users to upgrade their deps which may come from other dependencies.
Bottom line:
pydantic>2.0
with pydantic-settings
pro
: makes the code a bit cleaner as we'll remove all the pydantic 1.x compatibility logicpro
: allows us to bump fastapi
con
: we add an extra dependency pydantic-settings
con
: forces users to upgrade to pydantic>2.0
con
: there are possible paths that lead to unreconcilable dependency conflicts and breaking changessupport for both versions
pro
: we don't have to add pydantic-settings
as direct dependency, but in case it is there we can use itpro
: no breaking changes for userscon
: add more boilerplate code to support both versions that need to be maintained and testedcon
: our tests always run on 2.x
so it properly testing with older version requires additional CI with little gain, but it will help us avoid the above error@HammadB, any strong opinion on which option to go for?
What happened?
After installed ChromaDB and run
chroma run
, it throws an error when I try to create a new index using the NodeJS client.The reason I find out is because the version of
pydantic
is too old on my machine. Mine was1.10.12
, and upgrading the library to the latest version (2.7.1) solves this issue.It seems the version in the
requirement.txt
ispydantic>=1.9
so it doesn't upgrade the library automatically during install, should the version number gets bumped? Reference: https://github.com/chroma-core/chroma/blob/b34f90ce41a82d54ca4d68e43009309cd8b43f89/requirements.txt#L18The code that throws the error (
model_validate
does not exist on1.10
): https://github.com/chroma-core/chroma/blob/b34f90ce41a82d54ca4d68e43009309cd8b43f89/chromadb/server/fastapi/__init__.py#L561Versions
Chroma v0.5.0, Python 3.9.13, MacOS 14.4.1
Relevant log output