Closed hd10180 closed 1 year ago
after debug i found: the find query's limit, skip, sort_expressions doesn't pass to the aggregation. see https://github.com/roman-right/beanie/blob/main/beanie/odm/queries/find.py#L527-L556
def aggregate(
self,
aggregation_pipeline: List[Any],
projection_model: Optional[Type[FindQueryProjectionType]] = None,
session: Optional[ClientSession] = None,
ignore_cache: bool = False,
**pymongo_kwargs,
) -> Union[
AggregationQuery[Dict[str, Any]],
AggregationQuery[FindQueryProjectionType],
]:
"""
Provide search criteria to the [AggregationQuery](https://roman-right.github.io/beanie/api/queries/#aggregationquery)
:param aggregation_pipeline: list - aggregation pipeline. MongoDB doc:
<https://docs.mongodb.com/manual/core/aggregation-pipeline/>
:param projection_model: Type[BaseModel] - Projection Model
:param session: Optional[ClientSession] - PyMongo session
:param ignore_cache: bool
:return:[AggregationQuery](https://roman-right.github.io/beanie/api/queries/#aggregationquery)
"""
self.set_session(session=session)
return self.AggregationQueryType(
aggregation_pipeline=aggregation_pipeline,
document_model=self.document_model,
projection_model=projection_model,
find_query=self.get_filter_query(),
ignore_cache=ignore_cache,
**pymongo_kwargs,
).set_session(session=self.session)
maybe we can pass them like
return self.AggregationQueryType(
aggregation_pipeline=aggregation_pipeline,
document_model=self.document_model,
projection_model=projection_model,
find_query=self.get_filter_query(),
ignore_cache=ignore_cache,
limit=self.limit_number, # pass the limit
skip=self.skip_number, # pass the skip
sort_expressions=self.sort_expressions, # pass the sort exp
**pymongo_kwargs,
).set_session(session=self.session)
and then use the params in aggregation.py's get_aggregation_pipeline function
def get_aggregation_pipeline(
self,
) -> List[Mapping[str, Any]]:
match_pipeline: List[Mapping[str, Any]] = (
[{"$match": self.find_query}] if self.find_query else []
)
# use the params
sort_pipeline = {"$sort": {i[0]: i[1] for i in self.sort_expressions}}
if sort_pipeline["$sort"]:
match_pipeline.append(sort_pipeline)
if self.skip_number and self.skip_number != 0:
match_pipeline.append({"$skip": self.skip_number})
if self.limit_number and self.limit_number != 0:
match_pipeline.append({"$limit": self.limit_number})
# end of patch
projection_pipeline: List[Mapping[str, Any]] = []
if self.projection_model:
projection = get_projection(self.projection_model)
if projection is not None:
projection_pipeline = [{"$project": projection}]
return match_pipeline + self.aggregation_pipeline + projection_pipeline
@roman-right
Hey! Sorry for the delay. It looks like a bug. I'll pick it up soon. Thank you
This issue is stale because it has been open 30 days with no activity.
This issue was closed because it has been stalled for 14 days with no activity.
i try to chained call with method find, sort, limit, skip and finally use aggregate, but got unexpected result.
a. insert 100 records into db b. try to find data use {} as a condition c. use limit(10) to limit return numbers: what i need is 10 d. use aggregate to query across tables
my env:
Q1: can i channed call the methods like:
DocType.find().sort().limit().skip().aggregate()
? Q2: what is the correct result of the test case?reproduce steps: the code is complete, it should run with pytest
pytest --setup-show --log-cli-level=INFO
codes
from .models import Cup, Water, WaterInCup
LOGGER = logging.getLogger(name)
class Settings(BaseSettings): uri: str = "mongodb://127.0.0.1:27017/pytest" db_name: str = "test_db"
settings = Settings()
@pytest.fixture def motor_client(): LOGGER.info(">>>>>>>>>>init_client") return motor.motor_asyncio.AsyncIOMotorClient(settings.uri)
@pytest.fixture def db(motor_client): LOGGER.info(">>>>>>>>>>init_db") db = motor_client[settings.db_name] return db
@pytest.fixture(autouse=True) async def lifespan(motor_client, db):
!pre: