milvus-io / milvus

A cloud-native vector database, storage for next generation AI applications
https://milvus.io
Apache License 2.0
30.92k stars 2.95k forks source link

[Bug]: Milvus search exception #36905

Open HuaJieHappy opened 1 month ago

HuaJieHappy commented 1 month ago

Is there an existing issue for this?

Environment

- Milvus version:2.2.12
- Deployment mode(standalone or cluster):standalone
- MQ type(rocksmq, pulsar or kafka):  defalut  
- SDK version(e.g. pymilvus v2.0.0rc2): 2.2.12
- OS(Ubuntu or CentOS): centos7.5
- CPU/Memory: 8C32G
- GPU: No
- Others:

Current Behavior

The application was subjected to stress testing with 30 concurrent users, and an error occurred during the query method. The error message is The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/chatfile/app/service/chat_sse_support/multiple_recall.py", line 41, in multiple_recall milvus_docs, milvus_image_ids = milvus_search_based_service(data, question_embedding, search_limit, File "/chatfile/app/recall_pics/milvus_recall.py", line 167, in milvus_search_based_service result = search_vectors(collection=collection, vectors_to_search=content_embedding, search_params=search_params, File "/chatfile/app/service/chat/milvus_search.py", line 19, in search_vectors results = collection.search(data=[vectors_to_search], param=search_params, limit=milvus_search_limit, File "/usr/local/lib/python3.8/site-packages/pymilvus/orm/collection.py", line 629, in search res = conn.search(self._name, data, anns_field, param, limit, expr, File "/usr/local/lib/python3.8/site-packages/pymilvus/decorators.py", line 109, in handler raise e File "/usr/local/lib/python3.8/site-packages/pymilvus/decorators.py", line 105, in handler return func(*args, *kwargs) File "/usr/local/lib/python3.8/site-packages/pymilvus/decorators.py", line 136, in handler ret = func(self, args, **kwargs) File "/usr/local/lib/python3.8/site-packages/pymilvus/decorators.py", line 56, in handler raise MilvusException(message=str(e)) from e pymilvus.exceptions.MilvusException: <MilvusException: (code=1, message=<_InactiveRpcError of RPC that terminated with: status = StatusCode.CANCELLED details = "Channel closed!" debug_error_string = "UNKNOWN:Channel closed! {created_time:"2024-10-09T19:01:25.108709223+08:00", grpc_message:"Channel closed!", grpc_status:1}"

)>

Expected Behavior

Is it necessary to modify the configuration or upgrade the version

Steps To Reproduce

use milvus default config

Milvus Log

No response

Anything else?

No response

yanliang567 commented 1 month ago

@HuaJieHappy Please refer this doc to export the whole Milvus logs for investigation. For Milvus installed with docker-compose, you can use docker-compose logs > milvus.log to export the logs.

Also quick questions:

  1. do you have writing requests during the stress testing?
  2. any metrics about the resource usage and milvus pods? if convenient, I suggest you upgrade to 2.3.22 or 2.4.13 as the running milvus 2.2.12 is quite old.

/assign @HuaJieHappy /unassign

HuaJieHappy commented 1 month ago

do we have complete upgrade plan such as from 2.2.12 to 2.4.13? data can't lost

HuaJieHappy commented 1 month ago

How to upgrade Milvus from version 2.2.12 to 2.4.13 or another version while preserving the data and ensuring that queries can still be executed?

xiaofan-luan commented 1 month ago

Our suggestion is to upgrade from 2.2.12 to 2.3.22(Latest 2.3) and see

xiaofan-luan commented 1 month ago

If there are any issue we can help on investigating. All zilliz cloud instance has already been upgrade to 2.4 and 2.3 is already in it's end of life

HuaJieHappy commented 1 month ago

milvus-standalone | [2024/10/18 02:17:12.614 +00:00] [WARN] [grpcclient/client.go:341] ["ClientBase ReCall grpc first call get error"] [role=datacoord] [error="err: rpc error: code = Canceled desc = context canceled\n, /go/src/github.com/milvus-io/milvus/internal/util/trace/stack_trace.go:51 github.com/milvus-io/milvus/internal/util/trace.StackTrace\n/go/src/github.com/milvus-io/milvus/internal/util/grpcclient/client.go:340 github.com/milvus-io/milvus/internal/util/grpcclient.(ClientBase[...]).ReCall\n/go/src/github.com/milvus-io/milvus/internal/distributed/datacoord/client/client.go:435 github.com/milvus-io/milvus/internal/distributed/datacoord/client.(Client).GetRecoveryInfoV2\n/go/src/github.com/milvus-io/milvus/internal/indexcoord/index_coord.go:855 github.com/milvus-io/milvus/internal/indexcoord.(IndexCoord).getIndexedStats\n/go/src/github.com/milvus-io/milvus/internal/indexcoord/index_coord.go:936 github.com/milvus-io/milvus/internal/indexcoord.(IndexCoord).DescribeIndex\n/go/src/github.com/milvus-io/milvus/internal/distributed/indexcoord/service.go:279 github.com/milvus-io/milvus/internal/distributed/indexcoord.(*Server).DescribeIndex\n/go/src/github.com/milvus-io/milvus/internal/proto/indexpb/index_coord.pb.go:2669 github.com/milvus-io/milvus/internal/proto/indexpb._IndexCoord_DescribeIndex_Handler.func1\n/go/src/github.com/milvus-io/milvus/internal/util/interceptor/cluster_interceptor.go:69 github.com/milvus-io/milvus/internal/util/interceptor.ClusterValidationUnaryServerInterceptor.func1\n/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware@v1.3.0/chain.go:25 github.com/grpc-ecosystem/go-grpc-middleware.ChainUnaryServer.func1.1.1\n/go/src/github.com/milvus-io/milvus/internal/util/logutil/grpc_interceptor.go:22 github.com/milvus-io/milvus/internal/util/logutil.UnaryTraceLoggerInterceptor\n"] @xiaofan-luan

HuaJieHappy commented 1 month ago

15 threads for search func

HuaJieHappy commented 1 month ago

Our suggestion is to upgrade from 2.2.12 to 2.3.22(Latest 2.3) and see

Can the previous data be retained after the upgrade?

xiaofan-luan commented 1 month ago

yes it should be all compatible

xiaofan-luan commented 1 month ago

do a backup in case is recommended

HuaJieHappy commented 1 month ago

do a backup in case is recommended

use Milvus backup tools?

HuaJieHappy commented 1 month ago

milvus-standalone | [2024/10/18 02:17:12.614 +00:00] [WARN] [grpcclient/client.go:341] ["ClientBase ReCall grpc first call get error"] [role=datacoord] [error="err: rpc error: code = Canceled desc = context canceled\n, /go/src/github.com/milvus-io/milvus/internal/util/trace/stack_trace.go:51 github.com/milvus-io/milvus/internal/util/trace.StackTrace\n/go/src/github.com/milvus-io/milvus/internal/util/grpcclient/client.go:340 github.com/milvus-io/milvus/internal/util/grpcclient.(ClientBase[...]).ReCall\n/go/src/github.com/milvus-io/milvus/internal/distributed/datacoord/client/client.go:435 github.com/milvus-io/milvus/internal/distributed/datacoord/client.(Client).GetRecoveryInfoV2\n/go/src/github.com/milvus-io/milvus/internal/indexcoord/index_coord.go:855 github.com/milvus-io/milvus/internal/indexcoord.(IndexCoord).getIndexedStats\n/go/src/github.com/milvus-io/milvus/internal/indexcoord/index_coord.go:936 github.com/milvus-io/milvus/internal/indexcoord.(IndexCoord).DescribeIndex\n/go/src/github.com/milvus-io/milvus/internal/distributed/indexcoord/service.go:279 github.com/milvus-io/milvus/internal/distributed/indexcoord.(*Server).DescribeIndex\n/go/src/github.com/milvus-io/milvus/internal/proto/indexpb/index_coord.pb.go:2669 github.com/milvus-io/milvus/internal/proto/indexpb._IndexCoord_DescribeIndex_Handler.func1\n/go/src/github.com/milvus-io/milvus/internal/util/interceptor/cluster_interceptor.go:69 github.com/milvus-io/milvus/internal/util/interceptor.ClusterValidationUnaryServerInterceptor.func1\n/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware@v1.3.0/chain.go:25 github.com/grpc-ecosystem/go-grpc-middleware.ChainUnaryServer.func1.1.1\n/go/src/github.com/milvus-io/milvus/internal/util/logutil/grpc_interceptor.go:22 github.com/milvus-io/milvus/internal/util/logutil.UnaryTraceLoggerInterceptor\n"] @xiaofan-luan

need help,please @yanliang567

stale[bot] commented 1 week ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. Rotten issues close after 30d of inactivity. Reopen the issue with /reopen.

dbc-2024 commented 2 days ago

/reopen