Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: index not found & collection not loaded #39256

Closed
1 task done
iurii-stepanov opened this issue Jan 14, 2025 · 1 comment
Closed
1 task done

[Bug]: index not found & collection not loaded #39256

iurii-stepanov opened this issue Jan 14, 2025 · 1 comment
Assignees
Labels
kind/bug Issues or changes related a bug needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one.

Comments

@iurii-stepanov
Copy link

Is there an existing issue for this?

  • I have searched the existing issues

Environment

- Milvus version: 2.4.10
- Deployment mode(standalone or cluster): cluster
- MQ type(rocksmq, pulsar or kafka): external Kafka
- SDK version(e.g. pymilvus v2.0.0rc2):
- OS(Ubuntu or CentOS): 
- CPU/Memory: 
- GPU: 
- Others:

Current Behavior

On December 25, we created the "obo_3200_union" collection and uploaded 3.2kk items to it.
Then, on December 26, the IVF_FLAT index was created (COSINE, nlist:512). After creating the index, the number of objects in the collection decreased by 500k to about 2.7kk.

Today I noticed that the Proxy logs contain errors about an unloaded collection and a missing index:

{"level":"WARN","time":"2025/01/13 14:04:34.722 +00:00","caller":"proxy/task_scheduler.go:477","message":"Failed to execute task: ","traceID":"766f3d4c49faa20a1c9c61fa2a66ce1c","error":"failed to query: collection not loaded[collection=454267287951409670]","errorVerbose":"failed to query: collection not loaded[collection=454267287951409670]\n(1) attached stack trace\n  -- stack trace:\n  | github.com/milvus-io/milvus/internal/proxy.(*queryTask).Execute\n  | \t/workspace/source/internal/proxy/task_query.go:471\n  | github.com/milvus-io/milvus/internal/proxy.(*taskScheduler).processTask\n  | \t/workspace/source/internal/proxy/task_scheduler.go:474\n  | github.com/milvus-io/milvus/internal/proxy.(*taskScheduler).queryLoop.func1\n  | \t/workspace/source/internal/proxy/task_scheduler.go:553\n  | github.com/milvus-io/milvus/pkg/util/conc.(*Pool[...]).Submit.func1\n  | \t/workspace/source/pkg/util/conc/pool.go:81\n  | github.com/panjf2000/ants/v2.(*goWorker).run.func1\n  | \t/go/pkg/mod/github.com/panjf2000/ants/[email protected]/worker.go:67\n  | runtime.goexit\n  | \t/usr/local/go/src/runtime/asm_amd64.s:1650\nWraps: (2) failed to query\nWraps: (3) collection not loaded[collection=454267287951409670]\nError types: (1) *withstack.withStack (2) *errutil.withPrefix (3) merr.milvusError"}
{"level":"WARN","time":"2025/01/13 14:04:34.722 +00:00","caller":"proxy/impl.go:3490","message":"Query failed to WaitToFinish","traceID":"766f3d4c49faa20a1c9c61fa2a66ce1c","role":"proxy","db":"shopping_assistant_dev","collection":"obo_3200_union","partitions":[],"ConsistencyLevel":"Strong","useDefaultConsistency":false,"error":"failed to query: collection not loaded[collection=454267287951409670]","errorVerbose":"failed to query: collection not loaded[collection=454267287951409670]\n(1) attached stack trace\n  -- stack trace:\n  | github.com/milvus-io/milvus/internal/proxy.(*queryTask).Execute\n  | \t/workspace/source/internal/proxy/task_query.go:471\n  | github.com/milvus-io/milvus/internal/proxy.(*taskScheduler).processTask\n  | \t/workspace/source/internal/proxy/task_scheduler.go:474\n  | github.com/milvus-io/milvus/internal/proxy.(*taskScheduler).queryLoop.func1\n  | \t/workspace/source/internal/proxy/task_scheduler.go:553\n  | github.com/milvus-io/milvus/pkg/util/conc.(*Pool[...]).Submit.func1\n  | \t/workspace/source/pkg/util/conc/pool.go:81\n  | github.com/panjf2000/ants/v2.(*goWorker).run.func1\n  | \t/go/pkg/mod/github.com/panjf2000/ants/[email protected]/worker.go:67\n  | runtime.goexit\n  | \t/usr/local/go/src/runtime/asm_amd64.s:1650\nWraps: (2) failed to query\nWraps: (3) collection not loaded[collection=454267287951409670]\nError types: (1) *withstack.withStack (2) *errutil.withPrefix (3) merr.milvusError"}
{"level":"WARN","time":"2025/01/13 14:59:47.520 +00:00","caller":"proxy/task_scheduler.go:477","message":"Failed to execute task: ","traceID":"1fabf9b994aa1bc827eedae6362fc3c8","error":"index not found[collection=obo_3200_union]"}
{"level":"WARN","time":"2025/01/13 14:59:47.521 +00:00","caller":"proxy/impl.go:2166","message":"DescribeIndex failed to WaitToFinish","traceID":"1fabf9b994aa1bc827eedae6362fc3c8","role":"proxy","db":"shopping_assistant_dev","collection":"obo_3200_union","field":"","index name":"","error":"index not found[collection=obo_3200_union]","BeginTs":455286572482035716,"EndTs":455286572482035716}

**Today I also tried to unload the collection from memory and load it again from the web ui, and then recreate the index, but errors still remained in the Proxy logs.

At the same time, the search in the collection works, but we want to figure out the following:

  • why did the size of the collection decrease after the index was created?
  • why do we see errors in Proxy logs about missing index and unloaded collection?

Expected Behavior

Expected:

  • the size of the collection has not decreased after creating the index
  • there are no errors in the logs

Steps To Reproduce

No response

Milvus Log

milvus-log.tar.gz

Anything else?

No response

@iurii-stepanov iurii-stepanov added kind/bug Issues or changes related a bug needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one. labels Jan 14, 2025
@iurii-stepanov
Copy link
Author

The request was created by mistake

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Issues or changes related a bug needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one.
Projects
None yet
Development

No branches or pull requests

2 participants