-
Notifications
You must be signed in to change notification settings - Fork 124
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Connection timed out occured while dump a big redis instance #13
Comments
What is the timeout value set in your
|
0, it is always be 0. Well, other client works well, I guess maybe the huge keys cause the problem(consume too many memory stuff). |
Do you know how large the key is (estimated)? |
I use "keys *" on a smaller redis instance(around 30,000 results), it works. And on the large one(more than 10,000,000 results return), it failed. |
Seems to timeout 0 doesn't work as assumed. When I'm setting to 0 I'm getting many timeouts (from other clients also), when set to 120 works more stable ... strange |
Having this issue when dumping a keyspace of 8 million. Timeout is set for a huge number but |
Ya, I imagine that's annoying. @shortdudey123 what's the timeout set to in your |
@delano not sure off the top of my head, but its several hours |
Same issue for me - timeout after few secs, even if filter is applied in order to dump small amount of keys |
It's also slow as shit because it doesn't use pipelining. |
i have meet the same promblem. How can i resovle it? |
redis-dump -u :[email protected]:6379 -D -d 3 -c 500 > ./redisprodb3.json use -c set chunck size,一次500条,可规避timeout |
When I dump a 10,000,000 keys redis instance into a json, I got a "Connection timed out". I wonder the issue may caused by command "keys", would it be some hot fix in a near future?
The text was updated successfully, but these errors were encountered: