-
-
Notifications
You must be signed in to change notification settings - Fork 97
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: #721
Comments
ulimit -n |
testing if ulimit -n 2048 fixes it. ulimit -Hn running with ulimit -n 4096 does not fix the issue. |
I am now at the hard limit of open files of a user, which I cannot easily change. |
Hey @awb99. How big is your database? Probably the |
The eavt dump is 12.8MB. The import on the old datahike version worked fine. |
The branching factor is 512, which should approximately yield one file per index, so with history 6, without 3. (6/512)*100000=1171.875, so this would be too much for ulimit 1024. I think 10k should be safe and fast enough. |
I got it to work now! batching solved it! Thanks @whilo BTW: datahikes migrate prints that it does batches, but in reality it does not. When Importing from cbor, is it still necessary to set :max-eiid and :max-tx ? |
Great! I am not sure about :max-eid :max-tx. @yflim do you remember why you put it there? In case you have a modified import function feel free to open a PR, I think we can stick to 10k batching as a default for now. |
What version of Datahike are you using?
0.6.1594
What version of Java are you using?
openjdk 19.0.2 2023-01-17
What operating system are you using?
GUIX
What database EDN configuration are you using?
{:store {:backend :file
:path "data/datahike-db"}
:keep-history? false
:schema-flexibility :write }
Describe the bug
This is what I did:
io.replikativ/konserve {:mvn/version "0.7.311"} ==> io.replikativ/konserve {:mvn/version "0.7.319"}
io.replikativ/datahike {:mvn/version "0.6.1542"} ==> io.replikativ/datahike {:mvn/version "0.6.1594"}
I guess this exception is good, as it means that datahike is much faster, and so I must hit some kind
of filesytem limit. I will investigate how to fix the filesystem limit.
What is the expected behaviour?
no crash
How can the behaviour be reproduced?
I can share a private repo with the dataset that produces the error.
The text was updated successfully, but these errors were encountered: