You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently Wyng's deduplication code is RAM-bound (as are most deduplicators) which puts an effective limit on the size of an archive than can be deduplicated.
Possible solution
detect the large archive condition and available RAM resources
move the lions' share of dedup indexes out of RAM (and out of /tmp)
This would trade-off performance for the ability to perform the dedup.
Alternate solution (workaround)
For un-encrypted archives, users could have jdupes (or similar utility) do a hardlink or reflink dedup on the archive dir. Otherwise, a dedup-capable filesystem like Btrfs or ZFS could be utilized. (These options would not work on encrypted archives unless Wyng started offering a deterministic encryption mode.)
The text was updated successfully, but these errors were encountered:
Problem
Currently Wyng's deduplication code is RAM-bound (as are most deduplicators) which puts an effective limit on the size of an archive than can be deduplicated.
Possible solution
This would trade-off performance for the ability to perform the dedup.
Alternate solution (workaround)
For un-encrypted archives, users could have
jdupes
(or similar utility) do a hardlink or reflink dedup on the archive dir. Otherwise, a dedup-capable filesystem like Btrfs or ZFS could be utilized. (These options would not work on encrypted archives unless Wyng started offering a deterministic encryption mode.)The text was updated successfully, but these errors were encountered: