Replies: 3 comments
-
@rfjakob where are you, dear? |
Beta Was this translation helpful? Give feedback.
0 replies
-
Tge volume of data is no problem, but there's this problem with cifs:
#532
It seems to be a bug with the linux kernel and/or the smb server.
This restic/restic#2659 (comment)
suggests that the problem has been solved in latest linux kernels.
So before deleting the original files, do make sure that all files in
gocryptfs are readable. "gocryptfs -fsck" checks that.
Or get the sha256sum of all original files and check the copy.
…On Thu, 8 Aug 2024, 21:17 PnYMaT, ***@***.***> wrote:
hello, I have about 6TB of data, (about 100000 files and the largest file
is 120GB) to copy to a CIFS share. is gocryptfs robust enough to handle the
encryption of such a large amount of data ?
Thanks for your help.
—
Reply to this email directly, view it on GitHub
<#860>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AACGA75DF7FT7ZHU3LYWKXLZQO72ZAVCNFSM6AAAAABMHDEFEOVHI2DSMVQWIX3LMV43ERDJONRXK43TNFXW4OZXGAZDOMBZGI>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
0 replies
-
Very interesting, thanks you for the links, i will read all of this. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
hello, I have about 6TB of data, (about 100000 files and the largest file is 120GB) to copy to a CIFS share. is gocryptfs robust enough to handle the encryption of such a large amount of data ?
Thanks for your help.
Beta Was this translation helpful? Give feedback.
All reactions