You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have create a chunk store of 3 files, each file is of 500MB.
Chunk store is hosted on a http server.
While extracting, the size of cached directory keeps on increasing because even though the first file of 500MB is re constructed in the target path, the cache tmp path still have those old chunks.
If there is an option to remove already used chunks, it can be useful for systems with less memory.
The text was updated successfully, but these errors were encountered:
You could have a look at casync-nano. If I remember correctly it does not have this nasty habbit of keeping already used chunks locally, but I am not 100% sure about that.
I have create a chunk store of 3 files, each file is of 500MB.
Chunk store is hosted on a http server.
While extracting, the size of cached directory keeps on increasing because even though the first file of 500MB is re constructed in the target path, the cache tmp path still have those old chunks.
If there is an option to remove already used chunks, it can be useful for systems with less memory.
The text was updated successfully, but these errors were encountered: