-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
split up zip files so that they are in 500gig chunks #328
Comments
Also in 7za, the command just needs -v500g for a 500 gig split. This will result in the file extension of a <500gig file alwasys being .zip.001. |
Hi Kieron,
Just catching up with stuff after a week on leave, sorry for delay...
I know I explained our setup in separate email but just for this context: if we weren’t getting that managed by the Spectra systems, I’d definitely go for your approach if dividing into smaller datasets for storage to tape. Arbitrary 500 or 250 GB sounds good, but I wonder if there’s a size that aligns systematically with your data tape infrastructure’s parameters? Eg a % of usable storage capacity on the tapes?
Just a thought cos our system uses that approach, chunking using a dynamic calc against usable TB on the destination. Don’t know why, precisely, but I’ve always assumed it optimises the result. Secret sauce.
Anyway I like your approach and would love to keep posted on how it goes. Blog, blog, blog, blog........
…________________________________
From: kieranjol <[email protected]>
Date: 20 February 2019 at 15:24:38 GMT
To: kieranjol/IFIscripts <[email protected]>
Cc: Stephen McConnachie <[email protected]>, Mention <[email protected]>
Subject: [kieranjol/IFIscripts] split up zip files so that they are in 500gig chunks (#328)
So there are too many approx 1.5TB zipfiles that are partially filling up LTO tapes in our library, and we only have a license for 24 slots, so to save us contantly swapping out tapes - perhaps splitting the zip files into smaller chunks will end up filling up more tapes in an ongoing fashion.
500 gigs seems like it should be fine, maybe something smaller like 250 gigs is better...
Curious to know what @stephenmcconnachie<https://url4.mailanyone.net/v1/?m=1gwTjf-000B3r-6D&i=57e1b682&c=HVzWiM3grM4he1v_bGXASp_An1lITRaVKakucDNWOEcvsA9gkEXF5a2sWy1Hm3hkjE2my_DlzJambkWmIyKpcaCH5NZ0-iV9Bu4VISyHRiSltYdzuBC-tT_zpkede-cbqKtDqfP_SY3VIY6lQhnm5Y2KGLrLva7uI21O4OZAy-rnd_ar0ppQG4Pp73le-iVzHcvqlh4mxV79wuX5hI2h3ujyBajF_527trEyC7Kmz4E> thinks...
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub<https://url4.mailanyone.net/v1/?m=1gwTjf-000B3r-6D&i=57e1b682&c=WyTsb5LKI2gyvlAjrXcEIAge60KjndGGBpulGhu0L4UbxmcL1fblBPjkoCoOyg_fBeVUCboHLDHjma_r-d6Z-mHAal1vHfPs3-3SV0tNwAPp32mOLCrYxiAT_Rv6ypUDoPBDKcZqJryh5qySplc8hixRDQxH67n2XvEt7E6lwUG4ql7J73QfzCvWMEa_jZdpcp6DaPGbISQYWbjzfIcrfvJfDh2na8ufp-TyLvAecYT69pdB6qTT3M_aLW-0ikYB>, or mute the thread<https://url4.mailanyone.net/v1/?m=1gwTjf-000B3r-6D&i=57e1b682&c=whYn_fdvzB_qPwrydS1JQYL0n5ojU4vyZOfTh1o6b5IpDku5aYEw4QjiR358B29LGyZoOUcKqW0U7lM7wQiB7wTjcWpy_r8Ok-U4UA0pAwdeNSUIoRO3rXBJThhlqaQyS3UloP22fwK21brqjeNfoKBCbM7jrLiNDqThyJfhntlT_MJhnOi2m_og4sm7qZJgrbKdqGLYvG25R0k823dAaZlqnYocRg7rRU9LWwgW5ulEYRb81e1JoUT6XZoGXpijA3Sw6KNYtnvcYvwvKZ9M5Bu7wwTa37m8SxO6owpZ7qP9DCI7AxguYGYrnxlP0JUiF2Lgiyvv_EbJq_qccykXmg>.
The British Film Institute is governed by Royal Charter and is a charity registered in England and Wales number 287780. The contents of this e-mail are confidential and may be legally privileged. If you are not the intended recipient, kindly notify the sender that you have received this message in error and immediately delete it. Unless you are the intended recipient, you may not forward this e-mail to anybody, nor make any use of its contents.
|
I think I should plough ahead with this - it should all be simple, some thinking is needed to make sure that the numerical file extensions (and not .zip) won't affect any workflows. I doubt it will. |
So there are too many approx 1.5TB zipfiles that are partially filling up LTO tapes in our library, and we only have a license for 24 slots, so to save us contantly swapping out tapes - perhaps splitting the zip files into smaller chunks will end up filling up more tapes in an ongoing fashion.
500 gigs seems like it should be fine, maybe something smaller like 250 gigs is better...
Curious to know what @stephenmcconnachie thinks...
The text was updated successfully, but these errors were encountered: