Guild icon
S3Drive
Community / support / Error when dowloading a lot of files (1.5+GB)
Avatar
I2rys (安全) 7/20/2024 12:20 PM
I was downloading some files (1.5+GB in total) from my bucket because the internet today is pretty fast. About 80 Mbps average (both upload and download but again in S3Drive it can't even reach 10-11 Mbps) but it keeps showing this ;-; Happens on BackBlaze, iDrive E2 and Storj and I'm using the latest version. ComfyDance (edited)
12:20 PM
Does it affect the archived file? Yup, it won't download XD RIP (won't be able to get my files). Suggestions: 1. Perhaps offline CLI to decrypt directories and files. 2. Download files individually without archiving. (edited)
I2rys (安全) changed the channel name: Error when dowloading a lot of files (1.5+GB) 7/20/2024 2:31 PM
Avatar
Avatar
I2rys (安全)
Does it affect the archived file? Yup, it won't download XD RIP (won't be able to get my files). Suggestions: 1. Perhaps offline CLI to decrypt directories and files. 2. Download files individually without archiving. (edited)
Before we'll improve speed and resilience of standard download you can either use in-app Sync feature or connect directly from the CLI as you say. Since encryption is Rclone compatible you can either configure Rclone profile manually: https://docs.s3drive.app/advanced/ or actually if you use Sync feature (e.g. you will try to select source folder from your S3 account) then it will configure the Rclone entry automatically (for internal) use, which you could then use using Rclone, e.g.: rclone ls s3drive_auto_MSjaArpfKFyW8PyLProF1U0OFOhfaLsRMY1vPLNpxYV:bucketname or e.g. copy with configured alias: rclone copy s3 s3drive_noversion:yourfile.txt c:\folder I've also added a feature to download multiple files without archive: https://s3drive.canny.io/feature-requests/p/implement-multiple-files-download-without-archive If you have further troubles accessing your encrypted bucket from the Rclone CLI, please let us know, we'll certainly help out. (edited)
Avatar
Avatar
Tom
Before we'll improve speed and resilience of standard download you can either use in-app Sync feature or connect directly from the CLI as you say. Since encryption is Rclone compatible you can either configure Rclone profile manually: https://docs.s3drive.app/advanced/ or actually if you use Sync feature (e.g. you will try to select source folder from your S3 account) then it will configure the Rclone entry automatically (for internal) use, which you could then use using Rclone, e.g.: rclone ls s3drive_auto_MSjaArpfKFyW8PyLProF1U0OFOhfaLsRMY1vPLNpxYV:bucketname or e.g. copy with configured alias: rclone copy s3 s3drive_noversion:yourfile.txt c:\folder I've also added a feature to download multiple files without archive: https://s3drive.canny.io/feature-requests/p/implement-multiple-files-download-without-archive If you have further troubles accessing your encrypted bucket from the Rclone CLI, please let us know, we'll certainly help out. (edited)
I2rys (安全) 7/23/2024 12:22 AM
Hi when I tried with sync It didn't managed to pull all the files (3141 on S3Drive info, it pulled 3094. About this found out that it includes .empty in the files count) and If I try with rclone it doesn't decrypt them perhaps I got it wrong (ex command: rclone copy "s3drive_auto_U2Um8OrYEYMkBBef6bElnpL5wQs1YiFriJTSdrQVx4g:private-ne-fn-b" "C:\Users\yes\Documents\y" --transfers=8 or with the enc one it does nothing it just exits). Edit: Further testing, it turns out the error came from .empty files. Also how would I know if it synced successfully? (edited)
Avatar
Avatar
I2rys (安全)
Hi when I tried with sync It didn't managed to pull all the files (3141 on S3Drive info, it pulled 3094. About this found out that it includes .empty in the files count) and If I try with rclone it doesn't decrypt them perhaps I got it wrong (ex command: rclone copy "s3drive_auto_U2Um8OrYEYMkBBef6bElnpL5wQs1YiFriJTSdrQVx4g:private-ne-fn-b" "C:\Users\yes\Documents\y" --transfers=8 or with the enc one it does nothing it just exits). Edit: Further testing, it turns out the error came from .empty files. Also how would I know if it synced successfully? (edited)
I haven't mentioned that if encryption is enabled, then in your rclone copy command you need to use backend with the type=crypt, it's also automatically created though. For instance non-encrypted back-end is: s3drive_auto_acddd458-d307-4053-b072-1180909eb54a, whereas encrypted one is called: s3drive_enc_s3ultimate and points to: remote = s3drive_auto_acddd458-d307-4053-b072-1180909eb54a:bucket, please find attached screenshot. In other words, in your copy you would use: s3drive_enc_s3ultimate instead of s3drive_auto_acddd458-d307-4053-b072-1180909eb54a:bucket (edited)
Avatar
Avatar
I2rys (安全)
Hi when I tried with sync It didn't managed to pull all the files (3141 on S3Drive info, it pulled 3094. About this found out that it includes .empty in the files count) and If I try with rclone it doesn't decrypt them perhaps I got it wrong (ex command: rclone copy "s3drive_auto_U2Um8OrYEYMkBBef6bElnpL5wQs1YiFriJTSdrQVx4g:private-ne-fn-b" "C:\Users\yes\Documents\y" --transfers=8 or with the enc one it does nothing it just exits). Edit: Further testing, it turns out the error came from .empty files. Also how would I know if it synced successfully? (edited)
Also how would I know if it synced successfully?
As long as error is returned there isn't guarantee that sync/copy has finished. In fact it may be caused by the .empty file which isn't expected to be empty given the encryption is enabled. Can you please try adding: --exclude ".empty" flag to your rclone copy? We'll look into this issue closer. We might need exclude .empty file ourselves or make it encryption compliant (that is regardless if it's empty or not it likely should have encryption headers included).
Avatar
Avatar
Tom
Also how would I know if it synced successfully?
As long as error is returned there isn't guarantee that sync/copy has finished. In fact it may be caused by the .empty file which isn't expected to be empty given the encryption is enabled. Can you please try adding: --exclude ".empty" flag to your rclone copy? We'll look into this issue closer. We might need exclude .empty file ourselves or make it encryption compliant (that is regardless if it's empty or not it likely should have encryption headers included).
I2rys (安全) 7/24/2024 1:36 PM
I see, btw when I tried that one now it keeps saying Skipping undecryptable dir name: bad PKCS#7 padding - too long
1:37 PM
Command used: rclone copy s3drive_enc_main: test --exclude ".empty"
Avatar
Avatar
I2rys (安全)
I see, btw when I tried that one now it keeps saying Skipping undecryptable dir name: bad PKCS#7 padding - too long
This is NOTICE returned from Rclone if it fails to decrypt the filename, it doesn't fail by default and continues further: https://rclone.org/crypt/#crypt-strict-names You might give it a go and add: -vvv flag to Rclone to run in verbose mode, tihs might potentially display the affected file paths. This error may indicate couple things, usually either password used for this filepath was different or it wasn't encrypted in first place (therefore Rclone fails to decrypt something which was never encrypted). If you don't see weird looking filenames on the destination you were copying/syncing files too, then it would back this hypothesis.
Encryption overlay remote
Avatar
Avatar
Tom
This is NOTICE returned from Rclone if it fails to decrypt the filename, it doesn't fail by default and continues further: https://rclone.org/crypt/#crypt-strict-names You might give it a go and add: -vvv flag to Rclone to run in verbose mode, tihs might potentially display the affected file paths. This error may indicate couple things, usually either password used for this filepath was different or it wasn't encrypted in first place (therefore Rclone fails to decrypt something which was never encrypted). If you don't see weird looking filenames on the destination you were copying/syncing files too, then it would back this hypothesis.
I2rys (安全) 7/25/2024 12:01 AM
Hi it turns out the password was indeed wrong (i somehow messed up the rclone config 💀 ) but its all good now and its downloading great. Thankss!
👌 1
Exported 11 message(s)
Timezone: UTC+0