Guild icon
S3Drive
Community / general
For all on-topic discussion about S3Drive or related storage providers.
Avatar
Avatar
slano
[s3drive_remote] type = s3 provider = Other access_key_id = <redacted> secret_access_key = <redatced> endpoint = s3.eu-central-003.backblazeb2.com [s3drive_crypt] type = crypt filename_encoding = base64 remote = s3drive_remote:bucket_name password = <password generated with echo "password-from-e2e-in-s3drive" | rclone obscure -> filename_encryption = standard directory_name_encryption = true suffix = none
That looks fine indeed. What's your Rclone version?
Avatar
Avatar
Tom
That looks fine indeed. What's your Rclone version?
v1.8.0
Avatar
Avatar
slano
v1.8.0
rclone version, I guess you've provided S3Drive version?
Avatar
Avatar
Tom
rclone version, I guess you've provided S3Drive version?
yes, that was s3drive version sorry, rclone: v1.53.3-DEV
  • os/arch: linux/amd64
  • go version: go1.18.1
Avatar
Avatar
slano
yes, that was s3drive version sorry, rclone: v1.53.3-DEV
  • os/arch: linux/amd64
  • go version: go1.18.1
We haven't tested that with anything below 1.6.5, I can't recall exactly, but there was some issue with S3Drive <> Rclone compatibility below that version. Would you be keen to upgrade your Rclone version and see if that config works for you?
Avatar
Avatar
Tom
We haven't tested that with anything below 1.6.5, I can't recall exactly, but there was some issue with S3Drive <> Rclone compatibility below that version. Would you be keen to upgrade your Rclone version and see if that config works for you?
sure thing, on it
Avatar
Avatar
slano
[s3drive_remote] type = s3 provider = Other access_key_id = <redacted> secret_access_key = <redatced> endpoint = s3.eu-central-003.backblazeb2.com [s3drive_crypt] type = crypt filename_encoding = base64 remote = s3drive_remote:bucket_name password = <password generated with echo "password-from-e2e-in-s3drive" | rclone obscure -> filename_encryption = standard directory_name_encryption = true suffix = none
directory_name_encryption = true - do you also have filename/filepath encryption enabled on the S3Drive side?
Avatar
Avatar
Tom
directory_name_encryption = true - do you also have filename/filepath encryption enabled on the S3Drive side?
yes I do
Avatar
Avatar
Tom
We haven't tested that with anything below 1.6.5, I can't recall exactly, but there was some issue with S3Drive <> Rclone compatibility below that version. Would you be keen to upgrade your Rclone version and see if that config works for you?
well, upgrading to 1.66.0 and regenerating obscured password did the trick, thanks! there probably is some breaking change in rclone, since I'm unable to decrypt the file uploaded via rclone version 1.53.3: 2024/03/21 14:49:41 NOTICE: q6iai7p6mlrj2joi3k4pvp6nio: Skipping undecryptable file name: not a multiple of blocksize (edited)
👍 1
Avatar
I’m using minio on another system on my lan. Yes everything works just fine in the S3Drive program. I’m going to assume yes on the Read question for I AM as I can use the S3Drive program to its full function? No I do not have EE encryption enabled. (edited)
Avatar
Ok, so with a little bit more sleauthing here is what i have been able to do: I can mount S3Drive in settings and all works fine I can see the "existing files" that were in the S3 bucket, in the mounted folder/drive but can not open them on the local machine that S3Drive is on I downloaded an image from google and saved it in the Mounted Folder and I am able to view the image on s3Drive and i can open the image in the mounted folder/drive and view it.
5:19 PM
So im going to isolate my question to specifically being able to view files that are existing in the s3 bucket, prior to installing s3drive. Thank you for your continued responses as im sure you are busy with current updates. This is a great program and solution. I look forward to continuing to utilize it. I can move this convo to support if you would like to not clutter up the gen-pop channel.
BabyPop joined the server. 3/21/2024 7:22 PM
CyberKiller joined the server. 3/22/2024 7:53 AM
Avatar
Avatar
CyberKiller
Click to see original message
9:37 AM
hi, I stumbled on this tool today, I'm looking on and off for something to be able to use a cheap pay as you go s3 storage, instead of shelling upfront for terabytes of google drive or dropbox, and this seems like it could be what I'm looking for
9:38 AM
that is if I manage to get it working, as I hit a problem on my OpenSUSE, posted a bug on the support chat
Avatar
Avatar
BabyPop
Click to see original message
Avatar
Avatar
CyberKiller
hi, I stumbled on this tool today, I'm looking on and off for something to be able to use a cheap pay as you go s3 storage, instead of shelling upfront for terabytes of google drive or dropbox, and this seems like it could be what I'm looking for
Indeed that was one of the initial idea when starting this project. Ability to use reasonably priced commodity storage as the file cloud with an option to self-host if needed (e.g. MinIO).
Avatar
Avatar
CyberKiller
that is if I manage to get it working, as I hit a problem on my OpenSUSE, posted a bug on the support chat
We'll try to address your issue as soon as possible. We use Linux based distro ourselves for app development and we love it, but believe me that it's somewhat tricky to get it all right across all Linux distributions (especially if combined with budget/resource planning and Linux tiny userbase vs Windows/Mac) (edited)
Avatar
Avatar
Tom
We'll try to address your issue as soon as possible. We use Linux based distro ourselves for app development and we love it, but believe me that it's somewhat tricky to get it all right across all Linux distributions (especially if combined with budget/resource planning and Linux tiny userbase vs Windows/Mac) (edited)
That's cool. I'm a senior engineer myself, so I should be able to provide you with all the debug info needed
sifayne joined the server. 3/24/2024 7:11 PM
Abubakr joined the server. 3/25/2024 6:44 AM
1azytrip joined the server. 3/26/2024 3:39 PM
Avatar
Avatar
sifayne
Click to see original message
raas joined the server. 3/26/2024 8:00 PM
Deleted User joined the server. 3/28/2024 4:40 PM
Avatar
Deleted User 3/28/2024 4:42 PM
Hi I just found s3drive today. I’ve been trying to upload a video file but each time I try it just fails half way through. I’ve tried via the app on iOS and the web browser. Any solution ?
Avatar
Avatar
Deleted User
Hi I just found s3drive today. I’ve been trying to upload a video file but each time I try it just fails half way through. I’ve tried via the app on iOS and the web browser. Any solution ?
Can you please create a support item? #support Please specify the error (you will probably find one in the Transfers tab), S3 provider, whether encryption is enabled and the approximate file size. Thanks.
👍 1
Riccardo Bellanova joined the server. 3/29/2024 9:13 PM
Avatar
Avatar
Riccardo Bellanova
Click to see original message
Riccardo Bellanova 3/29/2024 9:19 PM
9:21 PM
Hello 👋🏻, I'm looking for a long time something works as a drive with Cubbit S3 storage... Does it the right place? It's a dream?
Avatar
Avatar
Riccardo Bellanova
Hello 👋🏻, I'm looking for a long time something works as a drive with Cubbit S3 storage... Does it the right place? It's a dream?
Hi Riccardo, we've got quite a few users using S3Drive with Cubbit S3.
Avatar
Avatar
Tom
Hi Riccardo, we've got quite a few users using S3Drive with Cubbit S3.
Riccardo Bellanova 3/29/2024 9:53 PM
Top! I will try it in these days.. Thank you
Biel joined the server. 3/30/2024 3:47 AM
Avatar
Avatar
t_rott
So im going to isolate my question to specifically being able to view files that are existing in the s3 bucket, prior to installing s3drive. Thank you for your continued responses as im sure you are busy with current updates. This is a great program and solution. I look forward to continuing to utilize it. I can move this convo to support if you would like to not clutter up the gen-pop channel.
Sorry, I've missed this. Please go ahead, create a support item from after which we can hopefully find out where the problem lies.
Avatar
InfiniteAds559 3/30/2024 1:48 PM
Hi Tom, I remember I was able to download (not to my phone) individual media for offline use, and now the offline option is only available for folders.
Avatar
Deleted User 3/30/2024 4:57 PM
Does anyone know any zero knowledge clouds with encryption/vault that can be used via s3drive ?
kefir joined the server. 3/30/2024 9:49 PM
Avatar
Hi, I'm busy backing up my files to a remote s3 storage, in essence migrating away from Google photos. But I'm curious about the roadmap and the "Photo management tools" on the roadmap for 2024. I often search my media files by face recognition, by geographic location, and by date/time. How will that work with the planned s3drive features? Will a shared database/index be stored on s3, or will an index be local to a device? Will I have to re-index/process all files before they can be searched?
Aru99 joined the server. 3/31/2024 6:18 AM
Avatar
Avatar
Tom
Hi Riccardo, we've got quite a few users using S3Drive with Cubbit S3.
Hii I just installed this app nd it's saying this can u pls help me out?
ZenGnostic joined the server. 3/31/2024 7:56 AM
Riccardo Bellanova started a thread. 3/31/2024 9:51 AM
Avatar
Deleted User 3/31/2024 7:52 PM
Has anyone used koofr or koofr vault with the s3drive app ? if so can anyone help me please
Avatar
How much does cubbit cost?
5:17 AM
Their pricing page says nothing
Ronco joined the server. 4/1/2024 10:15 AM
beli3ver joined the server. 4/2/2024 9:43 AM
Avatar
Thank you. That's all I want to say. Just thanks to you. All other providers don't really allow me to preview video and images when I encrypt them. Or no mounting on Linux as a drive. Thank you. Really, thank you.
Avatar
Avatar
beli3ver
Thank you. That's all I want to say. Just thanks to you. All other providers don't really allow me to preview video and images when I encrypt them. Or no mounting on Linux as a drive. Thank you. Really, thank you.
Thanks for the good words. We're doing our best to make the encrypted experience as seamless as possible and constantly working on improving cipher in terms of security and performance. I hope that in the near future we will be able to provide even more encrypted features, as we're actively working on cipher improvements: https://github.com/rclone/rclone/issues/7192
We're running S3Drive (GUI for S3 on desktop, mobile, web) and recently aligned with Rclone's encryption scheme for better interoperability and features like drive mount and Webdav that we ...
d@rshan joined the server. 4/2/2024 1:26 PM
Avatar
@Tom Can you support an encrypted export of the config ?
Avatar
@Tom how can I sync a folder from the system without copy it to the storage? I use B2 and want to sync my /home/user/.ssh folder. Thanks.
Avatar
Avatar
beli3ver
@Tom Can you support an encrypted export of the config ?
I've just added this as a feature request: https://s3drive.canny.io/feature-requests/p/encrypted-export-of-the-config We will have it added likely in a couple months or sooner if we have some spare time.
Avatar
Avatar
beli3ver
@Tom how can I sync a folder from the system without copy it to the storage? I use B2 and want to sync my /home/user/.ssh folder. Thanks.
Sorry, I didn't get that. You can select local path, like on the attached screenshot and then select remote. Wouldn't that work?
Avatar
Found it, but when I click on the clock next to the time, I just see this:
4:08 PM
Just a black overlay but no window to set the sync time
4:08 PM
@Tom
Avatar
Avatar
beli3ver
Just a black overlay but no window to set the sync time
Oh dear, it seems there is a regression after we've updated our dependencies and our tests haven't captured that. I've passed that for fixing, in the meantime you can use the default "sync every" setting. Thanks for spotting that.
Avatar
Android the same 😄
4:13 PM
no problem just reporting it 🙂
4:14 PM
Should I test it with iOS too 😄
Avatar
Avatar
beli3ver
no problem just reporting it 🙂
Apparently the "quick" workaround would be to set the theme to Light (instead of Dark - this can be done in the Settings after opening a drawer menu), setting your preferred time and then reverting Theme settings back to your preferred. (edited)
Avatar
Avatar
beli3ver
Should I test it with iOS too 😄
Nope, it's broken on all platorms.
Avatar
Thanks, the workaround works
Darktoxicola joined the server. 4/3/2024 10:29 AM
Jo Colina joined the server. 4/3/2024 7:25 PM
Alberto joined the server. 4/4/2024 6:44 PM
Avatar
Avatar
Alberto
Click to see original message
EmporioBreak joined the server. 4/5/2024 5:27 AM
energetic joined the server. 4/5/2024 6:59 PM
ahtE joined the server. 4/6/2024 8:26 PM
VATER joined the server. 4/6/2024 8:52 PM
Avatar
InfiniteAds559 4/8/2024 1:00 AM
Hi Tom, what's your 3-2-1 back up implementation? I'm looking for a way to automate my local backup. Keeping tabs of which ones I haven't backed up yet on a local drive is tedious. What I'm thinking since I'm already using S3Drive-BackBlaze, is to back up BackBlaze to a local drive via spare mac mini I have laying around that does auto-back up when it detects new files. Kind of like Syncthing (Not sure if you're familiar with it). Thoughts?
Avatar
Avatar
InfiniteAds559
Hi Tom, what's your 3-2-1 back up implementation? I'm looking for a way to automate my local backup. Keeping tabs of which ones I haven't backed up yet on a local drive is tedious. What I'm thinking since I'm already using S3Drive-BackBlaze, is to back up BackBlaze to a local drive via spare mac mini I have laying around that does auto-back up when it detects new files. Kind of like Syncthing (Not sure if you're familiar with it). Thoughts?
I wouldn't say my setup is perfect or automated. I usually use old fashioned: rsync -av --exclude='cache' --exclude='build' source dest to sync data to other local machine and then archive things and send it compressed and password protected to Backblaze: 7z -mhc=on -mhe=on -pVeryHardPasswordHere a $folder.7z /home/tom/$folder/* AWS_ACCESS_KEY_ID=<key> AWS_SECRET_ACCESS_KEY=<access> aws --endpoint https://s3.eu-central-001.backblazeb2.com s3 cp $folder.7z s3://my-backup-bucket I use S3Drive to backup media from my phone to cloud and for online access to other media files (mostly older photos). I am yet to find perfect backup strategy for photos, but I would say at this stage bigger problem is to keep things tidy, organized and deduplicated. Eventually I will get to that. (edited)
Avatar
CyberKiller 4/8/2024 7:33 AM
if I may cut in... tools like this here, or other synced storage apps aren't really a good backup solution. It's only a backup in case of hardware failure, but not in other data loss scenarios. E.g. you delete a file by accident and that deletion get immidiately propagated to the remote storage, you lose both copies. A good backup tool takes snapshots periodicaly, and keeps a number of changed versions, so in case you delete a local copy, there is an earlier one that you can get back to. For cloud object storage I know of 2 pretty good backup tools. One is Restic, which can cooperate with Rclone to provide support to a huge number of providers, and compresses and encrypts the backups. Another is closed source CloudBerry Backup, though to get reasonable features you have to get a paid version and still it's mostly limited to S3 and Swift types. Restic requires to script around it, setup cron jobs or timers, drop old snapshots, prune storage etc (prunning takes a lot of bandwidth so it's a seldom operation), CloudBerry has a built-in scheduler. In any case, both are nice and most of all, you setup once and don't have to maintain this much later. (edited)
Avatar
Avatar
CyberKiller
if I may cut in... tools like this here, or other synced storage apps aren't really a good backup solution. It's only a backup in case of hardware failure, but not in other data loss scenarios. E.g. you delete a file by accident and that deletion get immidiately propagated to the remote storage, you lose both copies. A good backup tool takes snapshots periodicaly, and keeps a number of changed versions, so in case you delete a local copy, there is an earlier one that you can get back to. For cloud object storage I know of 2 pretty good backup tools. One is Restic, which can cooperate with Rclone to provide support to a huge number of providers, and compresses and encrypts the backups. Another is closed source CloudBerry Backup, though to get reasonable features you have to get a paid version and still it's mostly limited to S3 and Swift types. Restic requires to script around it, setup cron jobs or timers, drop old snapshots, prune storage etc (prunning takes a lot of bandwidth so it's a seldom operation), CloudBerry has a built-in scheduler. In any case, both are nice and most of all, you setup once and don't have to maintain this much later. (edited)
I've also heard positive comments about https://github.com/kopia/kopia tool, though I haven't used it personally.
A good backup tool takes snapshots periodicaly, and keeps a number of changed versions, so in case you delete a local copy, there is an earlier one that you can get back to.
I am not saying that I would recommend it (given that plenty other tools suited for backup purposes exist), but this could be achieved with Rclone and even S3Drive (if someone is afraid of Rclone + Cron CLI). There are 3 ingredients:
  • S3 bucket with enabled versioning,
  • Sync mode with periodic timer (e.g. every 24 hours),
  • Lifecycle policy to clean up older versions
(edited)
Avatar
CyberKiller 4/8/2024 7:45 AM
a versioned bucket is ok if you want to restore just single files, but if you'd want a directory tree from a particular date, then it's going to be a chore to select all those objects.
7:47 AM
I haven't heard of kopia before, but the features look nice
7:48 AM
though one thing that I feel missing from restic is support for cold storage like glacier or ovh archive; with e.g. using a separate hot bucket for metadata and the cold one just for data blobs
Avatar
Avatar
CyberKiller
a versioned bucket is ok if you want to restore just single files, but if you'd want a directory tree from a particular date, then it's going to be a chore to select all those objects.
You're right, but it's a matter of finding the right tooling to do the restore, because all of the data is materialized as versions with the proper timestamps in the S3 bucket. For instance you could tell... I want this directory with all of the files as of 3rd Jan 2022 2:00PM and it should retrieve the relevant data and restore the hierarchy. (edited)
Avatar
CyberKiller 4/8/2024 7:49 AM
in fact, I havent seen any modern tools which could do it, apart from Bareos (used with aws tape storage emulator), but I wouldn't call it modern 😉 (edited)
7:51 AM
in any case, it's always a matter of effort needed to use these kind of tools, having any backup is always good, even if it's just a copied floppy disk like we did in the 90s ;-D
Avatar
Avatar
CyberKiller
in any case, it's always a matter of effort needed to use these kind of tools, having any backup is always good, even if it's just a copied floppy disk like we did in the 90s ;-D
I am fine to spend little bit more effort on the restore side (as long as it works) and less on the backup side, since the frequency of these two is quite different. In other words I rely on backups more like disaster recovery, rather than some working copy to use.
Avatar
CyberKiller 4/8/2024 7:53 AM
oh yes, definitely
7:56 AM
my hobby server data survived the great OVH fire in 2021 (zero data lost!), so I take it my strategy is good enough (used restic daily snapshots, and pushed to swift in another datacenter location) (edited)
👍 1
Neoth joined the server. 4/8/2024 9:54 AM
Avatar
greetings gents
🫡 2
Yusarina joined the server. 4/8/2024 3:16 PM
sencha joined the server. 4/8/2024 5:06 PM
Avatar
Is it normal that you cannot upload files to the Bucket via the mount drive? If I upload a file via the S3Drive application, it can also be seen later in iDrivee2 and in the mounted drive of s3Drive. If a file is placed directly in the mounted drive, it is not loaded into the bucket of iDrivee2.
Avatar
Avatar
ReplaX
Is it normal that you cannot upload files to the Bucket via the mount drive? If I upload a file via the S3Drive application, it can also be seen later in iDrivee2 and in the mounted drive of s3Drive. If a file is placed directly in the mounted drive, it is not loaded into the bucket of iDrivee2.
Depending on the cache settings (available in the app) if file is copied to the mount drive, it shall either appear immediately in the bucket (cache: Off) or with with some small delay (depending on the connection speed, file size etc.). If file never appears, then there might be some issue involved in the upload process. What cache settings you use? Do you have E2E enabled? What region you use for iDrivee2? (edited)
Avatar
Avatar
Tom
Depending on the cache settings (available in the app) if file is copied to the mount drive, it shall either appear immediately in the bucket (cache: Off) or with with some small delay (depending on the connection speed, file size etc.). If file never appears, then there might be some issue involved in the upload process. What cache settings you use? Do you have E2E enabled? What region you use for iDrivee2? (edited)
Mount cache mode is "writes", e2ee enabled, reagion is frankfurt, germany
4:50 PM
4:50 PM
but im not in cached mode
Avatar
Avatar
ReplaX
Mount cache mode is "writes", e2ee enabled, reagion is frankfurt, germany
Thanks. Is name / filepath encryption also enabled? I will set up test environment on my end and will let you know if I've found anything later today.
👍🏻 1
Avatar
yes it is
Avatar
Avatar
ReplaX
yes it is
Hi, I've found the issue. If "Default encryption" setting is enabled on the iDrive side, then they return different object hashes which makes the integrity checks to fail for the mount. https://www.idrive.com/s3-storage-e2/rclone https://forum.rclone.org/t/issues-with-idrive-e2-corrupted-transfers/36085 We can fix that by setting the: server_side_encryption = aws:kms in the config, which we've checked solves the issue, the challenge is that we don't know if user actually enabled that setting on the iDrive side. The quick fix is to turn off the: "Default encryption" setting for the iDrive bucket, then the mount shall upload objects to iDrive without issues. We need to spend more time on this to research if we can detect this setting or whether we need to implement prompt/question for the user and provide configurable setting. (edited)
Avatar
Avatar
Tom
Hi, I've found the issue. If "Default encryption" setting is enabled on the iDrive side, then they return different object hashes which makes the integrity checks to fail for the mount. https://www.idrive.com/s3-storage-e2/rclone https://forum.rclone.org/t/issues-with-idrive-e2-corrupted-transfers/36085 We can fix that by setting the: server_side_encryption = aws:kms in the config, which we've checked solves the issue, the challenge is that we don't know if user actually enabled that setting on the iDrive side. The quick fix is to turn off the: "Default encryption" setting for the iDrive bucket, then the mount shall upload objects to iDrive without issues. We need to spend more time on this to research if we can detect this setting or whether we need to implement prompt/question for the user and provide configurable setting. (edited)
Tom you are the best! 😊 i have indeed Default encryption enabled at iDrive. I used your documentation for bucket setup https://docs.s3drive.app/setup/bucket/ and there is also default encryption enabled. But as you wrote there you used Backblaze. Maybe there its not a problem. You could add there the information that for idrive it should be disabled. I did the quick fix and now its working, Thank you very much!
Personal storage compatible with S3, WebDav and 70+ other Rclone back-ends
👍 1
Exported 100 message(s)
Timezone: UTC+0