Guild icon
S3Drive
Community / support / Very slow folder list on android
Avatar
I have an s3 bucket that I'm syncing my media files to. There are also some other folders that I've synced in with rclone. When listing the entire bucket contents with rclone, I can time the operation to 40 seconds on my linux computer, by running something like "time rclone ls mybucket:". The resulting text file is approximately 7.5MB. When I try to view the content from s3drive, I get a notification almost every time that the listing is outdated, and I'm prompted to refresh the file list. If I agree and try to update this, the listing still isn't ready after 30 minutes. I've tried timing this on several occasions, but it turns out I don't have the time to sit and stare at my phone for an hour to get an accurate timing. I believe I've reported something similar before, probably more than a year ago. I've had this issue from day one. I keep my app up to date.
Avatar
Avatar
kefir
I have an s3 bucket that I'm syncing my media files to. There are also some other folders that I've synced in with rclone. When listing the entire bucket contents with rclone, I can time the operation to 40 seconds on my linux computer, by running something like "time rclone ls mybucket:". The resulting text file is approximately 7.5MB. When I try to view the content from s3drive, I get a notification almost every time that the listing is outdated, and I'm prompted to refresh the file list. If I agree and try to update this, the listing still isn't ready after 30 minutes. I've tried timing this on several occasions, but it turns out I don't have the time to sit and stare at my phone for an hour to get an accurate timing. I believe I've reported something similar before, probably more than a year ago. I've had this issue from day one. I keep my app up to date.
Thanks for your feedback.
I have an s3 bucket that I'm syncing my media files to
What's your S3 provider and how many files you expect to be having in that bucket? Did you configure that S3 bucket in S3Drive as S3, or perhaps Rclone (type: s3) back-end?
I can time the operation to 40 seconds on my linux computer, by running something like "time rclone ls mybucket:".
For faster response. Can you try lsf command instead of ls? You probably don't need all the recursive output/dump at once, but instead want to list directories and files in a given path. Unless you've just wanted to compare it to S3Drive "Cached mode" speed.
When I try to view the content from s3drive, I get a notification almost every time that the listing is outdated, and I'm prompted to refresh the file list.
In principle you don't need to use: "Cached mode" if you don't need order or search all the time. If you use standard mode, then you should get results straight away.
If I agree and try to update this, the listing still isn't ready after 30 minutes.
Do you have versioning enabled on the bucket level? Do you expect to have a lot of versions (past revisions) of file? We would need to double check how we build the "Cached view", but 40s for Rclone and 30-60 minutes for S3Drive, sounds like there is a room for improvement on our end.
(edited)
Exported 2 message(s)
Timezone: UTC+0