s3-benchmark
s5cmd
s3-benchmark | s5cmd | |
---|---|---|
4 | 11 | |
776 | 2,339 | |
- | 2.2% | |
0.0 | 7.3 | |
4 months ago | about 2 months ago | |
Go | Go | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
s3-benchmark
- S3 Benchmark: Measure Amazon S3's performance from any location
- S3 Benchmark
-
Ask HN: Have you ever switched cloud?
There's another benchmark somewhere showing S3 can max out a 100Gbps instance.
https://github.com/dvassallo/s3-benchmark
Another potential issue is ListBucket rate limiting. If you have lots of small objects, you'll spend most of the time waiting to discover the names than transferring data
-
A distributed Posix file system built on top of Redis and S3
TTFB in S3 is 20-30ms around the 50th percentile. it can go much higher at p99 [1]. In any case, rotational latency for HDD drives is an order of magnitude lower (typically 2-5ms for a seek operation).
S3 is great for higher throughput workloads where TTFB is amortized across larger downloads (this is why it's very common to use S3 as a "data lake" where larger columnar files are stored, usually at the order of hundreds of MiB).
I think it's an interesting project but perhaps explaining the use cases where this solution is beneficial would go a long way here.
[1] https://github.com/dvassallo/s3-benchmark
s5cmd
-
GitHub issues from top Open Source Golang Repositories that you should contribute to
s5cmd - Extended character support for s3 compatible backend
-
Migrate 5 TB S3 bucket from one AWS account to another
I've used a tool in the past called s5cmd to copy millions of objects, and it was strikingly fast: https://github.com/peak/s5cmd
-
Those using AWS, have you ever tried to use AWS Transfer Family to transfer files into an S3 bucket? Can I use python to make these uploads, and if so how do I set it up in aws?
Some folks say https://github.com/peak/s5cmd is faster than the two options above.
- Gcloud storage: up to 94% faster data transfers for Cloud Storage
- Faster way to empty S3 buckets?
-
A Dockerfile for Perl 5.36 / Alpine, with working SSL
RUN mkdir /tmp/output && cd /tmp/output RUN wget --no-check-certificate https://github.com/peak/s5cmd/releases/download/v1.2.1/s5cmd_1.2.1_Linux-64bit.tar.gz RUN tar xvzf s5cmd_1.2.1_Linux-64bit.tar.gz && mv s5cmd /usr/bin/s5cmd && rm -rf /tmp/output && rm s5cmd_1.2.1_Linux-64bit.tar.gz
-
DataSync Vs AWS S3 sync?
Not that I’ve seen but you might checkout https://github.com/peak/s5cmd
-
S3/100gbps question
I like to use https://github.com/peak/s5cmd
-
Downloading files from S3 with multithreading and Boto3
Excellent walkthrough, love boto. We’ve recently been using s5cmd which we’ve found is ridiculously faster than boto without any extra boto tricks.
https://github.com/peak/s5cmd
- How to download millions of files from S3? (AWS CLI stops working after 1st million)
What are some alternatives?
warp - S3 benchmarking tool
rclone - "rsync for cloud storage" - Google Drive, S3, Dropbox, Backblaze B2, One Drive, Swift, Hubic, Wasabi, Google Cloud Storage, Azure Blob, Azure Files, Yandex Files