s3fs
smart_open
s3fs | smart_open | |
---|---|---|
2 | 6 | |
149 | 3,093 | |
0.7% | 0.8% | |
0.0 | 8.3 | |
4 months ago | 14 days ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
s3fs
-
Best Linux friendly cloud storage services
s3fs with a provider like Backblaze will probably be the absolute cheapest you’ll get.
-
Flask app, I'm connecting to an s3 bucket. How do I persist the connection for a user session?
Alternatively, if you want the cutting edge code, you can check out the GitHub repos at https://github.com/pyfilesystem/s3fs
smart_open
- smart_open: Utils for streaming large files (S3, HDFS, gzip, bz2...)
-
Use AWS to unzip all of Wikipedia in 10 minutes
We’re using smart_open, which is an amazing library that lets you open objects in S3 (and other cloud object stores) as if they’re files on your filesystem. It’s obviously critical that we’re able to seek to an arbitrary position in an S3 file without first downloading the whole thing. We’ll assume you’re using Poetry, but you should be able to follow along with any other package manager:
-
Using AWS and Hyperscan to match regular expressions on 100GB of text
If you didn’t follow along with the first article in this series, you should be able to follow this article with your own dataset as long as you install smart_open and Meadowrun. smart_open is an amazing library that lets you open objects in S3 (and other cloud object stores) as if they’re files on your filesystem, and Meadowrun makes it easy to run your Python code on the cloud.
-
Ask HN: Codebases with great, easy to read code?
I see that you're primarily looking into Python work, so I'd recommend `smart_open` as a nice, compact way to get started.
https://github.com/RaRe-Technologies/smart_open
-
How to open an s3 binary file in lambda using python open() function?
You want smart_open. It gives you a (more complete) file-like interface to many different storage systems, including s3. You can read and seek as needed.
-
Fsspec: Filesystem Interfaces for Python
See also smart_open: https://github.com/RaRe-Technologies/smart_open which might be more user-friendly? Never used it myself but it was on HN before. Discussion on their bugtracker: https://github.com/RaRe-Technologies/smart_open/issues/579
What are some alternatives?
rclone - "rsync for cloud storage" - Google Drive, S3, Dropbox, Backblaze B2, One Drive, Swift, Hubic, Wasabi, Google Cloud Storage, Azure Blob, Azure Files, Yandex Files
Streamz - Real-time stream processing for python
aws-sdk-go-v2 - AWS SDK for the Go programming language.
s3path - s3path is a pathlib extension for AWS S3 Service
s3www - Serve static files from any S3 compatible object storage services (Let's Encrypt ready)
PyFilesystem2 - Python's Filesystem abstraction layer
django-s3file - A lightweight file upload input for Django and Amazon S3
rxsci - ReactiveX for data science
fluvio-client-python - The Fluvio Python Client!
BorgBackup - Deduplicating archiver with compression and authenticated encryption.
requests - A simple, yet elegant, HTTP library.