boto3 VS aws-cli

Compare boto3 vs aws-cli and see what are their differences.

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
boto3 aws-cli
36 48
8,649 14,811
1.1% 1.3%
9.7 9.8
1 day ago about 16 hours ago
Python Python
Apache License 2.0 GNU General Public License v3.0 or later
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

boto3

Posts with mentions or reviews of boto3. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-03-03.
  • Bug in std:shared_mutex on Windows
    7 projects | news.ycombinator.com | 3 Mar 2024
    Former AWS here.

    My literal job for the last part of my time at AWS was "help triage bugs in the AWS SDK." This is by far the best repro I've ever seen for such an in-depth event.

    Most of the tickets you get in open ticket trackers are incomplete [ https://github.com/boto/boto3/issues/4011 ] nonsensical [ https://github.com/boto/boto3/issues/4018 ] or weird [ https://github.com/boto/boto3/issues/358 ].

  • Asynchronous Python lib to work with Amazon SQS
    3 projects | news.ycombinator.com | 19 Nov 2023
  • Beginning Python: Project Management With PDM
    4 projects | dev.to | 12 Oct 2023
    A majority of software in the modern world is built upon various third party packages. These packages help offload work that would otherwise be rather tedious. This includes interacting with cloud APIs, developing scientific applications, or even creating web applications. As you gain experience in python you'll be using more and more of these packages developed by others to power your own code. In this example I've decided to expand our math functionality with NumPy. pdm add is what's used to add dependencies like this to our project:
  • Creating RSS feeds for language/module specific AWS SDK updates
    3 projects | /r/aws | 25 Sep 2023
    The updates could be parsed from the github repo's CHANGELOG files (ex: javascript, java, python). I'm picturing an RSS feed generated for a specific language and module (ex: python s3, javascript s3, java sqs)
  • Teaching boto3 to store floats and datetime objects in DynamoDB
    2 projects | dev.to | 7 Sep 2023
    This can be quite annoying because it makes you wonder why the high-level API isn't able to deal with these common data types. Part of the reason for this is most likely that floats in Python can be counter-intuitive, so Decimal is a better data type if you want numbers to behave as non-computer-scientists expect it. To learn more about these complexities, check out this discussion on GitHub about implementing float support in boto3 and the Python documentation on the subject. Additionally, DynamoDB has no native DateTime data type, so there is no straightforward mapping.
  • Interacting with Amazon S3 using AWS Data Wrangler (awswrangler) SDK for Pandas: A Comprehensive Guide
    5 projects | dev.to | 20 Aug 2023
    AWS Data Wrangler is a Python library that simplifies the process of interacting with various AWS services, built on top of some useful data tools and open-source projects such as Pandas, Apache Arrow and Boto3. It offers streamlined functions to connect to, retrieve, transform, and load data from AWS services, with a strong focus on Amazon S3.
  • Migrate 5 TB S3 bucket from one AWS account to another
    4 projects | /r/aws | 27 Jun 2023
    Alternatively, you could create a Python script using either Boto3 or her asynchronous sister, aioBoto3 that will spin through the contents of the origin bucket and move it over to the destination.
  • Growing Outside of Work: My Journey with the Cloud Resume Challenge
    3 projects | dev.to | 22 Apr 2023
    Once my site was stood up, I needed to build out the user count API. Through the console, I set up a DynamoDB table and created a user count item. Getting my lambda to interface with AWS resources was a breeze with the Boto3 SDK. You can see my Python code that increments the user count whenever someone visits the site here. The key is the usage of the update_item method that comes from Boto3.
  • Logging code mess
    3 projects | /r/Python | 14 Apr 2023
    If you want to get a feel for what kind of logging and how much logging is done in projects, boto3 is a very widely used SDK created by Amazon: https://github.com/boto/boto3
  • Guide to Serverless & Lambda Testing — Part 2 — Testing Pyramid
    6 projects | dev.to | 13 Mar 2023
    Schema validations logic — I use Pydantic for input validation and schema validation (boto responses, API responses, input validation, etc.) use cases. The Pydantic schema can contain type and value constraint checks or even more complicated logic with the custom validator code.

aws-cli

Posts with mentions or reviews of aws-cli. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-08-14.
  • Top 10 CLI Tools for DevOps Teams
    11 projects | dev.to | 14 Aug 2023
    The AWS CLI is a must-have tool if your team relies on Amazon Web Services. It lets you effortlessly interact with AWS services, orchestrate resource management, and automate tasks from the comfort of your terminal. Once you get used to the tool, you'll notice how convenient and quick it is to fit into your processes – especially compared to going through AWS's web-based user interface.
  • My First Impressions of Nix
    33 projects | news.ycombinator.com | 19 Jun 2023
    Just for your consideration, the network effect is very real with package managers, too:

    https://search.nixos.org/packages?channel=23.05&show=awscli2 is 2.11.27 (even on the "unstable" channel), versus https://formulae.brew.sh/formula/awscli#default that is 2.12.1, which correctly is the most current (https://github.com/aws/aws-cli/tags)

  • [Engineering_Stuff] S3FS-FUSE - Permet de monter votre lien de seau S3 / Minio vers votre répertoire local
    2 projects | /r/enfrancais | 28 Apr 2023
  • s3fs-fuse - allows to mount your s3/minio bucket link to your local directory
    3 projects | /r/engineering_stuff | 30 Mar 2023
    s3fs allows Linux, macOS, and FreeBSD to mount an S3 bucket via FUSE(Filesystem in Userspace). s3fs makes you operate files and directories in S3 bucket like a local file system. s3fs preserves the native object format for files, allowing use of other tools like AWS CLI.
  • AWS Announces Open Source Mountpoint for Amazon S3
    4 projects | news.ycombinator.com | 26 Mar 2023
    AFAIK it's still a Python package: https://github.com/aws/aws-cli/tree/2.11.6
  • Event Based System with Localstack (Elixir Edition): Uploading files to S3 with PresignedURL's
    3 projects | dev.to | 9 Feb 2023
    And this is the init_localstack.sh file content, a unique thing about localstack its that you can move all strings like an aws-cli tool, also the container deletes all the content and config once the container stops, so the script file must create all the resources that you need from Localstack
  • Yq is a portable yq: command-line YAML, JSON, XML, CSV and properties processor
    11 projects | news.ycombinator.com | 4 Feb 2023
    Not to mention that JMESpath appears to be abandoned.

    There is a fork (https://github.com/jmespath-community/jmespath.spec), but it seems unlikely to be used by the aws cli (https://github.com/aws/aws-cli/issues/7396). Although, for that matter jq is semi-abandoned itself.

  • Setting up ad hoc development environments for Django applications with AWS ECS, Terraform and GitHub Actions
    4 projects | dev.to | 11 Jun 2022
    #!/bin/bash # This script will be called to update an ad hoc environment backend # with a new image tag. It will first run pre-update tasks (such as migrations) # and then do a rolling update of the backend services. # It is called from the ad_hock_backend_update.yml GitHub Actions file # Required environment variables that need to be exported before running this script: # WORKSPACE - ad hoc environment workspace # SHARED_RESOURCES_WORKSPACE - shared resources workspace # BACKEND_IMAGE_TAG - backend image tag to update services to (e.g. v1.2.3) # AWS_ACCOUNT_ID - AWS account ID is used for the ECR repository URL echo "Updating backend services..." # first define a variable containing the new image URI NEW_BACKEND_IMAGE_URI="$AWS_ACCOUNT_ID.dkr.ecr.us-east-1.amazonaws.com/backend:$BACKEND_IMAGE_TAG" # register new task definitions # https://docs.aws.amazon.com/cli/latest/reference/ecs/describe-task-definition.html#description for TASK in "migrate" "gunicorn" "default" "beat" do echo "Updating $TASK task definition..." # in Terraform we name our tasks based on the ad hoc environment name # (also the Terraform workspace name) and the name of the task # (e.g. migrate, gunicorn, default, beat) TASK_FAMILY=$WORKSPACE-$TASK # save the task definition JSON to a variable TASK_DESCRIPTION=$(aws ecs describe-task-definition \ --task-definition $TASK_FAMILY \ ) # save container definitions to a file for each task echo $TASK_DESCRIPTION | jq -r \ .taskDefinition.containerDefinitions \ > /tmp/$TASK_FAMILY.json # write new container definition JSON with updated image echo "Writing new $TASK_FAMILY container definitions JSON..." # replace old image URI with new image URI in a new container definitions JSON cat /tmp/$TASK_FAMILY.json \ | jq \ --arg IMAGE "$NEW_BACKEND_IMAGE_URI" '.[0].image |= $IMAGE' \ > /tmp/$TASK_FAMILY-new.json # Get the existing configuration for the task definition (memory, cpu, etc.) # from the variable that we saved the task definition JSON to earlier echo "Getting existing configuration for $TASK_FAMILY..." MEMORY=$( echo $TASK_DESCRIPTION | jq -r \ .taskDefinition.memory \ ) CPU=$( echo $TASK_DESCRIPTION | jq -r \ .taskDefinition.cpu \ ) ECS_EXECUTION_ROLE_ARN=$( echo $TASK_DESCRIPTION | jq -r \ .taskDefinition.executionRoleArn \ ) ECS_TASK_ROLE_ARN=$( echo $TASK_DESCRIPTION | jq -r \ .taskDefinition.taskRoleArn \ ) # check the content of the new container definition JSON cat /tmp/$TASK_FAMILY-new.json # register new task definition using the new container definitions # and the values that we read off of the existing task definitions echo "Registering new $TASK_FAMILY task definition..." aws ecs register-task-definition \ --family $TASK_FAMILY \ --container-definitions file:///tmp/$TASK_FAMILY-new.json \ --memory $MEMORY \ --cpu $CPU \ --network-mode awsvpc \ --execution-role-arn $ECS_EXECUTION_ROLE_ARN \ --task-role-arn $ECS_TASK_ROLE_ARN \ --requires-compatibilities "FARGATE" done # Now we need to run migrate, collectstatic and any other commands that need to be run # before doing a rolling update of the backend services # We will use the new task definitions we just created to run these commands # get the ARN of the most recent revision of the migrate task definition TASK_DEFINITION=$( \ aws ecs describe-task-definition \ --task-definition $WORKSPACE-migrate \ | jq -r \ .taskDefinition.taskDefinitionArn \ ) # get private subnets as space separated string from shared resources VPC SUBNETS=$( \ aws ec2 describe-subnets \ --filters "Name=tag:env,Values=$SHARED_RESOURCES_WORKSPACE" "Name=tag:Name,Values=*private*" \ --query 'Subnets[*].SubnetId' \ --output text \ ) # replace spaces with commas using tr SUBNET_IDS=$(echo $SUBNETS | tr ' ' ',') # https://github.com/aws/aws-cli/issues/5348 # get ecs_sg_id - just a single value ECS_SG_ID=$( \ aws ec2 describe-security-groups \ --filters "Name=tag:Name,Values=$SHARED_RESOURCES_WORKSPACE-ecs-sg" \ --query 'SecurityGroups[*].GroupId' \ --output text \ ) echo "Running database migrations..." # timestamp used for log retrieval (milliseconds after Jan 1, 1970 00:00:00 UTC) START_TIME=$(date +%s000) # run the migration task and capture the taskArn into a variable called TASK_ID TASK_ID=$( \ aws ecs run-task \ --cluster $WORKSPACE-cluster \ --task-definition $TASK_DEFINITION \ --network-configuration "awsvpcConfiguration={subnets=[$SUBNET_IDS],securityGroups=[$ECS_SG_ID],assignPublicIp=ENABLED}" \ | jq -r '.tasks[0].taskArn' \ ) echo "Task ID is $TASK_ID" # wait for the migrate task to exit # https://docs.aws.amazon.com/cli/latest/reference/ecs/wait/tasks-stopped.html#description # > It will poll every 6 seconds until a successful state has been reached. # > This will exit with a return code of 255 after 100 failed checks. aws ecs wait tasks-stopped \ --tasks $TASK_ID \ --cluster $WORKSPACE-cluster # timestamp used for log retrieval (milliseconds after Jan 1, 1970 00:00:00 UTC) END_TIME=$(date +%s000) # print the CloudWatch log events to STDOUT aws logs get-log-events \ --log-group-name "/ecs/$WORKSPACE/migrate" \ --log-stream-name "migrate/migrate/${TASK_ID##*/}" \ --start-time $START_TIME \ --end-time $END_TIME \ | jq -r '.events[].message' echo "Migrations complete. Starting rolling update for backend services..." # update backend services for TASK in "gunicorn" "default" "beat" do # get taskDefinitionArn for each service to be used in update-service command # this will get the most recent revision of each task (the one that was just created) # https://docs.aws.amazon.com/cli/latest/reference/ecs/describe-task-definition.html#description TASK_DEFINITION=$( \ aws ecs describe-task-definition \ --task-definition $WORKSPACE-$TASK \ | jq -r \ .taskDefinition.taskDefinitionArn \ ) # update each service with new task definintion aws ecs update-service \ --cluster $WORKSPACE-cluster \ --service $WORKSPACE-$TASK \ --task-definition $TASK_DEFINITION \ --no-cli-pager done echo "Services updated. Waiting for services to become stable..." # wait for all service to be stable (runningCount == desiredCount for each service) aws ecs wait services-stable \ --cluster $WORKSPACE-cluster \ --services $WORKSPACE-gunicorn $WORKSPACE-default $WORKSPACE-beat echo "Services are now stable. Backend services are now up to date with $BACKEND_IMAGE_TAG." echo "Backend update is now complete!"
  • AWS linux container image
    2 projects | /r/aws | 5 Mar 2022
    The Dockerfile is here (to confirm that it builds on Amazon Linux 2): https://github.com/aws/aws-cli/blob/v2/docker/Dockerfile
  • AWS CLI releases
    2 projects | /r/aws | 15 Feb 2022
    > AWS CLI 2.0.0dev preview release

What are some alternatives?

When comparing boto3 and aws-cli you can also consider the following projects:

rclone - "rsync for cloud storage" - Google Drive, S3, Dropbox, Backblaze B2, One Drive, Swift, Hubic, Wasabi, Google Cloud Storage, Yandex Files

terraform - Terraform enables you to safely and predictably create, change, and improve infrastructure. It is a source-available tool that codifies APIs into declarative configuration files that can be shared amongst team members, treated as code, edited, reviewed, and versioned.

apache-libcloud - Apache Libcloud is a Python library which hides differences between different cloud provider APIs and allows you to manage different cloud resources through a unified and easy to use API.

SAWS - A supercharged AWS command line interface (CLI).

boto - For the latest version of boto, see https://github.com/boto/boto3 -- Python interface to Amazon Web Services

httpie - 🥧 HTTPie CLI — modern, user-friendly command-line HTTP client for the API era. JSON support, colors, sessions, downloads, plugins & more.

Telethon - Pure Python 3 MTProto API Telegram client library, for bots too!

google-api-python-client - 🐍 The official Python client library for Google's discovery based APIs.

thefuck - Magnificent app which corrects your previous console command.

aws-vault - A vault for securely storing and accessing AWS credentials in development environments

pgcli - Postgres CLI with autocompletion and syntax highlighting

gspread - Google Sheets Python API