How to execute Boto3 methods on Multiple AWS Accounts


Have you ever needed to execute boto3 methods on multiple AWS accounts at once?

Manually switching between accounts and running the same commands over and over can be a tedious and time-consuming task.

In this guide, we’ll show you how to run Boto3 methods on multiple AWS accounts at once using Python.

How to execute Boto3 methods on Multiple AWS Accounts at once

Before you can start, you’re required to have done the following prerequisites before you can run the Python script on your AWS account.

  1. Install the AWS CLI and configure an AWS profile
  2. Setting up the Python Environment
  3. Create an IAM role on the target AWS Account

If you’ve already done this, you can proceed to step 4.

1. Install AWS CLI and configure an AWS profile

The AWS CLI is a command line tool that allows you to interact with AWS services in your terminal. Depending on if you’re running LinuxmacOS, or Windows the installation goes like this:

# macOS install method:
brew install awscli

# Windows install method:
wget https://awscli.amazonaws.com/AWSCLIV2.msi
msiexec.exe /i https://awscli.amazonaws.com/AWSCLIV2.msi

# Linux (Ubuntu) install method:
sudo apt install awscli

In order to access your AWS account with the AWS CLI, you first need to configure an AWS Profile. There are 2 ways of configuring a profile:

  • Access and secret key credentials from an IAM user
  • AWS Single Sign-on (SSO) user

In this article, I’ll briefly explain how to configure the first method so that you can proceed with running the python script on your AWS account.

If you wish to set up the AWS profile more securely, then I’d suggest you read and apply the steps described in setting up AWS CLI with AWS Single Sign-On (SSO).

In order to configure the AWS CLI with your IAM user’s access and secret key credentials, you need to login to the AWS Console.

Go to IAM > Users, select your IAM user, and click on the Security credentials tab to create an access and secret key.

Then configure the AWS profile on the AWS CLI as follows:

➜ aws configure
AWS Access Key ID [None]: <insert_access_key>
AWS Secret Access Key [None]: <insert_secret_key>
Default region name [None]: <insert_aws_region>
Default output format [json]: json

Your was credentials are stored in ~/.aws/credentials and you can validate that your AWS profile is working by running the command:

➜ aws sts get-caller-identity
{
    "UserId": "AIDA5BRFSNF24CDMD7FNY",
    "Account": "012345678901",
    "Arn": "arn:aws:iam::012345678901:user/test-user"
}

2. Setting up the Python Environment

To be able to run the Python boto3 script, you will need to have Python installed on your machine.

Depending on if you’re running LinuxmacOS, or Windows the installation goes like this:

# macOS install method:
brew install python

# Windows install method:
wget https://www.python.org/ftp/python/3.11.2/python-3.11.2-amd64.exe
msiexec.exe /i https://www.python.org/ftp/python/3.11.2/python-3.11.2-amd64.exe

curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py
python get-pip.py

# Linux (Ubuntu) install method:
sudo apt install python3 python3-pip

Once you have installed Python, you will need to install the Boto3 library.

You can install Boto3 using pip, the Python package manager, by running the following command in your terminal:

pip install boto3

3. Create an IAM role on the target AWS Account

On every target account that you wish to run Boto3 methods, you need to create an IAM role that can be assumed by the source AWS account in which you deploy the multi_account_execution.py script that we’re going to create in step 4.

AWSTemplateFormatVersion: "2010-09-09"
Description: A CloudFormation template that creates a cross-account role that can be assumed by the source account.
Parameters:
  SourceAccount: { Description: Source AWS account ID, Type: String }
Resources:
  CrossTargetAccountRole:
    Type: AWS::IAM::Role
    Properties:
      RoleName: crossaccount-role
      AssumeRolePolicyDocument:
        Version: "2012-10-17"
        Statement:
          - Effect: Allow
            Principal:
              AWS: !Sub arn:aws:iam::${SourceAccount}:root
            Action: sts:AssumeRole
      Path: "/"
      ManagedPolicyArns:
        - arn:aws:iam::aws:policy/AdministratorAccess

Note: make sure to fill in the SourceAccount parameter with the correct AWS account ID.

4. Create the Python script that allows you to run Boto3 commands on Multiple AWS Accounts

Once you have our environment set up, you can create the Python script.

Copy the following code into a new file on the desired location and name it: multi_account_execution.py.

The script assumes a role in each account, sets up a Boto3 client, and runs the delete_awsconfig_rule_evaluations function on each account.

#  https://github.com/dannysteenman/aws-toolbox
#
#  License: MIT
#
# This script gives you the ability to run Boto3 commands on all accounts which are specified in the aws_account_list

import boto3

aws_account_list = ["111111111111", "222222222222", "333333333333"]


def role_arn_to_session(**args):
    client = boto3.client("sts")
    response = client.assume_role(**args)
    return boto3.Session(
        aws_access_key_id=response["Credentials"]["AccessKeyId"],
        aws_secret_access_key=response["Credentials"]["SecretAccessKey"],
        aws_session_token=response["Credentials"]["SessionToken"],
    )


# This decides what role to use, a name of the session you will start, and potentially an external id.
# The external id can be used as a passcode to protect your role.
def set_boto3_clients(account_id):
    return role_arn_to_session(
        RoleArn="arn:aws:iam::" + account_id + ":role/crossaccount-role",
        RoleSessionName=f"{account_id}-crossaccount-role",
    )


# This is an example function which deletes evaluation results for a specific config rule.
# You can create your own Boto3 function which you want to execute on mutliple accounts.
def delete_awsconfig_rule_evaluations(awsconfig):
    return awsconfig.delete_evaluation_results(ConfigRuleName="SHIELD_002")


def lambda_handler(event, context):
    for account_id in aws_account_list:
        run_boto3_in_account = set_boto3_clients(account_id)
        # You can use run_boto3_in_account as if you are using boto in another account
        # For example: s3 = run_boto3_in_account.client('s3')
        awsconfig = run_boto3_in_account.client("config")
        delete_awsconfig_rule_evaluations(awsconfig)


if __name__ == "__main__":
    lambda_handler({"invokingEvent": '{"messageType":"ScheduledNotification"}'}, None)

First, create a list of AWS account IDs that you want to run Boto3 methods on under aws_account_list. You can specify any number of accounts in this list.

Do note that you need to deploy the IAM role as shown in step 3 on the target AWS account, otherwise you’ll get a permission denied error.

Next, you have to update the RolaARN for the set_boto3_clients function to match the role name that you’ve deployed on the target AWS Accounts. This allows the Python function to assume the IAM role on the target AWS account.

Finally, you can configure the lambda_handler to your own liking and add your own Boto3 methods that you like to wish to run on your target AWS Accounts.

In the example script, we’ve created a function that deletes evaluation results for a specific AWS Config rule.

5. Run Boto3 methods on Multiple AWS Accounts

Make sure that the Lambda execution role or IAM role that you use in your Terminal is able to assume the IAM role on the target account.

This means the target IAM role name needs to match the role name in the script and that the target IAM role has added the source AWS account ID as a trusted entity.

Conclusion

In this blog post, we’ve shown you how to run Boto3 methods on multiple AWS accounts at once.

By utilizing the AWS Security Token Service (STS) and the boto3.Session object, you can assume a specific role in each account and run the desired Boto3 functions.

This allows you to perform tasks across multiple accounts, such as cleaning up or monitoring AWS resources, in a more streamlined and efficient way.


Want to join us? Join for tips, strategies, and resources that I use in my solo cloud agency to build well-architected, resilient, and cost-optimized AWS solutions on AWS.

Join 1k+ AWS Cloud enthusiasts
Loved by engineers worldwide


Danny Steenman

A Senior AWS Cloud Engineer with over 9 years of experience migrating workloads from on-premises to AWS Cloud.

I have helped companies of all sizes shape their cloud adoption strategies, optimizing operational efficiency, reducing costs, and improving organizational agility.

Connect with me today to discuss your cloud aspirations, and let’s work together to transform your business by leveraging the power of AWS Cloud.

I need help with..
stacked cubes
Improving or managing my CDK App.Maximize the potential of your AWS CDK app by leveraging the expertise of a seasoned CDK professional.
Reducing AWS Costs.We can start by doing a thorough assessment of your current AWS infrastructure, identifying areas with potential for cost reduction and efficiency improvement.
Verifying if my infrastructure is reliable and efficient.We’ve created a comprehensive AWS Operations Checklist that you can utilize to quickly verify if your AWS Resources are set up reliably and efficiently.