Image source: Pixabay
Amazon Web Services (AWS) is currently the most widely adopted cloud service provider. Python is currently ranked in the top three languages being used by developers. It only makes sense that you might want to use the two in tandem. Thankfully, AWS has anticipated this and provided ways to smoothly integrate the two.
Here, we’ll look at how Python and AWS can be combined and see an example script used to work with metrics through CloudWatch.
Boto 3: AWS Python SDK
Boto 3 is AWS’ Python Software Development Kit (SDK). It supports Python 2.6.5+, 2.7 and 3.3+. With it you can easily integrate Python applications, libraries or scripts with over 50 AWS services. It includes many specific service features, such as allowing multi-part transfers for S3 or simplified query conditions or DynamoDB.
Boto 3 comes with two different APIs: client and resource. The client API provides one-to-one mapping to underlying HTTP operations. The resource API hides network calls and provides resource objects and collections. This allows you to access attributes and to perform actions.
Both have dynamic classes driven by JSON models, which allow fast updating and strong consistency across services.
This SDK also comes with “waiters” for both client and resource APIs, which are used to poll for pre-defined status changes to resources. This allows you to specify when your code will run based on the status reported by the service you wish to interact with.
Integrated Development Environment (IDE) Options
You have a couple different compatible IDEs to choose from when working with Python in AWS.
PyCharm, IntelliJ and Visual Studio Code are all available for use via open-source plug-ins. You can also use Cloud9, AWS’ cloud-based IDE that runs on an EC2 instance or a Linux server with SSH.
Cloud9 works with over 40 different languages and includes a code editor, debugger and terminal. It comes pre-configured with SDKs, libraries and plug-ins to help you develop serverless applications. It allows real-time collaborative coding and grants direct terminal access to AWS through your EC2 instance.
Get Up and Running
Your first step will be to install the latest version of Boto 3. This can be done with the pip install boto3 command or by installing the tarball from PyPI.
Before you can begin using Boto 3, you need to set up authentication credentials. You can do so through the Identity and Access Management (IAM) console. There you can either create a new user or give permissions to an existing user.
You also need to generate a new set of access and secret keys for this user. You can do this either through the CLI with aws configure or manually by adding the file to ~/.aws/credentials. If you choose the manual route, you’ll use the following:
[default]
aws_access_key_id = {ACCESS KEY ID}
aws_secret_access_key = {SECRET ACCESS KEY}
Regardless of which way you choose, it’s good practice to create a credential file to make passing your keys simpler and to keep secrets out of your code. You have the option of placing this file in either the home directory of your user (~/.boto) or for system use (/etc/boto.cfg).
After your installation and configuration is done, simply import and tell it what service you want to work with, like so:
import boto3
s3 = boto3.resource('s3')
Creating Lambda Functions
AWS Lambda is a serverless compute service that allows you to run and schedule code in a wide range of languages. By uploading code to Lambda you are able to perform any function allowed by API, from automating EBS snapshots to bulk deployment of instances. Lambda is a good option for running any scripts you write if you do not have a dedicated server. With it, your code will run in an environment that includes Boto 3 already.
To use Lambda you must first create an execution role, which can be done through the IAM console. This role should have the following properties:
- Trusted entity - Lambda
- Permissions - AWSLambdaBasicExecutionRole.
- Role name - lambda-role
To create a function, you can either do so through the Lambda console, which uses Cloud9, or you can import code from your preferred editor. You need to configure the following for each function you create:
- Name - my-function
- Runtime - Python 3.7
- Role - Choose an existing role
- Existing role - lambda-role
Each function you create is automatically placed in a source file named lambda_function which exports a function named lambda_handler. This handler is then called to run your code. Lambda will automatically create a deployment package each time you save your code.
Example Function: Retrieving Metrics From CloudWatch
To get a better idea of what your end result will look like, here’s a simple example for working with CloudWatch metrics. This script allows you to get a list of available metrics and push data to those metrics. It can be used either with AWS resources or your own applications. It works with metrics that are already being collected as well as those you wish to create.
To list metrics:
import boto3
# Create CloudWatch client
cloudwatch = boto3.client('cloudwatch')
# List metrics through the pagination interface
paginator = cloudwatch.get_paginator('list_metrics')
for response in paginator.paginate(Dimensions=[{'Name': 'LogGroupName'}],
MetricName='IncomingLogEvents',
Namespace='AWS/Logs'):
print(response['Metrics'])
To push data points:
import boto3
# Create CloudWatch client
cloudwatch = boto3.client('cloudwatch')
# Put custom metrics
cloudwatch.put_metric_data(
MetricData=[
{
'MetricName': 'PAGES_VISITED',
'Dimensions': [
{
'Name': 'UNIQUE_PAGES',
'Value': 'URLS'
},
],
'Unit': 'None',
'Value': 1.0
},
],
Namespace='SITE/TRAFFIC'
)
Conclusion
Using AWS’ Boto 3 SDK in combination with Python allows you to more easily manage your AWS configurations and monitor your systems. With this combo, you can better integrate administrative workflows into your current development processes and gain access to higher level features.
Hopefully, this article gave you some solid foundations to start from and you’re excited to keep going from here. Whether you’re already familiar with AWS configurations, or you’re just getting started, your next step will likely be checking out the API documentation.
Author Bio
Gilad David Maayan is a technology writer who has worked with over 150 technology companies including SAP, Samsung NEXT, NetApp and Imperva, producing technical and thought leadership content that elucidates technical solutions for developers and IT leadership.