Skip to main content

Python3 and AWS

So, here we are. Deep into cloud architecture, getting our feet wet in application deployment/distributed systems, and we find ourselves doing a ton of manual tasks. These manual tasks could be point/click, or they could be simply re-running the same aws cli commands over and over again. We're going to use another way to hit the AWS API, and that's with Python.

PLEASE NOTE: The code is pasted at the bottom. Feel free to copy/paste it while we're going over it. I have screenshots for a visual as well.

Python has an amazing module (one of it's best in my opinion) called Boto3. Boto3 is a library/module to AWS's API. Similar to AWS CLI, all you need is a secret key/access key.

For this blog, we're going to list EC2 instances.
You will need the following to follow along:
  1. PyCharm for the IDE
  2. An internet connection
  3. An AWS account that has access to EC2 (please don't experiment in your prod environment).
  4. A coffee, or a beer
  5. Prior knowledge of functions, if statements, booleans, and variables in Python. If you know them in another language, you should still be able to follow along.
The first thing we will do is spin up an EC2 instance, as shown below.

This can be any AMI/Distro. That won't matter in this read.

Next, we're going to create an IAM role. This IAM role for PURELY testing purpose, will have access to admin.

 New Project > Name it whatever you'd like > pick a location to store your code."]]]]}'>
After your IAM role is configured, let's open up PyCharm.

You'll want to create a new project by going to File > New Project > Name it whatever you'd like > pick a location to store your code.

Next, go ahead and go to File > New File > Python > (feel free to name it whatever you'd like. Just ensure to put the .py at the end of the name).

Once the file is open, go to File > Default Settings > Your project interpreter > Click the plus (+) sign at the bottom > search for Boto3.

Now that we're all ready, you should have a blank file ready to go.

The first thing we are going to do is import Boto3

Next, let's define our first function

Our first function will do the following:
  1. Be named AWS_Access_Key():
  2. It will have one parameter, the access key.
  3. It will have an if statement that if accesskey is not None (meaning if it's not null), it'll print 'Access key accepted'
  4. Next, we will return the parameter value. Returning the parameter value allows us to use the functions input in another function.
You might be wondering why I have an input function inside our defining parameter. This is for security purposes. The access key/secret key is very powerful. If anyone has that, and you have admin access, they can literally do whatever they want in your AWS environment. Some folks prefer not to do it this way, but this is the way I choose to do it. If you don't want to do it this way, you can define your parameter like so:

For the secret key, we're going to do the same exact steps.

Next, we're going to get into the fun.
Let's give a break down of what's happening in the below screenshot.
  1. We're creating a new function called listService():
  2. Be given two parameters that need to be filled in. awsService and region (we will break these down together).
  3. Call the boto3 module's client parameter, fill in the appropriate parameters needed, and assign the values to the "service" variable. Let's break down the parameters in boto3.client:
    • the service_name represents what AWS service we want to use. In our case, it's ec2
    • the region_name represents what region we want to scan. In my case, it's us-east-1
    • aws_access_key_id is our access key that we put in for the first parameter (notice that it shows the function name as the value because the function has the value due to the return key)
    • aws_secret_access_key is our secret key that we put in for the second parameter (notice that it shows the function name as the value because the function has the value due to the return key)
  4. Next, we're going to call our "service" variable that is holding the values for the boto3.client parameter and call another parameter called describe_instances (this is another Boto3 parameter).
  5. We're now going to add our filters. I want you to go back to your AWS console, click on your EC2 instance, and take a look at the description. Do you see those values? Those are the values that we can pull. In our case, we will be pulling the instance state name and if it's running, give us value.
  6. Next, we're going to print the variable output so we can get out instances.
  7. Finally, we're going to define our parameter with it's 'ec2' and 'us-east-1' value.

Now, we're ready to run! Get your access key and secret key ready.

Go ahead and run the function. You will see the following:

After you enter your access key, go ahead and enter your secret key

There you have it! You are now on your way to being a Boto3 expert. Thank you for reading, and I hope you found this both insightful and entertaining!

As promised, below is the code:

import boto3 def AWS_Access_Key(accesskey=input('Please enter access key: ')): if accesskey is not None: print('Access key accepted') return accesskey def AWS_Secret_Key(secretkey=input('Please enter access key: ')): if secretkey is not None: print('Access key accepted') return secretkey def listService(awsService, region): service = boto3.client(service_name = awsService, region_name = region, aws_access_key_id = AWS_Access_Key(), aws_secret_access_key = AWS_Secret_Key() ) instances = service.describe_instances( Filters=[ {'Name': 'instance-state-name', 'Values': ['running']} ]) print(instances) listService('ec2', 'us-east-1')


Popular posts from this blog

Run PowerShell code with Ansible on a Windows Host

Ansible is one of the Configuration Manager kings in the game. With it's easy-to-understand syntax and even easier to use modules, Ansible is certainly a go-to when you're picking what Configuration Management you want to use for your organization. Your question may be "but Ansible is typically on Linux and what happens when I'm in a Windows environment?". Luckily I'm here to tell you that Ansible will still work! I was pleasantly surprised with how easy it is to use Ansible on Windows with a little WinRM magic. Let's get started.

Pre-requisites for this post:
1) WinRM set up to connect to your Windows host from Ansible
2) Ansible set up for Windows Remote Management
3) SSH access to the Ansible host
4) Proper firewall rules to allow WinRM (port 5985) access from your Ansible host to your Windows host
5) Hosts file set up in Ansible that has your IP or hostname of your Windows Server.
6) At least one Linux host running Ansible and one Windows Server host …

Running PowerShell commands in a Dockerfile

As Docker continues to grow we are starting to see the containerization engine more and more on Windows. With the need for containers on Windows, we also need the same automation we get in Linux with Dockerfiles. Today we're going to create a Dockerfile that runs PowerShell cmdlets.
Prerequisites; 1. Docker for Windows
2. A code editor (VSCode preferred)

Let's go ahead and get our Dockerfile set up. Below is the Dockerfile I used for this post.

from MAINTAINER Michael Levan RUN powershell -Command Install-WindowsFeature -Name Web-Server RUN powershell -Command New-Item -Type File -Path C:\ -Name config
As you can see from the above, this is a tiny Dockerfile. What this will do is install the IIS Windows 

Feature and create a new file in C:\ called "config".
You should see something very similar to the below screenshot;

Next let's create a running container out of our image. First we'll need to run docker container ls to

 get o…

DevOps tooling in the Microsoft realm

When I really started to dive into automation and practicing DevOps with specific tooling, there were a few key players. At the time Microsoft was not one of them. They were just starting to embrace the open source world, including the art and practice of DevOps. Since then Microsoft has went all in and the tech giant has made some incredible tooling. Recently I switched to a Microsoft-heavy environment and I love it. I went from AWS/Python/Ansible/Jenkins to Azure/PowerShell/ARM/Azure DevOps. My first programming language was PowerShell so being back in the saddle allowed me to do a full circle between all of the different types of tooling in both worlds. Today I want to share some of that tooling with you.

The first thing I want to talk about is ARM. What is ARM? ARM is a configuration management tool that allows you to perform software-defined-infrastructure. Much like Ansible and Terraform, ARM allows you to define what you want your environment to look like at scale. With ARM, yo…