Programmatically Write Files from an EC2 Instance to an S3 Bucket in AWS

Tayo617
11 min readOct 22, 2020

Disclaimer: Over the past couple of months, I’ve been assigned various projects utilizing software tools that will help me develop hands-on experience in my pursuit towards a promising DevOps opportunity. Although I still feel a bit wet behind the ears, I’m trying to avoid being intimidated by the complex functions of AWS because I see the insurmountable value & benefits this platform provides for its customers. Please bare with me, the purpose of my Medium blog is to simply document my progression while I prepare for various certifications.

Amazon S3 provides access to reliable, fast, and inexpensive data storage infrastructure. It is designed to make web-scale computing easier by enabling you to store and retrieve any amount of data, at any time, from within Amazon EC2 or anywhere on the web. Amazon S3 stores data objects redundantly on multiple devices across multiple facilities and allows concurrent read or write access to these data objects by many separate clients or application threads.

Given the benefits of Amazon S3 for storage, you might decide to use this service to store files and data sets for use with EC2 instances. There are several ways to move data to and from Amazon S3 to your instances. There are a variety of tools that people have written that you can use to access your data in Amazon S3 from your computer or your instance.

For today’s project, I will demonstrate how to write files from an EC2 instance to an S3 bucket via the AWS Management Console and a Terminal window.

1. Create an S3 bucket

An S3 bucket is where we store all objects (files, images, spreadsheets, etc.) First, we’ll start by logging into the AWS Management Console (always make sure you’re in the region closest to your physical location). Go to or search for S3 under the Storage services section, then click Create Bucket. On the next screen, you’ll be asked to name this new bucket; I’ve named my bucket luit-ec2. Again, you can confirm if you’re logged into the proper region. If so, just proceed and click Next.

S3 Bucket, Create a new Bucket
Naming the new S3 Bucket

We’ll ignore the Properties & Set Permissions steps, click Next twice to proceed to the Review step. Here we’ll press the Create Bucket button. The new bucket should now appear on the bucket list screen.

Complete creation of the S3 Bucket
Listed above, luit-ec2 Bucket

2. Create an EC2 Instance

Our EC2 operates as the virtual server, or compute piece of any infrastructure. In the AWS Management Console, simple click the Services button in the upper-left corner and navigate to the EC2 Dashboard, then press the Launch Instance button. You’ll be prompted to ‘Choose an Amazon Machine Image (AMI)’; on the left-hand side of the screen, apply a checkmark in the Free tier only box to ensure you choose an AMI that will not incur charges. We now Select the very first free-tier eligible AMI on the list.

On the Choose Instance Type screen, make sure to Select free-tier eligible t2.micro. Click Review & Launch, then Launch.

Choose a free-tier eligible AMI
Choose the free-tier eligible t2.micro Instance Type

A pop-up will appear requesting creation of a key pair, which is MANDATORY in order to ssh into the EC2 through the Terminal application. Make sure to make a note of where on your computer the key pair is located. Be sure to do the following:

  1. Select Create a new key pair.
  2. Provide a name to key pair; I’ve named mine as luit-ec2-key
  3. Click the Download the key pair button
  4. Click the Launch Instances button

Note: At this point, I received an error because the EC2’s security group defaulted to allow SSH access from any IP address, but we want the rule to only allow access from our computer’s IP address. The error notification will provide a link to navigate you to edit your security groups. Here, you’ll create a new security group so you may edit the source to designate it as ‘My IP’. Once you do that, hit the Review & Launch button. You should be brought back to the key pair screen above, only this time you choose the existing key pair option & include the key pair name you previously created. Click the Launch button, you should now have see a screen like the one below:

Successful EC2 Launch Status
EC2 Instance Dashboard with running Instances

3. Create an IAM Role

AWS Identity and Access Management (IAM) enables you to manage access to AWS services and resources securely. Using IAM, you can create and manage AWS users and groups, and use permissions to allow and deny their access to AWS resources.

An IAM role is an IAM entity that defines a set of permissions for making AWS service requests. IAM roles are not associated with a specific user or group. Instead, trusted entities assume roles, such as IAM users, applications, or AWS services such as EC2. IAM roles are designed so that your applications can securely make API requests from your instances, without requiring you to manage the security credentials that the applications use. In this case, we can use IAM roles to grant permissions to applications running on instances that need to use a bucket in Amazon S3.

Click the top-left Services button in the AWS Management Console, find & select IAM service, click on Roles. On the next screen, click the Create Role button. Click the AWS Services box, then choose EC2, where we will now assign permissions to the EC2 server. Select Next: Permissions.

IAM Role Dashboard
IAM Roles 2

On the Attach Permission Policies screen, filter the policies by entering S3 in the search field. In order to permit our EC2 to access an S3 bucket, we need to select AmazonS3FullAccess, then Next twice until you’ve reached the Review screen.

IAM Role Permission Policies

Create a Role Name for this policy (I’ve created LUIT-EC2-S3-Access). Click on the Create Role button.

Now you should see a screen like the following, confirming your successful EC2 creation. In our next step, we’ll attach the IAM Role to our EC2.

Successful creation of IAM Role for EC2 to S3 Access

4. Apply the IAM role to an EC2 instance

We’ll need to SSH into our EC2 instance. Again, return back to the AWS Management Console’s Services button & navigate to the EC2 dashboard under Compute services. Note: For the following steps, I switched to the old EC2 dashboard view because it is visibly easier to follow.

a. In the EC2 dashboard, click on the Instances button. You should be see your EC2 instance on this screen, select it and then click the Connect button. A pop-up window will open to instruct you on how to connect to your EC2 instance.

b. I am a MacOS user, so I’ll be using my Mac Terminal app to complete the proceeding step(s). Included this pop-up window are the access instructions & commands to apply in your Terminal window. It also includes the downloaded key pair file name from a previous step. In the terminal window, we’ll navigate to the downloads folder by entering e, cd downloadscommand

c. Run the chmod command from the pop-up window from the terminal. In our case, chmod 400 luit-ec2-key.pem

d. To SSH connect into the EC2 instance using its Public DNS, we can copy & paste the following command into our terminal:

ssh -i "luit-ec2-key.pem" ec2-user@ec2-13-56-18-198.us-west-1.compute.amazonaws.com

Connection instructions for EC2 instance
SSH connecting into EC2 instance via Terminal

e. While we’re inside the EC2, we’re going to attempt to access the S3 bucket without adding the IAM role. Run the command:

aws s3 ls

The command returned with an error citing “Unable to locate credentials. You can configure credentials by running ‘aws configure’”, so instead we’ll just add our IAM role from the AWS console (Note: do NOT close your terminal window. We’re done here for now, but will be running more commands further along, so keep the terminal open).

f. Return back to our EC2 instance screen, and close the EC2 connection instructions pop-up window. With the EC2 still selected, now we’ll click the Actions button. This will open a drop-down menu, we want to go to Instance Settings → Attach/Replace IAM Role. Find & select our created IAM role in the drop-down list, then click the Apply button.

Applying the IAM role to EC2
Applying IAM role to EC2

g. After successfully applying the IAM role, head back to the terminal window an re-runaws s3 ls which should provide a log of all existing S3 buckets. For this demo, it is luit-ec2.

5. Write Python onto the EC2 instance

a. Upgrade to Python3 on EC2: By default, Python2 is pre-installed into EC2 instances, however, we’re going to upgrade to the Python3 version. While still logged into our EC2 in the terminal, we’ll run the following command to determine what versions are currently available:

sudo yum list | grep python3

b. The result above confirms Python3 is in fact available, so now we’ll run the following command for installation onto our EC2 instance (also, while running, it will prompt you confirm installation, which we then type ‘y’):

sudo yum install python3
Installation upgrade to Python3
Completed installation upgrade to Python3

b. Install packages with pip: We’ll now use a boto3 package to programmatically access S3. Install this package using this command:

sudo pip3 install boto3

c. Create an empty Python file using the below command:

touch my_script.py

d. Add logging code to the Python file: Using Vim Text Editor, open and edit the file using command vi my_script.py

Note: Performing this step will require you to utilize vim modes & commands in order to properly navigate & edit the file in the next step. After blank text edit screen appears, copy & paste the following code to update the file. The purpose for this code is to use the boto3 package to write a text file called “ec2.txt” to S3 with the current time in a string. (For further accuracy, I switched into VIM’s insert mode to add a couple blank lines in between the pasted code, then hit ESC to return to command mode, entering :wq at the bottom of the screen to save & completely exit out of the text editor.)

import boto3
from datetime import datetime
cli = boto3.client('s3')cli.put_object(
Body='The time now is '+str(datetime.now()),
Bucket='luit-ec2',
Key='ec2.txt')

We’re returned back to our previous screen where I installed the boto3 package, while still logged into my EC2 instance. We will run the updated above-mentioned script with the below command:

python3 my_script.py

I initially received an error message because I did NOT correctly apply my S3 bucket name to the Python code, so I had to re-run the VIM text editor command (vi my_script.py) and add my luit-ec2 S3 bucket within the code, save & exit out again. The results are now correct and are illustrated below.

This concludes all the necessary performed steps within the terminal window. We can now return back to the AWS console.

6. Inspect the Python file in AWS Console

We previously left off in the EC2 instance dashboard, now we’ll switch back to our S3 dashboard by clicking the top-left corner Services button, and navigate to S3. (Note: my screenshots are of the old console version)

a. Select the S3 bucket checkbox, then click the bucket name to view its contents.

luit-ec2 S3 bucket
ec2.txt file inside the S3 bucket (Python file created via Mac Terminal from step #5)

b. Click the file name, then click Download button, which will literally download it to your computer. Open the file with your computer’s default text editor application (for MacOS users, it’s the TextEdit app). Voila! Now we see the contents of the .txt file, which we applied via the Python code.

ec2.txt file
Contents inside the ec2.txt file

Conclusion: Although this was rather complex constantly switching from the console to the terminal, this was an enjoyable project for me because I understood the fluidity of writing files from an EC2 instance to an S3 bucket while also resolving the troubleshooting issues I encountered along the way. Due to lack of exposure, I watched a couple YouTube videos to assist me in resolving some minor steps I did not initially grasp. For my first opportunity using VIM text Editor & writing Python code, I found myself comfortable applying their functions to this project and understand how vital knowledge with these tools are beneficial in completion of various projects/tasks.

I feel a slight bit of confidence if I were asked to perform a similar project in a test or live production environment (hopefully by that point, I would have obtained a couple certifications and have a better knowledge on Python). I look forward to developing more experience with both the console & the terminal. For those of you just starting in the AWS environment — I hope you enjoyed this tutorial and found it useful, thanks for your time.

--

--

Tayo617

Learning new skills to transition to a cloud career