Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

AWS Lambda Function for S3 Object Monitoring and Email Notification, Assignments of Computer Science

A comprehensive guide to creating an aws lambda function that monitors an s3 uri for new uploads. The function captures details of uploaded objects, including uri, name, size, and type, and sends an email notification to specified users at the end of each day. Code examples, instructions for setting up aws services, and explanations of key concepts.

Typology: Assignments

2023/2024

Uploaded on 11/16/2024

ganeshmalyala
ganeshmalyala 🇮🇳

5 documents

1 / 24

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Section 1
Create a New Folder:
1. Open any directory of your
choice.
2. Right-click and select "New" -
> "Folder."
3. Name the folder "custom-
docker."
Open Visual Studio Code:
1. Download and install Visual
Studio Code from https://code.visualstudio.com/.Open
Visual Studio Code.
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14
pf15
pf16
pf17
pf18

Partial preview of the text

Download AWS Lambda Function for S3 Object Monitoring and Email Notification and more Assignments Computer Science in PDF only on Docsity!

Section 1

Create a New Folder:

  1. Open any directory of your choice.
  2. Right-click and select "New" -

    "Folder."

  3. Name the folder "custom- docker." Open Visual Studio Code:
  4. Download and install Visual Studio Code from https://code.visualstudio.com/.Open Visual Studio Code.

Open the Folder in Visual Studio Code:

  1. Click on "File" -> "Open Folder..."
  2. Navigate to the "custom-docker" folder and select it. Create a New File:
  3. Inside Visual Studio Code, right-click on the "custom-docker" folder in the Explorer.
  4. Select "New File" and name it "dockerfile" (without an extension). Install Docker Extension:

WORKDIR informs which container working directory

WORKDIR /vim

Which command to run in container - any executable as start point

ENTRYPOINT ["/bin/bash"] Open Terminal in Visual Studio Code:

  1. Click on "View" -> "Terminal" to open the integrated terminal. Run Docker Commands:
  1. Run the following commands in the terminal: docker build -t myubuntu [This builds a custom image with the tag 'myubuntu'. The dot indicates the current location.]
  2. docker images [This lists the images. Verify that the 'myubuntu' image is present.]
  3. docker run -it myubuntu [This starts the container and places you in the /vim directory.]
  4. pwd [This prints the current working directory (/vim).]
  5. vim [This opens Vim. Press Escape, then type : and q to exit.] Screenshot provided in the next page

Section 2

Using AWS lambda, write a program to monitor an S3 Uri. Whenever a user uploads data into the S storage, the program should capture the details. At the end of the day the program must send out an email to select users displaying the following information

  1. S3 Uri
  2. Object Name
  3. Object Size
  4. Object type In addition to the above, the program should create a thumbnail and store it in the same uri in case the user uploads an image (.jpg/jpeg/png) First create a AWS S3 (Simple Storage Service) This is where we are going upload our files and we want that to trigger our lambda function. So we will create a new bucket and we will name it as S3-Media-bucket-for-Thumbnail-creation-20- 11- For region, I am choosing North Virginia as I feel it will be one of the cheapest regions.

Leaving everything else as defaults then click on the “Create bucket” button.

We will upload files to this S3 Storage later. Similarly, create another S3 Bucket for storing the resized version of the images (thumbnails).

Now let us check regarding the lambda function creation. Go to the main console and search for “Lambda” service. Next click on Functions -> Create function We are going to start from scratch so click on “Author from scratch” option. Then I am giving the Function name as “myThumbnailCreatorFunction”

Create account in the Amazon Simple Email Service (SES) and follow the instructions till you get Verified like below Let us now keep our code in the lambda_function tab import json import urllib.parse import boto print('Loading function') s3 = boto3.client('s3') def lambda_handler(event, context): #print("Received event: " + json.dumps(event, indent=2))

Get the object from the event and show its content type

bucket = event['Records'][0]['s3']['bucket']['name'] key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'], encoding='utf-8') try: response = s3.get_object(Bucket=bucket, Key=key) print("CONTENT TYPE: " + response['ContentType'])

return response['ContentType'] except Exception as e: print(e) print('Error getting object {} from bucket {}. Make sure they exist and your bucket is in the same region as this function.'.format(key, bucket)) raise e Now click on “Add trigger” button. This code needs to be triggered by the event - whenever a new file is uploaded to the S3 Bucket.

To create the policy (console)

  1. Open the Policies page of the AWS Identity and Access Management (IAM) console.
  2. Choose Create policy.
  3. Choose the JSON tab, and then paste the following custom policy into the JSON editor. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "logs:PutLogEvents", "logs:CreateLogGroup", "logs:CreateLogStream" ], "Resource": "arn:aws:logs:::" }, { "Effect": "Allow", "Action": [ "s3:GetObject" ], "Resource": "arn:aws:s3:::/" }, { "Effect": "Allow", "Action": [ "ses:SendEmail", "ses:SendRawEmail" ], "Resource": "" } { "Effect": "Allow", "Action": [ "s3:PutObject" ], "Resource": "arn:aws:s3:::/"

]

  1. Choose Next: Tags.
  2. Choose Next: Review.
  3. Under Review policy , for Name , enter AWSLambdaS3ExecutionRole.
  4. Choose Create policy.

5. Create the function deployment

package

body = ( f"Dear User,\n\n" f"This is to inform you that an object has been uploaded to the S3 storage. Below are the details:\n\n" f"- S3 Uri: {s3_uri}\n" f"- Object Name: {object_name}\n" f"- Object Size: {object_size} bytes\n" f"- Object Type: {object_type}\n\n" f"Thank you for using our service.\n\n" f"Best regards,\nGanesh Malyala" )

Placeholder code for demonstration (The email-sending code printed)

print(f"Subject: {subject}\n\n{body}")

Send email using AWS SES

response = ses_client.send_email( Source='2023ht66013@wilp.bits-pilani.ac.in', Destination={ 'ToAddresses': ['2023ht66013@wilp.bits-pilani.ac.in'] }, Message={ 'Subject': { 'Data': subject, 'Charset': 'utf-8' }, 'Body': { 'Text': { 'Data': body, 'Charset': 'utf-8' } } } )

Print the message ID from the response

print(f"Email sent successfully. Message ID: {response['MessageId']}") def lambda_handler(event, context): for record in event['Records']: bucket = record['s3']['bucket']['name'] key = unquote_plus(record['s3']['object']['key']) tmpkey = key.replace('/', '') download_path = '/tmp/{}{}'.format(uuid.uuid4(), tmpkey) upload_path = '/tmp/resized-{}'.format(tmpkey)

Download the original image

s3_client.download_file(bucket, key, download_path)

Resize image and upload

resize_image(download_path, upload_path)

s3_client.upload_file(upload_path, '{}-resized'.format(bucket), 'resized-{}'.format(key))

Extract object details

s3_uri = f's3://{bucket}/{key}' object_name = os.path.basename(key) object_size = os.path.getsize(download_path) object_type, _ = mimetypes.guess_type(download_path)

Check if the uploaded file is an image

if object_type and object_type.startswith('image'):

Generate thumbnail

thumbnail_data = generate_thumbnail(bucket, key) thumbnail_key = f'thumbnails/{object_name}'

Upload thumbnail to the same S3 URI

s3_client.put_object(Body=thumbnail_data, Bucket=bucket, Key=thumbnail_key)

Send email with object details

send_email(s3_uri, object_name, object_size, object_type)

Clean up temporary files

os.remove(download_path) os.remove(upload_path)

Note: Ensure that the Lambda function has the necessary IAM roles and permissions to access S

and SES/SNS. In the same directory in which we created our lambda_function.py file, we will create a new directory named package and install the Pillow (PIL) library and the AWS SDK for Python (Boto3). Although the Lambda Python runtime includes a version of the Boto3 SDK, we will add all of our function's dependencies to our deployment package, even if they are included in the runtime. mkdir package pip install
--platform manylinux2014_x86_64
--target=package
--implementation cp
--python-version 3.9
--only-binary=:all: --upgrade \ pillow boto  The Pillow library contains C/C++ code. By using the --platform manylinux_2014_x86_64 and --only-binary=:all: options, pip will download and install a version of Pillow that contains pre-compiled binaries compatible with the Amazon Linux 2 operating system.  This ensures that our deployment package will work in the Lambda execution environment, regardless of the operating system and architecture of the local build machine. Now we will create a .zip file containing your application code and the Pillow and Boto3 libraries making sure that our lambda_function.py file and the folders containing our dependencies are all at the root of the lambda_function.zip file.