
















Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
A comprehensive guide to creating an aws lambda function that monitors an s3 uri for new uploads. The function captures details of uploaded objects, including uri, name, size, and type, and sends an email notification to specified users at the end of each day. Code examples, instructions for setting up aws services, and explanations of key concepts.
Typology: Assignments
1 / 24
This page cannot be seen from the preview
Don't miss anything!
Create a New Folder:
"Folder."
Open the Folder in Visual Studio Code:
WORKDIR /vim
ENTRYPOINT ["/bin/bash"] Open Terminal in Visual Studio Code:
Using AWS lambda, write a program to monitor an S3 Uri. Whenever a user uploads data into the S storage, the program should capture the details. At the end of the day the program must send out an email to select users displaying the following information
Leaving everything else as defaults then click on the “Create bucket” button.
We will upload files to this S3 Storage later. Similarly, create another S3 Bucket for storing the resized version of the images (thumbnails).
Now let us check regarding the lambda function creation. Go to the main console and search for “Lambda” service. Next click on Functions -> Create function We are going to start from scratch so click on “Author from scratch” option. Then I am giving the Function name as “myThumbnailCreatorFunction”
Create account in the Amazon Simple Email Service (SES) and follow the instructions till you get Verified like below Let us now keep our code in the lambda_function tab import json import urllib.parse import boto print('Loading function') s3 = boto3.client('s3') def lambda_handler(event, context): #print("Received event: " + json.dumps(event, indent=2))
bucket = event['Records'][0]['s3']['bucket']['name'] key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'], encoding='utf-8') try: response = s3.get_object(Bucket=bucket, Key=key) print("CONTENT TYPE: " + response['ContentType'])
return response['ContentType'] except Exception as e: print(e) print('Error getting object {} from bucket {}. Make sure they exist and your bucket is in the same region as this function.'.format(key, bucket)) raise e Now click on “Add trigger” button. This code needs to be triggered by the event - whenever a new file is uploaded to the S3 Bucket.
To create the policy (console)
body = ( f"Dear User,\n\n" f"This is to inform you that an object has been uploaded to the S3 storage. Below are the details:\n\n" f"- S3 Uri: {s3_uri}\n" f"- Object Name: {object_name}\n" f"- Object Size: {object_size} bytes\n" f"- Object Type: {object_type}\n\n" f"Thank you for using our service.\n\n" f"Best regards,\nGanesh Malyala" )
print(f"Subject: {subject}\n\n{body}")
response = ses_client.send_email( Source='2023ht66013@wilp.bits-pilani.ac.in', Destination={ 'ToAddresses': ['2023ht66013@wilp.bits-pilani.ac.in'] }, Message={ 'Subject': { 'Data': subject, 'Charset': 'utf-8' }, 'Body': { 'Text': { 'Data': body, 'Charset': 'utf-8' } } } )
print(f"Email sent successfully. Message ID: {response['MessageId']}") def lambda_handler(event, context): for record in event['Records']: bucket = record['s3']['bucket']['name'] key = unquote_plus(record['s3']['object']['key']) tmpkey = key.replace('/', '') download_path = '/tmp/{}{}'.format(uuid.uuid4(), tmpkey) upload_path = '/tmp/resized-{}'.format(tmpkey)
s3_client.download_file(bucket, key, download_path)
resize_image(download_path, upload_path)
s3_client.upload_file(upload_path, '{}-resized'.format(bucket), 'resized-{}'.format(key))
s3_uri = f's3://{bucket}/{key}' object_name = os.path.basename(key) object_size = os.path.getsize(download_path) object_type, _ = mimetypes.guess_type(download_path)
if object_type and object_type.startswith('image'):
thumbnail_data = generate_thumbnail(bucket, key) thumbnail_key = f'thumbnails/{object_name}'
s3_client.put_object(Body=thumbnail_data, Bucket=bucket, Key=thumbnail_key)
send_email(s3_uri, object_name, object_size, object_type)
os.remove(download_path) os.remove(upload_path)
and SES/SNS. In the same directory in which we created our lambda_function.py file, we will create a new directory named package and install the Pillow (PIL) library and the AWS SDK for Python (Boto3). Although the Lambda Python runtime includes a version of the Boto3 SDK, we will add all of our function's dependencies to our deployment package, even if they are included in the runtime. mkdir package pip install
--platform manylinux2014_x86_64
--target=package
--implementation cp
--python-version 3.9
--only-binary=:all: --upgrade \ pillow boto The Pillow library contains C/C++ code. By using the --platform manylinux_2014_x86_64 and --only-binary=:all: options, pip will download and install a version of Pillow that contains pre-compiled binaries compatible with the Amazon Linux 2 operating system. This ensures that our deployment package will work in the Lambda execution environment, regardless of the operating system and architecture of the local build machine. Now we will create a .zip file containing your application code and the Pillow and Boto3 libraries making sure that our lambda_function.py file and the folders containing our dependencies are all at the root of the lambda_function.zip file.