The goal really is to define an Amazon lambda function which will eventually response to an event invoked by Amazon S3.
Table of content
Lambda function responding S3 event
There are many real world applications which can fit into this scenario. For example, uploading an image to Amazon S3 which will automatically triggle a put event. The put event will then triggle an Amazon lambda function which will resize the image, generate an new image, and put the new image back to Amazon S3. The whole process can be illustrated with the following pictures from Amazon

where we have the following steps of processings.
- User uploads an object to an S3 bucket (object-created event).
- Amazon S3 detects the object-created event.
- Amazon S3 invokes a Lambda function that is specified in the bucket notification configuration.
- AWS Lambda executes the Lambda function by assuming the execution role that you specified at the time you created the Lambda function.
- The Lambda function executes.
‘Using AWS Lambda with Amazon S3’ is a fairly nice tutorial from Aazone which gives a step-by-step instruction to achieve the goal. Meanwhile, everything documented there can also be achieved by the following scripts via Amazone AWS CLI.
Lambda function in JavaScript
The bash script for running JavaScript through Amazon lambda function can be found from my Github.
Create an Lambda function
-
Set up environment variables
# variable names source_bucket=hongyusuoriginal target_bucket=${source_bucket}resized function=CreateThumbnailand define the name of different access roles
# role name lambda_execution_role_name=lambda-$function-execution lambda_execution_access_policy_name=lambda-$function-execution-access lambda_invocation_role_name=lambda-$function-invocation lambda_invocation_access_policy_name=lambda-$function-invocation-access log_group_name=/aws/lambda/$function -
Create two buckets in Amazon S3 and upload a sample image
# bucket aws s3 mb s3://$source_bucket aws s3 mb s3://$target_bucket aws s3 cp HappyFace.jpg s3://$source_bucket/ -
Create a lambda function deployment package by download the java script
wget -q -O $function.js http://run.alestic.com/lambda/aws-examples/CreateThumbnail.js npm install async gm zip -r $function.zip $function.js node_modulesand make a delivery jave script pacakge with all dependencies
npm install async gm zip -r $function.zip $function.js node_models
Define Lambda role and policy
-
Create an IAM role for lambda function
lambda_execution_role_arn=$(aws iam create-role \ --role-name "$lambda_execution_role_name" \ --assume-role-policy-document '{ "Version": "2012-10-17", "Statement": [ { "Sid": "", "Effect": "Allow", "Principal": { "Service": "lambda.amazonaws.com" }, "Action": "sts:AssumeRole" } ] }' \ --output text \ --query 'Role.Arn' ) echo lambda_execution_role_arn=$lambda_execution_role_arnand the associated policy
aws iam put-role-policy \ --role-name "$lambda_execution_role_name" \ --policy-name "$lambda_execution_access_policy_name" \ --policy-document '{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "logs:*" ], "Resource": "arn:aws:logs:*:*:*" }, { "Effect": "Allow", "Action": [ "s3:GetObject" ], "Resource": "arn:aws:s3:::'$source_bucket'/*" }, { "Effect": "Allow", "Action": [ "s3:PutObject" ], "Resource": "arn:aws:s3:::'$target_bucket'/*" } ] }'
Upload the Lambda function and test with fake S3 event
-
Upload the deployment package
aws lambda create-function \ --region us-west-2 \ --function-name "$function" \ --zip "fileb://$function.zip" \ --role "$lambda_execution_role_arn" \ --handler "$function.handler" \ --timeout 30 \ --runtime nodejs --timeout 10 --memory-size 1024 -
Define a fake S3 event consist of data which will be passed to lambda function
cat > $function-data.json <<EOM { "Records":[ { "eventVersion":"2.0", "eventSource":"aws:s3", "awsRegion":"us-east-1", "eventTime":"1970-01-01T00:00:00.000Z", "eventName":"ObjectCreated:Put", "userIdentity":{ "principalId":"AIDAJDPLRKLG7UEXAMPLE" }, "requestParameters":{ "sourceIPAddress":"127.0.0.1" }, "responseElements":{ "x-amz-request-id":"C3D13FE58DE4C810", "x-amz-id-2":"FMyUVURIY8/IgAtTv8xRjskZQpcIZ9KG4V5Wp6S7S/JRWeUWerMUE5JgHvANOjpD" }, "s3":{ "s3SchemaVersion":"1.0", "configurationId":"testConfigRule", "bucket":{ "name":"$source_bucket", "ownerIdentity":{ "principalId":"A3NL1KOZZKExample" }, "arn":"arn:aws:s3:::$source_bucket" }, "object":{ "key":"HappyFace.jpg", "size":1024, "eTag":"d41d8cd98f00b204e9800998ecf8427e", "versionId":"096fKKXTRTtl3on89fVO.nfljtsv6qko" } } } ] } EOM -
Invoke the lambda function and pass in the json data
aws lambda invoke-async \ --function-name "$function" \ --invoke-args "$function-data.json"and check the generated image from the target S3 bucket
aws s3 ls s3://$target_bucket
Define S3 role and policy
-
Create IAM role for S3
lambda_invocation_role_arn=$(aws iam create-role \ --role-name "$lambda_invocation_role_name" \ --assume-role-policy-document '{ "Version": "2012-10-17", "Statement": [ { "Sid": "", "Effect": "Allow", "Principal": { "Service": "s3.amazonaws.com" }, "Action": "sts:AssumeRole", "Condition": { "StringLike": { "sts:ExternalId": "arn:aws:s3:::*" } } } ] }' \ --output text \ --query 'Role.Arn' ) echo lambda_invocation_role_arn=$lambda_invocation_role_arn -
Creat the policy for this role
aws iam put-role-policy \ --role-name "$lambda_invocation_role_name" \ --policy-name "$lambda_invocation_access_policy_name" \ --policy-document '{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "lambda:InvokeFunction" ], "Resource": [ "*" ] } ] }'
Connect Lambda with S3
-
Get ARN of lambda function
lambda_function_arn=$(aws lambda get-function-configuration \ --function-name "$function" \ --output text \ --query 'FunctionARN' ) echo lambda_function_arn=$lambda_function_arn -
Configure S3 with lambda function
aws s3api put-bucket-notification \ --bucket "$source_bucket" \ --notification-configuration '{ "CloudFunctionConfiguration": { "CloudFunction": "'$lambda_function_arn'", "InvocationRole": "'$lambda_invocation_role_arn'", "Event": "s3:ObjectCreated:*" } }'
Check Lambda with real S3 event
-
Check lambda function with real S3 event
aws s3 ls s3://$source_bucket aws s3 ls s3://$target_bucket aws s3 rm s3://$source_bucket/HappyFace.jpg aws s3 rm s3://$source_bucket/resized-HappyFace.jpg aws s3 cp HappyFace.jpg s3://$source_bucket/ aws s3 ls s3://$source_bucket aws s3 ls s3://$target_bucket
Clean up
-
Clean up workspace in Amazon server
aws s3 rm s3://$target_bucket/resized-HappyFace.jpg aws s3 rm s3://$source_bucket/HappyFace.jpg aws s3 rb s3://$target_bucket/ aws s3 rb s3://$source_bucket/ aws lambda delete-function \ --function-name "$function" aws iam delete-role-policy \ --role-name "$lambda_execution_role_name" \ --policy-name "$lambda_execution_access_policy_name" aws iam delete-role \ --role-name "$lambda_execution_role_name" aws iam delete-role-policy \ --role-name "$lambda_invocation_role_name" \ --policy-name "$lambda_invocation_access_policy_name" aws iam delete-role \ --role-name "$lambda_invocation_role_name" log_stream_names=$(aws logs describe-log-streams \ --log-group-name "$log_group_name" \ --output text \ --query 'logStreams[*].logStreamName') && for log_stream_name in $log_stream_names; do echo "deleting log-stream $log_stream_name" aws logs delete-log-stream \ --log-group-name "$log_group_name" \ --log-stream-name "$log_stream_name" done aws logs delete-log-group \ --log-group-name "$log_group_name"
Lambda function in Python
Comming soon … :laughing:
Extra reading materials
There are always very good external reading materials available on the web.
- For example, this article ‘A beginner’s guide to Amazon S3 and web hosting’ is a part of the website which is built for a practical course ‘Small data journalism’. It offers very good introductory knowledge on hosting a website with Amazon services.
- In addition, you can reed this article ‘How to serve 100k users without breaking server’ for more practical details on building a website with Amazon S3.
- Still, ‘Getting started with AWS and Python’ offers a excellent running example :question: of running an small web application on Amazon using S3, EC2, and SQS. However, it is a pretty old post.
- And maybe ‘Setting up EC2 and Django’ is also useful for this purpose.
- This blog post heavily relies on AWS lambda walk through CLI.
- And don’t forget Amazon’s own tutorial Using AWS lambda with Amazon S3.
- Also Hosting a web app on Amazon web service.