This project demonstrates a basic event-driven architecture using AWS Lambda and Amazon S3. When a file is uploaded to a specific S3 bucket, the Lambda function is triggered and logs the file name to Amazon CloudWatch.
- Understand how S3 triggers work with Lambda
- Learn how to handle AWS events in Python
- Practice role-based permissions using IAM
- Monitor function execution via CloudWatch Logs
- Amazon S3 β Event source for uploads
- AWS Lambda β Event-driven compute
- IAM β Role for Lambda to access S3 and CloudWatch
- Amazon CloudWatch β To log and view events
- File uploaded to S3
- S3 sends an event to Lambda
- Lambda function logs the filename
- Output visible in CloudWatch Logs
- Go to the S3 console β Create bucket
- Example name:
lambda-s3-trigger-demo1 - Ensure ACLs are disabled and public access is blocked
- Go to Lambda Console β Create function
- Runtime:
Python 3.10 - Permissions: Create a new role with basic Lambda permissions
Paste the following code:
import json
def lambda_handler(event, context):
file_name = event['Records'][0]['s3']['object']['key']
print(f"New file uploaded: {file_name}")
return {
'statusCode': 200,
'body': json.dumps(f"Processed file: {file_name}")
}- In Lambda β Click Add trigger
- Choose
Amazon S3 - Select the bucket created (
lambda-s3-trigger-demo1) - Event type:
PUT - Save
βI acknowledge that using the same S3 bucket for both input and output is not recommended...β
Otherwise, the trigger won't be created.
- Upload a test file (e.g.,
test.txt) into the S3 bucket
- Go to CloudWatch β Log Groups β /aws/lambda/your-function-name
- Verify that you see a log like:
New file uploaded: test.txt
- This project demonstrates a simple event trigger setup
- You can expand the Lambda to process file content, send notifications, or trigger other AWS services
- Make sure the IAM role has:
AmazonS3ReadOnlyAccessAWSLambdaBasicExecutionRole



