Lambda Function Layers
Welcome to CloudAffaire and this is Debjeet.
In the last blog post, we have discussed and configured environment variables for our lambda function.
In this blog post, we are going to discuss layers in a lambda function. We will also create a layer for python package pymysql. Our function will have a trigger on S3 create-object and will write the bucket name and object name in AWS MySQL RDS.
Lambda Function Layers:
You can configure your Lambda function to pull in additional code and content in the form of layers. A layer is a ZIP archive that contains libraries, a custom runtime, or other dependencies. With layers, you can use libraries in your function without needing to include them in your deployment package.
Layers let you keep your deployment package small, which makes development easier. You can avoid errors that can occur when you install and package dependencies with your function code. For Node.js, Python, and Ruby functions, you can develop your function code in the Lambda console as long as you keep your deployment package under 3 MB.
Note: A function can use up to 5 layers at a time. The total unzipped size of the function and all layers can’t exceed the unzipped deployment package size limit of 250 MB.
Prerequisite for this demo:
S3 bucket, AWS MySQL RDS with table s3_upload.
We have already created an S3 bucket named cloudaffaire.
We have also created MySQL RDS instance with a database mysqldb and table s3_upload.
mysql -h mysqlinstance.xxxxxxxxxxxx.ap-south-1.rds.amazonaws.com -P 3306 -u mysqluser -p
CREATE TABLE s3_upload (bucket_name varchar(50) not null, object_name varchar(50));
Next, we are going to modify our lambda function.
Step 1: Login to AWS console and navigate to ‘Lambda’.
Step 2: Click on the function name.
Step 3: Modify the code of the function and add an S3 trigger on create-object with cloudaffaire bucket.
AWSRegionName = 'ap-south-1'
# Create a connection object
dbServerName = "mysqlinstance.xxxxxxxxxxxxx.ap-south-1.rds.amazonaws.com"
dbUser = "mysqluser"
dbPassword = "PASSWORD"
dbName = "mysqldb"
def lambda_handler(event, context):
BucketName = event['Records']['s3']['bucket']['name']
ObjectName = event['Records']['s3']['object']['key']
Message = ObjectName + ' has been uploaded in ' + BucketName + ' bucket!'
print("connecting to mysql rds!")
connectionObject = pymysql.connect(host=dbServerName, user=dbUser, password=dbPassword,
# Create a cursor object
cursorObject = connectionObject.cursor()
# Insert rows into the MySQL Table
print("inserting bucket and object name in s3_upload table!")
insertStatement = "INSERT INTO s3_upload (bucket_name, object_name) VALUES (\"" + BucketName + "\",\"" + ObjectName + "\")"
# SQL Query to retrive the rows
sqlQuery = "select * from s3_upload"
#Fetch all the rows - for the SQL Query
rows = cursorObject.fetchall()
for row in rows:
print("printing data from mysql rds instance!")
except Exception as e:
except Exception as e:
Next, test the function by uploading an object to the S3 bucket.
Step 4: Upload an object in S3 bucket.
Check CloudWatch log
Observe: Our lambda function execution failed due to the missing pymysql module.
Next, we are going to create and add a layer for pymysql module.
Step 5: Login to EC2 instance and create the layer using AWS CLI.
#create module directory
mkdir -p temp/python
#install pymysql module
pip install pymysql -t .
#create a zip file using installed module
zip -r9 ../pymysql.zip .
#create the lambda layer
aws lambda publish-layer-version --layer-name pymysql \
--description "pymysql for mysql access" \
--zip-file fileb://../pymysql.zip \
Step 6: Add the layer to our existing lambda function and click ‘Save’.
Step 7: Test the function by uploading an object into the S3 bucket.
Check CloudWatch Logs.
Observe: Our lambda function with module pymysql successfully executed.
You can also check the MySQL database
Remove the lambda layer pymysql from lambda function.
Hope you have enjoyed this article. In the next blog post, instead of layers, we will use the deployment package to include pymysql module to our lambda function.
To get more details on Lambda, please refer below AWS documentation