Question:
I want to deploy machine learning related code on AWS Lambda function as Docker image. Base Images provided by AWS for Python, don’t allow to install using apt-get command. So I created custom docker image for AWS Lambda. Below is the code of my Dockerfile.
ref: Create an image from an alternative base image
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 |
ARG FUNCTION_DIR="/function" FROM python:3.7-buster as build-image # Install aws-lambda-cpp build dependencies RUN apt-get update && \ apt-get install -y \ g++ \ make \ cmake \ unzip \ libcurl4-openssl-dev RUN apt-get install -y python-opencv # Include global arg in this stage of the build ARG FUNCTION_DIR # Create function directory RUN mkdir -p ${FUNCTION_DIR} # Install the runtime interface client RUN pip install \ --target ${FUNCTION_DIR} \ awslambdaric FROM python:3.7-buster ARG FUNCTION_DIR # Set working directory to function root directory WORKDIR ${FUNCTION_DIR} # Copy in the build image dependencies COPY --from=build-image ${FUNCTION_DIR} ${FUNCTION_DIR} COPY requirements.txt / #RUN pip install -r /requirements.txt COPY . /function ENV AWS_LAMBDA_RUNTIME_API=python3.7 ENTRYPOINT [ "/usr/local/bin/python", "-m", "awslambdaric" ] CMD [ "app.handler" ] |
My Folder structure is as below:
1 2 3 4 5 6 7 |
aws_lambda (Folder) - Dockerfile - app.py - function (Folder) - app.py - requirements.txt |
When I run docker image, it shows error as below:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
[ERROR] [1614258613176] LAMBDA_RUNTIME Failed to get next invocation. No Response from endpoint Traceback (most recent call last): File "/usr/local/lib/python3.7/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/usr/local/lib/python3.7/runpy.py", line 85, in _run_code exec(code, run_globals) File "/function/awslambdaric/__main__.py", line 21, in main(sys.argv) File "/function/awslambdaric/__main__.py", line 17, in main bootstrap.run(app_root, handler, lambda_runtime_api_addr) File "/function/awslambdaric/bootstrap.py", line 416, in run event_request = lambda_runtime_client.wait_next_invocation() File "/function/awslambdaric/lambda_runtime_client.py", line 76, in wait_next_invocation response_body, headers = runtime_client.next() RuntimeError: Failed to get next Executing 'app.handler' in function directory '/function' |
It seems it is unable to find /function directory but it is already there.
Answer:
I’ve run into this thing recently and this is what works for me.
My dockerfile:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
ARG FUNCTION_DIR="/opt/myapp" FROM ubuntu:16.04 RUN apt update \ && apt upgrade \ && apt-get install -y software-properties-common python-software-properties \ && apt-add-repository -y ppa:deadsnakes/ppa \ && apt update \ && apt-get install -y python3.6 python3.6-dev python3-pip RUN python3.6 -m pip --no-cache-dir install awslambdaric ADD https://github.com/aws/aws-lambda-runtime-interface-emulator/releases/latest/download/aws-lambda-rie /usr/bin/aws-lambda-rie ARG FUNCTION_DIR RUN mkdir -p ${FUNCTION_DIR} WORKDIR /opt ADD myapp myapp COPY entry.sh /opt/myapp/entry.sh RUN chmod 755 /usr/bin/aws-lambda-rie /opt/myapp/entry.sh WORKDIR /tmp ENTRYPOINT [ "/opt/myapp/entry.sh" ] CMD [ "/opt/myapp/app.handler" ] |
My entrypoint
1 2 3 4 5 6 7 8 |
#!/bin/sh if [ -z "${AWS_LAMBDA_RUNTIME_API}" ]; then exec /usr/bin/aws-lambda-rie /usr/bin/python3.6 -m awslambdaric $1 else exec /usr/bin/python3.6 -m awslambdaric $1 fi |
Then I built my image with:
docker build –rm -t myimage .
And tested locally running with:
docker run -p 9000:8080 –rm -it myimage:latest
In another terminal I run:
curl -XPOST
“http://localhost:9000/2015-03-31/functions/function/invocations” -d
‘{“field1″:”..”, “field2″:”..”}’
Elaboration starts. My app.py
contain a method handler
, which parse the argument event and retrieve the fields (you can see them in the curl call), then starts the elaboration. I set my workdir in /tmp
because it’s the only read/write folder in AWS Lambda so I can put there some intermediate files of my elaboration.