Hands-On: Deploy the Serverless Application and Inspect it in AWS
Explore how to deploy a serverless Python application using the Serverless Framework and AWS Lambda. Understand environment configuration, deployment commands, and how to inspect deployed functions and resources in AWS. Gain hands-on experience managing serverless services and verifying functionality via the AWS console.
Creating a simple serverless application
The following steps explain the creation of a simple serverless application.
Note: This application will be used throughout the course and will deploy multiple services.
Steps
Enter the AWS credentials in the environment variables.
Click “Connect,” and then perform the following tasks:
Configure the AWS credentials using the environment variables (
AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY,AWS_REGION,AWS_OUTPUT).You can log into Serverless either through the dashboard or the console. We have used the dashboard method. To log in through the dashboard, you must first register yourself on Serverless if you are not already registered.
After logging into Serverless, create a serverless application using the
AWS–PYTHON–STARTERand enter the project (app) name, then select the appropriateorg.For now, we won't deploy the application, so enter “no” when you are asked to deploy the application.
These steps are shown in the following slides:
Demo
The serverless application can be created using the following terminal:
Deploying a simple Python application
Let’s deploy our first application with the following command: serverless deploy --stage dev --verbose. We'll be discussing two things here:
stage: The Serverless Framework allows us to create stages (for example, testing and development) for our project. The staging environment is usually an independent clone of the production environment, and therefore allows us to test and ensure that the relevant version of code is good to go for deployment. By default, SLS deploys to thedevenvironment and appends thedevname to services deployed to AWS.verbose: This option results in an extensive log of activities that SLS is doing during the deployment.
As we can see, with very little code, SLS performs many actions on behalf of the user, provided that our role has enough privileges. In short, SLS does the following:
Creates a package of all the files
Converts it to CloudFormation (the native AWS language for deploying applications)
Uploads the package to Amazon S3 and then deploys or updates the existing stack
At the end of the process, SLS also provides an output of generated resources.
function: This is our function from
handler.py. Note how the name is composed of the{SERVICE}-{STAGE}-{FUNCTION}with its corresponding .ARN A unique Amazon Resource Name (ARN) to identify the specific version of the function. IAM Role: SLS will create a role in AWS for the function to be able to function properly. For the sake of our example, that is usable, but in real-life production use, we wouldn't let SLS create the role or create the IAM Role ourselves, but rather we'd get it from another team that is responsible for managing our company’s AWS environment.
S3 bucket: If no bucket is provided by the user, a new bucket will be generated on their behalf for every service. In practice, it makes sense to organize deployments in our own buckets.
Inspecting our application in AWS
We can view and verify the contents of the automatically generated S3 bucket under our “service” name and “dev” stage in AWS. All zipped and deployed packages would be deployed under that prefix. If we had deployed to another stage, such as “prod,” they would have been separated by using another prefix.
Similarly, we can also go to the AWS Lambda section in AWS, and we'll find our newly deployed service under “Functions.”
If we scroll a little lower, we'll see the possibility of testing our function. By clicking "Test" (this might ask us to create a new test), we can see the same printed text as in handler.py.
Testing yourself
The following steps explicate the ways to deploy a simple Python application using Serverless.
Note: This application will be used throughout the course and will deploy multiple services.
Enter the values for the following environment variables:
SERVERLESS_ORG: This name is a unique tenant within the Serverless Console. When you sign up, the Serverless Console generates a defaultorgname.SERVERLESS_APP: The serverless application is the name of your project on Serverless.SERVERLESS_SERVICE: It is the name of the service that you'll deploy.
Note: The environment variables used in the above terminal should be autofilled in the following playground. The service name will be unique for every application. You can name your service according to the following pattern:
sls-aws-python-starter-xxxx.
Click “Run.” When you click “Run,” the following occurs:
AWS credentials are configured using the environment variables (
AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY,AWS_REGION,AWS_OUTPUT).You are asked to log in to Serverless, either through the dashboard or the console. We use the dashboard method.
After logging in to Serverless, you are directed to the
sls-aws-python-starterfolder, where theserverless deploythe command will run.
You can then open the Lamba window in the AWS console and run the tests on the deployed Lambda function. It should indicate that the deployment was successful with the message given in the
handler.pyfile.
The Python application can be deployed by running the following code:
import json
def hello(event, context):
body = {
"message": "Go Serverless v3.0! Your function executed successfully!",
"input": event,
}
return {"statusCode": 200, "body": json.dumps(body)}
Note: When you click the "Run" button after the first iteration, you need to re-enter the
serverless deploycommand in the specific project directory. And if you have made any changes to the files, those will also be reflected here.
Congratulations! We've got our first Lambda function deployed with the Serverless Framework. Let’s move on to more advanced configurations!