You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[Elastic Load Balancer (ELB)](https://aws.amazon.com/elasticloadbalancing/) is a service that distributes incoming application traffic across multiple targets, such as EC2 instances, containers, IP addresses, and Lambda functions.
16
-
ELBs can be physical hardware or virtual software components.
17
-
They accept incoming traffic and distribute it across multiple targets in one or more Availability Zones.
15
+
## Introduction
16
+
17
+
[Elastic Load Balancer (ELB)](https://aws.amazon.com/elasticloadbalancing/)is a service that distributes incoming application traffic across multiple targets, such as EC2 instances, containers, IP addresses, and Lambda functions.
18
+
ELBs can be physical hardware or virtual software components. They accept incoming traffic and distribute it across multiple targets in one or more Availability Zones.
18
19
Using ELB, you can quickly scale your load balancer to accommodate changes in traffic over time, ensuring optimal performance for your application and workloads running on the AWS infrastructure.
19
20
20
-
ELB provides three types of load balancers: [Application Load Balancer](https://docs.aws.amazon.com/elasticloadbalancing/latest/application/introduction.html), [Network Load Balancer](https://docs.aws.amazon.com/elasticloadbalancing/latest/network/introduction.html), [Classic Load Balancer](https://docs.aws.amazon.com/elasticloadbalancing/latest/gateway/introduction.html), and [Application Load Balancer](https://docs.aws.amazon.com/elasticloadbalancing/latest/application/introduction.html).
21
+
ELB provides four types of load balancers:
22
+
-**[Application Load Balancer](https://docs.aws.amazon.com/elasticloadbalancing/latest/application/introduction.html)**: Manages HTTP/HTTPS traffic, offering advanced routing features at the application layer.
23
+
-**[Network Load Balancer](https://docs.aws.amazon.com/elasticloadbalancing/latest/network/introduction.html)**: Handles TCP traffic with high performance and low latency at the transport layer.
24
+
-**[Gateway Load Balancer](https://docs.aws.amazon.com/elasticloadbalancing/latest/gateway/introduction.html)**: Deploys, scales, and manages third-party virtual appliances with a transparent network gateway.
25
+
-**[Classic Load Balancer](https://docs.aws.amazon.com/elasticloadbalancing/latest/classic/introduction.html)**: Provides basic load balancing for both HTTP/HTTPS and TCP traffic.
21
26
22
-
In this tutorial we focus on the Application Load Balancer (ALB), which operates at the Application layer of the OSI model and is specifically designed for load balancing HTTP and HTTPS traffic for web applications.
23
-
ALB works at the request level, allowing advanced load-balancing features for HTTP and HTTPS requests.
24
-
It also enables you to register Lambda functions as targets.
27
+
In this tutorial, we focus on the Application Load Balancer (ALB), which operates at Layer 7 (Application layer) of the OSI model and is specifically designed for load balancing HTTP and HTTPS traffic for web applications.
28
+
ALB works at the request level, allowing advanced load-balancing features for HTTP and HTTPS requests. It also enables you to register Lambda functions as targets.
25
29
You can configure a listener rule that forwards requests to a target group for your Lambda function, triggering its execution to process the request.
26
30
27
31
[LocalStack Pro](https://localstack.cloud) extends support for ELB Application Load Balancers and the configuration of target groups, including Lambda functions.
@@ -37,17 +41,34 @@ Additionally, we will demonstrate how to set up ELB endpoints to efficiently for
-[curl](https://curl.se/) and [jq](https://jqlang.github.io/jq/)
39
43
44
+
## Architecture
45
+
46
+
The architecture emulates a scalable AWS setup locally: Clients send HTTP/HTTPS requests to the ALB's DNS endpoint.
47
+
The ALB listener (e.g., on port 80) routes traffic based on path rules to target groups, which forward to registered Lambda functions.
48
+
These functions process requests and return responses. The setup runs within a VPC and subnet for networking isolation, all emulated in LocalStack.
49
+
50
+
### Key components
51
+
- Clients/Users: Initiate traffic via browsers or tools like curl.
52
+
- ALB Listener: Receives and routes based on rules (e.g., /hello1 → hello1 Lambda).
53
+
- Target Group: Manages health checks and forwards to Lambda targets.
54
+
- Lambda Functions: Handle business logic (e.g., return "Hello 1").
55
+
- VPC/Subnet: Provides network boundaries.
56
+
57
+

58
+
40
59
## Setup a Serverless project
41
60
42
61
Serverless is an open-source framework that enables you to build, package, and deploy serverless applications seamlessly across various cloud providers and platforms.
43
62
With the Serverless framework, you can easily set up your serverless development environment, define your applications as functions and events, and deploy your entire infrastructure to the cloud using a single command.
63
+
44
64
To start using the Serverless framework, install the Serverless framework globally by executing the following command using `npm`:
45
65
46
66
```bash
47
67
npm install -g serverless
48
68
```
49
69
50
70
The above command installs the Serverless framework globally on your machine.
71
+
51
72
After the installation is complete, you can verify it by running the following command:
52
73
53
74
```bash
@@ -59,6 +80,7 @@ SDK: 4.3.2
59
80
```
60
81
61
82
This command displays the version numbers of the Serverless framework's core, plugins, and SDK you installed.
83
+
62
84
Now, let's proceed with creating a new Serverless project using the `serverless` command:
63
85
64
86
```bash
@@ -91,6 +113,7 @@ The `serverless-localstack` plugin enables your Serverless project to redirect A
91
113
This bucket is responsible for storing the deployment artifacts and ensuring that old deployment buckets are properly cleaned up after each deployment.
92
114
93
115
We have a `serverless.yml` file in the directory to define our Serverless project's configuration, which includes information such as the service name, the provider (AWS in this case), the functions, and example events that trigger those functions.
116
+
94
117
To set up the plugins we installed earlier, you need to add the following properties to your `serverless.yml` file:
95
118
96
119
```yaml showLineNumbers
@@ -119,8 +142,8 @@ custom:
119
142
120
143
To configure Serverless to use the LocalStack plugin specifically for the `local` stage and ensure that your Serverless project only deploys to LocalStack instead of the real AWS Cloud, you need to set the `--stage` flag when using the `serverless deploy` command and specify the flag variable as `local`.
121
144
122
-
Configure a `deploy` script in your `package.json` file to simplify the deployment process.
123
-
It lets you run the `serverless deploy` command directly over your local infrastructure.
145
+
Configure a `deploy` script in your `package.json` file to simplify the deployment process. It lets you run the `serverless deploy` command directly over your local infrastructure.
146
+
124
147
Update your `package.json` file to include the following:
We have defined the `hello1` and `hello2` Lambda functions in the updated code.
190
-
Each function receives an event parameter and logs it to the console.
191
-
The function then returns a response with a status code of 200 and a plain text body containing the respective `"Hello"` message.
192
-
It's important to note that the `isBase64Encoded` property is not required for plain text responses.
193
-
It is typically used when you need to include binary content in the response body and want to indicate that the content is Base64 encoded.
212
+
We have defined the `hello1` and `hello2` Lambda functions in the updated code. Each function receives an event parameter and logs it to the console. The function then returns a response with a status code of 200 and a plain text body containing the respective `"Hello"` message.
213
+
214
+
It's important to note that the `isBase64Encoded` property is not required for plain text responses. It is typically used when you need to include binary content in the response body and want to indicate that the content is Base64 encoded.
194
215
195
216
Let us now configure the `serverless.yml` file to create an Application Load Balancer (ALB) and attach the Lambda functions to it.
196
217
@@ -231,12 +252,12 @@ custom:
231
252
- local
232
253
```
233
254
234
-
In the above configuration, we specify the service name (`serverless-elb` in this case) and set the provider to AWS with the Node.js 12.x runtime.
235
-
We include the necessary plugins, `serverless-localstack` and `serverless-deployment-bucket`, for LocalStack support and deployment bucket management.
236
-
Next, we define the `hello1` and `hello2` functions with their respective handlers and event triggers.
237
-
In this example, both functions are triggered by HTTP GET requests to the `/hello1` and `/hello2` paths.
255
+
In the above configuration, we specify the service name (`serverless-elb` in this case) and set the provider to AWS with the Node.js 12.x runtime. We include the necessary plugins, `serverless-localstack` and `serverless-deployment-bucket`, for LocalStack support and deployment bucket management.
256
+
257
+
Next, we define the `hello1` and `hello2` functions with their respective handlers and event triggers. In this example, both functions are triggered by HTTP GET requests to the `/hello1` and `/hello2` paths.
238
258
239
259
Lastly, let's create a VPC, a subnet, an Application Load Balancer, and an HTTP listener on the load balancer that redirects traffic to the target group.
260
+
240
261
To do this, add the following resources to your `serverless.yml` file:
241
262
242
263
```yaml showLineNumbers
@@ -276,27 +297,28 @@ resources:
276
297
CidrBlock: 12.2.1.0/24
277
298
```
278
299
279
-
With these resource definitions, you have completed the configuration of your Serverless project.
280
-
Now you can create your local AWS infrastructure on LocalStack and deploy your Application Load Balancers with the two Lambda functions as targets.
300
+
You have completed the configuration of your Serverless project! Now you can create your local AWS infrastructure on LocalStack and deploy your Application Load Balancers with the two Lambda functions as targets.
281
301
282
302
## Creating the infrastructure on LocalStack
283
303
284
-
Now that we have completed the initial setup let's run LocalStack's AWS emulation on our local machine.
304
+
Now that we have completed the initial setup, let's run LocalStack's AWS emulation on our local machine.
305
+
285
306
Start LocalStack by running the following command:
This command launches LocalStack in the background, enabling you to use the AWS services locally.
292
-
Now, let's deploy our Serverless project and verify the resources created in LocalStack.
312
+
This command launches LocalStack in the background, enabling you to use the AWS services locally. Now, let's deploy our Serverless project and verify the resources created in LocalStack.
313
+
293
314
Run the following command:
294
315
295
316
```bash
296
317
npm run deploy
297
318
```
298
319
299
320
This command deploys your Serverless project using the "local" stage.
This output confirms the successful deployment of your Serverless service to the `local` stage in LocalStack.
321
-
It also displays information about the deployed Lambda functions (`hello1` and `hello2`).
342
+
This output confirms the successful deployment of your Serverless service to the `local` stage in LocalStack. It also displays information about the deployed Lambda functions (`hello1` and `hello2`).
343
+
322
344
You can run the following command to verify that the functions and the load balancers have been deployed:
323
345
324
346
```bash showLineNumbers
@@ -365,25 +387,42 @@ The ALB endpoints for the two Lambda functions, hello1 and hello2, are accessibl
To test these endpoints, you can use the curl command along with the jq tool for better formatting.
369
-
Run the following commands:
390
+
## Testing
391
+
392
+
Here in the testing phase we will test endpoints, do a validation check which includes health check and error handling. To test these endpoints, you can use the curl command along with the jq tool for better formatting.
393
+
394
+
1. **Verify Deployment:** Use the commands `awslocal lambda list-functions` and `awslocal elbv2 describe-load-balancers` respectively to confirm the existince of Lambda and ALB respectively
395
+
2. **Test Endpoints:** Run the following commands:
Both commands send an HTTP GET request to the endpoints and uses `jq` to format the response. The expected outputs are `Hello 1` & `Hello 2`, representing the Lambda functions' response.
If tests fail, you can ensure LocalStack is healthy (`localstack status services`), check ports (default `4566`), and restart if needed.
380
421
381
422
## Conclusion
382
423
383
424
In this tutorial, we have learned how to create an Application Load Balancer (ALB) with two Lambda functions as targets using LocalStack.
384
-
We have also explored creating, configuring, and deploying a Serverless project with LocalStack.
385
-
This enables developers to develop and test Cloud and Serverless applications locally conveniently.
386
425
387
-
LocalStack offers integrations with various popular tools such as Terraform, Pulumi, Serverless Application Model (SAM), and more.
388
-
For more information about LocalStack integrations, you can refer to our [Integration documentation]().
389
-
To further explore and experiment with the concepts covered in this tutorial, you can access the code and resources on our [LocalStack Pro samples over GitHub](https://github.com/localstack/localstack-pro-samples/tree/master/elb-load-balancing) along with a `Makefile` for step-by-step execution.
426
+
We have also explored creating, configuring, and deploying a Serverless project with LocalStack, enabling developers to develop and test Cloud and Serverless applications locally without AWS costs—accelerating iteration for cloud-native workloads.
427
+
428
+
LocalStack offers integrations with various popular tools such as Terraform, Pulumi, Serverless Application Model (SAM), and more. For more information about LocalStack integrations, you can refer to our [Integration documentation](https://docs.localstack.cloud/aws/integrations). To further explore and experiment with the concepts covered in this tutorial, you can access the code and resources on our [LocalStack Pro samples over GitHub](https://github.com/localstack/localstack-pro-samples/tree/master/elb-load-balancing) along with a Makefile for step-by-step execution.
0 commit comments