...

/

Design Load Balancers

Design Load Balancers

Learn to design a system of load balancers.

Overview

Millions of requests could arrive per second in a typical data center. To serve these requests, thousands (or a hundred thousand) servers work together to share the load of incoming requests.

Note: Here, it’s important that we consider how the incoming requests will be divided among all the available servers.

A load balancer (LB) is the answer to the question. The job of the load balancer is to fairly divide all clients’ requests among the pool of available servers. Load balancers perform this job to avoid overloading or crashing servers.

The load balancing layer is the first point of contact within a data center after the firewall. A load balancer may not be required if a service entertains a few hundred or even a few thousand requests per second. However, for increasing client requests, load balancers provide the following capabilities:

  • Scalability: By adding servers, the capacity of the application/service can be increased seamlessly. Load balancers make such upscaling or downscaling transparent to the end users.
  • Availability: Even if some servers go down or suffer a fault, the system still remains available. One of the jobs of the load balancers is to hide faults and failures of servers.
  • Performance: Load balancers can forward requests to servers with a lesser load so the user can get a quicker response time. This not only improves performance but also improves resource utilization.

Here’s an abstract depiction of how load balancers work:

Press + to interact
Simplified working of a load balancer
Simplified working of a load balancer

Placing load balancers

Generally, LBs sit between clients and servers. Requests go through to servers and back to clients via the load balancing layer. However, that isn’t the only point where load balancers are used.

Let’s consider the three well-known groups of servers: the web, the application, and the database servers. To divide the traffic load among the available servers, load balancers can be used between the server instances of these three services in the following way:

  • Place LBs between end users of the application and web servers/application gateway.
  • Place LBs between the web servers and application servers that run the business/application logic.
  • Place LBs between the application servers and database servers.
Press + to interact
Possible usage of load balancers in a three-tier architecture
Possible usage of load balancers in a three-tier architecture

In reality, load balancers can be potentially used between any two services with multiple instances within the design of a system.

Services offered by load balancers

LBs not only enable services to be scalable, available, and highly performant, they offer some key services like the following:

  • Health checking: LBs use the heartbeat protocolThe heartbeat protocol is a way of identifying failures in distributed systems. Using this protocol, every node in a cluster periodically reports its health to a monitoring service. to monitor the health and, therefore, reliability of end-servers. Another advantage of health checking is the improved user experience.
  • TLS termination: LBs reduce the burden on end-servers by handling TLS terminationTLS termination, also called TLS/SSL offloading, is the establishment of secure communication channels between clients and servers through encryption/decryption of data. with the client.
  • Predictive analytics: LBs can predict traffic patterns
...
Access this course and 1400+ top-rated courses and projects.