Creating a Load Balancer

1.5 hours
  • 4 Learning Objectives

About this Hands-on Lab

In this learning activity, you will install and configure a load balancer to be the front end for two pre-built Apache nodes. The load balancer should be configured to run in the best-effort stickiness mode.

Learning Objectives

Successfully complete this lab by achieving the following learning objectives:

Install HAProxy on `Server1`

Now, we’ll start off by installing HAProxy:

[root@Server1]# yum -y install haproxy

Enable and start the service:

[root@Server1]# systemctl enable haproxy
[root@Server1]# systemctl start haproxy

Then verify that incoming port 80 traffic is permitted through the firewall. Running firewall-cmd --list-all should show http in the list of allowed services.

Configure HAProxy on `Server1`

We need to configure the HAProxy’s frontend and backend. HAProxy should listen on port 80, and use node1 and node2 as the backend nodes. Let’s edit /etc/haproxy/haproxy.cfg and add the configuration code:


    timeout check            10s
    maxconn                  3000

# Our code starts here:

frontend app1
        bind *:80       
        mode http
        default_backend apache_nodes

backend apache_nodes
        mode http
        balance source      
        server node1 10.0.1.20:8080 check
        server node2 10.0.1.30:8080 check

# End of our code
#-------------------------------------------------------
# main frontend which proxys to the backends
#-------------------------------------------------------

We’ll have to restart the daemon so that our changes take effect, then check with ss:

[root@Server1]# systemctl restart haproxy
[root@Server1]# ss -lntp
Configure the `node1` and `node2` Firewalls

We need to permit incoming port 8080/TCP traffic on node1 and node2. On each node (after logging in, then becoming root), perform the following:

[root@node]# firewall-cmd --permanent --add-port=8080/tcp

Then reload the firewall configuration:

[root@node]# firewall-cmd --reload
From `Client1`, Validate That Settings Are Correct

On Client1, we can run a curl on the Server1 private IP address:

[cloud_user@Client1]$ curl &ltServer1_IP_ADDRESS>

Additional Resources

A business unit is requesting your assistance in resolving some performance issues. They've had additional application nodes created, but need you to build a load balancer to act as their front end.

The application nodes will listen on port 8080, and the load balancer should listen on port 80 for web traffic.

What are Hands-on Labs

Hands-on Labs are real environments created by industry experts to help you learn. These environments help you gain knowledge and experience, practice without compromising your system, test without risk, destroy without fear, and let you learn from your mistakes. Hands-on Labs: practice your skills before delivering in the real world.

Sign In
Welcome Back!

Psst…this one if you’ve been moved to ACG!

Get Started
Who’s going to be learning?