Lihat ke Halaman Asli

Distributed Web Server with Load Balancer

Diperbarui: 26 Juni 2015   18:20

Kompasiana adalah platform blog. Konten ini menjadi tanggung jawab bloger dan tidak mewakili pandangan redaksi Kompas.

Bagikan ide kreativitasmu dalam bentuk konten di Kompasiana | Sumber gambar: Freepik

This Article will discuss about Load Balancing from common definition to example of implementation. Load Balancing technique was ideal for a website with high traffic and zero downtime. Or maybe, your website's growth rapidly and upgrading server specification are seems impossible so it need to expand by adding more server, so this article's perfect for you as starter guideline.

Definition


Basically, main ideas of Load Balancing is to distribute task to more than single server, and so the goal from this technique are to reduce server load and also deliver content in less time. Actually, Load Balancing is a technique that implemented on many service, not only on Web Server. Like database, file server and any other service are using this technique too but with different name. Such as 'clustering', 'parallel system', 'distributed file system' and so on.

For this article which discuss about Web Server, let me use Load balancing term.

Products


There are plenty option for Load Balancer product, you can choose to implement it with hardware or totally with software.

If you prefer to use hardware for Load Balancing, let say F5, Barracuda are quite experience for this area. Like other devices, they also provide ease-to-use software to operate the hardware remotely.

But if you only have a low budget or maybe you like to explore what your server can do, so software based Load Balancing will be your reasonable option. I've seen so many software that work not only as Load Balancer specifically, in fact, many of them provide it optionally not as main feature. For example, Squid (proxy) and Nginx (Web Server).

Since Squid and Nginx are the product that I've used before, so this article only covered with that two application, including later example.

Methods


Load Balancer work in many different algorithm. But in common the method are:


  • Round Robin: The Load balancer passed the request to the next server inline.
  • Weight Round Robin: Same as Round Robin but the number of request that passed to server is defined by the weight parameter. more high the weight is more request will be pass to them.
  • Random: Every request will pass by Load Balancer randomly, without knowing server availability and performance.
  • etc


For each algorithm has their own plus and minus. Find the best method that fit with your overall server configuration and environment by research.

Example Configuration


For this example i use one server for the Load Balancer (IP: 192.168.0.1), and two as Client (IP: 192.168.0.10 and 192.168.0.11). We do not need high specification for Load Balancer. One thing that need to consider is network card specification that support high bandwidth.

Nginx

Halaman Selanjutnya


BERI NILAI

Bagaimana reaksi Anda tentang artikel ini?

BERI KOMENTAR

Kirim

Konten Terkait


Video Pilihan

Terpopuler

Nilai Tertinggi

Feature Article

Terbaru

Headline