Why Redis, AMQP or 0MQ is needed along with elastic search and logstash?

970 views Asked by At

I am beginner to Elastic Search and logStash. After going through the n number of documentation still i couldn't figure out. Why there is an broker needed in between logshipping and indexing component. Can't we directly send the logs to Elastic Search and start indexing? enter image description here

2

There are 2 answers

0
Tom Kregenbild On BEST ANSWER

The message queue

The role of the message queue is to be the gatekeeper and protect the parser from being overloaded with too many log messages. It is all about the maximum number of events your parser can process in one second and when this rate becomes too high, the log parser will drop events and will cause data lose. To prevent this situation a message queue is mandatory.

Pull vs Push

When you send log messages directly to the log parser from log shipper you basically push the messages and hope that the parser can handle the rate at which these events are being pushed to it. When you choose to use a message queue you allow the log parser to pull the messages at the rate that it can handle. When the rate is too high and the parser can’t pull all the messages, they will accumulate in your message queue and once the rate will become lower, the parser will pull these messages and clear the queue. A message queue is your best protection against temporary high load on your central logging solution.

The database crisis

In rare cases your database server will crash and during that time the parser will have no available destination to send its parsed log messages to. From the input side the parser will receive more and more messages from the log shipper and will start dropping them. In that case, all the log messages generated during this time will be gone. A message queue is a great solution to that situation and will allow the parser to stop pulling events and let them accumulate in the message queue. Once the session to the database will be restored all the events will be pulled by the parser and sent to the database. It might take some time to parse and write such a big queue but eventually you will have complete access to your generated log data and no data will be lost.

A layer of safety

In some cases your log files might be scattered between different servers outside of your data center and you will want them to send data to your centralized logging solution. By using a message queue, you will be able to keep your data safe, sending it encrypted and limiting your inbound access to a single port on your message queue server. It is very important to take into consideration the security aspects of your centralized logging solution and especially so when it comes to a distributed server environment.

0
Jettro Coenradie On

Yes you can send the logs immediately to the indexer. However, there is a scalability and maintainability reason to use the Broker. If the indexer at some time becomes overloaded, sending the logs could slow down. Also if you want to restart the indexer for any reason, using the Broker you can keep sending logs.