Dynamic elasticsearch index_type using logstash

1.9k views Asked by At

I am working on storing data on elasticsearch using logstash from a rabbitmq server.

My logstash command looks like

logstash -e 'input{
rabbitmq {
    exchange => "redwine_log"
    key => "info.redwine"
    host => "localhost"
    durable => true
    user => "guest"
    password => "guest"
}
}

output {
  elasticsearch {
    host => "localhost"
    index => "redwine"
  }
}
filter {
  json {
    source => "message"
    remove_field => [ "message" ]
  }
}'

But I needed logstash to put the data into different types in elasticsearch cluster. What i meant by type is:

"hits": {
      "total": 3,
      "max_score": 1,
      "hits": [
         {
            "_index": "logstash-2014.11.19",
            "_type": "logs",
            "_id": "ZEea8HBOSs-QwH67q1Kcaw",
            "_score": 1,
            "_source": {
               "context": [],
               "level": 200,
               "level_name": "INFO",

This is the part of search result, where you can see the logstash by defualt creates a type named "logs" (_type : "logs"). In my project i needed the type to be dynamic and should be created based on the input data. For example: my input data looks like

{
    "data":"some data",
    "type": "type_1"
}

and i need the logstash to create a new type in elasticsearch with a name "type_1"..

I tried using grok..But couldnt able to get this specifc requirment.

1

There are 1 answers

2
Ysak On

Its worked for me in this way

elasticsearch {
    host => "localhost"
    index_type => "%{type}"
  }