ELK data insertion fails due to type mapping failure because the actual data type changes

72 views Asked by At

I get json to insert into ElasticSearch, I also have the dataType: data mapping configured for certain indices. The problem is that the types of some fields are changing occasionally and I do not have control over that changes. That breaks data insertion to ELK. I wonder if there is a way to specify list of allowed data types for a certain field? Or maybe there is a better solution to my problem?

1

There are 1 answers

0
leandrojmp On BEST ANSWER

The fields in elasticsearch can only be mapped to one data type each, if for example you have a field mapped as numeric, you can't store a text value in this fields, it will give you a mapping exception.

If you have a field that can change between your documents, you should map this field in a way that it will work for every case, for example, if the value of the field can be an integer, a string or a date, you should map this field as keyword or text, but you won't be able to perform numeric or date operations on this field like sums or date range queries.

You can also set the option ignore_malformed to true in your index, this way if you have a field with a different data type than the one in your mapping, only this field will be ignored, the other fields in your document will be indexed. Without this option the whole document would be ignored and not indexed.