Precision gets lost for big number. I am using tail input plugin to read file and data inside a file is in json format. Below is the configuration
[inputs.tail]]
files = ["E:/Telegraph/MSTCIVRRequestLog_*.json"]
from_beginning = true
name_override = "tcivrrequest"
data_format = "json"
json_strict = true
[[outputs.file]]
files = ["E:/Telegraph/output.json"]
data_format = "json"
Input file contains
{"RequestId":959011990586458245}
Expected Output
{"fields":{"RequestId":959011990586458245},"name":"tcivrrequest","tags":{},"timestamp":1632994599}
Actual Output
{"fields":{"RequestId":959011990586458200},"name":"tcivrrequest","tags":{},"timestamp":1632994599}
Number 959011990586458245 converted into 959011990586458200(check last few digits).
Already Tried Below things but not worked
json_string_fields = ["RequestId"]
[[processors.converter]] [processors.converter.fields] string = [""RequestId""]"
precision = "1s"
json_int64_fields = ["RequestId"]
character_encoding = "utf-8"
json_strict = true
I was able to reproduce this with the
json
parser as well. My suggestion would be to move to thejson_v2
parser with a config like the following:I was able to get a result as follows:
The newer parser is generally more accurate and flexible for simple cases like the one you provided.
Thanks!