0

How can I extract the json data into influx by tailing a file that has the JSON data?

EG:

My logs are like this:

2020-12-01T18:34:06+02:00 10.132.90.194 {"wfd_successful_hits_sec": "0", "sql_hits_sec_max": "0", "timestamp": "2020/12/01 18:34:01", "connection_sec_max": "1922", "http_hits_sec_max": "1106", "http_hits_sec": "106", "wfd_successful_hits_sec_max": "0", "sql_hits_sec": "0", "sql_audit_phase2_events_sec_max": "0", "hdfs_hits_sec": "0", "connection_sec": "26"}

Is there a way to just extract the JSON part and send it to influx?

I know the grok pattern (\{.*\})$ will extract the JSON part.

My config looks like this:

[global_tags]
[agent]
  interval = "10s"
  round_interval = true
  metric_batch_size = 1000
  metric_buffer_limit = 10000
  collection_jitter = "0s"
  flush_interval = "10s"
  flush_jitter = "0s"
  precision = ""
  hostname = ""
  omit_hostname = false
[[inputs.tail]]
  files = ["/opt/share/host*log"]
  data_format = "json"
[[outputs.influxdb_v2]]
  urls = ["http://localhost:8086"]
  token = "TOKEN"
  organization = "ORG"
  bucket = "performance_stats
sarvesh.lad
  • 137
  • 5

1 Answers1

0

Since my JSON output contains a timestamp field.

I modified my rsyslog config to to save the the syslog rawmsg and save it starting from the opening first match of {

$Template tpl,"%rawmsg:R,ERE,0,DFLT:(\{.*)--end%\n"
sarvesh.lad
  • 137
  • 5