0

I am using fluentd, Elasticsearch and Grafana stack for logging and monitoring setup. I am tailing the logs(in JSON format) coming from NodeJS container/pod deployed in EKS cluster and dumping into Elasticsearch.

Problem The log format coming in stdout from node(deployed using pm2 web server) is not accurate json for example:

10:37:01 0|server | {"timestamp":"*","level":"*","label":"*","user":"*","message":"*","RequestMethod":"*","RequestPath":"*","StatusCode":*,"MachineName":"V","EnvironmentName":"*","Appplication":"*","CorrelationId":"*"}

As you can see above that unnecessary data 10:37:01 0|server | is coming outside json {} so as a result while parsing the logs using filter directive its throwing the error and saying "pattern not matched " error

Goal My logs should be fetched from stdout via fluentd and dumped into Elasticsearch hence I need to fix the logs structure just to make sure my logs should look like below:

{"timestamp":"*","level":"*","label":"*","user":"*","message":"*","RequestMethod":"*","RequestPath":"*","StatusCode":*,"MachineName":"V","EnvironmentName":"*","Appplication":"*","CorrelationId":"*"}

Which means no data 10:37:01 0|server | coming outside Json {} body. FYI I also checked at the pm2 server whether if we can remove this entry in the logs but couldn't find anything on it. Any methods that could fix my problem will be great.

yash
  • 1
  • 1

0 Answers0