3

I'm using an ELK stack for server monitoring. My application's access logs which are from AWS ELB are stored in AWS S3. I am trying to add them to logstash with the following input:

input {
  s3 {
    access_key_id => "my_id"
    secret_access_key => "my_key"
    bucket => "my_bucket"
    region => "region"
    prefix => "AWSLogs/828557649675/elasticloadbalancing/eu-west-1/**/**/**/*.log"
    type => "elb"
  }
}

But nothing is added to logstash, what am I doing wrong? I there another way to add them to logstash? The filter and output parts seems OK so I don't post them.

Note: I am using logstash 2.1 version

apanagiotou
  • 43
  • 1
  • 4

2 Answers2

0

Could you add sincedb_path => "/tmp/alb-sincedb" and leave prefix like prefix => "AWSLogs/"?

Also it would be great if you install the latest version of input s3 gem. I believe I saw the changes there related to the iterating of objects inside of the bucket using the prefix by using V2 resources API. My current version is logstash-input-s3-3.1.2.gem.

473183469
  • 1,350
  • 1
  • 12
  • 23
oivoodoo
  • 101
  • 3
-1

Seems like the prefix should be, as the plugin does not support globs (*) :

 prefix => "AWSLogs/828557649675/elasticloadbalancing/eu-west-1/"

Full example:

input {
  s3 {
    access_key_id => "my_id"
    secret_access_key => "my_key"
    bucket => "my_bucket"
    region => "region"
    prefix => "AWSLogs/828557649675/elasticloadbalancing/eu-west-1/"
  }
}