0

I have an "ELK stack" configuration and, at first, was doing the standard 'filebeat' syslog feeding from logstash with the elasticsearch output plugin. It worked just fine.

Now I have added a TCP input port (with assigned "type" for this data so as to do if [type] == "thistype" to differentiate in filters), its own grok filter, and output to both elasticsearch with its own unique index name and document_type name and file. When the data comes in over the TCP port it writes the properly formatted data to the output file as expected in the file output plugin but no data is showing up in Kibana when I choose the given index. Kibana recognizes the index from my output configuration and also lists all the fields/keys I assign in the grok filter; however, again, no data is searchable. The data is definitely being grok'd properly as it is, as I mentioned, appearing in the file plugin output.

What am I doing wrong here? My configuration is as follows:

input {
  tcp {
    host => "10.1.1.10"
    port => 12345
    type => "odata"
    id => "odata"
    codec => line
  }
}

filter {
  if [type] == "odata" {
    grok {
      match => { "message" => "%{QUOTEDSTRING:oid},\"%{WORD:oword1}\",\"%{IPV4:oclientip}\",\"%{DATA:otimestamp}\",%{QUOTEDSTRING:opl},%{QUOTEDSTRING:oos},%{QUOTEDSTRING:oua}" }
      remove_field => "message"
    }

    date {
      match => ["otimestamp", "YYYY-MM-dd HH:mm:ss Z"]
    }

    mutate {
      remove_field => "otimestamp"
      remove_field => "host"
      remove_field => "@version"
    }
  }
}

output {
# the if .. is here because there are other types that are handled in this output since I centralize the input, filter, and output files to three distinct files.
  if [type] == "odata" {
    elasticsearch {
      hosts => ["10.1.1.1:9200", "10.1.1.2:9200"]
      sniffing => false
      index => "odataindex"
      document_type => "odatatype"
    }
    file {
      path => "/tmp/odata_output"
    }
  }
}

Again, the grok filter is fine; the data is ingested via tcp just fine; and the file output plugin outputs the interpreted/grokd fields fine. Kibana sees the "odataindex" index and the fields (such as oid, oclientip, oua, etc.) from the grok filter. It just doesn't return any data when I do a search.

Any ideas? I am new to elastic/kibana/logstash and would also appreciate any tips on debugging these things.

Thank you in advance!

Brendan
  • 73
  • 1
  • 8
  • I still have not been able to determine why the data is not displaying in Kibana thru ElasticSearch, but the output goes out the file output plugin just fine to a local text file on the server. Any ideas? – Brendan Nov 08 '17 at 19:27

1 Answers1

0

Being unfamiliar with Kibana, I was not aware of the time constraint on data on the default search/display of just 15 minutes. The data coming in was timestamped (@timestamp key) via the 'date' plugin with the original open date, NOT the time of the, via TCP port to elastic, insertion event; thus, no data was showing and I had no idea that by default only the past 15 minutes of data based on @timestamp were displayed. If I had read the part about the time constraint I would have known. So I just adjusted the time to go back infinite years and saw my data.

So, if anyone else is having this problem, it's probably because you created a time-dependent index and have not clicked the 'time' button in the top right corner and changed the time frame.

This, my friends, is why you read the manual!

Brendan
  • 73
  • 1
  • 8