ElasticSearch on elk keep running out of space

0

I got a problem using ElasticSearch Kibana and Logstash and my disk where my data are stored is used at 92% (I got this information by df -h) The fact is I don't need to keep all the information so I used this command :

curl -XPUT -H "Content-Type: application/json" http://localhost:9200/_all/_settings -d '{"index.blocks.read_only_allow_delete": null}'

but it is applyed on existing index and this is not applyed on new indexes I've to reapply this command to avoid this message :

[2019-04-17T09:09:31,545][INFO ][logstash.outputs.elasticsearch] retrying failed action with response code: 403 ({"type"=>"cluster_block_exception", "reason"=>"blocked by: [FORBIDDEN/12/index read-only / allow delete (api)];"})

did i missed any configuration ? or I must automate it with a cron ?

Brice Harismendy

Posted 2019-04-17T07:31:52.133

Reputation: 1

No answers