I am looking to scope the servers required for an elasticsearch proof-of-concept.
Ultimately, my question is this:
Given 1GB of json text indexed by elasticsearch, how much can I expect to be occupied on disk by elasticsearch?
Obviously there are many variables, but I'm going for orders of magnitude. 100MB? 100GB?
I understand that elasticsearch performs compression ( http://www.elasticsearch.org/guide/reference/index-modules/store/ ), but I don't know what kind of footprint the indexes and other structures occupy.
Anecdotal answers are acceptable, but please also let me know what version you're using.