| Argument | Description |
|---|---|
| -q, --query QUERY | Query string in Lucene syntax. [required] |
| -o, --output-file FILE | CSV file location. [required] |
| -u, --url URL | Elasticsearch host URL. Default is "http://localhost:9200". |
| -a, --auth | Elasticsearch basic authentication in the form of username:password. |
| -i, --index-prefixes INDEX [INDEX ...] | Index name prefix(es). Default is ['logstash-*']. |
| -D, --doc-types DOC_TYPE [DOC_TYPE ...] | Document type(s). |
| -t, --tags TAGS [TAGS ...] | Query tags. |
| -f, --fields FIELDS [FIELDS ...] | List of selected fields in output. Default is ['_all']. |
| -s, --sort FIELDS [FIELDS ...] | List of <field>:<direction> pairs to sort on. Default is []. |
| -d, --delimiter DELIMITER | Delimiter to use in CSV file. Default is ",". |
| -m, --max INTEGER | Maximum number of results to return. Default is 0. |
| -s, --scroll-size INTEGER | Scroll size for each batch of results. Default is 100. |
| -k, --kibana-nested | Format nested fields in Kibana style. |
| -r, --raw-query | Switch query format in the Query DSL. |
| -e, --meta-fields | Add meta-fields in output. |
| --verify-certs | Verify SSL certificates. Default is False. |
| --ca-certs CA_CERTS | Location of CA bundle. |
| --client-cert CLIENT_CERT | Location of Client Auth cert. |
| --client-key CLIENT_KEY | Location of Client Cert Key. |
| -v, --version | Show version and exit. |
| --debug | Debug mode on. |
| -h, --help | show this help message and exit |
Searching on http://localhost:9200, by default
$ es2csv -q 'user: John' -o database.csvSave to my_database.csv file
$ es2csv -q 'user: John' -o my_database.csvOn custom Elasticsearch host
$ es2csv -u my.cool.host.com:9200 -q 'user: John' -o database.csvYou are using secure Elasticsearch with nginx? No problem!
$ es2csv -u http://my.cool.host.com/es/ -q 'user: John' -o database.csvNot default port?
$ es2csv -u my.cool.host.com:6666/es/ -q 'user: John' -o database.csvWith Authorization
$ es2csv -u http://login:password@my.cool.host.com:6666/es/ -q 'user: John' -o database.csvWith explicit Authorization
$ es2csv -a login:password -u http://my.cool.host.com:6666/es/ -q 'user: John' -o database.csvSpecifying index
$ es2csv -i logstash-2015-07-07 -q 'user: John' -o database.csvMore indexes
$ es2csv -i logstash-2015-07-07 logstash-2015-08-08 -q 'user: John' -o database.csvOr index mask
$ es2csv -i logstash-2015-* -q 'user: John' -o database.csvAnd now together
$ es2csv -i logstash-2015-01-0* logstash-2015-01-10 -q 'user: John' -o database.csvCollecting all data on all indices
$ es2csv -i _all -q '*' -o database.csvSpecifying document type
$ es2csv -D log -i _all -q '*' -o database.csvWith tag
$ es2csv -t dev -q 'user: John' -o database.csvMore tags
$ es2csv -t dev prod -q 'user: John' -o database.csvSelecting some fields, what you are interesting in, if you don't need all of them (query run faster)
$ es2csv -f host status date -q 'user: John' -o database.csvOr field mask
$ es2csv -f 'ho*' 'st*us' '*ate' -q 'user: John' -o database.csvSelecting all fields, by default
$ es2csv -f _all -q 'user: John' -o database.csvSelecting nested fields
$ es2csv -f comments.comment comments.date comments.name -q '*' -i twitter -o database.csvSorting by fields, in order what you are interesting in, could contains only field name (will be sorted in ascending order)
$ es2csv -S key -q '*' -o database.csvOr field pair: field name and direction (desc or asc)
$ es2csv -S status:desc -q '*' -o database.csvUsing multiple pairs
$ es2csv -S key:desc status:asc -q '*' -o database.csvSelecting some field(s), but sorting by other(s)
$ es2csv -S key -f user -q '*' -o database.csvChanging column delimiter in CSV file, by default ','
$ es2csv -d ';' -q '*' -i twitter -o database.csvMax results count
$ es2csv -m 6283185 -q '*' -i twitter -o database.csvRetrieve 2000 results in just 2 requests (two scrolls 1000 each):
$ es2csv -m 2000 -s 1000 -q '*' -i twitter -o database.csvChanging nested columns output format to Kibana style like
$ es2csv -k -q '*' -i twitter -o database.csvAn JSON document example
{
"title": "Nest eggs",
"body": "Making your money work...",
"tags": [ "cash", "shares" ],
"comments": [
{
"name": "John Smith",
"comment": "Great article",
"age": 28,
"stars": 4,
"date": "2014-09-01"
},
{
"name": "Alice White",
"comment": "More like this please",
"age": 31,
"stars": 5,
"date": "2014-10-22"
}
]
}A CSV file in Kibana style format
body,comments.age,comments.comment,comments.date,comments.name,comments.stars,tags,title
Making your money work...,"28,31","Great article,More like this please","2014-09-01,2014-10-22","John Smith,Alice White","4,5","cash,shares",Nest eggsA CSV file in default format
body,comments.0.age,comments.0.comment,comments.0.date,comments.0.name,comments.0.stars,comments.1.age,comments.1.comment,comments.1.date,comments.1.name,comments.1.stars,tags.0,tags.1,title
Making your money work...,28,Great article,2014-09-01,John Smith,4,31,More like this please,2014-10-22,Alice White,5,cash,shares,Nest eggsQuery DSL syntax
$ es2csv -r -q '{"query": {"match": {"user": "John"}}}' -o database.csvVery long queries can be read from file
$ es2csv -r -q @'~/query string file.json' -o database.csvSelecting meta-fields: _id, _index, _score, _type
$ es2csv -e -f _all -q 'user: John' -o database.csvWith enabled SSL certificate verification (off by default)
$ es2csv --verify-certs -u https://my.cool.host.com/es/ -q 'user: John' -o database.csvWith your own certificate authority bundle
$ es2csv --ca-certs '/path/to/your/ca_bundle' --verify-certs -u https://host.com -q '*' -o out.csv