csv - Logstash not_analyzed -


i total newby elk stack , trying setup complicated config start with... :-)

i running whole stack on windows 7 laptop. , importing csv goes cannot string field not analysed giving me broken text in kibana visualisations.

last try template.

both template , conf file located in c:\logstash-1.5.0\bin directory.

this conf file:

input {     file {       path => "c:\users\jeroen\documents\temp\csv\elasticsearch_input_vc.csv"       type => "core2"       start_position => "beginning"      } }  filter {   csv {     columns => ["snapshot_date_time","country","tower","service","division","usd group","ref nr","processtype","importance","priority","severity","status , reason","category","is_valid_category","summary","open date time","closed date time","opened by","last modified","resolve completed date time","hrs_assigned_to_completed","first assign date time","hrs_new_to_assign","customer organization","requested by","assignee","active flag","in out sla resolution 1"]      separator => ";" } date { match => [ "snapshot_date_time", "yyyy-mm-dd hh:mm:ss" ] } mutate { convert => { "hrs_assigned_to_completed" => "float" } convert => { "hrs_new_to_assign" => "float" }   } } output {   elasticsearch {     action => "index"     host => "localhost"     index => "qdb-%{+yyyy.mm.dd}"     workers => 1     template => "template.json" } #stdout {    #codec => rubydebug #} } 

and template (which copied topic , changed "template name") , in doubt 7th line specific data used originator...

#template.json: { "template": "qdb-%{+yyyy.mm.dd}", "settings" : {     "number_of_shards" : 1,     "number_of_replicas" : 0,     "index" : {"query" : { "default_field" : "userid" }      } }, "mappings": {     "_default_": {          "_all": { "enabled": false },         "_source": { "compress": true },         "dynamic_templates": [             {                 "string_template" : {                      "match" : "*",                     "mapping": { "type": "string", "index": "not_analyzed" },                     "match_mapping_type" : "string"                  }               }          ],          "properties" : {             "date" : { "type" : "date", "format": "yyyy-mm-dd hh:mm:ss"},             "device" : { "type" : "string", "fields": {"raw": {"type":  "string","index":  "not_analyzed"}}},             "distance" : { "type" : "integer"}     } } } 

any help/hints/tips appreciated!

what need mapping in first elasticsearch after import data via logstash , see data in kibana wich data not analyzed

http://host:9200/yourindex/_mapping/yourtype  {  "your type": {  "properties": {   "user" : {     "type" : "string",     "index": "not_analyzed",   "data" : {     "type" : "string",     "index": "not_analyzed"   }     } 

Comments

Popular posts from this blog

How has firefox/gecko HTML+CSS rendering changed in version 38? -

javascript - Complex json ng-repeat -

jquery - Cloning of rows and columns from the old table into the new with colSpan and rowSpan -