首先講個題外話。logstash配置文件hello-world.json上篇也提到過,不過那是7.9.0版本的,注意mapping下面是沒有type的,因為默認的type就是_doc:
{ "index_patterns": ["hello-world-%{+YYYY.MM.dd}"], "order": 0, "settings": { "index.refresh_interval": "10s" }, "mappings": { "properties": { "createTime": { "type": "long" }, "sessionId": { "type": "text", "fielddata": true, "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "chart": { "type": "text", "analyzer": "ik_max_word", "search_analyzer": "ik_max_word" } } } }
如果我們拿上面這個模板放到logstash的6.5.3版本去跑,會提示mapper_parsing_exception,原因是:
Root mapping definition has unsupported parameters: [createTime : {type=long}] [sessionId : {fielddata=true, type=text, fields={keyword={ignore_above=256, type=keyword}}}] [chart : {search_analyzer=ik_max_word, analyzer=ik_max_word, type=text}]
為啥不支持,因為我們沒有提供mapping的type,所以我們只能加上一個映射類型,比如我們就使用_doc,模板文件內容新增節點_doc:
{ "index_patterns": [ "hello-world-%{+YYYY.MM.dd}" ], "order": 0, "settings": { "index.refresh_interval": "10s" }, "mappings": { "_doc": { "properties": { "createTime": { "type": "long" }, "sessionId": { "type": "text", "fielddata": true, "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "chart": { "type": "text", "analyzer": "ik_max_word", "search_analyzer": "ik_max_word" } } } } }
好了,重啟logstash,這次啟動沒報錯。回歸正題,我們給logstash配置輸入源是:
input{ file { path => "D:\wlf\logs\*" start_position => "beginning" type => "log" } }
啟動沒有報錯,但靜悄悄,文件在D盤里,數據也有的,但沒有任何動靜,elasticsearch也沒有收到數據。這里比較奇葩的是path指定的目錄分隔符,它要求使用的不再是我們習慣上windows環境下的反斜杠,而是linux下的斜杠,所以把path改成:
path => "D:/wlf/logs/*"
重啟啟動logstash,終於在啟動結束后有了動靜,數據往elasticsearch插入了,只不過插入失敗了,我們迎來了新的報錯:
[2020-09-10T09:46:23,198][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"hello-world-2020.09.10", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x15dfbf91>], :response=>{"index"=>{"_index"=>"after-cdr-2020.09.10", "_type"=>"doc", "_id"=>"jX-xdXQBTQnatMJTUOJU", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Rejecting mapping update to [hello-world-2020.09.10] as the final mapping would have more than 1 type: [_doc, doc]"}}}}
日志說elasticsearch的hello-world-2020.09.10索引要的映射type是_doc,而我們給的卻是doc,所以拒絕為此索引更新映射。我們到kiban確認下這個問題:
模板是對的,插入一條數據試試:
一點問題沒有,那么按日志里的提示,我們試試映射類型為doc的:
問題重現了,這里為啥logstash要使用doc類型而不是_doc類型呢?還不是默認值搞的鬼。我們不使用默認值就好了,在輸出器中指定映射類型為_doc:
output { elasticsearch{ hosts => "localhost:9200" index => "hello-world-%{+YYYY.MM.dd}" manage_template => true template_name => "hello" template_overwrite => true template => "D:\elk\logstash-6.5.3\config\hello-world.json" document_type => "_doc" } }
再次重啟logstash,啟動ok,但之前D盤里的文件已經傳輸過一次(雖然是失敗的),所以elasticsearch還是靜悄悄,我們可以手動打開其中某個文件,復制其中一條數據后保存,或者直接復制一個文件,隨后新的數據會被傳輸到elasticsearch去,這次會順利,沒有異常,直接到elasticsearch去查同步的數據即可。