Logstash作为Elastic stack的重要组成部分,其最常用的功能是将数据导入到Elasticssearch中。将Logstash中的数据导入到Elasticsearch中操作也非常的方便,只需要在pipeline配置文件中增加Elasticsearch的output即可。
docker pull kibana:7.4.1
安装Elasticsearch docker pull elasticsearch:7.4.1
第一节已创建内部网络:docker network create elknetwork
启动elasticsearch
docker run -dti --name elasticsearch --net elknaetwork -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" elasticsearch:7.4.1
检测 elasticsearch 是否启动成功 curl 127.0.0.1:9200
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
{ "name" : "9efc2039c911", "cluster_name" : "docker-cluster", "cluster_uuid" : "fVtC7F7xRYm3GjxHVUcRwA", "version" : { "number" : "7.4.1", "build_flavor" : "default", "build_type" : "docker", "build_hash" : "fc0eeb6e2c25915d63d871d344e3d0b45ea0ea1e", "build_date" : "2019-10-22T17:16:35.176724Z", "build_snapshot" : false, "lucene_version" : "8.2.0", "minimum_wire_compatibility_version" : "6.8.0", "minimum_index_compatibility_version" : "6.0.0-beta1" }, "tagline" : "You Know, for Search" }
问题
[WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticse arch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x325cb33c>], :response=>{"index"=>{"_index"=>"logstash-2020.07.22-000001", "_type"=>"_doc", "_id"=>"2wfgonMBomMZZiZqvHT_", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [host] of type [text] in document with id '2wfgonMBomMZZiZqvHT_'
在logstash的output配置文件里增加了这一段,同样的问题也解决了。
filter { mutate { rename => { "[host][name]" => "host" } } }
最后filelog.conf:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
input { beats { port => "5044" } } filter { mutate { rename => { "[host][name]" => "host" } } } output { stdout { } elasticsearch { hosts => ["elasticsearch:9200"] } }
测试 查看索引:curl -XGET 'http://172.168.1.10:9200/logstash-*?pretty'
搜索数据:curl -XGET 'http://172.168.1.10:9200/logstash-*/_search?pretty'
参考资料
Elasticsearch教程 : https://www.yiibai.com/elasticsearch