Vorder's blog

初入ELK

字数统计: 510阅读时长: 2 min
2018/12/19 Share

/*
本环境使用docker-elk搭建
*/

准备阶段

环境搭建

安装docker

1
2
curl -fsSL https://get.docker.com -o get-docker.sh
sudo sh get-docker.sh

安装docker-compose:1.6

1
pip install docker-compose==1.6

clone docker-elk代码

1
git clone https://github.com/deviantony/docker-elk.git

于docker-elk目录使用

1
docker-compose up

[!] 如果遇到invalid reference formation分别修改

1
2
3
4
/elasticsearch/Dockerfile   
/logstash/Dockerfile
/kibana/Dockerfile
除第一行ARG ELK_VERSION以外的ELK_VERSION 修改为6.5.2

window及其其他环境下参考—-官方文档:https://github.com/deviantony/docker-elk

汉化kibana

1
2
3
4
5
6
7
8
docker ps -a #找到kibana的docker 容器id
docker exec -it xxxxxx(容器id) /bin/bash
mkdir cn
cd cn
mkdir config
curl -o config/kibana_resource.json https://raw.githubusercontent.com/anbai-inc/Kibana_Hanization/master/config/kibana_resource.json
curl -O https://raw.githubusercontent.com/anbai-inc/Kibana_Hanization/master/main.py
python main.py "/usr/share/kibana"

ps:6.5.2暴毙,很不巧的是,我所使用环境也是6.5.2

导入阶段

创建ES

创建索引

1
curl -H 'Content-Type: application/json' -XPUT '127.0.0.1:9200/14e' -d '{}'

创建映射(6.x后移除string类型,index变为boolean)

1
2
3
4
5
6
7
8
9
10
11
12
curl -H 'Content-Type:application/json' -XPUT '127.0.0.1:9200/14e/data0/_mapping' -d '{
"data0": {
"properties": {
"email": {
"type": "text"
},
"passwd": {
"type": "text"
}
}
}
}'

设置logstash清洗,导入

设置/docker-elk/logstash/pipline/logstash.conf

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
input {
tcp {
port => 5000
}
}

filter {
mutate{
split => ["message",":"]
add_field =>{
"email" => "%{[message][0]}"
}
add_field =>{
"passwd"=> "%{[message][1]}"
}
remove_field => ["path","message","host","@version","port"]#去掉一些不想显示的字段
}
}

output {
elasticsearch {
hosts => "elasticsearch:9200"
index => "14e"
document_type => "data0"
}
}

index_type字段已改,大坑:https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html#plugins-outputs-elasticsearch-document_type
关于分割参考官网文档:https://www.elastic.co/guide/en/logstash/current/plugins-filters-split.html
logstash详细写法参考此文章:https://doc.yonyoucloud.com/doc/logstash-best-practice-cn

nc 传入数据

1
nc localhost 5000 < 0

nc 批量传入

1
nc localhost 5000 < ./*

静候载入。
结果展示

建立图表分析

create pie->split slices

split slices下的sub Aggregation选择filters
分别添加filters

1
email:"xxxx.xx"

22.png

后续

logstash对于空字段未经判断=>导致passwd出现%{[message][1]}
\ux15等乱码字符未处理
数据还没导完..
其他功能还没试玩..

CATALOG
  1. 1. 准备阶段
    1. 1.1. 环境搭建
    2. 1.2. 汉化kibana
  2. 2. 导入阶段
    1. 2.1. 创建ES
    2. 2.2. 设置logstash清洗,导入
  3. 3. 建立图表分析
  4. 4. 后续