ElasticSearch —— ELK 安装和配置

[TOC]

安装

使用 docker-compose:

1
2
3
git clone https://github.com/deviantony/docker-elk.git
cd docker-elk
sudo docker-compose up -d

配置

配置密码

执行下列命令,生成 ES 集成用户的密码:

1
sudo docker-compose exec -T elasticsearch bin/elasticsearch-setup-passwords auto --batch

比如我的输出:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
Changed password for user apm_system
PASSWORD apm_system = mkiFZAiUFtYusw0hKidx

Changed password for user kibana_system
PASSWORD kibana_system = 8KnPbnm5doTZBBfk5lxd

Changed password for user kibana
PASSWORD kibana = 8KnPbnm5doTZBBfk5lxd

Changed password for user logstash_system
PASSWORD logstash_system = IZILJ4Z9cqC6oARU2DJb

Changed password for user beats_system
PASSWORD beats_system = ZwZV65JUoQPq4Drwx79V

Changed password for user remote_monitoring_user
PASSWORD remote_monitoring_user = yGD4EK9o9fDDFpPGaC34

Changed password for user elastic
PASSWORD elastic = XCkZUSgrTHOF0krUsXgJ

编辑 kibana 配置文件 /path/to/docker-elk/kibana/config/kibana.yml,修改其中用户信息:

1
2
elasticsearch.username: kibana_system
elasticsearch.password: 8KnPbnm5doTZBBfk5lxd

编辑 logstash 配置文件 /path/to/docker-elk/logstash/config/logstash.yml,修改用户信息:

1
2
xpack.monitoring.elasticsearch.username: logstash_system
xpack.monitoring.elasticsearch.password: IZILJ4Z9cqC6oARU2DJb

编辑 /path/to/docker-elk/logstash/pipeline/logstash.conf,修改 es 连接配置:

1
2
3
4
5
6
7
8
output {
        elasticsearch {
                hosts => "elasticsearch:9200"
                user => "elastic"
                password => "XCkZUSgrTHOF0krUsXgJ"
                ecs_compatibility => disabled
        }
}

最后重启 kibana 和 logstash:

1
sudo docker-compose restart kibana logstash

分词器插件 ik

安装

  1. 方法 1

    推荐使用此方法安装插件,能比较方便未来自定义分词字典。

    将插件解压为文件夹 analysis-ik,随后把这个文件夹拷贝至 /path/to/docker-elk/elasticsearch/plugins

    编辑 dockerfile 文件 /path/to/docker-elk/elasticsearch/Dockerfile,添加如下:

    1
    
    COPY plugins plugins
    

    build 并重启:

    1
    2
    
    sudo docker-compose build
    sudo docker-compose up -d
    
  2. 方法 2(不推荐)

    编辑 es dockerfile 文件 /path/to/docker-elk/elasticsearch/Dockerfile,添加:

    1
    
    RUN elasticsearch-plugin install -b http://192.168.79.1:8000/elasticsearch-analysis-ik-7.11.2.zip
    

    PS: 自行替换插件包地址。

    之后 build 并重启:

    1
    2
    
    sudo docker-compose build
    sudo docker-compose up -d
    

ik_smart

1
2
3
4
5
GET _analyze 
{
  "analyzer": "ik_smart",
  "text": "vue.js 教程"
}

输出:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
{
  "tokens" : [
    {
      "token" : "vue.js",
      "start_offset" : 0,
      "end_offset" : 6,
      "type" : "LETTER",
      "position" : 0
    },
    {
      "token" : "教程",
      "start_offset" : 7,
      "end_offset" : 9,
      "type" : "CN_WORD",
      "position" : 1
    }
  ]
}

ik_max_word

1
2
3
4
5
GET _analyze 
{
  "analyzer": "ik_max_word",
  "text": "vue.js 教程"
}

输出:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
{
  "tokens" : [
    {
      "token" : "vue.js",
      "start_offset" : 0,
      "end_offset" : 6,
      "type" : "LETTER",
      "position" : 0
    },
    {
      "token" : "vue",
      "start_offset" : 0,
      "end_offset" : 3,
      "type" : "ENGLISH",
      "position" : 1
    },
    {
      "token" : "js",
      "start_offset" : 4,
      "end_offset" : 6,
      "type" : "ENGLISH",
      "position" : 2
    },
    {
      "token" : "教程",
      "start_offset" : 7,
      "end_offset" : 9,
      "type" : "CN_WORD",
      "position" : 3
    }
  ]
}

自定义字典

有的常见人名/专有词 IK 不能识别,比如:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
GET _analyze 
{
  "analyzer": "ik_max_word",
  "text": "阮一峰的网络日志"
}

// 输出:
{
  "tokens" : [
    {
      "token" : "阮",
      "start_offset" : 0,
      "end_offset" : 1,
      "type" : "CN_CHAR",
      "position" : 0
    },
    {
      "token" : "一",
      "start_offset" : 1,
      "end_offset" : 2,
      "type" : "TYPE_CNUM",
      "position" : 1
    },
    {
      "token" : "峰",
      "start_offset" : 2,
      "end_offset" : 3,
      "type" : "CN_CHAR",
      "position" : 2
    },
    {
      "token" : "的",
      "start_offset" : 3,
      "end_offset" : 4,
      "type" : "CN_CHAR",
      "position" : 3
    },
    {
      "token" : "网络日志",
      "start_offset" : 4,
      "end_offset" : 8,
      "type" : "CN_WORD",
      "position" : 4
    },
    {
      "token" : "网络",
      "start_offset" : 4,
      "end_offset" : 6,
      "type" : "CN_WORD",
      "position" : 5
    },
    {
      "token" : "日志",
      "start_offset" : 6,
      "end_offset" : 8,
      "type" : "CN_WORD",
      "position" : 6
    }
  ]
}

可以看到“阮一峰”这个名字没有被正确分词,因此需要自定义一个词典,将改名字添加进去。

首先新增自定义字典文件 /path/to/docker-elk/elasticsearch/plugins/analysis-ik/config/boli.dic,添加内容很简单:

1
阮一峰

之后编辑 ik 配置文件,指定自定义词典:

1
vim elasticsearch/plugins/analysis-ik/config/IKAnalyzer.cfg.xml

编辑内容:

1
2
<!--用户可以在这里配置自己的扩展字典 -->
 <entry key="ext_dict">boli.dic</entry>

随后 build、重启 ES:

1
2
sudo docker-compose build
sudo docker-compose up -d

再次进行测试:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
GET _analyze 
{
  "analyzer": "ik_max_word",
  "text": "阮一峰的网络日志"
}

// 输出:
{
  "tokens" : [
    {
      "token" : "阮一峰",
      "start_offset" : 0,
      "end_offset" : 3,
      "type" : "CN_WORD",
      "position" : 0
    },
    {
      "token" : "一",
      "start_offset" : 1,
      "end_offset" : 2,
      "type" : "TYPE_CNUM",
      "position" : 1
    },
    {
      "token" : "峰",
      "start_offset" : 2,
      "end_offset" : 3,
      "type" : "CN_CHAR",
      "position" : 2
    },
    {
      "token" : "的",
      "start_offset" : 3,
      "end_offset" : 4,
      "type" : "CN_CHAR",
      "position" : 3
    },
    {
      "token" : "网络日志",
      "start_offset" : 4,
      "end_offset" : 8,
      "type" : "CN_WORD",
      "position" : 4
    },
    {
      "token" : "网络",
      "start_offset" : 4,
      "end_offset" : 6,
      "type" : "CN_WORD",
      "position" : 5
    },
    {
      "token" : "日志",
      "start_offset" : 6,
      "end_offset" : 8,
      "type" : "CN_WORD",
      "position" : 6
    }
  ]
}

可看到搜索内容已被正确分词。

集群

TODO

加载评论