经验首页 前端设计 程序设计 Java相关 移动开发 数据库/运维 软件/图像 大数据/云计算 其他经验
当前位置:技术经验 » 数据库/运维 » Linux/Shell » 查看文章
ELK收集日志之logstash使用
来源:cnblogs  作者:zbzSH  时间:2021/12/31 8:51:09  对本文有异议

一、logstash使用

1.logstah收集文件日志

  1. 不难理解,我们的日志通常都是在日志文件中存储的,所以,当我们在使用INPUT插件时,收集日志,需要使用file模块,从文件中读取日志的内容,那么接下来讲解的是,将日志内容输出到另一个文件中,如此一来,我们可以将日志文件统一目录,方便查找。
  2. 注意:Logstash与其他服务不同,收集日志的配置文件需要我们根据实际情况自己去写。
  3. 前提:需要Logstash对被收集的日志文件有读的,并且对要写入的文件,有写入的权限。

2.配置logstash

  1. #默认配置文件
  2. [root@logstash ~]# vim /etc/logstash/logstash.yml
  3. #启动logstash回去读取conf.d下面的配置文件
  4. path.config: /etc/logstash/conf.d

3.配置logstash收集文件日志到文件

1)配置

  1. [root@logstash ~]# vim /etc/logstash/conf.d/message.conf
  2. input {
  3. file {
  4. path => "/var/log/messages"
  5. start_position => "beginning"
  6. }
  7. }
  8. output {
  9. file {
  10. path => "/tmp/message_%{+YYYY.MM.dd}.log"
  11. }
  12. }

2)启动logstash

  1. #先检查语法
  2. [root@logstash ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/message.conf -t
  3. #启动
  4. [root@logstash ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/message.conf &

3)查看新文件内容

  1. [root@logstash ~]# tail /var/log/messages
  2. Jul 17 15:01:01 logstash systemd: Started Session 448 of user root.
  3. Jul 17 15:05:01 logstash systemd: Started Session 449 of user root.
  4. [root@logstash ~]# tail /tmp/message_2020.07.17.log
  5. {"@version":"1","path":"/var/log/messages","message":"Jul 17 15:01:01 logstash systemd: Started Session 448 of user root.","@timestamp":"2020-07-17T07:05:42.341Z","host":"logstash"}
  6. {"@version":"1","path":"/var/log/messages","message":"Jul 17 15:05:01 logstash systemd: Started Session 449 of user root.","@timestamp":"2020-07-17T07:05:42.341Z","host":"logstash"}

4.配置收集日志到ES

1)配置

  1. [root@logstash tmp]# vim /etc/logstash/conf.d/message_es.conf
  2. input {
  3. file {
  4. path => "/var/log/messages"
  5. start_position => "beginning"
  6. }
  7. }
  8. output {
  9. elasticsearch {
  10. hosts => ["10.0.0.51:9200"]
  11. index => "messages_%{+YYYY-MM-dd}.log"
  12. }
  13. }

2)启动logstash

  1. #先检查语法
  2. [root@logstash ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/message.conf -t
  3. #启动
  4. [root@logstash ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/message.conf &

启动多个logstash进程需要配置多个data目录,否则会出现这样的报错

  1. [ERROR] 2020-07-20 11:59:22.363 [LogStash::Runner] Logstash - java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit

5.启动logsstash多实例

1)创建多实例数据目录

  1. [root@logstash ~]# mkdir /data/logstash/{message_file,secure_file} -p
  2. #授权目录logstash权限
  3. [root@logstash ~]# chown -R logstash.logstash /data/logstash/

2)启动多实例

  1. #启动多实例要加一个参数 --path.data 指定多实例不同的数据目录
  2. [root@logstash ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/message_es.conf --path.data=/data/logstash/message_file &
  3. [root@logstash tmp]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/secure_es.conf --path.data=/data/logstash/secure_file &

6.单个进程收集多个日志

1)停掉原来的进程删掉索引

2)配置方式一:

  1. [root@logstash ~]# vim /etc/logstash/conf.d/double_es.conf
  2. input {
  3. file {
  4. type => "messages_log"
  5. path => "/var/log/messages"
  6. start_position => "beginning"
  7. }
  8. file {
  9. type => "secure_log"
  10. path => "/var/log/secure"
  11. start_position => "beginning"
  12. }
  13. }
  14. output {
  15. if [type] == "messages_log" {
  16. elasticsearch {
  17. hosts => ["10.0.0.51:9200"]
  18. index => "messages_%{+YYYY-MM-dd}.log"
  19. }
  20. }
  21. if [type] == "secure_log" {
  22. elasticsearch {
  23. hosts => ["10.0.0.51:9200"]
  24. index => "secure_%{+YYYY-MM-dd}.log"
  25. }
  26. }
  27. }

3)配置方式二:

  1. [root@logstash ~]# vim /etc/logstash/conf.d/doubles_es.conf
  2. input {
  3. file {
  4. type => "messages_log"
  5. path => "/var/log/messages"
  6. start_position => "beginning"
  7. }
  8. file {
  9. type => "secure_log"
  10. path => "/var/log/secure"
  11. start_position => "beginning"
  12. }
  13. }
  14. output {
  15. elasticsearch {
  16. hosts => ["10.0.0.51:9200"]
  17. index => "%{type}_%{+YYYY-MM-dd}.log"
  18. }
  19. }

4)启动

  1. [root@logstash ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/doubles_es.conf

二、收集tomcat日志

1.安装tomcat

  1. #上传包
  2. #安装java环境
  3. #解压包
  4. [root@logstash ~]# tar xf apache-tomcat-9.0.30.tar.gz
  5. #移动并做软连接
  6. [root@logstash ~]# mv apache-tomcat-9.0.30 /usr/local/
  7. [root@logstash ~]# ln -s /usr/local/apache-tomcat-9.0.30 /usr/local/tomcat

2.启动tomcat

  1. #配置一个页面
  2. [root@logstash ~]# echo "test logstash log" > /usr/local/tomcat/webapps/ROOT/index.html
  3. #启动
  4. [root@logstash ~]# /usr/local/tomcat/bin/startup.sh
  5. [root@logstash ~]# netstat -lntp
  6. tcp6 0 0 :::8080 :::* LISTEN 84967/java

3.配置logstash收集tomcat日志

  1. [root@logstash ~]# vim /etc/logstash/conf.d/tomcat_es.conf
  2. input {
  3. file {
  4. path => "/usr/local/tomcat/logs/catalina.*.log" #input 插件不识别变量,日志只收集当天的,以前的日志文件第二天之后不会再写入,所以这里用* 就可以收集每天的日志。
  5. start_position => "beginning"
  6. }
  7. }
  8. output {
  9. elasticsearch {
  10. hosts => ["10.0.0.51:9200"]
  11. index => "tomcat_%{+YYYY-MM-dd}.log"
  12. }
  13. }
  14. [root@logstash ~]# vim /etc/logstash/conf.d/tomcat_access_es.conf
  15. input {
  16. file {
  17. path => "/usr/local/tomcat/logs/localhost_access_log.*.txt"
  18. start_position => "beginning"
  19. }
  20. }
  21. output {
  22. elasticsearch {
  23. hosts => ["10.0.0.51:9200"]
  24. index => "tomcat_access_%{+YYYY-MM-dd}.log"
  25. }
  26. }

4.启动

  1. [root@logstash ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/tomcat_access_es.conf

5.收集tomcat错误日志

1)概念

  1. 当收集tomcat错误日志时,一条报错可能是很多行,收集到以后时很多条数据,查看时比较麻烦
  2. #解决方式
  3. 1.跟开发协商,将tomcat日志格式改为json格式,直接收集即可
  4. 2.通过logstash的模块将日志合并

2)方式一:

  1. #进入tomcat配置文件目录
  2. [root@elkstack03 ~]# cd /usr/local/tomcat/conf
  3. #编辑server配置文件
  4. [root@elkstack03 conf]# vim server.xml
  5. #在138行,添加如下内容
  6. <Valve className="org.apache.catalina.valves.AccessLogValve" directory="logs"
  7. prefix="tomcat_access_log" suffix=".log"
  8. pattern="{&quot;clientip&quot;:&quot;%h&quot;,&quot;ClientUser&quot;:&quot;%l&quot;,&quot;authenticated&quot;:&quot;%u&quot;,&quot;AccessTime&quot;:&quot;%t&quot;,&quot;method&quot;:&quot;%r&quot;,&quot;status&quot;:&quot;%s&quot;,&quot;SendBytes&quot;:&quot;%b&quot;,&quot;Query?string&quot;:&quot;%q&quot;,&quot;partner&quot;:&quot;%{Referer}i&quot;,&quot;AgentVersion&quot;:&quot;%{User-Agent}i&quot;}"/>

3)方式二:

  1. [root@logstash ~]# vim /etc/logstash/conf.d/tomcat_mutiline_es.conf
  2. input {
  3. file {
  4. type => "java_log"
  5. path => "/usr/local/tomcat/logs/localhost_access_log.*.txt"
  6. start_position => "beginning"
  7. codec => multiline {
  8. pattern => "^\["
  9. negate => true
  10. what => "previous"
  11. }
  12. }
  13. }
  14. output {
  15. elasticsearch {
  16. hosts => ["10.0.0.51:9200"]
  17. index => "tomcat_mutiline_%{+YYYY-MM-dd}.log"
  18. }
  19. }
  20. #注释:
  21. [root@elkstack03 ~]# vim /etc/logstash/conf.d/java.conf
  22. input {
  23. stdin {
  24. codec => multiline {
  25. #当遇到[开头的行时候将多行进行合并
  26. pattern => "^\["
  27. #true为匹配成功进行操作,false为不成功进行操作
  28. negate => true
  29. #与上面的行合并,如果是下面的行合并就是next
  30. what => "previous"
  31. }}
  32. }
  33. output {
  34. stdout {
  35. codec => rubydebug
  36. }
  37. }

原文链接:http://www.cnblogs.com/zbzSH/p/15727583.html

 友情链接:直通硅谷  点职佳  北美留学生论坛

本站QQ群:前端 618073944 | Java 606181507 | Python 626812652 | C/C++ 612253063 | 微信 634508462 | 苹果 692586424 | C#/.net 182808419 | PHP 305140648 | 运维 608723728

W3xue 的所有内容仅供测试,对任何法律问题及风险不承担任何责任。通过使用本站内容随之而来的风险与本站无关。
关于我们  |  意见建议  |  捐助我们  |  报错有奖  |  广告合作、友情链接(目前9元/月)请联系QQ:27243702 沸活量
皖ICP备17017327号-2 皖公网安备34020702000426号