Changes between Initial Version and Version 1 of SystemAdministration/Logging


Ignore:
Timestamp:
08/02/18 08:00:01 (6 years ago)
Author:
Twan Goosen
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • SystemAdministration/Logging

    v1 v1  
     1= Logging =
     2
     3== Architecture ==
     4
     5{{{
     6
     7+------------ rs236235 ------
     8|
     9| Elasticsearch +- Kibana
     10|   +
     11|   |
     12| Fluentd --+ file:aggregation.current
     13|   +
     14|   |
     15+---|------------------------
     16    |
     17+---|----------service host--
     18|   |
     19|  Fluentd --+ file:aggregation.current
     20|   +                    +
     21|   |                    |
     22|  Docker daemon     Host service
     23|   +
     24|   |
     25| +-|-----docker container--
     26| | |
     27| | Fluentd
     28| | |
     29| | +
     30| | Application log
     31| +-------------------------
     32
     33}}}
     34
     35== Log aggregation ==
     36
     37Service hosts collect log information from their services (typically through the docker daemon) using Fluentd. These logs are aggregated locally and sent to the central log aggregation host ([[SystemAdministration/Hosts/rs236235.rs.hosteurope.de|rs236235]]). Here, the logs are processed and sent to two targets:
     38* The file system: `/var/log/fluent/aggregation.current` which is flushed and rotated periodically
     39* A local Elasticsearch instance (using the [https://docs.fluentd.org/v1.0/articles/out_elasticsearch Elasticsearch output plugin])
     40
     41A Kibana instance is also running and connected to Elasticsearch and available at [https://rs236235.rs.hosteurope.de:5601]. This can be used to view, query and visualise the indexed log messages and the data therein.
     42
     43== Data processing via logs ==
     44
     45Using Kibana logs cannot only be viewed and queried, they can also be used for data visualisation, for example the evolution of a certain value included in regular log messages over time. For this, the logs have to be parsed into fields, which can happen in various places. Ideally this is done at the primary source of the log collection, i.e. in the Fluentd enabled application container.
     46
     47=== Log parsing ===
     48
     49==== Examples ====
     50
     51===== Solr =====
     52This uses the [https://docs.fluentd.org/v0.12/articles/parser_regexp regexp parser] to extract a number of fields, and also specifies field types for those fields that should not be interpreted as string.
     53
     54{{{
     55<source>
     56  @type tail
     57  path /opt/solr/server/logs/solr.log
     58  pos_file /opt/solr/server/logs/solr.log.pos
     59  tag solr
     60  emit_unmatched_lines true
     61  <parse>
     62    @type regexp
     63    expression /^(?<log_time>\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}\.\d+)\s+(?<log_level>([^\s]+))\s+((\([^\)]*\)\s+\[\s*(?<solr_index>[^\]]+)\s*\].*(webapp=(?<solr_webapp>[^\s]+)).*(path=(?<solr_path>[^\s]+)).*(params=(?<solr_params>{[^\}]+}))(.*hits=(?<solr_hits>\d+)(.*status=(?<solr_status>\d+)(.*QTime=(?<solr_qtime>\d+)?)?)?)?)?(?<message>.+)?)$/
     64    types solr_hits:integer,solr_status:integer,solr_qtime:integer
     65  </parse>
     66</source>
     67}}}
     68