Kubernetes logs
StackState Self-hosted v5.0.x
This page describes StackState version 5.0.
Overview
In a Kubernetes setup, StackState functions are distributed across different pods and logs for each function are stored per pod and container. You can access recent logs using kubectl
, although for long term storage it is recommended to set up log aggregation.
Kubernetes pods for logging
StackState logs are stored per pod and container. The table below shows the pod to access for logs relating to specific StackState functions. Note that actual pod names will include a number or random string suffix (e.g. stackstate-receiver-5b9d79db86-h2hkz
) and may also include the release name specified when StackState was deployed as a prefix.
Note that logs stored on pods will be regularly removed. For long term access to logs, it is advised that you set up log aggregation for your Kubernetes cluster.
API (including topology, charts and settings)
stackstate-api
Checks
stackstate-checks
Data indexing into Elasticsearch
stackstate-mm2es
(metrics) stackstate-e2es
(events) stackstate-trace2es
(traces) stackstate-sts2es
(events generated by StackState)
Data ingestion
stackstate-receiver
Event handlers
stackstate-view-health
State propagation
stackstate-state
Synchronization
stackstate-sync
View health state
stackstate-view-health
You can access logs on a specific pod using the kubectl logs
command.
For example:
Access recent logs
Pod or container logs
The most recent logs can be retrieved from Kubernetes using the kubectl logs
command. Check the pod that you need to monitor to retrieve a specific log.
For example:
Synchronization logs
All synchronization logs can be found in a pod stackstate-sync-<suffix>
. You can use the synchronization name to locate specific log information in a log snapshot.
For example:
Log aggregation
For long term storage of StackState log data, it is advised that you set up log aggregation on your Kubernetes cluster. This can be done using a third party system for storage such as Elasticsearch, Splunk or Logz.io and a log shipper such as Logstash or Fluentd.
For more details of how this can be done, check:
Shipping logs with Fluentd (fluentd.org)
A complete overview of setting up log aggregation into Elasticsearch (bitnami.com)
See also
Last updated