Integrate with SAP Cloud Logging
| Category | |
|---|---|
| Signal types | logs, traces, metrics |
| Backend type | third-party remote |
| OTLP-native | yes |
Configure the Telemetry module to send logs, metrics, and traces from your cluster to an SAP Cloud Logging instance. By centralizing this data in your SAP Cloud Logging instance, you can store, visualize, and analyze the observability of your applications.
Table of Content
- Table of Content
- Prerequisites
- Context
- Ship Logs to SAP Cloud Logging
- Ship Distributed Traces to SAP Cloud Logging
- Ship Metrics to SAP Cloud Logging
- Set Up Kyma Dashboard Integration
- Use SAP Cloud Logging Alerts
- Use SAP Cloud Logging Dashboards
Prerequisites
- Kyma as the target deployment environment, with the following modules added (see Quick Install):
- Telemetry module
- To collect data from your Istio service mesh: Istio module (default module)
- SAP BTP Operator module (default module)
- An instance of SAP Cloud Logging with OpenTelemetry ingestion enabled. For details, see Ingest via OpenTelemetry API Endpoint.
TIP
Create the SAP Cloud Logging instance with the SAP BTP service operator (see Create an SAP Cloud Logging Instance through SAP BTP Service Operator), because it takes care of creation and rotation of the required Secret. However, you can choose any other method of creating the instance and the Secret, as long as the parameter for OTLP ingestion is enabled in the instance. For details, see Configuration Parameters.
- A Secret in the respective namespace in your Kyma cluster, holding the credentials and endpoints for the instance. It’s recommended that you rotate your Secret (see SAP BTP Security Recommendation BTP-CLS-0003). In the following example, the Secret is named
sap-cloud-loggingand the namespacesap-cloud-logging-integration, as illustrated in the secret-example.yaml.
- Kubernetes CLI (kubectl) (see Install the Kubernetes Command Line Tool).
- UNIX shell or Windows Subsystem for Linux (WSL) to execute commands.
Context
The Telemetry module supports shipping logs and ingesting distributed traces as well as metrics from applications and the Istio service mesh to SAP Cloud Logging.
First, set up the Telemetry module to ship the logs, traces, and metrics to your backend by deploying the pipelines and other required resources. Then, you configure Kyma dashboard integration. Finally, set up SAP Cloud Logging alerts and dashboards.
SAP Cloud Logging is an instance-based and environment-agnostic observability service to store, visualize, and analyze logs, metrics, and traces.
Ship Logs to SAP Cloud Logging
The Telemetry module supports two protocols for shipping logs to your backend. The method you choose determines how you configure your LogPipeline and which SAP Cloud Logging dashboards you can use.
Context
Choose one of the following methods to configure shipping for application and access logs
- OpenTelemetry (recommended): Use the OTLP-native method for all new configurations. It provides a unified way to send logs, metrics, and traces.
- Fluent Bit (legacy): Use this method only if you depend on the preconfigured Kyma_* dashboards in SAP Cloud Logging. These dashboards were designed for the HTTP and are not compatible with the OTLP logging output.
Procedure
Set Up Application Logs
Deploy a LogPipeline for application logs:
- For OTLP, run:
Script: Application Logs
bashkubectl apply -f - <<EOF apiVersion: telemetry.kyma-project.io/v1alpha1 kind: LogPipeline metadata: name: sap-cloud-logging spec: input: application: enabled: true output: otlp: endpoint: valueFrom: secretKeyRef: name: sap-cloud-logging namespace: sap-cloud-logging-integration key: ingest-otlp-endpoint tls: cert: valueFrom: secretKeyRef: name: sap-cloud-logging namespace: sap-cloud-logging-integration key: ingest-otlp-cert key: valueFrom: secretKeyRef: name: sap-cloud-logging namespace: sap-cloud-logging-integration key: ingest-otlp-key EOF- For HTTP, run:
Script: Application Logs
bashkubectl apply -f - <<EOF apiVersion: telemetry.kyma-project.io/v1alpha1 kind: LogPipeline metadata: name: sap-cloud-logging-application-logs spec: input: application: containers: exclude: - istio-proxy output: http: dedot: true host: valueFrom: secretKeyRef: name: sap-cloud-logging namespace: sap-cloud-logging-integration key: ingest-mtls-endpoint tls: cert: valueFrom: secretKeyRef: name: sap-cloud-logging namespace: sap-cloud-logging-integration key: ingest-mtls-cert key: valueFrom: secretKeyRef: name: sap-cloud-logging namespace: sap-cloud-logging-integration key: ingest-mtls-key uri: /customindex/kyma EOFVerify that the LogPipeline is running:
kubectl get logpipelinesSet Up Istio Access Logs
By default, Istio sidecar injection and Istio access logs are disabled in Kyma. To analyze them, you must enable them:
Enable Istio sidecar injection for your workload (see Enabling Istio Sidecar Proxy Injection).
Depending on your log shipment protocol, configure the Istio Telemetry resource (see Configure Istio Access Logs):
For OTLP, set up the Istio Telemetry resource with the OTLP-based kyma-logs extension provider.
For HTTP:
- Set up the Istio Telemetry resource with the
stdout-jsonextension provider. - Deploy a LogPipeline for Istio access logs:
Script: Access Logs
bashkubectl apply -f - <<EOF apiVersion: telemetry.kyma-project.io/v1alpha1 kind: LogPipeline metadata: name: sap-cloud-logging-access-logs spec: input: application: containers: include: - istio-proxy output: http: dedot: true host: valueFrom: secretKeyRef: name: sap-cloud-logging namespace: sap-cloud-logging-integration key: ingest-mtls-endpoint tls: cert: valueFrom: secretKeyRef: name: sap-cloud-logging namespace: sap-cloud-logging-integration key: ingest-mtls-cert key: valueFrom: secretKeyRef: name: sap-cloud-logging namespace: sap-cloud-logging-integration key: ingest-mtls-key uri: /customindex/istio-envoy-kyma EOFVerify that the LogPipeline for Istio access logs is running:
bashkubectl get logpipelines
- Set up the Istio Telemetry resource with the
Ship Traces to SAP Cloud Logging
You can set up ingestion of distributed traces from applications and the Istio service mesh to the OTLP endpoint of the SAP Cloud Logging service instance.
Procedure
Set Up Traces
Deploy a TracePipeline:
Script: Distributed Traces
bashkubectl apply -f - <<EOF apiVersion: telemetry.kyma-project.io/v1alpha1 kind: TracePipeline metadata: name: sap-cloud-logging spec: output: otlp: endpoint: valueFrom: secretKeyRef: name: sap-cloud-logging namespace: sap-cloud-logging-integration key: ingest-otlp-endpoint tls: cert: valueFrom: secretKeyRef: name: sap-cloud-logging namespace: sap-cloud-logging-integration key: ingest-otlp-cert key: valueFrom: secretKeyRef: name: sap-cloud-logging namespace: sap-cloud-logging-integration key: ingest-otlp-key EOFVerify that the TracePipeline is running:
kubectl get tracepipelinesSet Up Istio Tracing
By default, Istio sidecar injection and Istio tracing are disabled in Kyma. To analyze them, you must enable them:
- Enable Istio sidecar injection for your workload (see Enabling Istio Sidecar Proxy Injection).
- Configure the Istio Telemetry resource to use the kyma-traces extension provider based on OTLP (see Configure Istio Tracing).
Ship Metrics to SAP Cloud Logging
You can set up ingestion of metrics from applications and the Istio service mesh to the OTLP endpoint of the SAP Cloud Logging service instance.
Procedure
Deploy a MetricPipeline:
Script: SAP Cloud Logging
bashkubectl apply -f - <<EOF apiVersion: telemetry.kyma-project.io/v1alpha1 kind: MetricPipeline metadata: name: sap-cloud-logging spec: input: prometheus: enabled: false istio: enabled: false runtime: enabled: false output: otlp: endpoint: valueFrom: secretKeyRef: name: sap-cloud-logging namespace: sap-cloud-logging-integration key: ingest-otlp-endpoint tls: cert: valueFrom: secretKeyRef: name: sap-cloud-logging namespace: sap-cloud-logging-integration key: ingest-otlp-cert key: valueFrom: secretKeyRef: name: sap-cloud-logging namespace: sap-cloud-logging-integration key: ingest-otlp-key EOFThe default configuration creates a gateway to receive OTLP metrics from your applications.
Optional: To collect additional metrics, such as those from the runtime or Istio, configure the presets in the input section of the MetricPipeline. For the available options, see Configure Metrics Collection.
Verify that the MetricPipeline is running:
kubectl get metricpipelinesSet Up Kyma Dashboard Integration
To add direct links from Kyma dashboard to SAP Cloud Logging, apply the ConfigMap that corresponds to your chosen log shipping method.
Context
Depending on the output you use in your LogPipeline, apply the ConfigMap. If your Secret has a different name or namespace, then download the file first and adjust the namespace and name accordingly in the dataSources section of the file.
Procedure
- If your Secret has a name or namespace different from the example, download the file and edit the dataSources section before you apply it.
- Apply the ConfigMap:
For OTLP, run:
bashkubectl apply -f https://raw.githubusercontent.com/kyma-project/telemetry-manager/main/docs/user/integration/sap-cloud-logging/kyma-dashboard-configmap.yamlFor HTTP, run:
bashkubectl apply -f https://raw.githubusercontent.com/kyma-project/telemetry-manager/main/docs/user/integration/sap-cloud-logging/kyma-dashboard-http-configmap.yaml
Use SAP Cloud Logging Alerts
You can import predefined alerts for SAP Cloud Logging to monitor the health of your telemetry integration.
Procedure
- In the SAP Cloud Logging dashboard, define a “notification channel” to receive alert notifications.
- To import a monitor, use the development tools of the SAP Cloud Logging dashboard.
- Execute
POST _plugins/_alerting/monitors, followed by the contents of the respective JSON file. - Depending on the pipelines you are using, enable some or all of the following alerts: The alerts are based on JSON documents defining a Monitor for the alerting plugin.
| Monitored Component | File | Description |
|---|---|---|
| SAP Cloud Logging | alert-health.json | Monitors the health of the underlying OpenSearch cluster in SAP Cloud Logging using the cluster health API. Triggers if the cluster status becomes red. |
| SAP Cloud Logging | alert-rejection-in-progress.json | Monitors the cls-rejected-* index for new data. Triggers if new rejected data is observed. |
| Kyma Telemetry Integration | alert-telemetry-status.json | Monitors the status of the Telemetry module. Triggers if the module reports a non-ready state. |
| Kyma Telemetry Integration | alert-log-ingestion.json | For OTLP: Monitors the single LogPipeline used in the OTLP method. Triggers if log data stops flowing. |
| Kyma Telemetry Integration | alert-app-log-ingestion.json | For HTTP (legacy): Monitors the application log LogPipeline. Triggers if log data stops flowing. |
| Kyma Telemetry Integration | alert-access-log-ingestion.json | For HTTP (legacy): Monitors the Istio access log LogPipeline. Triggers if log data stops flowing. |
| Kyma Telemetry Integration | alert-trace-ingestion.json | Monitors the TracePipeline. Triggers if trace data stops flowing to SAP Cloud Logging. |
| Kyma Telemetry Integration | alert-metric-ingestion.json | Monitors the MetricPipeline. Triggers if metric data stops flowing to SAP Cloud Logging. |
- After importing, edit the monitor to attach your notification channel or destination and adjust thresholds as needed.
- Verify that the new monitor definition is listed among the SAP Cloud Logging alerts.
Use SAP Cloud Logging Dashboards
You can view logs, traces, and metrics in SAP Cloud Logging dashboards. Several dashboards come with SAP Cloud Logging, and you can import additional dashboards as needed.
Context
The preconfigured Kyma_* dashboards in SAP Cloud Logging are compatible only with the legacy (HTTP) logging method.
Procedure
- For the status of the SAP Cloud Logging integration with the Telemetry module, import the file dashboard-status.ndjson.
- For application logs and Istio access logs using the http output: Use the preconfigured dashboards prefixed with
Kyma_*. - For traces, use the OpenSearch plugin “Observability”.
- For runtime metrics, import the file dashboard-runtime.ndjson.
- For Istio Pod metrics, import the file dashboard-istio.ndjson.