Skip to content

Integrate with SAP Cloud Logging

Category
Signal typeslogs, traces, metrics
Backend typethird-party remote
OTLP-nativeyes

Configure the Telemetry module to send logs, metrics, and traces from your cluster to an SAP Cloud Logging instance. By centralizing this data in your SAP Cloud Logging instance, you can store, visualize, and analyze the observability of your applications.

Table of Content

Prerequisites

  • Kyma as the target deployment environment, with the following modules added (see Quick Install):
    • Telemetry module
    • To collect data from your Istio service mesh: Istio module (default module)
    • SAP BTP Operator module (default module)
  • An instance of SAP Cloud Logging with OpenTelemetry ingestion enabled. For details, see Ingest via OpenTelemetry API Endpoint.

    TIP

    Create the SAP Cloud Logging instance with the SAP BTP service operator (see Create an SAP Cloud Logging Instance through SAP BTP Service Operator), because it takes care of creation and rotation of the required Secret. However, you can choose any other method of creating the instance and the Secret, as long as the parameter for OTLP ingestion is enabled in the instance. For details, see Configuration Parameters.

  • A Secret in the respective namespace in your Kyma cluster, holding the credentials and endpoints for the instance. It’s recommended that you rotate your Secret (see SAP BTP Security Recommendation BTP-CLS-0003). In the following example, the Secret is named sap-cloud-logging and the namespace sap-cloud-logging-integration, as illustrated in the secret-example.yaml.
  • UNIX shell or Windows Subsystem for Linux (WSL) to execute commands.

Context

The Telemetry module supports shipping logs and ingesting distributed traces as well as metrics from applications and the Istio service mesh to SAP Cloud Logging.

First, set up the Telemetry module to ship the logs, traces, and metrics to your backend by deploying the pipelines and other required resources. Then, you configure Kyma dashboard integration. Finally, set up SAP Cloud Logging alerts and dashboards.

SAP Cloud Logging is an instance-based and environment-agnostic observability service to store, visualize, and analyze logs, metrics, and traces.

setup

Ship Logs to SAP Cloud Logging

The Telemetry module supports two protocols for shipping logs to your backend. The method you choose determines how you configure your LogPipeline and which SAP Cloud Logging dashboards you can use.

Context

Choose one of the following methods to configure shipping for application and access logs

  • OpenTelemetry (recommended): Use the OTLP-native method for all new configurations. It provides a unified way to send logs, metrics, and traces.
  • Fluent Bit (legacy): Use this method only if you depend on the preconfigured Kyma_* dashboards in SAP Cloud Logging. These dashboards were designed for the HTTP and are not compatible with the OTLP logging output.

Procedure

Set Up Application Logs

  1. Deploy a LogPipeline for application logs:

    • For OTLP, run:
    Script: Application Logs
    bash
    kubectl apply -f - <<EOF
    apiVersion: telemetry.kyma-project.io/v1alpha1
    kind: LogPipeline
    metadata:
      name: sap-cloud-logging
    spec:
      input:
        application:
          enabled: true
      output:
        otlp:
          endpoint:
            valueFrom:
              secretKeyRef:
                name: sap-cloud-logging
                namespace: sap-cloud-logging-integration
                key: ingest-otlp-endpoint
          tls:
            cert:
              valueFrom:
                secretKeyRef:
                  name: sap-cloud-logging
                  namespace: sap-cloud-logging-integration
                  key: ingest-otlp-cert
            key:
              valueFrom:
                secretKeyRef:
                  name: sap-cloud-logging
                  namespace: sap-cloud-logging-integration
                  key: ingest-otlp-key
    EOF
    • For HTTP, run:
    Script: Application Logs
    bash
    kubectl apply -f - <<EOF
    apiVersion: telemetry.kyma-project.io/v1alpha1
    kind: LogPipeline
    metadata:
      name: sap-cloud-logging-application-logs
    spec:
      input:
        application:
          containers:
            exclude:
              - istio-proxy
      output:
        http:
          dedot: true
          host:
            valueFrom:
              secretKeyRef:
                name: sap-cloud-logging
                namespace: sap-cloud-logging-integration
                key: ingest-mtls-endpoint
          tls:
            cert:
              valueFrom:
                secretKeyRef:
                  name: sap-cloud-logging
                  namespace: sap-cloud-logging-integration
                  key: ingest-mtls-cert
            key:
              valueFrom:
                secretKeyRef:
                  name: sap-cloud-logging
                  namespace: sap-cloud-logging-integration
                  key: ingest-mtls-key
          uri: /customindex/kyma
    EOF
  2. Verify that the LogPipeline is running:

bash
kubectl get logpipelines

Set Up Istio Access Logs

By default, Istio sidecar injection and Istio access logs are disabled in Kyma. To analyze them, you must enable them:

  1. Enable Istio sidecar injection for your workload (see Enabling Istio Sidecar Proxy Injection).

  2. Depending on your log shipment protocol, configure the Istio Telemetry resource (see Configure Istio Access Logs):

    • For OTLP, set up the Istio Telemetry resource with the OTLP-based kyma-logs extension provider.

    • For HTTP:

      1. Set up the Istio Telemetry resource with the stdout-json extension provider.
      2. Deploy a LogPipeline for Istio access logs:
      Script: Access Logs
      bash
      kubectl apply -f - <<EOF
      apiVersion: telemetry.kyma-project.io/v1alpha1
      kind: LogPipeline
      metadata:
        name: sap-cloud-logging-access-logs
      spec:
        input:
          application:
            containers:
              include:
                - istio-proxy
        output:
          http:
            dedot: true
            host:
              valueFrom:
                secretKeyRef:
                  name: sap-cloud-logging
                  namespace: sap-cloud-logging-integration
                  key: ingest-mtls-endpoint
            tls:
              cert:
                valueFrom:
                  secretKeyRef:
                    name: sap-cloud-logging
                    namespace: sap-cloud-logging-integration
                    key: ingest-mtls-cert
              key:
                valueFrom:
                  secretKeyRef:
                    name: sap-cloud-logging
                    namespace: sap-cloud-logging-integration
                    key: ingest-mtls-key
            uri: /customindex/istio-envoy-kyma
      EOF
      1. Verify that the LogPipeline for Istio access logs is running:

        bash
        kubectl get logpipelines

Ship Traces to SAP Cloud Logging

You can set up ingestion of distributed traces from applications and the Istio service mesh to the OTLP endpoint of the SAP Cloud Logging service instance.

Procedure

Set Up Traces

  1. Deploy a TracePipeline:

    Script: Distributed Traces
    bash
    kubectl apply -f - <<EOF
    apiVersion: telemetry.kyma-project.io/v1alpha1
    kind: TracePipeline
    metadata:
      name: sap-cloud-logging
    spec:
      output:
        otlp:
          endpoint:
            valueFrom:
              secretKeyRef:
                name: sap-cloud-logging
                namespace: sap-cloud-logging-integration
                key: ingest-otlp-endpoint
          tls:
            cert:
              valueFrom:
                secretKeyRef:
                  name: sap-cloud-logging
                  namespace: sap-cloud-logging-integration
                  key: ingest-otlp-cert
            key:
              valueFrom:
                secretKeyRef:
                  name: sap-cloud-logging
                  namespace: sap-cloud-logging-integration
                  key: ingest-otlp-key
    EOF
  2. Verify that the TracePipeline is running:

bash
kubectl get tracepipelines

Set Up Istio Tracing

By default, Istio sidecar injection and Istio tracing are disabled in Kyma. To analyze them, you must enable them:

  1. Enable Istio sidecar injection for your workload (see Enabling Istio Sidecar Proxy Injection).
  2. Configure the Istio Telemetry resource to use the kyma-traces extension provider based on OTLP (see Configure Istio Tracing).

Ship Metrics to SAP Cloud Logging

You can set up ingestion of metrics from applications and the Istio service mesh to the OTLP endpoint of the SAP Cloud Logging service instance.

Procedure

  1. Deploy a MetricPipeline:

    Script: SAP Cloud Logging
    bash
    kubectl apply -f - <<EOF
    apiVersion: telemetry.kyma-project.io/v1alpha1
    kind: MetricPipeline
    metadata:
      name: sap-cloud-logging
    spec:
      input:
        prometheus:
          enabled: false
        istio:
          enabled: false
        runtime:
          enabled: false
      output:
        otlp:
          endpoint:
            valueFrom:
              secretKeyRef:
                name: sap-cloud-logging
                namespace: sap-cloud-logging-integration
                key: ingest-otlp-endpoint
          tls:
            cert:
              valueFrom:
                secretKeyRef:
                  name: sap-cloud-logging
                  namespace: sap-cloud-logging-integration
                  key: ingest-otlp-cert
            key:
              valueFrom:
                secretKeyRef:
                  name: sap-cloud-logging
                  namespace: sap-cloud-logging-integration
                  key: ingest-otlp-key
    EOF

    The default configuration creates a gateway to receive OTLP metrics from your applications.

  2. Optional: To collect additional metrics, such as those from the runtime or Istio, configure the presets in the input section of the MetricPipeline. For the available options, see Configure Metrics Collection.

  3. Verify that the MetricPipeline is running:

bash
kubectl get metricpipelines

Set Up Kyma Dashboard Integration

To add direct links from Kyma dashboard to SAP Cloud Logging, apply the ConfigMap that corresponds to your chosen log shipping method.

Context

Depending on the output you use in your LogPipeline, apply the ConfigMap. If your Secret has a different name or namespace, then download the file first and adjust the namespace and name accordingly in the dataSources section of the file.

Procedure

  1. If your Secret has a name or namespace different from the example, download the file and edit the dataSources section before you apply it.
  2. Apply the ConfigMap:
    • For OTLP, run:

      bash
      kubectl apply -f https://raw.githubusercontent.com/kyma-project/telemetry-manager/main/docs/user/integration/sap-cloud-logging/kyma-dashboard-configmap.yaml
    • For HTTP, run:

      bash
      kubectl apply -f https://raw.githubusercontent.com/kyma-project/telemetry-manager/main/docs/user/integration/sap-cloud-logging/kyma-dashboard-http-configmap.yaml

Use SAP Cloud Logging Alerts

You can import predefined alerts for SAP Cloud Logging to monitor the health of your telemetry integration.

Procedure

  1. In the SAP Cloud Logging dashboard, define a “notification channel” to receive alert notifications.
  2. To import a monitor, use the development tools of the SAP Cloud Logging dashboard.
  3. Execute POST _plugins/_alerting/monitors, followed by the contents of the respective JSON file.
  4. Depending on the pipelines you are using, enable some or all of the following alerts: The alerts are based on JSON documents defining a Monitor for the alerting plugin.
Monitored ComponentFileDescription
SAP Cloud Loggingalert-health.jsonMonitors the health of the underlying OpenSearch cluster in SAP Cloud Logging using the cluster health API. Triggers if the cluster status becomes red.
SAP Cloud Loggingalert-rejection-in-progress.jsonMonitors the cls-rejected-* index for new data. Triggers if new rejected data is observed.
Kyma Telemetry Integrationalert-telemetry-status.jsonMonitors the status of the Telemetry module. Triggers if the module reports a non-ready state.
Kyma Telemetry Integrationalert-log-ingestion.jsonFor OTLP: Monitors the single LogPipeline used in the OTLP method. Triggers if log data stops flowing.
Kyma Telemetry Integrationalert-app-log-ingestion.jsonFor HTTP (legacy): Monitors the application log LogPipeline. Triggers if log data stops flowing.
Kyma Telemetry Integrationalert-access-log-ingestion.jsonFor HTTP (legacy): Monitors the Istio access log LogPipeline. Triggers if log data stops flowing.
Kyma Telemetry Integrationalert-trace-ingestion.jsonMonitors the TracePipeline. Triggers if trace data stops flowing to SAP Cloud Logging.
Kyma Telemetry Integrationalert-metric-ingestion.jsonMonitors the MetricPipeline. Triggers if metric data stops flowing to SAP Cloud Logging.
  1. After importing, edit the monitor to attach your notification channel or destination and adjust thresholds as needed.
  2. Verify that the new monitor definition is listed among the SAP Cloud Logging alerts.

Use SAP Cloud Logging Dashboards

You can view logs, traces, and metrics in SAP Cloud Logging dashboards. Several dashboards come with SAP Cloud Logging, and you can import additional dashboards as needed.

Context

The preconfigured Kyma_* dashboards in SAP Cloud Logging are compatible only with the legacy (HTTP) logging method.

Procedure

  • For the status of the SAP Cloud Logging integration with the Telemetry module, import the file dashboard-status.ndjson.
  • For application logs and Istio access logs using the http output: Use the preconfigured dashboards prefixed with Kyma_*.
  • For traces, use the OpenSearch plugin “Observability”.
  • For runtime metrics, import the file dashboard-runtime.ndjson.
  • For Istio Pod metrics, import the file dashboard-istio.ndjson.