Skip to content

Commit

Permalink
Integrations Cleanup and Updates (#1197)
Browse files Browse the repository at this point in the history
* Remove extra lines in app-o11y notes

* Added namespaces property

The namespaces property is already supported by the ksm helm chart, but wasn't documented in the k8s chart

* Drop debug logs by default

* fixed scrubTimestamp bug

Ensured that only a leading or trailing space is removed when the timestamp is scrubbed

* Update ts format

* Documentation

* Set default allow list to null

* Updated Meta-Monitoring Example

* Updates

* Updated Tests and Rebuilt

* Update charts/k8s-monitoring/charts/feature-cluster-metrics/values.yaml

Co-authored-by: Pete Wall <pete.wall@grafana.com>

* Fix a few things: (#1200)

* Fix a few things:
* Pod Logs annotations and labels
* Turn off authentication for Grafana in integration tests
* Fix validation messages and check for pod logs

Signed-off-by: Pete Wall <pete.wall@grafana.com>

* Fix test

Signed-off-by: Pete Wall <pete.wall@grafana.com>

---------

Signed-off-by: Pete Wall <pete.wall@grafana.com>

* Added otlp-gateway as part of validation check (#1202)

* Added otlp-gateway as part of validation check

* updated check to use regex instead of contains

* added tempo checks

* Fixed metrics/logs messaging

* Bump v2 version to 2.0.7

Signed-off-by: Robbie Lankford <robert.lankford@grafana.com>

* Add unit tests for the new Grafana cloud validators. (#1204)

utilize dig that handles undefined fields

Signed-off-by: Pete Wall <pete.wall@grafana.com>

* Bump v2 version to 2.0.8

Signed-off-by: Robbie Lankford <robert.lankford@grafana.com>

* Update Update dependency "kepler" for Helm chart "feature-cluster-metrics" to 0.5.13 (#1194)

Co-authored-by: petewall <petewall@users.noreply.github.com>

* Update Update dependency "kepler" for Helm chart "k8s-monitoring-v1" to 0.5.13 (#1195)

Co-authored-by: petewall <petewall@users.noreply.github.com>

* add application-observability platform test (#1206)

* add application-observability platform test

---------

Signed-off-by: Robbie Lankford <robert.lankford@grafana.com>

* Added Tempo Integration (#1168)

* Initial commit for tempo integration.

* Adding in tempo integration tests

* Update to default allow list for metrics
Don't have the default allow list currently, so blanking it out while I compile it.

* Adding default metric allow list

* Fix to tempo integration test

* Add in tempo single binary deployment for tests

* removing test values file

* updating generated files

* Fix to test for keep_metrics with tempo integration

* Updating port name for metrics collection in tempo integration test.

* updating output.yaml for tempo integration test

* Test fix - port name for tempo-monolith is different than tempo distributed

* Add validation for otlp destination protocol (#1212)

* Add validation for otlp destination protocol

Signed-off-by: Pete Wall <pete.wall@grafana.com>

* Update charts/k8s-monitoring/tests/destination_validations_test.yaml

Co-authored-by: Robert Lankford <rlankfo@gmail.com>

* Actually test the right error message

Signed-off-by: Pete Wall <pete.wall@grafana.com>

---------

Signed-off-by: Pete Wall <pete.wall@grafana.com>
Co-authored-by: Robert Lankford <rlankfo@gmail.com>

* Actually implement proxy URL for loki destinations (#1215)

Signed-off-by: Pete Wall <pete.wall@grafana.com>

* Add the ability to set an additional service for the recevier (#1213)

Signed-off-by: Pete Wall <pete.wall@grafana.com>

* adjust default batch size for feature-application-observability (#1205)

* update default batch size in feature-application-observability

Signed-off-by: Robbie Lankford <robert.lankford@grafana.com>

* update tests and generated files

---------

Signed-off-by: Robbie Lankford <robert.lankford@grafana.com>

* make build (#1216)

Signed-off-by: Robbie Lankford <robert.lankford@grafana.com>

* Make it possible to skip mysql logs integration (#1218)

* Make it possible to skip mysql logs integration

Signed-off-by: Pete Wall <pete.wall@grafana.com>

* Catch more instances

Signed-off-by: Pete Wall <pete.wall@grafana.com>

---------

Signed-off-by: Pete Wall <pete.wall@grafana.com>

* Fix Pod log annotation and label assignment (#1222)

Signed-off-by: Pete Wall <pete.wall@grafana.com>

* Bump versions to 1.6.24 and 2.0.9

Signed-off-by: Pete Wall <pete.wall@grafana.com>

* Updates

* Fixed integration bug not merging booleans

* WIP

* Fixed Deep Copy / Merge of Loki Values and added Tests

* Fixed Deep Copy / Merge of Mimir Values and added Tests

* Fixed Deep Copy / Merge of Grafana Values and added Tests

* Fixed Deep Copy / Merge of Tempo Values and added Tests

* Rebuilt

* Fix a few things: (#1200)

* Fix a few things:
* Pod Logs annotations and labels
* Turn off authentication for Grafana in integration tests
* Fix validation messages and check for pod logs

Signed-off-by: Pete Wall <pete.wall@grafana.com>

* Fix test

Signed-off-by: Pete Wall <pete.wall@grafana.com>

---------

Signed-off-by: Pete Wall <pete.wall@grafana.com>

* Only run these workflows on weekday mornings

Signed-off-by: Pete Wall <pete.wall@grafana.com>

* Updated Tests and Rebuilt

* Rebuilt

* Rebuilt

* Fixed Tests

---------

Signed-off-by: Pete Wall <pete.wall@grafana.com>
Signed-off-by: Robbie Lankford <robert.lankford@grafana.com>
Co-authored-by: Pete Wall <pete.wall@grafana.com>
Co-authored-by: Robbie Lankford <robert.lankford@grafana.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: petewall <petewall@users.noreply.github.com>
Co-authored-by: Robert Lankford <rlankfo@gmail.com>
Co-authored-by: Sheldon Jackson <5133739+Imshelledin21@users.noreply.github.com>
  • Loading branch information
7 people authored Feb 12, 2025
1 parent f13f348 commit 2bd0c7c
Show file tree
Hide file tree
Showing 55 changed files with 691 additions and 234 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -17,10 +17,10 @@ Gather application data via {{ include "english_list" $receivers }} {{ $receiver
Configure your applications to send telemetry data to:
{{- if .Values.receivers.otlp.grpc.enabled }}
* http://{{ .Collector.ServiceName }}.{{ .Collector.Namespace }}.svc.cluster.local:{{ .Values.receivers.otlp.grpc.port }} (OTLP gRPC)
{{ end }}
{{- end }}
{{- if .Values.receivers.otlp.http.enabled }}
* http://{{ .Collector.ServiceName }}.{{ .Collector.Namespace }}.svc.cluster.local:{{ .Values.receivers.otlp.http.port }} (OTLP HTTP)
{{ end }}
{{- end }}
{{- if .Values.receivers.jaeger.grpc.enabled }}
* http://{{ .Collector.ServiceName }}.{{ .Collector.Namespace }}.svc.cluster.local:{{ .Values.receivers.jaeger.grpc.port }} (Jaeger gRPC)
{{- end }}
Expand All @@ -35,7 +35,7 @@ Configure your applications to send telemetry data to:
{{- end }}
{{- if .Values.receivers.zipkin.enabled }}
* http://{{ .Collector.ServiceName }}.{{ .Collector.Namespace }}.svc.cluster.local:{{ .Values.receivers.zipkin.port }} (Zipkin)
{{ end }}
{{- end }}
{{- end }}

{{- define "feature.applicationObservability.summary" -}}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -203,6 +203,7 @@ Be sure perform actual integration testing in a live environment in the main [k8
| kube-state-metrics.metricsTuning.includeMetrics | list | `[]` | Metrics to keep. Can use regular expressions. |
| kube-state-metrics.metricsTuning.useDefaultAllowList | bool | `true` | Filter the list of metrics from Kube State Metrics to a useful, minimal set. |
| kube-state-metrics.namespace | string | `""` | Namespace to locate kube-state-metrics pods. If `deploy` is set to `true`, this will automatically be set to the namespace where this Helm chart is deployed. |
| kube-state-metrics.namespaces | string | `""` | Comma-separated list(string) or yaml list of namespaces to be enabled for collecting resources. By default all namespaces are collected. |
| kube-state-metrics.scrapeInterval | string | `60s` | How frequently to scrape kube-state-metrics metrics. |
| kube-state-metrics.service.portName | string | `"http"` | The port name used by kube-state-metrics. |
| kube-state-metrics.service.scheme | string | `"http"` | The scrape scheme used by kube-state-metrics. |
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -323,6 +323,9 @@
"namespace": {
"type": "string"
},
"namespaces": {
"type": "string"
},
"nodeSelector": {
"type": "object",
"properties": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -434,6 +434,10 @@ kube-state-metrics:
# @section -- kube-state-metrics
namespace: ""

# -- Comma-separated list(string) or yaml list of namespaces to be enabled for collecting resources. By default all namespaces are collected.
# @section -- kube-state-metrics
namespaces: ""

# -- Rule blocks to be added to the discovery.relabel component for kube-state-metrics.
# These relabeling rules are applied pre-scrape against the targets from service discovery.
# Before the scrape, any remaining target labels that start with __ (i.e. __meta_kubernetes*) are dropped.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
| Key | Type | Default | Description |
|-----|------|---------|-------------|
| logs.enabled | bool | `true` | Whether to enable special processing of Grafana pod logs. |
| logs.tuning.dropLogLevels | list | `[]` | The log levels to drop. Will automatically keep all log levels unless specified here. |
| logs.tuning.dropLogLevels | list | `["debug"]` | The log levels to drop. Will automatically keep all log levels unless specified here. |
| logs.tuning.excludeLines | list | `[]` | Line patterns (valid RE2 regular expression)to exclude from the logs. |
| logs.tuning.scrubTimestamp | bool | `true` | Whether the timestamp should be scrubbed from the log line |
| logs.tuning.structuredMetadata | object | `{}` | The structured metadata mappings to set. To not set any structured metadata, set this to an empty object (e.g. `{}`) |
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
| Key | Type | Default | Description |
|-----|------|---------|-------------|
| logs.enabled | bool | `true` | Whether to enable special processing of Loki pod logs. |
| logs.tuning.dropLogLevels | list | `[]` | The log levels to drop. Will automatically keep all log levels unless specified here. |
| logs.tuning.dropLogLevels | list | `["debug"]` | The log levels to drop. Will automatically keep all log levels unless specified here. |
| logs.tuning.excludeLines | list | `[]` | Line patterns (valid RE2 regular expression)to exclude from the logs. |
| logs.tuning.scrubTimestamp | bool | `true` | Whether the timestamp should be scrubbed from the log line |
| logs.tuning.structuredMetadata | object | `{}` | The structured metadata mappings to set. To not set any structured metadata, set this to an empty object (e.g. `{}`) |
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
| Key | Type | Default | Description |
|-----|------|---------|-------------|
| logs.enabled | bool | `true` | Whether to enable special processing of Mimir pod logs. |
| logs.tuning.dropLogLevels | list | `[]` | The log levels to drop. Will automatically keep all log levels unless specified here. |
| logs.tuning.dropLogLevels | list | `["debug"]` | The log levels to drop. Will automatically keep all log levels unless specified here. |
| logs.tuning.excludeLines | list | `[]` | Line patterns (valid RE2 regular expression)to exclude from the logs. |
| logs.tuning.scrubTimestamp | bool | `true` | Whether the timestamp should be scrubbed from the log line |
| logs.tuning.structuredMetadata | object | `{}` | The structured metadata mappings to set. To not set any structured metadata, set this to an empty object (e.g. `{}`) |
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
| Key | Type | Default | Description |
|-----|------|---------|-------------|
| logs.enabled | bool | `true` | Whether to enable special processing of Tempo pod logs. |
| logs.tuning.dropLogLevels | list | `[]` | The log levels to drop. Will automatically keep all log levels unless specified here. |
| logs.tuning.dropLogLevels | list | `["debug"]` | The log levels to drop. Will automatically keep all log levels unless specified here. |
| logs.tuning.excludeLines | list | `[]` | Line patterns (valid RE2 regular expression)to exclude from the logs. |
| logs.tuning.scrubTimestamp | bool | `true` | Whether the timestamp should be scrubbed from the log line |
| logs.tuning.structuredMetadata | object | `{}` | The structured metadata mappings to set. To not set any structured metadata, set this to an empty object (e.g. `{}`) |
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ logs:
# -- The timestamp format to use for the log line, if not set the default timestamp which is the collection
# will be used for the log line
# @section -- Logs Settings
timestampFormat: "RFC3339Nano"
timestampFormat: RFC3339Nano

# -- Whether the timestamp should be scrubbed from the log line
# @section -- Logs Settings
Expand All @@ -78,7 +78,8 @@ logs:
# -- The log levels to drop.
# Will automatically keep all log levels unless specified here.
# @section -- Logs Settings
dropLogLevels: []
dropLogLevels:
- debug

# -- Line patterns (valid RE2 regular expression)to exclude from the logs.
# @section -- Logs Settings
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ logs:
# -- The timestamp format to use for the log line, if not set the default timestamp which is the collection
# will be used for the log line
# @section -- Logs Settings
timestampFormat: "RFC3339Nano"
timestampFormat: RFC3339Nano

# -- Whether the timestamp should be scrubbed from the log line
# @section -- Logs Settings
Expand All @@ -77,7 +77,8 @@ logs:
# -- The log levels to drop.
# Will automatically keep all log levels unless specified here.
# @section -- Logs Settings
dropLogLevels: []
dropLogLevels:
- debug

# -- Line patterns (valid RE2 regular expression)to exclude from the logs.
# @section -- Logs Settings
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ logs:
# -- The timestamp format to use for the log line, if not set the default timestamp which is the collection
# will be used for the log line
# @section -- Logs Settings
timestampFormat: "RFC3339Nano"
timestampFormat: RFC3339Nano

# -- Whether the timestamp should be scrubbed from the log line
# @section -- Logs Settings
Expand All @@ -77,7 +77,8 @@ logs:
# -- The log levels to drop.
# Will automatically keep all log levels unless specified here.
# @section -- Logs Settings
dropLogLevels: []
dropLogLevels:
- debug

# -- Line patterns (valid RE2 regular expression)to exclude from the logs.
# @section -- Logs Settings
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ logs:
# -- The timestamp format to use for the log line, if not set the default timestamp which is the collection
# will be used for the log line
# @section -- Logs Settings
timestampFormat: "RFC3339Nano"
timestampFormat: RFC3339Nano

# -- Whether the timestamp should be scrubbed from the log line
# @section -- Logs Settings
Expand All @@ -77,7 +77,8 @@ logs:
# -- The log levels to drop.
# Will automatically keep all log levels unless specified here.
# @section -- Logs Settings
dropLogLevels: []
dropLogLevels:
- debug

# -- Line patterns (valid RE2 regular expression)to exclude from the logs.
# @section -- Logs Settings
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,10 @@
"type": "object",
"properties": {
"dropLogLevels": {
"type": "array"
"type": "array",
"items": {
"type": "string"
}
},
"excludeLines": {
"type": "array"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,10 @@
"type": "object",
"properties": {
"dropLogLevels": {
"type": "array"
"type": "array",
"items": {
"type": "string"
}
},
"excludeLines": {
"type": "array"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,10 @@
"type": "object",
"properties": {
"dropLogLevels": {
"type": "array"
"type": "array",
"items": {
"type": "string"
}
},
"excludeLines": {
"type": "array"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,10 @@
"type": "object",
"properties": {
"dropLogLevels": {
"type": "array"
"type": "array",
"items": {
"type": "string"
}
},
"excludeLines": {
"type": "array"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@
{{- $defaultValues := "integrations/grafana-values.yaml" | .Files.Get | fromYaml }}
{{- $logsEnabled := false }}
{{- range $instance := .Values.grafana.instances }}
{{- with merge $instance $defaultValues (dict "type" "integration.grafana") }}
{{- $logsEnabled = or $logsEnabled $instance.logs.enabled }}
{{- with merge (deepCopy $defaultValues) (deepCopy $instance) (dict "type" "integration.grafana") }}
{{- $logsEnabled = or $logsEnabled .logs.enabled }}
{{- end }}
{{- end }}
{{- $logsEnabled -}}
Expand All @@ -13,7 +13,7 @@
{{- define "integrations.grafana.logs.discoveryRules" }}
{{- $defaultValues := "integrations/grafana-values.yaml" | .Files.Get | fromYaml }}
{{- range $instance := $.Values.grafana.instances }}
{{- with mergeOverwrite $defaultValues (deepCopy $instance) }}
{{- with $defaultValues | merge (deepCopy $instance) }}
{{- if .logs.enabled }}
{{- $labelList := list }}
{{- $valueList := list }}
Expand Down Expand Up @@ -52,9 +52,9 @@ rule {
{{- define "integrations.grafana.logs.processingStage" }}
{{- if eq (include "integrations.grafana.type.logs" .) "true" }}
{{- $defaultValues := "integrations/grafana-values.yaml" | .Files.Get | fromYaml }}
// Integration: Loki
// Integration: Grafana
{{- range $instance := $.Values.grafana.instances }}
{{- with mergeOverwrite $defaultValues (deepCopy $instance) }}
{{- with $defaultValues | merge (deepCopy $instance) }}
{{- if .logs.enabled }}
stage.match {
{{- if $instance.namespaces }}
Expand All @@ -66,10 +66,8 @@ stage.match {
// extract some of the fields from the log line
stage.logfmt {
mapping = {
"timestamp" = "t",
"ts" = "t",
"level" = "",
"logger" = "",
"type" = "",
{{- range $key, $value := .logs.tuning.structuredMetadata }}
{{ $key | quote }} = {{ if $value }}{{ $value | quote }}{{ else }}{{ $key | quote }}{{ end }},
{{- end }}
Expand All @@ -86,22 +84,31 @@ stage.match {
{{- if .logs.tuning.timestampFormat }}
// reset the timestamp to the extracted value
stage.timestamp {
source = "timestamp"
source = "ts"
format = {{ .logs.tuning.timestampFormat | quote }}
}
{{- end }}

{{- if .logs.tuning.scrubTimestamp }}
// remove the timestamp from the log line
stage.replace {
expression = "( t=[^ ]+\\s+)"
expression = `(?:^|\s+)(t=\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d+[^ ]*\s+)`
replace = ""
}
{{- end }}

{{- if hasKey .logs.tuning.structuredMetadata "caller" }}
// clean up the caller to remove the line
stage.replace {
source = "caller"
expression = "(:[0-9]+$)"
replace = ""
}
{{- end }}

{{- /* the stage.structured_metadata block needs to be conditionalized because the support for enabling structured metadata can be disabled */ -}}
{{- /* through the grafana limits_conifg on a per-tenant basis, even if there are no values defined or there are values defined but it is disabled */ -}}
{{- /* in Loki, the write will fail. */ -}}
{{- /* in Grafana, the write will fail. */ -}}
{{- if gt (len .logs.tuning.structuredMetadata) 0 }}
// set the structured metadata values
stage.structured_metadata {
Expand Down Expand Up @@ -130,6 +137,7 @@ stage.match {
drop_counter_reason = "grafana-exclude-line"
}
{{- end }}

}
{{- end }}
{{- end }}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,11 @@
{{/* Inputs: instance (grafana integration instance) Files (Files object) */}}
{{- define "integrations.grafana.allowList" }}
{{- $allowList := list -}}
{{- if .instance.metrics.tuning.useDefaultAllowList -}}
{{- $allowList = concat $allowList (list "up" "scrape_samples_scraped") (.Files.Get "default-allow-lists/grafana.yaml" | fromYamlArray) -}}
{{- end -}}
{{- if .instance.metrics.tuning.includeMetrics -}}
{{- $allowList = concat $allowList .instance.metrics.tuning.includeMetrics -}}
{{- $allowList = concat $allowList (list "up" "scrape_samples_scraped") .instance.metrics.tuning.includeMetrics -}}
{{- end -}}
{{ $allowList | uniq | toYaml }}
{{- end -}}
Expand Down Expand Up @@ -89,7 +92,7 @@ declare "grafana_integration" {
}

argument "job_label" {
comment = "The job label to add for all Loki metrics (default: integrations/grafana)"
comment = "The job label to add for all Grafana metrics (default: integrations/grafana)"
optional = true
}

Expand Down Expand Up @@ -137,9 +140,27 @@ declare "grafana_integration" {
// drop metrics that match the drop_metrics regex
rule {
source_labels = ["__name__"]
regex = coalesce(argument.drop_metrics.value, "(^(go|process)_.+$)")
regex = coalesce(argument.drop_metrics.value, "")
action = "drop"
}

// keep only metrics that match the keep_metrics regex
rule {
source_labels = ["__name__"]
regex = coalesce(argument.keep_metrics.value, "(.+)")
action = "keep"
}

// the grafana-mixin expects the instance label to be the node name
rule {
source_labels = ["node"]
target_label = "instance"
replacement = "$1"
}
rule {
action = "labeldrop"
regex = "node"
}
}
}
{{- range $instance := $.Values.grafana.instances }}
Expand All @@ -151,10 +172,10 @@ declare "grafana_integration" {
{{/* Instantiates the grafana integration */}}
{{/* Inputs: integration (grafana integration definition), Values (all values), Files (Files object) */}}
{{- define "integrations.grafana.include.metrics" }}
{{- $defaultValues := "integrations/grafana-values.yaml" | .Files.Get | fromYaml }}
{{- with mergeOverwrite $defaultValues (deepCopy .instance) }}
{{- $defaultValues := fromYaml (.Files.Get "integrations/grafana-values.yaml") }}
{{- with mergeOverwrite $defaultValues .instance (dict "type" "integration.grafana") }}
{{- $metricAllowList := include "integrations.grafana.allowList" (dict "instance" . "Files" $.Files) | fromYamlArray }}
{{- $metricDenyList := .excludeMetrics }}
{{- $metricDenyList := .metrics.tuning.excludeMetrics }}
{{- $labelSelectors := list }}
{{- range $k, $v := .labelSelectors }}
{{- if kindIs "slice" $v }}
Expand All @@ -174,7 +195,7 @@ grafana_integration_discovery {{ include "helper.alloy_name" .name | quote }} {

grafana_integration_scrape {{ include "helper.alloy_name" .name | quote }} {
targets = grafana_integration_discovery.{{ include "helper.alloy_name" .name }}.output
job_label = {{ .jobLabel | quote }}
job_label = "integrations/grafana"
clustering = true
{{- if $metricAllowList }}
keep_metrics = {{ $metricAllowList | join "|" | quote }}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@
{{- $defaultValues := "integrations/loki-values.yaml" | .Files.Get | fromYaml }}
{{- $logsEnabled := false }}
{{- range $instance := .Values.loki.instances }}
{{- with merge $instance $defaultValues (dict "type" "integration.loki") }}
{{- $logsEnabled = or $logsEnabled $instance.logs.enabled }}
{{- with merge (deepCopy $instance) (deepCopy $defaultValues) (dict "type" "integration.loki") }}
{{- $logsEnabled = or $logsEnabled .logs.enabled }}
{{- end }}
{{- end }}
{{- $logsEnabled -}}
Expand Down Expand Up @@ -105,7 +105,7 @@ stage.match {
{{- if .logs.tuning.scrubTimestamp }}
// remove the timestamp from the log line
stage.replace {
expression = "(ts=[^ ]+\\s+)"
expression = `(?:^|\s+)(ts=\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d+[^ ]*\s+)`
replace = ""
}
{{- end }}
Expand Down
Loading

0 comments on commit 2bd0c7c

Please sign in to comment.