Skip to content

Works with debug, but not something like googlecloud #2

@iancote-ionq

Description

@iancote-ionq

Hello! Thank you for working on this, I need something exactly like this.

When I try to export to googlecloud I'm getting errors like this:

2024-11-12T01:11:33.994Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 1, "metrics": 6, "data points": 8}
2024-11-12T01:11:33.995Z	info	ResourceMetrics #0
Resource SchemaURL: 
Resource attributes:
     -> service.name: Str(test)
     -> service.instance.id: Str(more_test)
ScopeMetrics #0
ScopeMetrics SchemaURL: 
InstrumentationScope  
Metric #0
Descriptor:
     -> Name: ping.rtt
     -> Description: 
     -> Unit: ms
     -> DataType: Gauge
NumberDataPoints #0
Data point attributes:
     -> net.peer.ip: Str(8.8.8.8)
     -> net.peer.name: Str(8.8.8.8)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2024-11-12 01:11:31.951675877 +0000 UTC
Value: 1.404978
NumberDataPoints #1
Data point attributes:
     -> net.peer.ip: Str(8.8.8.8)
     -> net.peer.name: Str(8.8.8.8)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2024-11-12 01:11:32.952326551 +0000 UTC
Value: 1.439537
NumberDataPoints #2
Data point attributes:
     -> net.peer.ip: Str(8.8.8.8)
     -> net.peer.name: Str(8.8.8.8)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2024-11-12 01:11:33.95269958 +0000 UTC
Value: 1.412412
Metric #1
Descriptor:
     -> Name: ping.rtt.min
     -> Description: 
     -> Unit: ms
     -> DataType: Gauge
NumberDataPoints #0
Data point attributes:
     -> net.peer.ip: Str(8.8.8.8)
     -> net.peer.name: Str(8.8.8.8)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2024-11-12 01:11:33.994490837 +0000 UTC
Value: 1.404978
Metric #2
Descriptor:
     -> Name: ping.rtt.max
     -> Description: 
     -> Unit: ms
     -> DataType: Gauge
NumberDataPoints #0
Data point attributes:
     -> net.peer.ip: Str(8.8.8.8)
     -> net.peer.name: Str(8.8.8.8)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2024-11-12 01:11:33.994490837 +0000 UTC
Value: 1.439537
Metric #3
Descriptor:
     -> Name: ping.rtt.avg
     -> Description: 
     -> Unit: ms
     -> DataType: Gauge
NumberDataPoints #0
Data point attributes:
     -> net.peer.ip: Str(8.8.8.8)
     -> net.peer.name: Str(8.8.8.8)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2024-11-12 01:11:33.994490837 +0000 UTC
Value: 1.418976
Metric #4
Descriptor:
     -> Name: ping.rtt.stddev
     -> Description: 
     -> Unit: ms
     -> DataType: Gauge
NumberDataPoints #0
Data point attributes:
     -> net.peer.ip: Str(8.8.8.8)
     -> net.peer.name: Str(8.8.8.8)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2024-11-12 01:11:33.994490837 +0000 UTC
Value: 0.014852
Metric #5
Descriptor:
     -> Name: ping.loss.ratio
     -> Description: 
     -> Unit: 
     -> DataType: Gauge
NumberDataPoints #0
Data point attributes:
     -> net.peer.ip: Str(8.8.8.8)
     -> net.peer.name: Str(8.8.8.8)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2024-11-12 01:11:33.994490837 +0000 UTC
Value: 0.000000
	{"kind": "exporter", "data_type": "metrics", "name": "debug"}
2024-11-12T01:11:34.075Z	error	internal/queue_sender.go:92	Exporting failed. Dropping data.	{"kind": "exporter", "data_type": "metrics", "name": "googlecloud", "error": "rpc error: code = InvalidArgument desc = One or more TimeSeries could not be written: timeSeries[2]: Field timeSeries[2] had an invalid value: Duplicate TimeSeries encountered. Only one point can be written per TimeSeries per request.; timeSeries[1]: Field timeSeries[1] had an invalid value: Duplicate TimeSeries encountered. Only one point can be written per TimeSeries per request.\nerror details: name = Unknown  desc = total_point_count:8 success_point_count:6 errors:{status:{code:3} point_count:2}", "dropped_items": 8}
go.opentelemetry.io/collector/exporter/exporterhelper/internal.NewQueueSender.func1
	go.opentelemetry.io/collector/exporter@v0.111.0/exporterhelper/internal/queue_sender.go:92
go.opentelemetry.io/collector/exporter/internal/queue.(*boundedMemoryQueue[...]).Consume
	go.opentelemetry.io/collector/exporter@v0.111.0/internal/queue/bounded_memory_queue.go:52
go.opentelemetry.io/collector/exporter/internal/queue.(*Consumers[...]).Start.func1
	go.opentelemetry.io/collector/exporter@v0.111.0/internal/queue/consumers.go:43

If I set it to use something simple like the debug exporter, it seems fine.

I'm configuring otel like this (stripped down config for testing, the googlecloud part works fine for a number of other metrics and logs being sent now):

receivers:
  icmpcheck:
    collection_interval: 30s
    targets:
      - target: 8.8.8.8

processors:
  resource:
    attributes:
      - key: service.name
        value: test
        action: upsert
      - key: service.instance.id
        value: more_test
        action: upsert

exporters:
  googlecloud:
    log:
      default_log_name: opentelemetry.io/collector-exported-log

  debug:
    verbosity: detailed

service:
  pipelines:
    metrics/test:
      receivers:
        - icmpcheck
      processors:
        - resource
      exporters:
        - googlecloud
        - debug

The metrics are showing up in GCP as expected though, just with the log noise. Any ideas?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions