Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Data race in otel collector #2166

Open
krdln opened this issue Jun 19, 2023 · 3 comments
Open

Data race in otel collector #2166

krdln opened this issue Jun 19, 2023 · 3 comments
Labels
Agent Telemetry bug Something isn't working

Comments

@krdln
Copy link
Contributor

krdln commented Jun 19, 2023

Describe the bug
Data race in otel collector

Steps to reproduce
tilt up -- --race, wait

What did you see instead?

==================
==================
WARNING: DATA RACE
Read at 0x00c00107da48 by goroutine 1449:
  container/list.(*List).move()
      /usr/local/go/src/container/list/list.go:122 +0x199
  container/list.(*List).MoveToFront()
      /usr/local/go/src/container/list/list.go:185 +0x159
  github.com/golang/groupcache/lru.(*Cache).Get()
      /go/pkg/mod/github.com/golang/[email protected]/lru/lru.go:79 +0xed
  github.com/open-telemetry/opentelemetry-collector-contrib/internal/filter/filterset/regexp.(*FilterSet).Matches()
      /go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/internal/[email protected]/filterset/regexp/regexpfilterset.go:47 +0xc6
  github.com/open-telemetry/opentelemetry-collector-contrib/internal/filter/filtermetric.(*nameMatcher).Eval()
      /go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/internal/[email protected]/filtermetric/name_matcher.go:35 +0x91
  github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics.func1.1.1()
      /go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/metrics.go:128 +0x15a
  go.opentelemetry.io/collector/pdata/pmetric.MetricSlice.RemoveIf()
      /go/pkg/mod/github.com/fluxninja/opentelemetry-collector/[email protected]/pmetric/generated_metricslice.go:103 +0xb1
  github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics.func1.1()
      /go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/metrics.go:126 +0xeb
  go.opentelemetry.io/collector/pdata/pmetric.ScopeMetricsSlice.RemoveIf()
      /go/pkg/mod/github.com/fluxninja/opentelemetry-collector/[email protected]/pmetric/generated_scopemetricsslice.go:103 +0xb1
  github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics.func1()
      /go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/metrics.go:124 +0x1a4
  go.opentelemetry.io/collector/pdata/pmetric.ResourceMetricsSlice.RemoveIf()
      /go/pkg/mod/github.com/fluxninja/opentelemetry-collector/[email protected]/pmetric/generated_resourcemetricsslice.go:103 +0xb1
  github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics()
      /go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/metrics.go:112 +0x11e
  github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics-fm()
      <autogenerated>:1 +0x64
  go.opentelemetry.io/collector/processor/processorhelper.NewMetricsProcessor.func1()
      /go/pkg/mod/github.com/fluxninja/[email protected]/processor/processorhelper/metrics.go:52 +0x134
  go.opentelemetry.io/collector/consumer.ConsumeMetricsFunc.ConsumeMetrics()
      /go/pkg/mod/github.com/fluxninja/opentelemetry-collector/[email protected]/metrics.go:25 +0x79
  go.opentelemetry.io/collector/consumer.(*baseMetrics).ConsumeMetrics()
      <autogenerated>:1 +0x29
  go.opentelemetry.io/collector/processor/processorhelper.(*metricsProcessor).ConsumeMetrics()
      <autogenerated>:1 +0x7e
  go.opentelemetry.io/collector/consumer.Metrics.ConsumeMetrics-fm()
      <autogenerated>:1 +0x75
  go.opentelemetry.io/collector/consumer.ConsumeMetricsFunc.ConsumeMetrics()
      /go/pkg/mod/github.com/fluxninja/opentelemetry-collector/[email protected]/metrics.go:25 +0x79
  go.opentelemetry.io/collector/service/internal/graph.(*capabilitiesNode).ConsumeMetrics()
      <autogenerated>:1 +0x29
  github.com/open-telemetry/opentelemetry-collector-contrib/receiver/prometheusreceiver/internal.(*transaction).Commit()
      /go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/receiver/[email protected]/internal/transaction.go:256 +0x245
  github.com/prometheus/prometheus/scrape.(*scrapeLoop).scrapeAndReport.func1()
      /go/pkg/mod/github.com/prometheus/[email protected]/scrape/scrape.go:1311 +0x99
  runtime.deferreturn()
      /usr/local/go/src/runtime/panic.go:476 +0x32
  github.com/prometheus/prometheus/scrape.(*scrapeLoop).run()
      /go/pkg/mod/github.com/prometheus/[email protected]/scrape/scrape.go:1264 +0x4ec
  github.com/prometheus/prometheus/scrape.(*scrapePool).sync.func3()
      /go/pkg/mod/github.com/prometheus/[email protected]/scrape/scrape.go:609 +0x4a

Previous write at 0x00c00107da48 by goroutine 1450:
  container/list.(*List).move()
      /usr/local/go/src/container/list/list.go:123 +0x217
  container/list.(*List).MoveToFront()
      /usr/local/go/src/container/list/list.go:185 +0x159
  github.com/golang/groupcache/lru.(*Cache).Get()
      /go/pkg/mod/github.com/golang/[email protected]/lru/lru.go:79 +0xed
  github.com/open-telemetry/opentelemetry-collector-contrib/internal/filter/filterset/regexp.(*FilterSet).Matches()
      /go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/internal/[email protected]/filterset/regexp/regexpfilterset.go:47 +0xc6
  github.com/open-telemetry/opentelemetry-collector-contrib/internal/filter/filtermetric.(*nameMatcher).Eval()
      /go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/internal/[email protected]/filtermetric/name_matcher.go:35 +0x91
  github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics.func1.1.1()
      /go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/metrics.go:128 +0x15a
  go.opentelemetry.io/collector/pdata/pmetric.MetricSlice.RemoveIf()
      /go/pkg/mod/github.com/fluxninja/opentelemetry-collector/[email protected]/pmetric/generated_metricslice.go:103 +0xb1
  github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics.func1.1()
      /go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/metrics.go:126 +0xeb
  go.opentelemetry.io/collector/pdata/pmetric.ScopeMetricsSlice.RemoveIf()
      /go/pkg/mod/github.com/fluxninja/opentelemetry-collector/[email protected]/pmetric/generated_scopemetricsslice.go:103 +0xb1
  github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics.func1()
      /go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/metrics.go:124 +0x1a4
  go.opentelemetry.io/collector/pdata/pmetric.ResourceMetricsSlice.RemoveIf()
      /go/pkg/mod/github.com/fluxninja/opentelemetry-collector/[email protected]/pmetric/generated_resourcemetricsslice.go:103 +0xb1
  github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics()
      /go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/metrics.go:112 +0x11e
  github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics-fm()
      <autogenerated>:1 +0x64
  go.opentelemetry.io/collector/processor/processorhelper.NewMetricsProcessor.func1()
      /go/pkg/mod/github.com/fluxninja/[email protected]/processor/processorhelper/metrics.go:52 +0x134
  go.opentelemetry.io/collector/consumer.ConsumeMetricsFunc.ConsumeMetrics()
      /go/pkg/mod/github.com/fluxninja/opentelemetry-collector/[email protected]/metrics.go:25 +0x79
  go.opentelemetry.io/collector/consumer.(*baseMetrics).ConsumeMetrics()
      <autogenerated>:1 +0x29
  go.opentelemetry.io/collector/processor/processorhelper.(*metricsProcessor).ConsumeMetrics()
      <autogenerated>:1 +0x7e
  go.opentelemetry.io/collector/consumer.Metrics.ConsumeMetrics-fm()
      <autogenerated>:1 +0x75
  go.opentelemetry.io/collector/consumer.ConsumeMetricsFunc.ConsumeMetrics()
      /go/pkg/mod/github.com/fluxninja/opentelemetry-collector/[email protected]/metrics.go:25 +0x79
  go.opentelemetry.io/collector/service/internal/graph.(*capabilitiesNode).ConsumeMetrics()
      <autogenerated>:1 +0x29
  github.com/open-telemetry/opentelemetry-collector-contrib/receiver/prometheusreceiver/internal.(*transaction).Commit()
      /go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/receiver/[email protected]/internal/transaction.go:256 +0x245
  github.com/prometheus/prometheus/scrape.(*scrapeLoop).scrapeAndReport.func1()
      /go/pkg/mod/github.com/prometheus/[email protected]/scrape/scrape.go:1311 +0x99
  runtime.deferreturn()
      /usr/local/go/src/runtime/panic.go:476 +0x32
  github.com/prometheus/prometheus/scrape.(*scrapeLoop).run()
      /go/pkg/mod/github.com/prometheus/[email protected]/scrape/scrape.go:1264 +0x4ec
  github.com/prometheus/prometheus/scrape.(*scrapePool).sync.func3()
      /go/pkg/mod/github.com/prometheus/[email protected]/scrape/scrape.go:609 +0x4a

Goroutine 1449 (running) created at:
  github.com/prometheus/prometheus/scrape.(*scrapePool).sync()
      /go/pkg/mod/github.com/prometheus/[email protected]/scrape/scrape.go:609 +0x1189
  github.com/prometheus/prometheus/scrape.(*scrapePool).Sync()
      /go/pkg/mod/github.com/prometheus/[email protected]/scrape/scrape.go:511 +0x696
  github.com/prometheus/prometheus/scrape.(*Manager).reload.func1()
      /go/pkg/mod/github.com/prometheus/[email protected]/scrape/manager.go:228 +0x53
  github.com/prometheus/prometheus/scrape.(*Manager).reload.func2()
      /go/pkg/mod/github.com/prometheus/[email protected]/scrape/manager.go:230 +0x74

Goroutine 1450 (running) created at:
  github.com/prometheus/prometheus/scrape.(*scrapePool).sync()
      /go/pkg/mod/github.com/prometheus/[email protected]/scrape/scrape.go:609 +0x1189
  github.com/prometheus/prometheus/scrape.(*scrapePool).Sync()
      /go/pkg/mod/github.com/prometheus/[email protected]/scrape/scrape.go:511 +0x696
  github.com/prometheus/prometheus/scrape.(*Manager).reload.func1()
      /go/pkg/mod/github.com/prometheus/[email protected]/scrape/manager.go:228 +0x53
  github.com/prometheus/prometheus/scrape.(*Manager).reload.func2()
      /go/pkg/mod/github.com/prometheus/[email protected]/scrape/manager.go:230 +0x74
==================
==================

What version did you use?
03c185f

Environment
playground

@krdln krdln added bug Something isn't working Agent Telemetry labels Jun 19, 2023
@hdkshingala
Copy link
Contributor

@krdln I tried reproducing this on the mentioned commit and waited for more then 20 minutes but was not able to see the data race. Are there any steps which we have to run to reproduce this?

@krdln
Copy link
Contributor Author

krdln commented Jul 12, 2023

No, it was just happening randomly for me once in a few minutes

@hdkshingala
Copy link
Contributor

Strange.. I tried running on this commit also a couple of times but didn't see this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Agent Telemetry bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants