Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BatchExporting - issues #1968

Open
cijothomas opened this issue Jul 26, 2024 · 4 comments
Open

BatchExporting - issues #1968

cijothomas opened this issue Jul 26, 2024 · 4 comments
Assignees

Comments

@cijothomas
Copy link
Member

cijothomas commented Jul 26, 2024

Opening an issue to track issues/fixes for BatchExportProcessor

  1. We currently duplicate entire logic for Span and Logs. It might be possible to do a BatchExportProcessor<T> where T can be Span or LogRecord. It'll have a bit awkwardness as SpanProcessor has begin() and end(), but LogProcessor only has emit(). It may be possible to treat begin() as noop, and emit() as end(). Or extract common logic and re-use wherever feasible.
  2. For spans, there is capability to perform concurrent Exports, while Logs lack this. Also the env variable to configure this for span ("OTEL_BSP_MAX_CONCURRENT_EXPORTS"), is not part of spec.
  3. The timers are triggered immediately when processor/provider is built. We should apply https://github.com/open-telemetry/opentelemetry-rust/pull/1766/files for Batchprocesor as well. : Fixed for Span in BatchSpanProcessor to trigger first export after the interval instead of right away #1971
  4. It is unclear how much copy/clone is involved. For Logs, LogRecord is cloned (unavoidable) in BatchProcessor. But then it is sent to Channel, and then added to Vec, and then copied to another Vec to be passed to exporter. A lot of this can be potentially optimized.
  5. The Channel is set with 2048 capacity. But then the Vec can also hold upto 512. So effectively, up-to 2500 items can be in memory. When user set 2048 as capacity, the expectation is that, at-max 2048 items is kept in memory (and hence lost if there is a crash). Need to revisit this part.
  6. Lack of test coverage. I see a lot of issues reported, esp. related to shutdown issues. We'll need to develop test coverage for each runtime this repo offers. Reported here - Shutting down the global tracer hangs #868
  7. Offer Batchprocessor without any runtime dependency. This could be done by spawning a dedicated background thread , or providing a runtime implementation with dedicated thread. [Feature]: Support Metrics, BatchProcessor without async runtime #1437
  8. Exporting is not at regular intervals. See [Feature]: Support Metrics, BatchProcessor without async runtime #1437 (comment)
  9. When we spawn threads (TokioCurrentThread as an example), it should be named to make it easy to debug. Reported here : [Feature]: Name spawned threads #1949
    // TODO: Add more. Create dedicated issues for tracking each of them separately, if needed.

This is important to be addressed as BatchProcessor is critical for OTLP Exporting, which is the top used exporter and spec default.

@cijothomas cijothomas self-assigned this Jul 27, 2024
@cijothomas
Copy link
Member Author

I self assigned for now. I am currently busy with Metrics, so this will take a while. If anyone wants to pick this up (even parts of it), please comment and I'll assign.

@cijothomas
Copy link
Member Author

@howardjohn Would you have interest/bandwidth to help with some of these?

@cijothomas
Copy link
Member Author

@ThomsonTan agreed to look at this and help with some of these issues.

@TommyCpp
Copy link
Contributor

TommyCpp commented Aug 6, 2024

Also the env variable to configure this for span ("OTEL_BSP_MAX_CONCURRENT_EXPORTS"), is not part of spec

I think the concurrent exporting is a nice feature for performance. We should looking into adding it to other signals. The env var is indeed not part of the spec but it provides control over the concurrent exporting feature and it's not conflicting with any spec. We should keep it unless there are other concerns

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants