Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[processor/deltatocumulative]: evicts all streams under heavy load #33014

Closed
sh0rez opened this issue May 13, 2024 · 3 comments · Fixed by #33015
Closed

[processor/deltatocumulative]: evicts all streams under heavy load #33014

sh0rez opened this issue May 13, 2024 · 3 comments · Fixed by #33015
Labels
bug Something isn't working processor/deltatocumulative

Comments

@sh0rez
Copy link
Contributor

sh0rez commented May 13, 2024

Component(s)

processor/deltatocumulative

What happened?

Description

When running with a limit that is much lower than the ingested stream rate, the deltatocumulativeprocessor deletes all its streams from tracking.

This is because of its eviction logic, which deletes the oldest stream (whether stale or not) once the stream limit is exceeded.
On a high load this happens all the time, deleting all streams over and over again.

Steps to Reproduce

Expected Result

Only stale streams are deleted. New streams that don't fit are dropped, making the processor operate on the subset it can track within the limit.

Actual Result

The processor admits evicts live (non-stale) streams, admits a new one which is subsequently evicted as well.

Collector version

a133a8e

Environment information

No response

OpenTelemetry Collector configuration

No response

Log output

No response

Additional context

No response

Copy link
Contributor

Pinging code owners for processor/deltatocumulative: @sh0rez @RichieSams. See Adding Labels via Comments if you do not have permissions to add labels yourself.

1 similar comment
Copy link
Contributor

Pinging code owners for processor/deltatocumulative: @sh0rez @RichieSams. See Adding Labels via Comments if you do not have permissions to add labels yourself.

@crobert-1
Copy link
Member

The proposed solution/expected behavior makes sense to me, and was filed by a code owner. Removing needs triage.

@crobert-1 crobert-1 removed the needs triage New item requiring triage label May 13, 2024
jpkrohling pushed a commit that referenced this issue May 16, 2024
changes eviction behavior at limit to only delete streams if they are
actually stale. Current behavior just deletes the oldest, which leads to
rapid deletion of all streams under heavy load, making this processor
unusable.

**Description:** <Describe what has changed.>
<!--Ex. Fixing a bug - Describe the bug and how this fixes the issue.
Ex. Adding a feature - Explain what this achieves.-->

**Link to tracking Issue:**
Fixes
#33014

**Testing:** <Describe what testing was performed and which tests were
added.>

**Documentation:** <Describe the documentation added.>
cparkins pushed a commit to AmadeusITGroup/opentelemetry-collector-contrib that referenced this issue Jul 11, 2024
…y#33015)

changes eviction behavior at limit to only delete streams if they are
actually stale. Current behavior just deletes the oldest, which leads to
rapid deletion of all streams under heavy load, making this processor
unusable.

**Description:** <Describe what has changed.>
<!--Ex. Fixing a bug - Describe the bug and how this fixes the issue.
Ex. Adding a feature - Explain what this achieves.-->

**Link to tracking Issue:**
Fixes
open-telemetry#33014

**Testing:** <Describe what testing was performed and which tests were
added.>

**Documentation:** <Describe the documentation added.>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working processor/deltatocumulative
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants