Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

You want to dynamically determine the topic when exporting with Kafka Exporter. #31178

Closed
pyama86 opened this issue Feb 11, 2024 · 10 comments · Fixed by #31809
Closed

You want to dynamically determine the topic when exporting with Kafka Exporter. #31178

pyama86 opened this issue Feb 11, 2024 · 10 comments · Fixed by #31809
Labels
enhancement New feature or request exporter/kafka

Comments

@pyama86
Copy link
Contributor

pyama86 commented Feb 11, 2024

Component(s)

exporter/kafka

Is your feature request related to a problem? Please describe.

When adding log information to Kafka, you separate the topics by log file or type. However, currently, to do this, you need to write many exporter definitions.

Describe the solution you'd like

You want to dynamically determine the topic from metadata, such as the file name obtained from the Filelog Receiver.

Describe alternatives you've considered

No response

Additional context

No response

@pyama86 pyama86 added enhancement New feature or request needs triage New item requiring triage labels Feb 11, 2024
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@shivanshuraj1333
Copy link
Member

This looks like a reasonable request to me, if this gets triaged, would love to work on it.

@crobert-1
Copy link
Member

Hello @pyama86, can you share a proposal for what the configuration of this option would look like?

@pyama86
Copy link
Contributor Author

pyama86 commented Mar 5, 2024

@crobert-1
Thank you for your reply. I'm hoping to be able to make definitions like the ones below.

receivers:
  filelog:
    include: [ /var/log/containers/*.log]
    operators:
      - type: json_parser

processors:
  # This processor is expected to extract k8s-related metadata from the logs,
  # and add it as attributes to the logs.
  k8sattributes:

exporters:
  kafka/custom:
    default_topic: default-topic
    brokers:
      - localhost:9092
    topic_from_attribute: k8s.app

service:
  pipelines:
    logs:
      receivers: [filelog]
      processors: [k8sattributes]
      exporters: [kafka/custom]

It seems there's currently no way to access the attributes created by the processor, which would be very useful for various purposes.

@crobert-1
Copy link
Member

The idea sounds good to me, but I do wonder a bit at implementation details. Do you plan on only allowing resource attribute names to be the source of value for topic_from_attribute?

@pyama86
Copy link
Contributor Author

pyama86 commented Mar 5, 2024

I think that users will likely want to input various values as needed, and the processor can generate or transform attributes accordingly, which seems to allow for reasonably flexible responses. On the other hand, I'm not very familiar with the implementation of the otel collector, so if there are any concerns or points that should be considered, presenting them would enable a deeper examination.

@crobert-1
Copy link
Member

That's fair. I think initially we can restrict it to resource attributes as the source. The resource attribute name can be the value for topic_from_attribute, and then the resource attribute's value will be the topic. 👍

@crobert-1 crobert-1 removed the needs triage New item requiring triage label Mar 13, 2024
@pyama86
Copy link
Contributor Author

pyama86 commented Mar 15, 2024

Is there anything else I can contribute to this matter? I'm willing to help with anything if there is something I can do.

@crobert-1
Copy link
Member

You're welcome to submit a PR if you're able! Otherwise it will depend on the priorities and availability of the component code owners, or other contributors.

@pyama86
Copy link
Contributor Author

pyama86 commented Mar 15, 2024

That's great! Whether it gets merged or not, I also believe that writing code is the best way to express what you want to do, so I'll give it a try. Thank you.

codeboten added a commit that referenced this issue May 3, 2024
…ibute. (#31809)

I've implemented the feature based on the discussion in the referenced
issue. I'm assuming it will suffice in most cases if the text attributes
are available, so I haven't planned for arrays or maps.

Fixes #31178

I've implemented unit tests. For the tests related to kafka_exporter, I
followed the existing implementation, but I find it somewhat redundant.
If it's okay, I'd like to switch to table-driven tests.

---------

Co-authored-by: Curtis Robert <crobert@splunk.com>
Co-authored-by: Alex Boten <223565+codeboten@users.noreply.github.com>
rimitchell pushed a commit to rimitchell/opentelemetry-collector-contrib that referenced this issue May 8, 2024
…ibute. (open-telemetry#31809)

I've implemented the feature based on the discussion in the referenced
issue. I'm assuming it will suffice in most cases if the text attributes
are available, so I haven't planned for arrays or maps.

Fixes open-telemetry#31178

I've implemented unit tests. For the tests related to kafka_exporter, I
followed the existing implementation, but I find it somewhat redundant.
If it's okay, I'd like to switch to table-driven tests.

---------

Co-authored-by: Curtis Robert <crobert@splunk.com>
Co-authored-by: Alex Boten <223565+codeboten@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request exporter/kafka
Projects
None yet
3 participants