Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[receiver/awscloudwatch] Cannot scrape full named loggroups #32345

Closed
mikel-jason opened this issue Apr 12, 2024 · 4 comments
Closed

[receiver/awscloudwatch] Cannot scrape full named loggroups #32345

mikel-jason opened this issue Apr 12, 2024 · 4 comments
Labels
bug Something isn't working receiver/awscloudwatch

Comments

@mikel-jason
Copy link
Contributor

Component(s)

receiver/awscloudwatch

What happened?

Description

In the receiver's docs, you have an example to scrape a loggroup by name without filtering for streams:

awscloudwatch:
profile: 'my-profile'
region: us-west-1
logs:
poll_interval: 5m
groups:
named:
/aws/eks/dev-0/cluster:

This example does not work, no data is scraped/exported.

I tried a workaround by setting an empty prefix, but according to the AWS SDK that is used here, this parameter needs at least 1 character.

// Filters the results to include only events from log streams that have names
// starting with this prefix.
//
// If you specify a value for both logStreamNamePrefix and logStreamNames, but
// the value for logStreamNamePrefix does not match any log stream names specified
// in logStreamNames, the action returns an InvalidParameterException error.
LogStreamNamePrefix *string locationName:"logStreamNamePrefix" min:"1" type:"string"

From https://docs.aws.amazon.com/sdk-for-go/api/service/cloudwatchlogs/#FilterLogEventsInput

Steps to Reproduce

With an exporter and corresponding system of your choice (I use Loki & Grafana), configure the receiver with a named loggroup but without stream filtering (no name and no prefix). Then write data to the loggroup. Wait for the batch timeout if configured. Check the storage system for data.

Expected Result

The logs from all streams of the referenced loggroup are correctly scraped and exported to the storage system (Loki/Grafana in my case).

Actual Result

Data is not scraped/exported.

Collector version

v0.90.1

Environment information

Environment

AWS EKS 1.27
Nodes: x86_amd64, latest AWS-provided BottlerocketOS
OTel operator: Binary version v0.90.0, chart version 0.44.2, installed using the provided helm chart https://github.com/open-telemetry/opentelemetry-helm-charts/tree/main/charts/opentelemetry-operator
OTelCol: Binary v0.90.1, deployed by OTel-operator via OTelCol CRD

OpenTelemetry Collector configuration

receivers:
  awscloudwatch:
    region: eu-central-1
    logs:
      poll_interval: 2m
      groups:
        named:
          /aws/eks/eks/cluster: # this is working fine
            prefixes: [kube-apiserver]
          testloggroup: # this is not working
processors:
  batch:
    send_batch_size: 1000
    send_batch_max_size: 1500
    timeout: 30s
  resource:
    attributes:
      - action: insert
        key: loki.resource.labels
        value: cloudwatch.log.group.name
exporters:
  loki:
    endpoint: "<REDACTED>"
    headers:
      X-Scope-OrgID: "<REDACTED>"
    compression: gzip
service:
  pipelines:
    logs:
      receivers: [awscloudwatch]
      processors: [batch,resource]
      exporters: [loki]

Log output

No response

Additional context

I identified the problem here:

func newLogsReceiver(cfg *Config, logger *zap.Logger, consumer consumer.Logs) *logsReceiver {
groups := []groupRequest{}
for logGroupName, sc := range cfg.Logs.Groups.NamedConfigs {
for _, prefix := range sc.Prefixes {
groups = append(groups, &streamPrefix{group: logGroupName, prefix: prefix})
}
if len(sc.Names) > 0 {
groups = append(groups, &streamNames{group: logGroupName, names: sc.Names})
}
}

A group config for named loggroups is only appended if it has a stream prefix or a name set. If none is present, nothing is appended, so the receiver does not know about the loggroup at all.

I implemented a fix and tested it in-cluster - works as expected. Please see the linked PR and confirm the approach or provide feedback. Thanks a lot for investing your time in this and the overall receiver! <3

@mikel-jason mikel-jason added bug Something isn't working needs triage New item requiring triage labels Apr 12, 2024
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@crobert-1
Copy link
Member

Removing needs triage based on code owner's response on PR that will resolve this issue

@crobert-1 crobert-1 removed the needs triage New item requiring triage label Apr 15, 2024
djaglowski pushed a commit that referenced this issue Apr 18, 2024
**Description:** Allow receiving named loggroups without stream
filtering as indicated by a given example linked in the README

**Link to tracking Issue:** #32345

**Testing:** Adds additional unit test for this specific config case.

**Documentation:** None, implementation matches the given docs/example
which was not the case before
rimitchell pushed a commit to rimitchell/opentelemetry-collector-contrib that referenced this issue May 8, 2024
…-telemetry#32346)

**Description:** Allow receiving named loggroups without stream
filtering as indicated by a given example linked in the README

**Link to tracking Issue:** open-telemetry#32345

**Testing:** Adds additional unit test for this specific config case.

**Documentation:** None, implementation matches the given docs/example
which was not the case before
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Jun 17, 2024
@crobert-1
Copy link
Member

Was this issue resolved by #32346? Is so, we can close it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working receiver/awscloudwatch
Projects
None yet
Development

No branches or pull requests

2 participants