Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[controller-runtime] log.SetLogger(...) was never called; logs will not be displayed. #981

Open
jessebot opened this issue Jun 22, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@jessebot
Copy link
Contributor

Description

I always get the following in the logs of any backup pod: [controller-runtime] log.SetLogger(...) was never called; logs will not be displayed. The backup pods are always successful and the data is actually backed up, but they always print this this trackback and I don't know why? Nothing is broken, just a bit confusing.

Additional Context

I'm using s3 as my backend and B2 as the remote s3 provider. All is working there fine. Again, none of this breaks anything. It just causes weird log messages that I don't understand. Thanks for any help you can provide and thanks for continuing to maintain this project!

Logs

2024-06-22T08:34:49Z    INFO    k8up.restic.restic.backup.progress      backup finished {"new files": 0, "changed files": 56, "errors": 0}
2024-06-22T08:34:49Z    INFO    k8up.restic.restic.backup.progress      stats   {"time": 248.825449061, "bytes added": 2842538507, "bytes processed": 14227999250}
2024-06-22T08:34:49Z    INFO    k8up.restic.statsHandler.promStats      sending prometheus stats        {"url": "push-gateway.prometheus.svc:9091"}
2024-06-22T08:34:49Z    INFO    k8up.restic.restic.backup       backup finished, sending snapshot list
2024-06-22T08:34:49Z    INFO    k8up.restic.restic.snapshots    getting list of snapshots
2024-06-22T08:34:49Z    INFO    k8up.restic.restic.snapshots.command    restic command  {"path": "/usr/local/bin/restic", "args": ["snapshots", "--option", "", "--json"]}
2024-06-22T08:34:49Z    INFO    k8up.restic.restic.snapshots.command    Defining RESTIC_PROGRESS_FPS    {"frequency": 0.016666666666666666}
[controller-runtime] log.SetLogger(...) was never called; logs will not be displayed.
Detected at:
        >  goroutine 1 [running]:
        >  runtime/debug.Stack()
        >       /opt/hostedtoolcache/go/1.21.9/x64/src/runtime/debug/stack.go:24 +0x5e
        >  sigs.k8s.io/controller-runtime/pkg/log.eventuallyFulfillRoot()
        >       /home/runner/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.2/pkg/log/log.go:60 +0xcd
        >  sigs.k8s.io/controller-runtime/pkg/log.(*delegatingLogSink).WithName(0xc00003fa00, {0x1a01e56, 0x14})
        >       /home/runner/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.2/pkg/log/deleg.go:147 +0x45
        >  github.com/go-logr/logr.Logger.WithName({{0x1cab1c8, 0xc00003fa00}, 0x0}, {0x1a01e56?, 0x19e676a?})
        >       /home/runner/go/pkg/mod/github.com/go-logr/logr@v1.4.1/logr.go:345 +0x3d
        >  sigs.k8s.io/controller-runtime/pkg/client.newClient(0xc0003c5008?, {0x0, 0xc0004ca070, {0x0, 0x0}, 0x0, {0x0, 0x0}, 0x0})
        >       /home/runner/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.2/pkg/client/client.go:129 +0xec
        >  sigs.k8s.io/controller-runtime/pkg/client.New(0xc0004ca000?, {0x0, 0xc0004ca070, {0x0, 0x0}, 0x0, {0x0, 0x0}, 0x0})
        >       /home/runner/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.17.2/pkg/client/client.go:110 +0x7d
        >  github.com/k8up-io/k8up/v2/restic/kubernetes.NewTypedClient()
        >       /home/runner/work/k8up/k8up/restic/kubernetes/config.go:45 +0xb0
        >  github.com/k8up-io/k8up/v2/restic/kubernetes.SyncSnapshotList({0x1ca7638?, 0xc000184000}, {0xc000252a00, 0x1e, 0x0?}, {0xc000012019, 0x6}, {0xc000044062, 0x38})
        >       /home/runner/work/k8up/k8up/restic/kubernetes/snapshots.go:19 +0x3db
        >  github.com/k8up-io/k8up/v2/restic/cli.(*Restic).sendSnapshotList(0xc000000000)
        >       /home/runner/work/k8up/k8up/restic/cli/backup.go:116 +0x18b
        >  github.com/k8up-io/k8up/v2/restic/cli.(*Restic).Backup(0xc000000000, {0x19e7bb6, 0x5}, {0x0, 0x0, 0x0})
        >       /home/runner/work/k8up/k8up/restic/cli/backup.go:46 +0x3eb
        >  github.com/k8up-io/k8up/v2/cmd/restic.doBackup({0x1ca7328?, 0x29bde40?}, 0xc0002126c0?, {{0x1caa8a8?, 0xc0002126c0?}, 0x0?})
        >       /home/runner/work/k8up/k8up/cmd/restic/main.go:270 +0xf3
        >  github.com/k8up-io/k8up/v2/cmd/restic.run({0x1ca7328, 0x29bde40}, 0x1caa8a8?, {{0x1caa8a8?, 0xc0002126c0?}, 0x1c93950?})
        >       /home/runner/work/k8up/k8up/cmd/restic/main.go:143 +0xb6
        >  github.com/k8up-io/k8up/v2/cmd/restic.resticMain(0xc0002104c0)
        >       /home/runner/work/k8up/k8up/cmd/restic/main.go:127 +0x2c5
        >  github.com/urfave/cli/v2.(*Command).Run(0x2974ce0, 0xc0002104c0, {0xc0002125a0, 0x3, 0x3})
        >       /home/runner/go/pkg/mod/github.com/urfave/cli/v2@v2.23.7/command.go:271 +0x998
        >  github.com/urfave/cli/v2.(*Command).Run(0xc00021e640, 0xc0002106c0, {0xc00003e0c0, 0x4, 0x4})
        >       /home/runner/go/pkg/mod/github.com/urfave/cli/v2@v2.23.7/command.go:264 +0xbe5
        >  github.com/urfave/cli/v2.(*App).RunContext(0xc00021a3c0, {0x1ca7328?, 0x29bde40}, {0xc00003e0c0, 0x4, 0x4})
        >       /home/runner/go/pkg/mod/github.com/urfave/cli/v2@v2.23.7/app.go:333 +0x579
        >  github.com/urfave/cli/v2.(*App).Run(...)
        >       /home/runner/go/pkg/mod/github.com/urfave/cli/v2@v2.23.7/app.go:310
        >  main.main()
        >       /home/runner/work/k8up/k8up/cmd/k8up/main.go:30 +0x3b

Expected Behavior

No stack traces in the logs.

Steps To Reproduce

I'm using the 4.7.0 helm chart on k8s. I deploy the crds first, and then the helm chart, both via Argo CD. You can see my full values here:
https://github.com/small-hack/argocd-apps/blob/730c3494444d19f8cad59184e0ba2039bcead4d6/k8up/k8up_argocd_appset.yaml#L42-L75

This happens with both Backups and Schedules manifests applied manually through kubectl.

Version of K8up

v2.10.0

Version of Kubernetes

1.29.5

Distribution of Kubernetes

K3s

@jessebot jessebot added the bug Something isn't working label Jun 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant