Skip to content

vendor: otel v0.56.0 / v1.31.0#49276

Merged
thaJeztah merged 1 commit intomoby:masterfrom
thaJeztah:vendor_otel
Jan 14, 2025
Merged

vendor: otel v0.56.0 / v1.31.0#49276
thaJeztah merged 1 commit intomoby:masterfrom
thaJeztah:vendor_otel

Conversation

@thaJeztah
Copy link
Member

@thaJeztah thaJeztah commented Jan 14, 2025

vendor: otel v0.56.0 / v1.31.0

Reverts otel workaround, added in cca7085,
as it's no longer needed:

=== Failed
=== FAIL: cmd/dockerd TestOtelMeterLeak (0.64s)
    daemon_test.go:303: Allocations: 3
    daemon_test.go:307: Allocations count decreased. OTEL leak workaround is no longer needed!

- Description for the changelog

- A picture of a cute animal (not mandatory but encouraged)

Comment on lines -284 to -309

// TestOtelMeterLeak tests for a memory leak in the OTEL meter implementation.
// Once the fixed OTEL is vendored, this test will fail - the workaround
// and this test should be removed then.
func TestOtelMeterLeak(t *testing.T) {
meter := otel.Meter("foo")

var before runtime.MemStats
runtime.ReadMemStats(&before)

const counters = 10 * 1000 * 1000
for i := 0; i < counters; i++ {
_, _ = meter.Int64Counter("bar")
}

var after runtime.MemStats
runtime.ReadMemStats(&after)

allocs := after.Mallocs - before.Mallocs
t.Log("Allocations:", allocs)

if allocs < 10 {
// TODO: Remove Workaround OTEL memory leak in cmd/dockerd/daemon.go
t.Fatal("Allocations count decreased. OTEL leak workaround is no longer needed!")
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wondering, maybe we should leave it, but inverse the check and fail to let us know in case it regresses in future?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, was considering for a bit, but of course that's also assuming the exact same bug would come again I guess? We wouldn't catch issues in other areas of OTEL, or do you think those are less likely to happen?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah it's mostly for this exact bug. Although one case it would also catch is when something (like a vendored dependency in func init()) would override the global OTEL tracer provider.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK; updated the test to check for "more than 10" allocations, and added some comments to provide some context.

@thaJeztah thaJeztah marked this pull request as ready for review January 14, 2025 13:02
Reverts otel workaround, added in cca7085,
as it's no longer needed:

    === Failed
    === FAIL: cmd/dockerd TestOtelMeterLeak (0.64s)
        daemon_test.go:303: Allocations: 3
        daemon_test.go:307: Allocations count decreased. OTEL leak workaround is no longer needed!

We're keeping the test for now, so that we can check for possible
regressions in the OTel dependencies.

Co-authored-by: Derek McGowan <[email protected]>
Signed-off-by: Derek McGowan <[email protected]>
Signed-off-by: Sebastiaan van Stijn <[email protected]>
@thaJeztah
Copy link
Member Author

Failure on windows looks like a flaky test, but let me kick it again

== FAIL: github.com/docker/docker/integration/container TestLogs/driver_local/without_tty/only_stderr (61.12s)
    logs_test.go:133: assertion failed: error is not nil: error during connect: Post "http://%2F%2F.%2Fpipe%2Fdocker_engine/v1.48/containers/create": EOF

@thaJeztah thaJeztah merged commit 2b7e859 into moby:master Jan 14, 2025
@thaJeztah thaJeztah deleted the vendor_otel branch January 14, 2025 16:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants