-
Notifications
You must be signed in to change notification settings - Fork 41.6k
Closed
Labels
kind/failing-testCategorizes issue or PR as related to a consistently or frequently failing test.Categorizes issue or PR as related to a consistently or frequently failing test.kind/flakeCategorizes issue or PR as related to a flaky test.Categorizes issue or PR as related to a flaky test.needs-sigIndicates an issue or PR lacks a `sig/foo` label and requires one.Indicates an issue or PR lacks a `sig/foo` label and requires one.needs-triageIndicates an issue or PR lacks a `triage/foo` label and requires one.Indicates an issue or PR lacks a `triage/foo` label and requires one.
Milestone
Description
Which jobs are flaking?
pull-kubernetes-unit, see example failure in https://prow.k8s.io/view/gs/kubernetes-jenkins/pr-logs/pull/116554/pull-kubernetes-unit/1636476265563164672
{Failed === RUN TestHollowNode/kubelet
hollow_node_test.go:82: read 290, err=<nil>
I0316 21:26:45.874715 42015 hollow_node.go:178] Version: v0.0.0-master+$Format:%H$
I0316 21:26:45.878890 42015 hollow_kubelet.go:154] Using /tmp/hollow-kubelet.3224151551 as root dir for hollow-kubelet
W0316 21:26:45.902348 42015 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0316 21:26:45.902475 42015 kubelet.go:405] "Attempting to sync node with API server"
I0316 21:26:45.902560 42015 kubelet.go:298] "Adding static pod path" path="/tmp/hollow-kubelet.3224151551/static-pods193672360"
I0316 21:26:45.902725 42015 kubelet.go:309] "Adding apiserver pod source"
I0316 21:26:45.902798 42015 apiserver.go:42] "Waiting for node sync before watching apiserver pods"
W0316 21:26:45.902871 42015 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0316 21:26:45.903273 42015 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0316 21:26:45.903382 42015 state_mem.go:36] "Initialized new in-memory state store"
I0316 21:26:45.903428 42015 state_mem.go:35] "Initializing new in-memory state store"
I0316 21:26:45.903461 42015 fake_topology_manager.go:33] "NewFakeManager"
I0316 21:26:45.950888 42015 kuberuntime_manager.go:257] "Container runtime initialized" containerRuntime="fakeRuntime" version="0.1.0" apiVersion="0.1.0"
W0316 21:26:45.974590 42015 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0316 21:26:45.980015 42015 fake_topology_manager.go:33] "NewFakeManager"
I0316 21:26:45.980402 42015 server.go:1168] "Started kubelet"
E0316 21:26:45.981665 42015 kubelet.go:1398] "Image garbage collection failed once. Stats initialization may not have completed yet" err="imageFs information is unavailable"
I0316 21:26:45.981868 42015 server.go:162] "Starting to listen" address="0.0.0.0" port=0
W0316 21:26:45.982030 42015 reflector.go:533] k8s.io/client-go/informers/factory.go:150: failed to list *v1.Node: serializer for text/plain; charset=utf-8 doesn't exist
E0316 21:26:45.982146 42015 reflector.go:148] k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Node: failed to list *v1.Node: serializer for text/plain; charset=utf-8 doesn't exist
I0316 21:26:45.982302 42015 ratelimit.go:65] "Setting rate limiting for podresources endpoint" qps=100 burstTokens=10
W0316 21:26:45.985931 42015 reflector.go:533] k8s.io/client-go/informers/factory.go:150: failed to list *v1.Service: serializer for text/plain; charset=utf-8 doesn't exist
E0316 21:26:45.986041 42015 reflector.go:148] k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Service: failed to list *v1.Service: serializer for text/plain; charset=utf-8 doesn't exist
==================
WARNING: DATA RACE
Write at 0x00c00088f988 by goroutine 191:
k8s.io/kubernetes/pkg/kubelet.(*Kubelet).setupDataDirs()
/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/pkg/kubelet/kubelet.go:1324 +0x96
k8s.io/kubernetes/pkg/kubelet.(*Kubelet).initializeModules()
/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/pkg/kubelet/kubelet.go:1425 +0x308
k8s.io/kubernetes/pkg/kubelet.(*Kubelet).Run()
/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/pkg/kubelet/kubelet.go:1554 +0x4c6
k8s.io/kubernetes/cmd/kubelet/app.startKubelet.func1()
/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/cmd/kubelet/app/server.go:1175 +0x59
Previous read at 0x00c00088f988 by goroutine 193:
k8s.io/kubernetes/pkg/kubelet.(*Kubelet).getRootDir()
/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/pkg/kubelet/kubelet_getters.go:48 +0x56
k8s.io/kubernetes/pkg/kubelet.(*Kubelet).getPodResourcesDir()
/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/pkg/kubelet/kubelet_getters.go:175 +0x6d
k8s.io/kubernetes/pkg/kubelet.(*Kubelet).ListenAndServePodResources()
/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/pkg/kubelet/kubelet.go:2780 +0xbf
k8s.io/kubernetes/cmd/kubelet/app.startKubelet.func4()
/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/cmd/kubelet/app/server.go:1185 +0x48
Goroutine 191 (running) created at:
k8s.io/kubernetes/cmd/kubelet/app.startKubelet()
/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/cmd/kubelet/app/server.go:1175 +0x167
k8s.io/kubernetes/cmd/kubelet/app.RunKubelet()
/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/cmd/kubelet/app/server.go:1167 +0xa07
k8s.io/kubernetes/pkg/kubemark.(*HollowKubelet).Run()
/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/pkg/kubemark/hollow_kubelet.go:129 +0x185
k8s.io/kubernetes/cmd/kubemark/app.run()
/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/cmd/kubemark/app/hollow_node.go:264 +0x129a
k8s.io/kubernetes/cmd/kubemark/app.TestHollowNode.func2.1()
/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/cmd/kubemark/app/hollow_node_test.go:83 +0x130
Goroutine 193 (running) created at:
k8s.io/kubernetes/cmd/kubelet/app.startKubelet()
/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/cmd/kubelet/app/server.go:1185 +0x5b2
k8s.io/kubernetes/cmd/kubelet/app.RunKubelet()
/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/cmd/kubelet/app/server.go:1167 +0xa07
k8s.io/kubernetes/pkg/kubemark.(*HollowKubelet).Run()
/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/pkg/kubemark/hollow_kubelet.go:129 +0x185
k8s.io/kubernetes/cmd/kubemark/app.run()
/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/cmd/kubemark/app/hollow_node.go:264 +0x129a
k8s.io/kubernetes/cmd/kubemark/app.TestHollowNode.func2.1()
/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/cmd/kubemark/app/hollow_node_test.go:83 +0x130
==================
I0316 21:26:46.050012 42015 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer"
I0316 21:26:46.051412 42015 volume_manager.go:284] "Starting Kubelet Volume Manager"
I0316 21:26:46.052627 42015 reconciler_new.go:29] "Reconciler: start to sync state"
I0316 21:26:46.052575 42015 desired_state_of_world_populator.go:145] "Desired state populator starts to run"
E0316 21:26:46.053506 42015 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"a86e2dd0-c43f-11ed-86ed-9e80c8e2f64d\" not found"
W0316 21:26:46.054008 42015 reflector.go:533] k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIDriver: serializer for text/plain; charset=utf-8 doesn't exist
E0316 21:26:46.054105 42015 reflector.go:148] k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: serializer for text/plain; charset=utf-8 doesn't exist
E0316 21:26:46.054558 42015 controller.go:146] "Failed to ensure lease exists, will retry" err="serializer for text/plain; charset=utf-8 doesn't exist" interval="200ms"
I0316 21:26:46.058447 42015 plugin_manager.go:118] "Starting Kubelet Plugin Manager"
I0316 21:26:46.060501 42015 server.go:461] "Adding debug handlers to kubelet server"
E0316 21:26:46.068197 42015 kubelet_network_linux.go:83] "Failed to ensure that iptables hint chain exists" err=<
error creating chain "KUBE-IPTABLES-HINT": exit status 4: Fatal: can't open lock file /run/xtables.lock: Permission denied
>
I0316 21:26:46.068299 42015 kubelet_network_linux.go:71] "Failed to initialize iptables rules; some functionality may be missing." protocol=IPv4
E0316 21:26:46.070234 42015 kubelet_network_linux.go:83] "Failed to ensure that iptables hint chain exists" err=<
error creating chain "KUBE-IPTABLES-HINT": exit status 4: Fatal: can't open lock file /run/xtables.lock: Permission denied
>
I0316 21:26:46.070326 42015 kubelet_network_linux.go:71] "Failed to initialize iptables rules; some functionality may be missing." protocol=IPv6
I0316 21:26:46.070390 42015 status_manager.go:211] "Starting to sync pod status with apiserver"
I0316 21:26:46.070522 42015 kubelet.go:2233] "Starting kubelet main sync loop"
E0316 21:26:46.070719 42015 kubelet.go:2257] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful"
W0316 21:26:46.073049 42015 reflector.go:533] k8s.io/client-go/informers/factory.go:150: failed to list *v1.RuntimeClass: serializer for text/plain; charset=utf-8 doesn't exist
E0316 21:26:46.073147 42015 reflector.go:148] k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: serializer for text/plain; charset=utf-8 doesn't exist
I0316 21:26:46.160348 42015 kubelet_node_status.go:70] "Attempting to register node" node="a86e2dd0-c43f-11ed-86ed-9e80c8e2f64d"
E0316 21:26:46.166886 42015 kubelet_node_status.go:92] "Unable to register node with API server" err="serializer for text/plain; charset=utf-8 doesn't exist" node="a86e2dd0-c43f-11ed-86ed-9e80c8e2f64d"
E0316 21:26:46.256778 42015 controller.go:146] "Failed to ensure lease exists, will retry" err="serializer for text/plain; charset=utf-8 doesn't exist" interval="400ms"
I0316 21:26:46.370067 42015 kubelet_node_status.go:70] "Attempting to register node" node="a86e2dd0-c43f-11ed-86ed-9e80c8e2f64d"
E0316 21:26:46.372104 42015 kubelet_node_status.go:92] "Unable to register node with API server" err="serializer for text/plain; charset=utf-8 doesn't exist" node="a86e2dd0-c43f-11ed-86ed-9e80c8e2f64d"
E0316 21:26:46.659635 42015 controller.go:146] "Failed to ensure lease exists, will retry" err="serializer for text/plain; charset=utf-8 doesn't exist" interval="800ms"
I0316 21:26:46.775905 42015 kubelet_node_status.go:70] "Attempting to register node" node="a86e2dd0-c43f-11ed-86ed-9e80c8e2f64d"
E0316 21:26:46.777935 42015 kubelet_node_status.go:92] "Unable to register node with API server" err="serializer for text/plain; charset=utf-8 doesn't exist" node="a86e2dd0-c43f-11ed-86ed-9e80c8e2f64d"
W0316 21:26:46.967940 42015 reflector.go:533] k8s.io/client-go/informers/factory.go:150: failed to list *v1.Node: serializer for text/plain; charset=utf-8 doesn't exist
E0316 21:26:46.968062 42015 reflector.go:148] k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Node: failed to list *v1.Node: serializer for text/plain; charset=utf-8 doesn't exist
W0316 21:26:47.293875 42015 reflector.go:533] k8s.io/client-go/informers/factory.go:150: failed to list *v1.Service: serializer for text/plain; charset=utf-8 doesn't exist
E0316 21:26:47.294001 42015 reflector.go:148] k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Service: failed to list *v1.Service: serializer for text/plain; charset=utf-8 doesn't exist
W0316 21:26:47.402628 42015 reflector.go:533] k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIDriver: serializer for text/plain; charset=utf-8 doesn't exist
E0316 21:26:47.402712 42015 reflector.go:148] k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: serializer for text/plain; charset=utf-8 doesn't exist
E0316 21:26:47.462244 42015 controller.go:146] "Failed to ensure lease exists, will retry" err="serializer for text/plain; charset=utf-8 doesn't exist" interval="1.6s"
W0316 21:26:47.474542 42015 reflector.go:533] k8s.io/client-go/informers/factory.go:150: failed to list *v1.RuntimeClass: serializer for text/plain; charset=utf-8 doesn't exist
E0316 21:26:47.474640 42015 reflector.go:148] k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: serializer for text/plain; charset=utf-8 doesn't exist
I0316 21:26:47.580660 42015 kubelet_node_status.go:70] "Attempting to register node" node="a86e2dd0-c43f-11ed-86ed-9e80c8e2f64d"
E0316 21:26:47.582547 42015 kubelet_node_status.go:92] "Unable to register node with API server" err="serializer for text/plain; charset=utf-8 doesn't exist" node="a86e2dd0-c43f-11ed-86ed-9e80c8e2f64d"
hollow_node_test.go:90: Morph "kubelet" hasn't crashed for 3s. Calling success.
testing.go:1446: race detected during execution of test
--- FAIL: TestHollowNode/kubelet (3.00s)
}
Which tests are flaking?
k8s.io/kubernetes/cmd/kubemark/app: TestHollowNode/kubelet
Since when has it been flaking?
It appears to be a new test added in #116645
Testgrid link
No response
Reason for failure (if possible)
No response
Anything else we need to know?
No response
Relevant SIG(s)
/sig
Metadata
Metadata
Assignees
Labels
kind/failing-testCategorizes issue or PR as related to a consistently or frequently failing test.Categorizes issue or PR as related to a consistently or frequently failing test.kind/flakeCategorizes issue or PR as related to a flaky test.Categorizes issue or PR as related to a flaky test.needs-sigIndicates an issue or PR lacks a `sig/foo` label and requires one.Indicates an issue or PR lacks a `sig/foo` label and requires one.needs-triageIndicates an issue or PR lacks a `triage/foo` label and requires one.Indicates an issue or PR lacks a `triage/foo` label and requires one.
Type
Projects
Status
Done