2026-03-10T15:35:21.715 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-10T15:35:21.738 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-10T15:35:21.789 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1080 branch: squid description: orch/cephadm/workunits/{0-distro/ubuntu_22.04 agent/off mon_election/classic task/test_cephadm} email: null first_in_suite: false flavor: default job_id: '1080' ktype: distro last_in_suite: false machine_type: vps name: kyr-2026-03-10_01:00:38-orch-squid-none-default-vps no_nested_subset: false os_type: ubuntu os_version: '22.04' overrides: admin_socket: branch: squid ansible.cephlab: branch: main skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: timezone: UTC ceph: conf: global: mon election default strategy: 1 mgr: debug mgr: 20 debug ms: 1 mgr/cephadm/use_agent: false mon: debug mon: 20 debug ms: 1 debug paxos: 20 osd: debug ms: 1 debug osd: 20 osd mclock iops capacity threshold hdd: 49000 flavor: default log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) sha1: e911bdebe5c8faa3800735d1568fcdca65db60df ceph-deploy: conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: {} install: ceph: flavor: default sha1: e911bdebe5c8faa3800735d1568fcdca65db60df extra_system_packages: deb: - python3-xmltodict - python3-jmespath rpm: - bzip2 - perl-Test-Harness - python3-xmltodict - python3-jmespath workunit: branch: tt-squid sha1: 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - mon.a - mgr.x - osd.0 - client.0 seed: 8043 sha1: e911bdebe5c8faa3800735d1568fcdca65db60df sleep_before_teardown: 0 subset: 1/64 suite: orch suite_branch: tt-squid suite_path: /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b targets: vm01.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAhx6XRUk25b6A8cVjgYjRF14uv5O0rPFubeDfZOHJWLgU1fDxDTvMpc3G6bfkkiCHslxdkLrk0uO00aCs7F2Mo= tasks: - install: null - exec: mon.a: - yum install -y python3 || apt install -y python3 - workunit: clients: client.0: - cephadm/test_cephadm.sh teuthology: fragments_dropped: [] meta: {} postmerge: [] teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-10_01:00:38 tube: vps user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.611473 2026-03-10T15:35:21.789 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa; will attempt to use it 2026-03-10T15:35:21.790 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks 2026-03-10T15:35:21.790 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-10T15:35:21.790 INFO:teuthology.task.internal:Checking packages... 2026-03-10T15:35:21.791 INFO:teuthology.task.internal:Checking packages for os_type 'ubuntu', flavor 'default' and ceph hash 'e911bdebe5c8faa3800735d1568fcdca65db60df' 2026-03-10T15:35:21.791 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-10T15:35:21.791 INFO:teuthology.packaging:ref: None 2026-03-10T15:35:21.791 INFO:teuthology.packaging:tag: None 2026-03-10T15:35:21.791 INFO:teuthology.packaging:branch: squid 2026-03-10T15:35:21.791 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T15:35:21.791 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&ref=squid 2026-03-10T15:35:22.411 INFO:teuthology.task.internal:Found packages for ceph version 19.2.3-678-ge911bdeb-1jammy 2026-03-10T15:35:22.412 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-10T15:35:22.413 INFO:teuthology.task.internal:no buildpackages task found 2026-03-10T15:35:22.413 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-10T15:35:22.413 INFO:teuthology.task.internal:Saving configuration 2026-03-10T15:35:22.416 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-10T15:35:22.417 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-10T15:35:22.425 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm01.local', 'description': '/archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1080', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'ubuntu', 'os_version': '22.04', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-10 15:34:44.606755', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:01', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAhx6XRUk25b6A8cVjgYjRF14uv5O0rPFubeDfZOHJWLgU1fDxDTvMpc3G6bfkkiCHslxdkLrk0uO00aCs7F2Mo='} 2026-03-10T15:35:22.425 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-10T15:35:22.426 INFO:teuthology.task.internal:roles: ubuntu@vm01.local - ['mon.a', 'mgr.x', 'osd.0', 'client.0'] 2026-03-10T15:35:22.426 INFO:teuthology.run_tasks:Running task console_log... 2026-03-10T15:35:22.432 DEBUG:teuthology.task.console_log:vm01 does not support IPMI; excluding 2026-03-10T15:35:22.432 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7f1a3787a290>, signals=[15]) 2026-03-10T15:35:22.432 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-10T15:35:22.433 INFO:teuthology.task.internal:Opening connections... 2026-03-10T15:35:22.433 DEBUG:teuthology.task.internal:connecting to ubuntu@vm01.local 2026-03-10T15:35:22.433 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm01.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T15:35:22.488 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-10T15:35:22.489 DEBUG:teuthology.orchestra.run.vm01:> uname -m 2026-03-10T15:35:22.621 INFO:teuthology.orchestra.run.vm01.stdout:x86_64 2026-03-10T15:35:22.622 DEBUG:teuthology.orchestra.run.vm01:> cat /etc/os-release 2026-03-10T15:35:22.665 INFO:teuthology.orchestra.run.vm01.stdout:PRETTY_NAME="Ubuntu 22.04.5 LTS" 2026-03-10T15:35:22.665 INFO:teuthology.orchestra.run.vm01.stdout:NAME="Ubuntu" 2026-03-10T15:35:22.665 INFO:teuthology.orchestra.run.vm01.stdout:VERSION_ID="22.04" 2026-03-10T15:35:22.665 INFO:teuthology.orchestra.run.vm01.stdout:VERSION="22.04.5 LTS (Jammy Jellyfish)" 2026-03-10T15:35:22.665 INFO:teuthology.orchestra.run.vm01.stdout:VERSION_CODENAME=jammy 2026-03-10T15:35:22.665 INFO:teuthology.orchestra.run.vm01.stdout:ID=ubuntu 2026-03-10T15:35:22.665 INFO:teuthology.orchestra.run.vm01.stdout:ID_LIKE=debian 2026-03-10T15:35:22.665 INFO:teuthology.orchestra.run.vm01.stdout:HOME_URL="https://www.ubuntu.com/" 2026-03-10T15:35:22.665 INFO:teuthology.orchestra.run.vm01.stdout:SUPPORT_URL="https://help.ubuntu.com/" 2026-03-10T15:35:22.666 INFO:teuthology.orchestra.run.vm01.stdout:BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/" 2026-03-10T15:35:22.666 INFO:teuthology.orchestra.run.vm01.stdout:PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy" 2026-03-10T15:35:22.666 INFO:teuthology.orchestra.run.vm01.stdout:UBUNTU_CODENAME=jammy 2026-03-10T15:35:22.666 INFO:teuthology.lock.ops:Updating vm01.local on lock server 2026-03-10T15:35:22.671 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-10T15:35:22.674 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-10T15:35:22.675 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-10T15:35:22.675 DEBUG:teuthology.orchestra.run.vm01:> test '!' -e /home/ubuntu/cephtest 2026-03-10T15:35:22.709 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-10T15:35:22.710 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-10T15:35:22.710 DEBUG:teuthology.orchestra.run.vm01:> test -z $(ls -A /var/lib/ceph) 2026-03-10T15:35:22.754 INFO:teuthology.orchestra.run.vm01.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-10T15:35:22.754 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-10T15:35:22.764 DEBUG:teuthology.orchestra.run.vm01:> test -e /ceph-qa-ready 2026-03-10T15:35:22.800 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T15:35:23.023 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-10T15:35:23.025 INFO:teuthology.task.internal:Creating test directory... 2026-03-10T15:35:23.025 DEBUG:teuthology.orchestra.run.vm01:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-10T15:35:23.029 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-10T15:35:23.030 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-10T15:35:23.031 INFO:teuthology.task.internal:Creating archive directory... 2026-03-10T15:35:23.031 DEBUG:teuthology.orchestra.run.vm01:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-10T15:35:23.075 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-10T15:35:23.076 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-10T15:35:23.076 DEBUG:teuthology.orchestra.run.vm01:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-10T15:35:23.117 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T15:35:23.117 DEBUG:teuthology.orchestra.run.vm01:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-10T15:35:23.166 INFO:teuthology.orchestra.run.vm01.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T15:35:23.170 INFO:teuthology.orchestra.run.vm01.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T15:35:23.170 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-10T15:35:23.172 INFO:teuthology.task.internal:Configuring sudo... 2026-03-10T15:35:23.172 DEBUG:teuthology.orchestra.run.vm01:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-10T15:35:23.220 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-10T15:35:23.222 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-10T15:35:23.222 DEBUG:teuthology.orchestra.run.vm01:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-10T15:35:23.264 DEBUG:teuthology.orchestra.run.vm01:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T15:35:23.308 DEBUG:teuthology.orchestra.run.vm01:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T15:35:23.353 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-10T15:35:23.353 DEBUG:teuthology.orchestra.run.vm01:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-10T15:35:23.406 DEBUG:teuthology.orchestra.run.vm01:> sudo service rsyslog restart 2026-03-10T15:35:23.461 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-10T15:35:23.463 INFO:teuthology.task.internal:Starting timer... 2026-03-10T15:35:23.463 INFO:teuthology.run_tasks:Running task pcp... 2026-03-10T15:35:23.465 INFO:teuthology.run_tasks:Running task selinux... 2026-03-10T15:35:23.469 INFO:teuthology.task.selinux:Excluding vm01: VMs are not yet supported 2026-03-10T15:35:23.469 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-10T15:35:23.469 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-10T15:35:23.469 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-10T15:35:23.469 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-10T15:35:23.471 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'timezone': 'UTC'}} 2026-03-10T15:35:23.471 DEBUG:teuthology.repo_utils:Setting repo remote to https://github.com/ceph/ceph-cm-ansible.git 2026-03-10T15:35:23.472 INFO:teuthology.repo_utils:Fetching github.com_ceph_ceph-cm-ansible_main from origin 2026-03-10T15:35:23.998 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main to origin/main 2026-03-10T15:35:24.005 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-10T15:35:24.005 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "timezone": "UTC"}' -i /tmp/teuth_ansible_inventorysqxtwmk3 --limit vm01.local /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-10T15:37:25.760 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm01.local')] 2026-03-10T15:37:25.761 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm01.local' 2026-03-10T15:37:25.761 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm01.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T15:37:25.824 DEBUG:teuthology.orchestra.run.vm01:> true 2026-03-10T15:37:26.056 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm01.local' 2026-03-10T15:37:26.057 INFO:teuthology.run_tasks:Running task clock... 2026-03-10T15:37:26.078 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-10T15:37:26.079 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-10T15:37:26.079 DEBUG:teuthology.orchestra.run.vm01:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: ntpd 4.2.8p15@1.3728-o Wed Feb 16 17:13:02 UTC 2022 (1): Starting 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: Command line: ntpd -gq 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: ---------------------------------------------------- 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: ntp-4 is maintained by Network Time Foundation, 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: Inc. (NTF), a non-profit 501(c)(3) public-benefit 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: corporation. Support and training for ntp-4 are 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: available at https://www.nwtime.org/support 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: ---------------------------------------------------- 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: proto: precision = 0.029 usec (-25) 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: basedate set to 2022-02-04 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: gps base set to 2022-02-06 (week 2196) 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): good hash signature 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): loaded, expire=2025-12-28T00:00:00Z last=2017-01-01T00:00:00Z ofs=37 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stderr:10 Mar 15:37:26 ntpd[16106]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): expired 73 days ago 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: Listen and drop on 0 v6wildcard [::]:123 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: Listen and drop on 1 v4wildcard 0.0.0.0:123 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: Listen normally on 2 lo 127.0.0.1:123 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: Listen normally on 3 ens3 192.168.123.101:123 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: Listen normally on 4 lo [::1]:123 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: Listen normally on 5 ens3 [fe80::5055:ff:fe00:1%2]:123 2026-03-10T15:37:26.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:26 ntpd[16106]: Listening on routing socket on fd #22 for interface updates 2026-03-10T15:37:27.113 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:27 ntpd[16106]: Soliciting pool server 46.38.241.235 2026-03-10T15:37:28.111 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:28 ntpd[16106]: Soliciting pool server 185.207.105.38 2026-03-10T15:37:28.112 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:28 ntpd[16106]: Soliciting pool server 116.203.218.109 2026-03-10T15:37:29.111 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:29 ntpd[16106]: Soliciting pool server 144.76.66.156 2026-03-10T15:37:29.111 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:29 ntpd[16106]: Soliciting pool server 93.177.65.20 2026-03-10T15:37:29.112 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:29 ntpd[16106]: Soliciting pool server 37.114.42.119 2026-03-10T15:37:30.110 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:30 ntpd[16106]: Soliciting pool server 93.241.86.156 2026-03-10T15:37:30.110 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:30 ntpd[16106]: Soliciting pool server 194.36.144.87 2026-03-10T15:37:30.110 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:30 ntpd[16106]: Soliciting pool server 185.233.107.180 2026-03-10T15:37:30.111 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:30 ntpd[16106]: Soliciting pool server 131.188.3.221 2026-03-10T15:37:31.110 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:31 ntpd[16106]: Soliciting pool server 5.75.181.179 2026-03-10T15:37:31.110 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:31 ntpd[16106]: Soliciting pool server 176.9.42.91 2026-03-10T15:37:31.110 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:31 ntpd[16106]: Soliciting pool server 104.167.24.26 2026-03-10T15:37:31.163 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:31 ntpd[16106]: Soliciting pool server 185.125.190.58 2026-03-10T15:37:31.179 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:31 ntpd[16106]: Soliciting pool server 5.45.97.204 2026-03-10T15:37:32.109 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:32 ntpd[16106]: Soliciting pool server 185.125.190.57 2026-03-10T15:37:32.109 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:32 ntpd[16106]: Soliciting pool server 46.224.156.215 2026-03-10T15:37:32.110 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:32 ntpd[16106]: Soliciting pool server 172.104.134.72 2026-03-10T15:37:32.110 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:32 ntpd[16106]: Soliciting pool server 116.203.96.227 2026-03-10T15:37:36.136 INFO:teuthology.orchestra.run.vm01.stdout:10 Mar 15:37:36 ntpd[16106]: ntpd: time slew +0.022035 s 2026-03-10T15:37:36.136 INFO:teuthology.orchestra.run.vm01.stdout:ntpd: time slew +0.022035s 2026-03-10T15:37:36.154 INFO:teuthology.orchestra.run.vm01.stdout: remote refid st t when poll reach delay offset jitter 2026-03-10T15:37:36.154 INFO:teuthology.orchestra.run.vm01.stdout:============================================================================== 2026-03-10T15:37:36.154 INFO:teuthology.orchestra.run.vm01.stdout: 0.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-10T15:37:36.154 INFO:teuthology.orchestra.run.vm01.stdout: 1.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-10T15:37:36.154 INFO:teuthology.orchestra.run.vm01.stdout: 2.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-10T15:37:36.154 INFO:teuthology.orchestra.run.vm01.stdout: 3.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-10T15:37:36.154 INFO:teuthology.orchestra.run.vm01.stdout: ntp.ubuntu.com .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-10T15:37:36.155 INFO:teuthology.run_tasks:Running task install... 2026-03-10T15:37:36.156 DEBUG:teuthology.task.install:project ceph 2026-03-10T15:37:36.157 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'}, 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-10T15:37:36.157 DEBUG:teuthology.task.install:config {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-10T15:37:36.157 INFO:teuthology.task.install:Using flavor: default 2026-03-10T15:37:36.159 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'ceph-volume', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'ceph-volume', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-10T15:37:36.159 INFO:teuthology.task.install:extra packages: [] 2026-03-10T15:37:36.159 DEBUG:teuthology.orchestra.run.vm01:> sudo apt-key list | grep Ceph 2026-03-10T15:37:36.231 INFO:teuthology.orchestra.run.vm01.stderr:Warning: apt-key is deprecated. Manage keyring files in trusted.gpg.d instead (see apt-key(8)). 2026-03-10T15:37:36.249 INFO:teuthology.orchestra.run.vm01.stdout:uid [ unknown] Ceph automated package build (Ceph automated package build) 2026-03-10T15:37:36.249 INFO:teuthology.orchestra.run.vm01.stdout:uid [ unknown] Ceph.com (release key) 2026-03-10T15:37:36.249 INFO:teuthology.task.install.deb:Installing packages: ceph, cephadm, ceph-mds, ceph-mgr, ceph-common, ceph-fuse, ceph-test, ceph-volume, radosgw, python3-rados, python3-rgw, python3-cephfs, python3-rbd, libcephfs2, libcephfs-dev, librados2, librbd1, rbd-fuse on remote deb x86_64 2026-03-10T15:37:36.249 INFO:teuthology.task.install.deb:Installing system (non-project) packages: python3-xmltodict, python3-jmespath on remote deb x86_64 2026-03-10T15:37:36.249 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T15:37:36.822 INFO:teuthology.task.install.deb:Pulling from https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default/ 2026-03-10T15:37:36.822 INFO:teuthology.task.install.deb:Package version is 19.2.3-678-ge911bdeb-1jammy 2026-03-10T15:37:37.347 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-10T15:37:37.347 DEBUG:teuthology.orchestra.run.vm01:> sudo dd of=/etc/apt/sources.list.d/ceph.list 2026-03-10T15:37:37.356 DEBUG:teuthology.orchestra.run.vm01:> sudo apt-get update 2026-03-10T15:37:37.556 INFO:teuthology.orchestra.run.vm01.stdout:Hit:1 https://archive.ubuntu.com/ubuntu jammy InRelease 2026-03-10T15:37:37.560 INFO:teuthology.orchestra.run.vm01.stdout:Hit:2 https://archive.ubuntu.com/ubuntu jammy-updates InRelease 2026-03-10T15:37:37.568 INFO:teuthology.orchestra.run.vm01.stdout:Hit:3 https://archive.ubuntu.com/ubuntu jammy-backports InRelease 2026-03-10T15:37:37.658 INFO:teuthology.orchestra.run.vm01.stdout:Hit:4 https://security.ubuntu.com/ubuntu jammy-security InRelease 2026-03-10T15:37:38.047 INFO:teuthology.orchestra.run.vm01.stdout:Ign:5 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy InRelease 2026-03-10T15:37:38.157 INFO:teuthology.orchestra.run.vm01.stdout:Get:6 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy Release [7662 B] 2026-03-10T15:37:38.267 INFO:teuthology.orchestra.run.vm01.stdout:Ign:7 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy Release.gpg 2026-03-10T15:37:38.376 INFO:teuthology.orchestra.run.vm01.stdout:Get:8 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 Packages [18.1 kB] 2026-03-10T15:37:38.455 INFO:teuthology.orchestra.run.vm01.stdout:Fetched 25.8 kB in 1s (27.8 kB/s) 2026-03-10T15:37:39.164 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:37:39.176 DEBUG:teuthology.orchestra.run.vm01:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install ceph=19.2.3-678-ge911bdeb-1jammy cephadm=19.2.3-678-ge911bdeb-1jammy ceph-mds=19.2.3-678-ge911bdeb-1jammy ceph-mgr=19.2.3-678-ge911bdeb-1jammy ceph-common=19.2.3-678-ge911bdeb-1jammy ceph-fuse=19.2.3-678-ge911bdeb-1jammy ceph-test=19.2.3-678-ge911bdeb-1jammy ceph-volume=19.2.3-678-ge911bdeb-1jammy radosgw=19.2.3-678-ge911bdeb-1jammy python3-rados=19.2.3-678-ge911bdeb-1jammy python3-rgw=19.2.3-678-ge911bdeb-1jammy python3-cephfs=19.2.3-678-ge911bdeb-1jammy python3-rbd=19.2.3-678-ge911bdeb-1jammy libcephfs2=19.2.3-678-ge911bdeb-1jammy libcephfs-dev=19.2.3-678-ge911bdeb-1jammy librados2=19.2.3-678-ge911bdeb-1jammy librbd1=19.2.3-678-ge911bdeb-1jammy rbd-fuse=19.2.3-678-ge911bdeb-1jammy 2026-03-10T15:37:39.210 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:37:39.414 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:37:39.415 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:37:39.581 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:37:39.581 INFO:teuthology.orchestra.run.vm01.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-10T15:37:39.582 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-10T15:37:39.582 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:37:39.582 INFO:teuthology.orchestra.run.vm01.stdout:The following additional packages will be installed: 2026-03-10T15:37:39.582 INFO:teuthology.orchestra.run.vm01.stdout: ceph-base ceph-mgr-cephadm ceph-mgr-dashboard ceph-mgr-diskprediction-local 2026-03-10T15:37:39.582 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-k8sevents ceph-mgr-modules-core ceph-mon ceph-osd jq 2026-03-10T15:37:39.582 INFO:teuthology.orchestra.run.vm01.stdout: libdouble-conversion3 libfuse2 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-10T15:37:39.582 INFO:teuthology.orchestra.run.vm01.stdout: liboath0 libonig5 libpcre2-16-0 libqt5core5a libqt5dbus5 libqt5network5 2026-03-10T15:37:39.582 INFO:teuthology.orchestra.run.vm01.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsqlite3-mod-ceph 2026-03-10T15:37:39.582 INFO:teuthology.orchestra.run.vm01.stdout: libthrift-0.16.0 lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-10T15:37:39.582 INFO:teuthology.orchestra.run.vm01.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-10T15:37:39.582 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-10T15:37:39.582 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-iniconfig 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: python3-pastescript python3-pecan python3-pluggy python3-portend 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: python3-prettytable python3-psutil python3-py python3-pygments 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-pytest python3-repoze.lru 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: python3-toml python3-waitress python3-wcwidth python3-webob 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: python3-websocket python3-webtest python3-werkzeug python3-zc.lockfile 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: qttranslations5-l10n smartmontools socat unzip xmlstarlet zip 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout:Suggested packages: 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: python3-influxdb readline-doc python3-beaker python-mako-doc 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: python-natsort-doc httpd-wsgi libapache2-mod-python libapache2-mod-scgi 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: libjs-mochikit python-pecan-doc python-psutil-doc subversion 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: python-pygments-doc ttf-bitstream-vera python-pyinotify-doc python3-dap 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: python-sklearn-doc ipython3 python-waitress-doc python-webob-doc 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: python-webtest-doc python-werkzeug-doc python3-watchdog gsmartcontrol 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: smart-notifier mailx | mailutils 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout:Recommended packages: 2026-03-10T15:37:39.583 INFO:teuthology.orchestra.run.vm01.stdout: btrfs-tools 2026-03-10T15:37:39.619 INFO:teuthology.orchestra.run.vm01.stdout:The following NEW packages will be installed: 2026-03-10T15:37:39.619 INFO:teuthology.orchestra.run.vm01.stdout: ceph ceph-base ceph-common ceph-fuse ceph-mds ceph-mgr ceph-mgr-cephadm 2026-03-10T15:37:39.619 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-k8sevents 2026-03-10T15:37:39.619 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core ceph-mon ceph-osd ceph-test ceph-volume cephadm jq 2026-03-10T15:37:39.619 INFO:teuthology.orchestra.run.vm01.stdout: libcephfs-dev libcephfs2 libdouble-conversion3 libfuse2 libjq1 liblttng-ust1 2026-03-10T15:37:39.619 INFO:teuthology.orchestra.run.vm01.stdout: liblua5.3-dev libnbd0 liboath0 libonig5 libpcre2-16-0 libqt5core5a 2026-03-10T15:37:39.619 INFO:teuthology.orchestra.run.vm01.stdout: libqt5dbus5 libqt5network5 libradosstriper1 librdkafka1 libreadline-dev 2026-03-10T15:37:39.619 INFO:teuthology.orchestra.run.vm01.stdout: librgw2 libsqlite3-mod-ceph libthrift-0.16.0 lua-any lua-sec lua-socket 2026-03-10T15:37:39.620 INFO:teuthology.orchestra.run.vm01.stdout: lua5.1 luarocks nvme-cli pkg-config python-asyncssh-doc 2026-03-10T15:37:39.620 INFO:teuthology.orchestra.run.vm01.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-10T15:37:39.620 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-argparse python3-ceph-common python3-cephfs python3-cheroot 2026-03-10T15:37:39.620 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-iniconfig 2026-03-10T15:37:39.620 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-10T15:37:39.620 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-10T15:37:39.620 INFO:teuthology.orchestra.run.vm01.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-10T15:37:39.620 INFO:teuthology.orchestra.run.vm01.stdout: python3-pastescript python3-pecan python3-pluggy python3-portend 2026-03-10T15:37:39.620 INFO:teuthology.orchestra.run.vm01.stdout: python3-prettytable python3-psutil python3-py python3-pygments 2026-03-10T15:37:39.620 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-pytest python3-rados python3-rbd 2026-03-10T15:37:39.620 INFO:teuthology.orchestra.run.vm01.stdout: python3-repoze.lru python3-requests-oauthlib python3-rgw python3-routes 2026-03-10T15:37:39.620 INFO:teuthology.orchestra.run.vm01.stdout: python3-rsa python3-simplegeneric python3-simplejson python3-singledispatch 2026-03-10T15:37:39.620 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn python3-sklearn-lib python3-tempita python3-tempora 2026-03-10T15:37:39.620 INFO:teuthology.orchestra.run.vm01.stdout: python3-threadpoolctl python3-toml python3-waitress python3-wcwidth 2026-03-10T15:37:39.620 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-10T15:37:39.620 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile qttranslations5-l10n radosgw rbd-fuse smartmontools 2026-03-10T15:37:39.620 INFO:teuthology.orchestra.run.vm01.stdout: socat unzip xmlstarlet zip 2026-03-10T15:37:39.620 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be upgraded: 2026-03-10T15:37:39.620 INFO:teuthology.orchestra.run.vm01.stdout: librados2 librbd1 2026-03-10T15:37:40.074 INFO:teuthology.orchestra.run.vm01.stdout:2 upgraded, 107 newly installed, 0 to remove and 12 not upgraded. 2026-03-10T15:37:40.074 INFO:teuthology.orchestra.run.vm01.stdout:Need to get 178 MB of archives. 2026-03-10T15:37:40.074 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 782 MB of additional disk space will be used. 2026-03-10T15:37:40.074 INFO:teuthology.orchestra.run.vm01.stdout:Get:1 https://archive.ubuntu.com/ubuntu jammy/main amd64 liblttng-ust1 amd64 2.13.1-1ubuntu1 [190 kB] 2026-03-10T15:37:40.205 INFO:teuthology.orchestra.run.vm01.stdout:Get:2 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 librbd1 amd64 19.2.3-678-ge911bdeb-1jammy [3257 kB] 2026-03-10T15:37:40.532 INFO:teuthology.orchestra.run.vm01.stdout:Get:3 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libdouble-conversion3 amd64 3.1.7-4 [39.0 kB] 2026-03-10T15:37:40.546 INFO:teuthology.orchestra.run.vm01.stdout:Get:4 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 libpcre2-16-0 amd64 10.39-3ubuntu0.1 [203 kB] 2026-03-10T15:37:40.641 INFO:teuthology.orchestra.run.vm01.stdout:Get:5 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5core5a amd64 5.15.3+dfsg-2ubuntu0.2 [2006 kB] 2026-03-10T15:37:40.912 INFO:teuthology.orchestra.run.vm01.stdout:Get:6 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5dbus5 amd64 5.15.3+dfsg-2ubuntu0.2 [222 kB] 2026-03-10T15:37:40.926 INFO:teuthology.orchestra.run.vm01.stdout:Get:7 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5network5 amd64 5.15.3+dfsg-2ubuntu0.2 [731 kB] 2026-03-10T15:37:40.984 INFO:teuthology.orchestra.run.vm01.stdout:Get:8 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libthrift-0.16.0 amd64 0.16.0-2 [267 kB] 2026-03-10T15:37:40.993 INFO:teuthology.orchestra.run.vm01.stdout:Get:9 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libnbd0 amd64 1.10.5-1 [71.3 kB] 2026-03-10T15:37:40.995 INFO:teuthology.orchestra.run.vm01.stdout:Get:10 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-wcwidth all 0.2.5+dfsg1-1 [21.9 kB] 2026-03-10T15:37:40.996 INFO:teuthology.orchestra.run.vm01.stdout:Get:11 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-prettytable all 2.5.0-2 [31.3 kB] 2026-03-10T15:37:40.998 INFO:teuthology.orchestra.run.vm01.stdout:Get:12 https://archive.ubuntu.com/ubuntu jammy/universe amd64 librdkafka1 amd64 1.8.0-1build1 [633 kB] 2026-03-10T15:37:41.007 INFO:teuthology.orchestra.run.vm01.stdout:Get:13 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 librados2 amd64 19.2.3-678-ge911bdeb-1jammy [3597 kB] 2026-03-10T15:37:41.020 INFO:teuthology.orchestra.run.vm01.stdout:Get:14 https://archive.ubuntu.com/ubuntu jammy/main amd64 libreadline-dev amd64 8.1.2-1 [166 kB] 2026-03-10T15:37:41.023 INFO:teuthology.orchestra.run.vm01.stdout:Get:15 https://archive.ubuntu.com/ubuntu jammy/main amd64 liblua5.3-dev amd64 5.3.6-1build1 [167 kB] 2026-03-10T15:37:41.027 INFO:teuthology.orchestra.run.vm01.stdout:Get:16 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua5.1 amd64 5.1.5-8.1build4 [94.6 kB] 2026-03-10T15:37:41.114 INFO:teuthology.orchestra.run.vm01.stdout:Get:17 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua-any all 27ubuntu1 [5034 B] 2026-03-10T15:37:41.114 INFO:teuthology.orchestra.run.vm01.stdout:Get:18 https://archive.ubuntu.com/ubuntu jammy/main amd64 zip amd64 3.0-12build2 [176 kB] 2026-03-10T15:37:41.117 INFO:teuthology.orchestra.run.vm01.stdout:Get:19 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 unzip amd64 6.0-26ubuntu3.2 [175 kB] 2026-03-10T15:37:41.121 INFO:teuthology.orchestra.run.vm01.stdout:Get:20 https://archive.ubuntu.com/ubuntu jammy/universe amd64 luarocks all 3.8.0+dfsg1-1 [140 kB] 2026-03-10T15:37:41.128 INFO:teuthology.orchestra.run.vm01.stdout:Get:21 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 liboath0 amd64 2.6.7-3ubuntu0.1 [41.3 kB] 2026-03-10T15:37:41.129 INFO:teuthology.orchestra.run.vm01.stdout:Get:22 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs2 amd64 19.2.3-678-ge911bdeb-1jammy [979 kB] 2026-03-10T15:37:41.129 INFO:teuthology.orchestra.run.vm01.stdout:Get:23 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.functools all 3.4.0-2 [9030 B] 2026-03-10T15:37:41.142 INFO:teuthology.orchestra.run.vm01.stdout:Get:24 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-rados amd64 19.2.3-678-ge911bdeb-1jammy [357 kB] 2026-03-10T15:37:41.146 INFO:teuthology.orchestra.run.vm01.stdout:Get:25 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-ceph-argparse all 19.2.3-678-ge911bdeb-1jammy [32.9 kB] 2026-03-10T15:37:41.146 INFO:teuthology.orchestra.run.vm01.stdout:Get:26 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-cephfs amd64 19.2.3-678-ge911bdeb-1jammy [184 kB] 2026-03-10T15:37:41.150 INFO:teuthology.orchestra.run.vm01.stdout:Get:27 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-ceph-common all 19.2.3-678-ge911bdeb-1jammy [70.1 kB] 2026-03-10T15:37:41.151 INFO:teuthology.orchestra.run.vm01.stdout:Get:28 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-rbd amd64 19.2.3-678-ge911bdeb-1jammy [334 kB] 2026-03-10T15:37:41.156 INFO:teuthology.orchestra.run.vm01.stdout:Get:29 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 librgw2 amd64 19.2.3-678-ge911bdeb-1jammy [6935 kB] 2026-03-10T15:37:41.212 INFO:teuthology.orchestra.run.vm01.stdout:Get:30 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-cheroot all 8.5.2+ds1-1ubuntu3.1 [71.1 kB] 2026-03-10T15:37:41.213 INFO:teuthology.orchestra.run.vm01.stdout:Get:31 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.classes all 3.2.1-3 [6452 B] 2026-03-10T15:37:41.213 INFO:teuthology.orchestra.run.vm01.stdout:Get:32 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.text all 3.6.0-2 [8716 B] 2026-03-10T15:37:41.213 INFO:teuthology.orchestra.run.vm01.stdout:Get:33 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.collections all 3.4.0-2 [11.4 kB] 2026-03-10T15:37:41.310 INFO:teuthology.orchestra.run.vm01.stdout:Get:34 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-tempora all 4.1.2-1 [14.8 kB] 2026-03-10T15:37:41.310 INFO:teuthology.orchestra.run.vm01.stdout:Get:35 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-portend all 3.0.0-1 [7240 B] 2026-03-10T15:37:41.310 INFO:teuthology.orchestra.run.vm01.stdout:Get:36 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-zc.lockfile all 2.0-1 [8980 B] 2026-03-10T15:37:41.310 INFO:teuthology.orchestra.run.vm01.stdout:Get:37 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-cherrypy3 all 18.6.1-4 [208 kB] 2026-03-10T15:37:41.313 INFO:teuthology.orchestra.run.vm01.stdout:Get:38 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-natsort all 8.0.2-1 [35.3 kB] 2026-03-10T15:37:41.314 INFO:teuthology.orchestra.run.vm01.stdout:Get:39 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-logutils all 0.3.3-8 [17.6 kB] 2026-03-10T15:37:41.407 INFO:teuthology.orchestra.run.vm01.stdout:Get:40 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-mako all 1.1.3+ds1-2ubuntu0.1 [60.5 kB] 2026-03-10T15:37:41.408 INFO:teuthology.orchestra.run.vm01.stdout:Get:41 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-simplegeneric all 0.8.1-3 [11.3 kB] 2026-03-10T15:37:41.408 INFO:teuthology.orchestra.run.vm01.stdout:Get:42 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-singledispatch all 3.4.0.3-3 [7320 B] 2026-03-10T15:37:41.409 INFO:teuthology.orchestra.run.vm01.stdout:Get:43 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-webob all 1:1.8.6-1.1ubuntu0.1 [86.7 kB] 2026-03-10T15:37:41.470 INFO:teuthology.orchestra.run.vm01.stdout:Get:44 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-rgw amd64 19.2.3-678-ge911bdeb-1jammy [112 kB] 2026-03-10T15:37:41.472 INFO:teuthology.orchestra.run.vm01.stdout:Get:45 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libradosstriper1 amd64 19.2.3-678-ge911bdeb-1jammy [470 kB] 2026-03-10T15:37:41.477 INFO:teuthology.orchestra.run.vm01.stdout:Get:46 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-common amd64 19.2.3-678-ge911bdeb-1jammy [26.5 MB] 2026-03-10T15:37:41.505 INFO:teuthology.orchestra.run.vm01.stdout:Get:47 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-waitress all 1.4.4-1.1ubuntu1.1 [47.0 kB] 2026-03-10T15:37:41.506 INFO:teuthology.orchestra.run.vm01.stdout:Get:48 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-tempita all 0.5.2-6ubuntu1 [15.1 kB] 2026-03-10T15:37:41.506 INFO:teuthology.orchestra.run.vm01.stdout:Get:49 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-paste all 3.5.0+dfsg1-1 [456 kB] 2026-03-10T15:37:41.513 INFO:teuthology.orchestra.run.vm01.stdout:Get:50 https://archive.ubuntu.com/ubuntu jammy/main amd64 python-pastedeploy-tpl all 2.1.1-1 [4892 B] 2026-03-10T15:37:41.513 INFO:teuthology.orchestra.run.vm01.stdout:Get:51 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pastedeploy all 2.1.1-1 [26.6 kB] 2026-03-10T15:37:41.513 INFO:teuthology.orchestra.run.vm01.stdout:Get:52 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-webtest all 2.0.35-1 [28.5 kB] 2026-03-10T15:37:41.603 INFO:teuthology.orchestra.run.vm01.stdout:Get:53 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pecan all 1.3.3-4ubuntu2 [87.3 kB] 2026-03-10T15:37:41.607 INFO:teuthology.orchestra.run.vm01.stdout:Get:54 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-werkzeug all 2.0.2+dfsg1-1ubuntu0.22.04.3 [181 kB] 2026-03-10T15:37:41.611 INFO:teuthology.orchestra.run.vm01.stdout:Get:55 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libfuse2 amd64 2.9.9-5ubuntu3 [90.3 kB] 2026-03-10T15:37:41.611 INFO:teuthology.orchestra.run.vm01.stdout:Get:56 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python3-asyncssh all 2.5.0-1ubuntu0.1 [189 kB] 2026-03-10T15:37:41.700 INFO:teuthology.orchestra.run.vm01.stdout:Get:57 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-repoze.lru all 0.7-2 [12.1 kB] 2026-03-10T15:37:41.700 INFO:teuthology.orchestra.run.vm01.stdout:Get:58 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-routes all 2.5.1-1ubuntu1 [89.0 kB] 2026-03-10T15:37:41.702 INFO:teuthology.orchestra.run.vm01.stdout:Get:59 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-sklearn-lib amd64 0.23.2-5ubuntu6 [2058 kB] 2026-03-10T15:37:41.742 INFO:teuthology.orchestra.run.vm01.stdout:Get:60 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-joblib all 0.17.0-4ubuntu1 [204 kB] 2026-03-10T15:37:41.745 INFO:teuthology.orchestra.run.vm01.stdout:Get:61 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-threadpoolctl all 3.1.0-1 [21.3 kB] 2026-03-10T15:37:41.745 INFO:teuthology.orchestra.run.vm01.stdout:Get:62 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-sklearn all 0.23.2-5ubuntu6 [1829 kB] 2026-03-10T15:37:41.823 INFO:teuthology.orchestra.run.vm01.stdout:Get:63 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-cachetools all 5.0.0-1 [9722 B] 2026-03-10T15:37:41.823 INFO:teuthology.orchestra.run.vm01.stdout:Get:64 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-rsa all 4.8-1 [28.4 kB] 2026-03-10T15:37:41.824 INFO:teuthology.orchestra.run.vm01.stdout:Get:65 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-google-auth all 1.5.1-3 [35.7 kB] 2026-03-10T15:37:41.824 INFO:teuthology.orchestra.run.vm01.stdout:Get:66 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-requests-oauthlib all 1.3.0+ds-0.1 [18.7 kB] 2026-03-10T15:37:41.895 INFO:teuthology.orchestra.run.vm01.stdout:Get:67 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-websocket all 1.2.3-1 [34.7 kB] 2026-03-10T15:37:41.896 INFO:teuthology.orchestra.run.vm01.stdout:Get:68 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-kubernetes all 12.0.1-1ubuntu1 [353 kB] 2026-03-10T15:37:41.899 INFO:teuthology.orchestra.run.vm01.stdout:Get:69 https://archive.ubuntu.com/ubuntu jammy/main amd64 libonig5 amd64 6.9.7.1-2build1 [172 kB] 2026-03-10T15:37:41.901 INFO:teuthology.orchestra.run.vm01.stdout:Get:70 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 libjq1 amd64 1.6-2.1ubuntu3.1 [133 kB] 2026-03-10T15:37:41.902 INFO:teuthology.orchestra.run.vm01.stdout:Get:71 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 jq amd64 1.6-2.1ubuntu3.1 [52.5 kB] 2026-03-10T15:37:41.993 INFO:teuthology.orchestra.run.vm01.stdout:Get:72 https://archive.ubuntu.com/ubuntu jammy/main amd64 socat amd64 1.7.4.1-3ubuntu4 [349 kB] 2026-03-10T15:37:41.997 INFO:teuthology.orchestra.run.vm01.stdout:Get:73 https://archive.ubuntu.com/ubuntu jammy/universe amd64 xmlstarlet amd64 1.6.1-2.1 [265 kB] 2026-03-10T15:37:41.999 INFO:teuthology.orchestra.run.vm01.stdout:Get:74 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua-socket amd64 3.0~rc1+git+ac3201d-6 [78.9 kB] 2026-03-10T15:37:42.000 INFO:teuthology.orchestra.run.vm01.stdout:Get:75 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua-sec amd64 1.0.2-1 [37.6 kB] 2026-03-10T15:37:42.000 INFO:teuthology.orchestra.run.vm01.stdout:Get:76 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 nvme-cli amd64 1.16-3ubuntu0.3 [474 kB] 2026-03-10T15:37:42.091 INFO:teuthology.orchestra.run.vm01.stdout:Get:77 https://archive.ubuntu.com/ubuntu jammy/main amd64 pkg-config amd64 0.29.2-1ubuntu3 [48.2 kB] 2026-03-10T15:37:42.091 INFO:teuthology.orchestra.run.vm01.stdout:Get:78 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python-asyncssh-doc all 2.5.0-1ubuntu0.1 [309 kB] 2026-03-10T15:37:42.094 INFO:teuthology.orchestra.run.vm01.stdout:Get:79 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-iniconfig all 1.1.1-2 [6024 B] 2026-03-10T15:37:42.094 INFO:teuthology.orchestra.run.vm01.stdout:Get:80 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pastescript all 2.0.2-4 [54.6 kB] 2026-03-10T15:37:42.095 INFO:teuthology.orchestra.run.vm01.stdout:Get:81 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-pluggy all 0.13.0-7.1 [19.0 kB] 2026-03-10T15:37:42.188 INFO:teuthology.orchestra.run.vm01.stdout:Get:82 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-psutil amd64 5.9.0-1build1 [158 kB] 2026-03-10T15:37:42.190 INFO:teuthology.orchestra.run.vm01.stdout:Get:83 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-py all 1.10.0-1 [71.9 kB] 2026-03-10T15:37:42.191 INFO:teuthology.orchestra.run.vm01.stdout:Get:84 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-pygments all 2.11.2+dfsg-2ubuntu0.1 [750 kB] 2026-03-10T15:37:42.199 INFO:teuthology.orchestra.run.vm01.stdout:Get:85 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pyinotify all 0.9.6-1.3 [24.8 kB] 2026-03-10T15:37:42.199 INFO:teuthology.orchestra.run.vm01.stdout:Get:86 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-toml all 0.10.2-1 [16.5 kB] 2026-03-10T15:37:42.286 INFO:teuthology.orchestra.run.vm01.stdout:Get:87 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-pytest all 6.2.5-1ubuntu2 [214 kB] 2026-03-10T15:37:42.289 INFO:teuthology.orchestra.run.vm01.stdout:Get:88 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-simplejson amd64 3.17.6-1build1 [54.7 kB] 2026-03-10T15:37:42.289 INFO:teuthology.orchestra.run.vm01.stdout:Get:89 https://archive.ubuntu.com/ubuntu jammy/universe amd64 qttranslations5-l10n all 5.15.3-1 [1983 kB] 2026-03-10T15:37:42.310 INFO:teuthology.orchestra.run.vm01.stdout:Get:90 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 smartmontools amd64 7.2-1ubuntu0.1 [583 kB] 2026-03-10T15:37:42.609 INFO:teuthology.orchestra.run.vm01.stdout:Get:91 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-base amd64 19.2.3-678-ge911bdeb-1jammy [5178 kB] 2026-03-10T15:37:42.750 INFO:teuthology.orchestra.run.vm01.stdout:Get:92 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-modules-core all 19.2.3-678-ge911bdeb-1jammy [248 kB] 2026-03-10T15:37:42.821 INFO:teuthology.orchestra.run.vm01.stdout:Get:93 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libsqlite3-mod-ceph amd64 19.2.3-678-ge911bdeb-1jammy [125 kB] 2026-03-10T15:37:42.822 INFO:teuthology.orchestra.run.vm01.stdout:Get:94 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr amd64 19.2.3-678-ge911bdeb-1jammy [1081 kB] 2026-03-10T15:37:42.842 INFO:teuthology.orchestra.run.vm01.stdout:Get:95 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mon amd64 19.2.3-678-ge911bdeb-1jammy [6239 kB] 2026-03-10T15:37:43.074 INFO:teuthology.orchestra.run.vm01.stdout:Get:96 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-osd amd64 19.2.3-678-ge911bdeb-1jammy [23.0 MB] 2026-03-10T15:37:43.899 INFO:teuthology.orchestra.run.vm01.stdout:Get:97 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph amd64 19.2.3-678-ge911bdeb-1jammy [14.2 kB] 2026-03-10T15:37:43.899 INFO:teuthology.orchestra.run.vm01.stdout:Get:98 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-fuse amd64 19.2.3-678-ge911bdeb-1jammy [1173 kB] 2026-03-10T15:37:43.975 INFO:teuthology.orchestra.run.vm01.stdout:Get:99 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mds amd64 19.2.3-678-ge911bdeb-1jammy [2503 kB] 2026-03-10T15:37:44.071 INFO:teuthology.orchestra.run.vm01.stdout:Get:100 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 cephadm amd64 19.2.3-678-ge911bdeb-1jammy [798 kB] 2026-03-10T15:37:44.093 INFO:teuthology.orchestra.run.vm01.stdout:Get:101 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-cephadm all 19.2.3-678-ge911bdeb-1jammy [157 kB] 2026-03-10T15:37:44.094 INFO:teuthology.orchestra.run.vm01.stdout:Get:102 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-dashboard all 19.2.3-678-ge911bdeb-1jammy [2396 kB] 2026-03-10T15:37:44.186 INFO:teuthology.orchestra.run.vm01.stdout:Get:103 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-diskprediction-local all 19.2.3-678-ge911bdeb-1jammy [8625 kB] 2026-03-10T15:37:44.481 INFO:teuthology.orchestra.run.vm01.stdout:Get:104 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-k8sevents all 19.2.3-678-ge911bdeb-1jammy [14.3 kB] 2026-03-10T15:37:44.481 INFO:teuthology.orchestra.run.vm01.stdout:Get:105 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-test amd64 19.2.3-678-ge911bdeb-1jammy [52.1 MB] 2026-03-10T15:37:46.426 INFO:teuthology.orchestra.run.vm01.stdout:Get:106 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-volume all 19.2.3-678-ge911bdeb-1jammy [135 kB] 2026-03-10T15:37:46.426 INFO:teuthology.orchestra.run.vm01.stdout:Get:107 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs-dev amd64 19.2.3-678-ge911bdeb-1jammy [41.0 kB] 2026-03-10T15:37:46.426 INFO:teuthology.orchestra.run.vm01.stdout:Get:108 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 radosgw amd64 19.2.3-678-ge911bdeb-1jammy [13.7 MB] 2026-03-10T15:37:46.908 INFO:teuthology.orchestra.run.vm01.stdout:Get:109 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 rbd-fuse amd64 19.2.3-678-ge911bdeb-1jammy [92.2 kB] 2026-03-10T15:37:47.223 INFO:teuthology.orchestra.run.vm01.stdout:Fetched 178 MB in 7s (24.4 MB/s) 2026-03-10T15:37:47.512 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package liblttng-ust1:amd64. 2026-03-10T15:37:47.548 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 111717 files and directories currently installed.) 2026-03-10T15:37:47.550 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../000-liblttng-ust1_2.13.1-1ubuntu1_amd64.deb ... 2026-03-10T15:37:47.621 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-10T15:37:47.670 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libdouble-conversion3:amd64. 2026-03-10T15:37:47.676 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../001-libdouble-conversion3_3.1.7-4_amd64.deb ... 2026-03-10T15:37:47.680 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-10T15:37:47.698 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libpcre2-16-0:amd64. 2026-03-10T15:37:47.702 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../002-libpcre2-16-0_10.39-3ubuntu0.1_amd64.deb ... 2026-03-10T15:37:47.703 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-10T15:37:47.727 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libqt5core5a:amd64. 2026-03-10T15:37:47.732 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../003-libqt5core5a_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-10T15:37:47.737 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-10T15:37:47.782 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libqt5dbus5:amd64. 2026-03-10T15:37:47.784 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../004-libqt5dbus5_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-10T15:37:47.785 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-10T15:37:47.804 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libqt5network5:amd64. 2026-03-10T15:37:47.810 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../005-libqt5network5_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-10T15:37:47.811 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-10T15:37:47.839 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libthrift-0.16.0:amd64. 2026-03-10T15:37:47.844 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../006-libthrift-0.16.0_0.16.0-2_amd64.deb ... 2026-03-10T15:37:47.845 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-10T15:37:47.871 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../007-librbd1_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:47.874 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking librbd1 (19.2.3-678-ge911bdeb-1jammy) over (17.2.9-0ubuntu0.22.04.2) ... 2026-03-10T15:37:47.952 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../008-librados2_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:47.954 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking librados2 (19.2.3-678-ge911bdeb-1jammy) over (17.2.9-0ubuntu0.22.04.2) ... 2026-03-10T15:37:48.030 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libnbd0. 2026-03-10T15:37:48.035 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../009-libnbd0_1.10.5-1_amd64.deb ... 2026-03-10T15:37:48.036 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libnbd0 (1.10.5-1) ... 2026-03-10T15:37:48.055 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libcephfs2. 2026-03-10T15:37:48.060 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../010-libcephfs2_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:48.060 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libcephfs2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:48.088 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-rados. 2026-03-10T15:37:48.094 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../011-python3-rados_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:48.095 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-rados (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:48.117 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-ceph-argparse. 2026-03-10T15:37:48.124 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../012-python3-ceph-argparse_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-10T15:37:48.124 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-ceph-argparse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:48.140 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-cephfs. 2026-03-10T15:37:48.145 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../013-python3-cephfs_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:48.146 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-cephfs (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:48.163 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-ceph-common. 2026-03-10T15:37:48.168 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../014-python3-ceph-common_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-10T15:37:48.169 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:48.189 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-wcwidth. 2026-03-10T15:37:48.194 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../015-python3-wcwidth_0.2.5+dfsg1-1_all.deb ... 2026-03-10T15:37:48.195 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-10T15:37:48.213 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-prettytable. 2026-03-10T15:37:48.218 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../016-python3-prettytable_2.5.0-2_all.deb ... 2026-03-10T15:37:48.219 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-prettytable (2.5.0-2) ... 2026-03-10T15:37:48.235 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-rbd. 2026-03-10T15:37:48.240 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../017-python3-rbd_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:48.241 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-rbd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:48.261 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package librdkafka1:amd64. 2026-03-10T15:37:48.266 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../018-librdkafka1_1.8.0-1build1_amd64.deb ... 2026-03-10T15:37:48.267 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-10T15:37:48.288 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libreadline-dev:amd64. 2026-03-10T15:37:48.294 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../019-libreadline-dev_8.1.2-1_amd64.deb ... 2026-03-10T15:37:48.294 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libreadline-dev:amd64 (8.1.2-1) ... 2026-03-10T15:37:48.313 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package liblua5.3-dev:amd64. 2026-03-10T15:37:48.318 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../020-liblua5.3-dev_5.3.6-1build1_amd64.deb ... 2026-03-10T15:37:48.319 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking liblua5.3-dev:amd64 (5.3.6-1build1) ... 2026-03-10T15:37:48.340 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package lua5.1. 2026-03-10T15:37:48.345 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../021-lua5.1_5.1.5-8.1build4_amd64.deb ... 2026-03-10T15:37:48.346 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking lua5.1 (5.1.5-8.1build4) ... 2026-03-10T15:37:48.366 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package lua-any. 2026-03-10T15:37:48.371 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../022-lua-any_27ubuntu1_all.deb ... 2026-03-10T15:37:48.372 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking lua-any (27ubuntu1) ... 2026-03-10T15:37:48.386 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package zip. 2026-03-10T15:37:48.391 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../023-zip_3.0-12build2_amd64.deb ... 2026-03-10T15:37:48.392 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking zip (3.0-12build2) ... 2026-03-10T15:37:48.413 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package unzip. 2026-03-10T15:37:48.418 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../024-unzip_6.0-26ubuntu3.2_amd64.deb ... 2026-03-10T15:37:48.419 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking unzip (6.0-26ubuntu3.2) ... 2026-03-10T15:37:48.440 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package luarocks. 2026-03-10T15:37:48.445 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../025-luarocks_3.8.0+dfsg1-1_all.deb ... 2026-03-10T15:37:48.446 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking luarocks (3.8.0+dfsg1-1) ... 2026-03-10T15:37:48.497 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package librgw2. 2026-03-10T15:37:48.503 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../026-librgw2_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:48.504 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking librgw2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:48.628 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-rgw. 2026-03-10T15:37:48.634 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../027-python3-rgw_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:48.634 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-rgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:48.654 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package liboath0:amd64. 2026-03-10T15:37:48.657 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../028-liboath0_2.6.7-3ubuntu0.1_amd64.deb ... 2026-03-10T15:37:48.658 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-10T15:37:48.675 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libradosstriper1. 2026-03-10T15:37:48.680 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../029-libradosstriper1_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:48.682 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libradosstriper1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:48.710 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-common. 2026-03-10T15:37:48.712 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../030-ceph-common_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:48.713 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:49.158 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-base. 2026-03-10T15:37:49.161 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../031-ceph-base_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:49.166 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:49.528 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-jaraco.functools. 2026-03-10T15:37:49.533 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../032-python3-jaraco.functools_3.4.0-2_all.deb ... 2026-03-10T15:37:49.534 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-jaraco.functools (3.4.0-2) ... 2026-03-10T15:37:49.549 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-cheroot. 2026-03-10T15:37:49.555 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../033-python3-cheroot_8.5.2+ds1-1ubuntu3.1_all.deb ... 2026-03-10T15:37:49.556 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-10T15:37:49.577 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-jaraco.classes. 2026-03-10T15:37:49.582 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../034-python3-jaraco.classes_3.2.1-3_all.deb ... 2026-03-10T15:37:49.583 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-jaraco.classes (3.2.1-3) ... 2026-03-10T15:37:49.597 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-jaraco.text. 2026-03-10T15:37:49.601 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../035-python3-jaraco.text_3.6.0-2_all.deb ... 2026-03-10T15:37:49.602 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-jaraco.text (3.6.0-2) ... 2026-03-10T15:37:49.618 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-jaraco.collections. 2026-03-10T15:37:49.623 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../036-python3-jaraco.collections_3.4.0-2_all.deb ... 2026-03-10T15:37:49.624 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-jaraco.collections (3.4.0-2) ... 2026-03-10T15:37:49.641 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-tempora. 2026-03-10T15:37:49.646 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../037-python3-tempora_4.1.2-1_all.deb ... 2026-03-10T15:37:49.647 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-tempora (4.1.2-1) ... 2026-03-10T15:37:49.664 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-portend. 2026-03-10T15:37:49.670 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../038-python3-portend_3.0.0-1_all.deb ... 2026-03-10T15:37:49.670 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-portend (3.0.0-1) ... 2026-03-10T15:37:49.686 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-zc.lockfile. 2026-03-10T15:37:49.692 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../039-python3-zc.lockfile_2.0-1_all.deb ... 2026-03-10T15:37:49.693 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-zc.lockfile (2.0-1) ... 2026-03-10T15:37:49.710 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-cherrypy3. 2026-03-10T15:37:49.715 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../040-python3-cherrypy3_18.6.1-4_all.deb ... 2026-03-10T15:37:49.716 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-cherrypy3 (18.6.1-4) ... 2026-03-10T15:37:49.745 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-natsort. 2026-03-10T15:37:49.750 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../041-python3-natsort_8.0.2-1_all.deb ... 2026-03-10T15:37:49.751 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-natsort (8.0.2-1) ... 2026-03-10T15:37:49.770 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-logutils. 2026-03-10T15:37:49.776 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../042-python3-logutils_0.3.3-8_all.deb ... 2026-03-10T15:37:49.777 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-logutils (0.3.3-8) ... 2026-03-10T15:37:49.796 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-mako. 2026-03-10T15:37:49.801 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../043-python3-mako_1.1.3+ds1-2ubuntu0.1_all.deb ... 2026-03-10T15:37:49.802 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-mako (1.1.3+ds1-2ubuntu0.1) ... 2026-03-10T15:37:49.825 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-simplegeneric. 2026-03-10T15:37:49.831 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../044-python3-simplegeneric_0.8.1-3_all.deb ... 2026-03-10T15:37:49.832 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-simplegeneric (0.8.1-3) ... 2026-03-10T15:37:49.849 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-singledispatch. 2026-03-10T15:37:49.856 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../045-python3-singledispatch_3.4.0.3-3_all.deb ... 2026-03-10T15:37:49.857 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-singledispatch (3.4.0.3-3) ... 2026-03-10T15:37:49.872 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-webob. 2026-03-10T15:37:49.878 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../046-python3-webob_1%3a1.8.6-1.1ubuntu0.1_all.deb ... 2026-03-10T15:37:49.879 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-10T15:37:49.899 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-waitress. 2026-03-10T15:37:49.904 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../047-python3-waitress_1.4.4-1.1ubuntu1.1_all.deb ... 2026-03-10T15:37:49.906 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-waitress (1.4.4-1.1ubuntu1.1) ... 2026-03-10T15:37:49.924 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-tempita. 2026-03-10T15:37:49.930 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../048-python3-tempita_0.5.2-6ubuntu1_all.deb ... 2026-03-10T15:37:49.931 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-tempita (0.5.2-6ubuntu1) ... 2026-03-10T15:37:49.948 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-paste. 2026-03-10T15:37:49.955 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../049-python3-paste_3.5.0+dfsg1-1_all.deb ... 2026-03-10T15:37:49.956 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-paste (3.5.0+dfsg1-1) ... 2026-03-10T15:37:49.998 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python-pastedeploy-tpl. 2026-03-10T15:37:50.004 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../050-python-pastedeploy-tpl_2.1.1-1_all.deb ... 2026-03-10T15:37:50.004 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python-pastedeploy-tpl (2.1.1-1) ... 2026-03-10T15:37:50.020 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-pastedeploy. 2026-03-10T15:37:50.026 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../051-python3-pastedeploy_2.1.1-1_all.deb ... 2026-03-10T15:37:50.027 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-pastedeploy (2.1.1-1) ... 2026-03-10T15:37:50.044 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-webtest. 2026-03-10T15:37:50.050 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../052-python3-webtest_2.0.35-1_all.deb ... 2026-03-10T15:37:50.051 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-webtest (2.0.35-1) ... 2026-03-10T15:37:50.068 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-pecan. 2026-03-10T15:37:50.074 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../053-python3-pecan_1.3.3-4ubuntu2_all.deb ... 2026-03-10T15:37:50.075 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-pecan (1.3.3-4ubuntu2) ... 2026-03-10T15:37:50.107 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-werkzeug. 2026-03-10T15:37:50.112 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../054-python3-werkzeug_2.0.2+dfsg1-1ubuntu0.22.04.3_all.deb ... 2026-03-10T15:37:50.113 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-werkzeug (2.0.2+dfsg1-1ubuntu0.22.04.3) ... 2026-03-10T15:37:50.137 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mgr-modules-core. 2026-03-10T15:37:50.143 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../055-ceph-mgr-modules-core_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-10T15:37:50.144 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mgr-modules-core (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:50.180 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libsqlite3-mod-ceph. 2026-03-10T15:37:50.185 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../056-libsqlite3-mod-ceph_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:50.186 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libsqlite3-mod-ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:50.204 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mgr. 2026-03-10T15:37:50.208 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../057-ceph-mgr_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:50.209 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:50.243 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mon. 2026-03-10T15:37:50.248 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../058-ceph-mon_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:50.249 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:50.345 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libfuse2:amd64. 2026-03-10T15:37:50.350 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../059-libfuse2_2.9.9-5ubuntu3_amd64.deb ... 2026-03-10T15:37:50.351 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-10T15:37:50.370 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-osd. 2026-03-10T15:37:50.375 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../060-ceph-osd_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:50.376 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:50.699 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph. 2026-03-10T15:37:50.705 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../061-ceph_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:50.706 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:50.723 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-fuse. 2026-03-10T15:37:50.728 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../062-ceph-fuse_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:50.729 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:50.763 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mds. 2026-03-10T15:37:50.768 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../063-ceph-mds_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:50.769 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:50.819 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package cephadm. 2026-03-10T15:37:50.825 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../064-cephadm_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:50.826 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:50.845 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-asyncssh. 2026-03-10T15:37:50.852 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../065-python3-asyncssh_2.5.0-1ubuntu0.1_all.deb ... 2026-03-10T15:37:50.853 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-10T15:37:50.882 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mgr-cephadm. 2026-03-10T15:37:50.888 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../066-ceph-mgr-cephadm_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-10T15:37:50.889 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mgr-cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:50.914 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-repoze.lru. 2026-03-10T15:37:50.921 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../067-python3-repoze.lru_0.7-2_all.deb ... 2026-03-10T15:37:50.922 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-repoze.lru (0.7-2) ... 2026-03-10T15:37:50.939 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-routes. 2026-03-10T15:37:50.945 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../068-python3-routes_2.5.1-1ubuntu1_all.deb ... 2026-03-10T15:37:50.946 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-routes (2.5.1-1ubuntu1) ... 2026-03-10T15:37:50.970 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mgr-dashboard. 2026-03-10T15:37:50.975 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../069-ceph-mgr-dashboard_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-10T15:37:50.976 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mgr-dashboard (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:51.357 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-sklearn-lib:amd64. 2026-03-10T15:37:51.360 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../070-python3-sklearn-lib_0.23.2-5ubuntu6_amd64.deb ... 2026-03-10T15:37:51.361 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-10T15:37:51.685 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-joblib. 2026-03-10T15:37:51.688 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../071-python3-joblib_0.17.0-4ubuntu1_all.deb ... 2026-03-10T15:37:51.689 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-joblib (0.17.0-4ubuntu1) ... 2026-03-10T15:37:51.722 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-threadpoolctl. 2026-03-10T15:37:51.728 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../072-python3-threadpoolctl_3.1.0-1_all.deb ... 2026-03-10T15:37:51.728 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-threadpoolctl (3.1.0-1) ... 2026-03-10T15:37:51.745 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-sklearn. 2026-03-10T15:37:51.751 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../073-python3-sklearn_0.23.2-5ubuntu6_all.deb ... 2026-03-10T15:37:51.752 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-10T15:37:51.897 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mgr-diskprediction-local. 2026-03-10T15:37:51.904 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../074-ceph-mgr-diskprediction-local_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-10T15:37:51.905 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mgr-diskprediction-local (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:52.199 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-cachetools. 2026-03-10T15:37:52.205 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../075-python3-cachetools_5.0.0-1_all.deb ... 2026-03-10T15:37:52.205 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-cachetools (5.0.0-1) ... 2026-03-10T15:37:52.224 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-rsa. 2026-03-10T15:37:52.229 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../076-python3-rsa_4.8-1_all.deb ... 2026-03-10T15:37:52.229 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-rsa (4.8-1) ... 2026-03-10T15:37:52.251 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-google-auth. 2026-03-10T15:37:52.255 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../077-python3-google-auth_1.5.1-3_all.deb ... 2026-03-10T15:37:52.255 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-google-auth (1.5.1-3) ... 2026-03-10T15:37:52.275 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-requests-oauthlib. 2026-03-10T15:37:52.278 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../078-python3-requests-oauthlib_1.3.0+ds-0.1_all.deb ... 2026-03-10T15:37:52.279 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-10T15:37:52.296 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-websocket. 2026-03-10T15:37:52.300 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../079-python3-websocket_1.2.3-1_all.deb ... 2026-03-10T15:37:52.300 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-websocket (1.2.3-1) ... 2026-03-10T15:37:52.320 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-kubernetes. 2026-03-10T15:37:52.325 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../080-python3-kubernetes_12.0.1-1ubuntu1_all.deb ... 2026-03-10T15:37:52.342 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-10T15:37:52.503 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mgr-k8sevents. 2026-03-10T15:37:52.509 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../081-ceph-mgr-k8sevents_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-10T15:37:52.510 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mgr-k8sevents (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:52.527 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libonig5:amd64. 2026-03-10T15:37:52.533 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../082-libonig5_6.9.7.1-2build1_amd64.deb ... 2026-03-10T15:37:52.534 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-10T15:37:52.556 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libjq1:amd64. 2026-03-10T15:37:52.563 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../083-libjq1_1.6-2.1ubuntu3.1_amd64.deb ... 2026-03-10T15:37:52.565 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-10T15:37:52.845 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package jq. 2026-03-10T15:37:52.851 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../084-jq_1.6-2.1ubuntu3.1_amd64.deb ... 2026-03-10T15:37:52.852 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking jq (1.6-2.1ubuntu3.1) ... 2026-03-10T15:37:52.872 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package socat. 2026-03-10T15:37:52.877 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../085-socat_1.7.4.1-3ubuntu4_amd64.deb ... 2026-03-10T15:37:52.878 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking socat (1.7.4.1-3ubuntu4) ... 2026-03-10T15:37:52.905 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package xmlstarlet. 2026-03-10T15:37:52.911 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../086-xmlstarlet_1.6.1-2.1_amd64.deb ... 2026-03-10T15:37:52.911 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking xmlstarlet (1.6.1-2.1) ... 2026-03-10T15:37:52.963 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-test. 2026-03-10T15:37:52.968 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../087-ceph-test_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:52.969 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-test (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:53.693 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-volume. 2026-03-10T15:37:53.699 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../088-ceph-volume_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-10T15:37:53.700 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-volume (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:53.729 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libcephfs-dev. 2026-03-10T15:37:53.734 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../089-libcephfs-dev_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:53.735 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libcephfs-dev (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:53.750 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package lua-socket:amd64. 2026-03-10T15:37:53.754 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../090-lua-socket_3.0~rc1+git+ac3201d-6_amd64.deb ... 2026-03-10T15:37:53.755 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking lua-socket:amd64 (3.0~rc1+git+ac3201d-6) ... 2026-03-10T15:37:53.782 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package lua-sec:amd64. 2026-03-10T15:37:53.786 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../091-lua-sec_1.0.2-1_amd64.deb ... 2026-03-10T15:37:53.787 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking lua-sec:amd64 (1.0.2-1) ... 2026-03-10T15:37:53.809 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package nvme-cli. 2026-03-10T15:37:53.813 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../092-nvme-cli_1.16-3ubuntu0.3_amd64.deb ... 2026-03-10T15:37:53.814 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking nvme-cli (1.16-3ubuntu0.3) ... 2026-03-10T15:37:53.856 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package pkg-config. 2026-03-10T15:37:53.861 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../093-pkg-config_0.29.2-1ubuntu3_amd64.deb ... 2026-03-10T15:37:53.862 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking pkg-config (0.29.2-1ubuntu3) ... 2026-03-10T15:37:53.879 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python-asyncssh-doc. 2026-03-10T15:37:53.884 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../094-python-asyncssh-doc_2.5.0-1ubuntu0.1_all.deb ... 2026-03-10T15:37:53.885 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-10T15:37:53.932 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-iniconfig. 2026-03-10T15:37:53.937 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../095-python3-iniconfig_1.1.1-2_all.deb ... 2026-03-10T15:37:53.938 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-iniconfig (1.1.1-2) ... 2026-03-10T15:37:53.956 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-pastescript. 2026-03-10T15:37:53.960 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../096-python3-pastescript_2.0.2-4_all.deb ... 2026-03-10T15:37:53.961 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-pastescript (2.0.2-4) ... 2026-03-10T15:37:53.984 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-pluggy. 2026-03-10T15:37:53.989 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../097-python3-pluggy_0.13.0-7.1_all.deb ... 2026-03-10T15:37:53.990 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-pluggy (0.13.0-7.1) ... 2026-03-10T15:37:54.015 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-psutil. 2026-03-10T15:37:54.020 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../098-python3-psutil_5.9.0-1build1_amd64.deb ... 2026-03-10T15:37:54.021 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-psutil (5.9.0-1build1) ... 2026-03-10T15:37:54.046 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-py. 2026-03-10T15:37:54.050 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../099-python3-py_1.10.0-1_all.deb ... 2026-03-10T15:37:54.051 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-py (1.10.0-1) ... 2026-03-10T15:37:54.083 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-pygments. 2026-03-10T15:37:54.084 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../100-python3-pygments_2.11.2+dfsg-2ubuntu0.1_all.deb ... 2026-03-10T15:37:54.085 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-pygments (2.11.2+dfsg-2ubuntu0.1) ... 2026-03-10T15:37:54.149 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-pyinotify. 2026-03-10T15:37:54.155 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../101-python3-pyinotify_0.9.6-1.3_all.deb ... 2026-03-10T15:37:54.156 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-pyinotify (0.9.6-1.3) ... 2026-03-10T15:37:54.174 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-toml. 2026-03-10T15:37:54.180 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../102-python3-toml_0.10.2-1_all.deb ... 2026-03-10T15:37:54.181 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-toml (0.10.2-1) ... 2026-03-10T15:37:54.199 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-pytest. 2026-03-10T15:37:54.205 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../103-python3-pytest_6.2.5-1ubuntu2_all.deb ... 2026-03-10T15:37:54.206 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-pytest (6.2.5-1ubuntu2) ... 2026-03-10T15:37:54.235 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-simplejson. 2026-03-10T15:37:54.241 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../104-python3-simplejson_3.17.6-1build1_amd64.deb ... 2026-03-10T15:37:54.241 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-simplejson (3.17.6-1build1) ... 2026-03-10T15:37:54.264 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package qttranslations5-l10n. 2026-03-10T15:37:54.269 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../105-qttranslations5-l10n_5.15.3-1_all.deb ... 2026-03-10T15:37:54.270 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking qttranslations5-l10n (5.15.3-1) ... 2026-03-10T15:37:54.381 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package radosgw. 2026-03-10T15:37:54.387 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../106-radosgw_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:54.388 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:54.622 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package rbd-fuse. 2026-03-10T15:37:54.628 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../107-rbd-fuse_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-10T15:37:54.629 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking rbd-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:54.650 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package smartmontools. 2026-03-10T15:37:54.656 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../108-smartmontools_7.2-1ubuntu0.1_amd64.deb ... 2026-03-10T15:37:54.664 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking smartmontools (7.2-1ubuntu0.1) ... 2026-03-10T15:37:54.708 INFO:teuthology.orchestra.run.vm01.stdout:Setting up smartmontools (7.2-1ubuntu0.1) ... 2026-03-10T15:37:54.955 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/smartd.service → /lib/systemd/system/smartmontools.service. 2026-03-10T15:37:54.955 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/smartmontools.service → /lib/systemd/system/smartmontools.service. 2026-03-10T15:37:55.314 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-iniconfig (1.1.1-2) ... 2026-03-10T15:37:55.380 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-10T15:37:55.382 INFO:teuthology.orchestra.run.vm01.stdout:Setting up nvme-cli (1.16-3ubuntu0.3) ... 2026-03-10T15:37:55.448 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/default.target.wants/nvmefc-boot-connections.service → /lib/systemd/system/nvmefc-boot-connections.service. 2026-03-10T15:37:55.659 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/default.target.wants/nvmf-autoconnect.service → /lib/systemd/system/nvmf-autoconnect.service. 2026-03-10T15:37:56.045 INFO:teuthology.orchestra.run.vm01.stdout:nvmf-connect.target is a disabled or a static unit, not starting it. 2026-03-10T15:37:56.052 INFO:teuthology.orchestra.run.vm01.stdout:Could not execute systemctl: at /usr/bin/deb-systemd-invoke line 142. 2026-03-10T15:37:56.054 INFO:teuthology.orchestra.run.vm01.stdout:Setting up cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:56.095 INFO:teuthology.orchestra.run.vm01.stdout:Adding system user cephadm....done 2026-03-10T15:37:56.103 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-waitress (1.4.4-1.1ubuntu1.1) ... 2026-03-10T15:37:56.177 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-jaraco.classes (3.2.1-3) ... 2026-03-10T15:37:56.245 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-10T15:37:56.248 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-jaraco.functools (3.4.0-2) ... 2026-03-10T15:37:56.411 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-repoze.lru (0.7-2) ... 2026-03-10T15:37:56.536 INFO:teuthology.orchestra.run.vm01.stdout:Setting up liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-10T15:37:56.539 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-py (1.10.0-1) ... 2026-03-10T15:37:56.631 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-joblib (0.17.0-4ubuntu1) ... 2026-03-10T15:37:56.752 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-cachetools (5.0.0-1) ... 2026-03-10T15:37:56.821 INFO:teuthology.orchestra.run.vm01.stdout:Setting up unzip (6.0-26ubuntu3.2) ... 2026-03-10T15:37:56.830 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-pyinotify (0.9.6-1.3) ... 2026-03-10T15:37:56.897 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-threadpoolctl (3.1.0-1) ... 2026-03-10T15:37:56.963 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-ceph-argparse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:57.033 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-10T15:37:57.036 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libnbd0 (1.10.5-1) ... 2026-03-10T15:37:57.039 INFO:teuthology.orchestra.run.vm01.stdout:Setting up lua-socket:amd64 (3.0~rc1+git+ac3201d-6) ... 2026-03-10T15:37:57.042 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libreadline-dev:amd64 (8.1.2-1) ... 2026-03-10T15:37:57.044 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-10T15:37:57.047 INFO:teuthology.orchestra.run.vm01.stdout:Setting up lua5.1 (5.1.5-8.1build4) ... 2026-03-10T15:37:57.050 INFO:teuthology.orchestra.run.vm01.stdout:update-alternatives: using /usr/bin/lua5.1 to provide /usr/bin/lua (lua-interpreter) in auto mode 2026-03-10T15:37:57.052 INFO:teuthology.orchestra.run.vm01.stdout:update-alternatives: using /usr/bin/luac5.1 to provide /usr/bin/luac (lua-compiler) in auto mode 2026-03-10T15:37:57.055 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-10T15:37:57.057 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-psutil (5.9.0-1build1) ... 2026-03-10T15:37:57.178 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-natsort (8.0.2-1) ... 2026-03-10T15:37:57.251 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-routes (2.5.1-1ubuntu1) ... 2026-03-10T15:37:57.322 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-simplejson (3.17.6-1build1) ... 2026-03-10T15:37:57.400 INFO:teuthology.orchestra.run.vm01.stdout:Setting up zip (3.0-12build2) ... 2026-03-10T15:37:57.402 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-pygments (2.11.2+dfsg-2ubuntu0.1) ... 2026-03-10T15:37:57.671 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-tempita (0.5.2-6ubuntu1) ... 2026-03-10T15:37:57.739 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python-pastedeploy-tpl (2.1.1-1) ... 2026-03-10T15:37:57.741 INFO:teuthology.orchestra.run.vm01.stdout:Setting up qttranslations5-l10n (5.15.3-1) ... 2026-03-10T15:37:57.743 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-10T15:37:57.834 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-10T15:37:57.979 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-paste (3.5.0+dfsg1-1) ... 2026-03-10T15:37:58.218 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-10T15:37:58.359 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-werkzeug (2.0.2+dfsg1-1ubuntu0.22.04.3) ... 2026-03-10T15:37:58.475 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-jaraco.text (3.6.0-2) ... 2026-03-10T15:37:58.543 INFO:teuthology.orchestra.run.vm01.stdout:Setting up socat (1.7.4.1-3ubuntu4) ... 2026-03-10T15:37:58.545 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:37:58.636 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-10T15:37:59.190 INFO:teuthology.orchestra.run.vm01.stdout:Setting up pkg-config (0.29.2-1ubuntu3) ... 2026-03-10T15:37:59.210 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-10T15:37:59.215 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-toml (0.10.2-1) ... 2026-03-10T15:37:59.283 INFO:teuthology.orchestra.run.vm01.stdout:Setting up librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-10T15:37:59.285 INFO:teuthology.orchestra.run.vm01.stdout:Setting up xmlstarlet (1.6.1-2.1) ... 2026-03-10T15:37:59.287 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-pluggy (0.13.0-7.1) ... 2026-03-10T15:37:59.355 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-zc.lockfile (2.0-1) ... 2026-03-10T15:37:59.421 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-10T15:37:59.423 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-rsa (4.8-1) ... 2026-03-10T15:37:59.495 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-singledispatch (3.4.0.3-3) ... 2026-03-10T15:37:59.561 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-logutils (0.3.3-8) ... 2026-03-10T15:37:59.629 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-tempora (4.1.2-1) ... 2026-03-10T15:37:59.695 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-simplegeneric (0.8.1-3) ... 2026-03-10T15:37:59.796 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-prettytable (2.5.0-2) ... 2026-03-10T15:37:59.885 INFO:teuthology.orchestra.run.vm01.stdout:Setting up liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-10T15:37:59.887 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-websocket (1.2.3-1) ... 2026-03-10T15:37:59.962 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-10T15:37:59.964 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-10T15:38:00.040 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-mako (1.1.3+ds1-2ubuntu0.1) ... 2026-03-10T15:38:00.127 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-10T15:38:00.219 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-jaraco.collections (3.4.0-2) ... 2026-03-10T15:38:00.289 INFO:teuthology.orchestra.run.vm01.stdout:Setting up liblua5.3-dev:amd64 (5.3.6-1build1) ... 2026-03-10T15:38:00.292 INFO:teuthology.orchestra.run.vm01.stdout:Setting up lua-sec:amd64 (1.0.2-1) ... 2026-03-10T15:38:00.294 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-10T15:38:00.296 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-pytest (6.2.5-1ubuntu2) ... 2026-03-10T15:38:00.441 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-pastedeploy (2.1.1-1) ... 2026-03-10T15:38:00.511 INFO:teuthology.orchestra.run.vm01.stdout:Setting up lua-any (27ubuntu1) ... 2026-03-10T15:38:00.513 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-portend (3.0.0-1) ... 2026-03-10T15:38:00.582 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-10T15:38:00.584 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-google-auth (1.5.1-3) ... 2026-03-10T15:38:00.663 INFO:teuthology.orchestra.run.vm01.stdout:Setting up jq (1.6-2.1ubuntu3.1) ... 2026-03-10T15:38:00.665 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-webtest (2.0.35-1) ... 2026-03-10T15:38:00.739 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-cherrypy3 (18.6.1-4) ... 2026-03-10T15:38:00.957 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-pastescript (2.0.2-4) ... 2026-03-10T15:38:01.118 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-pecan (1.3.3-4ubuntu2) ... 2026-03-10T15:38:01.222 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-10T15:38:01.224 INFO:teuthology.orchestra.run.vm01.stdout:Setting up librados2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:01.226 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libsqlite3-mod-ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:01.228 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-10T15:38:01.791 INFO:teuthology.orchestra.run.vm01.stdout:Setting up luarocks (3.8.0+dfsg1-1) ... 2026-03-10T15:38:01.796 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libcephfs2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:01.798 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libradosstriper1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:01.800 INFO:teuthology.orchestra.run.vm01.stdout:Setting up librbd1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:01.802 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mgr-modules-core (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:01.804 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:01.868 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/remote-fs.target.wants/ceph-fuse.target → /lib/systemd/system/ceph-fuse.target. 2026-03-10T15:38:01.868 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-fuse.target → /lib/systemd/system/ceph-fuse.target. 2026-03-10T15:38:02.258 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libcephfs-dev (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:02.260 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-rados (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:02.262 INFO:teuthology.orchestra.run.vm01.stdout:Setting up librgw2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:02.264 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-rbd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:02.266 INFO:teuthology.orchestra.run.vm01.stdout:Setting up rbd-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:02.268 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-rgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:02.270 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-cephfs (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:02.272 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:02.303 INFO:teuthology.orchestra.run.vm01.stdout:Adding group ceph....done 2026-03-10T15:38:02.335 INFO:teuthology.orchestra.run.vm01.stdout:Adding system user ceph....done 2026-03-10T15:38:02.342 INFO:teuthology.orchestra.run.vm01.stdout:Setting system user ceph properties....done 2026-03-10T15:38:02.345 INFO:teuthology.orchestra.run.vm01.stdout:chown: cannot access '/var/log/ceph/*.log*': No such file or directory 2026-03-10T15:38:02.407 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /lib/systemd/system/ceph.target. 2026-03-10T15:38:02.627 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/rbdmap.service → /lib/systemd/system/rbdmap.service. 2026-03-10T15:38:02.995 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-test (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:02.997 INFO:teuthology.orchestra.run.vm01.stdout:Setting up radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:03.248 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /lib/systemd/system/ceph-radosgw.target. 2026-03-10T15:38:03.249 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /lib/systemd/system/ceph-radosgw.target. 2026-03-10T15:38:03.672 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:03.794 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /lib/systemd/system/ceph-crash.service. 2026-03-10T15:38:04.167 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:04.233 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /lib/systemd/system/ceph-mds.target. 2026-03-10T15:38:04.233 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /lib/systemd/system/ceph-mds.target. 2026-03-10T15:38:04.596 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:04.663 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /lib/systemd/system/ceph-mgr.target. 2026-03-10T15:38:04.663 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /lib/systemd/system/ceph-mgr.target. 2026-03-10T15:38:05.037 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:05.113 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /lib/systemd/system/ceph-osd.target. 2026-03-10T15:38:05.114 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /lib/systemd/system/ceph-osd.target. 2026-03-10T15:38:05.481 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mgr-k8sevents (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:05.484 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mgr-diskprediction-local (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:05.498 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:05.561 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /lib/systemd/system/ceph-mon.target. 2026-03-10T15:38:05.561 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /lib/systemd/system/ceph-mon.target. 2026-03-10T15:38:05.962 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mgr-cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:06.178 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:06.235 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mgr-dashboard (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:06.250 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-volume (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:38:06.370 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for mailcap (3.70+nmu1ubuntu1) ... 2026-03-10T15:38:06.378 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-10T15:38:06.397 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-10T15:38:06.487 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for install-info (6.8-4build1) ... 2026-03-10T15:38:06.837 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-10T15:38:06.837 INFO:teuthology.orchestra.run.vm01.stdout:Running kernel seems to be up-to-date. 2026-03-10T15:38:06.837 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-10T15:38:06.837 INFO:teuthology.orchestra.run.vm01.stdout:Services to be restarted: 2026-03-10T15:38:06.843 INFO:teuthology.orchestra.run.vm01.stdout: systemctl restart packagekit.service 2026-03-10T15:38:06.846 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-10T15:38:06.846 INFO:teuthology.orchestra.run.vm01.stdout:Service restarts being deferred: 2026-03-10T15:38:06.846 INFO:teuthology.orchestra.run.vm01.stdout: systemctl restart networkd-dispatcher.service 2026-03-10T15:38:06.846 INFO:teuthology.orchestra.run.vm01.stdout: systemctl restart unattended-upgrades.service 2026-03-10T15:38:06.846 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-10T15:38:06.846 INFO:teuthology.orchestra.run.vm01.stdout:No containers need to be restarted. 2026-03-10T15:38:06.846 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-10T15:38:06.846 INFO:teuthology.orchestra.run.vm01.stdout:No user sessions are running outdated binaries. 2026-03-10T15:38:06.846 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-10T15:38:06.846 INFO:teuthology.orchestra.run.vm01.stdout:No VM guests are running outdated hypervisor (qemu) binaries on this host. 2026-03-10T15:38:07.896 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:38:07.898 DEBUG:teuthology.orchestra.run.vm01:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install python3-xmltodict python3-jmespath 2026-03-10T15:38:07.975 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:38:08.186 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:38:08.186 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:38:08.348 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:38:08.348 INFO:teuthology.orchestra.run.vm01.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-10T15:38:08.348 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-10T15:38:08.348 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:38:08.363 INFO:teuthology.orchestra.run.vm01.stdout:The following NEW packages will be installed: 2026-03-10T15:38:08.363 INFO:teuthology.orchestra.run.vm01.stdout: python3-jmespath python3-xmltodict 2026-03-10T15:38:08.571 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 2 newly installed, 0 to remove and 12 not upgraded. 2026-03-10T15:38:08.571 INFO:teuthology.orchestra.run.vm01.stdout:Need to get 34.3 kB of archives. 2026-03-10T15:38:08.571 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 146 kB of additional disk space will be used. 2026-03-10T15:38:08.571 INFO:teuthology.orchestra.run.vm01.stdout:Get:1 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jmespath all 0.10.0-1 [21.7 kB] 2026-03-10T15:38:08.649 INFO:teuthology.orchestra.run.vm01.stdout:Get:2 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-xmltodict all 0.12.0-2 [12.6 kB] 2026-03-10T15:38:08.853 INFO:teuthology.orchestra.run.vm01.stdout:Fetched 34.3 kB in 0s (121 kB/s) 2026-03-10T15:38:08.995 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-jmespath. 2026-03-10T15:38:09.021 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118577 files and directories currently installed.) 2026-03-10T15:38:09.023 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../python3-jmespath_0.10.0-1_all.deb ... 2026-03-10T15:38:09.101 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-jmespath (0.10.0-1) ... 2026-03-10T15:38:09.118 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-xmltodict. 2026-03-10T15:38:09.124 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../python3-xmltodict_0.12.0-2_all.deb ... 2026-03-10T15:38:09.125 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-xmltodict (0.12.0-2) ... 2026-03-10T15:38:09.160 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-xmltodict (0.12.0-2) ... 2026-03-10T15:38:09.231 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-jmespath (0.10.0-1) ... 2026-03-10T15:38:09.574 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-10T15:38:09.574 INFO:teuthology.orchestra.run.vm01.stdout:Running kernel seems to be up-to-date. 2026-03-10T15:38:09.574 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-10T15:38:09.574 INFO:teuthology.orchestra.run.vm01.stdout:Services to be restarted: 2026-03-10T15:38:09.579 INFO:teuthology.orchestra.run.vm01.stdout: systemctl restart packagekit.service 2026-03-10T15:38:09.582 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-10T15:38:09.582 INFO:teuthology.orchestra.run.vm01.stdout:Service restarts being deferred: 2026-03-10T15:38:09.582 INFO:teuthology.orchestra.run.vm01.stdout: systemctl restart networkd-dispatcher.service 2026-03-10T15:38:09.582 INFO:teuthology.orchestra.run.vm01.stdout: systemctl restart unattended-upgrades.service 2026-03-10T15:38:09.582 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-10T15:38:09.582 INFO:teuthology.orchestra.run.vm01.stdout:No containers need to be restarted. 2026-03-10T15:38:09.582 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-10T15:38:09.582 INFO:teuthology.orchestra.run.vm01.stdout:No user sessions are running outdated binaries. 2026-03-10T15:38:09.582 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-10T15:38:09.582 INFO:teuthology.orchestra.run.vm01.stdout:No VM guests are running outdated hypervisor (qemu) binaries on this host. 2026-03-10T15:38:10.653 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:38:10.657 DEBUG:teuthology.parallel:result is None 2026-03-10T15:38:10.657 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T15:38:11.259 DEBUG:teuthology.orchestra.run.vm01:> dpkg-query -W -f '${Version}' ceph 2026-03-10T15:38:11.269 INFO:teuthology.orchestra.run.vm01.stdout:19.2.3-678-ge911bdeb-1jammy 2026-03-10T15:38:11.269 INFO:teuthology.packaging:The installed version of ceph is 19.2.3-678-ge911bdeb-1jammy 2026-03-10T15:38:11.269 INFO:teuthology.task.install:The correct ceph version 19.2.3-678-ge911bdeb-1jammy is installed. 2026-03-10T15:38:11.271 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-10T15:38:11.271 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-10T15:38:11.271 DEBUG:teuthology.orchestra.run.vm01:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-10T15:38:11.319 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-10T15:38:11.319 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-10T15:38:11.319 DEBUG:teuthology.orchestra.run.vm01:> sudo dd of=/usr/bin/daemon-helper 2026-03-10T15:38:11.372 DEBUG:teuthology.orchestra.run.vm01:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-10T15:38:11.421 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-10T15:38:11.422 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-10T15:38:11.422 DEBUG:teuthology.orchestra.run.vm01:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-10T15:38:11.470 DEBUG:teuthology.orchestra.run.vm01:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-10T15:38:11.518 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-10T15:38:11.518 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-10T15:38:11.518 DEBUG:teuthology.orchestra.run.vm01:> sudo dd of=/usr/bin/stdin-killer 2026-03-10T15:38:11.566 DEBUG:teuthology.orchestra.run.vm01:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-10T15:38:11.614 INFO:teuthology.run_tasks:Running task exec... 2026-03-10T15:38:11.616 INFO:teuthology.task.exec:Executing custom commands... 2026-03-10T15:38:11.616 INFO:teuthology.task.exec:Running commands on role mon.a host ubuntu@vm01.local 2026-03-10T15:38:11.616 DEBUG:teuthology.orchestra.run.vm01:> sudo TESTDIR=/home/ubuntu/cephtest bash -c 'yum install -y python3 || apt install -y python3' 2026-03-10T15:38:11.662 INFO:teuthology.orchestra.run.vm01.stderr:bash: line 1: yum: command not found 2026-03-10T15:38:11.665 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-10T15:38:11.666 INFO:teuthology.orchestra.run.vm01.stderr:WARNING: apt does not have a stable CLI interface. Use with caution in scripts. 2026-03-10T15:38:11.666 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-10T15:38:11.690 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:38:11.882 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:38:11.882 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:38:12.000 INFO:teuthology.orchestra.run.vm01.stdout:python3 is already the newest version (3.10.6-1~22.04.1). 2026-03-10T15:38:12.000 INFO:teuthology.orchestra.run.vm01.stdout:python3 set to manually installed. 2026-03-10T15:38:12.000 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:38:12.000 INFO:teuthology.orchestra.run.vm01.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-10T15:38:12.000 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-10T15:38:12.000 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:38:12.084 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 12 not upgraded. 2026-03-10T15:38:12.145 INFO:teuthology.run_tasks:Running task workunit... 2026-03-10T15:38:12.149 INFO:tasks.workunit:Pulling workunits from ref 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b 2026-03-10T15:38:12.149 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-10T15:38:12.149 DEBUG:teuthology.orchestra.run.vm01:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-10T15:38:12.189 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T15:38:12.190 INFO:teuthology.orchestra.run.vm01.stderr:stat: cannot statx '/home/ubuntu/cephtest/mnt.0': No such file or directory 2026-03-10T15:38:12.190 DEBUG:teuthology.orchestra.run.vm01:> mkdir -- /home/ubuntu/cephtest/mnt.0 2026-03-10T15:38:12.233 INFO:tasks.workunit:Created dir /home/ubuntu/cephtest/mnt.0 2026-03-10T15:38:12.233 DEBUG:teuthology.orchestra.run.vm01:> cd -- /home/ubuntu/cephtest/mnt.0 && mkdir -- client.0 2026-03-10T15:38:12.277 INFO:tasks.workunit:timeout=3h 2026-03-10T15:38:12.277 INFO:tasks.workunit:cleanup=True 2026-03-10T15:38:12.278 DEBUG:teuthology.orchestra.run.vm01:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b 2026-03-10T15:38:12.322 INFO:tasks.workunit.client.0.vm01.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-10T15:38:57.530 INFO:tasks.workunit.client.0.vm01.stderr:Note: switching to '75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b'. 2026-03-10T15:38:57.530 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-10T15:38:57.530 INFO:tasks.workunit.client.0.vm01.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-10T15:38:57.530 INFO:tasks.workunit.client.0.vm01.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-10T15:38:57.530 INFO:tasks.workunit.client.0.vm01.stderr:state without impacting any branches by switching back to a branch. 2026-03-10T15:38:57.530 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-10T15:38:57.530 INFO:tasks.workunit.client.0.vm01.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-10T15:38:57.530 INFO:tasks.workunit.client.0.vm01.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-10T15:38:57.530 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-10T15:38:57.530 INFO:tasks.workunit.client.0.vm01.stderr: git switch -c 2026-03-10T15:38:57.530 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-10T15:38:57.530 INFO:tasks.workunit.client.0.vm01.stderr:Or undo this operation with: 2026-03-10T15:38:57.531 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-10T15:38:57.531 INFO:tasks.workunit.client.0.vm01.stderr: git switch - 2026-03-10T15:38:57.531 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-10T15:38:57.531 INFO:tasks.workunit.client.0.vm01.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-10T15:38:57.531 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-10T15:38:57.531 INFO:tasks.workunit.client.0.vm01.stderr:HEAD is now at 75a68fd8ca3 qa/suites/orch/cephadm/osds: drop nvme_loop task 2026-03-10T15:38:57.537 DEBUG:teuthology.orchestra.run.vm01:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-10T15:38:57.582 INFO:tasks.workunit.client.0.vm01.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-10T15:38:57.584 INFO:tasks.workunit.client.0.vm01.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-10T15:38:57.584 INFO:tasks.workunit.client.0.vm01.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-10T15:38:57.623 INFO:tasks.workunit.client.0.vm01.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-10T15:38:57.654 INFO:tasks.workunit.client.0.vm01.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-10T15:38:57.680 INFO:tasks.workunit.client.0.vm01.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-10T15:38:57.681 INFO:tasks.workunit.client.0.vm01.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-10T15:38:57.681 INFO:tasks.workunit.client.0.vm01.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-10T15:38:57.707 INFO:tasks.workunit.client.0.vm01.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-10T15:38:57.710 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-10T15:38:57.710 DEBUG:teuthology.orchestra.run.vm01:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-10T15:38:57.754 INFO:tasks.workunit:Running workunits matching cephadm/test_cephadm.sh on client.0... 2026-03-10T15:38:57.755 INFO:tasks.workunit:Running workunit cephadm/test_cephadm.sh... 2026-03-10T15:38:57.755 DEBUG:teuthology.orchestra.run.vm01:workunit test cephadm/test_cephadm.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/test_cephadm.sh 2026-03-10T15:38:57.801 INFO:tasks.workunit.client.0.vm01.stderr:++ basename /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/test_cephadm.sh 2026-03-10T15:38:57.802 INFO:tasks.workunit.client.0.vm01.stderr:+ SCRIPT_NAME=test_cephadm.sh 2026-03-10T15:38:57.802 INFO:tasks.workunit.client.0.vm01.stderr:+++ dirname /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/test_cephadm.sh 2026-03-10T15:38:57.803 INFO:tasks.workunit.client.0.vm01.stderr:++ cd /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm 2026-03-10T15:38:57.803 INFO:tasks.workunit.client.0.vm01.stderr:++ pwd 2026-03-10T15:38:57.803 INFO:tasks.workunit.client.0.vm01.stderr:+ SCRIPT_DIR=/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm 2026-03-10T15:38:57.803 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' -z '' ']' 2026-03-10T15:38:57.803 INFO:tasks.workunit.client.0.vm01.stderr:+ CLEANUP=true 2026-03-10T15:38:57.803 INFO:tasks.workunit.client.0.vm01.stderr:+ FSID=00000000-0000-0000-0000-0000deadbeef 2026-03-10T15:38:57.803 INFO:tasks.workunit.client.0.vm01.stderr:+ IMAGE_MAIN=quay.ceph.io/ceph-ci/ceph:main 2026-03-10T15:38:57.803 INFO:tasks.workunit.client.0.vm01.stderr:+ IMAGE_QUINCY=quay.ceph.io/ceph-ci/ceph:quincy 2026-03-10T15:38:57.803 INFO:tasks.workunit.client.0.vm01.stderr:+ IMAGE_REEF=quay.ceph.io/ceph-ci/ceph:reef 2026-03-10T15:38:57.803 INFO:tasks.workunit.client.0.vm01.stderr:+ IMAGE_SQUID=quay.ceph.io/ceph-ci/ceph:squid 2026-03-10T15:38:57.803 INFO:tasks.workunit.client.0.vm01.stderr:+ IMAGE_DEFAULT=quay.ceph.io/ceph-ci/ceph:squid 2026-03-10T15:38:57.803 INFO:tasks.workunit.client.0.vm01.stderr:+ OSD_IMAGE_NAME=test_cephadm_osd.img 2026-03-10T15:38:57.803 INFO:tasks.workunit.client.0.vm01.stderr:+ OSD_IMAGE_SIZE=6G 2026-03-10T15:38:57.803 INFO:tasks.workunit.client.0.vm01.stderr:+ OSD_TO_CREATE=2 2026-03-10T15:38:57.804 INFO:tasks.workunit.client.0.vm01.stderr:+ OSD_VG_NAME=test_cephadm 2026-03-10T15:38:57.804 INFO:tasks.workunit.client.0.vm01.stderr:+ OSD_LV_NAME=test_cephadm 2026-03-10T15:38:57.804 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' -d '' ']' 2026-03-10T15:38:57.804 INFO:tasks.workunit.client.0.vm01.stderr:++ mktemp -d tmp.test_cephadm.sh.XXXXXX 2026-03-10T15:38:57.804 INFO:tasks.workunit.client.0.vm01.stderr:+ TMPDIR=tmp.test_cephadm.sh.dSdiir 2026-03-10T15:38:57.804 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' -d '' ']' 2026-03-10T15:38:57.804 INFO:tasks.workunit.client.0.vm01.stderr:++ mktemp -d tmp.test_cephadm.sh.XXXXXX 2026-03-10T15:38:57.805 INFO:tasks.workunit.client.0.vm01.stderr:+ TMPDIR_TEST_MULTIPLE_MOUNTS=tmp.test_cephadm.sh.l4jfBR 2026-03-10T15:38:57.805 INFO:tasks.workunit.client.0.vm01.stderr:+ CEPHADM_SRC_DIR=/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/../../../src/cephadm 2026-03-10T15:38:57.805 INFO:tasks.workunit.client.0.vm01.stderr:+ CEPHADM_SAMPLES_DIR=/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/../../../src/cephadm/samples 2026-03-10T15:38:57.805 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' -z '' ']' 2026-03-10T15:38:57.805 INFO:tasks.workunit.client.0.vm01.stderr:+ SUDO=sudo 2026-03-10T15:38:57.805 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' -z '' ']' 2026-03-10T15:38:57.805 INFO:tasks.workunit.client.0.vm01.stderr:+ command -v cephadm 2026-03-10T15:38:57.806 INFO:tasks.workunit.client.0.vm01.stderr:++ command -v cephadm 2026-03-10T15:38:57.806 INFO:tasks.workunit.client.0.vm01.stderr:+ CEPHADM=/usr/sbin/cephadm 2026-03-10T15:38:57.806 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' -z /usr/sbin/cephadm ']' 2026-03-10T15:38:57.806 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' -x /usr/sbin/cephadm ']' 2026-03-10T15:38:57.806 INFO:tasks.workunit.client.0.vm01.stderr:+ CEPHADM_ARGS=' --image quay.ceph.io/ceph-ci/ceph:squid' 2026-03-10T15:38:57.806 INFO:tasks.workunit.client.0.vm01.stderr:+ CEPHADM_BIN=/usr/sbin/cephadm 2026-03-10T15:38:57.806 INFO:tasks.workunit.client.0.vm01.stderr:+ CEPHADM='sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid' 2026-03-10T15:38:57.806 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-cluster --fsid 00000000-0000-0000-0000-0000deadbeef --force 2026-03-10T15:38:57.896 INFO:tasks.workunit.client.0.vm01.stdout:Deleting cluster with fsid: 00000000-0000-0000-0000-0000deadbeef 2026-03-10T15:38:59.131 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo vgchange -an test_cephadm 2026-03-10T15:38:59.142 INFO:tasks.workunit.client.0.vm01.stderr: Volume group "test_cephadm" not found 2026-03-10T15:38:59.142 INFO:tasks.workunit.client.0.vm01.stderr: Cannot process volume group test_cephadm 2026-03-10T15:38:59.172 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-10T15:38:59.172 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo losetup -a 2026-03-10T15:38:59.172 INFO:tasks.workunit.client.0.vm01.stderr:++ awk -F : '{print $1}' 2026-03-10T15:38:59.174 INFO:tasks.workunit.client.0.vm01.stderr:+++ basename test_cephadm_osd.img 2026-03-10T15:38:59.176 INFO:tasks.workunit.client.0.vm01.stderr:++ grep test_cephadm_osd.img 2026-03-10T15:38:59.179 INFO:tasks.workunit.client.0.vm01.stderr:+ loopdev= 2026-03-10T15:38:59.179 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' '' = '' ']' 2026-03-10T15:38:59.179 INFO:tasks.workunit.client.0.vm01.stderr:+ trap cleanup EXIT 2026-03-10T15:38:59.179 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid check-host 2026-03-10T15:38:59.274 INFO:tasks.workunit.client.0.vm01.stderr:docker (/usr/bin/docker) is present 2026-03-10T15:38:59.274 INFO:tasks.workunit.client.0.vm01.stderr:systemctl is present 2026-03-10T15:38:59.274 INFO:tasks.workunit.client.0.vm01.stderr:lvcreate is present 2026-03-10T15:38:59.298 INFO:tasks.workunit.client.0.vm01.stderr:Unit ntp.service is enabled and running 2026-03-10T15:38:59.298 INFO:tasks.workunit.client.0.vm01.stderr:Host looks OK 2026-03-10T15:38:59.315 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid gather-facts 2026-03-10T15:38:59.456 INFO:tasks.workunit.client.0.vm01.stdout:{ 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "arch": "x86_64", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "bios_date": "04/01/2014", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "bios_version": "1.16.3-debian-1.16.3-2", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "board_serial": "Unknown", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "chassis_serial": "", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "cpu_cores": 1, 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "cpu_count": 2, 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "cpu_load": { 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "15min": 0.23, 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "1min": 1.09, 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "5min": 0.55 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: }, 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "cpu_model": "AMD Ryzen 9 7950X3D 16-Core Processor", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "cpu_threads": 1, 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "enclosure_count": 0, 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "enclosures": {}, 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "flash_capacity": "0.0", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "flash_capacity_bytes": 0, 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "flash_count": 0, 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "flash_list": [], 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "fqdn": "vm01.local", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "hdd_capacity": "128.8GB", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "hdd_capacity_bytes": 128849018880, 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "hdd_count": 5, 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "hdd_list": [ 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: { 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "alt_dev_name": "", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "description": "Virtio Block Device Unknown (21.5GB)", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "dev_name": "vdd", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "disk_size_bytes": 21474836480, 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "disk_type": "hdd", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "enclosure_id": "", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "enclosure_slot": "", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "model": "Unknown", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "mpath": "", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "path_id": "/dev/disk/by-path/virtio-pci-0000:00:08.0", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "rev": "Unknown", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "scsi_addr": "", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "serial": "", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "vendor": "Virtio Block Device", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "wwid": "Unknown" 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: }, 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: { 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "alt_dev_name": "", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "description": "Virtio Block Device Unknown (21.5GB)", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "dev_name": "vdb", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "disk_size_bytes": 21474836480, 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "disk_type": "hdd", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "enclosure_id": "", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "enclosure_slot": "", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "model": "Unknown", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "mpath": "", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "path_id": "/dev/disk/by-path/virtio-pci-0000:00:06.0", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "rev": "Unknown", 2026-03-10T15:38:59.457 INFO:tasks.workunit.client.0.vm01.stdout: "scsi_addr": "", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "serial": "", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "vendor": "Virtio Block Device", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "wwid": "Unknown" 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: }, 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: { 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "alt_dev_name": "", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "description": "Virtio Block Device Unknown (21.5GB)", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "dev_name": "vde", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "disk_size_bytes": 21474836480, 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "disk_type": "hdd", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "enclosure_id": "", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "enclosure_slot": "", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "model": "Unknown", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "mpath": "", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "path_id": "/dev/disk/by-path/pci-0000:00:09.0", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "rev": "Unknown", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "scsi_addr": "", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "serial": "", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "vendor": "Virtio Block Device", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "wwid": "Unknown" 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: }, 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: { 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "alt_dev_name": "", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "description": "Virtio Block Device Unknown (21.5GB)", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "dev_name": "vdc", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "disk_size_bytes": 21474836480, 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "disk_type": "hdd", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "enclosure_id": "", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "enclosure_slot": "", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "model": "Unknown", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "mpath": "", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "path_id": "/dev/disk/by-path/pci-0000:00:07.0", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "rev": "Unknown", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "scsi_addr": "", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "serial": "", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "vendor": "Virtio Block Device", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "wwid": "Unknown" 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: }, 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: { 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "alt_dev_name": "", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "description": "Virtio Block Device Unknown (42.9GB)", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "dev_name": "vda", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "disk_size_bytes": 42949672960, 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "disk_type": "hdd", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "enclosure_id": "", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "enclosure_slot": "", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "model": "Unknown", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "mpath": "", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "path_id": "/dev/disk/by-path/virtio-pci-0000:00:05.0", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "rev": "Unknown", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "scsi_addr": "", 2026-03-10T15:38:59.458 INFO:tasks.workunit.client.0.vm01.stdout: "serial": "", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "vendor": "Virtio Block Device", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "wwid": "Unknown" 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: } 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: ], 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "hostname": "vm01", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "interfaces": { 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "docker0": { 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "driver": "", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "iftype": "logical", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "ipv4_address": "172.17.0.1/16", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "ipv6_address": "", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "lower_devs_list": [], 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "mtu": 1500, 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "nic_type": "bridge", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "operstate": "down", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "speed": -1, 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "upper_devs_list": [] 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: }, 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "ens3": { 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "driver": "virtio_net", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "iftype": "physical", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "ipv4_address": "192.168.123.101/24", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "ipv6_address": "fe80::5055:ff:fe00:1/64", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "lower_devs_list": [], 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "mtu": 1500, 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "nic_type": "ethernet", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "operstate": "up", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "speed": -1, 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "upper_devs_list": [] 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: } 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: }, 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "kernel": "5.15.0-1092-kvm", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "kernel_parameters": { 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.ip_nonlocal_bind": "0" 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: }, 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "kernel_security": { 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "description": "AppArmor: Enabled(40 enforce)", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "enforce": 40, 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "type": "AppArmor" 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: }, 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "memory_available_kb": 7830312, 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "memory_free_kb": 1811848, 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "memory_total_kb": 8156564, 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "model": " (Standard PC (i440FX + PIIX, 1996))", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "nic_count": 1, 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "operating_system": "Ubuntu 22.04.5 LTS (Jammy Jellyfish)", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "product_serial": "", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "selinux_enabled": false, 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "shortname": "vm01", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "subscribed": "Unknown", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "sysctl_options": { 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "abi.vsyscall32": "1", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "debug.exception-trace": "1", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "dev.cdrom.autoclose": "1", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "dev.cdrom.autoeject": "0", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "dev.cdrom.check_media": "0", 2026-03-10T15:38:59.459 INFO:tasks.workunit.client.0.vm01.stdout: "dev.cdrom.debug": "0", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "dev.cdrom.info": "", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "dev.cdrom.lock": "0", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "dev.scsi.logging_level": "0", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "dev.tty.ldisc_autoload": "1", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.aio-max-nr": "1048576", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.aio-nr": "0", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.dentry-state": "68327\t57884\t45\t0\t16112\t0", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.epoll.max_user_watches": "1814839", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.fanotify.max_queued_events": "16384", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.fanotify.max_user_groups": "128", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.fanotify.max_user_marks": "66044", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.file-max": "9223372036854775807", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.file-nr": "1280\t0\t9223372036854775807", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.inode-nr": "117023\t75000", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.inode-state": "117023\t75000\t0\t0\t0\t0\t0", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.inotify.max_queued_events": "16384", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.inotify.max_user_instances": "128", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.inotify.max_user_watches": "62113", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.lease-break-time": "45", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.leases-enable": "1", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.mount-max": "100000", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.mqueue.msg_default": "10", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.mqueue.msg_max": "10", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.mqueue.msgsize_default": "8192", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.mqueue.msgsize_max": "8192", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.mqueue.queues_max": "256", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.nr_open": "1048576", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.overflowgid": "65534", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.overflowuid": "65534", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.pipe-max-size": "1048576", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.pipe-user-pages-hard": "0", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.pipe-user-pages-soft": "16384", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.protected_fifos": "1", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.protected_hardlinks": "1", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.protected_regular": "2", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.protected_symlinks": "1", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.quota.allocated_dquots": "0", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.quota.cache_hits": "0", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.quota.drops": "0", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.quota.free_dquots": "0", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.quota.lookups": "0", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.quota.reads": "0", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.quota.syncs": "16", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.quota.warnings": "1", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.quota.writes": "0", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.suid_dumpable": "2", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "fs.verity.require_signatures": "0", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.apparmor_display_secid_mode": "0", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.auto_msgmni": "0", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.bootloader_type": "114", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.bootloader_version": "2", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.bpf_stats_enabled": "0", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.cad_pid": "1", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.cap_last_cap": "40", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.core_pattern": "/home/ubuntu/cephtest/archive/coredump/%t.%p.core", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.core_pipe_limit": "10", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.core_uses_pid": "1", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.ctrl-alt-del": "0", 2026-03-10T15:38:59.460 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.dmesg_restrict": "1", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.domainname": "(none)", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.ftrace_dump_on_oops": "0", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.hostname": "vm01", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.hotplug": "", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.io_delay_type": "0", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.io_uring_disabled": "0", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.io_uring_group": "-1", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.kexec_load_disabled": "0", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.keys.gc_delay": "300", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.keys.maxbytes": "20000", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.keys.maxkeys": "200", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.keys.root_maxbytes": "25000000", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.keys.root_maxkeys": "1000000", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.kptr_restrict": "1", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.max_lock_depth": "1024", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.max_rcu_stall_to_panic": "0", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.modprobe": "/sbin/modprobe", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.modules_disabled": "0", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.msg_next_id": "-1", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.msgmax": "8192", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.msgmnb": "16384", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.msgmni": "32000", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.ngroups_max": "65536", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.ns_last_pid": "19563", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.numa_balancing": "0", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.oops_all_cpu_backtrace": "0", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.oops_limit": "10000", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.osrelease": "5.15.0-1092-kvm", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.ostype": "Linux", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.overflowgid": "65534", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.overflowuid": "65534", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.panic": "-1", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.panic_on_io_nmi": "0", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.panic_on_oops": "0", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.panic_on_rcu_stall": "0", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.panic_on_unrecovered_nmi": "0", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.panic_on_warn": "0", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.panic_print": "0", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.perf_cpu_time_max_percent": "25", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.perf_event_max_contexts_per_stack": "8", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.perf_event_max_sample_rate": "100000", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.perf_event_max_stack": "127", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.perf_event_mlock_kb": "516", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.perf_event_paranoid": "4", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.pid_max": "4194304", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.poweroff_cmd": "/sbin/poweroff", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.print-fatal-signals": "0", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.printk": "4\t4\t1\t7", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.printk_delay": "0", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.printk_devkmsg": "on", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.printk_ratelimit": "5", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.printk_ratelimit_burst": "10", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.pty.max": "4096", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.pty.nr": "0", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.pty.reserve": "1024", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.random.boot_id": "2973e642-371b-4ee2-8c90-57ae31cd64cd", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.random.entropy_avail": "256", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.random.poolsize": "256", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.random.urandom_min_reseed_secs": "60", 2026-03-10T15:38:59.461 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.random.uuid": "df40b6f4-edaa-4c0e-96c8-ec2883844115", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.random.write_wakeup_threshold": "256", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.randomize_va_space": "2", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.real-root-dev": "0", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.sched_cfs_bandwidth_slice_us": "5000", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.sched_child_runs_first": "0", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.sched_deadline_period_max_us": "4194304", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.sched_deadline_period_min_us": "100", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.sched_energy_aware": "1", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.sched_rr_timeslice_ms": "100", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.sched_rt_period_us": "1000000", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.sched_rt_runtime_us": "950000", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.sched_schedstats": "0", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.sched_util_clamp_max": "1024", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.sched_util_clamp_min": "1024", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.sched_util_clamp_min_rt_default": "1024", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.seccomp.actions_avail": "kill_process kill_thread trap errno user_notif trace log allow", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.seccomp.actions_logged": "kill_process kill_thread trap errno user_notif trace log", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.sem": "32000\t1024000000\t500\t32000", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.sem_next_id": "-1", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.shm_next_id": "-1", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.shm_rmid_forced": "0", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.shmall": "18446744073692774399", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.shmmax": "18446744073692774399", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.shmmni": "4096", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.sysctl_writes_strict": "1", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.tainted": "0", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.task_delayacct": "0", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.threads-max": "63692", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.timer_migration": "1", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.traceoff_on_warning": "0", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.tracepoint_printk": "0", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.unknown_nmi_panic": "0", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.unprivileged_bpf_disabled": "2", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.unprivileged_userns_apparmor_policy": "1", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.unprivileged_userns_clone": "1", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.usermodehelper.bset": "4294967295\t511", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.usermodehelper.inheritable": "4294967295\t511", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.version": "#97-Ubuntu SMP Fri Jan 23 15:00:24 UTC 2026", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.warn_limit": "0", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "kernel.yama.ptrace_scope": "1", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.bpf_jit_enable": "1", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.bpf_jit_harden": "0", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.bpf_jit_kallsyms": "1", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.bpf_jit_limit": "528482304", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.busy_poll": "0", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.busy_read": "0", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.default_qdisc": "pfifo_fast", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.dev_weight": "64", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.dev_weight_rx_bias": "1", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.dev_weight_tx_bias": "1", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.devconf_inherit_init_net": "0", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.fb_tunnels_only_for_init_net": "0", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.flow_limit_cpu_bitmap": "0", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.flow_limit_table_len": "4096", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.gro_normal_batch": "8", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.high_order_alloc_disable": "0", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.max_skb_frags": "17", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.message_burst": "10", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.message_cost": "5", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.netdev_budget": "300", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.netdev_budget_usecs": "8000", 2026-03-10T15:38:59.462 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.netdev_max_backlog": "1000", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.netdev_rss_key": "00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.netdev_tstamp_prequeue": "1", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.netdev_unregister_timeout_secs": "10", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.optmem_max": "20480", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.rmem_default": "212992", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.rmem_max": "212992", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.rps_sock_flow_entries": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.somaxconn": "4096", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.tstamp_allow_data": "1", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.warnings": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.wmem_default": "212992", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.wmem_max": "212992", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.xfrm_acq_expires": "30", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.xfrm_aevent_etime": "10", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.xfrm_aevent_rseqth": "2", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.core.xfrm_larval_drop": "1", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.cipso_cache_bucket_size": "10", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.cipso_cache_enable": "1", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.cipso_rbm_optfmt": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.cipso_rbm_strictvalid": "1", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.accept_local": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.accept_redirects": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.accept_source_route": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.arp_accept": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.arp_announce": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.arp_filter": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.arp_ignore": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.arp_notify": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.bc_forwarding": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.bootp_relay": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.disable_policy": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.disable_xfrm": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.drop_gratuitous_arp": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.drop_unicast_in_l2_multicast": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.force_igmp_version": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.forwarding": "1", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.igmpv2_unsolicited_report_interval": "10000", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.igmpv3_unsolicited_report_interval": "1000", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.ignore_routes_with_linkdown": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.log_martians": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.mc_forwarding": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.medium_id": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.promote_secondaries": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.proxy_arp": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.proxy_arp_pvlan": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.route_localnet": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.rp_filter": "2", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.secure_redirects": "1", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.send_redirects": "1", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.shared_media": "1", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.src_valid_mark": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.all.tag": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.accept_local": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.accept_redirects": "1", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.accept_source_route": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.arp_accept": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.arp_announce": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.arp_filter": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.arp_ignore": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.arp_notify": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.bc_forwarding": "0", 2026-03-10T15:38:59.463 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.bootp_relay": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.disable_policy": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.disable_xfrm": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.drop_gratuitous_arp": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.drop_unicast_in_l2_multicast": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.force_igmp_version": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.forwarding": "1", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.igmpv2_unsolicited_report_interval": "10000", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.igmpv3_unsolicited_report_interval": "1000", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.ignore_routes_with_linkdown": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.log_martians": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.mc_forwarding": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.medium_id": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.promote_secondaries": "1", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.proxy_arp": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.proxy_arp_pvlan": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.route_localnet": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.rp_filter": "2", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.secure_redirects": "1", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.send_redirects": "1", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.shared_media": "1", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.src_valid_mark": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.default.tag": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.accept_local": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.accept_redirects": "1", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.accept_source_route": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.arp_accept": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.arp_announce": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.arp_filter": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.arp_ignore": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.arp_notify": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.bc_forwarding": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.bootp_relay": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.disable_policy": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.disable_xfrm": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.drop_gratuitous_arp": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.drop_unicast_in_l2_multicast": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.force_igmp_version": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.forwarding": "1", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.igmpv2_unsolicited_report_interval": "10000", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.igmpv3_unsolicited_report_interval": "1000", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.ignore_routes_with_linkdown": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.log_martians": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.mc_forwarding": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.medium_id": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.promote_secondaries": "1", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.proxy_arp": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.proxy_arp_pvlan": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.route_localnet": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.rp_filter": "2", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.secure_redirects": "1", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.send_redirects": "1", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.shared_media": "1", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.src_valid_mark": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.docker0.tag": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.accept_local": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.accept_redirects": "1", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.accept_source_route": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.arp_accept": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.arp_announce": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.arp_filter": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.arp_ignore": "0", 2026-03-10T15:38:59.464 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.arp_notify": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.bc_forwarding": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.bootp_relay": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.disable_policy": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.disable_xfrm": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.drop_gratuitous_arp": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.drop_unicast_in_l2_multicast": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.force_igmp_version": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.forwarding": "1", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.igmpv2_unsolicited_report_interval": "10000", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.igmpv3_unsolicited_report_interval": "1000", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.ignore_routes_with_linkdown": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.log_martians": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.mc_forwarding": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.medium_id": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.promote_secondaries": "1", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.proxy_arp": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.proxy_arp_pvlan": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.route_localnet": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.rp_filter": "2", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.secure_redirects": "1", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.send_redirects": "1", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.shared_media": "1", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.src_valid_mark": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.ens3.tag": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.accept_local": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.accept_redirects": "1", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.accept_source_route": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.arp_accept": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.arp_announce": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.arp_filter": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.arp_ignore": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.arp_notify": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.bc_forwarding": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.bootp_relay": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.disable_policy": "1", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.disable_xfrm": "1", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.drop_gratuitous_arp": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.drop_unicast_in_l2_multicast": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.force_igmp_version": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.forwarding": "1", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.igmpv2_unsolicited_report_interval": "10000", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.igmpv3_unsolicited_report_interval": "1000", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.ignore_routes_with_linkdown": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.log_martians": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.mc_forwarding": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.medium_id": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.promote_secondaries": "1", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.proxy_arp": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.proxy_arp_pvlan": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.route_localnet": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.rp_filter": "2", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.secure_redirects": "1", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.send_redirects": "1", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.shared_media": "1", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.src_valid_mark": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.conf.lo.tag": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.fib_notify_on_flag_change": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.fib_sync_mem": "524288", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.fwmark_reflect": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.icmp_echo_enable_probe": "0", 2026-03-10T15:38:59.465 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.icmp_echo_ignore_all": "0", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.icmp_echo_ignore_broadcasts": "1", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.icmp_errors_use_inbound_ifaddr": "0", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.icmp_ignore_bogus_error_responses": "1", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.icmp_msgs_burst": "50", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.icmp_msgs_per_sec": "1000", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.icmp_ratelimit": "1000", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.icmp_ratemask": "6168", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.igmp_link_local_mcast_reports": "1", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.igmp_max_memberships": "20", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.igmp_max_msf": "10", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.inet_peer_maxttl": "600", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.inet_peer_minttl": "120", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.inet_peer_threshold": "65664", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.ip_autobind_reuse": "0", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.ip_default_ttl": "64", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.ip_dynaddr": "0", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.ip_early_demux": "1", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.ip_forward": "1", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.ip_forward_update_priority": "1", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.ip_forward_use_pmtu": "0", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.ip_local_port_range": "32768\t60999", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.ip_local_reserved_ports": "", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.ip_no_pmtu_disc": "0", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.ip_nonlocal_bind": "0", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.ip_unprivileged_port_start": "1024", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.ipfrag_high_thresh": "4194304", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.ipfrag_low_thresh": "3145728", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.ipfrag_max_dist": "64", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.ipfrag_secret_interval": "0", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.ipfrag_time": "30", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.default.anycast_delay": "100", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.default.app_solicit": "0", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.default.base_reachable_time_ms": "30000", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.default.delay_first_probe_time": "5", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.default.gc_interval": "30", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.default.gc_stale_time": "60", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.default.gc_thresh1": "128", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.default.gc_thresh2": "512", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.default.gc_thresh3": "1024", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.default.locktime": "100", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.default.mcast_resolicit": "0", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.default.mcast_solicit": "3", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.default.proxy_delay": "80", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.default.proxy_qlen": "64", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.default.retrans_time_ms": "1000", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.default.ucast_solicit": "3", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.default.unres_qlen": "101", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.default.unres_qlen_bytes": "212992", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.docker0.anycast_delay": "100", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.docker0.app_solicit": "0", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.docker0.base_reachable_time_ms": "30000", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.docker0.delay_first_probe_time": "5", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.docker0.gc_stale_time": "60", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.docker0.locktime": "100", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.docker0.mcast_resolicit": "0", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.docker0.mcast_solicit": "3", 2026-03-10T15:38:59.466 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.docker0.proxy_delay": "80", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.docker0.proxy_qlen": "64", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.docker0.retrans_time_ms": "1000", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.docker0.ucast_solicit": "3", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.docker0.unres_qlen": "101", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.docker0.unres_qlen_bytes": "212992", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.ens3.anycast_delay": "100", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.ens3.app_solicit": "0", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.ens3.base_reachable_time_ms": "30000", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.ens3.delay_first_probe_time": "5", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.ens3.gc_stale_time": "60", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.ens3.locktime": "100", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.ens3.mcast_resolicit": "0", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.ens3.mcast_solicit": "3", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.ens3.proxy_delay": "80", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.ens3.proxy_qlen": "64", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.ens3.retrans_time_ms": "1000", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.ens3.ucast_solicit": "3", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.ens3.unres_qlen": "101", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.ens3.unres_qlen_bytes": "212992", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.lo.anycast_delay": "100", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.lo.app_solicit": "0", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.lo.base_reachable_time_ms": "30000", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.lo.delay_first_probe_time": "5", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.lo.gc_stale_time": "60", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.lo.locktime": "100", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.lo.mcast_resolicit": "0", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.lo.mcast_solicit": "3", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.lo.proxy_delay": "80", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.lo.proxy_qlen": "64", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.lo.retrans_time_ms": "1000", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.lo.ucast_solicit": "3", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.lo.unres_qlen": "101", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.neigh.lo.unres_qlen_bytes": "212992", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.nexthop_compat_mode": "1", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.ping_group_range": "0\t2147483647", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.raw_l3mdev_accept": "1", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.route.error_burst": "1250", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.route.error_cost": "250", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.route.gc_elasticity": "8", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.route.gc_interval": "60", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.route.gc_min_interval": "0", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.route.gc_min_interval_ms": "500", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.route.gc_thresh": "-1", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.route.gc_timeout": "300", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.route.max_size": "2147483647", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.route.min_adv_mss": "256", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.route.min_pmtu": "552", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.route.mtu_expires": "600", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.route.redirect_load": "5", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.route.redirect_number": "9", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.route.redirect_silence": "5120", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_abort_on_overflow": "0", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_adv_win_scale": "1", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_allowed_congestion_control": "reno cubic", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_app_win": "31", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_autocorking": "1", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_available_congestion_control": "reno cubic", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_available_ulp": "espintcp mptcp", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_base_mss": "1024", 2026-03-10T15:38:59.467 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_challenge_ack_limit": "1000", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_comp_sack_delay_ns": "1000000", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_comp_sack_nr": "44", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_comp_sack_slack_ns": "100000", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_congestion_control": "cubic", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_dsack": "1", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_early_demux": "1", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_early_retrans": "3", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_ecn": "2", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_ecn_fallback": "1", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_fack": "0", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_fastopen": "1", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_fastopen_blackhole_timeout_sec": "0", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_fastopen_key": "a35e87f5-7fe0a87c-3dfbbef8-5515932d", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_fin_timeout": "60", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_frto": "2", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_fwmark_accept": "0", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_invalid_ratelimit": "500", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_keepalive_intvl": "75", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_keepalive_probes": "9", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_keepalive_time": "7200", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_l3mdev_accept": "0", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_limit_output_bytes": "1048576", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_low_latency": "0", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_max_orphans": "32768", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_max_reordering": "300", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_max_syn_backlog": "512", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_max_tw_buckets": "32768", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_mem": "95214\t126952\t190428", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_migrate_req": "0", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_min_rtt_wlen": "300", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_min_snd_mss": "48", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_min_tso_segs": "2", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_moderate_rcvbuf": "1", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_mtu_probe_floor": "48", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_mtu_probing": "0", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_no_metrics_save": "0", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_no_ssthresh_metrics_save": "1", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_notsent_lowat": "4294967295", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_orphan_retries": "0", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_pacing_ca_ratio": "120", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_pacing_ss_ratio": "200", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_probe_interval": "600", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_probe_threshold": "8", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_recovery": "1", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_reflect_tos": "0", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_reordering": "3", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_retrans_collapse": "1", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_retries1": "3", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_retries2": "15", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_rfc1337": "0", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_rmem": "4096\t131072\t6291456", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_rx_skb_cache": "0", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_sack": "1", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_slow_start_after_idle": "1", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_stdurg": "0", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_syn_retries": "6", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_synack_retries": "5", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_syncookies": "1", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_thin_linear_timeouts": "0", 2026-03-10T15:38:59.468 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_timestamps": "1", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_tso_win_divisor": "3", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_tw_reuse": "2", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_tx_skb_cache": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_window_scaling": "1", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_wmem": "4096\t16384\t4194304", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.tcp_workaround_signed_windows": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.udp_early_demux": "1", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.udp_l3mdev_accept": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.udp_mem": "190428\t253904\t380856", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.udp_rmem_min": "4096", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.udp_wmem_min": "4096", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv4.xfrm4_gc_thresh": "32768", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.anycast_src_echo_reply": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.auto_flowlabels": "1", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.bindv6only": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.calipso_cache_bucket_size": "10", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.calipso_cache_enable": "1", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.accept_dad": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.accept_ra": "1", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.accept_ra_defrtr": "1", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.accept_ra_from_local": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.accept_ra_min_hop_limit": "1", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.accept_ra_min_lft": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.accept_ra_mtu": "1", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.accept_ra_pinfo": "1", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.accept_redirects": "1", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.accept_source_route": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.addr_gen_mode": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.autoconf": "1", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.dad_transmits": "1", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.disable_ipv6": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.disable_policy": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.drop_unicast_in_l2_multicast": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.drop_unsolicited_na": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.enhanced_dad": "1", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.force_mld_version": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.force_tllao": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.forwarding": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.hop_limit": "64", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.ignore_routes_with_linkdown": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.ioam6_enabled": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.ioam6_id": "65535", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.ioam6_id_wide": "4294967295", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.keep_addr_on_down": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.max_addresses": "16", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.max_desync_factor": "600", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.mldv1_unsolicited_report_interval": "10000", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.mldv2_unsolicited_report_interval": "1000", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.mtu": "1280", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.ndisc_notify": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.ndisc_tclass": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.proxy_ndp": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.ra_defrtr_metric": "1024", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.regen_max_retry": "3", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.router_solicitation_delay": "1", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.router_solicitation_interval": "4", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.router_solicitation_max_interval": "3600", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.router_solicitations": "-1", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.rpl_seg_enabled": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.seg6_enabled": "0", 2026-03-10T15:38:59.469 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.suppress_frag_ndisc": "1", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.temp_prefered_lft": "86400", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.temp_valid_lft": "604800", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.use_oif_addrs_only": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.all.use_tempaddr": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.accept_dad": "1", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.accept_ra": "1", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.accept_ra_defrtr": "1", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.accept_ra_from_local": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.accept_ra_min_hop_limit": "1", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.accept_ra_min_lft": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.accept_ra_mtu": "1", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.accept_ra_pinfo": "1", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.accept_redirects": "1", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.accept_source_route": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.addr_gen_mode": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.autoconf": "1", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.dad_transmits": "1", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.disable_ipv6": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.disable_policy": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.drop_unicast_in_l2_multicast": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.drop_unsolicited_na": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.enhanced_dad": "1", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.force_mld_version": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.force_tllao": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.forwarding": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.hop_limit": "64", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.ignore_routes_with_linkdown": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.ioam6_enabled": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.ioam6_id": "65535", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.ioam6_id_wide": "4294967295", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.keep_addr_on_down": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.max_addresses": "16", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.max_desync_factor": "600", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.mldv1_unsolicited_report_interval": "10000", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.mldv2_unsolicited_report_interval": "1000", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.mtu": "1280", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.ndisc_notify": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.ndisc_tclass": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.proxy_ndp": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.ra_defrtr_metric": "1024", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.regen_max_retry": "3", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.router_solicitation_delay": "1", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.router_solicitation_interval": "4", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.router_solicitation_max_interval": "3600", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.router_solicitations": "-1", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.rpl_seg_enabled": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.seg6_enabled": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.suppress_frag_ndisc": "1", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.temp_prefered_lft": "86400", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.temp_valid_lft": "604800", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.use_oif_addrs_only": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.default.use_tempaddr": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.accept_dad": "1", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.accept_ra": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.accept_ra_defrtr": "1", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.accept_ra_from_local": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.accept_ra_min_hop_limit": "1", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.accept_ra_min_lft": "0", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.accept_ra_mtu": "1", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.accept_ra_pinfo": "1", 2026-03-10T15:38:59.470 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.accept_redirects": "1", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.accept_source_route": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.addr_gen_mode": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.autoconf": "1", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.dad_transmits": "1", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.disable_ipv6": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.disable_policy": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.drop_unicast_in_l2_multicast": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.drop_unsolicited_na": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.enhanced_dad": "1", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.force_mld_version": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.force_tllao": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.forwarding": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.hop_limit": "64", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.ignore_routes_with_linkdown": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.ioam6_enabled": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.ioam6_id": "65535", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.ioam6_id_wide": "4294967295", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.keep_addr_on_down": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.max_addresses": "16", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.max_desync_factor": "600", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.mldv1_unsolicited_report_interval": "10000", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.mldv2_unsolicited_report_interval": "1000", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.mtu": "1500", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.ndisc_notify": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.ndisc_tclass": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.proxy_ndp": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.ra_defrtr_metric": "1024", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.regen_max_retry": "3", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.router_solicitation_delay": "1", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.router_solicitation_interval": "4", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.router_solicitation_max_interval": "3600", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.router_solicitations": "-1", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.rpl_seg_enabled": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.seg6_enabled": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.suppress_frag_ndisc": "1", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.temp_prefered_lft": "86400", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.temp_valid_lft": "604800", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.use_oif_addrs_only": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.docker0.use_tempaddr": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.accept_dad": "1", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.accept_ra": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.accept_ra_defrtr": "1", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.accept_ra_from_local": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.accept_ra_min_hop_limit": "1", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.accept_ra_min_lft": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.accept_ra_mtu": "1", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.accept_ra_pinfo": "1", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.accept_redirects": "1", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.accept_source_route": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.addr_gen_mode": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.autoconf": "1", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.dad_transmits": "1", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.disable_ipv6": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.disable_policy": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.drop_unicast_in_l2_multicast": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.drop_unsolicited_na": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.enhanced_dad": "1", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.force_mld_version": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.force_tllao": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.forwarding": "0", 2026-03-10T15:38:59.471 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.hop_limit": "64", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.ignore_routes_with_linkdown": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.ioam6_enabled": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.ioam6_id": "65535", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.ioam6_id_wide": "4294967295", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.keep_addr_on_down": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.max_addresses": "16", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.max_desync_factor": "600", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.mldv1_unsolicited_report_interval": "10000", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.mldv2_unsolicited_report_interval": "1000", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.mtu": "1500", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.ndisc_notify": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.ndisc_tclass": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.proxy_ndp": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.ra_defrtr_metric": "1024", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.regen_max_retry": "3", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.router_solicitation_delay": "1", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.router_solicitation_interval": "4", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.router_solicitation_max_interval": "3600", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.router_solicitations": "-1", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.rpl_seg_enabled": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.seg6_enabled": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.suppress_frag_ndisc": "1", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.temp_prefered_lft": "86400", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.temp_valid_lft": "604800", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.use_oif_addrs_only": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.ens3.use_tempaddr": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.accept_dad": "-1", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.accept_ra": "1", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.accept_ra_defrtr": "1", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.accept_ra_from_local": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.accept_ra_min_hop_limit": "1", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.accept_ra_min_lft": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.accept_ra_mtu": "1", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.accept_ra_pinfo": "1", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.accept_redirects": "1", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.accept_source_route": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.addr_gen_mode": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.autoconf": "1", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.dad_transmits": "1", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.disable_ipv6": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.disable_policy": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.drop_unicast_in_l2_multicast": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.drop_unsolicited_na": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.enhanced_dad": "1", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.force_mld_version": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.force_tllao": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.forwarding": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.hop_limit": "64", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.ignore_routes_with_linkdown": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.ioam6_enabled": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.ioam6_id": "65535", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.ioam6_id_wide": "4294967295", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.keep_addr_on_down": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.max_addresses": "16", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.max_desync_factor": "600", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.mldv1_unsolicited_report_interval": "10000", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.mldv2_unsolicited_report_interval": "1000", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.mtu": "65536", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.ndisc_notify": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.ndisc_tclass": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.proxy_ndp": "0", 2026-03-10T15:38:59.472 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.ra_defrtr_metric": "1024", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.regen_max_retry": "3", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.router_solicitation_delay": "1", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.router_solicitation_interval": "4", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.router_solicitation_max_interval": "3600", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.router_solicitations": "-1", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.rpl_seg_enabled": "0", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.seg6_enabled": "0", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.suppress_frag_ndisc": "1", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.temp_prefered_lft": "86400", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.temp_valid_lft": "604800", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.use_oif_addrs_only": "0", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.conf.lo.use_tempaddr": "-1", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.fib_multipath_hash_fields": "7", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.fib_multipath_hash_policy": "0", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.fib_notify_on_flag_change": "0", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.flowlabel_consistency": "1", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.flowlabel_reflect": "0", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.flowlabel_state_ranges": "0", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.fwmark_reflect": "0", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.icmp.echo_ignore_all": "0", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.icmp.echo_ignore_anycast": "0", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.icmp.echo_ignore_multicast": "0", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.icmp.ratelimit": "1000", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.icmp.ratemask": "0-1,3-127", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.idgen_delay": "1", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.idgen_retries": "3", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.ioam6_id": "16777215", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.ioam6_id_wide": "72057594037927935", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.ip6frag_high_thresh": "4194304", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.ip6frag_low_thresh": "3145728", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.ip6frag_secret_interval": "0", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.ip6frag_time": "60", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.ip_nonlocal_bind": "0", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.max_dst_opts_length": "2147483647", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.max_dst_opts_number": "8", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.max_hbh_length": "2147483647", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.max_hbh_opts_number": "8", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.mld_max_msf": "64", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.mld_qrv": "2", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.default.anycast_delay": "100", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.default.app_solicit": "0", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.default.base_reachable_time_ms": "30000", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.default.delay_first_probe_time": "5", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.default.gc_interval": "30", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.default.gc_stale_time": "60", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.default.gc_thresh1": "128", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.default.gc_thresh2": "512", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.default.gc_thresh3": "1024", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.default.locktime": "0", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.default.mcast_resolicit": "0", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.default.mcast_solicit": "3", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.default.proxy_delay": "80", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.default.proxy_qlen": "64", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.default.retrans_time_ms": "1000", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.default.ucast_solicit": "3", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.default.unres_qlen": "101", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.default.unres_qlen_bytes": "212992", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.docker0.anycast_delay": "100", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.docker0.app_solicit": "0", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.docker0.base_reachable_time_ms": "30000", 2026-03-10T15:38:59.473 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.docker0.delay_first_probe_time": "5", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.docker0.gc_stale_time": "60", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.docker0.locktime": "0", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.docker0.mcast_resolicit": "0", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.docker0.mcast_solicit": "3", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.docker0.proxy_delay": "80", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.docker0.proxy_qlen": "64", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.docker0.retrans_time_ms": "1000", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.docker0.ucast_solicit": "3", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.docker0.unres_qlen": "101", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.docker0.unres_qlen_bytes": "212992", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.ens3.anycast_delay": "100", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.ens3.app_solicit": "0", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.ens3.base_reachable_time_ms": "30000", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.ens3.delay_first_probe_time": "5", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.ens3.gc_stale_time": "60", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.ens3.locktime": "0", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.ens3.mcast_resolicit": "0", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.ens3.mcast_solicit": "3", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.ens3.proxy_delay": "80", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.ens3.proxy_qlen": "64", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.ens3.retrans_time_ms": "1000", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.ens3.ucast_solicit": "3", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.ens3.unres_qlen": "101", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.ens3.unres_qlen_bytes": "212992", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.lo.anycast_delay": "100", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.lo.app_solicit": "0", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.lo.base_reachable_time_ms": "30000", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.lo.delay_first_probe_time": "5", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.lo.gc_stale_time": "60", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.lo.locktime": "0", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.lo.mcast_resolicit": "0", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.lo.mcast_solicit": "3", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.lo.proxy_delay": "80", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.lo.proxy_qlen": "64", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.lo.retrans_time_ms": "1000", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.lo.ucast_solicit": "3", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.lo.unres_qlen": "101", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.neigh.lo.unres_qlen_bytes": "212992", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.route.gc_elasticity": "9", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.route.gc_interval": "30", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.route.gc_min_interval": "0", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.route.gc_min_interval_ms": "500", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.route.gc_thresh": "1024", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.route.gc_timeout": "60", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.route.max_size": "2147483647", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.route.min_adv_mss": "1220", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.route.mtu_expires": "600", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.route.skip_notify_on_dev_down": "0", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.seg6_flowlabel": "0", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.ipv6.xfrm6_gc_thresh": "32768", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.mptcp.add_addr_timeout": "120", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.mptcp.allow_join_initial_addr_port": "1", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.mptcp.checksum_enabled": "0", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.mptcp.enabled": "1", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.mptcp.stale_loss_cnt": "4", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_acct": "0", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_buckets": "262144", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_checksum": "1", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_count": "89", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_dccp_loose": "1", 2026-03-10T15:38:59.474 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_dccp_timeout_closereq": "64", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_dccp_timeout_closing": "64", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_dccp_timeout_open": "43200", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_dccp_timeout_partopen": "480", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_dccp_timeout_request": "240", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_dccp_timeout_respond": "480", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_dccp_timeout_timewait": "240", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_events": "1", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_expect_max": "4096", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_frag6_high_thresh": "4194304", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_frag6_low_thresh": "3145728", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_frag6_timeout": "60", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_generic_timeout": "600", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_gre_timeout": "30", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_gre_timeout_stream": "180", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_helper": "0", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_icmp_timeout": "30", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_icmpv6_timeout": "30", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_log_invalid": "0", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_max": "262144", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_sctp_timeout_closed": "10", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_sctp_timeout_cookie_echoed": "3", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_sctp_timeout_cookie_wait": "3", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_sctp_timeout_established": "210", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_sctp_timeout_heartbeat_sent": "30", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_sctp_timeout_shutdown_ack_sent": "3", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_sctp_timeout_shutdown_recd": "3", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_sctp_timeout_shutdown_sent": "3", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_tcp_be_liberal": "0", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_tcp_ignore_invalid_rst": "0", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_tcp_loose": "1", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_tcp_max_retrans": "3", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_tcp_timeout_close": "10", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_tcp_timeout_close_wait": "60", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_tcp_timeout_established": "432000", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_tcp_timeout_fin_wait": "120", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_tcp_timeout_last_ack": "30", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_tcp_timeout_max_retrans": "300", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_tcp_timeout_syn_recv": "60", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_tcp_timeout_syn_sent": "120", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_tcp_timeout_time_wait": "120", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_tcp_timeout_unacknowledged": "300", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_timestamp": "0", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_udp_timeout": "30", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_conntrack_udp_timeout_stream": "120", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_flowtable_tcp_timeout": "30", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_flowtable_udp_timeout": "30", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_hooks_lwtunnel": "0", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_log.0": "NONE", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_log.1": "NONE", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_log.10": "NONE", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_log.11": "NONE", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_log.12": "NONE", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_log.2": "NONE", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_log.3": "NONE", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_log.4": "NONE", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_log.5": "NONE", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_log.6": "NONE", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_log.7": "NONE", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_log.8": "NONE", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_log.9": "NONE", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.netfilter.nf_log_all_netns": "0", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.nf_conntrack_max": "262144", 2026-03-10T15:38:59.475 INFO:tasks.workunit.client.0.vm01.stdout: "net.unix.max_dgram_qlen": "512", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "sunrpc.max_resvport": "1023", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "sunrpc.min_resvport": "665", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "sunrpc.tcp_fin_timeout": "15", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "sunrpc.tcp_max_slot_table_entries": "65536", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "sunrpc.tcp_slot_table_entries": "2", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "sunrpc.udp_slot_table_entries": "16", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "user.max_cgroup_namespaces": "31846", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "user.max_fanotify_groups": "128", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "user.max_fanotify_marks": "66044", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "user.max_inotify_instances": "128", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "user.max_inotify_watches": "62113", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "user.max_ipc_namespaces": "31846", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "user.max_mnt_namespaces": "31846", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "user.max_net_namespaces": "31846", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "user.max_pid_namespaces": "31846", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "user.max_time_namespaces": "31846", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "user.max_user_namespaces": "31846", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "user.max_uts_namespaces": "31846", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.admin_reserve_kbytes": "8192", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.dirty_background_bytes": "0", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.dirty_background_ratio": "10", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.dirty_bytes": "0", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.dirty_expire_centisecs": "3000", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.dirty_ratio": "20", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.dirty_writeback_centisecs": "500", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.dirtytime_expire_seconds": "43200", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.hugetlb_shm_group": "0", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.laptop_mode": "0", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.legacy_va_layout": "0", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.lowmem_reserve_ratio": "256\t32\t0", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.max_map_count": "65530", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.min_free_kbytes": "11421", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.min_slab_ratio": "5", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.min_unmapped_ratio": "1", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.mmap_min_addr": "65536", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.mmap_rnd_bits": "28", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.mmap_rnd_compat_bits": "8", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.nr_hugepages": "0", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.nr_hugepages_mempolicy": "0", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.nr_overcommit_hugepages": "0", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.numa_stat": "1", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.numa_zonelist_order": "Node", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.oom_dump_tasks": "1", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.oom_kill_allocating_task": "0", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.overcommit_kbytes": "0", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.overcommit_memory": "0", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.overcommit_ratio": "50", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.page-cluster": "3", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.page_lock_unfairness": "5", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.panic_on_oom": "0", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.percpu_pagelist_high_fraction": "0", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.stat_interval": "1", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.swappiness": "60", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.unprivileged_userfaultfd": "0", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.user_reserve_kbytes": "131072", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.vfs_cache_pressure": "100", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.watermark_boost_factor": "15000", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.watermark_scale_factor": "10", 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "vm.zone_reclaim_mode": "0" 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: }, 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "system_uptime": 253.31, 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: "tcp6_ports_used": [ 2026-03-10T15:38:59.476 INFO:tasks.workunit.client.0.vm01.stdout: 49929, 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 22, 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 111 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: ], 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: "tcp_ports_used": [ 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 22, 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 111, 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 5345, 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 55501, 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 53, 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 33167 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: ], 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: "timestamp": 1773157139.4566596, 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: "udp6_ports_used": [ 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 111, 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 123, 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 123, 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 123, 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 34103 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: ], 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: "udp_ports_used": [ 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 34680, 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 53, 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 68, 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 111, 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 123, 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 123, 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 123, 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: 880 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: ], 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout: "vendor": "QEMU" 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stdout:} 2026-03-10T15:38:59.477 INFO:tasks.workunit.client.0.vm01.stderr:+ /usr/sbin/cephadm version 2026-03-10T15:38:59.559 INFO:tasks.workunit.client.0.vm01.stdout:cephadm version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable) 2026-03-10T15:38:59.577 INFO:tasks.workunit.client.0.vm01.stderr:+ /usr/sbin/cephadm version 2026-03-10T15:38:59.577 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'cephadm version' 2026-03-10T15:38:59.672 INFO:tasks.workunit.client.0.vm01.stdout:cephadm version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable) 2026-03-10T15:38:59.672 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' -z '' ']' 2026-03-10T15:38:59.672 INFO:tasks.workunit.client.0.vm01.stderr:+ /usr/sbin/cephadm version 2026-03-10T15:38:59.672 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -v UNSET 2026-03-10T15:38:59.762 INFO:tasks.workunit.client.0.vm01.stdout:cephadm version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable) 2026-03-10T15:38:59.762 INFO:tasks.workunit.client.0.vm01.stderr:+ /usr/sbin/cephadm version 2026-03-10T15:38:59.762 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -v UNKNOWN 2026-03-10T15:38:59.858 INFO:tasks.workunit.client.0.vm01.stdout:cephadm version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable) 2026-03-10T15:38:59.858 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef -- ceph -v 2026-03-10T15:38:59.858 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'ceph version' 2026-03-10T15:39:02.013 INFO:tasks.workunit.client.0.vm01.stderr:Unable to find image 'quay.ceph.io/ceph-ci/ceph:squid' locally 2026-03-10T15:39:03.409 INFO:tasks.workunit.client.0.vm01.stderr:squid: Pulling from ceph-ci/ceph 2026-03-10T15:39:03.409 INFO:tasks.workunit.client.0.vm01.stderr:8e380faede39: Pulling fs layer 2026-03-10T15:39:03.409 INFO:tasks.workunit.client.0.vm01.stderr:1752b8d01aa0: Pulling fs layer 2026-03-10T15:39:08.933 INFO:tasks.workunit.client.0.vm01.stderr:8e380faede39: Verifying Checksum 2026-03-10T15:39:08.933 INFO:tasks.workunit.client.0.vm01.stderr:8e380faede39: Download complete 2026-03-10T15:39:10.563 INFO:tasks.workunit.client.0.vm01.stderr:8e380faede39: Pull complete 2026-03-10T15:39:26.085 INFO:tasks.workunit.client.0.vm01.stderr:1752b8d01aa0: Verifying Checksum 2026-03-10T15:39:26.085 INFO:tasks.workunit.client.0.vm01.stderr:1752b8d01aa0: Download complete 2026-03-10T15:39:34.461 INFO:tasks.workunit.client.0.vm01.stderr:1752b8d01aa0: Pull complete 2026-03-10T15:39:34.464 INFO:tasks.workunit.client.0.vm01.stderr:Digest: sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T15:39:34.464 INFO:tasks.workunit.client.0.vm01.stderr:Status: Downloaded newer image for quay.ceph.io/ceph-ci/ceph:squid 2026-03-10T15:39:36.136 INFO:tasks.workunit.client.0.vm01.stdout:ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable) 2026-03-10T15:39:36.137 INFO:tasks.workunit.client.0.vm01.stderr:+ grep FOO=BAR 2026-03-10T15:39:36.137 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef -e FOO=BAR -- printenv 2026-03-10T15:39:38.617 INFO:tasks.workunit.client.0.vm01.stdout:FOO=BAR 2026-03-10T15:39:38.617 INFO:tasks.workunit.client.0.vm01.stderr:+ echo foo 2026-03-10T15:39:38.617 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q foo 2026-03-10T15:39:38.617 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell -- cat 2026-03-10T15:39:39.849 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-cluster --fsid 00000000-0000-0000-0000-0000deadbeef --force 2026-03-10T15:39:39.935 INFO:tasks.workunit.client.0.vm01.stdout:Deleting cluster with fsid: 00000000-0000-0000-0000-0000deadbeef 2026-03-10T15:39:41.148 INFO:tasks.workunit.client.0.vm01.stderr:++ mktemp -p tmp.test_cephadm.sh.dSdiir 2026-03-10T15:39:41.149 INFO:tasks.workunit.client.0.vm01.stderr:+ ORIG_CONFIG=tmp.test_cephadm.sh.dSdiir/tmp.QIVcPbMVhB 2026-03-10T15:39:41.149 INFO:tasks.workunit.client.0.vm01.stderr:++ mktemp -p tmp.test_cephadm.sh.dSdiir 2026-03-10T15:39:41.150 INFO:tasks.workunit.client.0.vm01.stderr:+ CONFIG=tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA 2026-03-10T15:39:41.150 INFO:tasks.workunit.client.0.vm01.stderr:++ mktemp -p tmp.test_cephadm.sh.dSdiir 2026-03-10T15:39:41.151 INFO:tasks.workunit.client.0.vm01.stderr:+ MONCONFIG=tmp.test_cephadm.sh.dSdiir/tmp.9JMNgNKnno 2026-03-10T15:39:41.151 INFO:tasks.workunit.client.0.vm01.stderr:++ mktemp -p tmp.test_cephadm.sh.dSdiir 2026-03-10T15:39:41.152 INFO:tasks.workunit.client.0.vm01.stderr:+ KEYRING=tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI 2026-03-10T15:39:41.152 INFO:tasks.workunit.client.0.vm01.stderr:+ IP=127.0.0.1 2026-03-10T15:39:41.152 INFO:tasks.workunit.client.0.vm01.stderr:+ cat 2026-03-10T15:39:41.153 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid bootstrap --mon-id a --mgr-id x --mon-ip 127.0.0.1 --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.QIVcPbMVhB --output-config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --output-keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI --output-pub-ssh-key tmp.test_cephadm.sh.dSdiir/ceph.pub --allow-overwrite --skip-mon-network --skip-monitoring-stack 2026-03-10T15:39:41.241 INFO:tasks.workunit.client.0.vm01.stderr:Specifying an fsid for your cluster offers no advantages and may increase the likelihood of fsid conflicts. 2026-03-10T15:39:41.241 INFO:tasks.workunit.client.0.vm01.stdout:Verifying podman|docker is present... 2026-03-10T15:39:41.242 INFO:tasks.workunit.client.0.vm01.stdout:Verifying lvm2 is present... 2026-03-10T15:39:41.242 INFO:tasks.workunit.client.0.vm01.stdout:Verifying time synchronization is in place... 2026-03-10T15:39:41.266 INFO:tasks.workunit.client.0.vm01.stdout:Unit ntp.service is enabled and running 2026-03-10T15:39:41.266 INFO:tasks.workunit.client.0.vm01.stdout:Repeating the final host check... 2026-03-10T15:39:41.266 INFO:tasks.workunit.client.0.vm01.stdout:docker (/usr/bin/docker) is present 2026-03-10T15:39:41.266 INFO:tasks.workunit.client.0.vm01.stdout:systemctl is present 2026-03-10T15:39:41.266 INFO:tasks.workunit.client.0.vm01.stdout:lvcreate is present 2026-03-10T15:39:41.291 INFO:tasks.workunit.client.0.vm01.stdout:Unit ntp.service is enabled and running 2026-03-10T15:39:41.291 INFO:tasks.workunit.client.0.vm01.stdout:Host looks OK 2026-03-10T15:39:41.291 INFO:tasks.workunit.client.0.vm01.stdout:Cluster fsid: 00000000-0000-0000-0000-0000deadbeef 2026-03-10T15:39:41.291 INFO:tasks.workunit.client.0.vm01.stdout:Verifying IP 127.0.0.1 port 3300 ... 2026-03-10T15:39:41.291 INFO:tasks.workunit.client.0.vm01.stdout:Verifying IP 127.0.0.1 port 6789 ... 2026-03-10T15:39:41.291 INFO:tasks.workunit.client.0.vm01.stdout:Internal network (--cluster-network) has not been provided, OSD replication will default to the public_network 2026-03-10T15:39:41.291 INFO:tasks.workunit.client.0.vm01.stdout:Pulling container image quay.ceph.io/ceph-ci/ceph:squid... 2026-03-10T15:39:42.417 INFO:tasks.workunit.client.0.vm01.stdout:Ceph version: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable) 2026-03-10T15:39:42.417 INFO:tasks.workunit.client.0.vm01.stdout:Extracting ceph user uid/gid from container image... 2026-03-10T15:39:42.509 INFO:tasks.workunit.client.0.vm01.stdout:Creating initial keys... 2026-03-10T15:39:42.834 INFO:tasks.workunit.client.0.vm01.stdout:Creating initial monmap... 2026-03-10T15:39:42.953 INFO:tasks.workunit.client.0.vm01.stdout:Creating mon... 2026-03-10T15:39:43.989 INFO:tasks.workunit.client.0.vm01.stdout:Waiting for mon to start... 2026-03-10T15:39:43.989 INFO:tasks.workunit.client.0.vm01.stdout:Waiting for mon... 2026-03-10T15:39:44.225 INFO:tasks.workunit.client.0.vm01.stdout:mon is available 2026-03-10T15:39:44.225 INFO:tasks.workunit.client.0.vm01.stdout:Assimilating anything we can from ceph.conf... 2026-03-10T15:39:44.421 INFO:tasks.workunit.client.0.vm01.stdout:Generating new minimal ceph.conf... 2026-03-10T15:39:44.616 INFO:tasks.workunit.client.0.vm01.stdout:Restarting the monitor... 2026-03-10T15:39:44.727 INFO:tasks.workunit.client.0.vm01.stdout:Wrote config to tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA 2026-03-10T15:39:44.728 INFO:tasks.workunit.client.0.vm01.stdout:Wrote keyring to tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI 2026-03-10T15:39:44.728 INFO:tasks.workunit.client.0.vm01.stdout:Creating mgr... 2026-03-10T15:39:44.728 INFO:tasks.workunit.client.0.vm01.stdout:Verifying port 0.0.0.0:9283 ... 2026-03-10T15:39:44.728 INFO:tasks.workunit.client.0.vm01.stdout:Verifying port 0.0.0.0:8765 ... 2026-03-10T15:39:45.115 INFO:tasks.workunit.client.0.vm01.stdout:Waiting for mgr to start... 2026-03-10T15:39:45.115 INFO:tasks.workunit.client.0.vm01.stdout:Waiting for mgr... 2026-03-10T15:39:45.331 INFO:tasks.workunit.client.0.vm01.stdout:mgr not available, waiting (1/15)... 2026-03-10T15:39:47.597 INFO:tasks.workunit.client.0.vm01.stdout:mgr not available, waiting (2/15)... 2026-03-10T15:39:49.913 INFO:tasks.workunit.client.0.vm01.stdout:mgr is available 2026-03-10T15:39:50.166 INFO:tasks.workunit.client.0.vm01.stdout:Enabling cephadm module... 2026-03-10T15:39:51.882 INFO:tasks.workunit.client.0.vm01.stdout:Waiting for the mgr to restart... 2026-03-10T15:39:51.882 INFO:tasks.workunit.client.0.vm01.stdout:Waiting for mgr epoch 5... 2026-03-10T15:39:55.720 INFO:tasks.workunit.client.0.vm01.stdout:mgr epoch 5 is available 2026-03-10T15:39:55.720 INFO:tasks.workunit.client.0.vm01.stdout:Setting orchestrator backend to cephadm... 2026-03-10T15:39:56.454 INFO:tasks.workunit.client.0.vm01.stdout:Generating ssh key... 2026-03-10T15:39:56.990 INFO:tasks.workunit.client.0.vm01.stdout:Wrote public SSH key to tmp.test_cephadm.sh.dSdiir/ceph.pub 2026-03-10T15:39:56.990 INFO:tasks.workunit.client.0.vm01.stdout:Adding key to root@localhost authorized_keys... 2026-03-10T15:39:56.990 INFO:tasks.workunit.client.0.vm01.stdout:Adding host vm01... 2026-03-10T15:39:59.114 INFO:tasks.workunit.client.0.vm01.stdout:Deploying mon service with default placement... 2026-03-10T15:39:59.437 INFO:tasks.workunit.client.0.vm01.stdout:Deploying mgr service with default placement... 2026-03-10T15:39:59.694 INFO:tasks.workunit.client.0.vm01.stdout:Deploying crash service with default placement... 2026-03-10T15:40:00.614 INFO:tasks.workunit.client.0.vm01.stdout:Enabling the dashboard module... 2026-03-10T15:40:02.162 INFO:tasks.workunit.client.0.vm01.stdout:Waiting for the mgr to restart... 2026-03-10T15:40:02.162 INFO:tasks.workunit.client.0.vm01.stdout:Waiting for mgr epoch 9... 2026-03-10T15:40:05.989 INFO:tasks.workunit.client.0.vm01.stdout:mgr epoch 9 is available 2026-03-10T15:40:05.989 INFO:tasks.workunit.client.0.vm01.stdout:Generating a dashboard self-signed certificate... 2026-03-10T15:40:06.350 INFO:tasks.workunit.client.0.vm01.stdout:Creating initial admin user... 2026-03-10T15:40:06.967 INFO:tasks.workunit.client.0.vm01.stdout:Fetching dashboard port number... 2026-03-10T15:40:07.298 INFO:tasks.workunit.client.0.vm01.stdout:Ceph Dashboard is now available at: 2026-03-10T15:40:07.298 INFO:tasks.workunit.client.0.vm01.stdout: 2026-03-10T15:40:07.298 INFO:tasks.workunit.client.0.vm01.stdout: URL: https://vm01.local:8443/ 2026-03-10T15:40:07.298 INFO:tasks.workunit.client.0.vm01.stdout: User: admin 2026-03-10T15:40:07.298 INFO:tasks.workunit.client.0.vm01.stdout: Password: 4qxuss3hyq 2026-03-10T15:40:07.298 INFO:tasks.workunit.client.0.vm01.stdout: 2026-03-10T15:40:07.298 INFO:tasks.workunit.client.0.vm01.stdout:Saving cluster configuration to /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/config directory 2026-03-10T15:40:07.615 INFO:tasks.workunit.client.0.vm01.stdout:You can access the Ceph CLI as following in case of multi-cluster or non-default config: 2026-03-10T15:40:07.615 INFO:tasks.workunit.client.0.vm01.stdout: 2026-03-10T15:40:07.615 INFO:tasks.workunit.client.0.vm01.stdout: sudo /usr/sbin/cephadm shell --fsid 00000000-0000-0000-0000-0000deadbeef -c tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA -k tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI 2026-03-10T15:40:07.615 INFO:tasks.workunit.client.0.vm01.stdout: 2026-03-10T15:40:07.615 INFO:tasks.workunit.client.0.vm01.stdout:Or, if you are only running a single cluster on this host: 2026-03-10T15:40:07.615 INFO:tasks.workunit.client.0.vm01.stdout: 2026-03-10T15:40:07.615 INFO:tasks.workunit.client.0.vm01.stdout: sudo /usr/sbin/cephadm shell 2026-03-10T15:40:07.615 INFO:tasks.workunit.client.0.vm01.stdout: 2026-03-10T15:40:07.615 INFO:tasks.workunit.client.0.vm01.stdout:Please consider enabling telemetry to help improve Ceph: 2026-03-10T15:40:07.615 INFO:tasks.workunit.client.0.vm01.stdout: 2026-03-10T15:40:07.615 INFO:tasks.workunit.client.0.vm01.stdout: ceph telemetry on 2026-03-10T15:40:07.615 INFO:tasks.workunit.client.0.vm01.stdout: 2026-03-10T15:40:07.615 INFO:tasks.workunit.client.0.vm01.stdout:For more information see: 2026-03-10T15:40:07.615 INFO:tasks.workunit.client.0.vm01.stdout: 2026-03-10T15:40:07.615 INFO:tasks.workunit.client.0.vm01.stdout: https://docs.ceph.com/en/latest/mgr/telemetry/ 2026-03-10T15:40:07.615 INFO:tasks.workunit.client.0.vm01.stdout: 2026-03-10T15:40:07.615 INFO:tasks.workunit.client.0.vm01.stdout:Bootstrap complete. 2026-03-10T15:40:07.636 INFO:tasks.workunit.client.0.vm01.stderr:+ test -e tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA 2026-03-10T15:40:07.636 INFO:tasks.workunit.client.0.vm01.stderr:+ test -e tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI 2026-03-10T15:40:07.636 INFO:tasks.workunit.client.0.vm01.stderr:+ rm -f tmp.test_cephadm.sh.dSdiir/tmp.QIVcPbMVhB 2026-03-10T15:40:07.637 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo test -e /var/log/ceph/00000000-0000-0000-0000-0000deadbeef/ceph-mon.a.log 2026-03-10T15:40:07.646 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo test -e /var/log/ceph/00000000-0000-0000-0000-0000deadbeef/ceph-mgr.x.log 2026-03-10T15:40:07.652 INFO:tasks.workunit.client.0.vm01.stderr:+ for u in ceph.target ceph-$FSID.target ceph-$FSID@mon.a ceph-$FSID@mgr.x 2026-03-10T15:40:07.652 INFO:tasks.workunit.client.0.vm01.stderr:+ systemctl is-enabled ceph.target 2026-03-10T15:40:07.654 INFO:tasks.workunit.client.0.vm01.stdout:enabled 2026-03-10T15:40:07.654 INFO:tasks.workunit.client.0.vm01.stderr:+ systemctl is-active ceph.target 2026-03-10T15:40:07.656 INFO:tasks.workunit.client.0.vm01.stdout:active 2026-03-10T15:40:07.656 INFO:tasks.workunit.client.0.vm01.stderr:+ for u in ceph.target ceph-$FSID.target ceph-$FSID@mon.a ceph-$FSID@mgr.x 2026-03-10T15:40:07.656 INFO:tasks.workunit.client.0.vm01.stderr:+ systemctl is-enabled ceph-00000000-0000-0000-0000-0000deadbeef.target 2026-03-10T15:40:07.657 INFO:tasks.workunit.client.0.vm01.stdout:enabled 2026-03-10T15:40:07.658 INFO:tasks.workunit.client.0.vm01.stderr:+ systemctl is-active ceph-00000000-0000-0000-0000-0000deadbeef.target 2026-03-10T15:40:07.659 INFO:tasks.workunit.client.0.vm01.stdout:active 2026-03-10T15:40:07.659 INFO:tasks.workunit.client.0.vm01.stderr:+ for u in ceph.target ceph-$FSID.target ceph-$FSID@mon.a ceph-$FSID@mgr.x 2026-03-10T15:40:07.659 INFO:tasks.workunit.client.0.vm01.stderr:+ systemctl is-enabled ceph-00000000-0000-0000-0000-0000deadbeef@mon.a 2026-03-10T15:40:07.661 INFO:tasks.workunit.client.0.vm01.stdout:enabled 2026-03-10T15:40:07.661 INFO:tasks.workunit.client.0.vm01.stderr:+ systemctl is-active ceph-00000000-0000-0000-0000-0000deadbeef@mon.a 2026-03-10T15:40:07.662 INFO:tasks.workunit.client.0.vm01.stdout:active 2026-03-10T15:40:07.663 INFO:tasks.workunit.client.0.vm01.stderr:+ for u in ceph.target ceph-$FSID.target ceph-$FSID@mon.a ceph-$FSID@mgr.x 2026-03-10T15:40:07.663 INFO:tasks.workunit.client.0.vm01.stderr:+ systemctl is-enabled ceph-00000000-0000-0000-0000-0000deadbeef@mgr.x 2026-03-10T15:40:07.665 INFO:tasks.workunit.client.0.vm01.stdout:enabled 2026-03-10T15:40:07.665 INFO:tasks.workunit.client.0.vm01.stderr:+ systemctl is-active ceph-00000000-0000-0000-0000-0000deadbeef@mgr.x 2026-03-10T15:40:07.666 INFO:tasks.workunit.client.0.vm01.stdout:active 2026-03-10T15:40:07.666 INFO:tasks.workunit.client.0.vm01.stderr:+ systemctl 2026-03-10T15:40:07.667 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q .slice 2026-03-10T15:40:07.667 INFO:tasks.workunit.client.0.vm01.stderr:+ grep system-ceph 2026-03-10T15:40:07.670 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph -s 2026-03-10T15:40:07.670 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 00000000-0000-0000-0000-0000deadbeef 2026-03-10T15:40:08.111 INFO:tasks.workunit.client.0.vm01.stdout: id: 00000000-0000-0000-0000-0000deadbeef 2026-03-10T15:40:08.112 INFO:tasks.workunit.client.0.vm01.stderr:+ for t in mon mgr node-exporter prometheus grafana 2026-03-10T15:40:08.112 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph orch apply mon --unmanaged 2026-03-10T15:40:08.474 INFO:tasks.workunit.client.0.vm01.stdout:Scheduled mon update... 2026-03-10T15:40:08.538 INFO:tasks.workunit.client.0.vm01.stderr:+ for t in mon mgr node-exporter prometheus grafana 2026-03-10T15:40:08.538 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph orch apply mgr --unmanaged 2026-03-10T15:40:08.874 INFO:tasks.workunit.client.0.vm01.stdout:Scheduled mgr update... 2026-03-10T15:40:09.012 INFO:tasks.workunit.client.0.vm01.stderr:+ for t in mon mgr node-exporter prometheus grafana 2026-03-10T15:40:09.012 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph orch apply node-exporter --unmanaged 2026-03-10T15:40:09.424 INFO:tasks.workunit.client.0.vm01.stdout:Scheduled node-exporter update... 2026-03-10T15:40:09.533 INFO:tasks.workunit.client.0.vm01.stderr:+ for t in mon mgr node-exporter prometheus grafana 2026-03-10T15:40:09.533 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph orch apply prometheus --unmanaged 2026-03-10T15:40:09.932 INFO:tasks.workunit.client.0.vm01.stdout:Scheduled prometheus update... 2026-03-10T15:40:10.088 INFO:tasks.workunit.client.0.vm01.stderr:+ for t in mon mgr node-exporter prometheus grafana 2026-03-10T15:40:10.088 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph orch apply grafana --unmanaged 2026-03-10T15:40:10.584 INFO:tasks.workunit.client.0.vm01.stdout:Scheduled grafana update... 2026-03-10T15:40:10.652 INFO:tasks.workunit.client.0.vm01.stderr:+ jq '.[]' 2026-03-10T15:40:10.653 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ls 2026-03-10T15:40:10.655 INFO:tasks.workunit.client.0.vm01.stderr:+ jq 'select(.name == "mon.a").fsid' 2026-03-10T15:40:10.655 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 00000000-0000-0000-0000-0000deadbeef 2026-03-10T15:40:15.514 INFO:tasks.workunit.client.0.vm01.stdout:"00000000-0000-0000-0000-0000deadbeef" 2026-03-10T15:40:15.514 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ls 2026-03-10T15:40:15.516 INFO:tasks.workunit.client.0.vm01.stderr:+ jq '.[]' 2026-03-10T15:40:15.516 INFO:tasks.workunit.client.0.vm01.stderr:+ jq 'select(.name == "mgr.x").fsid' 2026-03-10T15:40:15.517 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 00000000-0000-0000-0000-0000deadbeef 2026-03-10T15:40:20.734 INFO:tasks.workunit.client.0.vm01.stdout:"00000000-0000-0000-0000-0000deadbeef" 2026-03-10T15:40:20.734 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ls 2026-03-10T15:40:20.734 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q '\.' 2026-03-10T15:40:20.735 INFO:tasks.workunit.client.0.vm01.stderr:+ jq 'select(.name == "mon.a").version' 2026-03-10T15:40:20.739 INFO:tasks.workunit.client.0.vm01.stderr:+ jq '.[]' 2026-03-10T15:40:24.719 INFO:tasks.workunit.client.0.vm01.stderr:+ cp tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA tmp.test_cephadm.sh.dSdiir/tmp.9JMNgNKnno 2026-03-10T15:40:24.720 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'public addrv = [v2:127.0.0.1:3301,v1:127.0.0.1:6790]' 2026-03-10T15:40:24.721 INFO:tasks.workunit.client.0.vm01.stderr:+ jq --null-input --arg fsid 00000000-0000-0000-0000-0000deadbeef --arg name mon.b --arg keyring /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/keyring --arg config tmp.test_cephadm.sh.dSdiir/tmp.9JMNgNKnno '{"fsid": $fsid, "name": $name, "params":{"keyring": $keyring, "config": $config}}' 2026-03-10T15:40:24.721 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid _orch deploy 2026-03-10T15:40:24.827 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-mon-b 2026-03-10T15:40:24.827 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:40:24.827 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-mon-b 2026-03-10T15:40:24.836 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-mon.b 2026-03-10T15:40:24.836 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:40:24.836 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-mon.b 2026-03-10T15:40:24.836 INFO:tasks.workunit.client.0.vm01.stderr:Deploy daemon mon.b ... 2026-03-10T15:40:25.480 INFO:tasks.workunit.client.0.vm01.stderr:+ for u in ceph-$FSID@mon.b 2026-03-10T15:40:25.481 INFO:tasks.workunit.client.0.vm01.stderr:+ systemctl is-enabled ceph-00000000-0000-0000-0000-0000deadbeef@mon.b 2026-03-10T15:40:25.483 INFO:tasks.workunit.client.0.vm01.stdout:enabled 2026-03-10T15:40:25.483 INFO:tasks.workunit.client.0.vm01.stderr:+ systemctl is-active ceph-00000000-0000-0000-0000-0000deadbeef@mon.b 2026-03-10T15:40:25.484 INFO:tasks.workunit.client.0.vm01.stdout:active 2026-03-10T15:40:25.485 INFO:tasks.workunit.client.0.vm01.stderr:+ cond='sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph mon stat | grep '\''2 mons'\''' 2026-03-10T15:40:25.485 INFO:tasks.workunit.client.0.vm01.stderr:+ is_available mon.b 'sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph mon stat | grep '\''2 mons'\''' 30 2026-03-10T15:40:25.485 INFO:tasks.workunit.client.0.vm01.stderr:+ local name=mon.b 2026-03-10T15:40:25.485 INFO:tasks.workunit.client.0.vm01.stderr:+ local 'condition=sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph mon stat | grep '\''2 mons'\''' 2026-03-10T15:40:25.485 INFO:tasks.workunit.client.0.vm01.stderr:+ local tries=30 2026-03-10T15:40:25.485 INFO:tasks.workunit.client.0.vm01.stderr:+ local num=0 2026-03-10T15:40:25.485 INFO:tasks.workunit.client.0.vm01.stderr:+ eval 'sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph mon stat | grep '\''2 mons'\''' 2026-03-10T15:40:25.485 INFO:tasks.workunit.client.0.vm01.stderr:++ grep '2 mons' 2026-03-10T15:40:25.485 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph mon stat 2026-03-10T15:40:32.161 INFO:tasks.workunit.client.0.vm01.stdout:e2: 2 mons at {a=[v2:127.0.0.1:3300/0,v1:127.0.0.1:6789/0],b=[v2:127.0.0.1:3301/0,v1:127.0.0.1:6790/0]} removed_ranks: {} disallowed_leaders: {}, election epoch 10, leader 0 a, quorum 0,1 a,b 2026-03-10T15:40:32.161 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'mon.b is available' 2026-03-10T15:40:32.161 INFO:tasks.workunit.client.0.vm01.stdout:mon.b is available 2026-03-10T15:40:32.161 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-10T15:40:32.161 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph auth get-or-create mgr.y mon 'allow profile mgr' osd 'allow *' mds 'allow *' 2026-03-10T15:40:32.604 INFO:tasks.workunit.client.0.vm01.stderr:+ jq --null-input --arg fsid 00000000-0000-0000-0000-0000deadbeef --arg name mgr.y --arg keyring tmp.test_cephadm.sh.dSdiir/keyring.mgr.y --arg config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA '{"fsid": $fsid, "name": $name, "params":{"keyring": $keyring, "config": $config}}' 2026-03-10T15:40:32.604 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid _orch deploy 2026-03-10T15:40:32.714 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-mgr-y 2026-03-10T15:40:32.714 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:40:32.714 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-mgr-y 2026-03-10T15:40:32.723 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-mgr.y 2026-03-10T15:40:32.723 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:40:32.723 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-mgr.y 2026-03-10T15:40:32.723 INFO:tasks.workunit.client.0.vm01.stderr:Deploy daemon mgr.y ... 2026-03-10T15:40:33.245 INFO:tasks.workunit.client.0.vm01.stderr:+ for u in ceph-$FSID@mgr.y 2026-03-10T15:40:33.245 INFO:tasks.workunit.client.0.vm01.stderr:+ systemctl is-enabled ceph-00000000-0000-0000-0000-0000deadbeef@mgr.y 2026-03-10T15:40:33.247 INFO:tasks.workunit.client.0.vm01.stdout:enabled 2026-03-10T15:40:33.248 INFO:tasks.workunit.client.0.vm01.stderr:+ systemctl is-active ceph-00000000-0000-0000-0000-0000deadbeef@mgr.y 2026-03-10T15:40:33.249 INFO:tasks.workunit.client.0.vm01.stdout:active 2026-03-10T15:40:33.249 INFO:tasks.workunit.client.0.vm01.stderr:++ seq 1 30 2026-03-10T15:40:33.250 INFO:tasks.workunit.client.0.vm01.stderr:+ for f in `seq 1 30` 2026-03-10T15:40:33.250 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph -s -f json-pretty 2026-03-10T15:40:33.250 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q 1 2026-03-10T15:40:33.253 INFO:tasks.workunit.client.0.vm01.stderr:+ jq .mgrmap.num_standbys 2026-03-10T15:40:33.755 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 1 2026-03-10T15:40:34.758 INFO:tasks.workunit.client.0.vm01.stderr:+ for f in `seq 1 30` 2026-03-10T15:40:34.759 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph -s -f json-pretty 2026-03-10T15:40:34.763 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q 1 2026-03-10T15:40:34.763 INFO:tasks.workunit.client.0.vm01.stderr:+ jq .mgrmap.num_standbys 2026-03-10T15:40:35.286 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 1 2026-03-10T15:40:36.287 INFO:tasks.workunit.client.0.vm01.stderr:+ for f in `seq 1 30` 2026-03-10T15:40:36.287 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph -s -f json-pretty 2026-03-10T15:40:36.288 INFO:tasks.workunit.client.0.vm01.stderr:+ jq .mgrmap.num_standbys 2026-03-10T15:40:36.288 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q 1 2026-03-10T15:40:36.769 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 1 2026-03-10T15:40:37.770 INFO:tasks.workunit.client.0.vm01.stderr:+ for f in `seq 1 30` 2026-03-10T15:40:37.770 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph -s -f json-pretty 2026-03-10T15:40:37.770 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q 1 2026-03-10T15:40:37.771 INFO:tasks.workunit.client.0.vm01.stderr:+ jq .mgrmap.num_standbys 2026-03-10T15:40:38.185 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-10T15:40:38.185 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph -s -f json-pretty 2026-03-10T15:40:38.185 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q 1 2026-03-10T15:40:38.186 INFO:tasks.workunit.client.0.vm01.stderr:+ jq .mgrmap.num_standbys 2026-03-10T15:40:38.615 INFO:tasks.workunit.client.0.vm01.stderr:+ dd if=/dev/zero of=tmp.test_cephadm.sh.dSdiir/test_cephadm_osd.img bs=1 count=0 seek=6G 2026-03-10T15:40:38.616 INFO:tasks.workunit.client.0.vm01.stderr:0+0 records in 2026-03-10T15:40:38.616 INFO:tasks.workunit.client.0.vm01.stderr:0+0 records out 2026-03-10T15:40:38.616 INFO:tasks.workunit.client.0.vm01.stderr:0 bytes copied, 4.841e-05 s, 0.0 kB/s 2026-03-10T15:40:38.617 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo losetup -f 2026-03-10T15:40:38.622 INFO:tasks.workunit.client.0.vm01.stderr:+ loop_dev=/dev/loop3 2026-03-10T15:40:38.622 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo vgremove -f test_cephadm 2026-03-10T15:40:38.632 INFO:tasks.workunit.client.0.vm01.stderr: Volume group "test_cephadm" not found 2026-03-10T15:40:38.632 INFO:tasks.workunit.client.0.vm01.stderr: Cannot process volume group test_cephadm 2026-03-10T15:40:38.665 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-10T15:40:38.665 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo losetup /dev/loop3 tmp.test_cephadm.sh.dSdiir/test_cephadm_osd.img 2026-03-10T15:40:38.685 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo pvcreate /dev/loop3 2026-03-10T15:40:38.731 INFO:tasks.workunit.client.0.vm01.stdout: Physical volume "/dev/loop3" successfully created. 2026-03-10T15:40:38.769 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo vgcreate test_cephadm /dev/loop3 2026-03-10T15:40:38.841 INFO:tasks.workunit.client.0.vm01.stdout: Volume group "test_cephadm" successfully created 2026-03-10T15:40:38.875 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph auth get client.bootstrap-osd 2026-03-10T15:40:39.294 INFO:tasks.workunit.client.0.vm01.stderr:++ seq 0 1 2026-03-10T15:40:39.295 INFO:tasks.workunit.client.0.vm01.stderr:+ for id in `seq 0 $((--OSD_TO_CREATE))` 2026-03-10T15:40:39.295 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo lvcreate -l 50%VG -n test_cephadm.0 test_cephadm 2026-03-10T15:40:39.377 INFO:tasks.workunit.client.0.vm01.stdout: Logical volume "test_cephadm.0" created. 2026-03-10T15:40:39.413 INFO:tasks.workunit.client.0.vm01.stderr:+ for id in `seq 0 $((--OSD_TO_CREATE))` 2026-03-10T15:40:39.413 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo lvcreate -l 50%VG -n test_cephadm.1 test_cephadm 2026-03-10T15:40:39.475 INFO:tasks.workunit.client.0.vm01.stdout: Logical volume "test_cephadm.1" created. 2026-03-10T15:40:39.509 INFO:tasks.workunit.client.0.vm01.stderr:++ seq 0 1 2026-03-10T15:40:39.510 INFO:tasks.workunit.client.0.vm01.stderr:+ for id in `seq 0 $((--OSD_TO_CREATE))` 2026-03-10T15:40:39.510 INFO:tasks.workunit.client.0.vm01.stderr:+ device_name=/dev/test_cephadm/test_cephadm.0 2026-03-10T15:40:39.510 INFO:tasks.workunit.client.0.vm01.stderr:+ CEPH_VOLUME='sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ceph-volume --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/keyring.bootstrap.osd --' 2026-03-10T15:40:39.510 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ceph-volume --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/keyring.bootstrap.osd -- lvm prepare --bluestore --data /dev/test_cephadm/test_cephadm.0 --no-systemd 2026-03-10T15:40:43.831 INFO:tasks.workunit.client.0.vm01.stdout: 2026-03-10T15:40:43.842 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ceph-volume --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/keyring.bootstrap.osd -- lvm batch --no-auto /dev/test_cephadm/test_cephadm.0 --yes --no-systemd 2026-03-10T15:40:44.295 INFO:tasks.workunit.client.0.vm01.stdout: 2026-03-10T15:40:44.312 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ceph-volume --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/keyring.bootstrap.osd -- lvm list --format json /dev/test_cephadm/test_cephadm.0 2026-03-10T15:40:44.826 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo cat tmp.test_cephadm.sh.dSdiir/osd.map 2026-03-10T15:40:44.826 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -cr '.. | ."ceph.osd_id"? | select(.)' 2026-03-10T15:40:44.835 INFO:tasks.workunit.client.0.vm01.stderr:+ osd_id=0 2026-03-10T15:40:44.835 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo cat tmp.test_cephadm.sh.dSdiir/osd.map 2026-03-10T15:40:44.835 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -cr '.. | ."ceph.osd_fsid"? | select(.)' 2026-03-10T15:40:44.845 INFO:tasks.workunit.client.0.vm01.stderr:+ osd_fsid=11fc08f6-b35b-4c01-b71a-ce5565f8458e 2026-03-10T15:40:44.845 INFO:tasks.workunit.client.0.vm01.stderr:+ jq --null-input --arg fsid 00000000-0000-0000-0000-0000deadbeef --arg name osd.0 --arg keyring tmp.test_cephadm.sh.dSdiir/keyring.bootstrap.osd --arg config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --arg osd_fsid 11fc08f6-b35b-4c01-b71a-ce5565f8458e '{"fsid": $fsid, "name": $name, "params":{"keyring": $keyring, "config": $config, "osd_fsid": $osd_fsid}}' 2026-03-10T15:40:44.845 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid _orch deploy 2026-03-10T15:40:44.952 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-osd-0 2026-03-10T15:40:44.952 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:40:44.952 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-osd-0 2026-03-10T15:40:44.960 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-osd.0 2026-03-10T15:40:44.960 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:40:44.960 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-osd.0 2026-03-10T15:40:44.960 INFO:tasks.workunit.client.0.vm01.stderr:Deploy daemon osd.0 ... 2026-03-10T15:40:45.895 INFO:tasks.workunit.client.0.vm01.stderr:+ for id in `seq 0 $((--OSD_TO_CREATE))` 2026-03-10T15:40:45.895 INFO:tasks.workunit.client.0.vm01.stderr:+ device_name=/dev/test_cephadm/test_cephadm.1 2026-03-10T15:40:45.895 INFO:tasks.workunit.client.0.vm01.stderr:+ CEPH_VOLUME='sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ceph-volume --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/keyring.bootstrap.osd --' 2026-03-10T15:40:45.895 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ceph-volume --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/keyring.bootstrap.osd -- lvm prepare --bluestore --data /dev/test_cephadm/test_cephadm.1 --no-systemd 2026-03-10T15:40:49.639 INFO:tasks.workunit.client.0.vm01.stdout: 2026-03-10T15:40:49.660 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ceph-volume --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/keyring.bootstrap.osd -- lvm batch --no-auto /dev/test_cephadm/test_cephadm.1 --yes --no-systemd 2026-03-10T15:40:50.128 INFO:tasks.workunit.client.0.vm01.stdout: 2026-03-10T15:40:50.145 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ceph-volume --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/keyring.bootstrap.osd -- lvm list --format json /dev/test_cephadm/test_cephadm.1 2026-03-10T15:40:50.814 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo cat tmp.test_cephadm.sh.dSdiir/osd.map 2026-03-10T15:40:50.814 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -cr '.. | ."ceph.osd_id"? | select(.)' 2026-03-10T15:40:50.824 INFO:tasks.workunit.client.0.vm01.stderr:+ osd_id=1 2026-03-10T15:40:50.825 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo cat tmp.test_cephadm.sh.dSdiir/osd.map 2026-03-10T15:40:50.825 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -cr '.. | ."ceph.osd_fsid"? | select(.)' 2026-03-10T15:40:50.834 INFO:tasks.workunit.client.0.vm01.stderr:+ osd_fsid=44643506-1f95-4860-a057-ee722ab0b2a6 2026-03-10T15:40:50.835 INFO:tasks.workunit.client.0.vm01.stderr:+ jq --null-input --arg fsid 00000000-0000-0000-0000-0000deadbeef --arg name osd.1 --arg keyring tmp.test_cephadm.sh.dSdiir/keyring.bootstrap.osd --arg config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --arg osd_fsid 44643506-1f95-4860-a057-ee722ab0b2a6 '{"fsid": $fsid, "name": $name, "params":{"keyring": $keyring, "config": $config, "osd_fsid": $osd_fsid}}' 2026-03-10T15:40:50.835 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid _orch deploy 2026-03-10T15:40:50.945 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-osd-1 2026-03-10T15:40:50.945 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:40:50.945 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-osd-1 2026-03-10T15:40:50.955 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-osd.1 2026-03-10T15:40:50.955 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:40:50.955 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-osd.1 2026-03-10T15:40:50.955 INFO:tasks.workunit.client.0.vm01.stderr:Deploy daemon osd.1 ... 2026-03-10T15:40:51.745 INFO:tasks.workunit.client.0.vm01.stderr:+ jq --null-input --arg fsid 00000000-0000-0000-0000-0000deadbeef --arg name node-exporter.a '{"fsid": $fsid, "name": $name}' 2026-03-10T15:40:51.745 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm _orch deploy 2026-03-10T15:40:51.893 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-node-exporter-a 2026-03-10T15:40:51.893 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:40:51.893 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-node-exporter-a 2026-03-10T15:40:51.905 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-node-exporter.a 2026-03-10T15:40:51.905 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:40:51.906 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-node-exporter.a 2026-03-10T15:40:51.906 INFO:tasks.workunit.client.0.vm01.stderr:Deploy daemon node-exporter.a ... 2026-03-10T15:40:52.770 INFO:tasks.workunit.client.0.vm01.stderr:+ cond='curl '\''http://localhost:9100'\'' | grep -q '\''Node Exporter'\''' 2026-03-10T15:40:52.770 INFO:tasks.workunit.client.0.vm01.stderr:+ is_available node-exporter 'curl '\''http://localhost:9100'\'' | grep -q '\''Node Exporter'\''' 10 2026-03-10T15:40:52.770 INFO:tasks.workunit.client.0.vm01.stderr:+ local name=node-exporter 2026-03-10T15:40:52.770 INFO:tasks.workunit.client.0.vm01.stderr:+ local 'condition=curl '\''http://localhost:9100'\'' | grep -q '\''Node Exporter'\''' 2026-03-10T15:40:52.770 INFO:tasks.workunit.client.0.vm01.stderr:+ local tries=10 2026-03-10T15:40:52.770 INFO:tasks.workunit.client.0.vm01.stderr:+ local num=0 2026-03-10T15:40:52.770 INFO:tasks.workunit.client.0.vm01.stderr:+ eval 'curl '\''http://localhost:9100'\'' | grep -q '\''Node Exporter'\''' 2026-03-10T15:40:52.770 INFO:tasks.workunit.client.0.vm01.stderr:++ curl http://localhost:9100 2026-03-10T15:40:52.770 INFO:tasks.workunit.client.0.vm01.stderr:++ grep -q 'Node Exporter' 2026-03-10T15:40:52.811 INFO:tasks.workunit.client.0.vm01.stderr: % Total % Received % Xferd Average Speed Time Time Time Current 2026-03-10T15:40:52.811 INFO:tasks.workunit.client.0.vm01.stderr: Dload Upload Total Spent Left Speed 2026-03-10T15:40:52.815 INFO:tasks.workunit.client.0.vm01.stderr: 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 2026-03-10T15:40:52.815 INFO:tasks.workunit.client.0.vm01.stderr:curl: (7) Failed to connect to localhost port 9100 after 2 ms: Connection refused 2026-03-10T15:40:52.815 INFO:tasks.workunit.client.0.vm01.stderr:+ num=1 2026-03-10T15:40:52.816 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' 1 -ge 10 ']' 2026-03-10T15:40:52.816 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 5 2026-03-10T15:40:57.813 INFO:tasks.workunit.client.0.vm01.stderr:+ eval 'curl '\''http://localhost:9100'\'' | grep -q '\''Node Exporter'\''' 2026-03-10T15:40:57.813 INFO:tasks.workunit.client.0.vm01.stderr:++ grep -q 'Node Exporter' 2026-03-10T15:40:57.814 INFO:tasks.workunit.client.0.vm01.stderr:++ curl http://localhost:9100 2026-03-10T15:40:57.819 INFO:tasks.workunit.client.0.vm01.stderr: % Total % Received % Xferd Average Speed Time Time Time Current 2026-03-10T15:40:57.819 INFO:tasks.workunit.client.0.vm01.stderr: Dload Upload Total Spent Left Speed 2026-03-10T15:40:57.819 INFO:tasks.workunit.client.0.vm01.stderr: 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 150 100 150 0 0 353k 0 --:--:-- --:--:-- --:--:-- 146k 2026-03-10T15:40:57.819 INFO:tasks.workunit.client.0.vm01.stdout:node-exporter is available 2026-03-10T15:40:57.820 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'node-exporter is available' 2026-03-10T15:40:57.820 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-10T15:40:57.820 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm _orch deploy 2026-03-10T15:40:57.820 INFO:tasks.workunit.client.0.vm01.stderr:++ cat /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/../../../src/cephadm/samples/prometheus.json 2026-03-10T15:40:57.826 INFO:tasks.workunit.client.0.vm01.stderr:+ jq --null-input --arg fsid 00000000-0000-0000-0000-0000deadbeef --arg name prometheus.a --argjson config_blobs '{ 2026-03-10T15:40:57.826 INFO:tasks.workunit.client.0.vm01.stderr: "files": { 2026-03-10T15:40:57.826 INFO:tasks.workunit.client.0.vm01.stderr: "prometheus.yml": [ 2026-03-10T15:40:57.826 INFO:tasks.workunit.client.0.vm01.stderr: "global:", 2026-03-10T15:40:57.826 INFO:tasks.workunit.client.0.vm01.stderr: " scrape_interval: 5s", 2026-03-10T15:40:57.826 INFO:tasks.workunit.client.0.vm01.stderr: " evaluation_interval: 10s", 2026-03-10T15:40:57.826 INFO:tasks.workunit.client.0.vm01.stderr: "", 2026-03-10T15:40:57.826 INFO:tasks.workunit.client.0.vm01.stderr: "rule_files: ", 2026-03-10T15:40:57.826 INFO:tasks.workunit.client.0.vm01.stderr: " - '\''/etc/prometheus/alerting/*'\''", 2026-03-10T15:40:57.826 INFO:tasks.workunit.client.0.vm01.stderr: "", 2026-03-10T15:40:57.826 INFO:tasks.workunit.client.0.vm01.stderr: "scrape_configs:", 2026-03-10T15:40:57.826 INFO:tasks.workunit.client.0.vm01.stderr: " - job_name: '\''prometheus'\''", 2026-03-10T15:40:57.826 INFO:tasks.workunit.client.0.vm01.stderr: " static_configs:", 2026-03-10T15:40:57.826 INFO:tasks.workunit.client.0.vm01.stderr: " - targets: ['\''localhost:9095'\'']" 2026-03-10T15:40:57.826 INFO:tasks.workunit.client.0.vm01.stderr: ] 2026-03-10T15:40:57.826 INFO:tasks.workunit.client.0.vm01.stderr: } 2026-03-10T15:40:57.826 INFO:tasks.workunit.client.0.vm01.stderr:}' '{"fsid": $fsid, "name": $name, "config_blobs": $config_blobs}' 2026-03-10T15:40:58.014 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-prometheus-a 2026-03-10T15:40:58.014 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:40:58.014 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-prometheus-a 2026-03-10T15:40:58.026 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-prometheus.a 2026-03-10T15:40:58.026 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:40:58.026 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-prometheus.a 2026-03-10T15:40:58.026 INFO:tasks.workunit.client.0.vm01.stderr:Deploy daemon prometheus.a ... 2026-03-10T15:41:04.272 INFO:tasks.workunit.client.0.vm01.stderr:+ cond='curl '\''localhost:9095/api/v1/query?query=up'\''' 2026-03-10T15:41:04.272 INFO:tasks.workunit.client.0.vm01.stderr:+ is_available prometheus 'curl '\''localhost:9095/api/v1/query?query=up'\''' 10 2026-03-10T15:41:04.272 INFO:tasks.workunit.client.0.vm01.stderr:+ local name=prometheus 2026-03-10T15:41:04.272 INFO:tasks.workunit.client.0.vm01.stderr:+ local 'condition=curl '\''localhost:9095/api/v1/query?query=up'\''' 2026-03-10T15:41:04.272 INFO:tasks.workunit.client.0.vm01.stderr:+ local tries=10 2026-03-10T15:41:04.272 INFO:tasks.workunit.client.0.vm01.stderr:+ local num=0 2026-03-10T15:41:04.272 INFO:tasks.workunit.client.0.vm01.stderr:+ eval 'curl '\''localhost:9095/api/v1/query?query=up'\''' 2026-03-10T15:41:04.272 INFO:tasks.workunit.client.0.vm01.stderr:++ curl 'localhost:9095/api/v1/query?query=up' 2026-03-10T15:41:04.280 INFO:tasks.workunit.client.0.vm01.stderr: % Total % Received % Xferd Average Speed Time Time Time Current 2026-03-10T15:41:04.280 INFO:tasks.workunit.client.0.vm01.stderr: Dload Upload Total Spent Left Speed 2026-03-10T15:41:04.283 INFO:tasks.workunit.client.0.vm01.stderr: 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 2026-03-10T15:41:04.285 INFO:tasks.workunit.client.0.vm01.stderr:curl: (7) Failed to connect to localhost port 9095 after 6 ms: Connection refused 2026-03-10T15:41:04.285 INFO:tasks.workunit.client.0.vm01.stderr:+ num=1 2026-03-10T15:41:04.285 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' 1 -ge 10 ']' 2026-03-10T15:41:04.285 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 5 2026-03-10T15:41:09.286 INFO:tasks.workunit.client.0.vm01.stderr:+ eval 'curl '\''localhost:9095/api/v1/query?query=up'\''' 2026-03-10T15:41:09.287 INFO:tasks.workunit.client.0.vm01.stderr:++ curl 'localhost:9095/api/v1/query?query=up' 2026-03-10T15:41:09.290 INFO:tasks.workunit.client.0.vm01.stderr: % Total % Received % Xferd Average Speed Time Time Time Current 2026-03-10T15:41:09.290 INFO:tasks.workunit.client.0.vm01.stderr: Dload Upload Total Spent Left Speed 2026-03-10T15:41:09.290 INFO:tasks.workunit.client.0.vm01.stderr: 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 63 100 63 0 0 86657 0 --:--:-- --:--:-- --:--:-- 63000 2026-03-10T15:41:09.291 INFO:tasks.workunit.client.0.vm01.stdout:{"status":"success","data":{"resultType":"vector","result":[]}}prometheus is available 2026-03-10T15:41:09.291 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'prometheus is available' 2026-03-10T15:41:09.291 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-10T15:41:09.291 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm _orch deploy 2026-03-10T15:41:09.291 INFO:tasks.workunit.client.0.vm01.stderr:++ cat /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/../../../src/cephadm/samples/grafana.json 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr:+ jq --null-input --arg fsid 00000000-0000-0000-0000-0000deadbeef --arg name grafana.a --argjson config_blobs '{ 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "files": { 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "grafana.ini": [ 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "[users]", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " default_theme = light", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "[auth.anonymous]", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " enabled = true", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " org_name = '\''Main Org.'\''", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " org_role = '\''Viewer'\''", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "[server]", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " domain = '\''bootstrap.storage.lab'\''", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " protocol = https", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " cert_file = /etc/grafana/certs/cert_file", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " cert_key = /etc/grafana/certs/cert_key", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " http_port = 3000", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " http_addr = localhost", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "[security]", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " admin_user = admin", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " admin_password = admin", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " allow_embedding = true" 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: ], 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "provisioning/datasources/ceph-dashboard.yml": [ 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "deleteDatasources:", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " - name: '\''Dashboard'\''", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " orgId: 1", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " ", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "datasources:", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " - name: '\''Dashboard'\''", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " type: '\''prometheus'\''", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " access: '\''proxy'\''", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " orgId: 1", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " url: '\''http://localhost:9095'\''", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " basicAuth: false", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " isDefault: true", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: " editable: false" 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: ], 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "certs/cert_file": [ 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "-----BEGIN CERTIFICATE-----", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "MIIDLTCCAhWgAwIBAgIUEH0mq6u93LKsWlNXst5pxWcuqkQwDQYJKoZIhvcNAQEL", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "BQAwJjELMAkGA1UECgwCSVQxFzAVBgNVBAMMDmNlcGgtZGFzaGJvYXJkMB4XDTIw", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "MDEwNTIyNDYyMFoXDTMwMDEwMjIyNDYyMFowJjELMAkGA1UECgwCSVQxFzAVBgNV", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "BAMMDmNlcGgtZGFzaGJvYXJkMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKC", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "AQEAqxh6eO0NTZJe+DoKZG/kozJCf+83eB3gWzwXoNinRmV/49f5WPR20DIxAe0R", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "saO6XynJXTrhvXT1bsARUq+LSmjWNFoYXopFuOJhGdWn4dmpuHwtpcFv2kjzNOKj", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "U2EG8j6bsRp1jFAzn7kdbSWT0UHySRXp9DPAjDiF3LjykMXiJMReccFXrB1pRi93", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "nJxED8d6oT5GazGB44svb+Zi6ABamZu5SDJC1Fr/O5rWFNQkH4hQEqDPj1817H9O", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "sm0mZiNy77ZQuAzOgZN153L3QOsyJismwNHfAMGMH9mzPKOjyhc13VlZyeEzml8p", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "ZpWQ2gi8P2r/FAr8bFL3MFnHKwIDAQABo1MwUTAdBgNVHQ4EFgQUZg3v7MX4J+hx", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "w3HENCrUkMK8tbwwHwYDVR0jBBgwFoAUZg3v7MX4J+hxw3HENCrUkMK8tbwwDwYD", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "VR0TAQH/BAUwAwEB/zANBgkqhkiG9w0BAQsFAAOCAQEAaR/XPGKwUgVwH3KXAb6+", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "s9NTAt6lCmFdQz1ngoqFSizW7KGSXnOgd6xTiUCR0Tjjo2zKCwhIINaI6mwqMbrg", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "BOjb7diaqwFaitRs27AtdmaqMGndUqEBUn/k64Ld3VPGL4p0W2W+tXsyzZg1qQIn", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "JXb7c4+oWzXny7gHFheYQTwnHzDcNOf9vJiMGyYYvU1xTOGucu6dwtOVDDe1Z4Nq", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "AyIYWDScRr2FeAOXyx4aW2v5bjpTxvP+79/OOBbQ+p4y5F4PDrPeOSweGoo6huTR", 2026-03-10T15:41:09.293 INFO:tasks.workunit.client.0.vm01.stderr: "+T+YI9Jfw2XCgV7NHWhfdt3fHHwUQzO6WszWU557pmCODLvXWsQ8P+GRiG7Nywm3", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "uA==", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "-----END CERTIFICATE-----" 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: ], 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "certs/cert_key": [ 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "-----BEGIN PRIVATE KEY-----", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "MIIEvAIBADANBgkqhkiG9w0BAQEFAASCBKYwggSiAgEAAoIBAQCrGHp47Q1Nkl74", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "Ogpkb+SjMkJ/7zd4HeBbPBeg2KdGZX/j1/lY9HbQMjEB7RGxo7pfKcldOuG9dPVu", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "wBFSr4tKaNY0WhheikW44mEZ1afh2am4fC2lwW/aSPM04qNTYQbyPpuxGnWMUDOf", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "uR1tJZPRQfJJFen0M8CMOIXcuPKQxeIkxF5xwVesHWlGL3ecnEQPx3qhPkZrMYHj", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "iy9v5mLoAFqZm7lIMkLUWv87mtYU1CQfiFASoM+PXzXsf06ybSZmI3LvtlC4DM6B", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "k3XncvdA6zImKybA0d8AwYwf2bM8o6PKFzXdWVnJ4TOaXylmlZDaCLw/av8UCvxs", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "UvcwWccrAgMBAAECggEAeBv0BiYrm5QwdUORfhaKxAIJavRM1Vbr5EBYOgM90o54", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "bEN2ePsM2XUSsE5ziGfu8tVL1dX7GNwdW8UbpBc1ymO0VAYXa27YKUVKcy9o7oS1", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "v5v1E5Kq6esiSLL9gw/vJ2nKNFblxD2dL/hs7u1dSp5n7uSiW1tlRUp8toljRzts", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "1Cenp0J/a82HwWDE8j/H9NvitTOZ2cdwJ76V8GkBynlvr2ARjRfZGx0WXEJmoZYD", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "YUQVU303DB6Q2tkFco4LbPofkuhhMPhXsz3fZ/blHj/c78tqP9L5sQ29oqoPE1pS", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "DBOwKC/eoi5FY34RdLNL0dKq9MzbuYqEcCfZOJgxoQKBgQDf+5XF+aXQz2OmSaj6", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "1Yr+3KAKdfX/AYp22X1Wy4zWcZlgujgwQ1FG0zay8HVBM0/xn4UgOtcKCoXibePh", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "ag1t8aZINdRE1JcMzKmZoSvU9Xk30CNvygizuJVEKsJFPDbPzCpauDSplzcQb4pZ", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "wepucPuowkPMBx0iU3x0qSThWwKBgQDDjYs7d30xxSqWWXyCOZshy7UtHMNfqP15", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "kDfTXIZzuHvDf6ZNci10VY1eDZbpZfHgc6x1ElbKv2H4dYsgkENJZUi1YQDpVPKq", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "4N5teNykgAuagiR7dRFltSju3S7hIE6HInTv3hShaFPymlEE7zuBMuEUcuvYz5YN", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "RjxsvypKcQKBgCuuV+Y1KqZPW8K5SNAqRyIvCrMfkCr8NPG6tpvvtHa5zsyzZHPd", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "HQOv+1HoXSWrCSM5FfBUKU3XAYdIIRH76cSQRPp+LPiDcTXY0Baa/P5aJRrCZ7bM", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "cugBznJt2FdCR/o8eeIZXIPabq2w4w1gKQUC2cFuqWQn2wGvwGzL89pTAoGAAfpx", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "mSVpT9KVzrWTC+I3To04BP/QfixAfDVYSzwZZBxOrDijXw8zpISlDHmIuE2+t62T", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "5g9Mb3qmLBRMVwT+mUR8CtGzZ6jjV5U0yti5KrTc6TA93D3f8i51/oygR8jC4p0X", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "n8GYZdWfW8nx3eHpsTHpkwJinmvjMbkvLU51yBECgYAnUAMyhNOWjbYS5QWd8i1W", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "SFQansVDeeT98RebrzmGwlgrCImHItJz0Tz8gkNB3+S2B2balqT0WHaDxQ8vCtwX", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "xB4wd+gMomgdYtHGRnRwj1UyRXDk0c1TgGdRjOn3URaezBMibHTQSbFgPciJgAuU", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "mEl75h1ToBX9yvnH39o50g==", 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: "-----END PRIVATE KEY-----" 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: ] 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr: } 2026-03-10T15:41:09.294 INFO:tasks.workunit.client.0.vm01.stderr:}' '{"fsid": $fsid, "name": $name, "config_blobs": $config_blobs}' 2026-03-10T15:41:09.396 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-grafana-a 2026-03-10T15:41:09.396 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:41:09.396 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-grafana-a 2026-03-10T15:41:09.404 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-grafana.a 2026-03-10T15:41:09.404 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:41:09.404 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-grafana.a 2026-03-10T15:41:09.404 INFO:tasks.workunit.client.0.vm01.stderr:Deploy daemon grafana.a ... 2026-03-10T15:41:18.430 INFO:tasks.workunit.client.0.vm01.stderr:+ cond='curl --insecure '\''https://localhost:3000'\'' | grep -q '\''grafana'\''' 2026-03-10T15:41:18.430 INFO:tasks.workunit.client.0.vm01.stderr:+ is_available grafana 'curl --insecure '\''https://localhost:3000'\'' | grep -q '\''grafana'\''' 50 2026-03-10T15:41:18.430 INFO:tasks.workunit.client.0.vm01.stderr:+ local name=grafana 2026-03-10T15:41:18.430 INFO:tasks.workunit.client.0.vm01.stderr:+ local 'condition=curl --insecure '\''https://localhost:3000'\'' | grep -q '\''grafana'\''' 2026-03-10T15:41:18.430 INFO:tasks.workunit.client.0.vm01.stderr:+ local tries=50 2026-03-10T15:41:18.430 INFO:tasks.workunit.client.0.vm01.stderr:+ local num=0 2026-03-10T15:41:18.431 INFO:tasks.workunit.client.0.vm01.stderr:+ eval 'curl --insecure '\''https://localhost:3000'\'' | grep -q '\''grafana'\''' 2026-03-10T15:41:18.431 INFO:tasks.workunit.client.0.vm01.stderr:++ curl --insecure https://localhost:3000 2026-03-10T15:41:18.434 INFO:tasks.workunit.client.0.vm01.stderr: % Total % Received % Xferd Average Speed Time Time Time Current 2026-03-10T15:41:18.434 INFO:tasks.workunit.client.0.vm01.stderr: Dload Upload Total Spent Left Speed 2026-03-10T15:41:18.434 INFO:tasks.workunit.client.0.vm01.stderr: 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 2026-03-10T15:41:18.434 INFO:tasks.workunit.client.0.vm01.stderr:curl: (7) Failed to connect to localhost port 3000 after 0 ms: Connection refused 2026-03-10T15:41:18.434 INFO:tasks.workunit.client.0.vm01.stderr:++ grep -q grafana 2026-03-10T15:41:18.436 INFO:tasks.workunit.client.0.vm01.stderr:+ num=1 2026-03-10T15:41:18.436 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' 1 -ge 50 ']' 2026-03-10T15:41:18.436 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 5 2026-03-10T15:41:23.437 INFO:tasks.workunit.client.0.vm01.stderr:+ eval 'curl --insecure '\''https://localhost:3000'\'' | grep -q '\''grafana'\''' 2026-03-10T15:41:23.437 INFO:tasks.workunit.client.0.vm01.stderr:++ curl --insecure https://localhost:3000 2026-03-10T15:41:23.437 INFO:tasks.workunit.client.0.vm01.stderr:++ grep -q grafana 2026-03-10T15:41:23.440 INFO:tasks.workunit.client.0.vm01.stderr: % Total % Received % Xferd Average Speed Time Time Time Current 2026-03-10T15:41:23.440 INFO:tasks.workunit.client.0.vm01.stderr: Dload Upload Total Spent Left Speed 2026-03-10T15:41:23.451 INFO:tasks.workunit.client.0.vm01.stderr: 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 19155 0 19155 0 0 1597k 0 --:--:-- --:--:-- --:--:-- 1700k 2026-03-10T15:41:23.451 INFO:tasks.workunit.client.0.vm01.stderr:curl: (23) Failure writing output to destination 2026-03-10T15:41:23.452 INFO:tasks.workunit.client.0.vm01.stdout:grafana is available 2026-03-10T15:41:23.452 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'grafana is available' 2026-03-10T15:41:23.452 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-10T15:41:23.452 INFO:tasks.workunit.client.0.vm01.stderr:+ nfs_stop 2026-03-10T15:41:23.452 INFO:tasks.workunit.client.0.vm01.stderr:+ local 'units=nfs-server nfs-kernel-server' 2026-03-10T15:41:23.452 INFO:tasks.workunit.client.0.vm01.stderr:+ for unit in $units 2026-03-10T15:41:23.452 INFO:tasks.workunit.client.0.vm01.stderr:+ systemctl --no-pager status nfs-server 2026-03-10T15:41:23.456 INFO:tasks.workunit.client.0.vm01.stderr:+ for unit in $units 2026-03-10T15:41:23.456 INFO:tasks.workunit.client.0.vm01.stderr:+ systemctl --no-pager status nfs-kernel-server 2026-03-10T15:41:23.458 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_false 'sudo ss -tlnp '\''( sport = :nfs )'\'' | grep LISTEN' 2026-03-10T15:41:23.458 INFO:tasks.workunit.client.0.vm01.stderr:+ set -x 2026-03-10T15:41:23.458 INFO:tasks.workunit.client.0.vm01.stderr:+ eval 'sudo ss -tlnp '\''( sport = :nfs )'\'' | grep LISTEN' 2026-03-10T15:41:23.458 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo ss -tlnp '( sport = :nfs )' 2026-03-10T15:41:23.458 INFO:tasks.workunit.client.0.vm01.stderr:++ grep LISTEN 2026-03-10T15:41:23.473 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-10T15:41:23.473 INFO:tasks.workunit.client.0.vm01.stderr:++ cat /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/../../../src/cephadm/samples/nfs.json 2026-03-10T15:41:23.473 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r '.["pool"]' 2026-03-10T15:41:23.482 INFO:tasks.workunit.client.0.vm01.stderr:+ nfs_rados_pool=nfs-ganesha 2026-03-10T15:41:23.483 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph osd pool create nfs-ganesha 64 2026-03-10T15:41:24.437 INFO:tasks.workunit.client.0.vm01.stderr:pool 'nfs-ganesha' created 2026-03-10T15:41:24.525 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- rados --pool nfs-ganesha --namespace nfs-ns create conf-nfs.a 2026-03-10T15:41:25.519 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph orch pause 2026-03-10T15:41:25.909 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid _orch deploy 2026-03-10T15:41:25.909 INFO:tasks.workunit.client.0.vm01.stderr:++ cat /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/../../../src/cephadm/samples/nfs.json 2026-03-10T15:41:25.915 INFO:tasks.workunit.client.0.vm01.stderr:+ jq --null-input --arg fsid 00000000-0000-0000-0000-0000deadbeef --arg name nfs.a --arg keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI --arg config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --argjson config_blobs '{ 2026-03-10T15:41:25.916 INFO:tasks.workunit.client.0.vm01.stderr: "pool" : "nfs-ganesha", 2026-03-10T15:41:25.916 INFO:tasks.workunit.client.0.vm01.stderr: "namespace" : "nfs-ns", 2026-03-10T15:41:25.916 INFO:tasks.workunit.client.0.vm01.stderr: "files": { 2026-03-10T15:41:25.916 INFO:tasks.workunit.client.0.vm01.stderr: "ganesha.conf": [ 2026-03-10T15:41:25.916 INFO:tasks.workunit.client.0.vm01.stderr: "RADOS_URLS {", 2026-03-10T15:41:25.916 INFO:tasks.workunit.client.0.vm01.stderr: " userid = admin;", 2026-03-10T15:41:25.916 INFO:tasks.workunit.client.0.vm01.stderr: "}", 2026-03-10T15:41:25.916 INFO:tasks.workunit.client.0.vm01.stderr: "", 2026-03-10T15:41:25.916 INFO:tasks.workunit.client.0.vm01.stderr: "%url rados://nfs-ganesha/nfs-ns/conf-nfs.a", 2026-03-10T15:41:25.916 INFO:tasks.workunit.client.0.vm01.stderr: "" 2026-03-10T15:41:25.916 INFO:tasks.workunit.client.0.vm01.stderr: ], 2026-03-10T15:41:25.916 INFO:tasks.workunit.client.0.vm01.stderr: "idmap.conf": "" 2026-03-10T15:41:25.916 INFO:tasks.workunit.client.0.vm01.stderr: } 2026-03-10T15:41:25.916 INFO:tasks.workunit.client.0.vm01.stderr:}' '{"fsid": $fsid, "name": $name, "params": {"keyring": $keyring, "config": $config}, "config_blobs": $config_blobs}' 2026-03-10T15:41:26.027 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-nfs-a 2026-03-10T15:41:26.027 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:41:26.027 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-nfs-a 2026-03-10T15:41:26.036 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-nfs.a 2026-03-10T15:41:26.036 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:41:26.036 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-nfs.a 2026-03-10T15:41:26.036 INFO:tasks.workunit.client.0.vm01.stderr:Deploy daemon nfs.a ... 2026-03-10T15:41:26.149 INFO:tasks.workunit.client.0.vm01.stderr:Verifying port 0.0.0.0:2049 ... 2026-03-10T15:41:26.151 INFO:tasks.workunit.client.0.vm01.stderr:Creating ganesha config... 2026-03-10T15:41:26.151 INFO:tasks.workunit.client.0.vm01.stderr:Write file: /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/nfs.a/etc/ganesha/ganesha.conf 2026-03-10T15:41:26.152 INFO:tasks.workunit.client.0.vm01.stderr:Write file: /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/nfs.a/etc/ganesha/idmap.conf 2026-03-10T15:41:26.596 INFO:tasks.workunit.client.0.vm01.stderr:+ cond='sudo ss -tlnp '\''( sport = :nfs )'\'' | grep '\''ganesha.nfsd'\''' 2026-03-10T15:41:26.596 INFO:tasks.workunit.client.0.vm01.stderr:+ is_available nfs 'sudo ss -tlnp '\''( sport = :nfs )'\'' | grep '\''ganesha.nfsd'\''' 10 2026-03-10T15:41:26.596 INFO:tasks.workunit.client.0.vm01.stderr:+ local name=nfs 2026-03-10T15:41:26.596 INFO:tasks.workunit.client.0.vm01.stderr:+ local 'condition=sudo ss -tlnp '\''( sport = :nfs )'\'' | grep '\''ganesha.nfsd'\''' 2026-03-10T15:41:26.596 INFO:tasks.workunit.client.0.vm01.stderr:+ local tries=10 2026-03-10T15:41:26.596 INFO:tasks.workunit.client.0.vm01.stderr:+ local num=0 2026-03-10T15:41:26.596 INFO:tasks.workunit.client.0.vm01.stderr:+ eval 'sudo ss -tlnp '\''( sport = :nfs )'\'' | grep '\''ganesha.nfsd'\''' 2026-03-10T15:41:26.596 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo ss -tlnp '( sport = :nfs )' 2026-03-10T15:41:26.600 INFO:tasks.workunit.client.0.vm01.stderr:++ grep ganesha.nfsd 2026-03-10T15:41:26.609 INFO:tasks.workunit.client.0.vm01.stderr:+ num=1 2026-03-10T15:41:26.610 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' 1 -ge 10 ']' 2026-03-10T15:41:26.610 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 5 2026-03-10T15:41:31.611 INFO:tasks.workunit.client.0.vm01.stderr:+ eval 'sudo ss -tlnp '\''( sport = :nfs )'\'' | grep '\''ganesha.nfsd'\''' 2026-03-10T15:41:31.611 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo ss -tlnp '( sport = :nfs )' 2026-03-10T15:41:31.611 INFO:tasks.workunit.client.0.vm01.stderr:++ grep ganesha.nfsd 2026-03-10T15:41:31.624 INFO:tasks.workunit.client.0.vm01.stdout:LISTEN 0 4096 *:2049 *:* users:(("ganesha.nfsd",pid=37017,fd=24)) 2026-03-10T15:41:31.624 INFO:tasks.workunit.client.0.vm01.stdout:nfs is available 2026-03-10T15:41:31.624 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'nfs is available' 2026-03-10T15:41:31.624 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-10T15:41:31.624 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --keyring tmp.test_cephadm.sh.dSdiir/tmp.9q8NGDmkaI -- ceph orch resume 2026-03-10T15:41:32.030 INFO:tasks.workunit.client.0.vm01.stderr:++ cat /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/../../../src/cephadm/samples/custom_container.json 2026-03-10T15:41:32.030 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r .image 2026-03-10T15:41:32.039 INFO:tasks.workunit.client.0.vm01.stderr:+ alertmanager_image=quay.io/prometheus/alertmanager:v0.20.0 2026-03-10T15:41:32.039 INFO:tasks.workunit.client.0.vm01.stderr:++ jq .ports /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/../../../src/cephadm/samples/custom_container.json 2026-03-10T15:41:32.048 INFO:tasks.workunit.client.0.vm01.stderr:+ tcp_ports='[ 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: 9093, 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: 9094 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr:]' 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr:++ cat /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephadm/../../../src/cephadm/samples/custom_container.json 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm _orch deploy 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr:+ jq --null-input --arg fsid 00000000-0000-0000-0000-0000deadbeef --arg name container.alertmanager.a --arg keyring tmp.test_cephadm.sh.dSdiir/keyring.bootstrap.osd --arg config tmp.test_cephadm.sh.dSdiir/tmp.vW4QlHNtFA --arg image quay.io/prometheus/alertmanager:v0.20.0 --argjson tcp_ports '[ 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: 9093, 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: 9094 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr:]' --argjson config_blobs '{ 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: "image": "quay.io/prometheus/alertmanager:v0.20.0", 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: "ports": [9093, 9094], 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: "args": [ 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: "-p", "9093:9093", 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: "-p", "9094:9094" 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: ], 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: "dirs": ["etc/alertmanager"], 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: "files": { 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: "etc/alertmanager/alertmanager.yml": [ 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: "global:", 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: " resolve_timeout: 5m", 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: "", 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: "route:", 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: " group_by: ['\''alertname'\'']", 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: " group_wait: 10s", 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: " group_interval: 10s", 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: " repeat_interval: 1h", 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: " receiver: '\''web.hook'\''", 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: "receivers:", 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: "- name: '\''web.hook'\''", 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: " webhook_configs:", 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: " - url: '\''http://127.0.0.1:5001/'\''", 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: "inhibit_rules:", 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: " - source_match:", 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: " severity: '\''critical'\''", 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: " target_match:", 2026-03-10T15:41:32.049 INFO:tasks.workunit.client.0.vm01.stderr: " severity: '\''warning'\''", 2026-03-10T15:41:32.050 INFO:tasks.workunit.client.0.vm01.stderr: " equal: ['\''alertname'\'', '\''dev'\'', '\''instance'\'']" 2026-03-10T15:41:32.050 INFO:tasks.workunit.client.0.vm01.stderr: ] 2026-03-10T15:41:32.050 INFO:tasks.workunit.client.0.vm01.stderr: }, 2026-03-10T15:41:32.050 INFO:tasks.workunit.client.0.vm01.stderr: "volume_mounts": { 2026-03-10T15:41:32.050 INFO:tasks.workunit.client.0.vm01.stderr: "etc/alertmanager": "/etc/alertmanager" 2026-03-10T15:41:32.050 INFO:tasks.workunit.client.0.vm01.stderr: } 2026-03-10T15:41:32.050 INFO:tasks.workunit.client.0.vm01.stderr:}' '{"fsid": $fsid, "name": $name, "image": $image, "params": {"keyring": $keyring, "config": $config, "tcp_ports": $tcp_ports}, "config_blobs": $config_blobs}' 2026-03-10T15:41:32.150 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-container-alertmanager-a 2026-03-10T15:41:32.150 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:41:32.150 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-container-alertmanager-a 2026-03-10T15:41:32.158 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-container.alertmanager.a 2026-03-10T15:41:32.158 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:41:32.158 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-container.alertmanager.a 2026-03-10T15:41:32.158 INFO:tasks.workunit.client.0.vm01.stderr:Deploy daemon container.alertmanager.a ... 2026-03-10T15:41:32.158 INFO:tasks.workunit.client.0.vm01.stderr:Verifying port 0.0.0.0:9093 ... 2026-03-10T15:41:32.158 INFO:tasks.workunit.client.0.vm01.stderr:Verifying port 0.0.0.0:9094 ... 2026-03-10T15:41:32.159 INFO:tasks.workunit.client.0.vm01.stderr:Verifying port 0.0.0.0:9093 ... 2026-03-10T15:41:32.159 INFO:tasks.workunit.client.0.vm01.stderr:Verifying port 0.0.0.0:9094 ... 2026-03-10T15:41:32.159 INFO:tasks.workunit.client.0.vm01.stderr:Creating custom container configuration dirs/files in /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/container.alertmanager.a ... 2026-03-10T15:41:32.159 INFO:tasks.workunit.client.0.vm01.stderr:Creating directory: etc/alertmanager 2026-03-10T15:41:32.159 INFO:tasks.workunit.client.0.vm01.stderr:Creating file: etc/alertmanager/alertmanager.yml 2026-03-10T15:41:32.613 INFO:tasks.workunit.client.0.vm01.stderr:+ cond='sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name container.alertmanager.a -- test -f /etc/alertmanager/alertmanager.yml' 2026-03-10T15:41:32.613 INFO:tasks.workunit.client.0.vm01.stderr:+ is_available alertmanager.yml 'sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name container.alertmanager.a -- test -f /etc/alertmanager/alertmanager.yml' 10 2026-03-10T15:41:32.613 INFO:tasks.workunit.client.0.vm01.stderr:+ local name=alertmanager.yml 2026-03-10T15:41:32.613 INFO:tasks.workunit.client.0.vm01.stderr:+ local 'condition=sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name container.alertmanager.a -- test -f /etc/alertmanager/alertmanager.yml' 2026-03-10T15:41:32.613 INFO:tasks.workunit.client.0.vm01.stderr:+ local tries=10 2026-03-10T15:41:32.613 INFO:tasks.workunit.client.0.vm01.stderr:+ local num=0 2026-03-10T15:41:32.613 INFO:tasks.workunit.client.0.vm01.stderr:+ eval 'sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name container.alertmanager.a -- test -f /etc/alertmanager/alertmanager.yml' 2026-03-10T15:41:32.613 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name container.alertmanager.a -- test -f /etc/alertmanager/alertmanager.yml 2026-03-10T15:41:32.704 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/container.alertmanager.a/config 2026-03-10T15:41:32.713 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-container-alertmanager-a 2026-03-10T15:41:32.714 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:41:32.714 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-container-alertmanager-a 2026-03-10T15:41:32.722 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from /usr/bin/docker container inspect --format {{.State.Status}} ceph-00000000-0000-0000-0000-0000deadbeef-container.alertmanager.a 2026-03-10T15:41:32.722 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stdout 2026-03-10T15:41:32.722 INFO:tasks.workunit.client.0.vm01.stderr:/usr/bin/docker: stderr Error response from daemon: No such container: ceph-00000000-0000-0000-0000-0000deadbeef-container.alertmanager.a 2026-03-10T15:41:32.722 INFO:tasks.workunit.client.0.vm01.stderr:ERROR: unable to find container "ceph-00000000-0000-0000-0000-0000deadbeef-container-alertmanager-a" 2026-03-10T15:41:32.737 INFO:tasks.workunit.client.0.vm01.stderr:+ num=1 2026-03-10T15:41:32.737 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' 1 -ge 10 ']' 2026-03-10T15:41:32.737 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 5 2026-03-10T15:41:37.738 INFO:tasks.workunit.client.0.vm01.stderr:+ eval 'sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name container.alertmanager.a -- test -f /etc/alertmanager/alertmanager.yml' 2026-03-10T15:41:37.738 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name container.alertmanager.a -- test -f /etc/alertmanager/alertmanager.yml 2026-03-10T15:41:37.837 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/container.alertmanager.a/config 2026-03-10T15:41:37.903 INFO:tasks.workunit.client.0.vm01.stdout:alertmanager.yml is available 2026-03-10T15:41:37.903 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'alertmanager.yml is available' 2026-03-10T15:41:37.903 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-10T15:41:37.903 INFO:tasks.workunit.client.0.vm01.stderr:+ cond='curl '\''http://localhost:9093'\'' | grep -q '\''Alertmanager'\''' 2026-03-10T15:41:37.903 INFO:tasks.workunit.client.0.vm01.stderr:+ is_available alertmanager 'curl '\''http://localhost:9093'\'' | grep -q '\''Alertmanager'\''' 10 2026-03-10T15:41:37.903 INFO:tasks.workunit.client.0.vm01.stderr:+ local name=alertmanager 2026-03-10T15:41:37.903 INFO:tasks.workunit.client.0.vm01.stderr:+ local 'condition=curl '\''http://localhost:9093'\'' | grep -q '\''Alertmanager'\''' 2026-03-10T15:41:37.903 INFO:tasks.workunit.client.0.vm01.stderr:+ local tries=10 2026-03-10T15:41:37.903 INFO:tasks.workunit.client.0.vm01.stderr:+ local num=0 2026-03-10T15:41:37.903 INFO:tasks.workunit.client.0.vm01.stderr:+ eval 'curl '\''http://localhost:9093'\'' | grep -q '\''Alertmanager'\''' 2026-03-10T15:41:37.903 INFO:tasks.workunit.client.0.vm01.stderr:++ curl http://localhost:9093 2026-03-10T15:41:37.903 INFO:tasks.workunit.client.0.vm01.stderr:++ grep -q Alertmanager 2026-03-10T15:41:37.907 INFO:tasks.workunit.client.0.vm01.stderr: % Total % Received % Xferd Average Speed Time Time Time Current 2026-03-10T15:41:37.907 INFO:tasks.workunit.client.0.vm01.stderr: Dload Upload Total Spent Left Speed 2026-03-10T15:41:37.909 INFO:tasks.workunit.client.0.vm01.stderr: 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 1314 100 1314 0 0 1649k 0 --:--:-- --:--:-- --:--:-- 1283k 2026-03-10T15:41:37.909 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'alertmanager is available' 2026-03-10T15:41:37.909 INFO:tasks.workunit.client.0.vm01.stdout:alertmanager is available 2026-03-10T15:41:37.909 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-10T15:41:37.909 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- is-enabled 2026-03-10T15:41:38.000 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-10T15:41:42.561 INFO:tasks.workunit.client.0.vm01.stderr:stdout enabled 2026-03-10T15:41:42.586 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- is-active 2026-03-10T15:41:42.689 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-10T15:41:46.595 INFO:tasks.workunit.client.0.vm01.stderr:stdout active 2026-03-10T15:41:46.614 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_false sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.xyz -- is-active 2026-03-10T15:41:46.614 INFO:tasks.workunit.client.0.vm01.stderr:+ set -x 2026-03-10T15:41:46.614 INFO:tasks.workunit.client.0.vm01.stderr:+ eval sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.xyz -- is-active 2026-03-10T15:41:46.614 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.xyz -- is-active 2026-03-10T15:41:46.712 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.xyz/config 2026-03-10T15:41:50.628 INFO:tasks.workunit.client.0.vm01.stderr:ERROR: Daemon not found: mon.xyz. See `cephadm ls` 2026-03-10T15:41:50.642 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-10T15:41:50.642 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- disable 2026-03-10T15:41:50.728 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-10T15:41:54.876 INFO:tasks.workunit.client.0.vm01.stderr:stderr Removed /etc/systemd/system/ceph-00000000-0000-0000-0000-0000deadbeef.target.wants/ceph-00000000-0000-0000-0000-0000deadbeef@mon.a.service. 2026-03-10T15:41:54.889 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_false sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- is-enabled 2026-03-10T15:41:54.889 INFO:tasks.workunit.client.0.vm01.stderr:+ set -x 2026-03-10T15:41:54.889 INFO:tasks.workunit.client.0.vm01.stderr:+ eval sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- is-enabled 2026-03-10T15:41:54.889 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- is-enabled 2026-03-10T15:41:54.976 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-10T15:41:58.707 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 1 from systemctl is-enabled ceph-00000000-0000-0000-0000-0000deadbeef@mon.a 2026-03-10T15:41:58.707 INFO:tasks.workunit.client.0.vm01.stderr:stdout disabled 2026-03-10T15:41:58.726 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-10T15:41:58.726 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- enable 2026-03-10T15:41:58.816 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-10T15:42:02.948 INFO:tasks.workunit.client.0.vm01.stderr:stderr Created symlink /etc/systemd/system/ceph-00000000-0000-0000-0000-0000deadbeef.target.wants/ceph-00000000-0000-0000-0000-0000deadbeef@mon.a.service → /etc/systemd/system/ceph-00000000-0000-0000-0000-0000deadbeef@.service. 2026-03-10T15:42:02.966 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- is-enabled 2026-03-10T15:42:03.061 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-10T15:42:06.775 INFO:tasks.workunit.client.0.vm01.stderr:stdout enabled 2026-03-10T15:42:06.789 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- status 2026-03-10T15:42:06.875 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout ● ceph-00000000-0000-0000-0000-0000deadbeef@mon.a.service - Ceph mon.a for 00000000-0000-0000-0000-0000deadbeef 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout Loaded: loaded (/etc/systemd/system/ceph-00000000-0000-0000-0000-0000deadbeef@.service; enabled; vendor preset: enabled) 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout Active: active (running) since Tue 2026-03-10 15:39:44 UTC; 2min 26s ago 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout Main PID: 21186 (bash) 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout Tasks: 10 (limit: 9553) 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout Memory: 8.6M 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout CPU: 54ms 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout CGroup: /system.slice/system-ceph\x2d00000000\x2d0000\x2d0000\x2d0000\x2d0000deadbeef.slice/ceph-00000000-0000-0000-0000-0000deadbeef@mon.a.service 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout ├─21186 /bin/bash /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/unit.run 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout └─21205 /usr/bin/docker run --rm --ipc=host --stop-signal=SIGTERM --ulimit nofile=1048576 --net=host --entrypoint /usr/bin/ceph-mon --privileged --group-add=disk --init --name ceph-00000000-0000-0000-0000-0000deadbeef-mon-a --pids-limit=0 -e CONTAINER_IMAGE=quay.ceph.io/ceph-ci/ceph:squid -e NODE_NAME=vm01 -e TCMALLOC_MAX_TOTAL_THREAD_CACHE_BYTES=134217728 -v /var/run/ceph/00000000-0000-0000-0000-0000deadbeef:/var/run/ceph:z -v /var/log/ceph/00000000-0000-0000-0000-0000deadbeef:/var/log/ceph:z -v /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/crash:/var/lib/ceph/crash:z -v /dev:/dev -v /run/udev:/run/udev -v /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a:/var/lib/ceph/mon/ceph-a:z -v /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config:/etc/ceph/ceph.conf:z quay.ceph.io/ceph-ci/ceph:squid -n mon.a -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-stderr=true "--default-log-stderr-prefix=debug " --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-stderr=true 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:03 vm01 bash[21205]: cluster 2026-03-10T15:42:01.980206+0000 mgr.x (mgr.14150) 82 : cluster [DBG] pgmap v71: 64 pgs: 1 active+undersized+degraded, 63 active+undersized; 0 B data, 53 MiB used, 5.9 GiB / 6.0 GiB avail; 1/3 objects degraded (33.333%) 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:03 vm01 bash[21205]: cluster 2026-03-10T15:42:01.980206+0000 mgr.x (mgr.14150) 82 : cluster [DBG] pgmap v71: 64 pgs: 1 active+undersized+degraded, 63 active+undersized; 0 B data, 53 MiB used, 5.9 GiB / 6.0 GiB avail; 1/3 objects degraded (33.333%) 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:04 vm01 bash[21205]: debug 2026-03-10T15:42:04.974+0000 7f6774fe8640 1 mon.a@0(leader).osd e16 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:05 vm01 bash[21205]: cluster 2026-03-10T15:42:03.980448+0000 mgr.x (mgr.14150) 83 : cluster [DBG] pgmap v72: 64 pgs: 1 active+undersized+degraded, 63 active+undersized; 0 B data, 53 MiB used, 5.9 GiB / 6.0 GiB avail; 1/3 objects degraded (33.333%) 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:05 vm01 bash[21205]: cluster 2026-03-10T15:42:03.980448+0000 mgr.x (mgr.14150) 83 : cluster [DBG] pgmap v72: 64 pgs: 1 active+undersized+degraded, 63 active+undersized; 0 B data, 53 MiB used, 5.9 GiB / 6.0 GiB avail; 1/3 objects degraded (33.333%) 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:07 vm01 bash[21205]: cluster 2026-03-10T15:42:05.980722+0000 mgr.x (mgr.14150) 84 : cluster [DBG] pgmap v73: 64 pgs: 1 active+undersized+degraded, 63 active+undersized; 0 B data, 53 MiB used, 5.9 GiB / 6.0 GiB avail; 1/3 objects degraded (33.333%) 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:07 vm01 bash[21205]: cluster 2026-03-10T15:42:05.980722+0000 mgr.x (mgr.14150) 84 : cluster [DBG] pgmap v73: 64 pgs: 1 active+undersized+degraded, 63 active+undersized; 0 B data, 53 MiB used, 5.9 GiB / 6.0 GiB avail; 1/3 objects degraded (33.333%) 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:09 vm01 bash[21205]: cluster 2026-03-10T15:42:07.981010+0000 mgr.x (mgr.14150) 85 : cluster [DBG] pgmap v74: 64 pgs: 1 active+undersized+degraded, 63 active+undersized; 0 B data, 53 MiB used, 5.9 GiB / 6.0 GiB avail; 1/3 objects degraded (33.333%) 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:09 vm01 bash[21205]: cluster 2026-03-10T15:42:07.981010+0000 mgr.x (mgr.14150) 85 : cluster [DBG] pgmap v74: 64 pgs: 1 active+undersized+degraded, 63 active+undersized; 0 B data, 53 MiB used, 5.9 GiB / 6.0 GiB avail; 1/3 objects degraded (33.333%) 2026-03-10T15:42:10.810 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:09 vm01 bash[21205]: debug 2026-03-10T15:42:09.974+0000 7f6774fe8640 1 mon.a@0(leader).osd e16 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 2026-03-10T15:42:10.827 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- stop 2026-03-10T15:42:10.913 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-10T15:42:15.153 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_return_code 3 sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- status 2026-03-10T15:42:15.154 INFO:tasks.workunit.client.0.vm01.stderr:+ set -x 2026-03-10T15:42:15.154 INFO:tasks.workunit.client.0.vm01.stderr:+ local expected_code=3 2026-03-10T15:42:15.154 INFO:tasks.workunit.client.0.vm01.stderr:+ shift 2026-03-10T15:42:15.154 INFO:tasks.workunit.client.0.vm01.stderr:+ local 'command=sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- status' 2026-03-10T15:42:15.154 INFO:tasks.workunit.client.0.vm01.stderr:+ set +e 2026-03-10T15:42:15.154 INFO:tasks.workunit.client.0.vm01.stderr:+ eval 'sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- status' 2026-03-10T15:42:15.154 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- status 2026-03-10T15:42:15.249 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-10T15:42:18.884 INFO:tasks.workunit.client.0.vm01.stderr:Non-zero exit code 3 from systemctl status ceph-00000000-0000-0000-0000-0000deadbeef@mon.a 2026-03-10T15:42:18.884 INFO:tasks.workunit.client.0.vm01.stderr:stdout ○ ceph-00000000-0000-0000-0000-0000deadbeef@mon.a.service - Ceph mon.a for 00000000-0000-0000-0000-0000deadbeef 2026-03-10T15:42:18.884 INFO:tasks.workunit.client.0.vm01.stderr:stdout Loaded: loaded (/etc/systemd/system/ceph-00000000-0000-0000-0000-0000deadbeef@.service; enabled; vendor preset: enabled) 2026-03-10T15:42:18.884 INFO:tasks.workunit.client.0.vm01.stderr:stdout Active: inactive (dead) since Tue 2026-03-10 15:42:15 UTC; 3s ago 2026-03-10T15:42:18.884 INFO:tasks.workunit.client.0.vm01.stderr:stdout Process: 21186 ExecStart=/bin/bash /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/unit.run (code=exited, status=0/SUCCESS) 2026-03-10T15:42:18.884 INFO:tasks.workunit.client.0.vm01.stderr:stdout Process: 37721 ExecStop=/bin/bash -c bash /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/unit.stop (code=exited, status=0/SUCCESS) 2026-03-10T15:42:18.884 INFO:tasks.workunit.client.0.vm01.stderr:stdout Process: 37777 ExecStopPost=/bin/bash /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/unit.poststop (code=exited, status=0/SUCCESS) 2026-03-10T15:42:18.884 INFO:tasks.workunit.client.0.vm01.stderr:stdout Main PID: 21186 (code=exited, status=0/SUCCESS) 2026-03-10T15:42:18.884 INFO:tasks.workunit.client.0.vm01.stderr:stdout CPU: 126ms 2026-03-10T15:42:18.884 INFO:tasks.workunit.client.0.vm01.stderr:stdout 2026-03-10T15:42:18.884 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:13 vm01 bash[21205]: cluster 2026-03-10T15:42:11.981698+0000 mgr.x (mgr.14150) 87 : cluster [DBG] pgmap v76: 64 pgs: 1 active+undersized+degraded, 63 active+undersized; 0 B data, 53 MiB used, 5.9 GiB / 6.0 GiB avail; 1/3 objects degraded (33.333%) 2026-03-10T15:42:18.884 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:14 vm01 systemd[1]: Stopping Ceph mon.a for 00000000-0000-0000-0000-0000deadbeef... 2026-03-10T15:42:18.884 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:14 vm01 bash[21205]: debug 2026-03-10T15:42:14.882+0000 7f6777fee640 -1 received signal: Terminated from /sbin/docker-init -- /usr/bin/ceph-mon -n mon.a -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-stderr=true --default-log-stderr-prefix=debug --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-stderr=true (PID: 1) UID: 0 2026-03-10T15:42:18.884 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:14 vm01 bash[21205]: debug 2026-03-10T15:42:14.882+0000 7f6777fee640 -1 mon.a@0(leader) e2 *** Got Signal Terminated *** 2026-03-10T15:42:18.884 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:14 vm01 bash[21205]: debug 2026-03-10T15:42:14.882+0000 7f6777fee640 1 mon.a@0(leader) e2 shutdown 2026-03-10T15:42:18.884 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:14 vm01 bash[21205]: debug 2026-03-10T15:42:14.890+0000 7f6779a13d80 4 rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work 2026-03-10T15:42:18.884 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:14 vm01 bash[21205]: debug 2026-03-10T15:42:14.894+0000 7f6779a13d80 4 rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete 2026-03-10T15:42:18.884 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:15 vm01 bash[37745]: ceph-00000000-0000-0000-0000-0000deadbeef-mon-a 2026-03-10T15:42:18.884 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:15 vm01 systemd[1]: ceph-00000000-0000-0000-0000-0000deadbeef@mon.a.service: Deactivated successfully. 2026-03-10T15:42:18.885 INFO:tasks.workunit.client.0.vm01.stderr:stdout Mar 10 15:42:15 vm01 systemd[1]: Stopped Ceph mon.a for 00000000-0000-0000-0000-0000deadbeef. 2026-03-10T15:42:18.901 INFO:tasks.workunit.client.0.vm01.stderr:+ local return_code=3 2026-03-10T15:42:18.901 INFO:tasks.workunit.client.0.vm01.stderr:+ set -e 2026-03-10T15:42:18.901 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' '!' 3 -eq 3 ']' 2026-03-10T15:42:18.901 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-10T15:42:18.901 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid unit --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- start 2026-03-10T15:42:18.991 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-10T15:42:22.941 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef -- true 2026-03-10T15:42:26.945 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.b/config 2026-03-10T15:42:27.098 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef -- test -d /var/log/ceph 2026-03-10T15:42:30.977 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.b/config 2026-03-10T15:42:31.141 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_false sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid --timeout 10 shell --fsid 00000000-0000-0000-0000-0000deadbeef -- sleep 60 2026-03-10T15:42:31.141 INFO:tasks.workunit.client.0.vm01.stderr:+ set -x 2026-03-10T15:42:31.142 INFO:tasks.workunit.client.0.vm01.stderr:+ eval sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid --timeout 10 shell --fsid 00000000-0000-0000-0000-0000deadbeef -- sleep 60 2026-03-10T15:42:31.142 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid --timeout 10 shell --fsid 00000000-0000-0000-0000-0000deadbeef -- sleep 60 2026-03-10T15:42:35.018 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.b/config 2026-03-10T15:42:45.020 INFO:tasks.workunit.client.0.vm01.stderr:ERROR: Command `['/usr/bin/docker', 'run', '--rm', '--ipc=host', '--net=host', '--privileged', '--group-add=disk', '--init', '-i', '-e', 'CONTAINER_IMAGE=quay.ceph.io/ceph-ci/ceph:squid', '-e', 'NODE_NAME=vm01', '-v', '/var/run/ceph/00000000-0000-0000-0000-0000deadbeef:/var/run/ceph:z', '-v', '/var/log/ceph/00000000-0000-0000-0000-0000deadbeef:/var/log/ceph:z', '-v', '/var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/crash:/var/lib/ceph/crash:z', '-v', '/var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.b/config:/etc/ceph/ceph.conf:z', '-v', '/var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/config/ceph.client.admin.keyring:/etc/ceph/ceph.keyring:z', '--entrypoint', 'sleep', 'quay.ceph.io/ceph-ci/ceph:squid', '60']` timed out after 10 seconds 2026-03-10T15:42:45.038 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-10T15:42:45.038 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid --timeout 60 shell --fsid 00000000-0000-0000-0000-0000deadbeef -- sleep 10 2026-03-10T15:42:49.686 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.b/config 2026-03-10T15:42:59.847 INFO:tasks.workunit.client.0.vm01.stderr:++ basename tmp.test_cephadm.sh.dSdiir 2026-03-10T15:42:59.849 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid shell --fsid 00000000-0000-0000-0000-0000deadbeef --mount tmp.test_cephadm.sh.dSdiir tmp.test_cephadm.sh.l4jfBR -- stat /mnt/tmp.test_cephadm.sh.dSdiir 2026-03-10T15:43:04.499 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.b/config 2026-03-10T15:43:04.622 INFO:tasks.workunit.client.0.vm01.stdout: File: /mnt/tmp.test_cephadm.sh.dSdiir 2026-03-10T15:43:04.622 INFO:tasks.workunit.client.0.vm01.stdout: Size: 4096 Blocks: 8 IO Block: 4096 directory 2026-03-10T15:43:04.622 INFO:tasks.workunit.client.0.vm01.stdout:Device: fe01h/65025d Inode: 1046137 Links: 2 2026-03-10T15:43:04.622 INFO:tasks.workunit.client.0.vm01.stdout:Access: (0700/drwx------) Uid: ( 1000/ UNKNOWN) Gid: ( 1000/ UNKNOWN) 2026-03-10T15:43:04.622 INFO:tasks.workunit.client.0.vm01.stdout:Access: 2026-03-10 15:38:57.802659945 +0000 2026-03-10T15:43:04.622 INFO:tasks.workunit.client.0.vm01.stdout:Modify: 2026-03-10 15:40:44.310659945 +0000 2026-03-10T15:43:04.622 INFO:tasks.workunit.client.0.vm01.stdout:Change: 2026-03-10 15:40:44.310659945 +0000 2026-03-10T15:43:04.622 INFO:tasks.workunit.client.0.vm01.stdout: Birth: 2026-03-10 15:38:57.802659945 +0000 2026-03-10T15:43:04.674 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_false sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter 2026-03-10T15:43:04.674 INFO:tasks.workunit.client.0.vm01.stderr:+ set -x 2026-03-10T15:43:04.674 INFO:tasks.workunit.client.0.vm01.stderr:+ eval sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter 2026-03-10T15:43:04.674 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter 2026-03-10T15:43:04.765 INFO:tasks.workunit.client.0.vm01.stderr:usage: cephadm enter [-h] [--fsid FSID] --name NAME ... 2026-03-10T15:43:04.765 INFO:tasks.workunit.client.0.vm01.stderr:cephadm enter: error: the following arguments are required: --name/-n 2026-03-10T15:43:04.787 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-10T15:43:04.787 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- test -d /var/lib/ceph/mon/ceph-a 2026-03-10T15:43:04.878 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-10T15:43:04.938 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name mgr.x -- test -d /var/lib/ceph/mgr/ceph-x 2026-03-10T15:43:05.041 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mgr.x/config 2026-03-10T15:43:05.118 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- pidof ceph-mon 2026-03-10T15:43:05.215 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-10T15:43:05.248 INFO:tasks.workunit.client.0.vm01.stdout:7 2026-03-10T15:43:05.271 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_false sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name mgr.x -- pidof ceph-mon 2026-03-10T15:43:05.271 INFO:tasks.workunit.client.0.vm01.stderr:+ set -x 2026-03-10T15:43:05.271 INFO:tasks.workunit.client.0.vm01.stderr:+ eval sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name mgr.x -- pidof ceph-mon 2026-03-10T15:43:05.271 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name mgr.x -- pidof ceph-mon 2026-03-10T15:43:05.359 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mgr.x/config 2026-03-10T15:43:05.420 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-10T15:43:05.420 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid enter --fsid 00000000-0000-0000-0000-0000deadbeef --name mgr.x -- pidof ceph-mgr 2026-03-10T15:43:05.520 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mgr.x/config 2026-03-10T15:43:05.561 INFO:tasks.workunit.client.0.vm01.stdout:8 2026-03-10T15:43:05.582 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid --timeout 60 enter --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a -- sleep 10 2026-03-10T15:43:05.677 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.a/config 2026-03-10T15:43:15.738 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ceph-volume --fsid 00000000-0000-0000-0000-0000deadbeef -- inventory --format=json 2026-03-10T15:43:15.738 INFO:tasks.workunit.client.0.vm01.stderr:+ jq '.[]' 2026-03-10T15:43:20.373 INFO:tasks.workunit.client.0.vm01.stderr:Inferring config /var/lib/ceph/00000000-0000-0000-0000-0000deadbeef/mon.b/config 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout:{ 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "path": "/dev/vdb", 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "sys_api": { 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "removable": "0", 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "ro": "0", 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "vendor": "0x1af4", 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "model": "", 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "rev": "", 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "sas_address": "", 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "sas_device_handle": "", 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "support_discard": "512", 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "rotational": "1", 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "nr_requests": "256", 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "partitions": {}, 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "device_nodes": [ 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "vdb" 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: ], 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "actuators": null, 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "scheduler_mode": "none", 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "sectors": 0, 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "sectorsize": "512", 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "size": 21474836480, 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "human_readable_size": "20.00 GB", 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "path": "/dev/vdb", 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "devname": "vdb", 2026-03-10T15:43:20.991 INFO:tasks.workunit.client.0.vm01.stdout: "type": "disk", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "parent": "/dev/vdb", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "id_bus": "" 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: }, 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "ceph_device_lvm": false, 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "being_replaced": false, 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "lsm_data": {}, 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "available": true, 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "rejected_reasons": [], 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "device_id": "DWNBRSTVMM01001", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "lvs": [] 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout:} 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout:{ 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "path": "/dev/vdc", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "sys_api": { 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "removable": "0", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "ro": "0", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "vendor": "0x1af4", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "model": "", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "rev": "", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "sas_address": "", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "sas_device_handle": "", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "support_discard": "512", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "rotational": "1", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "nr_requests": "256", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "partitions": {}, 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "device_nodes": [ 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "vdc" 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: ], 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "actuators": null, 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "scheduler_mode": "none", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "sectors": 0, 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "sectorsize": "512", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "size": 21474836480, 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "human_readable_size": "20.00 GB", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "path": "/dev/vdc", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "devname": "vdc", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "type": "disk", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "parent": "/dev/vdc", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "id_bus": "" 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: }, 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "ceph_device_lvm": false, 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "being_replaced": false, 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "lsm_data": {}, 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "available": true, 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "rejected_reasons": [], 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "device_id": "DWNBRSTVMM01002", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "lvs": [] 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout:} 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout:{ 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "path": "/dev/vdd", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "sys_api": { 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "removable": "0", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "ro": "0", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "vendor": "0x1af4", 2026-03-10T15:43:20.992 INFO:tasks.workunit.client.0.vm01.stdout: "model": "", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "rev": "", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "sas_address": "", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "sas_device_handle": "", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "support_discard": "512", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "rotational": "1", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "nr_requests": "256", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "partitions": {}, 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "device_nodes": [ 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "vdd" 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: ], 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "actuators": null, 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "scheduler_mode": "none", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "sectors": 0, 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "sectorsize": "512", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "size": 21474836480, 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "human_readable_size": "20.00 GB", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "path": "/dev/vdd", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "devname": "vdd", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "type": "disk", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "parent": "/dev/vdd", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "id_bus": "" 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: }, 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "ceph_device_lvm": false, 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "being_replaced": false, 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "lsm_data": {}, 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "available": true, 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "rejected_reasons": [], 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "device_id": "DWNBRSTVMM01003", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "lvs": [] 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout:} 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout:{ 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "path": "/dev/vde", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "sys_api": { 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "removable": "0", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "ro": "0", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "vendor": "0x1af4", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "model": "", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "rev": "", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "sas_address": "", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "sas_device_handle": "", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "support_discard": "512", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "rotational": "1", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "nr_requests": "256", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "partitions": {}, 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "device_nodes": [ 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "vde" 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: ], 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "actuators": null, 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "scheduler_mode": "none", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "sectors": 0, 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "sectorsize": "512", 2026-03-10T15:43:20.993 INFO:tasks.workunit.client.0.vm01.stdout: "size": 21474836480, 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "human_readable_size": "20.00 GB", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "path": "/dev/vde", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "devname": "vde", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "type": "disk", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "parent": "/dev/vde", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "id_bus": "" 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: }, 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "ceph_device_lvm": false, 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "being_replaced": false, 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "lsm_data": {}, 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "available": true, 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "rejected_reasons": [], 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "device_id": "DWNBRSTVMM01004", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "lvs": [] 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout:} 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout:{ 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "path": "/dev/sr0", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "sys_api": { 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "removable": "1", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "ro": "0", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "vendor": "QEMU", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "model": "QEMU DVD-ROM", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "rev": "2.5+", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "sas_address": "", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "sas_device_handle": "", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "support_discard": "0", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "rotational": "1", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "nr_requests": "2", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "partitions": {}, 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "device_nodes": [ 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "sr0" 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: ], 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "actuators": null, 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "scheduler_mode": "mq-deadline", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "sectors": 0, 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "sectorsize": "2048", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "size": 374784, 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "human_readable_size": "366.00 KB", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "path": "/dev/sr0", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "devname": "sr0", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "type": "disk", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "parent": "/dev/sr0", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "id_bus": "ata" 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: }, 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "ceph_device_lvm": false, 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "being_replaced": false, 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "lsm_data": {}, 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "available": false, 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "rejected_reasons": [ 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "Has a FileSystem", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "Insufficient space (<5GB)" 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: ], 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "device_id": "QEMU_DVD-ROM_QM00003", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "lvs": [] 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout:} 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout:{ 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "path": "/dev/vda", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "sys_api": { 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "removable": "0", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "ro": "0", 2026-03-10T15:43:20.994 INFO:tasks.workunit.client.0.vm01.stdout: "vendor": "0x1af4", 2026-03-10T15:43:20.995 INFO:tasks.workunit.client.0.vm01.stdout: "model": "", 2026-03-10T15:43:20.995 INFO:tasks.workunit.client.0.vm01.stdout: "rev": "", 2026-03-10T15:43:20.995 INFO:tasks.workunit.client.0.vm01.stdout: "sas_address": "", 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "sas_device_handle": "", 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "support_discard": "512", 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "rotational": "1", 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "nr_requests": "256", 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "partitions": { 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "vda15": { 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "start": "10240", 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "sectors": "217088", 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "sectorsize": 512, 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "size": 111149056, 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "human_readable_size": "106.00 MB", 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "holders": [] 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: }, 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "vda1": { 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "start": "227328", 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "sectors": "83658719", 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "sectorsize": 512, 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "size": 42833264128, 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "human_readable_size": "39.89 GB", 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "holders": [] 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: }, 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "vda14": { 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "start": "2048", 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "sectors": "8192", 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "sectorsize": 512, 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "size": 4194304, 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "human_readable_size": "4.00 MB", 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "holders": [] 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: } 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: }, 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "device_nodes": [ 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "vda" 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: ], 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "actuators": null, 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "scheduler_mode": "none", 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "sectors": 0, 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "sectorsize": "512", 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "size": 42949672960, 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "human_readable_size": "40.00 GB", 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "path": "/dev/vda", 2026-03-10T15:43:21.011 INFO:tasks.workunit.client.0.vm01.stdout: "devname": "vda", 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stdout: "type": "disk", 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stdout: "parent": "/dev/vda", 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stdout: "id_bus": "" 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stdout: }, 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stdout: "ceph_device_lvm": false, 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stdout: "being_replaced": false, 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stdout: "lsm_data": {}, 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stdout: "available": false, 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stdout: "rejected_reasons": [ 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stdout: "Has GPT headers", 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stdout: "Has partitions" 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stdout: ], 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stdout: "device_id": "", 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stdout: "lvs": [] 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stdout:} 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' true = false ']' 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_false sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-daemon --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stderr:+ set -x 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stderr:+ eval sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-daemon --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a 2026-03-10T15:43:21.012 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-daemon --fsid 00000000-0000-0000-0000-0000deadbeef --name mon.a 2026-03-10T15:43:21.109 INFO:tasks.workunit.client.0.vm01.stderr:ERROR: must pass --force to proceed: this command may destroy precious data! 2026-03-10T15:43:21.128 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-10T15:43:21.128 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-daemon --fsid 00000000-0000-0000-0000-0000deadbeef --name mgr.x 2026-03-10T15:43:21.662 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_false sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid zap-osds --fsid 00000000-0000-0000-0000-0000deadbeef 2026-03-10T15:43:21.663 INFO:tasks.workunit.client.0.vm01.stderr:+ set -x 2026-03-10T15:43:21.663 INFO:tasks.workunit.client.0.vm01.stderr:+ eval sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid zap-osds --fsid 00000000-0000-0000-0000-0000deadbeef 2026-03-10T15:43:21.663 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid zap-osds --fsid 00000000-0000-0000-0000-0000deadbeef 2026-03-10T15:43:21.756 INFO:tasks.workunit.client.0.vm01.stderr:ERROR: must pass --force to proceed: this command may destroy precious data! 2026-03-10T15:43:21.771 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-10T15:43:21.771 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid zap-osds --fsid 00000000-0000-0000-0000-0000deadbeef --force 2026-03-10T15:43:22.523 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_false sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-cluster --fsid 00000000-0000-0000-0000-0000deadbeef --zap-osds 2026-03-10T15:43:22.523 INFO:tasks.workunit.client.0.vm01.stderr:+ set -x 2026-03-10T15:43:22.523 INFO:tasks.workunit.client.0.vm01.stderr:+ eval sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-cluster --fsid 00000000-0000-0000-0000-0000deadbeef --zap-osds 2026-03-10T15:43:22.523 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-cluster --fsid 00000000-0000-0000-0000-0000deadbeef --zap-osds 2026-03-10T15:43:22.614 INFO:tasks.workunit.client.0.vm01.stderr:ERROR: must pass --force to proceed: this command may destroy precious data! 2026-03-10T15:43:22.628 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-10T15:43:22.628 INFO:tasks.workunit.client.0.vm01.stderr:+ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid rm-cluster --fsid 00000000-0000-0000-0000-0000deadbeef --force --zap-osds 2026-03-10T15:43:22.723 INFO:tasks.workunit.client.0.vm01.stdout:Deleting cluster with fsid: 00000000-0000-0000-0000-0000deadbeef 2026-03-10T15:43:43.123 INFO:tasks.workunit.client.0.vm01.stdout:PASS 2026-03-10T15:43:43.123 INFO:tasks.workunit.client.0.vm01.stderr:+ echo PASS 2026-03-10T15:43:43.123 INFO:tasks.workunit.client.0.vm01.stderr:+ cleanup 2026-03-10T15:43:43.123 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' true = false ']' 2026-03-10T15:43:43.123 INFO:tasks.workunit.client.0.vm01.stderr:+ dump_all_logs 00000000-0000-0000-0000-0000deadbeef 2026-03-10T15:43:43.123 INFO:tasks.workunit.client.0.vm01.stderr:+ local fsid=00000000-0000-0000-0000-0000deadbeef 2026-03-10T15:43:43.123 INFO:tasks.workunit.client.0.vm01.stderr:++ sudo /usr/sbin/cephadm --image quay.ceph.io/ceph-ci/ceph:squid ls 2026-03-10T15:43:43.123 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r '.[] | select(.fsid == "00000000-0000-0000-0000-0000deadbeef").name' 2026-03-10T15:43:44.262 INFO:tasks.workunit.client.0.vm01.stdout:dumping logs for daemons: 2026-03-10T15:43:44.263 INFO:tasks.workunit.client.0.vm01.stderr:+ local names= 2026-03-10T15:43:44.263 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'dumping logs for daemons: ' 2026-03-10T15:43:44.263 INFO:tasks.workunit.client.0.vm01.stderr:+ rm -rf tmp.test_cephadm.sh.dSdiir 2026-03-10T15:43:44.264 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-10T15:43:44.264 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-10T15:43:44.312 INFO:tasks.workunit:Stopping ['cephadm/test_cephadm.sh'] on client.0... 2026-03-10T15:43:44.312 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-10T15:43:44.755 DEBUG:teuthology.parallel:result is None 2026-03-10T15:43:44.755 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-10T15:43:44.764 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-10T15:43:44.764 DEBUG:teuthology.orchestra.run.vm01:> rmdir -- /home/ubuntu/cephtest/mnt.0 2026-03-10T15:43:44.812 INFO:tasks.workunit:Deleted artificial mount point /home/ubuntu/cephtest/mnt.0/client.0 2026-03-10T15:43:44.812 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-10T15:43:44.838 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-10T15:43:44.838 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-10T15:43:44.873 INFO:teuthology.task.install.deb:Removing packages: ceph, cephadm, ceph-mds, ceph-mgr, ceph-common, ceph-fuse, ceph-test, ceph-volume, radosgw, python3-rados, python3-rgw, python3-cephfs, python3-rbd, libcephfs2, libcephfs-dev, librados2, librbd1, rbd-fuse on Debian system. 2026-03-10T15:43:44.873 DEBUG:teuthology.orchestra.run.vm01:> for d in ceph cephadm ceph-mds ceph-mgr ceph-common ceph-fuse ceph-test ceph-volume radosgw python3-rados python3-rgw python3-cephfs python3-rbd libcephfs2 libcephfs-dev librados2 librbd1 rbd-fuse ; do sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" purge $d || true ; done 2026-03-10T15:43:44.944 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:43:45.150 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:43:45.150 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:43:45.318 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:43:45.319 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-10T15:43:45.319 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-10T15:43:45.319 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:43:45.333 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-10T15:43:45.334 INFO:teuthology.orchestra.run.vm01.stdout: ceph* 2026-03-10T15:43:45.526 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 1 to remove and 12 not upgraded. 2026-03-10T15:43:45.526 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 47.1 kB disk space will be freed. 2026-03-10T15:43:45.568 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118605 files and directories currently installed.) 2026-03-10T15:43:45.571 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:46.835 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:43:46.869 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:43:47.061 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:43:47.061 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:43:47.274 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:43:47.275 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-10T15:43:47.276 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 python-asyncssh-doc python3-asyncssh sg3-utils sg3-utils-udev 2026-03-10T15:43:47.276 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:43:47.290 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-10T15:43:47.308 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-cephadm* cephadm* 2026-03-10T15:43:47.466 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 2 to remove and 12 not upgraded. 2026-03-10T15:43:47.466 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 1775 kB disk space will be freed. 2026-03-10T15:43:47.513 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118603 files and directories currently installed.) 2026-03-10T15:43:47.516 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mgr-cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:47.533 INFO:teuthology.orchestra.run.vm01.stdout:Removing cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:47.563 INFO:teuthology.orchestra.run.vm01.stdout:Looking for files to backup/remove ... 2026-03-10T15:43:47.565 INFO:teuthology.orchestra.run.vm01.stdout:Not backing up/removing `/var/lib/cephadm', it matches ^/var/.*. 2026-03-10T15:43:47.567 INFO:teuthology.orchestra.run.vm01.stdout:Removing user `cephadm' ... 2026-03-10T15:43:47.567 INFO:teuthology.orchestra.run.vm01.stdout:Warning: group `nogroup' has no more members. 2026-03-10T15:43:47.579 INFO:teuthology.orchestra.run.vm01.stdout:Done. 2026-03-10T15:43:47.600 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-10T15:43:47.695 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118529 files and directories currently installed.) 2026-03-10T15:43:47.698 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:48.912 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:43:48.947 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:43:49.060 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:43:49.060 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:43:49.211 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:43:49.211 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-10T15:43:49.211 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 python-asyncssh-doc python3-asyncssh sg3-utils sg3-utils-udev 2026-03-10T15:43:49.211 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:43:49.222 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-10T15:43:49.223 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mds* 2026-03-10T15:43:49.402 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 1 to remove and 12 not upgraded. 2026-03-10T15:43:49.402 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 7437 kB disk space will be freed. 2026-03-10T15:43:49.443 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118529 files and directories currently installed.) 2026-03-10T15:43:49.445 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:49.836 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-10T15:43:49.937 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118521 files and directories currently installed.) 2026-03-10T15:43:49.940 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:51.616 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:43:51.651 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:43:51.832 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:43:51.832 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:43:52.014 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:43:52.014 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core ceph-mon kpartx libboost-iostreams1.74.0 2026-03-10T15:43:52.014 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libpmemobj1 libsgutils2-2 python-asyncssh-doc 2026-03-10T15:43:52.014 INFO:teuthology.orchestra.run.vm01.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools python3-cheroot 2026-03-10T15:43:52.014 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-10T15:43:52.014 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-10T15:43:52.014 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-10T15:43:52.014 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-10T15:43:52.014 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-psutil python3-pyinotify 2026-03-10T15:43:52.014 INFO:teuthology.orchestra.run.vm01.stdout: python3-repoze.lru python3-requests-oauthlib python3-routes python3-rsa 2026-03-10T15:43:52.014 INFO:teuthology.orchestra.run.vm01.stdout: python3-simplegeneric python3-simplejson python3-singledispatch 2026-03-10T15:43:52.014 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn python3-sklearn-lib python3-tempita python3-tempora 2026-03-10T15:43:52.014 INFO:teuthology.orchestra.run.vm01.stdout: python3-threadpoolctl python3-waitress python3-webob python3-websocket 2026-03-10T15:43:52.014 INFO:teuthology.orchestra.run.vm01.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-10T15:43:52.014 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils-udev 2026-03-10T15:43:52.014 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:43:52.022 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-10T15:43:52.022 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr* ceph-mgr-dashboard* ceph-mgr-diskprediction-local* 2026-03-10T15:43:52.022 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-k8sevents* 2026-03-10T15:43:52.303 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 4 to remove and 12 not upgraded. 2026-03-10T15:43:52.304 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 165 MB disk space will be freed. 2026-03-10T15:43:52.384 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118521 files and directories currently installed.) 2026-03-10T15:43:52.387 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mgr-k8sevents (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:52.399 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mgr-diskprediction-local (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:52.422 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mgr-dashboard (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:52.457 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:52.962 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117937 files and directories currently installed.) 2026-03-10T15:43:52.965 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:54.609 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:43:54.645 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:43:54.863 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:43:54.864 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:43:55.064 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:43:55.064 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-10T15:43:55.065 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-10T15:43:55.065 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-10T15:43:55.065 INFO:teuthology.orchestra.run.vm01.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-10T15:43:55.065 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-10T15:43:55.065 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-10T15:43:55.065 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-10T15:43:55.065 INFO:teuthology.orchestra.run.vm01.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-10T15:43:55.065 INFO:teuthology.orchestra.run.vm01.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-10T15:43:55.065 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-10T15:43:55.065 INFO:teuthology.orchestra.run.vm01.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-10T15:43:55.065 INFO:teuthology.orchestra.run.vm01.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-10T15:43:55.066 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-10T15:43:55.066 INFO:teuthology.orchestra.run.vm01.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-10T15:43:55.066 INFO:teuthology.orchestra.run.vm01.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-10T15:43:55.066 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-10T15:43:55.066 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:43:55.080 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-10T15:43:55.081 INFO:teuthology.orchestra.run.vm01.stdout: ceph-base* ceph-common* ceph-mon* ceph-osd* ceph-test* ceph-volume* radosgw* 2026-03-10T15:43:55.281 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 7 to remove and 12 not upgraded. 2026-03-10T15:43:55.281 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 472 MB disk space will be freed. 2026-03-10T15:43:55.326 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117937 files and directories currently installed.) 2026-03-10T15:43:55.329 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-volume (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:55.388 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:55.825 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:56.259 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:56.753 INFO:teuthology.orchestra.run.vm01.stdout:Removing radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:57.107 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-test (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:57.139 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:57.564 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-10T15:43:57.598 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-10T15:43:57.675 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117456 files and directories currently installed.) 2026-03-10T15:43:57.677 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:58.320 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:58.768 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:59.162 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:43:59.603 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:44:01.334 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:44:01.368 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:44:01.568 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:44:01.569 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:44:01.755 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:44:01.755 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-10T15:44:01.755 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-10T15:44:01.755 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-10T15:44:01.755 INFO:teuthology.orchestra.run.vm01.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-10T15:44:01.755 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-10T15:44:01.755 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-10T15:44:01.755 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-10T15:44:01.755 INFO:teuthology.orchestra.run.vm01.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-10T15:44:01.755 INFO:teuthology.orchestra.run.vm01.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-10T15:44:01.755 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-10T15:44:01.755 INFO:teuthology.orchestra.run.vm01.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-10T15:44:01.755 INFO:teuthology.orchestra.run.vm01.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-10T15:44:01.755 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-10T15:44:01.755 INFO:teuthology.orchestra.run.vm01.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-10T15:44:01.755 INFO:teuthology.orchestra.run.vm01.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-10T15:44:01.755 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-10T15:44:01.755 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:44:01.762 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-10T15:44:01.762 INFO:teuthology.orchestra.run.vm01.stdout: ceph-fuse* 2026-03-10T15:44:01.955 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 1 to remove and 12 not upgraded. 2026-03-10T15:44:01.955 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 3673 kB disk space will be freed. 2026-03-10T15:44:01.996 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117443 files and directories currently installed.) 2026-03-10T15:44:01.998 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:44:02.443 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-10T15:44:02.924 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117434 files and directories currently installed.) 2026-03-10T15:44:02.927 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:44:04.584 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:44:04.627 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:44:04.827 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:44:04.828 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:44:05.022 INFO:teuthology.orchestra.run.vm01.stdout:Package 'ceph-test' is not installed, so not removed 2026-03-10T15:44:05.022 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:44:05.022 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-10T15:44:05.022 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-10T15:44:05.023 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-10T15:44:05.023 INFO:teuthology.orchestra.run.vm01.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-10T15:44:05.023 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-10T15:44:05.023 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-10T15:44:05.023 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-10T15:44:05.023 INFO:teuthology.orchestra.run.vm01.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-10T15:44:05.023 INFO:teuthology.orchestra.run.vm01.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-10T15:44:05.023 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-10T15:44:05.023 INFO:teuthology.orchestra.run.vm01.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-10T15:44:05.023 INFO:teuthology.orchestra.run.vm01.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-10T15:44:05.023 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-10T15:44:05.023 INFO:teuthology.orchestra.run.vm01.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-10T15:44:05.023 INFO:teuthology.orchestra.run.vm01.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-10T15:44:05.023 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-10T15:44:05.023 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:44:05.047 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 12 not upgraded. 2026-03-10T15:44:05.047 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:44:05.080 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:44:05.270 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:44:05.270 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:44:05.454 INFO:teuthology.orchestra.run.vm01.stdout:Package 'ceph-volume' is not installed, so not removed 2026-03-10T15:44:05.454 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:44:05.454 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-10T15:44:05.454 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-10T15:44:05.454 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-10T15:44:05.454 INFO:teuthology.orchestra.run.vm01.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-10T15:44:05.454 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-10T15:44:05.454 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-10T15:44:05.454 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-10T15:44:05.454 INFO:teuthology.orchestra.run.vm01.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-10T15:44:05.454 INFO:teuthology.orchestra.run.vm01.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-10T15:44:05.454 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-10T15:44:05.454 INFO:teuthology.orchestra.run.vm01.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-10T15:44:05.455 INFO:teuthology.orchestra.run.vm01.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-10T15:44:05.455 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-10T15:44:05.455 INFO:teuthology.orchestra.run.vm01.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-10T15:44:05.455 INFO:teuthology.orchestra.run.vm01.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-10T15:44:05.455 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-10T15:44:05.455 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:44:05.472 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 12 not upgraded. 2026-03-10T15:44:05.473 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:44:05.504 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:44:05.687 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:44:05.688 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:44:05.865 INFO:teuthology.orchestra.run.vm01.stdout:Package 'radosgw' is not installed, so not removed 2026-03-10T15:44:05.865 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:44:05.866 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-10T15:44:05.866 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-10T15:44:05.867 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-10T15:44:05.867 INFO:teuthology.orchestra.run.vm01.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-10T15:44:05.867 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-10T15:44:05.867 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-10T15:44:05.867 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-10T15:44:05.867 INFO:teuthology.orchestra.run.vm01.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-10T15:44:05.867 INFO:teuthology.orchestra.run.vm01.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-10T15:44:05.867 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-10T15:44:05.867 INFO:teuthology.orchestra.run.vm01.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-10T15:44:05.867 INFO:teuthology.orchestra.run.vm01.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-10T15:44:05.867 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-10T15:44:05.867 INFO:teuthology.orchestra.run.vm01.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-10T15:44:05.867 INFO:teuthology.orchestra.run.vm01.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-10T15:44:05.867 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-10T15:44:05.867 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:44:05.891 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 12 not upgraded. 2026-03-10T15:44:05.891 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:44:05.924 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:44:06.127 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:44:06.128 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:44:06.316 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:44:06.316 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-10T15:44:06.316 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-10T15:44:06.316 INFO:teuthology.orchestra.run.vm01.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-10T15:44:06.317 INFO:teuthology.orchestra.run.vm01.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-10T15:44:06.317 INFO:teuthology.orchestra.run.vm01.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-10T15:44:06.317 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-10T15:44:06.317 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-10T15:44:06.317 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-10T15:44:06.317 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-10T15:44:06.317 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-10T15:44:06.317 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-10T15:44:06.317 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-10T15:44:06.317 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-10T15:44:06.317 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-10T15:44:06.317 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-10T15:44:06.317 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-10T15:44:06.317 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-10T15:44:06.317 INFO:teuthology.orchestra.run.vm01.stdout: xmlstarlet zip 2026-03-10T15:44:06.318 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:44:06.335 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-10T15:44:06.335 INFO:teuthology.orchestra.run.vm01.stdout: python3-cephfs* python3-rados* python3-rgw* 2026-03-10T15:44:06.520 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 3 to remove and 12 not upgraded. 2026-03-10T15:44:06.521 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 2062 kB disk space will be freed. 2026-03-10T15:44:06.562 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117434 files and directories currently installed.) 2026-03-10T15:44:06.564 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-cephfs (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:44:06.575 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-rgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:44:06.585 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-rados (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:44:07.824 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:44:07.860 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:44:08.073 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:44:08.074 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:44:08.222 INFO:teuthology.orchestra.run.vm01.stdout:Package 'python3-rgw' is not installed, so not removed 2026-03-10T15:44:08.222 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:44:08.222 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-10T15:44:08.222 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-10T15:44:08.222 INFO:teuthology.orchestra.run.vm01.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-10T15:44:08.223 INFO:teuthology.orchestra.run.vm01.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-10T15:44:08.223 INFO:teuthology.orchestra.run.vm01.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-10T15:44:08.223 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-10T15:44:08.223 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-10T15:44:08.223 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-10T15:44:08.223 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-10T15:44:08.223 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-10T15:44:08.223 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-10T15:44:08.223 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-10T15:44:08.223 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-10T15:44:08.223 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-10T15:44:08.223 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-10T15:44:08.223 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-10T15:44:08.223 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-10T15:44:08.223 INFO:teuthology.orchestra.run.vm01.stdout: xmlstarlet zip 2026-03-10T15:44:08.223 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:44:08.242 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 12 not upgraded. 2026-03-10T15:44:08.243 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:44:08.275 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:44:08.473 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:44:08.473 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:44:08.667 INFO:teuthology.orchestra.run.vm01.stdout:Package 'python3-cephfs' is not installed, so not removed 2026-03-10T15:44:08.667 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:44:08.667 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-10T15:44:08.667 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-10T15:44:08.667 INFO:teuthology.orchestra.run.vm01.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-10T15:44:08.667 INFO:teuthology.orchestra.run.vm01.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-10T15:44:08.667 INFO:teuthology.orchestra.run.vm01.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-10T15:44:08.667 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-10T15:44:08.667 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-10T15:44:08.668 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-10T15:44:08.668 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-10T15:44:08.668 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-10T15:44:08.668 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-10T15:44:08.668 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-10T15:44:08.668 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-10T15:44:08.668 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-10T15:44:08.668 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-10T15:44:08.668 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-10T15:44:08.668 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-10T15:44:08.668 INFO:teuthology.orchestra.run.vm01.stdout: xmlstarlet zip 2026-03-10T15:44:08.668 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:44:08.687 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 12 not upgraded. 2026-03-10T15:44:08.687 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:44:08.720 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:44:08.891 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:44:08.891 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:44:09.055 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:44:09.055 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-10T15:44:09.055 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-10T15:44:09.056 INFO:teuthology.orchestra.run.vm01.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-10T15:44:09.056 INFO:teuthology.orchestra.run.vm01.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-10T15:44:09.056 INFO:teuthology.orchestra.run.vm01.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-10T15:44:09.056 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-10T15:44:09.056 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-10T15:44:09.056 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-10T15:44:09.056 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-10T15:44:09.056 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-10T15:44:09.056 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-10T15:44:09.056 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-10T15:44:09.056 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-10T15:44:09.056 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-10T15:44:09.056 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-10T15:44:09.056 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-10T15:44:09.057 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-10T15:44:09.057 INFO:teuthology.orchestra.run.vm01.stdout: xmlstarlet zip 2026-03-10T15:44:09.057 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:44:09.072 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-10T15:44:09.072 INFO:teuthology.orchestra.run.vm01.stdout: python3-rbd* 2026-03-10T15:44:09.241 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 1 to remove and 12 not upgraded. 2026-03-10T15:44:09.241 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 1186 kB disk space will be freed. 2026-03-10T15:44:09.279 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117410 files and directories currently installed.) 2026-03-10T15:44:09.281 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-rbd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:44:10.449 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:44:10.483 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:44:10.681 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:44:10.682 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:44:10.850 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:44:10.850 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-10T15:44:10.851 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-10T15:44:10.851 INFO:teuthology.orchestra.run.vm01.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-10T15:44:10.852 INFO:teuthology.orchestra.run.vm01.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-10T15:44:10.852 INFO:teuthology.orchestra.run.vm01.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-10T15:44:10.852 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-10T15:44:10.852 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-10T15:44:10.852 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-10T15:44:10.852 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-10T15:44:10.852 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-10T15:44:10.852 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-10T15:44:10.852 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-10T15:44:10.852 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-10T15:44:10.852 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-10T15:44:10.852 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-10T15:44:10.852 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-10T15:44:10.852 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-10T15:44:10.852 INFO:teuthology.orchestra.run.vm01.stdout: xmlstarlet zip 2026-03-10T15:44:10.852 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:44:10.859 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-10T15:44:10.859 INFO:teuthology.orchestra.run.vm01.stdout: libcephfs-dev* libcephfs2* 2026-03-10T15:44:11.035 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 2 to remove and 12 not upgraded. 2026-03-10T15:44:11.035 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 3202 kB disk space will be freed. 2026-03-10T15:44:11.115 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117402 files and directories currently installed.) 2026-03-10T15:44:11.117 INFO:teuthology.orchestra.run.vm01.stdout:Removing libcephfs-dev (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:44:11.128 INFO:teuthology.orchestra.run.vm01.stdout:Removing libcephfs2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:44:11.151 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-10T15:44:12.316 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:44:12.349 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:44:12.550 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:44:12.553 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:44:12.691 INFO:teuthology.orchestra.run.vm01.stdout:Package 'libcephfs-dev' is not installed, so not removed 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout: xmlstarlet zip 2026-03-10T15:44:12.692 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:44:12.713 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 12 not upgraded. 2026-03-10T15:44:12.713 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:44:12.745 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:44:12.928 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:44:12.929 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:44:13.081 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:44:13.081 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-10T15:44:13.081 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-10T15:44:13.081 INFO:teuthology.orchestra.run.vm01.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-10T15:44:13.081 INFO:teuthology.orchestra.run.vm01.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-10T15:44:13.082 INFO:teuthology.orchestra.run.vm01.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-10T15:44:13.082 INFO:teuthology.orchestra.run.vm01.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-10T15:44:13.082 INFO:teuthology.orchestra.run.vm01.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-10T15:44:13.082 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-10T15:44:13.082 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-10T15:44:13.082 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-10T15:44:13.082 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-10T15:44:13.082 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-10T15:44:13.082 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-10T15:44:13.082 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-10T15:44:13.082 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-10T15:44:13.082 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-10T15:44:13.082 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-10T15:44:13.082 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-10T15:44:13.082 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-10T15:44:13.082 INFO:teuthology.orchestra.run.vm01.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-10T15:44:13.082 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:44:13.094 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-10T15:44:13.095 INFO:teuthology.orchestra.run.vm01.stdout: librados2* libradosstriper1* librbd1* librgw2* libsqlite3-mod-ceph* 2026-03-10T15:44:13.095 INFO:teuthology.orchestra.run.vm01.stdout: qemu-block-extra* rbd-fuse* 2026-03-10T15:44:13.269 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 7 to remove and 12 not upgraded. 2026-03-10T15:44:13.269 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 51.6 MB disk space will be freed. 2026-03-10T15:44:13.307 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117387 files and directories currently installed.) 2026-03-10T15:44:13.309 INFO:teuthology.orchestra.run.vm01.stdout:Removing rbd-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:44:13.322 INFO:teuthology.orchestra.run.vm01.stdout:Removing libsqlite3-mod-ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:44:13.335 INFO:teuthology.orchestra.run.vm01.stdout:Removing libradosstriper1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:44:13.349 INFO:teuthology.orchestra.run.vm01.stdout:Removing qemu-block-extra (1:6.2+dfsg-2ubuntu6.28) ... 2026-03-10T15:44:13.767 INFO:teuthology.orchestra.run.vm01.stdout:Removing librbd1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:44:13.778 INFO:teuthology.orchestra.run.vm01.stdout:Removing librgw2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:44:13.790 INFO:teuthology.orchestra.run.vm01.stdout:Removing librados2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:44:13.815 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-10T15:44:13.849 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-10T15:44:13.917 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117336 files and directories currently installed.) 2026-03-10T15:44:13.919 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for qemu-block-extra (1:6.2+dfsg-2ubuntu6.28) ... 2026-03-10T15:44:15.596 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:44:15.629 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:44:15.826 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:44:15.826 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:44:16.023 INFO:teuthology.orchestra.run.vm01.stdout:Package 'librbd1' is not installed, so not removed 2026-03-10T15:44:16.023 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:44:16.023 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-10T15:44:16.023 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-10T15:44:16.023 INFO:teuthology.orchestra.run.vm01.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-10T15:44:16.023 INFO:teuthology.orchestra.run.vm01.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-10T15:44:16.023 INFO:teuthology.orchestra.run.vm01.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-10T15:44:16.024 INFO:teuthology.orchestra.run.vm01.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-10T15:44:16.024 INFO:teuthology.orchestra.run.vm01.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-10T15:44:16.024 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-10T15:44:16.024 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-10T15:44:16.024 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-10T15:44:16.024 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-10T15:44:16.024 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-10T15:44:16.024 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-10T15:44:16.024 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-10T15:44:16.024 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-10T15:44:16.024 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-10T15:44:16.024 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-10T15:44:16.024 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-10T15:44:16.024 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-10T15:44:16.024 INFO:teuthology.orchestra.run.vm01.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-10T15:44:16.024 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:44:16.051 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 12 not upgraded. 2026-03-10T15:44:16.051 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:44:16.084 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:44:16.280 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:44:16.281 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:44:16.467 INFO:teuthology.orchestra.run.vm01.stdout:Package 'rbd-fuse' is not installed, so not removed 2026-03-10T15:44:16.467 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-10T15:44:16.467 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-10T15:44:16.467 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-10T15:44:16.467 INFO:teuthology.orchestra.run.vm01.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-10T15:44:16.467 INFO:teuthology.orchestra.run.vm01.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-10T15:44:16.467 INFO:teuthology.orchestra.run.vm01.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-10T15:44:16.468 INFO:teuthology.orchestra.run.vm01.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-10T15:44:16.468 INFO:teuthology.orchestra.run.vm01.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-10T15:44:16.468 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-10T15:44:16.468 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-10T15:44:16.468 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-10T15:44:16.468 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-10T15:44:16.468 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-10T15:44:16.468 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-10T15:44:16.468 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-10T15:44:16.468 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-10T15:44:16.468 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-10T15:44:16.468 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-10T15:44:16.468 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-10T15:44:16.468 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-10T15:44:16.468 INFO:teuthology.orchestra.run.vm01.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-10T15:44:16.468 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-10T15:44:16.493 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 12 not upgraded. 2026-03-10T15:44:16.493 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:44:16.495 DEBUG:teuthology.orchestra.run.vm01:> dpkg -l | grep '^.\(U\|H\)R' | awk '{print $2}' | sudo xargs --no-run-if-empty dpkg -P --force-remove-reinstreq 2026-03-10T15:44:16.550 DEBUG:teuthology.orchestra.run.vm01:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" autoremove 2026-03-10T15:44:16.630 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:44:16.830 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-10T15:44:16.830 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-10T15:44:17.039 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-10T15:44:17.039 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-10T15:44:17.039 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-10T15:44:17.039 INFO:teuthology.orchestra.run.vm01.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-10T15:44:17.039 INFO:teuthology.orchestra.run.vm01.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-10T15:44:17.040 INFO:teuthology.orchestra.run.vm01.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-10T15:44:17.040 INFO:teuthology.orchestra.run.vm01.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-10T15:44:17.040 INFO:teuthology.orchestra.run.vm01.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-10T15:44:17.040 INFO:teuthology.orchestra.run.vm01.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-10T15:44:17.040 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-10T15:44:17.040 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-10T15:44:17.040 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-10T15:44:17.040 INFO:teuthology.orchestra.run.vm01.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-10T15:44:17.040 INFO:teuthology.orchestra.run.vm01.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-10T15:44:17.040 INFO:teuthology.orchestra.run.vm01.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-10T15:44:17.040 INFO:teuthology.orchestra.run.vm01.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-10T15:44:17.040 INFO:teuthology.orchestra.run.vm01.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-10T15:44:17.040 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-10T15:44:17.040 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-10T15:44:17.040 INFO:teuthology.orchestra.run.vm01.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-10T15:44:17.040 INFO:teuthology.orchestra.run.vm01.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-10T15:44:17.209 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 87 to remove and 12 not upgraded. 2026-03-10T15:44:17.210 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 107 MB disk space will be freed. 2026-03-10T15:44:17.254 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117336 files and directories currently installed.) 2026-03-10T15:44:17.257 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mgr-modules-core (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:44:17.273 INFO:teuthology.orchestra.run.vm01.stdout:Removing jq (1.6-2.1ubuntu3.1) ... 2026-03-10T15:44:17.285 INFO:teuthology.orchestra.run.vm01.stdout:Removing kpartx (0.8.8-1ubuntu1.22.04.4) ... 2026-03-10T15:44:17.337 INFO:teuthology.orchestra.run.vm01.stdout:Removing libboost-iostreams1.74.0:amd64 (1.74.0-14ubuntu3) ... 2026-03-10T15:44:17.518 INFO:teuthology.orchestra.run.vm01.stdout:Removing libboost-thread1.74.0:amd64 (1.74.0-14ubuntu3) ... 2026-03-10T15:44:17.531 INFO:teuthology.orchestra.run.vm01.stdout:Removing libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-10T15:44:17.543 INFO:teuthology.orchestra.run.vm01.stdout:Removing libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-10T15:44:17.554 INFO:teuthology.orchestra.run.vm01.stdout:Removing libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-10T15:44:17.565 INFO:teuthology.orchestra.run.vm01.stdout:Removing libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-10T15:44:17.583 INFO:teuthology.orchestra.run.vm01.stdout:Removing libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-10T15:44:17.594 INFO:teuthology.orchestra.run.vm01.stdout:Removing libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-10T15:44:17.607 INFO:teuthology.orchestra.run.vm01.stdout:Removing libgfapi0:amd64 (10.1-1ubuntu0.2) ... 2026-03-10T15:44:17.618 INFO:teuthology.orchestra.run.vm01.stdout:Removing libgfrpc0:amd64 (10.1-1ubuntu0.2) ... 2026-03-10T15:44:17.629 INFO:teuthology.orchestra.run.vm01.stdout:Removing libgfxdr0:amd64 (10.1-1ubuntu0.2) ... 2026-03-10T15:44:17.640 INFO:teuthology.orchestra.run.vm01.stdout:Removing libglusterfs0:amd64 (10.1-1ubuntu0.2) ... 2026-03-10T15:44:17.651 INFO:teuthology.orchestra.run.vm01.stdout:Removing libiscsi7:amd64 (1.19.0-3build2) ... 2026-03-10T15:44:17.663 INFO:teuthology.orchestra.run.vm01.stdout:Removing libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-10T15:44:17.673 INFO:teuthology.orchestra.run.vm01.stdout:Removing liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-10T15:44:17.683 INFO:teuthology.orchestra.run.vm01.stdout:Removing luarocks (3.8.0+dfsg1-1) ... 2026-03-10T15:44:17.706 INFO:teuthology.orchestra.run.vm01.stdout:Removing liblua5.3-dev:amd64 (5.3.6-1build1) ... 2026-03-10T15:44:17.716 INFO:teuthology.orchestra.run.vm01.stdout:Removing libnbd0 (1.10.5-1) ... 2026-03-10T15:44:17.726 INFO:teuthology.orchestra.run.vm01.stdout:Removing liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-10T15:44:17.735 INFO:teuthology.orchestra.run.vm01.stdout:Removing libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-10T15:44:17.745 INFO:teuthology.orchestra.run.vm01.stdout:Removing libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-10T15:44:17.754 INFO:teuthology.orchestra.run.vm01.stdout:Removing libpmemobj1:amd64 (1.11.1-3build1) ... 2026-03-10T15:44:17.763 INFO:teuthology.orchestra.run.vm01.stdout:Removing librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-10T15:44:17.772 INFO:teuthology.orchestra.run.vm01.stdout:Removing libreadline-dev:amd64 (8.1.2-1) ... 2026-03-10T15:44:17.781 INFO:teuthology.orchestra.run.vm01.stdout:Removing sg3-utils-udev (1.46-1ubuntu0.22.04.1) ... 2026-03-10T15:44:17.787 INFO:teuthology.orchestra.run.vm01.stdout:update-initramfs: deferring update (trigger activated) 2026-03-10T15:44:17.795 INFO:teuthology.orchestra.run.vm01.stdout:Removing sg3-utils (1.46-1ubuntu0.22.04.1) ... 2026-03-10T15:44:17.813 INFO:teuthology.orchestra.run.vm01.stdout:Removing libsgutils2-2:amd64 (1.46-1ubuntu0.22.04.1) ... 2026-03-10T15:44:17.825 INFO:teuthology.orchestra.run.vm01.stdout:Removing lua-any (27ubuntu1) ... 2026-03-10T15:44:17.836 INFO:teuthology.orchestra.run.vm01.stdout:Removing lua-sec:amd64 (1.0.2-1) ... 2026-03-10T15:44:17.848 INFO:teuthology.orchestra.run.vm01.stdout:Removing lua-socket:amd64 (3.0~rc1+git+ac3201d-6) ... 2026-03-10T15:44:17.863 INFO:teuthology.orchestra.run.vm01.stdout:Removing lua5.1 (5.1.5-8.1build4) ... 2026-03-10T15:44:17.881 INFO:teuthology.orchestra.run.vm01.stdout:Removing nvme-cli (1.16-3ubuntu0.3) ... 2026-03-10T15:44:18.281 INFO:teuthology.orchestra.run.vm01.stdout:Removing pkg-config (0.29.2-1ubuntu3) ... 2026-03-10T15:44:18.312 INFO:teuthology.orchestra.run.vm01.stdout:Removing python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-10T15:44:18.337 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-pecan (1.3.3-4ubuntu2) ... 2026-03-10T15:44:18.393 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-webtest (2.0.35-1) ... 2026-03-10T15:44:18.448 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-pastescript (2.0.2-4) ... 2026-03-10T15:44:18.500 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-pastedeploy (2.1.1-1) ... 2026-03-10T15:44:18.550 INFO:teuthology.orchestra.run.vm01.stdout:Removing python-pastedeploy-tpl (2.1.1-1) ... 2026-03-10T15:44:18.561 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-10T15:44:18.614 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-10T15:44:18.871 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-google-auth (1.5.1-3) ... 2026-03-10T15:44:18.922 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-cachetools (5.0.0-1) ... 2026-03-10T15:44:18.970 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-ceph-argparse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:44:19.016 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-10T15:44:19.065 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-cherrypy3 (18.6.1-4) ... 2026-03-10T15:44:19.125 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-10T15:44:19.182 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-jaraco.collections (3.4.0-2) ... 2026-03-10T15:44:19.237 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-jaraco.classes (3.2.1-3) ... 2026-03-10T15:44:19.290 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-portend (3.0.0-1) ... 2026-03-10T15:44:19.341 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-tempora (4.1.2-1) ... 2026-03-10T15:44:19.396 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-jaraco.text (3.6.0-2) ... 2026-03-10T15:44:19.451 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-jaraco.functools (3.4.0-2) ... 2026-03-10T15:44:19.504 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-10T15:44:19.628 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-joblib (0.17.0-4ubuntu1) ... 2026-03-10T15:44:19.691 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-logutils (0.3.3-8) ... 2026-03-10T15:44:19.743 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-mako (1.1.3+ds1-2ubuntu0.1) ... 2026-03-10T15:44:19.811 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-natsort (8.0.2-1) ... 2026-03-10T15:44:20.041 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-paste (3.5.0+dfsg1-1) ... 2026-03-10T15:44:20.104 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-prettytable (2.5.0-2) ... 2026-03-10T15:44:20.151 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-psutil (5.9.0-1build1) ... 2026-03-10T15:44:20.203 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-pyinotify (0.9.6-1.3) ... 2026-03-10T15:44:20.250 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-routes (2.5.1-1ubuntu1) ... 2026-03-10T15:44:20.302 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-repoze.lru (0.7-2) ... 2026-03-10T15:44:20.350 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-10T15:44:20.402 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-rsa (4.8-1) ... 2026-03-10T15:44:20.453 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-simplegeneric (0.8.1-3) ... 2026-03-10T15:44:20.500 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-simplejson (3.17.6-1build1) ... 2026-03-10T15:44:20.553 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-singledispatch (3.4.0.3-3) ... 2026-03-10T15:44:20.604 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-10T15:44:20.628 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-tempita (0.5.2-6ubuntu1) ... 2026-03-10T15:44:20.675 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-threadpoolctl (3.1.0-1) ... 2026-03-10T15:44:20.721 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-waitress (1.4.4-1.1ubuntu1.1) ... 2026-03-10T15:44:20.771 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-10T15:44:20.822 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-10T15:44:20.870 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-websocket (1.2.3-1) ... 2026-03-10T15:44:20.920 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-werkzeug (2.0.2+dfsg1-1ubuntu0.22.04.3) ... 2026-03-10T15:44:20.974 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-zc.lockfile (2.0-1) ... 2026-03-10T15:44:21.021 INFO:teuthology.orchestra.run.vm01.stdout:Removing qttranslations5-l10n (5.15.3-1) ... 2026-03-10T15:44:21.044 INFO:teuthology.orchestra.run.vm01.stdout:Removing smartmontools (7.2-1ubuntu0.1) ... 2026-03-10T15:44:21.515 INFO:teuthology.orchestra.run.vm01.stdout:Removing socat (1.7.4.1-3ubuntu4) ... 2026-03-10T15:44:21.527 INFO:teuthology.orchestra.run.vm01.stdout:Removing unzip (6.0-26ubuntu3.2) ... 2026-03-10T15:44:21.548 INFO:teuthology.orchestra.run.vm01.stdout:Removing xmlstarlet (1.6.1-2.1) ... 2026-03-10T15:44:21.568 INFO:teuthology.orchestra.run.vm01.stdout:Removing zip (3.0-12build2) ... 2026-03-10T15:44:21.594 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-10T15:44:21.606 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-10T15:44:21.653 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for mailcap (3.70+nmu1ubuntu1) ... 2026-03-10T15:44:21.660 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for initramfs-tools (0.140ubuntu13.5) ... 2026-03-10T15:44:21.677 INFO:teuthology.orchestra.run.vm01.stdout:update-initramfs: Generating /boot/initrd.img-5.15.0-1092-kvm 2026-03-10T15:44:23.217 INFO:teuthology.orchestra.run.vm01.stdout:W: mkconf: MD subsystem is not loaded, thus I cannot scan for arrays. 2026-03-10T15:44:23.218 INFO:teuthology.orchestra.run.vm01.stdout:W: mdadm: failed to auto-generate temporary mdadm.conf file. 2026-03-10T15:44:25.396 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-10T15:44:25.400 DEBUG:teuthology.parallel:result is None 2026-03-10T15:44:25.400 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm01.local 2026-03-10T15:44:25.400 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -f /etc/apt/sources.list.d/ceph.list 2026-03-10T15:44:25.450 DEBUG:teuthology.orchestra.run.vm01:> sudo apt-get update 2026-03-10T15:44:25.752 INFO:teuthology.orchestra.run.vm01.stdout:Hit:1 https://archive.ubuntu.com/ubuntu jammy InRelease 2026-03-10T15:44:25.755 INFO:teuthology.orchestra.run.vm01.stdout:Hit:2 https://archive.ubuntu.com/ubuntu jammy-updates InRelease 2026-03-10T15:44:25.761 INFO:teuthology.orchestra.run.vm01.stdout:Hit:3 https://security.ubuntu.com/ubuntu jammy-security InRelease 2026-03-10T15:44:25.763 INFO:teuthology.orchestra.run.vm01.stdout:Hit:4 https://archive.ubuntu.com/ubuntu jammy-backports InRelease 2026-03-10T15:44:26.797 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-10T15:44:26.811 DEBUG:teuthology.parallel:result is None 2026-03-10T15:44:26.811 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-10T15:44:26.813 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-10T15:44:26.813 DEBUG:teuthology.orchestra.run.vm01:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout: remote refid st t when poll reach delay offset jitter 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout:============================================================================== 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout: 0.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout: 1.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout: 2.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout: 3.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout: ntp.ubuntu.com .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout:+node-3.infogral 168.239.11.197 2 u 7 64 177 23.541 +1.049 9.770 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout:+zeus.f5s.de 131.188.3.222 2 u 2 64 177 25.168 +0.807 9.589 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout:+ntp3.uni-ulm.de 129.69.253.1 2 u 66 64 77 27.198 -1.262 12.205 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout:+ns8.starka.st 129.134.28.123 2 u 66 64 77 22.681 -2.782 11.201 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout:+nur1.aup.dk 131.188.3.222 2 u 63 64 77 23.645 -0.206 10.804 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout:+time1.hs-augsbu 131.188.3.220 2 u 67 64 77 32.110 -1.492 11.864 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout:+static.215.156. 35.73.197.144 2 u 61 64 77 23.620 -0.387 10.476 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout:+where-you.at 31.209.85.243 2 u 65 64 77 25.099 -0.153 10.880 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout:+104-167-24-26.l 185.232.69.65 2 u 3 64 177 26.387 -2.019 8.725 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout:+vps-nue1.orlean 195.145.119.188 2 u 67 64 77 28.433 -2.306 11.514 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout:+nono.io 218.73.139.35 2 u 1 64 177 25.166 +1.233 8.662 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout:+mail.morbitzer. 205.46.178.169 2 u 61 64 77 28.343 -3.378 10.189 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout:+185.125.190.58 145.238.80.80 2 u 6 64 177 35.266 -0.407 8.138 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout:*home.of.the.smi .LIgp. 1 u 66 64 77 38.645 +1.930 10.184 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout:+mail.johanneskr 131.188.3.222 2 u 66 64 77 21.054 -0.545 11.213 2026-03-10T15:44:27.821 INFO:teuthology.orchestra.run.vm01.stdout:+static.179.181. 213.239.239.166 3 u 64 64 77 23.657 +0.171 11.226 2026-03-10T15:44:27.822 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-10T15:44:27.827 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-10T15:44:27.827 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-10T15:44:27.829 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-10T15:44:27.847 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-10T15:44:27.864 INFO:teuthology.task.internal:Duration was 544.401150 seconds 2026-03-10T15:44:27.864 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-10T15:44:27.867 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-10T15:44:27.867 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-10T15:44:27.892 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-10T15:44:27.892 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm01.local 2026-03-10T15:44:27.892 DEBUG:teuthology.orchestra.run.vm01:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-10T15:44:27.943 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-10T15:44:27.943 DEBUG:teuthology.orchestra.run.vm01:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T15:44:28.085 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-10T15:44:28.086 DEBUG:teuthology.orchestra.run.vm01:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T15:44:28.092 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T15:44:28.093 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T15:44:28.093 INFO:teuthology.orchestra.run.vm01.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T15:44:28.093 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-10T15:44:28.093 INFO:teuthology.orchestra.run.vm01.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-10T15:44:28.110 INFO:teuthology.orchestra.run.vm01.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 91.3% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-10T15:44:28.111 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-10T15:44:28.114 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-10T15:44:28.114 DEBUG:teuthology.orchestra.run.vm01:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-10T15:44:28.160 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-10T15:44:28.162 DEBUG:teuthology.orchestra.run.vm01:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-10T15:44:28.208 INFO:teuthology.orchestra.run.vm01.stdout:kernel.core_pattern = core 2026-03-10T15:44:28.216 DEBUG:teuthology.orchestra.run.vm01:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-10T15:44:28.259 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T15:44:28.259 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-10T15:44:28.262 INFO:teuthology.task.internal:Transferring archived files... 2026-03-10T15:44:28.263 DEBUG:teuthology.misc:Transferring archived files from vm01:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1080/remote/vm01 2026-03-10T15:44:28.263 DEBUG:teuthology.orchestra.run.vm01:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-10T15:44:28.311 INFO:teuthology.task.internal:Removing archive directory... 2026-03-10T15:44:28.311 DEBUG:teuthology.orchestra.run.vm01:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-10T15:44:28.356 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-10T15:44:28.359 INFO:teuthology.task.internal:Not uploading archives. 2026-03-10T15:44:28.359 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-10T15:44:28.361 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-10T15:44:28.361 DEBUG:teuthology.orchestra.run.vm01:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-10T15:44:28.400 INFO:teuthology.orchestra.run.vm01.stdout: 258207 4 drwxr-xr-x 2 ubuntu ubuntu 4096 Mar 10 15:44 /home/ubuntu/cephtest 2026-03-10T15:44:28.400 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-10T15:44:28.406 INFO:teuthology.run:Summary data: description: orch/cephadm/workunits/{0-distro/ubuntu_22.04 agent/off mon_election/classic task/test_cephadm} duration: 544.4011504650116 flavor: default owner: kyr success: true 2026-03-10T15:44:28.406 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-10T15:44:28.425 INFO:teuthology.run:pass